text stringlengths 0 473k |
|---|
[SOURCE: https://en.wikipedia.org/wiki/Sega] | [TOKENS: 13448] |
Contents Sega Sega Corporation[a][b] is a Japanese video game company and subsidiary of Sega Sammy Holdings headquartered in Tokyo. It produces several multi-million-selling game franchises for arcades and consoles, including Sonic the Hedgehog, Angry Birds,[c] Phantasy Star, Puyo Puyo, Super Monkey Ball, Total War, Virtua Fighter, Megami Tensei, Sakura Wars, Persona, The House of the Dead, and Yakuza. From 1983 until 2001, Sega also developed its own consoles. Sega was founded by Martin Bromley and Richard Stewart as Nihon Goraku Bussan[d] on June 3, 1960. Shortly after, it acquired the assets of its predecessor, Service Games of Japan. In 1965, it became known as Sega Enterprises, Ltd., after acquiring Rosen Enterprises, an importer of coin-operated games. Sega developed its first coin-operated game, Periscope, in 1966. Sega was sold to Gulf and Western Industries in 1969. Following a downturn in the arcade business in the early 1980s, Sega began to develop video game consoles, starting with the SG-1000 and Master System, but struggled against competitors such as the Nintendo Entertainment System. In 1984, Sega executives David Rosen and Hayao Nakayama led a management buyout, with backing from CSK Corporation. In 1988, Sega released the Mega Drive (the "Genesis" in North America). The Mega Drive struggled against competition in Japan, but the Genesis found success overseas after the release of Sonic the Hedgehog in 1991 and briefly outsold its main competitor, the Super Nintendo Entertainment System, in the US. In 2001, after several commercial failures such as the 32X, Saturn, and Dreamcast, Sega stopped manufacturing consoles to become a third-party developer and publisher, and was acquired by Sammy Corporation in 2004. Sega Holdings Co., Ltd. was established in 2015; Sega Corporation was renamed to Sega Games Co., Ltd., and its arcade division was split into Sega Interactive. In 2020, Sega Games and Sega Interactive merged to become Sega Corporation. Sega's international branches, Sega of America and Sega Europe, are headquartered in Irvine, California, and London. Its development studios include their internal research and development divisions (which utilize the Ryu Ga Gotoku Studio and Sonic Team brands for several core franchise entries), Sega Sapporo Studio which mainly provides support for the Tokyo-based development teams as well as handling partial game development, and Atlus (including their R&D divisions) and five development studios in the UK and Europe: Creative Assembly, Sports Interactive, Sega Hardlight, Two Point Studios, and Rovio Entertainment (including Ruby Games). Sega is one of the world's most prolific arcade game producers and its mascot, Sonic, is internationally recognized. Sega is recognized for its video game consoles, creativity and innovations. In more recent years, it has been criticized for its business decisions and the quality of its creative output. Being the entertainment contents division of Sega Sammy Holdings, forming one half of the Sega Sammy Group, Sega also owns a toy and amusement machine company, Sega Fave, which comprises their arcade development and manufacturing divisions and two animation studios: TMS Entertainment, which animates, produces, and distributes anime, and Marza Animation Planet, which specializes in CG animation. History In May 1940, American businessmen Martin Bromley, Irving Bromberg and James Humpert formed Standard Games in Honolulu, Hawaii. Their aim was to provide coin-operated amusement machines, including slot machines, to military bases as the increase in personnel with the onset of World War II would create demand for entertainment. After the war, the founders sold Standard Games in 1945, and established Service Games the next year, named for the military focus. After the US government outlawed slot machines in its territories in 1952, Bromley sent employees Richard Stewart and Ray LeMaire to Tokyo to establish Service Games of Japan to provide coin-operated slot machines to US bases in Japan. A year later, all five men established Service Games Panama to control the entities of Service Games worldwide. The company expanded over the next seven years to include distribution in South Korea, the Philippines, and South Vietnam. The name Sega, an abbreviation of Service Games, was first used in 1954 on a slot machine, the Diamond Star. Due to notoriety arising from investigations by the US government into criminal business practices, Service Games of Japan was dissolved on May 31, 1960. On June 3, Bromley established two companies to take over its business activities, Nihon Goraku Bussan and Nihon Kikai Seizō.[e] The two new companies purchased all of Service Games of Japan's assets. Kikai Seizō, doing business as Sega, Inc., focused on manufacturing slot machines. Goraku Bussan, doing business under Stewart as Utamatic, Inc., served as a distributor and operator of coin-operated machines, particularly jukeboxes. The companies merged in 1964, retaining the Nihon Goraku Bussan name. Around the same time, David Rosen, an American officer in the United States Air Force stationed in Japan, launched a photo booth business in Tokyo in 1954. This company became Rosen Enterprises, and in 1957 began importing coin-operated games into Japan. In 1965, Nihon Goraku Bussan acquired Rosen Enterprises to form Sega Enterprises, Ltd.[f] Rosen was installed as the CEO and managing director, while Stewart was named president and LeMaire was the director of planning. Shortly afterward, Sega stopped leasing to military bases and moved its focus from slot machines to coin-operated amusement machines. Its imports included Rock-Ola jukeboxes, pinball games by Williams, and gun games by Midway Manufacturing. Because Sega imported second-hand machines, which required frequent maintenance, it began constructing replacement guns and flippers for its imported games. According to former Sega director Akira Nagai, this led to the company developing its own games. The first arcade electro-mechanical game (EM game) Sega manufactured was the submarine simulator Periscope, released worldwide in the late 1960s. It featured light and sound effects considered innovative and was successful in Japan. It was then exported to malls and department stores in Europe and the United States and helped standardize the 25-cent-per-play cost for arcade games in the US. Sega was surprised by the success, and for the next two years, the company produced and exported between eight and ten games per year. The worldwide success of Periscope led to a "technological renaissance" in the arcade industry, which was reinvigorated by a wave of "audio-visual" EM novelty games that followed in the wake of Periscope during the late 1960s to early 1970s. However, rampant piracy led Sega to cease exporting its games around 1970. In 1969, Sega was sold to the American conglomerate Gulf and Western Industries, although Rosen remained CEO. In 1974, Gulf and Western made Sega Enterprises, Ltd., a subsidiary of an American company renamed Sega Enterprises, Inc. Sega released Pong-Tron, its first video-based game, in 1973. Despite late competition from Taito's hit arcade game Space Invaders in 1978, Sega prospered from the arcade video game boom of the late 1970s, with revenues climbing to over US$100 million by 1979. During this period, Sega acquired Gremlin Industries, which manufactured microprocessor-based arcade games, and Esco Boueki, a coin-op distributor founded and owned by Hayao Nakayama. Nakayama was placed in a management role of Sega's Japanese operations. In the early 1980s, Sega was one of the top five arcade game manufacturers active in the United States, as company revenues rose to $214 million. 1979 saw the release of Head On, which introduced the "eat-the-dots" gameplay Namco later used in Pac-Man. In 1981, Sega licensed Konami's Frogger, its most successful game until then. In 1982, Sega introduced the first game with isometric graphics, Zaxxon. Following a downturn in the arcade business starting in 1982, Gulf and Western sold its North American arcade game manufacturing organization and the licensing rights for its arcade games to Bally Manufacturing in September 1983. Gulf and Western retained Sega's North American R&D operation and its Japanese subsidiary, Sega Enterprises, Ltd. With its arcade business in decline, Sega Enterprises, Ltd. president Nakayama advocated for the company to use its hardware expertise to move into the home consumer market in Japan. This led to Sega's development of a computer, the SC-3000. Learning that Nintendo was developing a games-only console, the Famicom, Sega developed its first home video game system, the SG-1000, alongside the SC-3000. Rebranded versions of the SG-1000 were released in several other markets worldwide. The SG-1000 sold 160,000 units in 1983, which far exceeded Sega's projection of 50,000 in the first year but was outpaced by the Famicom. This was in part because Nintendo expanded its game library by courting third-party developers, whereas Sega was hesitant to collaborate with companies with which it was competing in the arcades. In November 1983, Rosen announced his intention to step down as president of Sega Enterprises, Inc. on January 1, 1984. Jeffrey Rochlis was announced as the new president and CEO of Sega. Shortly after the launch of the SG-1000, and the death of company founder Charles Bluhdorn, Gulf and Western began to sell off its secondary businesses. Nakayama and Rosen arranged a management buyout of the Japanese subsidiary in 1984 with financial backing from Computer Service, a prominent Japanese software company. Sega's Japanese assets were purchased for $38 million by a group of investors led by Rosen and Nakayama. Isao Okawa, head of CSK, became chairman, while Nakayama was installed as CEO of Sega Enterprises, Ltd. In 1985, Sega began working on the Mark III, a redesigned SG-1000. For North America, Sega rebranded the Mark III as the Master System, with a futuristic design intended to appeal to Western tastes. The Mark III was released in Japan in October 1985. Despite featuring more powerful hardware than the Famicom in some ways, it was unsuccessful at launch. As Nintendo required third-party developers not to publish their Famicom games on other consoles, Sega developed its own games and obtained the rights to port games from other developers. To help market the console in North America, Sega planned to sell the Master System as a toy, similar to how Nintendo had done with the Nintendo Entertainment System. Sega partnered with Tonka, an American toy company, to make use of Tonka's expertise in the toy industry. Ineffective marketing by Tonka handicapped sales of the Master System. By early 1992, production had ceased in North America. The Master System sold between 1.5 million and 2 million units in the region. This was less market share in North America than both Nintendo and Atari, which controlled 80 percent and 12 percent of the market. The Master System was eventually a success in Europe, where its sales were comparable to the NES. As late as 1993, the Master System's active installed user base in Europe was 6.25 million units. The Master System has had continued success in Brazil. New versions continue to be released by Sega's partner in the region, Tectoy. By 2016, the Master System had sold 8 million units in Brazil. During 1984, Sega opened its European division of arcade distribution, Sega Europe. It re-entered the North American arcade market in 1985 with the establishment of Sega Enterprises USA at the end of a deal with Bally. The release of Hang-On in 1985 would prove successful in the region, becoming so popular that Sega struggled to keep up with demand for the game. UFO Catcher was introduced in 1985 and as of 2005 was Japan's most commonly installed claw machine. In 1986, Sega of America was established to manage the company's consumer products in North America, beginning with marketing the Master System. During Sega's partnership with Tonka, Sega of America relinquished marketing and distribution of the console and focused on customer support and some localization of games. Out Run, released in 1986, became Sega's best selling arcade cabinet of the 1980s. Former Sega director Akira Nagai said Hang-On and Out Run helped to pull the arcade game market out of the 1982 downturn and created new genres of video games. With the arcade game market once again growing, Sega was one of the most recognized game brands at the end of the 1980s. In the arcades, the company focused on releasing games to appeal to diverse tastes, including racing games and side-scrollers. Sega released the Master System's successor, the Mega Drive, in Japan on October 29, 1988. The launch was overshadowed by Nintendo's release of Super Mario Bros. 3 a week earlier. Positive coverage from magazines Famitsu and Beep! helped establish a following, with the latter launching a new publication dedicated to the console, but Sega shipped only 400,000 units in the first year. The Mega Drive struggled to compete against the Famicom and lagged behind Nintendo's Super Famicom and the TurboGrafx-16, made by NEC, in Japanese sales throughout the 16-bit era. For the North American launch, where the console was renamed Genesis, Sega had no sales and marketing organization. After Atari declined an offer to market the console in the region, Sega launched it through its own Sega of America subsidiary. The Genesis was launched in New York City and Los Angeles on August 14, 1989, and in the rest of North America later that year. The European version of the Mega Drive was released in September 1990. Former Atari executive and new Sega of America president Michael Katz developed a two-part strategy to build sales in North America. The first part involved a marketing campaign to challenge Nintendo and emphasize the more arcade-like experience available on the Genesis, with slogans including "Genesis does what Nintendon't". Since Nintendo owned the console rights to most arcade games of the time, the second part involved creating a library of games which used the names and likenesses of celebrities, such as Michael Jackson's Moonwalker and Joe Montana Football. Nonetheless, Sega had difficulty overcoming Nintendo's ubiquity in homes. Sega of America sold only 500,000 Genesis units in its first year, half of Nakayama's goal. After the launch of the Genesis, Sega sought a new flagship line of releases to compete with Nintendo's Mario series. Its new character, Sonic the Hedgehog, went on to feature in one of the best-selling video game franchises in history. Sonic the Hedgehog began with a tech demo created by Yuji Naka involving a fast-moving character rolling in a ball through a winding tube; this was fleshed out with Naoto Ohshima's character design and levels conceived by designer Hirokazu Yasuhara. Sonic's color was chosen to match Sega's cobalt blue logo; his shoes were inspired by Michael Jackson's boots, and his personality by Bill Clinton's "can-do" attitude. Nakayama hired Tom Kalinske as CEO of Sega of America in mid-1990, and Katz departed soon after. Kalinske knew little about the video game market, but surrounded himself with industry-savvy advisors. A believer in the razor-and-blades business model, he developed a four-point plan: cut the price of the Genesis, create a US team to develop games targeted at the American market, expand the aggressive advertising campaigns, and replace the bundled game Altered Beast with Sonic the Hedgehog. The Japanese board of directors disapproved, but it was approved by Nakayama, who told Kalinske, "I hired you to make the decisions for Europe and the Americas, so go ahead and do it." In large part due to the popularity of Sonic the Hedgehog, the Genesis outsold its main competitor, the Super Nintendo Entertainment System (SNES), in the United States nearly two to one during the 1991 holiday season. By January 1992, Sega controlled 65 percent of the 16-bit console market. Sega outsold Nintendo for four consecutive Christmas seasons due to the Genesis' head start, lower price, and a larger library compared to the SNES at release. Nintendo's dollar share of the US 16-bit market dropped from 60% at the end of 1992 to 37% at the end of 1993, Sega claimed 55% of all 16-bit hardware sales during 1994, and the SNES outsold the Genesis from 1995 through 1997. In 1990, Sega launched the Game Gear, a handheld console, to compete against Nintendo's Game Boy. The Game Gear was designed as a portable version of the Master System and featured a full-color screen, in contrast to the monochrome Game Boy screen. Due to its short battery life, lack of original games, and weak support from Sega, the Game Gear did not surpass the Game Boy, having sold approximately 11 million units. Sega launched the Mega-CD in Japan on December 1, 1991, initially retailing at JP¥49,800. The add-on uses CD-ROM technology. Further features include a second, faster processor, vastly expanded system memory, a graphics chip that performed scaling and rotation similar to the company's arcade games, and another sound chip. In North America, it was renamed the Sega CD and launched on October 15, 1992, with a retail price of US$299. It was released in Europe as the Mega-CD in 1993. The Mega-CD sold only 100,000 units during its first year in Japan, falling well below expectations. Throughout the early 1990s, Sega largely continued its success in arcades around the world. In 1992 and 1993, the new Sega Model 1 arcade system board showcased in-house development studio Sega AM2's Virtua Racing and Virtua Fighter (the first 3D fighting game), which, though expensive, played a crucial role in popularizing 3D polygonal graphics.[g] In addition, complex simulator equipment like the rotational R360 kept Sega competing with machines by rival arcade companies, including Taito. New official region-specific distributors and manufacturers, including the UK's Deith Leisure, allowed Sega to sell its machines outside of Japan with ease. Sega's domestic operations division also opened hundreds of family-oriented suburban Sega World amusement arcades in Japan during this period, as well as large over-18s "GiGO" facilities in the high-profile urban areas of Roppongi and Ikebukuro. In 1993, this success was mirrored in overseas territories with the openings of several large branded entertainment centers, such as Sega VirtuaLand in Luxor Las Vegas. In 1994, Sega generated a revenue of ¥354.032 billion or $3,464,000,000 (equivalent to $7,525,000,000 in 2025). In 1993, the American media began to focus on the mature content of certain video games, such as Night Trap for the Sega CD and the Genesis version of Midway's Mortal Kombat. This came at a time when Sega was capitalizing on its image as an "edgy" company with "attitude", and this reinforced that image. To handle this, Sega instituted the United States' first video game ratings system, the Videogame Rating Council (VRC), for all its systems. Ratings ranged from the family-friendly GA rating to the more mature rating of MA-13, and the adults-only rating of MA-17. Executive vice president of Nintendo of America Howard Lincoln was quick to point out in the United States congressional hearings in 1993 that Night Trap was not rated at all. Senator Joe Lieberman called for another hearing in February 1994 to check progress toward a rating system for video game violence. After the hearings, Sega proposed the universal adoption of the VRC; after objections by Nintendo and others, Sega took a role in forming the Entertainment Software Rating Board. Sega began work on the Genesis' successor, the Sega Saturn, more than two years before showcasing it at the Tokyo Toy Show in June 1994. According to former Sega of America producer Scot Bayless, Nakayama became concerned about the 1994 release of the Atari Jaguar, and that the Saturn would not be available until the next year. As a result, Nakayama decided to have a second console release to market by the end of 1994. Sega began to develop the 32X, a Genesis add-on which would serve as a less expensive entry into the 32-bit era. The 32X would not be compatible with the Saturn, but would play Genesis games. Sega released the 32X on November 21, 1994, in North America, December 3, 1994, in Japan, and January 1995 in PAL territories, and was sold at less than half of the Saturn's launch price. After the holiday season, interest in the 32X rapidly declined. Sega released the Saturn in Japan on November 22, 1994. Virtua Fighter, a port of the popular arcade game, sold at a nearly one-to-one ratio with the Saturn at launch and was crucial to the system's early success in Japan. Sega's initial shipment of 200,000 Saturn units sold out on the first day, and it was more popular than the PlayStation, made by Sony, in Japan. In March 1995, Sega of America CEO Tom Kalinske announced that the Saturn would be released in the US on Saturday, September 2, 1995, advertised as "Saturn-day". Sega executives in Japan mandated an early launch to give the Saturn an advantage over the PlayStation. At the first Electronic Entertainment Expo (E3) in Los Angeles on May 11, 1995, Kalinske revealed the release price and that Sega had shipped 30,000 Saturns to Toys "R" Us, Babbage's, Electronics Boutique, and Software Etc. for immediate release. A by-product of the surprise launch was the provocation of retailers not included in Sega's rollout; KB Toys in particular decided to no longer stock its products in response. The Saturn's release in Europe also came before the previously announced North American date, on July 8, 1995. Within two days of the PlayStation's American launch on September 9, 1995, the PlayStation sold more units than the Saturn. Within its first year, the PlayStation secured over twenty percent of the US video game market. The console's high price point, surprise launch, and difficulty handling polygonal graphics were factors in its lack of success. Sega also underestimated the continued popularity of the Genesis; 16-bit sales accounted for 64 percent of the market in 1995. Despite capturing 43 percent of the US market dollar share and selling more than 2 million Genesis units in 1995, Kalinske estimated that, if prepared for demand, another 300,000 could have been sold. Sega announced that Shoichiro Irimajiri had been appointed chairman and CEO of Sega of America in July 1996, while Kalinske left Sega after September 30 of that year. A former Honda executive, Irimajiri had been involved with Sega of America since joining Sega in 1993. The company also announced that Rosen and Nakayama had resigned from their positions at Sega of America, though both remained with Sega. Bernie Stolar, a former executive at Sony Computer Entertainment of America, became Sega of America's executive vice president in charge of product development and third-party relations. Stolar was not supportive of the Saturn, believing its hardware was poorly designed. While Stolar had said "the Saturn is not our future" at E3 1997, he continued to emphasize the quality of its games, and later reflected that "we tried to wind it down as cleanly as we could for the consumer." At Sony, Stolar had opposed the localization of certain Japanese PlayStation games that he felt would not represent the system well in North America. He advocated a similar policy for the Saturn, generally blocking 2D arcade games and role-playing games from release, although he later sought to distance himself from this stance. Other changes included a softer image in Sega's advertising, including removing the "Sega!" scream, and holding press events for the education industry. Sega partnered with GE to develop the Sega Model 2 arcade system board, building on 3D technology in the arcade industry at the time. This led to several successful arcade games, including Daytona USA, launched in a limited capacity in late 1993 and worldwide in 1994. Other popular games included Virtua Cop, Sega Rally Championship, and Virtua Fighter 2. Virtua Fighter and Virtua Fighter 2 became Sega's best-selling arcade games of all time, surpassing their previous record holder Out Run. There was also a technological arms race between Sega and Namco during this period, driving the growth of 3D gaming. Beginning in 1994, Sega launched a series of indoor theme parks in Japan under a concept dubbed "Amusement Theme Park", including Joypolis parks sited in urban Tokyo locations such as Yokohama and Odaiba. A rapid overseas rollout was planned, with at least 100 locations across the world proposed to be opened by 2000, however only two, Sega World London and Sega World Sydney, would ultimately materialise in September 1996 and March 1997, respectively. Following on from difficulties faced in setting up theme parks in the United States, Sega established the GameWorks chain of urban entertainment centers in a joint venture with DreamWorks SKG and Universal Studios during March 1997. In 1995, Sega partnered with Atlus to launch Print Club (purikura), an arcade photo sticker machine that produces selfie photos. Atlus and Sega introduced Purikura in February 1995, initially at game arcades, before expanding to other popular culture locations such as fast food shops, train stations, karaoke establishments and bowling alleys. Purikura became a popular form of entertainment among youths across East Asia, laying the foundations for modern selfie culture. By 1997, about 47,000 Purikura machines had been sold, earning Sega an estimated ¥25 billion (£173 million) or $283,000,000 (equivalent to $568,000,000 in 2025) from Purikura sales that year. Various other similar Purikura machines appeared from other manufacturers, with Sega controlling about half of the market in 1997. Sega also made forays in the PC market with the 1995 establishment of SegaSoft, which was tasked with creating original Saturn and PC games. From 1994 to 1999, Sega also participated in the arcade pinball market when it took over Data East's pinball division, renaming it Sega Pinball. In January 1997, Sega announced its intentions to merge with the Japanese toy maker Bandai. The merger, planned as a $1 billion stock swap whereby Sega would wholly acquire Bandai, was set to form a company known as Sega Bandai, Ltd. Though it was to be finalized in October of that year, it was called off in May after growing opposition from Bandai's mid-level executives. Bandai instead agreed to a business alliance with Sega. As a result of Sega's deteriorating financial situation, Nakayama resigned as Sega president in January 1998 in favor of Irimajiri. Nakayama's resignation may have in part been due to the failure of the merger, as well as Sega's 1997 performance. Stolar became CEO and president of Sega of America. After the launch of the Nintendo 64 in the US during 1996, sales of the Saturn and its games fell sharply in much of the west. The PlayStation outsold the Saturn three-to-one in the US in 1997, and the latter failed to gain a foothold in Europe and Australia, where the Nintendo 64 would not release until March 1997. After several years of declining profits, Sega had a slight increase in the fiscal year ended March 1997, partly driven by increasing arcade revenue, while outperforming Nintendo during the mid-term period. However, in the fiscal year ending March 1998, Sega suffered its first financial loss since its 1988 listing on the Tokyo Stock Exchange as both a parent company and a corporation as a whole. In the company's 1998 year end report, Irimajiri placed the blame for these losses on the failure to transition from the Genesis to the Saturn in North America and Sega Enterprises covering the debts of Sega of America. Shortly before the announcement of the losses, Sega discontinued the Saturn in North America to prepare for the launch of its successor, the Dreamcast, releasing remaining games in low quantities. The decision to discontinue the Saturn effectively left the North American home console market without Sega games for over a year, with most of its activity in the country coming from arcade divisions. The Saturn lasted longer in some Europe territories and particularly Japan, with it notably outperforming the Nintendo 64 in the latter. Nonetheless, Irimajiri confirmed in an interview with Japanese newspaper Daily Yomiuri that Saturn development would stop at the end of 1998 and games would continue to be produced until mid-1999. With lifetime sales of 9.26 million units, the Saturn is retrospectively considered a commercial failure in much of the world. While Sega had success with the Model 3 arcade board and titles like Virtua Fighter 3, Sega's arcade divisions struggled in the West during the late 1990s. On the other hand, Sega's arcade divisions were more successful in Asia, with Sega's overall arcade revenues increasing year-on-year throughout the late 1990s, but it was not enough to offset the significant declining revenues of Sega's home consumer divisions. Despite a 75 percent drop in half-year profits just before the Japanese launch of the Dreamcast, Sega felt confident about its new system. The Dreamcast attracted significant interest and drew many pre-orders. Sega announced that Sonic Adventure, the first major 3D Sonic the Hedgehog game, would be a Dreamcast launch game. It was promoted with a large-scale public demonstration at the Tokyo Kokusai Forum Hall. Due to a high failure rate in the manufacturing process, Sega could not ship enough consoles for the Dreamcast's Japanese launch. As more than half of its limited stock had been pre-ordered, Sega stopped pre-orders in Japan. Before the launch, Sega announced the release of its New Arcade Operation Machine Idea (NAOMI) arcade system board, which served as a cheaper alternative to the Sega Model 3. NAOMI shared technology with the Dreamcast, allowing nearly identical ports of arcade games. The Dreamcast launched in Japan on November 27, 1998. The entire stock of 150,000 consoles sold out by the end of the day. Irimajiri estimated that another 200,000 to 300,000 Dreamcast units could have been sold with sufficient supply. He hoped to sell more than a million Dreamcast units in Japan by February 1999, but less than 900,000 were sold. The low sales undermined Sega's attempts to build up a sufficient installed base to ensure the Dreamcast's survival after the arrival of competition from other manufacturers. Sega suffered a further ¥42.881 billion consolidated net loss in the fiscal year ending March 1999, and announced plans to eliminate 1,000 jobs, nearly a quarter of its workforce. Before the Western launch, Sega reduced the price of the Dreamcast in Japan by JP¥9,100, effectively making it unprofitable but increasing sales. On August 11, 1999, Sega of America confirmed that Stolar had been fired. Peter Moore, whom Stolar had hired as a Sega of America executive only six months before, was placed in charge of the North American launch.[h] The Dreamcast launched in North America on September 9, 1999, with 18 games. Sega set a record by selling more than 225,132 Dreamcast units in 24 hours, earning $98.4 million in what Moore called "the biggest 24 hours in entertainment retail history". Within two weeks, US Dreamcast sales exceeded 500,000. By Christmas, Sega held 31 percent of the US video game market by revenue. On November 4, Sega announced it had sold more than a million Dreamcast units. Nevertheless, the launch was marred by a glitch at one of Sega's manufacturing plants, which produced defective GD-ROMs where data was not properly recorded onto the disc. Sega released the Dreamcast in Europe on October 14, 1999. While Sega sold 500,000 units in Europe by Christmas 1999, sales there slowed, and by October 2000 Sega had sold only about a million units. Though the Dreamcast was successful, Sony's PlayStation still held 60 percent of the overall market share in North America at the end of 1999. On March 2, 1999, in what one report called a "highly publicized, vaporware-like announcement", Sony revealed the first details of the PlayStation 2. The same year, Nintendo announced that its next console would meet or exceed anything on the market, and Microsoft began development of its own console, the Xbox. Sega's initial momentum proved fleeting as US Dreamcast sales—which exceeded 1.5 million by the end of 1999—began to decline as early as January 2000. Poor Japanese sales contributed to Sega's ¥42.88 billion ($404 million) consolidated net loss in the fiscal year ending March 2000. This followed a similar loss of ¥42.881 billion the previous year and marked Sega's third consecutive annual loss. Sega's overall sales for the term increased 27.4 percent, and Dreamcast sales in North America and Europe greatly exceeded its expectations. However, this coincided with a decrease in profitability due to the investments required to launch the Dreamcast in Western markets and poor software sales in Japan. At the same time, worsening conditions reduced the profitability of Sega's Japanese arcade business, prompting the closure of 246 locations. Moore became the president and chief operating officer of Sega of America on May 8, 2000. He said the Dreamcast would need to sell 5 million units in the US by the end of 2000 to remain viable, but Sega fell short of this goal with some 3 million units sold. Moreover, Sega's attempts to spur Dreamcast sales through lower prices and cash rebates caused escalating financial losses. In March 2001, Sega posted a consolidated net loss of ¥51.7 billion ($417.5 million). While the PlayStation 2's October 26 US launch was marred by shortages, this did not benefit the Dreamcast as much as expected, as many disappointed consumers continued to wait or purchased a PSone. Eventually, Sony and Nintendo held 50 and 35 percent of the US video game market, while Sega held only 15 percent. CSK chairman Isao Okawa replaced Irimajiri as president of Sega on May 22, 2000. Okawa had long advocated that Sega abandon the console business. Others shared this view; Sega co-founder David Rosen had "always felt it was a bit of a folly for them to be limiting their potential to Sega hardware", and Stolar had suggested Sega should have sold the company to Microsoft. In a September 2000 meeting with Sega's Japanese executives and heads of its first-party game studios, Moore and Sega of America executive Charles Bellfield recommended that Sega abandon its console business. In response, the studio heads walked out. Sega announced an official company name change from Sega Enterprises, Ltd. to Sega Corporation effective November 1, 2000, officially dropping the Sega Enterprises name used in Japan as well as transitioning to the Sega name used globally. Sega stated in a release that this was to display its commitment to its "network entertainment business". On January 23, 2001, Japanese newspaper Nihon Keizai Shinbun reported that Sega would cease production of the Dreamcast and develop software for other platforms. After an initial denial, Sega released a press release confirming it was considering producing software for the PlayStation 2 and Game Boy Advance as part of its "new management policy". On January 31, 2001, Sega announced the discontinuation of the Dreamcast after March 31 and the restructuring of the company as a "platform-agnostic" third-party developer. Sega also announced a Dreamcast price reduction to eliminate its unsold inventory, estimated at 930,000 units as of April 2001. This was followed by further reductions to clear the remaining inventory. The final manufactured Dreamcast was autographed by the heads of all nine of Sega's first-party game studios, plus the heads of sports game developer Visual Concepts and audio studio Wave Master, and given away with all 55 first-party Dreamcast games through a competition organized by GamePro. Okawa, who had loaned Sega $500 million in 1999, died on March 16, 2001. Shortly before his death, he forgave Sega's debts to him and returned his $695 million worth of Sega and CSK stock, helping the company survive the third-party transition. He held failed talks with Microsoft about a sale or merger with their Xbox division. According to former Microsoft executive Joachim Kempin, Microsoft founder, Bill Gates, decided against acquiring Sega because "he didn't think that Sega had enough muscle to eventually stop Sony". A business alliance with Microsoft was announced whereby Sega would develop 11 games for the Xbox. As part of the restructuring, nearly one third of Sega's Tokyo workforce was laid off in 2001. 2002 was Sega's fifth consecutive fiscal year of net losses. After Okawa's death, Hideki Sato, a 30-year Sega veteran who had worked on Sega's consoles, became the company president. Following poor sales in 2002, Sega cut its profit forecast for 2003 by 90 percent, and explored opportunities for mergers. In 2003, Sega began talks with Sammy Corporation–a pachinko and pachislot manufacturing company–and Namco. The president of Sammy, Hajime Satomi, had been mentored by Okawa and was previously asked to be CEO of Sega. On February 13, Sega announced that it would merge with Sammy; however, as late as April 17, Sega was still in talks with Namco, which was attempting to overturn the merger. Sega's consideration of Namco's offer upset Sammy executives. The day after Sega announced it no longer planned to merge with Sammy, Namco withdrew its offer. In 2003, Sato and COO Tetsu Kamaya stepped down. Sato was replaced by Hisao Oguchi, the head of the Sega studio Hitmaker. Moore left Sega in January 2003, feeling that the Japanese executives were refusing to adapt to industry changes, such as the demand for mature games such as Grand Theft Auto III. Hideaki Irie, who had worked at Agetec and ASCII, became the new president and COO of Sega of America in October 2003. In August 2003, Sammy bought 22.4 percent of Sega's shares from CSK, making Sammy Sega's largest shareholder. In the same year, Hajime Satomi said Sega's activity would focus on its profitable arcade business as opposed to loss-incurring home software development. Successful console games during this period include entries in the Sonic the Hedgehog, Puyo Puyo, Virtua Fighter, Super Monkey Ball, Phantasy Star Online and Sakura Wars franchises. In 2004, Sega Sammy Holdings, an entertainment conglomerate, was created; Sega and Sammy became subsidiaries of the new holding company, both companies operating independently while the executive departments merged. According to the first Sega Sammy Annual Report, the merger went ahead as both companies were facing difficulties. Satomi said Sega had been operating at a loss for nearly ten years, while Sammy feared stagnation and over-reliance of its highly profitable pachislot and pachinko machine business and wanted to diversify. Sammy acquired the remaining percentages of Sega, completing a takeover. The stock swap deal valued Sega between $1.45 billion and $1.8 billion. Sega Sammy Holdings was structured into four parts: Consumer Business (video games), Amusement Machine Business (arcade games), Amusement Center Business (Sega's theme parks and arcades) and Pachislot and Pachinko Business (Sammy's pachinko and pachislot business). According to an industry survey, as of 2005, sales of arcade machines were up for the previous four years in Japan, while down for nine straight years overseas. In response to the decline of the global arcade industry in the late 1990s, Sega created several novel concepts tailored to the Japanese market. Derby Owners Club was an arcade machine with memory cards for data storage. Testing of Derby Owners Club in a Chicago arcade showed that it had become the most popular machine at the location, with a 92% replay rate. The cabinet was too expensive and the game did not entice casual users which are essential to the western arcade market. While the Japanese market retained core players, western arcades had become more focused on casual players, and Sega Amusements Europe, the entity created to officially distribute and manufacture Sega's machines on the continent after the consolidation of its regional divisions, subsequently decided to develop more games locally that were better suited to western tastes. In 2005, the GameWorks chain of arcades came under the sole ownership of Sega, which previously was shared with Vivendi Universal, and remained under their ownership until 2011. In 2009, Sega Republic, an indoor theme park, opened in Dubai. Sega gradually reduced its arcades from 450 in 2005 to around 200 in 2015. Arcade machine sales incurred higher profits than the company's console, mobile and PC games on a year-to-year basis until the fiscal year of 2014. In order to drive growth in western markets, Sega announced new leadership for Sega of America and Sega Europe in 2005. Simon Jeffery became president and COO of Sega of America, and Mike Hayes president and COO for Sega Europe. In 2009, Hayes became president of the combined outfit of both Sega of America and Sega Europe, due to Jeffery leaving. Sega sold Visual Concepts to Take-Two Interactive, and purchased UK-based developer Creative Assembly, known for its Total War series. In the same year, Sega Racing Studio was also formed by former Codemasters employees. In 2006, Sega Europe purchased Sports Interactive, known for its Football Manager series. Sega found success in the Japanese market with the Yakuza, Phantasy Star and Hatsune Miku: Project DIVA series.[i] Sega began providing the 3D imaging for Hatsune Miku holographic concerts in 2010. Sega also distributes games from smaller Japanese game developers and sells localizations of Western games in Japan. In 2013, Index Corporation was purchased by Sega Sammy after going bankrupt. The year before, Sega signed a deal to distribute Atlus titles in Japan. After the buyout, Sega implemented a corporate spin-off with Index. The latter's game assets were rebranded as Atlus, a wholly owned subsidiary of Sega. Atlus is known for the Persona and Megami Tensei series. The Sonic the Hedgehog games had grossed over $5 billion in sales by 2014. In the mobile market, Sega released its first app on the iTunes Store with a version of Super Monkey Ball for iOS in 2008. Due in part to the decline of packaged game sales worldwide in the 2010s, Sega began layoffs and closed five offices based in Europe and Australia on July 1, 2012. This was to focus on the digital game market, such as personal computers and mobile devices. In 2012, Sega also began acquiring studios for mobile development, studios such as Hardlight, Three Rings Design, and Demiurge Studios becoming fully owned subsidiaries. 19 older mobile games were pulled due to quality concerns in May 2015. To streamline operations, Sega established operational firms for each of its businesses in the 2010s. In 2012, Sega established Sega Networks as a subsidiary company for its mobile games. The same year, Sega Entertainment was established for Sega's amusement facility business. In January 2015, Sega of America announced its relocation from San Francisco to Atlus USA's headquarters in Irvine, California, which was completed later that year. From 2005 to 2015, Sega's operating income generally saw improvements compared to Sega's past financial problems, but was not profitable every year, reporting overall losses in 2008, 2009 and 2012. In April 2015, Sega Corporation was reorganized into Sega Group, one of the three groups of Sega Sammy Holdings. Sega Holdings Co., Ltd. was established, with four business sectors under its control. Haruki Satomi, son of Hajime Satomi, took office as president and CEO of the company in April 2015. As a result, Sega Corporation rebranded itself to Sega Games Co., Ltd.[j] and continued to manage home video games, while Sega Interactive Co., Ltd.[k] was founded to take control of the arcade division. Sega Networks merged with Sega Games in 2015. At the Tokyo Game Show in September 2016, Sega announced that it had acquired the intellectual property and development rights to all games developed and published by Technosoft. Effective from January 2017, 85.1% of the shares in Sega's theme park business became owned by China Animations Character Co., renaming the former Sega Live Creation to CA Sega Joypolis. Sega Sammy Holdings announced in April 2017 that it would relocate its head office functions and domestic subsidiaries located in the Tokyo metropolitan area to Shinagawa by January 2018. This was to consolidate scattered head office functions including Sega Sammy Holdings, Sammy Corporation, Sega Holdings, Sega Games, Atlus, Sammy Networks, and Dartslive. Sega's previous headquarters in Ōta was sold in 2019. In June 2017, Chris Bergstresser replaced Jurgen Post as president and COO of Sega Europe. In June 2018, Gary Dale, formerly of Rockstar Games and Take-Two Interactive, replaced Chris Bergstresser as president and COO of Sega Europe. A few months later, Ian Curran, a former executive at THQ and Acclaim Entertainment, replaced John Cheng as president and COO of Sega of America in August 2018. In October 2018, Sega reported favorable western sales results from games such as Yakuza 6 and Persona 5, due to the localization work of Atlus USA. Despite a 35-percent increase in the sale of console games and success in its PC game business, profits fell 70 percent for the 2018 fiscal year in comparison to the previous year, mainly due to the digital games market which includes mobile games as well as Phantasy Star Online 2. In response, Sega announced that for its digital games it would focus on releases for its existing intellectual property and also focus on growth areas such as packaged games in the overseas market. Sega blamed the loss on market miscalculations and having too many games under development. Projects in development at Sega included a new game in the Yakuza series, the Sonic the Hedgehog film, and the Sega Genesis Mini, which was released in September 2019. In May 2019, Sega acquired Two Point Studios, known for Two Point Hospital. On April 1, 2020, Sega Interactive merged with Sega Games. The company was again renamed to Sega Corporation, while Sega Holdings Co., Ltd. was renamed Sega Group Corporation. According to a company statement, the move was made to allow greater research and development flexibility. Also in April 2020, Sega sold Demiurge Studios to Demiurge co-founder Albert Reed. Demiurge said it would continue to support the mobile games it developed under Sega. As part of its 60th anniversary, Sega announced the Game Gear Micro microconsole for release on October 6, 2020, in Japan. Other anniversary projects include the Astro City Mini and several other merchandise. Sega also announced its Fog Gaming platform, which uses the unused processing power of arcade machines in Japanese arcades overnight to help power cloud gaming applications. Sega made a number of restructuring moves in the early 2020s. During the latter half of 2020, many of the financial gains Sega made in the earlier part of the year dissolved due to the impact of the COVID-19 pandemic on its Sega Entertainment division, which ran its arcades. That November, Sega Sammy sold 85.1% of its shares in the division to Genda Inc., though the Sega branding and coin-operated machines continued to be used in arcades. Arcade game development was unaffected by the sale. By January 2022, Sega sold the remaining portion of this division to Genda. Sega Amusement International was sold via a management buyout to Kaizen Entertainment, however the Sega brand will still be used for all games and the company name remains through a royalty agreement. Sega Group Corporation was formally dissolved by its parent company in 2021. Contrasting its losses brought forth by amusement operations in 2020, sales and critical reception of Sega's home console games improved; Metacritic named Sega the best publisher of the year in 2020. Of its 28 releases that year, 95% had "good" Metacritic scores (above 75/100), including two with "great" scores (above 90/100 for Persona 5 Royal and Yakuza 0), with an average Metacritic score of 81.6 for all 2020 Sega releases. Phantasy Star Online 2 was reported in 2021 to have made over 900 million dollars since its release in 2012. In 2022, Sega announced "Super Game", several high budget games that are expected to have 672 million dollars in lifetime sales, allocating about 200 million into its budget across three years. In 2023, Sega acquired the Finnish video game developer Rovio Entertainment, best known for the Angry Birds series, for US$776 million. Rovio will help Sega to continue to build their mobile presence worldwide. On April 24, 2023, 144 Sega of America employees announced plans to file a new union election under the new labor union, Allied Employees Guild Improving Sega (AEGIS), which is allied with the Communication Workers of America via CWA Local 9510. AEGIS represents workers from departments including marketing, quality assurance, development and localization, making it the first of its kind in the game industry in the United States. On July 10, 2023, it was announced that workers had voted 91–26 to form the union. AEGIS is undergoing certification with the National Labor Relations Board before going into bargaining. In May 2023, Sega announced that 121 employees at Relic Entertainment had been made redundant to focus on cored franchises. That same year, Sega cancelled their upcoming shooter Hyenas and began restructuring its British and European operations. At the Game Awards 2023, Sega announced an initiative to revive many of its dormant franchises, beginning with new Crazy Taxi, Golden Axe, Jet Set Radio, Shinobi and Streets of Rage games. The Washington Post characterized the announcement as a return to Sega's 1990s "bohemian" and "countercultural" spirit. The co-CEO, Shuji Utsumi, said Sega wanted to "show edginess and a rebellious mindset", and that the industry was now large enough to sustain its less conventional games. In November 2023, AEGIS filed an unfair labor practice after Sega proposed a plan to phase out temporary employees by February 2024, which would affect around 80 employees. In January 2024, Jurgen Post rejoined Sega Europe to become COO of its western studios and also serve as managing director. That month, Shuji Utsumi became the president, COO and CEO of Sega of America and Europe. Utsumi had previously helped found Sony Computer Entertainment, where he helped launch the original PlayStation, before moving to Sega and assisting with the North American Dreamcast launch. After a period with Disney Interactive, he co-founded Q Entertainment before returning to Sega in 2020. On January 9, Sega Sammy Holdings announced that Sega's amusement machine business would be demerged and transferred to Sega Toys, which will be renamed Sega Fave Corporation. The changes will take effect by April. On February 29, Sega appointed Justin Scarpone as an executive vice president of a group to expand Sega's transmedia strategy. In January 2024, Sega of America announced that it would lay off 61 workers at its Irvine, California location. AEGIS had been negotiating with Sega of America since November to reduce the total redundancies. On March 27, 2024, AEGIS announced that its workers had ratified a contract with Sega of America, focusing on key issues. The following day, Sega laid off 240 workers from its British and European operations, including Sega Europe, Creative Assembly, and Hardlight, and sold Relic Entertainment to an external investor. On November 8, Sega sold Amplitude Studios to its staff via a management buyout. Sonic X Shadow Generations, Like a Dragon: Infinite Wealth and Persona 3 Reload reached a million sales within a week, a record for each of the respective franchises. The Sonic the Hedgehog movie franchise made over 1 billion dollars. Corporate structure Since 2004, Sega has been a subsidiary of Sega Sammy Holdings. Sega's global headquarters are in Shinagawa, Tokyo, Japan. Sega also has offices in Irvine, California (as Sega of America), in London (as Sega Europe), in Seoul, South Korea (as Sega Publishing Korea), and in Singapore, Hong Kong, Shanghai, and Taipei. In other regions, Sega has contracted distributors for its games and consoles, such as Tectoy in Brazil. Sega has had offices in France, Germany, Spain, and Australia; those markets have since contracted distributors. Relations between the regional offices have not always been smooth. Some conflict in the 1990s may have been caused by Sega president Nakayama and his admiration for Sega of America; according to Kalinske, "There were some guys in the executive suites who really didn't like that Nakayama in particular appeared to favor the US executives. A lot of the Japanese executives were maybe a little jealous, and I think some of that played into the decisions that were made." By contrast, author Steven L. Kent said Nakayama bullied American executives and that Nakayama believed the Japanese executives made the best decisions. Kent also said Sega of America CEOs Kalinske, Stolar, and Moore dreaded meeting with Sega of Japan executives. After the formation of Sega Group in 2015 and the founding of Sega Holdings, the former Sega Corporation was renamed Sega Games Co., Ltd. Under this structure, Sega Games was responsible for the home video game market and consumer development, while Sega Interactive Co., Ltd., comprised Sega's arcade game business. The two were consolidated in 2020, renamed as Sega Corporation, and Sega Group Corporation was formally absorbed into Sega Corporation in 2021. The company includes Sega Networks, which handles game development for smartphones. Sega Corporation develops and publishes games for major video game consoles and has not expressed interest in developing consoles again. According to former Sega Europe CEO Mike Brogan, "There is no future in selling hardware. In any market, through competition, the hardware eventually becomes a commodity ... If a company has to sell hardware then it should only be to leverage software, even if that means taking a hit on the hardware." Sega Fave Corporation, originally known as Yonezawa Toys and acquired by Sega in 1991, has created toys for children's franchises such as Oshare Majo: Love and Berry, Mushiking: King of the Beetles, Lilpri, Bakugan, Jewelpet, Rilu Rilu Fairilu, Dinosaur King, and Hero Bank. Products released in the West include the home planetarium Homestar and the robot dog iDog. The Homestar was released in 2005 and has been improved several times. Its newest model, Flux, was released in 2019. The series is developed by the Japanese inventor and entrepreneur Takayuki Ohira. As a recognized specialist for professional planetariums, he has received numerous innovation prizes and supplies large planetariums internationally with his company Megastar. Sega Toys also inherited the Sega Pico handheld system and produced Pico software. The company also develops and sells arcade games that were previously held under Sega until 2024. Since the late 1960s, Sega has been affiliated with operations of bowling alleys and arcades through its former Sega Entertainment Co., Ltd. subsidiary in Japan, as well as a number of other smaller regional subsidiaries in other countries. Initiatives to expand operations in other territories, such as the US, UK, France, Spain, and Taiwan, have been more short-lived, and following the 85.1% majority acquisition of Sega Entertainment's shares in November 2020 to mitigate losses caused by the COVID-19 pandemic, Sega's arcades in Japan since have been run under Genda Incorporated's Genda GiGO Entertainment division. Its DartsLive subsidiary creates electronic darts games, while Sega Logistics Service distributes and repairs arcade games. In 2015, Sega and Japanese advertising agency Hakuhodo formed a joint venture, Stories LLC, to create entertainment for film and TV. Stories LLC has exclusive licensing rights to adapt Sega properties into film and television, and has partnered with producers to develop series based on properties including Shinobi, Golden Axe, Virtua Fighter, The House of the Dead, and Crazy Taxi. Sega releases games developed by its research and development teams. The Sonic the Hedgehog franchise, maintained through Sega's Sonic Team division, is one of the best-selling franchises in video games. Sega has also acquired third-party studios including Atlus, Play Heart, Creative Assembly, Hardlight, Sports Interactive, Two Point Studios, and Rovio Entertainment. Sega's software research and development teams began with one development division operating under Sega's longtime head of R&D, Hisashi Suzuki. As the market increased for home video game consoles, Sega expanded with three Consumer Development (CS) divisions. After October 1983, arcade development expanded to three teams: Sega DD No. 1, 2, and 3. Some time after the release of Power Drift, Sega restructured its teams again as the Sega Amusement Machine Research and Development Teams, or AM teams. Each arcade division was segregated, and a rivalry existed between the arcade and consumer development divisions. In what has been called "a brief moment of remarkable creativity", in 2000, Sega restructured its arcade and console development teams into ten semi-autonomous studios headed by the company's top designers. The studios were United Game Artists, Smilebit, Hitmaker, Sega Rosso, WOW Entertainment, Overworks, Wave Master, Amusement Vision, Sega-AM2, and Sonic Team. Sega's design houses were encouraged to experiment and benefited from a relatively lax approval process. After taking over as company president in 2003, Hisao Oguchi announced his intention to consolidate Sega's studios. Prior to the acquisition by Sammy, Sega began the process of re-integrating its subsidiaries into the main company. Toshihiro Nagoshi, formerly the head of Amusement Vision, recalls this period as "in many ways a labour of love" from Sega, teaching the creatives the experience of managing a business. Sega still operates first-party studios as departments of its research and development division. Sonic Team exists as Sega's CS2 research and development department, while Sega's CS3 or Online department has developed games such as Phantasy Star Online 2, and Sega's AM2 department has more recently worked on projects such as smartphone game Soul Reverse Zero. Toshihiro Nagoshi remained involved with research and development as Sega's chief creative officer or creative director while working on the Yakuza series until 2021. Other studios include Ignited Artists and Play Heart. Legacy Sega is one of the world's most prolific arcade game producers, having developed more than 500 games, 70 franchises, and 20 arcade system boards since 1981. It has been recognized by Guinness World Records for this achievement. Of Sega's arcade division, Eurogamer's Martin Robinson said, "It's boisterous, broad and with a neat sense of showmanship running through its range. On top of that, it has something that's often evaded its console-dwelling cousin: success." Hideki Sato, who developed most of Sega's hardware has said a major failure of Sega has been not integrating their arcade and console divisions more for synergy. The Sega Genesis is often ranked among the best consoles in history. In 2014, USgamer's Jeremy Parish credited it for galvanizing the market by breaking Nintendo's near-monopoly, helping create modern sports game franchises, and popularizing television games in the UK. Kalinske felt Sega had innovated by developing games for an older demographic and pioneering the "street date" concept with the simultaneous North American and European release of Sonic the Hedgehog 2. Sega of America's marketing campaign for the Genesis influenced marketing for later consoles. Despite its commercial failure, the Saturn is well regarded for its library, though it has been criticized for a lack of high-profile franchise releases. Edge wrote that "hardened loyalists continue to reminisce about the console that brought forth games like Burning Rangers, Guardian Heroes, Dragon Force, and Panzer Dragoon Saga." Sega's management was criticized for its handling of the Saturn. According to Greg Sewart of 1Up.com, "the Saturn will go down in history as one of the most troubled, and greatest, systems of all time". The Dreamcast is remembered for being ahead of its time, with several concepts that became standard in consoles, such as motion controls and online functionality. Its demise has been connected with transitions in the video game industry. In 1001 Video Games You Must Play Before You Die, Duncan Harris wrote that the Dreamcast's end "signaled the demise of arcade gaming culture... Sega's console gave hope that things were not about to change for the worse and that the tenets of fast fun and bright, attractive graphics were not about to sink into a brown and green bog of realistic war games." Parish contrasted the Dreamcast's diverse library with the "suffocating sense of conservatism" that pervaded the industry in the following decade. In Eurogamer, Damien McFerran wrote that Sega's decisions in the late 1990s were "a tragic spectacle of overconfidence and woefully misguided business practice". Travis Fahs of IGN noted that since the Sammy takeover Sega had developed fewer games and outsourced to more western studios, and that its arcade operations had been significantly reduced. Nonetheless, he wrote: "Sega was one of the most active, creative, and productive developers the industry has ever known, and nothing that can happen to their name since will change that." In 2015, Sega president Haruki Satomi told Famitsu that, in the previous ten years, Sega had "betrayed" the trust of older fans and that he hoped to re-establish the Sega brand. During the promotion of the Sega Genesis Mini, Sega executive manager Hiroyuki Miyazaki reflected on Sega's history, saying, "I feel like Sega has never been the champion, at the top of all the video game companies, but I feel like a lot of people love Sega because of the underdog image." Former Sega management cited the absence of Dragon Quest and Final Fantasy games on their home consoles as a reason for the console division's struggles, especially in Japan. In his 2018 book The Sega Arcade Revolution, Horowitz connected Sega's decline in the arcades after 1995 with broader industry changes. He argued that its most serious problems came from the loss of its creative talent, particularly Yuji Naka and Yu Suzuki, after the Sammy takeover, but concluded that "as of this writing, Sega is in its best financial shape of the past two decades. The company has endured." See also Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Birthday#cite_note-21] | [TOKENS: 4101] |
Contents Birthday A birthday is the anniversary of the birth of a person or the figurative birth of an institution. Birthdays of people are celebrated in numerous cultures, often with birthday gifts, birthday cards, a birthday party, or a rite of passage. Many religions celebrate the birth of their founders or religious figures with special holidays (e.g. Christmas, Mawlid, Buddha's Birthday, Krishna Janmashtami, and Gurpurb). There is a distinction between birthday and birthdate (also known as date of birth): the former, except for February 29, occurs each year (e.g. January 15), while the latter is the complete date when a person was born (e.g. January 15, 2001). Coming of age In most legal systems, one becomes a legal adult on a particular birthday when they reach the age of majority (usually between 12 and 21), and reaching age-specific milestones confers particular rights and responsibilities. At certain ages, one may become eligible to leave full-time education, become subject to military conscription or to enlist in the military, to consent to sexual intercourse, to marry with parental consent, to marry without parental consent, to vote, to run for elected office, to legally purchase (or consume) alcohol and tobacco products, to purchase lottery tickets, or to obtain a driver's licence. The age of majority is when minors cease to legally be considered children and assume control over their persons, actions, and decisions, thereby terminating the legal control and responsibilities of their parents or guardians over and for them. Most countries set the age of majority at 18, though it varies by jurisdiction. Many cultures celebrate a coming of age birthday when a person reaches a particular year of life. Some cultures celebrate landmark birthdays in early life or old age. In many cultures and jurisdictions, if a person's real birthday is unknown (for example, if they are an orphan), their birthday may be adopted or assigned to a specific day of the year, such as January 1. Racehorses are reckoned to become one year old in the year following their birth on January 1 in the Northern Hemisphere and August 1 in the Southern Hemisphere.[relevant?] Birthday parties In certain parts of the world, an individual's birthday is celebrated by a party featuring a specially made cake. Presents are bestowed on the individual by the guests appropriate to their age. Other birthday activities may include entertainment (sometimes by a hired professional, i.e., a clown, magician, or musician) and a special toast or speech by the birthday celebrant. The last stanza of Patty Hill's and Mildred Hill's famous song, "Good Morning to You" (unofficially titled "Happy Birthday to You") is typically sung by the guests at some point in the proceedings. In some countries, a piñata takes the place of a cake. The birthday cake may be decorated with lettering and the person's age, or studded with the same number of lit candles as the age of the individual. The celebrated individual may make a silent wish and attempt to blow out the candles in one breath; if successful, superstition holds that the wish will be granted. In many cultures, the wish must be kept secret or it will not "come true". Birthdays as holidays Historically significant people's birthdays, such as national heroes or founders, are often commemorated by an official holiday marking the anniversary of their birth. Some notables, particularly monarchs, have an official birthday on a fixed day of the year, which may not necessarily match the day of their birth, but on which celebrations are held. In Mahayana Buddhism, many monasteries celebrate the anniversary of Buddha's birth, usually in a highly formal, ritualized manner. They treat Buddha's statue as if it was Buddha himself as if he were alive; bathing, and "feeding" him. Jesus Christ's traditional birthday is celebrated as Christmas Eve or Christmas Day around the world, on December 24 or 25, respectively. As some Eastern churches use the Julian calendar, December 25 will fall on January 7 in the Gregorian calendar. These dates are traditional and have no connection with Jesus's actual birthday, which is not recorded in the Gospels. Similarly, the birthdays of the Virgin Mary and John the Baptist are liturgically celebrated on September 8 and June 24, especially in the Roman Catholic and Eastern Orthodox traditions (although for those Eastern Orthodox churches using the Julian calendar the corresponding Gregorian dates are September 21 and July 7 respectively). As with Christmas, the dates of these celebrations are traditional and probably have no connection with the actual birthdays of these individuals. Catholic saints are remembered by a liturgical feast on the anniversary of their "birth" into heaven a.k.a. their day of death. In Hinduism, Ganesh Chaturthi is a festival celebrating the birth of the elephant-headed deity Ganesha in extensive community celebrations and at home. Figurines of Ganesha are made for the holiday and are widely sold. Sikhs celebrate the anniversary of the birth of Guru Nanak and other Sikh gurus, which is known as Gurpurb. Mawlid is the anniversary of the birth of Muhammad and is celebrated on the 12th or 17th day of Rabi' al-awwal by adherents of Sunni and Shia Islam respectively. These are the two most commonly accepted dates of birth of Muhammad. However, there is much controversy regarding the permissibility of celebrating Mawlid, as some Muslims judge the custom as an unacceptable practice according to Islamic tradition. In Iran, Mother's Day is celebrated on the birthday of Fatima al-Zahra, the daughter of Muhammad. Banners reading Ya Fatima ("O Fatima") are displayed on government buildings, private buildings, public streets and car windows. Religious views In Judaism, rabbis are divided about celebrating this custom, although the majority of the faithful accept it. In the Torah, the only mention of a birthday is the celebration of Pharaoh's birthday in Egypt (Genesis 40:20). Although the birthday of Jesus of Nazareth is celebrated as a Christian holiday on December 25, historically the celebrating of an individual person's birthday has been subject to theological debate. Early Christians, notes The World Book Encyclopedia, "considered the celebration of anyone's birth to be a pagan custom." Origen, in his commentary "On Levites," wrote that Christians should not only refrain from celebrating their birthdays but should look at them with disgust as a pagan custom. A saint's day was typically celebrated on the anniversary of their martyrdom or death, considered the occasion of or preparation for their entrance into Heaven or the New Jerusalem. Ordinary folk in the Middle Ages celebrated their saint's day (the saint they were named after), but nobility celebrated the anniversary of their birth.[citation needed] The "Squire's Tale", one of Chaucer's Canterbury Tales, opens as King Cambuskan proclaims a feast to celebrate his birthday. In the Modern era, the Catholic Church, the Eastern Orthodox Church and Protestantism, i.e. the three main branches of Christianity, as well as almost all Christian religious denominations, consider celebrating birthdays acceptable or at most a choice of the individual. An exception is Jehovah's Witnesses, who do not celebrate them for various reasons: in their interpretation this feast has pagan origins, was not celebrated by early Christians, is negatively expounded in the Holy Scriptures and has customs linked to superstition and magic. In some historically Roman Catholic and Eastern Orthodox countries,[a] it is common to have a 'name day', otherwise known as a 'Saint's day'. It is celebrated in much the same way as a birthday, but it is held on the official day of a saint with the same Christian name as the birthday person; the difference being that one may look up a person's name day in a calendar, or easily remember common name days (for example, John or Mary); however in pious traditions, the two were often made to concur by giving a newborn the name of a saint celebrated on its day of confirmation, more seldom one's birthday. Some are given the name of the religious feast of their christening's day or birthday, for example, Noel or Pascal (French for Christmas and "of Easter"); as another example, Togliatti was given Palmiro as his first name because he was born on Palm Sunday. The birthday does not reflect Islamic tradition, and because of this, the majority of Muslims refrain from celebrating it. Others do not object, as long as it is not accompanied by behavior contrary to Islamic tradition. A good portion of Muslims (and Arab Christians) who have emigrated to the United States and Europe celebrate birthdays as customary, especially for children, while others abstain. Hindus celebrate the birth anniversary day every year when the day that corresponds to the lunar month or solar month (Sun Signs Nirayana System – Sourava Mana Masa) of birth and has the same asterism (Star/Nakshatra) as that of the date of birth. That age is reckoned whenever Janma Nakshatra of the same month passes. Hindus regard death to be more auspicious than birth, since the person is liberated from the bondages of material society. Also, traditionally, rituals and prayers for the departed are observed on the 5th and 11th days, with many relatives gathering. Historical and cultural perspectives According to Herodotus (5th century BC), of all the days in the year, the one which the Persians celebrate most is their birthday. It was customary to have the board furnished on that day with an ampler supply than common: the richer people eat wholly baked cow, horse, camel, or donkey (Greek: ὄνον), while the poorer classes use instead the smaller kinds of cattle. On his birthday, the king anointed his head and presented gifts to the Persians. According to the law of the Royal Supper, on that day "no one should be refused a request". The rule for drinking was "No restrictions". In ancient Rome, a birthday (dies natalis) was originally an act of religious cultivation (cultus). A dies natalis was celebrated annually for a temple on the day of its founding, and the term is still used sometimes for the anniversary of an institution such as a university. The temple founding day might become the "birthday" of the deity housed there. March 1, for example, was celebrated as the birthday of the god Mars. Each human likewise had a natal divinity, the guardian spirit called the Genius, or sometimes the Juno for a woman, who was owed religious devotion on the day of birth, usually in the household shrine (lararium). The decoration of a lararium often shows the Genius in the role of the person carrying out the rites. A person marked their birthday with ritual acts that might include lighting an altar, saying prayers, making vows (vota), anointing and wreathing a statue of the Genius, or sacrificing to a patron deity. Incense, cakes, and wine were common offerings. Celebrating someone else's birthday was a way to show affection, friendship, or respect. In exile, the poet Ovid, though alone, celebrated not only his own birthday rite but that of his far distant wife. Birthday parties affirmed social as well as sacred ties. One of the Vindolanda tablets is an invitation to a birthday party from the wife of one Roman officer to the wife of another. Books were a popular birthday gift, sometimes handcrafted as a luxury edition or composed especially for the person honored. Birthday poems are a minor but distinctive genre of Latin literature. The banquets, libations, and offerings or gifts that were a regular part of most Roman religious observances thus became part of birthday celebrations for individuals. A highly esteemed person would continue to be celebrated on their birthday after death, in addition to the several holidays on the Roman calendar for commemorating the dead collectively. Birthday commemoration was considered so important that money was often bequeathed to a social organization to fund an annual banquet in the deceased's honor. The observance of a patron's birthday or the honoring of a political figure's Genius was one of the religious foundations for imperial cult or so-called "emperor worship." The Chinese word for "year(s) old" (t 歲, s 岁, suì) is entirely different from the usual word for "year(s)" (年, nián), reflecting the former importance of Chinese astrology and the belief that one's fate was bound to the stars imagined to be in opposition to the planet Jupiter at the time of one's birth. The importance of this duodecennial orbital cycle only survives in popular culture as the 12 animals of the Chinese zodiac, which change each Chinese New Year and may be used as a theme for some gifts or decorations. Because of the importance attached to the influence of these stars in ancient China and throughout the Sinosphere, East Asian age reckoning previously began with one at birth and then added years at each Chinese New Year, so that it formed a record of the suì one had lived through rather than of the exact amount of time from one's birth. This method—which can differ by as much as two years of age from other systems—is increasingly uncommon and is not used for official purposes in the PRC or on Taiwan, although the word suì is still used for describing age. Traditionally, Chinese birthdays—when celebrated—were reckoned using the lunisolar calendar, which varies from the Gregorian calendar by as much as a month forward or backward depending on the year. Celebrating the lunisolar birthday remains common on Taiwan while growing increasingly uncommon on the mainland. Birthday traditions reflected the culture's deep-seated focus on longevity and wordplay. From the homophony in some dialects between 酒 ("rice wine") and 久 (meaning "long" in the sense of time passing), osmanthus and other rice wines are traditional gifts for birthdays in China. Longevity noodles are another traditional food consumed on the day, although western-style birthday cakes are increasingly common among urban Chinese. Hongbaos—red envelopes stuffed with money, now especially the red 100 RMB notes—are the usual gift from relatives and close family friends for most children. Gifts for adults on their birthdays are much less common, although the birthday for each decade is a larger occasion that might prompt a large dinner and celebration. The Japanese reckoned their birthdays by the Chinese system until the Meiji Reforms. Celebrations remained uncommon or muted until after the American occupation that followed World War II.[citation needed] Children's birthday parties are the most important, typically celebrated with a cake, candles, and singing. Adults often just celebrate with their partner. In North Korea, the Day of the Sun, Kim Il Sung's birthday, is the most important public holiday of the country, and Kim Jong Il's birthday is celebrated as the Day of the Shining Star. North Koreans are not permitted to celebrate birthdays on July 8 and December 17 because these were the dates of the deaths of Kim Il Sung and Kim Jong Il, respectively. More than 100,000 North Koreans celebrate displaced birthdays on July 9 and December 18 instead to avoid these dates. A person born on July 8 before 1994 may change their birthday, with official recognition. South Korea was one of the last countries to use a form of East Asian age reckoning for many official purposes. Prior to June 2023, three systems were used together—"Korean ages" that start with 1 at birth and increase every January 1st with the Gregorian New Year, "year ages" that start with 0 at birth and otherwise increase the same way, and "actual ages" that start with 0 at birth and increase each birthday. First birthday celebrations was heavily celebrated, despite usually having little to do with the child's age. In June 2023, all Korean ages were set back at least one year, and official ages henceforth are reckoned only by birthdays. In Ghana, children wake up on their birthday to a special treat called oto, which is a patty made from mashed sweet potato and eggs fried in palm oil. Later they have a birthday party where they usually eat stew and rice and a dish known as kelewele, which is fried plantain chunks. Distribution through the year Birthdays are fairly evenly distributed throughout the year, with some seasonal effects. In the United States, there tend to be more births in September and October. This may be because there is a holiday season nine months before (the human gestation period is about nine months), or because the longest nights of the year also occur in the Northern Hemisphere nine months before. However, the holidays affect birth rates more than the winter: New Zealand, a Southern Hemisphere country, has the same September and October peak with no corresponding peak in March and April. The least common birthdays tend to fall around public holidays, such as Christmas, New Year's Day and fixed-date holidays such as Independence Day in the US, which falls on July 4. Between 1973 and 1999, September 16 was the most common birthday in the United States, and December 25 was the least common birthday (other than February 29 because of leap years). In 2011, October 5 and 6 were reported as the most frequently occurring birthdays. New Zealand's most common birthday is September 29, and the least common birthday is December 25. The ten most common birthdays all fall within a thirteen-day period, between September 22 and October 4. The ten least common birthdays (other than February 29) are December 24–27, January 1–2, February 6, March 22, April 1, and April 25. This is based on all live births registered in New Zealand between 1980 and 2017. Positive and negative associations with culturally significant dates may influence birth rates. The study shows a 5.3% decrease in spontaneous births and a 16.9% decrease in Caesarean births on Halloween, compared to dates occurring within one week before and one week after the October holiday. In contrast, on Valentine's Day, there is a 3.6% increase in spontaneous births and a 12.1% increase in Caesarean births. In Sweden, 9.3% of the population is born in March and 7.3% in November, when a uniform distribution would give 8.3%. In the Gregorian calendar (a common solar calendar), February in a leap year has 29 days instead of the usual 28, so the year lasts 366 days instead of the usual 365. A person born on February 29 may be called a "leapling" or a "leaper". In common years, they usually celebrate their birthdays on February 28. In some situations, March 1 is used as the birthday in a non-leap year since it is the day following February 28. Technically, a leapling will have fewer birthday anniversaries than their age in years. This phenomenon is exploited when a person claims to be only a quarter of their actual age, by counting their leap-year birthday anniversaries only. In Gilbert and Sullivan's 1879 comic opera The Pirates of Penzance, Frederic the pirate apprentice discovers that he is bound to serve the pirates until his 21st birthday rather than until his 21st year. For legal purposes, legal birthdays depend on how local laws count time intervals. An individual's Beddian birthday, named in tribute to firefighter Bobby Beddia, occurs during the year that their age matches the last two digits of the year they were born. Some studies show people are more likely to die on their birthdays, with explanations including excessive drinking, suicide, cardiovascular events due to high stress or happiness, efforts to postpone death for major social events, and death certificate paperwork errors. See also References Notes External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Grey_alien#cite_note-20] | [TOKENS: 2835] |
Contents Grey alien Grey aliens, also referred to as Zeta Reticulans, Roswell Greys, or simply, Greys,[a] are purported extraterrestrial beings. They are frequently featured in claims of close encounter and alien abduction. Greys are typically described as having small, humanoid bodies, smooth, grey skin, disproportionately large, hairless heads, and large, black, almond-shaped eyes. The 1961 Barney and Betty Hill abduction claim was key to the popularization of Grey aliens. Precursor figures have been described in science fiction and similar descriptions appeared in later accounts of the 1947 Roswell UFO incident and early accounts of the 1948 Aztec UFO hoax. The Grey alien is cited an archetypal image of an intelligent non-human creature and extraterrestrial life in general, as well as an iconic trope of popular culture in the age of space exploration. Description Greys are typically depicted as grey-skinned, diminutive humanoid beings that possess reduced forms of, or completely lack, external human body parts such as noses, ears, or sex organs. Their bodies are usually depicted as being elongated, having a small chest, and lacking in muscular definition and visible skeletal structure. Their legs are depicted as being shorter and jointed differently from humans with limbs proportionally different from a human. Greys are depicted as having unusually large heads in proportion to their bodies, and as having no hair, no noticeable outer ears or noses, and small orifices for ears, nostrils, and mouths. In drawings, Greys are almost always shown with very large, opaque, black eyes, without eye whites. They are frequently described as shorter than average adult humans. The association between Grey aliens and Zeta Reticuli originated with the interpretation of a map drawn by Betty Hill by a school-teacher named Marjorie Fish sometime in 1969. Betty Hill, under hypnosis, had claimed to have been shown a map that displayed the aliens' home system and nearby stars. Upon learning of this, Fish attempted to create a model from a drawing produced by Hill, eventually determining that the stars marked as the aliens' home were Zeta Reticuli, a binary star system. History In literature, descriptions of beings similar to Grey aliens predate claims of supposed encounters with them. In 1893, H. G. Wells presented a description of humanity's future appearance in the article "The Man of the Year Million", describing humans as having no mouths, noses, or hair, and with large heads. In 1895, Wells also depicted the Eloi, a successor species to humanity, in similar terms in the novel The Time Machine. Both share many characteristics with future perceptions of Greys. As early as 1917, the occultist Aleister Crowley described a meeting with a "preternatural entity" named Lam that was similar in appearance to a modern Grey. Crowley claimed to have contacted Lam through a process called the "Amalantrah Workings," which he believed allowed humans to contact beings from outer space and across dimensions. Other occultists and ufologists, many of whom have retroactively linked Lam to later Grey encounters, have since described their own visitations from him, with one describing the being as a "cold, computer-like intelligence," and utterly beyond human comprehension. ...the creatures did not resemble any race of humans. They were short, shorter than the average Japanese, and their heads were big and bald, with strong, square foreheads, and very small noses and mouths, and weak chins. What was most extraordinary about them were the eyes—large, dark, gleaming, with a sharp gaze. They wore clothes made of soft grey fabric, and their limbs seemed to be similar to those of humans. In 1933, the Swedish novelist Gustav Sandgren, using the pen name Gabriel Linde, published a science fiction novel called Den okända faran (The Unknown Danger), in which he describes a race of extraterrestrials who wore clothes made of soft grey fabric and were short, with big bald heads, and large, dark, gleaming eyes. The novel, aimed at young readers, included illustrations of the imagined aliens. This description would become the template upon which the popular image of grey aliens is based. The conception remained a niche one until 1965, when newspaper reports of the Betty and Barney Hill abduction made the archetype famous. The alleged abductees, Betty and Barney Hill, claimed that in 1961, humanoid alien beings with greyish skin had abducted them and taken them to a flying saucer. In his 1990 article "Entirely Unpredisposed", Martin Kottmeyer suggested that Barney's memories revealed under hypnosis might have been influenced by an episode of the science-fiction television show The Outer Limits titled "The Bellero Shield", which was broadcast 12 days before Barney's first hypnotic session. The episode featured an extraterrestrial with large eyes, who says, "In all the universes, in all the unities beyond the universes, all who have eyes have eyes that speak." The report from the regression featured a scenario that was in some respects similar to the television show. In part, Kottmeyer wrote: Wraparound eyes are an extreme rarity in science fiction films. I know of only one instance. They appeared on the alien of an episode of an old TV series The Outer Limits entitled "The Bellero Shield." A person familiar with Barney's sketch in "The Interrupted Journey" and the sketch done in collaboration with the artist David Baker will find a "frisson" of "déjà vu" creeping up his spine when seeing this episode. The resemblance is much abetted by an absence of ears, hair, and nose on both aliens. Could it be by chance? Consider this: Barney first described and drew the wraparound eyes during the hypnosis session dated 22 February 1964. "The Bellero Shield" was first broadcast on 10 February 1964. Only twelve days separate the two instances. If the identification is admitted, the commonness of wraparound eyes in the abduction literature falls to cultural forces. — Martin Kottmeyer, Entirely Unpredisposed: The Cultural Background of UFO Reports Carl Sagan echoed Kottmeyer's suspicions in his 1997 book, The Demon Haunted World: Science as a Candle in the Dark, where Invaders from Mars was cited as another potential inspiration. After the Hills' encounter, Greys would go on to become an integral part of ufology and other extraterrestrial-related folklore. This is particularly true in the case of the United States: according to journalist C. D. B. Bryan, 73% of all reported alien encounters in the United States describe Grey aliens, a significantly higher proportion than other countries.: 68 During the early 1980s, Greys were linked to the alleged crash-landing of a flying saucer in Roswell, New Mexico, in 1947. A number of publications contained statements from individuals who claimed to have seen the U.S. military handling a number of unusually proportioned, bald, child-sized beings. These individuals claimed, during and after the incident, that the beings had oversized heads and slanted eyes, but scant other distinguishable facial features. In 1987, novelist Whitley Strieber published the book Communion, which, unlike his previous works, was categorized as non-fiction, and in which he describes a number of close encounters he alleges to have experienced with Greys and other extraterrestrial beings. The book became a New York Times bestseller, and New Line Cinema released a 1989 film adaption that starred Christopher Walken as Strieber. In 1988, Christophe Dechavanne interviewed the French science-fiction writer and ufologist Jimmy Guieu on TF1's Ciel, mon mardi !. Besides mentioning Majestic 12, Guieu described the existence of what he called "the little greys", which later on became better known in French under the name: les Petits-Gris. Guieu later wrote two docudramas, using as a plot the Grey aliens / Majestic-12 conspiracy theory as described by John Lear and Milton William Cooper: the series "E.B.E." (for "Extraterrestrial Biological Entity"): E.B.E.: Alerte rouge (first part) (1990) and E.B.E.: L'entité noire d'Andamooka (second part) (1991).[citation needed] Greys have since become the subject of many conspiracy theories. Many conspiracy theorists believe that Greys represent part of a government-led disinformation or plausible deniability campaign, or that they are a product of government mind-control experiments. During the 1990s, popular culture also began to increasingly link Greys to a number of military-industrial complex and New World Order conspiracy theories. In 1995, filmmaker Ray Santilli claimed to have obtained 22 reels of 16 mm film that depicted the autopsy of a "real" Grey supposedly recovered from the site of the 1947 incident in Roswell. In 2006, though, Santilli announced that the film was not original, but was instead a "reconstruction" created after the original film was found to have degraded. He maintained that a real Grey had been found and autopsied on camera in 1947, and that the footage released to the public contained a percentage of that original footage. Analysis Greys are often involved in alien abduction claims. Among reports of alien encounters, Greys make up about 50% in Australia, 73% in the United States, 48% in continental Europe, and around 12% in the United Kingdom.: 68 These reports include two distinct groups of Greys that differ in height.: 74 Abduction claims are often described as extremely traumatic, similar to an abduction by humans or even a sexual assault in the level of trauma and distress. The emotional impact of perceived abductions can be as great as that of combat, sexual abuse, and other traumatic events. The eyes are often a focus of abduction claims, which often describe a Grey staring into the eyes of an abductee when conducting mental procedures. This staring is claimed to induce hallucinogenic states or directly provoke different emotions. Neurologist Steven Novella proposes that Grey aliens are a byproduct of the human imagination, with the Greys' most distinctive features representing everything that modern humans traditionally link with intelligence. "The aliens, however, do not just appear as humans, they appear like humans with those traits we psychologically associate with intelligence." In 2005, Frederick V. Malmstrom, writing in Skeptic magazine, Volume 11, issue 4, presents his idea that Greys are actually residual memories of early childhood development. Malmstrom reconstructs the face of a Grey through transformation of a mother's face based on our best understanding of early-childhood sensation and perception. Malmstrom's study offers another alternative to the existence of Greys, the intense instinctive response many people experience when presented an image of a Grey, and the act of regression hypnosis and recovered-memory therapy in "recovering" memories of alien abduction experiences, along with their common themes. According to biologist Jack Cohen, the typical image of a Grey, assuming that it would have evolved from a world with different environmental and ecological conditions from Earth, is too physiologically similar to a human to be credible as a representation of an alien. The interdimensional hypothesis, the cryptoterrestrial hypothesis, and the time-traveller hypothesis attempt to provide an alternative explanation to the humanoid anatomy and behavior of these alleged beings. In popular culture Depictions of Grey aliens have gone on to appear in a number of films and television shows, supplanting the previously popular little green men. As early as 1966, for example, the superhero character Ultraman was explicitly based on them, and in 1977 they were featured in Close Encounters of the Third Kind. Greys have also been worked into space opera and other interstellar settings: in Babylon 5, the Greys are referred to as the "Vree", and are depicted as being allies and trade partners of 23rd-century Earth, while in the Stargate franchise they are called the "Asgard" and depicted as ancient astronauts allied with modern-day Earth.[citation needed] South Park refers to them as "visitors". During the 1990s, plotlines wherein Greys were linked to conspiracy theories became common. A well-known example is the Fox television series The X-Files, which first aired in 1993. It combined the quest to find proof of the existence of Grey-like extraterrestrials with a number of UFO conspiracy theory subplots, to form its primary story arc. Other notable examples include the XCOM video game franchise (where they are called "Sectoids"); Dark Skies, first broadcast in 1996, which expanded upon the MJ-12 conspiracy;[citation needed] and American Dad!, which features a Grey-like alien named Roger, whose backstory draws from both the Roswell incident and Area 51 conspiracy theories. The 2011 film Paul tells the story of a Grey named Paul who attributes the Greys' frequent presence in science fiction pop culture to the US government deliberately inserting the stereotypical Grey alien image into mainstream media; this is done so that if humanity came into contact with Paul's species, no immediate shock would occur as to their appearance. Child abduction by Greys is a key plot point in the 2013 film, Dark Skies. Greys appear in Syfy's 2021 science fiction dramedy series Resident Alien. The Greys appear as the main antagonistic faction in the 2023 independent game Greyhill Incident. See also Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Zygote] | [TOKENS: 963] |
Contents Zygote A zygote (/ˈzaɪˌɡoʊt/ ⓘ; from Ancient Greek ζυγωτός (zygōtós) 'joined, yoked', from ζυγοῦν (zygoun) 'to join, to yoke') is a eukaryotic cell formed by a fertilization event between two gametes. The zygote's genome is a combination of the DNA in each gamete, and contains all of the genetic information of a new individual organism. The sexual fusion of haploid cells is called karyogamy, the result of which is the formation of a diploid cell called the zygote (or zygospore in specific cases). History German zoologists Oscar and Richard Hertwig made some of the first discoveries on animal zygote formation in the late 19th century.[citation needed] In multicellular organisms The zygote is the earliest developmental stage. In humans and most other anisogamous organisms, a zygote is formed when an egg cell and sperm cell come together to create a new unique organism. The formation of a totipotent zygote with the potential to produce a whole organism depends on epigenetic reprogramming. DNA demethylation of the paternal genome in the zygote appears to be an important part of epigenetic reprogramming. In the paternal genome of the mouse, demethylation of DNA, particularly at sites of methylated cytosines, is likely a key process in establishing totipotency. Demethylation involves the processes of base excision repair and possibly other DNA-repair–based mechanisms. In human fertilization, a released ovum (a haploid secondary oocyte with replicate chromosome copies) and a haploid sperm cell (male gamete) combine to form a single diploid cell called the zygote. Once the single sperm fuses with the oocyte, the latter completes the division of the second meiosis forming a haploid daughter with only 23 chromosomes, almost all of the cytoplasm, and the male pronucleus. The other product of meiosis is the second polar body with only chromosomes but no ability to replicate or survive. In the fertilized daughter, DNA is then replicated in the two separate pronuclei derived from the sperm and ovum, making the zygote's chromosome number temporarily 4n diploid. After approximately 30 hours from the time of fertilization, a fusion of the pronuclei and immediate mitotic division produce two 2n diploid daughter cells called blastomeres. Between the stages of fertilization and implantation, the developing embryo is sometimes termed as a preimplantation embryo. This stage has also been referred to as the pre-embryo in legal discourses including relevance to the use of embryonic stem cells. After fertilization, the conceptus travels down the fallopian tube towards the uterus while continuing to divide without actually increasing in size, in a process called cleavage. After four divisions, the conceptus consists of 16 blastomeres, and it is known as the morula. Through the processes of compaction, cell division, and blastulation, the conceptus takes the form of the blastocyst by the fifth day of development, just as it approaches the site of implantation. When the blastocyst hatches from the zona pellucida, it can implant in the endometrial lining of the uterus and begin the gastrulation stage of embryonic development.[citation needed] The human zygote has been genetically edited in experiments designed to cure inherited diseases. In fungi, this cell may then enter meiosis or mitosis depending on the life cycle of the species.[citation needed] In plants, the zygote may be polyploid if fertilization occurs between meiotically unreduced gametes. In land plants, the zygote is formed within a chamber called the archegonium. In seedless plants, the archegonium is usually flask-shaped, with a long hollow neck through which the sperm cell enters. As the zygote divides and grows, it does so inside the archegonium.[citation needed] In single-celled organisms The zygote can divide asexually by mitosis to produce identical offspring.[citation needed] A Chlamydomonas zygote contains chloroplast DNA (cpDNA) from both parents; such cells are generally rare, since normally cpDNA is inherited uniparentally from the mt+ mating type parent. These rare biparental zygotes allowed mapping of chloroplast genes by recombination.[citation needed] See also References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Tom_Kalinske] | [TOKENS: 1283] |
Contents Tom Kalinske Thomas Kalinske (born July 17, 1944) is an American businessman who has worked for Mattel (1972–1987), Matchbox (1987-1990), Sega of America (1990–1996) and LeapFrog (1997–2006). At Mattel, Kalinske was credited with reviving the Barbie and Hot Wheels brands, launching Masters of the Universe, then being promoted to CEO from 1985 to 1987. He next became CEO of Matchbox, and then was recruited to be the president and CEO of Sega of America from 1990 to 1996, and then CEO and COB of LeapFrog from 1997 to 2006. His aggressive marketing decisions during his time at Sega, such as price drops, anti-Nintendo attack ads and the famous "Sega Scream" TV campaign, are often cited as key elements in the success of the Sega Genesis. The book Console Wars and the documentary film of the same name chronicles Kalinske's strategies and success in competing with Nintendo. Kalinske was inducted into the Toy Industry Hall of Fame in 1997 and received a Lifetime Achievement award from People of Play in 2021. He has also been honored as "Man of the Year" by the NY Boy Scouts and LA Make A Wish Foundation. He is Chairman of Mixed Dimensions, a 3D printing company, and on the boards of Stitched Insights, Adjunct Professor Link, Storyworld, the University of Wisconsin School of Business and Teach The World Foundation. His estimated net worth is $10–$20M. Career Thomas Kalinske earned a Bachelor of Science degree at the University of Wisconsin in 1966 and MBA at the University of Arizona in 1968. In 1976, he attended Harvard Business School's Strategic Management Program. After leaving Mattel in 1987, Kalinske went to work for rival company Matchbox, which had been recently placed into receivership. He implemented plans to cut costs by moving production to more labor-cheap regions in Asia, and by 1990, the company had managed to turn a profit for the first time in years, with a revenue of over $350 million. While Kalinske was CEO of Sega of America, the company grew from $72 million to more than $1.5 billion and the market value of Sega grew from less than $2 billion to more than $5 billion.[citation needed] His market strategies have been cited as the key factor in breaking Nintendo's dominance of the video game industry, added greatly by a strategic partnership that Kalinske developed with Blockbuster CEO Joe Baczko and Blockbuster's Retail Strategy Officer Mark Allen Stuart. The partnership including the renting of Sega hardware units at a time when in-home penetration of Sega hardware was very low. Providing consumers with the ability to rent the hardware and then rent the video games became a form of marketing, with increased retail sales of both hardware and software as a result. The Sega Genesis and Game Gear systems were highly successful during this time. Later, the commercial failure of the Sega Saturn in the US was exacerbated by Sega's announcement during E3 1995 that the system was available in selected stores effective immediately, instead of on the planned date of September 2, 1995, which had been dubbed "Saturnday". Kalinske was generally against releasing the Saturn early, but was forced to do so by Sega. With few games and consoles in stock, the pre-selected retailers simply were not ready to begin distributing the console, and retailers not provided consoles were so offended by this that they refused to distribute the Saturn at all. Kalinske was not a fan of the Saturn's architecture, and was later quoted saying that he believed no one would be able to successfully market the console. After having tendered his resignation from Sega on July 15, 1996,: 549 to take effect on September 30, Isao Okawa, who owned Sega at the time via CSK, reached out to him and invited him to work with the Okawa Foundation. Kalinske remained on Sega's board of directors following his resignation. Kalinske later became the CEO and chairman of the educational toy company, Leapfrog, which grew to over $600M in revenue, becoming the largest education toy company in the US. He was President of Knowledge Universe, a company that aimed to use technology to improve education. According to Oregon Business in 2011, the company was "the largest single private provider of early childhood education services in the country", with 40,000 employees on three continents and the biggest market share in the United Kingdom, Malaysia and Singapore. Many education companies came out of Knowledge Universe, including LeapFrog, K-12, Knowledge Beginnings (the largest chain of preschools in the US), Spring and Vistage (Executive Education). Kalinske was on the board of directors of the Toy Manufacturers of America for twelve years. He was on the board of Blackboard between 2005 and 2012. He is on the board of Cambium Learning Group, the board of Genyous (a cancer drug development company) and the board of WCEPS (Wisconsin Center for Educational Products & Services). He is emeritus advisor to the UW Business School and University of Arizona Eller School of Management, and is vice chairman of LeapFrog Inc.[citation needed] Legacy Blake J. Harris's book Console Wars, released in May 2014, is a retelling of Tom Kalinske's efforts to overthrow Nintendo. The story is formulated in a third person narrative, which was assembled based on more than two hundred interviews carried out by the author with former Sega and Nintendo employees. A documentary based on the book was released on September 23, 2020. For his work at Mattel, Universal Matchbox, Sega and Leapfrog, Kalinske was a 1997 inductee into the Toy Industry Hall of Fame. He has received the NYC Boy Scout's Good Scout Award, the Starlight Foundation Man of Year Award, the University of Wisconsin Business Partners "Distinguished low" Award and the Video Software Dealers Man of Year Award.[citation needed] References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Scikit-learn] | [TOKENS: 768] |
Contents scikit-learn scikit-learn (formerly scikits.learn and also known as sklearn) is a free and open-source machine learning library for the Python programming language. It features various classification, regression and clustering algorithms including support-vector machines, random forests, gradient boosting, k-means and DBSCAN, and is designed to interoperate with the Python numerical and scientific libraries NumPy and SciPy. Scikit-learn is a NumFOCUS fiscally sponsored project. Overview The scikit-learn project started as scikits.learn, a Google Summer of Code project by French data scientist David Cournapeau. The name of the project derives from its role as a "scientific toolkit for machine learning", originally developed and distributed as a third-party extension to SciPy. The original codebase was later rewritten by other developers.[who?] In 2010, contributors Fabian Pedregosa, Gaël Varoquaux, Alexandre Gramfort and Vincent Michel, from the French Institute for Research in Computer Science and Automation in Saclay, France, took leadership of the project and released the first public version of the library on February 1, 2010. In November 2012, scikit-learn as well as scikit-image were described as two of the "well-maintained and popular" scikits libraries[update]. In 2019, it was noted that scikit-learn is one of the most popular machine learning libraries on GitHub. At that time, the project had over 1,400 contributors and the documentation received 42 million visits in 2018. According to a 2022 Kaggle survey of nearly 24,000 respondents from 173 countries, scikit-learn was identified as the most widely used machine learning framework. Features Examples Fitting a random forest classifier: Implementation scikit-learn is largely written in Python, and uses NumPy extensively for high-performance linear algebra and array operations. Furthermore, some core algorithms are written in Cython to improve performance. Support vector machines are implemented by a Cython wrapper around LIBSVM; logistic regression and linear support vector machines by a similar wrapper around LIBLINEAR. In such cases, extending these methods with Python may not be possible. scikit-learn integrates well with many other Python libraries, such as Matplotlib and plotly for plotting, NumPy for array vectorization, Pandas dataframes, SciPy, and many more. History scikit-learn was initially developed by David Cournapeau as a Google Summer of Code project in 2007. Later that year, Matthieu Brucher joined the project and started to use it as a part of his thesis work. In 2010, INRIA, the French Institute for Research in Computer Science and Automation, got involved and the first public release (v0.1 beta) was published in late January 2010. The project released its first stable version, 1.0.0, on September 24, 2021. The release was the result of over 2,100 merged pull requests, approximately 800 of which were dedicated to improving documentation. Development continues to focus on bug fixes, efficiency and feature expansion. The latest version, 1.8, was released on December 10, 2025. This update introduced native Array API support, enabling the library to perform GPU computations by directly using PyTorch and CuPy arrays. This version also included bug fixes, improvements and new features, such as efficiency improvements to the fit time of linear models. Applications Scikit-learn is widely used across industries for a variety of machine learning tasks such as classification, regression, clustering, and model selection. The following are real-world applications of the library: Awards References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Mariner_9] | [TOKENS: 2696] |
Contents Mariner 9 Mariner 9 (Mariner Mars '71 / Mariner-I) was a robotic spacecraft that contributed greatly to the exploration of Mars and was part of the NASA Mariner program. Mariner 9 was launched toward Mars on May 30, 1971, from LC-36B at Cape Canaveral Air Force Station, Florida, and reached the planet on November 14 of the same year, becoming the first spacecraft to orbit another planet – only narrowly beating the Soviet probes Mars 2 (launched May 19) and Mars 3 (launched May 28), both of which arrived at Mars only weeks later. After the occurrence of dust storms on the planet for several months following its arrival, the orbiter managed to send back clear pictures of the surface. Mariner 9 successfully returned 7,329 images, covering 85% of Mars's surface, over the course of its mission, which concluded in October 1972. Spacecraft and subsystems Mariner 9 carried an instrument payload similar to Mariners 6 and 7, but because of the need for a larger propulsion system to control the spacecraft in Martian orbit, it weighed more than Mariners 6 and 7 combined (Mariner 6 and Mariner 7 weighed 413 kilograms while Mariner 9 weighed 997.9 kilograms). The power for the spacecraft was provided by a total of 14,742 solar cells, being distributed between 4 solar panels, which in total resulted in 7.7 meters of solar panels being present in the spacecraft. The solar panels produced 500 watts in the orbit of Mars. The energy was stored in a 20 amp-hour (Ah) nickel-cadmium battery. Propulsion was provided by the RS-2101a engine, which could produce 1340 N thrust, and in total could have 5 restarts. The engine was fueled by monomethyl hydrazine and nitrogen tetroxide. For attitude control, the spacecraft contained 2 sets of 6 nitrogen jets on the tip of the solar panels. Attitude knowledge was provided by a Sun sensor, a Canopus star tracker, gyroscopes, an inertial reference unit, and an accelerometer. The thermal control was achieved by the use of louvers on the eight sides of the frame and thermal blankets. The Ultraviolet Spectrometer (UVS) studied the composition and density of Mars's upper atmosphere, detecting hydrogen, oxygen, and ozone. It worked on a wavelength range of 110–340 nm with a spectral resolution of 2.5 nm. The instrument identified atomic hydrogen and oxygen in the upper atmosphere; provided data on the escape rates of these elements, influencing Mars's atmospheric evolution and mapped ozone distribution, showing seasonal variations. The UVS was constructed by the Laboratory for Atmospheric and Space Physics at the University of Colorado, Boulder, Colorado. The ultraviolet spectrometer team was led by Professor Charles Barth. The Infrared Interferometer Spectrometer (IRIS) measured thermal radiation emitted by Mars to analyze atmospheric composition, surface temperature, and dust properties. It worked on a wavelength range of 6–50 μm with a spectral resolution of 2.4 cm-1. The instrument confirmed the presence of CO2 as the dominant atmospheric gas; detected water vapor in the Martian atmosphere; measured surface and atmospheric temperatures and provided insights into dust storms' thermal properties. The IRIS team was led by Dr. Rudolf A. Hanel from NASA Goddard Spaceflight Center (GSFC). The IRIS instrument was built by Texas Instruments, Dallas, Texas. The Celestial Mechanics Experiment was not a separate instrument. It used radio tracking to determine Mars's gravitational field and refine its mass estimates. It was based on analysis of Doppler shifts in the spacecraft's radio signals and measuring range and range rate to track Mariner 9's precise motion. The experiment improved the understanding of Mars's gravitational field, provided more accurate estimates of Mars's mass and shape and helped refine the planet's rotational parameters. The S-Band Occultation Experiment was not a separate instrument. It used Mariner 9's radio signal at 2.295 GHz (S-band) passing through Mars's atmosphere to study its density, pressure, and temperature profiles. The experiment measured vertical profiles of temperature and pressure in the Martian atmosphere, detected variations in the ionosphere and confirmed the presence of CO2 as the main atmospheric component. The Infrared Radiometer (IRR) measured surface and atmospheric temperatures using infrared radiation. It worked on a wavelength range of 10–12 μm with a field of view of 1.7° × 1.7°. The instrument provided surface temperature maps of Mars, monitored thermal properties of dust storms and identified temperature variations between day and night cycles. The IRR team was led by Professor Gerald Neugebauer from the California Institute of Technology (Caltech). The Visual Imaging System captured high-resolution images of Mars's surface, weather patterns, and moons. It employed two vidicon television cameras, with a resolution of 832 by 700 pixels. In a lower orbit, half that of Mariner 6 and Mariner 7 flyby missions, and with a vastly improved imaging system, Mariner 9 achieved a resolution of 98 metres (320 ft) per pixel, whereas previous Martian probes had achieved only approximately 790 metres (2,600 ft) per pixel. It used broadband filters of various wavelengths optimized for surface and atmospheric studies. The wide-angle Camera A produced pictures using eight selectable colored filters: Minus Blue/ Yellow, Orange, Polarized 0°, Green, Polarized 60°, Blue, Polarized 120° and Violet. From a periapsis altitude of 2000 km each image covered an area of 11° × 14°. The narrow-angle Camera B didn't use any filters, but had a response equivalent to camera A's Minus Blue / Yellow filter. From a periapsis altitude of 2000 km each image covered an area of 1.1° × 1.4°. The following table summarizes characteristics of both cameras: The instrument provided the first global mapping of Mars's surface; discovered volcanoes, valleys, and dried riverbeds, suggesting past water activity; captured dust storms covering the entire planet and mapped Phobos and Deimos, Mars's two moons. Mission Mariner 9 was designed to continue the atmospheric studies begun by Mariner 6 and 7, and to map over 70% of the Martian surface from the lowest altitude (1,500 kilometers (930 mi)) and at the highest resolutions (from 1 kilometer to 100 meters (1,100 to 110 yards) per pixel) of any Mars mission up to that point.[according to whom?] An infrared radiometer was included to detect heat sources in search of evidence of volcanic activity. It was to study temporal changes in the Martian atmosphere and surface. Mars's two moons, Deimos and Phobos, were also to be analyzed. Mariner 9 more than met its objectives. Under original plans, a dual mission was to be flown like Mariners 6–7, however the launch failure of Mariner 8 ruined this scheme and forced NASA planners to fall back on a simpler one-probe mission. NASA still held out hope that another Mariner probe and Atlas-Centaur could be readied before the 1971 Mars launch window closed. A few logistical problems emerged, including the lack of an available Centaur payload shroud of the correct configuration for the Mariner probes, however there was a shroud in NASA's inventory which could be modified. Convair also had an available Centaur stage on hand and could have an Atlas readied in time, but the idea was ultimately abandoned for lack of funding. Mariner 9 was mated to Atlas-Centaur AC-23 on May 9 with investigation into Mariner 8's failure ongoing. The malfunction was traced to a problem in the Centaur's pitch control servo amplifier and because it was not clear if the spacecraft itself had been responsible, RFI testing was conducted on Mariner 9 to ensure the probe was not releasing interference that could cause problems with the Centaur's electronics. All testing came back negative and on May 22, a tested and verified rate gyro package arrived from Convair and was installed in the Centaur. Liftoff took place on May 30 at 22:23:04 UT. All launch vehicle systems performed normally and the Mariner separated from the Centaur at 13 minutes and 18 seconds after launch. When Mariner 9 arrived at Mars on November 10, 1971, planetary scientists were surprised (although had anticipated during perihelic opposition) to find the atmosphere was thick with "a planet-wide robe of dust, the largest storm ever observed." The surface was totally obscured. On November 14 1971 after slowing down, Mariner 9's computer was thus reprogrammed from Earth to delay imaging of the surface for a couple of months until the dust settled. Closing down the camera in order to save energy. The main surface imaging did not get underway until mid-January 1972. However, surface-obscured images did contribute to the collection of Mars science, including understanding of the existence of several huge high-altitude volcanoes of the Tharsis Bulge that gradually became visible as the dust storm abated. This unexpected situation made a strong case for the desirability of studying a planet from orbit rather than merely flying past. It also highlighted the importance of flexible mission software. The Soviet Union's Mars 2 and Mars 3 probes, which arrived during the same dust storm, were unable to adapt to the unexpected conditions having been preprogrammed, which severely limited the amount of data that they were able to collect. After 349 days in orbit, Mariner 9 had transmitted 7,329 images, covering 85% of Mars's surface, whereas previous flyby missions had returned less than one thousand images covering only a small portion of the planetary surface. The images revealed river beds, craters, massive extinct volcanoes (such as Olympus Mons, the largest known volcano in the Solar System; Mariner 9 led directly to its reclassification from Nix Olympica), canyons (including the Valles Marineris, a system of canyons over about 4,020 kilometres (2,500 mi) long), evidence of wind erosion and deposition, weather fronts, fogs, and more. Mars's small moons, Phobos and Deimos, were also photographed. The findings from the Mariner 9 mission underpinned the later Viking program. The enormous Valles Marineris canyon system is named after Mariner 9 in honor of its achievements. After depleting its supply of attitude control gas, the spacecraft was turned off on October 27, 1972. Present location Mariner 9 remained in orbit around Mars after its operational use. Today it is thought Mariner 9 either burnt up on entry of the Martian atmosphere or impacted the surface. NASA had provided multiple dates for when Mariner 9 could enter the Martian atmosphere. At the time of the mission, Mariner 9 was left in an orbit that would not decay for at least 50 years. In 2011, NASA predicted that Mariner 9 would burn up or crash into Mars around 2022. However, a 2018 revision to the Mariner 9 mission page by NASA expected Mariner 9 would crash into Mars "sometime around 2020". Error-correction codes To control for errors in the reception of the grayscale image data sent by Mariner 9 (caused by a low signal-to-noise ratio), the data had to be encoded before transmission using a so-called forward error-correcting code (FEC). Without FEC, noise would have made up roughly a quarter of a received image, while the FEC encoded the data in a redundant way which allowed for the reconstruction of most of the sent image data at reception. Since the flown hardware was constrained with regards to weight, power consumption, storage, and computing power, some considerations had to be put into choosing an FEC, and it was decided to use a Hadamard code for Mariner 9. Each image pixel was represented as a six-bit binary value, which had 64 possible grayscale levels. Because of limitations of the transmitter, the maximum useful data length was about 30 bits. Instead of using a repetition code, a [32, 6, 16] Hadamard code was used, which is also a 1st-order Reed-Muller code. Errors of up to seven bits per each 32-bit word could be corrected using this scheme. Compared to a five-repetition code, the error correcting properties of this Hadamard code were much better, yet its data rate was comparable. The efficient decoding algorithm was an important factor in the decision to use this code. The circuitry used was called the "Green Machine", which employed the fast Fourier transform, increasing the decoding speed by a factor of three. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/File:Orion%E2%80%93Eridanus_Superbubble_in_H-alpha_and_continuum.jpg] | [TOKENS: 186] |
File:Orion–Eridanus Superbubble in H-alpha and continuum.jpg Summary The field of view is 50° × 39°. Equatorial center coordinates are RA=4h36m and DEC=3°. North is up. If you have an image of similar quality that can be published under a suitable copyright license, be sure to upload it, tag it, and nominate it. Licensing Orion Nebula (M42) Horsehead Nebula Lambda Orionis Nebula (SH2-264) Orion–Eridanus Superbubble Barnard's Loop Witch Head Nebula (IC 2118) NGC2023 M78 Monkey Head Nebula (NGC 7174, SH2-252) SH2-261 SH2-279S File history Click on a date/time to view the file as it appeared at that time. File usage The following 4 pages use this file: |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Cappella_dei_Mercanti_(Turin)#Perpetual_calendar] | [TOKENS: 935] |
Contents Cappella dei Mercanti, Turin The cappella dei Mercanti, Negozianti e Banchieri (chapel of merchants, shopkeepers, and bankers), better known as cappella dei Mercanti, is a Catholic chapel in the historic city center of Turin, Italy. The chapel, whose construction was authorized during the 16th century, was built at the end of the 1600s and most of the artwork it contains originated in the 1600 and 1700s, in the baroque style. The sacristy hosts the Perpetual Calendar built by the engineering Giovanni Plana, a primitive computing machine. History and description The Pious Congregation of Bankers, Merchants and Merchants of Turin was chartered in 1663, and built its own chapel inside the Jesuit palace, on the city block of San Paolo (owned by the congregation itself) on Via Dora Grossa, now Via Garibaldi. The space is adjacent to the sixteenth century Church of the Holy Martyrs, which was staffed by the Jesuits. The chapel was built during the rectorate of the Fr. Agostino Provana (1680-1726). Inaugurated in 1692, the large rectangular hall was decorated in the following years thanks to the guidance of Provana. The theme of the interior decorations is the Epiphany, which represents the manifestation of Christ to the powerful of the earth and on which day the Congregation celebrates its own feast. The walls of the chapel present numerous seventeenth-century paintings, all inspired by the theme of the Biblical Magi. On the left wall Herod with the Magi and the wise men (circa 1694) by Sebastiano Taricco, Journey of the Magi towards Bethlehem (circa 1694) by Luigi Vannier, Opening of the treasures of the Wise Men (1705) by Stefano Maria Legnani (called Legnanino), and Announcement of the angel to the Magi circa 1694) by Sebastiano Taricco. On the left wall Appearance of the star to the Magi (1703) by Andrea Pozzo, King David meditates on the mystery of the Epiphany (circa 1695) by Stefano Maria Legnani, Massacre of the Innocents (1703) by Andrea Pozzo, and Procession of the Magi into Jerusalem (1712) by Niccolò Carone. The paintings are alternated with marbled wooden statues made by Carlo Giuseppe Plura between 1707 and 1715 depicting popes and Church Fathers; John Chrysostom, Gregory the Great, and Saint Ambrose on the left wall and Saint Jerome, Saint Leo the Great, and Saint Augustine on the right wall. Plana also carved the marble bust of the Madonna to the left of the altar. The altar dates back to 1797 and is the work of Michele Emanuele Buscaglione. On either side there are two reliquaries, while on the wall there are three paintings by the Jesuit painter Andrea Pozzo: Nativity with shepherds (1699 circa), Adoration of the Magi (before 1694), and Flight to Egypt (around 1699). The baroque frescoed ceiling by Legnanino depicts Heaven, prophets, sibyls and biblical episodes and dates to 1694-1695. The organ on the wall opposite the altar dates back to the eighteenth century. In the sacristy there are the altarpiece Adoration of the Magi (circa 1620) by Guglielmo Caccia (called Moncalvo), and a Piccolo Trono (1792) by Michele Brassiè, together with a Natale Favriano wardrobe from 1712. The sacristy also houses precious Antependi and the archive of the Congregation. On 21 January 2017 the chapel was returned to the public after a period of renovation. Perpetual calendar The sacristy contains several sacred objects, but above all the famous Perpetual Calendar by Giovanni Plana, one of the oldest calculator machines (it is equipped with rotating drums and a transmission system that allows the correct combination of the various information contained in the system) which allows precise calendrical calculation over a period of 4000 years starting from year zero (including the calculation of lunations, days of the week and Christian holidays). Gallery External links References |
======================================== |
[SOURCE: https://he.wikipedia.org/wiki/%D7%A7%D7%98%D7%92%D7%95%D7%A8%D7%99%D7%94:%D7%95%D7%99%D7%A7%D7%99%D7%A0%D7%AA%D7%95%D7%A0%D7%99%D7%9D_-_%D7%94%D7%A9%D7%95%D7%95%D7%90%D7%AA_%D7%A2%D7%A8%D7%9B%D7%99%D7%9D:_%D7%97%D7%A1%D7%A8] | [TOKENS: 416] |
קטגוריה:ויקינתונים - השוואת ערכים: חסר קטגוריה זו מרכזת ערכים שבהם קיים מידע מקומי בוויקיפדיה העברית שאינו קיים בוויקינתונים. ניתן להסתייע בה על מנת להעשיר את ויקינתונים במידע מוויקיפדיה העברית. קטגוריות־משנה דף קטגוריה זה כולל את 200 קטגוריות המשנה הבאות, מתוך 554 בקטגוריה כולה. (לתצוגת עץ) דפים בקטגוריה "ויקינתונים - השוואת ערכים: חסר" דף קטגוריה זה כולל את 200 הדפים הבאים, מתוך 93,316 בקטגוריה כולה. (לתצוגת עץ) |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/British_Empire] | [TOKENS: 15822] |
Contents British Empire The British Empire comprised the dominions, colonies, protectorates, mandates, and other territories ruled or administered by the United Kingdom and its predecessor states. It began with the overseas possessions and trading posts established by England in the late 16th and early 17th centuries, and colonisation attempts by Scotland during the 17th century. At its height in the 19th and early 20th centuries, it became the largest empire in history and, for a century, was the foremost global power. By 1913, the British Empire held sway over 412 million people, 23 percent of the world population at the time, and by 1920, it covered 35.5 million km2 (13.7 million sq mi), 24 per cent of the Earth's total land area. As a result, its constitutional, legal, linguistic, and cultural legacy is widespread. At the peak of its power, it was described as "the empire on which the sun never sets", as the sun was always shining on at least one of its territories. During the Age of Discovery in the 15th and 16th centuries, Portugal and Spain pioneered European exploration of the world, and in the process established large overseas empires. Motivated by the great wealth these empires generated, England, France, and the Netherlands began to establish colonies and trade networks of their own in the Americas and Asia. A series of wars in the 17th and 18th centuries with the Netherlands and France left Britain the dominant colonial power in North America. Britain became a major power in the Indian subcontinent after the East India Company's conquest of Mughal Bengal at the Battle of Plassey in 1757. The American War of Independence resulted in Britain losing some of its oldest and most populous colonies in North America by 1783. While retaining control of British North America (now Canada) and territories in and near the Caribbean in the British West Indies, British colonial expansion turned towards Asia, Africa, and the Pacific. After the defeat of France in the Napoleonic Wars (1803–1815), Britain emerged as the principal naval and imperial power of the 19th century and expanded its imperial holdings. It pursued trade concessions in China and Japan, and territory in Southeast Asia. The Great Game and Scramble for Africa also ensued. The period of relative peace (1815–1914) during which the British Empire became the global hegemon was later described as Pax Britannica (Latin for "British Peace"). Alongside the formal control that Britain exerted over its colonies, its dominance of much of world trade, and of its oceans, meant that it effectively controlled the economies of, and readily enforced its interests in, many regions, such as Asia and Latin America. It also came to dominate the Middle East. Increasing degrees of autonomy were granted to its white settler colonies, some of which were formally reclassified as Dominions by the 1920s. By the start of the 20th century, Germany and the United States had begun to challenge Britain's economic lead. Military, economic and colonial tensions between Britain and Germany were major causes of the First World War, during which Britain relied heavily on its empire. The conflict placed enormous strain on its military, financial, and manpower resources. Although the empire achieved its largest territorial extent immediately after the First World War, Britain was no longer the world's preeminent industrial or military power. In the Second World War, Britain's colonies in East Asia and Southeast Asia were occupied by the Empire of Japan. Despite the final victory of Britain and its allies, the damage to British prestige and the British economy helped accelerate the decline of the empire. India, Britain's most valuable and populous possession, achieved independence in 1947 as part of a larger decolonisation movement, in which Britain granted independence to most territories of the empire. The Suez Crisis of 1956 confirmed Britain's decline as a global power, and the handover of Hong Kong to China on 1 July 1997 symbolised for many the end of the British Empire, though fourteen overseas territories that are remnants of the empire remain under British sovereignty. After independence, many former British colonies, along with most of the dominions, joined the Commonwealth of Nations, a free association of independent states. Fifteen of these, including the United Kingdom, retain the same person as monarch, currently King Charles III. Origins (1497–1583) The foundations of the overseas British Empire were laid when England and Scotland were separate kingdoms, and, at the time ruled by different monarchs. In 1496, King Henry VII of England, following the successes of Spain and Portugal in overseas exploration, commissioned John Cabot to lead an expedition to discover a northwest passage to Asia via the North Atlantic. Cabot sailed in 1497, five years after the first voyage of Christopher Columbus, and made landfall on the coast of Newfoundland. He believed he had reached Asia, and there was no attempt to found a colony. Cabot led another voyage to the Americas the following year but did not return; it is unknown what happened to his ships. There is "inferential evidence [that] suggests that Bristol mariners may already have known of that North American shore and had been harvesting its rich seas for a decade or more", but England made no formal claims. No further attempts to establish English colonies in the Americas were made until well into the reign of Queen Elizabeth I, during the last decades of the 16th century. In the meantime, Henry VIII's 1533 Statute in Restraint of Appeals had declared "that this realm of England is an Empire". The Protestant Reformation turned England and Catholic Spain into implacable enemies. In 1562, Elizabeth I encouraged the privateers John Hawkins and Francis Drake to engage in slave-raiding attacks against Spanish and Portuguese ships off the coast of West Africa with the aim of establishing an Atlantic slave trade. This effort was rebuffed and later, as the Anglo-Spanish Wars intensified, Elizabeth I gave her blessing to further privateering raids against Spanish ports in the Americas and shipping that was returning across the Atlantic, laden with treasure from the New World. At the same time, influential writers such as Richard Hakluyt and John Dee (who was the first to use the term "British Empire") were beginning to press for the establishment of England's own empire. By this time, Spain had become the dominant power in the Americas and was exploring the Pacific Ocean, Portugal had established trading posts and forts from the coasts of Africa and Brazil to China, and France had begun to settle the Saint Lawrence River area, later to become New France. Although England tended to trail behind Portugal, Spain, and France in establishing overseas colonies, it carried out its first modern colonisation, in sixteenth-century Ireland, with settlements, known as "plantations", of English and Welsh Protestants. England had already colonised part of the country following the Norman invasion of Ireland in 1169, but English authority had waned over the centuries. During the reign of Elizabeth I, a pattern of conquest and attempts at colonization was "contemporaneous with and parallel to the first effective of Englishmen in North America". Several men who helped establish the Munster plantations in Ireland later played a part in the early colonisation of North America, particularly a group known as the West Country Men. English overseas possessions (1583–1707) In 1578, Elizabeth I granted a patent to Humphrey Gilbert for discovery and overseas exploration. That year, Gilbert sailed for the Caribbean with the intention of engaging in piracy and establishing a colony in North America, but the expedition was aborted before it had crossed the Atlantic. In 1583, he embarked on a second attempt. On this occasion, he formally claimed the harbour of the island of Newfoundland, although no settlers were left behind. Gilbert did not survive the return journey to England and was succeeded by his half-brother, Walter Raleigh, who was granted his own patent by Elizabeth in 1584. Later that year, Raleigh founded the Roanoke Colony on the coast of present-day North Carolina, but lack of supplies caused the colony to fail. In 1603, James VI of Scotland ascended (as James I) to the English throne and in 1604 negotiated the Treaty of London, ending hostilities with Spain. Now at peace with its main rival, English attention shifted from preying on other nations' colonial infrastructures to the business of establishing its own overseas colonies. The British Empire began to take shape during the early 17th century, with the English settlement of North America and the smaller islands of the Caribbean, and the establishment of joint-stock companies, most notably the East India Company, to administer colonies and overseas trade. This period, until the loss of the Thirteen Colonies after the American War of Independence towards the end of the 18th century, has been referred to by some historians as the "First British Empire". England's early efforts at colonisation in the Americas met with mixed success. An attempt to establish a colony in Guiana in 1604 lasted only two years and failed in its main objective to find gold deposits. Colonies on the Caribbean islands of St Lucia (1605) and Grenada (1609) rapidly folded. The first permanent English settlement in the Americas was founded in 1607 in Jamestown by Captain John Smith, and managed by the Virginia Company; the Crown took direct control of the venture in 1624, thereby founding the Colony of Virginia. Bermuda was settled and claimed by England as a result of the 1609 shipwreck of the Virginia Company's flagship, while attempts to settle Newfoundland were largely unsuccessful. In 1620, Plymouth was founded as a haven by Puritan religious separatists, later known as the Pilgrims. Fleeing from religious persecution would become the motive for many English would-be colonists to risk the arduous trans-Atlantic voyage: Maryland was established by English Roman Catholics (1634), Rhode Island (1636) as a colony tolerant of all religions and Connecticut (1639) for Congregationalists. England's North American holdings were further expanded by the annexation of the Dutch colony of New Netherland in 1664, following the capture of New Amsterdam, which was renamed New York. Although less financially successful than colonies in the Caribbean, these territories had large areas of good agricultural land and attracted far greater numbers of English emigrants, who preferred their temperate climates. The British West Indies initially provided England's most important and lucrative colonies. Settlements were successfully established in St. Kitts (1624), Barbados (1627) and Nevis (1628), but struggled until the "Sugar Revolution" transformed the Caribbean economy in the mid-17th century. Large sugarcane plantations were first established in the 1640s on Barbados, with assistance from Dutch merchants and Sephardic Jews fleeing Portuguese Brazil. At first, sugar was grown primarily using white indentured labour, but rising costs soon led English traders to embrace the use of imported African slaves. The enormous wealth generated by slave-produced sugar made Barbados the most successful colony in the Americas, and one of the most densely populated places in the world. This boom led to the spread of sugar cultivation across the Caribbean, financed the development of non-plantation colonies in North America, and accelerated the growth of the Atlantic slave trade, particularly the triangular trade of slaves, sugar and provisions between Africa, the West Indies and Europe. To ensure that the increasingly healthy profits of colonial trade remained in English hands, Parliament decreed in 1651 that only English ships would be able to ply their trade in English colonies. This led to hostilities with the United Dutch Provinces—a series of Anglo-Dutch Wars—which would eventually strengthen England's position in the Americas at the expense of the Dutch. In 1655, England annexed the island of Jamaica from the Spanish, and in 1666 succeeded in colonising the Bahamas. In 1670, Charles II incorporated by royal charter the Hudson's Bay Company (HBC), granting it a monopoly on the fur trade in the area known as Rupert's Land, which would later form a large proportion of the Dominion of Canada. Forts and trading posts established by the HBC were frequently the subject of attacks by the French, who had established their own fur trading colony in adjacent New France. Two years later, the Royal African Company was granted a monopoly on the supply of slaves to the British colonies in the Caribbean. The company would transport more slaves across the Atlantic than any other, and significantly grew England's share of the trade, from 33 per cent in 1673 to 74 per cent in 1683. The removal of this monopoly between 1688 and 1712 allowed independent British slave traders to thrive, leading to a rapid escalation in the number of slaves transported. British ships carried a third of all slaves shipped across the Atlantic—approximately 3.5 million Africans—until the abolition of the trade by Parliament in 1807 (see § Abolition of slavery). To facilitate the shipment of slaves, forts were established on the coast of West Africa, such as James Island, Accra and Bunce Island. In the British Caribbean, the percentage of the population of African descent rose from 25 per cent in 1650 to around 80 per cent in 1780, and in the Thirteen Colonies from 10 per cent to 40 per cent over the same period (the majority in the southern colonies). The transatlantic slave trade played a pervasive role in British economic life, and became a major economic mainstay for western port cities. Ships registered in Bristol, Liverpool and London were responsible for the bulk of British slave trading. For the transported, harsh and unhygienic conditions on the slaving ships and poor diets meant that the average mortality rate during the Middle Passage was one in seven. At the end of the 16th century, England and the Dutch Empire began to challenge the Portuguese Empire's monopoly of trade with Asia, forming private joint-stock companies to finance the voyages—the English, later British, East India Company and the Dutch East India Company, chartered in 1600 and 1602 respectively. The primary aim of these companies was to tap into the lucrative spice trade, an effort focused mainly on two regions: the East Indies archipelago, and an important hub in the trade network, India. There, they competed for trade supremacy with Portugal and with each other. Although England eclipsed the Netherlands as a colonial power, in the short term the Netherlands' more advanced financial system and the three Anglo-Dutch Wars of the 17th century left it with a stronger position in Asia. Hostilities ceased after the Glorious Revolution of 1688 when the Dutch William of Orange ascended the English throne, bringing peace between the Dutch Republic and England. A deal between the two nations left the spice trade of the East Indies archipelago to the Netherlands and the textiles industry of India to England, but textiles soon overtook spices in terms of profitability. Peace between England and the Netherlands in 1688 meant the two countries entered the Nine Years' War as allies, but the conflict—waged in Europe and overseas between France, Spain and the Anglo-Dutch alliance—left the English a stronger colonial power than the Dutch, who were forced to devote a larger proportion of their military budget to the costly land war in Europe. The death of Charles II of Spain in 1700 and his bequeathal of Spain and its colonial empire to Philip V of Spain, a grandson of the King of France, raised the prospect of the unification of France, Spain and their respective colonies, an unacceptable state of affairs for England and the other powers of Europe. In 1701, England, Portugal and the Netherlands sided with the Holy Roman Empire against Spain and France in the War of the Spanish Succession, which lasted for thirteen years. Colonies and territories of Scotland (1629–1707) Colonisation attempts by the Kingdom of Scotland predates the establishment of the United Kingdom of Great Britain and the British Empire. During the 17th century, the Kingdom of Scotland made attempts to establish trading schemes in Ireland and Canada. Nova Scotia, today a province of Canada, was a short lived scheme in the Mi'kmaq people territory. Additionally, there was a high number of Scots in Ireland, particularly in the region of Ulster, who settled there as planters. The first documented Scottish settlement in the Americas was of Nova Scotia in 1629. On 29 September 1621, the charter for the foundation of a colony was granted by James VI of Scotland to Sir William Alexander. Between 1622 and 1628, Sir William launched four attempts to send colonists to Nova Scotia; all failed for various reasons. A successful settlement of Nova Scotia was finally achieved in 1629. The colony's charter, in law, made Nova Scotia (defined as all land between Newfoundland and New England; i.e., The Maritimes) a part of mainland Scotland; this was later used to get around the English navigation acts. As a country, Scotland engaged in two ventures in an attempt to establish colonies within English colonies during the 1680s in Carolina and East New Jersey. The attempt to establish themselves in Carolina was largely in part due to Scottish Presbyterians fleeing from the threat of religious persecution in Scotland, and were assisted in their efforts by Scottish merchants who were aiming to develop transatlantic trade which was not subjected to the English Navigation Acts. The Act restricted Scottish trade with English colonies. Scottish settlement attempts in Carolina began in 1682, and was eventually defeated by the dissolution of the Scottish settlement of Stewartstoun, which was established in 1684, by the Spanish in 1686. In 1695, the Parliament of Scotland granted a charter to the Company of Scotland, which established a settlement in 1698 on the Isthmus of Panama. Besieged by neighbouring Spanish colonists of New Granada, and affected by malaria, the colony was abandoned two years later. The Darien scheme was a financial disaster for Scotland: a quarter of Scottish capital was lost in the enterprise. The episode had major political consequences, helping to persuade the government of the Kingdom of Scotland of the merits of turning the personal union with England into a political and economic one under the Kingdom of Great Britain established by the Acts of Union 1707. Expansion and colonial conflict (1707–1783) The 18th century saw the newly united Great Britain rise to be the world's dominant colonial power, with France becoming its main rival on the imperial stage. Great Britain, Portugal, the Netherlands, and the Holy Roman Empire continued the War of the Spanish Succession, which lasted until 1714 and was concluded by the Treaty of Utrecht. Philip V of Spain renounced his and his descendants' claim to the French throne, and Spain lost its empire in Europe. The British Empire was territorially enlarged: from France, Britain gained Newfoundland and Acadia, and from Spain, Gibraltar and Menorca. Gibraltar became a critical naval base and allowed Britain to control the Atlantic entry and exit point to the Mediterranean. Spain ceded the rights to the lucrative asiento (permission to sell African slaves in Spanish America) to Britain. With the outbreak of the Anglo-Spanish War of Jenkins' Ear in 1739, Spanish privateers attacked British merchant shipping along the Triangle Trade routes. In 1746, the Spanish and British began peace talks, with the King of Spain agreeing to stop all attacks on British shipping; however, in the 1750 Treaty of Madrid, Britain lost its slave-trading rights in Latin America. In the East Indies, British and Dutch merchants continued to compete in spices and textiles. With textiles becoming the larger trade, by 1720, in terms of sales, the British company had overtaken the Dutch. During the middle decades of the 18th century, there were several outbreaks of military conflict on the Indian subcontinent, as the English East India Company and its French counterpart, struggled alongside local rulers to fill the vacuum that had been left by the decline of the Mughal Empire. The Battle of Plassey in 1757, in which the British defeated the Nawab of Bengal and his French allies, left the British East India Company in control of Bengal and as a major military and political power in India. France was left control of its enclaves but with military restrictions and an obligation to support British client states, ending French hopes of controlling India. In the following decades the British East India Company gradually increased the size of the territories under its control, either ruling directly or via local rulers under the threat of force from the Presidency Armies, the vast majority of which was composed of Indian sepoys, led by British officers. The British and French struggles in India became but one theatre of the global Seven Years' War (1756–1763) involving France, Britain, and the other major European powers. The signing of the Treaty of Paris of 1763 had important consequences for the future of the British Empire. In North America, France's future as a colonial power effectively ended with the recognition of British claims to Rupert's Land, and the ceding of New France to Britain (leaving a sizeable French-speaking population under British control) and Louisiana to Spain. Spain ceded the Floridas to Britain. Along with its victory over France in India, the Seven Years' War therefore left Britain as the world's most powerful maritime power. During the 1760s and early 1770s, relations between the Thirteen Colonies and Britain became increasingly strained, primarily because of resentment of the British Parliament's attempts to govern and tax American colonists without their consent. This was summarised at the time by the colonists' slogan "No taxation without representation", a perceived violation of the guaranteed Rights of Englishmen. The American Revolution began with a rejection of Parliamentary authority and moves towards self-government. In response, Britain sent troops to reimpose direct rule, leading to the outbreak of war in 1775. The following year, in 1776, the Second Continental Congress issued the Declaration of Independence proclaiming the colonies' sovereignty from the British Empire as the new United States of America. The entry of French and Spanish forces into the war tipped the military balance in the Americans' favour and after a decisive defeat at Yorktown in 1781, Britain began negotiating peace terms. American independence was acknowledged at the Peace of Paris in 1783. The loss of such a large portion of British America, at the time Britain's most populous overseas possession, is seen by some historians as the event defining the transition between the first and second empires, in which Britain shifted its attention away from the Americas to Asia, the Pacific and later Africa. Adam Smith's Wealth of Nations, published in 1776, had argued that colonies were redundant, and that free trade should replace the old mercantilist policies that had characterised the first period of colonial expansion, dating back to the protectionism of Spain and Portugal. The growth of trade between the newly independent United States and Britain after 1783 seemed to confirm Smith's view that political control was not necessary for economic success. The war to the south influenced British policy in Canada, where between 40,000 and 100,000 defeated Loyalists had migrated from the new United States following independence. The 14,000 Loyalists who went to the Saint John and Saint Croix river valleys, then part of Nova Scotia, felt too far removed from the provincial government in Halifax, so London split off New Brunswick as a separate colony in 1784. The Constitutional Act 1791 created the provinces of Upper Canada (mainly English speaking) and Lower Canada (mainly French-speaking) to defuse tensions between the French and British communities, and implemented governmental systems similar to those employed in Britain, with the intention of asserting imperial authority and not allowing the sort of popular control of government that was perceived to have led to the American Revolution. Tensions between Britain and the United States escalated again during the Napoleonic Wars, as Britain tried to cut off American trade with France and boarded American ships to impress men into the Royal Navy. The United States Congress declared war, the War of 1812, and invaded Canadian territory. In response, Britain invaded the US, but the pre-war boundaries were reaffirmed by the 1814 Treaty of Ghent, ensuring Canada's future would be separate from that of the United States. Consolidation and global dominance (1783–1815) Since 1718, transportation to the American colonies had been a penalty for various offences in Britain, with approximately one thousand convicts transported per year. Forced to find an alternative location after the loss of the Thirteen Colonies in 1783, the British government looked for an alternative, eventually turning to Australia. On his first of three voyages commissioned by the government, James Cook reached New Zealand in October 1769. He was the first European to circumnavigate and map the country. From the late 18th century, the country was regularly visited by explorers and other sailors, missionaries, traders and adventurers but no attempt was made to settle the country or establish possession. The coast of Australia had been discovered for Europeans by the Dutch in 1606, but there was no attempt to colonise it. In 1770, after leaving New Zealand, James Cook charted the eastern coast, claimed the continent for Britain, and named it New South Wales. In 1778, Joseph Banks, Cook's botanist on the voyage, presented evidence to the government on the suitability of Botany Bay for the establishment of a penal settlement, and in 1787 the first shipment of convicts set sail, arriving in 1788. Unusually, Australia was claimed through proclamation. Indigenous Australians were considered too uncivilised to require treaties, and colonisation brought disease and violence that together with the deliberate dispossession of land and culture were devastating to these peoples. Britain continued to transport convicts to New South Wales until 1840, to Tasmania until 1853 and to Western Australia until 1868. The Australian colonies became profitable exporters of wool and gold, mainly because of the Victorian gold rush, making its capital Melbourne for a time the richest city in the world. The British also expanded their mercantile interests in the North Pacific. Spain and Britain had become rivals in the area, culminating in the Nootka Crisis in 1789. Both sides mobilised for war, but when France refused to support Spain, it was forced to back down, leading to the Nootka Convention. The outcome was a humiliation for Spain, which practically renounced all sovereignty on the North Pacific coast. This opened the way to British expansion in the area, and a number of expeditions took place; firstly a naval expedition led by George Vancouver which explored the inlets around the Pacific North West, particularly around Vancouver Island. On land, expeditions sought to discover a river route to the Pacific for the extension of the North American fur trade. Alexander Mackenzie of the North West Company led the first, starting out in 1792, and a year later he became the first European to reach the Pacific overland north of the Rio Grande, reaching the ocean near present-day Bella Coola. This preceded the Lewis and Clark Expedition by twelve years. Shortly thereafter, Mackenzie's companion, John Finlay, founded the first permanent European settlement in British Columbia, Fort St. John. The North West Company sought further exploration and backed expeditions by David Thompson, starting in 1797, and later by Simon Fraser. These pushed into the wilderness territories of the Rocky Mountains and Interior Plateau to the Strait of Georgia on the Pacific Coast, expanding British North America westward. The East India Company fought a series of Anglo-Mysore wars in Southern India with the Sultanate of Mysore under Hyder Ali and then Tipu Sultan. Defeats in the First Anglo-Mysore war and stalemate in the Second were followed by victories in the Third and the Fourth. Following Tipu Sultan's death in the fourth war in the Siege of Seringapatam (1799), the kingdom became a protectorate of the company. The East India Company fought three Anglo-Maratha Wars with the Maratha Confederacy. The First Anglo-Maratha War ended in 1782 with a restoration of the pre-war status quo. The Second and Third Anglo-Maratha wars resulted in British victories. After the surrender of Peshwa Bajirao II in 1818, the East India Company acquired control of a large majority of the Indian subcontinent. Britain was challenged again by France under Napoleon, in a struggle that, unlike previous wars, represented a contest of ideologies between the two nations. It was not only Britain's position on the world stage that was at risk: Napoleon threatened to invade Britain itself, just as his armies had overrun many countries of continental Europe. The Napoleonic Wars were therefore ones in which Britain invested large amounts of capital and resources to win. French ports were blockaded by the Royal Navy, which won a decisive victory over a French Imperial Navy-Spanish Navy fleet at the Battle of Trafalgar in 1805. Overseas colonies were attacked and occupied, including those of the Netherlands, which was annexed by Napoleon in 1810. France was finally defeated by a coalition of European armies in 1815. Britain was again the beneficiary of peace treaties: France ceded the Ionian Islands, Malta (which it had occupied in 1798), Mauritius, St Lucia, the Seychelles, and Tobago; Spain ceded Trinidad; the Netherlands ceded Guiana, Ceylon and the Cape Colony, while the Danish ceded Heligoland. Britain returned Guadeloupe, Martinique, French Guiana, and Réunion to France; Menorca to Spain; Danish West Indies to Denmark and Java and Suriname to the Netherlands. With the advent of the Industrial Revolution, goods produced by slavery became less important to the British economy. Added to this was the cost of suppressing regular slave rebellions. With support from the British abolitionist movement, Parliament enacted the Slave Trade Act in 1807, which abolished the slave trade in the empire. In 1808, Sierra Leone Colony was designated an official British colony for freed slaves. Parliamentary reform in 1832 saw the influence of the West India Committee decline. The Slavery Abolition Act, passed the following year, abolished slavery in the British Empire on 1 August 1834, finally bringing the empire into line with the law in the UK (with the exception of the territories administered by the East India Company and Ceylon, where slavery was ended in 1844). Under the Act, slaves were granted full emancipation after a period of four to six years of "apprenticeship". Facing further opposition from abolitionists, the apprenticeship system was abolished in 1838. The British government compensated slave-owners. Britain's imperial century (1815–1914) During Britain's "imperial century" between 1815 and 1914, around 10 million sq mi (26 million km2) of territory and roughly 400 million people were added to the British Empire. Victory over Napoleon left Britain without any serious international rival, other than Russia in Central Asia. Unchallenged at sea, Britain became the global policeman, a state of affairs later known as the Pax Britannica, and a foreign policy of "splendid isolation". Alongside the formal control it exerted over its own colonies, Britain's dominant position in world trade meant that it effectively controlled the economies of many countries, such as China, Argentina and Siam, which were seen as its "informal empire". British imperial strength was underpinned by the steamship and the telegraph, new technologies invented in the second half of the 19th century, allowing it to control and defend the empire. By 1902, the British Empire was linked together by a network of telegraph cables, the All Red Line. The East India Company drove the expansion of the British Empire in Asia. The company's army had first joined forces with the Royal Navy during the Seven Years' War, and the two continued to co-operate in arenas outside India: the eviction of the French from Egypt (1799), the capture of Java from the Netherlands (1811), the acquisition of Penang Island (1786), Singapore (1819) and Malacca (1824), and the defeat of Burma (1826). From its base in India, the company had been engaged in an increasingly profitable opium export trade to Qing China since the 1730s. This trade, illegal since it was outlawed by China in 1729, helped reverse the trade imbalances resulting from the British imports of tea, which saw large outflows of silver from Britain to China. In 1839, the confiscation by the Chinese authorities at Canton of 20,000 chests of opium led Britain to attack China in the First Opium War, and resulted in the seizure by Britain of Hong Kong Island, at that time a minor settlement, and other treaty ports including Shanghai. During the late 18th and early 19th centuries, the British Crown began to assume an increasingly large role in the affairs of the company. A series of acts of Parliament were passed, including the Regulating Act 1773, East India Company Act 1784 and the Charter Act 1813 which regulated the company's affairs and established the sovereignty of the Crown over the territories that it had acquired. The company's eventual end was precipitated by the Indian Rebellion in 1857, a conflict that had begun with the mutiny of sepoys, Indian troops under British officers and discipline. The rebellion took six months to suppress, with heavy loss of life on both sides. The following year the British government dissolved the company and assumed direct control over India through the Government of India Act 1858, establishing the British Raj, where an appointed governor-general administered India and Queen Victoria was crowned the Empress of India. India became the empire's most valuable possession, "the Jewel in the Crown", and was the most important source of Britain's strength. A series of serious crop failures in the late 19th century led to widespread famines on the subcontinent in which it is estimated that over 15 million people died. The East India Company had failed to implement any coordinated policy to deal with the famines during its period of rule. Later, under direct British rule, commissions were set up after each famine to investigate the causes and implement new policies, which took until the early 1900s to have an effect. On each of his three voyages to the Pacific between 1769 and 1777, James Cook visited New Zealand. He was followed by an assortment of Europeans and Americans which including whalers, sealers, escaped convicts from New South Wales, missionaries and adventurers. Initially, contact with the indigenous Māori people was limited to the trading of goods, although interaction increased during the early decades of the 19th century with many trading and missionary stations being set up, especially in the north. The first of several Church of England missionaries arrived in 1814 and as well as their missionary role, they soon become the only form of European authority in a land that was not subject to British jurisdiction: the closest authority being the New South Wales governor in Sydney. The sale of weapons to Māori resulted from 1818 on in the intertribal warfare of the Musket Wars, with devastating consequences for the Māori population. The UK government finally decided to act, dispatching Captain William Hobson with instructions to take formal possession after obtaining native consent. There was no central Māori authority able to represent all New Zealand so, on 6 February 1840, Hobson and many Māori chiefs signed the Treaty of Waitangi in the Bay of Islands; most other chiefs signing in stages over the following months. William Hobson declared British sovereignty over all New Zealand on 21 May 1840, over the North Island by cession and over the South Island by discovery (the island was sparsely populated and deemed terra nullius). Hobson became Lieutenant-Governor, subject to Governor Sir George Gipps in Sydney, with British possession of New Zealand initially administered from Australia as a dependency of the New South Wales colony. From 16 June 1840 New South Wales laws applied in New Zealand. This transitional arrangement ended with the Charter for Erecting the Colony of New Zealand on 16 November 1840. The Charter stated that New Zealand would be established as a separate Crown colony on 3 May 1841 with Hobson as its governor. During the 19th century, Britain and the Russian Empire vied to fill the power vacuums that had been left by the declining Ottoman Empire, Qajar dynasty and Qing dynasty. This rivalry in Central Asia came to be known as the "Great Game". As far as Britain was concerned, defeats inflicted by Russia on Persia and Turkey demonstrated its imperial ambitions and capabilities and stoked fears in Britain of an overland invasion of India. In 1839, Britain moved to pre-empt this by invading Afghanistan, but the First Anglo-Afghan War was a disaster for Britain. When Russia invaded the Ottoman Balkans in 1853, fears of Russian dominance in the Mediterranean and the Middle East led Britain and France to enter the war in support of the Ottoman Empire and invade the Crimean Peninsula to destroy Russian naval capabilities. The ensuing Crimean War (1854–1856), which involved new techniques of modern warfare, was the only global war fought between Britain and another imperial power during the Pax Britannica and was a resounding defeat for Russia. The situation remained unresolved in Central Asia for two more decades, with Britain annexing Baluchistan in 1876 and Russia annexing Kirghizia, Kazakhstan, and Turkmenistan. For a while, it appeared that another war would be inevitable, but the two countries reached an agreement on their respective spheres of influence in the region in 1878 and on all outstanding matters in 1907 with the signing of the Anglo-Russian Entente. The destruction of the Imperial Russian Navy by the Imperial Japanese Navy at the Battle of Tsushima during the Russo-Japanese War of 1904–1905 limited its threat to the British. The Dutch East India Company had founded the Dutch Cape Colony on the southern tip of Africa in 1652 as a way station for its ships travelling to and from its colonies in the East Indies. Britain formally acquired the colony, and its large Afrikaner (or Boer) population in 1806, having occupied it in 1795 to prevent its falling into French hands during the Flanders Campaign. British immigration to the Cape Colony began to rise after 1820, and pushed thousands of Boers, resentful of British rule, northwards to found their own—mostly short-lived—independent republics, during the Great Trek of the late 1830s and early 1840s. In the process the Voortrekkers clashed repeatedly with the British, who had their own agenda with regard to colonial expansion in South Africa and to the various native African polities, including those of the Sotho people and the Zulu Kingdom. Eventually, the Boers established two republics that had a longer lifespan: the South African Republic or Transvaal Republic (1852–1877; 1881–1902) and the Orange Free State (1854–1902). In 1902 Britain occupied both republics, concluding a treaty with the two Boer Republics following the Second Boer War (1899–1902). In 1869 the Suez Canal opened under Napoleon III, linking the Mediterranean Sea with the Indian Ocean. Initially the Canal was opposed by the British; but once opened, its strategic value was quickly recognised and became the "jugular vein of the Empire". In 1875, the Conservative government of Benjamin Disraeli bought the indebted Egyptian ruler Isma'il Pasha's 44 per cent shareholding in the Suez Canal for £4 million (equivalent to £480 million in 2023). Although this did not grant outright control of the strategic waterway, it did give Britain leverage. Joint Anglo-French financial control over Egypt ended in outright British occupation in 1882. Although Britain controlled the Khedivate of Egypt into the 20th century, it was officially a vassal state of the Ottoman Empire and not part of the British Empire. The French were still majority shareholders and attempted to weaken the British position, but a compromise was reached with the 1888 Convention of Constantinople, which made the Canal officially neutral territory. With competitive French, Belgian and Portuguese activity in the lower Congo River region undermining orderly colonisation of tropical Africa, the Berlin Conference of 1884–85 was held to regulate the competition between the European powers in what was called the "Scramble for Africa" by defining "effective occupation" as the criterion for international recognition of territorial claims. The scramble continued into the 1890s, and caused Britain to reconsider its decision in 1885 to withdraw from Sudan. A joint force of British and Egyptian troops defeated the Mahdist Army in 1896 and rebuffed an attempted French invasion at Fashoda in 1898. Sudan was nominally made an Anglo-Egyptian condominium, but a British colony in reality. British gains in Southern and East Africa prompted Cecil Rhodes, pioneer of British expansion in Southern Africa, to urge a "Cape to Cairo" railway linking the strategically important Suez Canal to the mineral-rich south of the continent. During the 1880s and 1890s, Rhodes, with his privately owned British South Africa Company, occupied and annexed territories named after him, Rhodesia. The path to independence for the white colonies of the British Empire began with the 1839 Durham Report, which proposed unification and self-government for Upper and Lower Canada, as a solution to political unrest which had erupted in armed rebellions in 1837. This began with the passing of the Act of Union in 1840, which created the Province of Canada. Responsible government was first granted to Nova Scotia in 1848, and was soon extended to the other British North American colonies. With the passage of the British North America Act, 1867 by the British Parliament, the Province of Canada, New Brunswick and Nova Scotia were formed into Canada, a confederation enjoying full self-government with the exception of international relations. Australia and New Zealand achieved similar levels of self-government after 1900, with the Australian colonies federating in 1901. The term "dominion status" was officially introduced at the 1907 Imperial Conference. As the dominions gained greater autonomy, they would come to be recognised as distinct realms of the empire with unique customs and symbols of their own. Imperial identity, through imagery such as patriotic artworks and banners, began developing into a form that attempted to be more inclusive by showcasing the empire as a family of newly birthed nations with common roots. The last decades of the 19th century saw concerted political campaigns for Irish home rule. Ireland had been united with Britain into the United Kingdom of Great Britain and Ireland with the Act of Union 1800 after the Irish Rebellion of 1798, and had suffered a severe famine between 1845 and 1852. Home rule was supported by the British prime minister, William Gladstone, who hoped that Ireland might follow in Canada's footsteps as a Dominion within the empire, but his 1886 Home Rule bill was defeated in Parliament. Although the bill, if passed, would have granted Ireland less autonomy within the UK than the Canadian provinces had within their own federation, many MPs feared that a partially independent Ireland might pose a security threat to Great Britain or mark the beginning of the break-up of the empire. A second Home Rule bill was defeated for similar reasons. A third bill was passed by Parliament in 1914, but not implemented because of the outbreak of the First World War leading to the 1916 Easter Rising. World wars (1914–1945) By the turn of the 20th century, fears had begun to grow in Britain that it would no longer be able to defend the metropole and the entirety of the empire while at the same time maintaining the policy of "splendid isolation". Germany was rapidly rising as a military and industrial power and was now seen as the most likely opponent in any future war. Recognising that it was overstretched in the Pacific and threatened at home by the Imperial German Navy, Britain formed an alliance with Japan in 1902 and with its old enemies France and Russia in 1904 and 1907, respectively. Britain's fears of war with Germany were realised in 1914 with the outbreak of the First World War. Britain quickly invaded and occupied most of Germany's overseas colonies in Africa. In the Pacific, Australia and New Zealand occupied German New Guinea and German Samoa respectively. Plans for a post-war division of the Ottoman Empire, which had joined the war on Germany's side, were secretly drawn up by Britain and France under the 1916 Sykes–Picot Agreement. This agreement was not divulged to the Sharif of Mecca, who the British had been encouraging to launch an Arab revolt against their Ottoman rulers, giving the impression that Britain was supporting the creation of an independent Arab state. The British declaration of war on Germany and its allies committed the colonies and Dominions, which provided invaluable military, financial and material support. Over 2.5 million men served in the armies of the Dominions, as well as many thousands of volunteers from the Crown colonies. The contributions of Australian and New Zealand troops during the 1915 Gallipoli Campaign against the Ottoman Empire had a great impact on the national consciousness at home and marked a watershed in the transition of Australia and New Zealand from colonies to nations in their own right. The countries continue to commemorate this occasion on Anzac Day. Canadians viewed the Battle of Vimy Ridge in a similar light. The important contribution of the Dominions to the war effort was recognised in 1917 by British prime minister David Lloyd George when he invited each of the Dominion prime ministers to join an Imperial War Cabinet to co-ordinate imperial policy. Under the terms of the concluding Treaty of Versailles signed in 1919, the empire reached its greatest extent with the addition of 1.8 million sq mi (4.7 million km2) and 13 million new subjects. The colonies of Germany and the Ottoman Empire were distributed to the Allied powers as League of Nations mandates. Britain gained control of Palestine, Transjordan, Iraq, parts of Cameroon and Togoland, and Tanganyika. The Dominions themselves acquired mandates of their own: the Union of South Africa gained South West Africa (modern-day Namibia), Australia gained New Guinea, and New Zealand Western Samoa. Nauru was made a combined mandate of Britain and the two Pacific Dominions. The changing world order that the war had brought about, in particular the growth of the United States and Japan as naval powers, and the rise of independence movements in India and Ireland, caused a major reassessment of British imperial policy. Forced to choose between alignment with the United States or Japan, Britain opted not to renew its Anglo-Japanese Alliance and instead signed the 1922 Washington Naval Treaty, where Britain accepted naval parity with the United States. This decision was the source of much debate in Britain during the 1930s as militaristic governments took hold in Germany and Japan helped in part by the Great Depression, for it was feared that the empire could not survive a simultaneous attack by both nations. The issue of the empire's security was a serious concern in Britain, as it was vital to the British economy. In 1919, the frustrations caused by delays to Irish home rule led the MPs of Sinn Féin, a pro-independence party that had won a majority of the Irish seats in the 1918 British general election, to establish an independent parliament in Dublin, at which Irish independence was declared. The Irish Republican Army simultaneously began a guerrilla war against the British administration. The Irish War of Independence ended in 1921 with a stalemate and the signing of the Anglo-Irish Treaty, creating the Irish Free State, a Dominion within the British Empire, with effective internal independence but still constitutionally linked with the British Crown. Northern Ireland, consisting of six of the 32 Irish counties which had been established as a devolved region under the 1920 Government of Ireland Act, immediately exercised its option under the treaty to retain its existing status within the United Kingdom. A similar struggle began in India when the Government of India Act 1919 failed to satisfy the demand for independence. Concerns over communist and foreign plots following the Ghadar conspiracy ensured that war-time strictures were renewed by the Rowlatt Acts. This led to tension, particularly in the Punjab region, where repressive measures culminated in the Amritsar Massacre. In Britain, public opinion was divided over the morality of the massacre, between those who saw it as having saved India from anarchy, and those who viewed it with revulsion. The non-cooperation movement was called off in March 1922 following the Chauri Chaura incident, and discontent continued to simmer for the next 25 years. In 1922, Egypt, which had been declared a British protectorate at the outbreak of the First World War, was granted formal independence, though it continued to be a British client state until 1954. British troops remained stationed in Egypt until the signing of the Anglo-Egyptian Treaty in 1936, under which it was agreed that the troops would withdraw but continue to occupy and defend the Suez Canal zone. In return, Egypt was assisted in joining the League of Nations. Iraq, a British mandate since 1920, gained membership of the League in its own right after achieving independence from Britain in 1932. In Palestine, Britain was presented with the problem of mediating between the Arabs and increasing numbers of Jews. The Balfour Declaration, which had been incorporated into the terms of the mandate, stated that a national home for the Jewish people would be established in Palestine, and Jewish immigration allowed up to a limit that would be determined by the mandatory power. This led to increasing conflict with the Arab population, who openly revolted in 1936. As the threat of war with Germany increased during the 1930s, Britain judged the support of Arabs as more important than the establishment of a Jewish homeland, and shifted to a pro-Arab stance, limiting Jewish immigration and in turn triggering a Jewish insurgency. The right of the Dominions to set their own foreign policy, independent of Britain, was recognised at the 1923 Imperial Conference. Britain's request for military assistance from the Dominions at the outbreak of the Chanak Crisis the previous year had been turned down by Canada and South Africa, and Canada had refused to be bound by the 1923 Treaty of Lausanne. After pressure from the Irish Free State and South Africa, the 1926 Imperial Conference issued the Balfour Declaration of 1926, declaring Britain and the Dominions to be "autonomous Communities within the British Empire, equal in status, in no way subordinate one to another" within a "British Commonwealth of Nations". This declaration was given legal substance under the 1931 Statute of Westminster. The parliaments of Canada, Australia, New Zealand, the Union of South Africa, the Irish Free State and Newfoundland were now independent of British legislative control, they could nullify British laws and Britain could no longer pass laws for them without their consent. Newfoundland reverted to colonial status in 1933, suffering from financial difficulties during the Great Depression. In 1937 the Irish Free State introduced a republican constitution renaming itself Ireland. Britain's declaration of war against Nazi Germany in September 1939 included the Crown colonies and India but did not automatically commit the Dominions of Australia, Canada, New Zealand, Newfoundland and South Africa. All soon declared war on Germany. While Britain continued to regard Ireland as still within the British Commonwealth, Ireland chose to remain legally neutral throughout the war. After the Fall of France in June 1940, Britain and the empire stood alone against Germany, until the German invasion of Greece on 7 April 1941. British Prime Minister Winston Churchill successfully lobbied US President Franklin D. Roosevelt for military aid from the United States, but Roosevelt was not yet ready to ask Congress to commit the country to war. In August 1941, Churchill and Roosevelt met and agreed the Atlantic Charter, which included the statement that "the rights of all peoples to choose the form of government under which they live" should be respected. This wording was ambiguous as to whether it referred to European countries invaded by Germany and Italy, or the peoples colonised by European nations, and would later be interpreted differently by the British, Americans, and nationalist movements. Nevertheless, Churchill rejected its universal applicability when it came to the self-determination of subject nations including the British Indian Empire. Churchill further added that he did not become Prime Minister to oversee the liquidation of the empire. For Churchill, the entry of the United States into the war was the "greatest joy". He felt that Britain was now assured of victory, but failed to recognise that the "many disasters, immeasurable costs and tribulations [which he knew] lay ahead" in December 1941 would have permanent consequences for the future of the empire. The manner in which British forces were rapidly defeated in the Far East irreversibly harmed Britain's standing and prestige as an imperial power, including, particularly, the Fall of Singapore, which had previously been hailed as an impregnable fortress and the eastern equivalent of Gibraltar. The realisation that Britain could not defend its entire empire pushed Australia and New Zealand, which now appeared threatened by Japanese forces, into closer ties with the United States and, ultimately, the 1951 ANZUS Pact. The war weakened the empire in other ways: undermining Britain's control of politics in India, inflicting long-term economic damage, and irrevocably changing geopolitics by pushing the Soviet Union and the United States to the centre of the global stage. Decolonisation and decline (1945–1997) Though Britain and the empire emerged victorious from the Second World War, the effects of the conflict were profound, both at home and abroad. Much of Europe, a continent that had dominated the world for several centuries, was in ruins, and host to the armies of the United States and the Soviet Union, who now held the balance of global power. Britain was left essentially bankrupt, with insolvency only averted in 1946 after the negotiation of a US$3.75 billion loan from the United States, the last instalment of which was repaid in 2006. At the same time, anti-colonial movements were on the rise in the colonies of European nations. The situation was complicated further by the increasing Cold War rivalry of the United States and the Soviet Union. In principle, both nations were opposed to European colonialism. In practice, American anti-communism prevailed over anti-imperialism, and therefore the United States supported the continued existence of the British Empire to keep Communist expansion in check. At first, British politicians believed it would be possible to maintain Britain's role as a world power at the head of a re-imagined Commonwealth, but by 1960 they were forced to recognise that there was an irresistible "wind of change" blowing. Their priorities changed to maintaining an extensive zone of British influence and ensuring that stable, non-Communist governments were established in former colonies. In this context, while other European powers such as France and Portugal waged costly and unsuccessful wars to keep their empires intact, Britain generally adopted a policy of peaceful disengagement from its colonies, although violence occurred in Malaya, Kenya and Palestine. Between 1945 and 1965, the number of people under British rule outside the UK itself fell from 700 million to 5 million, 3 million of whom were in Hong Kong. The pro-decolonisation Labour government, elected at the 1945 general election and led by Clement Attlee, moved quickly to tackle the most pressing issue facing the empire: Indian independence. India's major political party—the Indian National Congress (led by Mahatma Gandhi) — had been campaigning for independence for decades, but disagreed with Muslim League (led by Muhammad Ali Jinnah) as to how it should be implemented. Congress favoured a unified secular Indian state, whereas the League, fearing domination by the Hindu majority, desired a separate Islamic state for Muslim-majority regions. Increasing civil unrest led Attlee to promise independence no later than 30 June 1948. When the urgency of the situation and risk of civil war became apparent, the newly appointed (and last) Viceroy, Lord Mountbatten, hastily brought forward the date to 15 August 1947. The borders drawn by the British to broadly partition India into Hindu and Muslim areas left tens of millions as minorities in the newly independent states of India and Pakistan. The princely states were provided with a choice to either remain independent or join India or Pakistan. Millions of Hindus & Sikhs crossed from Pakistan to India and Muslims vice versa, and violence between the two communities cost hundreds of thousands of lives. Burma, which had been administered as part of British India until 1937 gained independence the following year in 1948 along with Sri Lanka (formerly known as British Ceylon). India, Pakistan and Sri Lanka became members of the Commonwealth, while Burma chose not to join. That same year, the British Nationality Act was enacted, in hopes of strengthening and unifying the Commonwealth: it provided British citizenship and right of entry to all those living within its jurisdiction. The British Mandate in Palestine, where an Arab majority lived alongside a Jewish minority, presented the British with a similar problem to that of India. The matter was complicated by large numbers of Jewish refugees seeking to be admitted to Palestine following the Holocaust, while Arabs were opposed to the creation of a Jewish state. Frustrated by the intractability of the problem, attacks by Jewish paramilitary organisations and the increasing cost of maintaining its military presence, Britain announced in 1947 that it would withdraw in 1948 and leave the matter to the United Nations to solve. The UN General Assembly subsequently voted for a plan to partition Palestine into a Jewish and an Arab state. It was immediately followed by the outbreak of a civil war between the Arabs and Jews of Palestine, and British forces withdrew amid the fighting. The British Mandate for Palestine officially terminated at midnight on 15 May 1948 as the State of Israel declared independence and the 1948 Arab-Israeli War broke out, during which the territory of the former Mandate became divided between Israel and the surrounding Arab states. Amid the fighting, British forces continued to withdraw from Israel, with the last British troops departing from Haifa on 30 June 1948. Following the surrender of Japan in the Second World War, anti-Japanese resistance movements in Malaya turned their attention towards the British, who had moved to quickly retake control of the colony, valuing it as a source of rubber and tin. The fact that the guerrillas were primarily Malaysian Chinese Communists meant that the British attempt to quell the uprising was supported by the Muslim Malay majority, on the understanding that once the insurgency had been quelled, independence would be granted. The Malayan Emergency, as it was called, began in 1948 and lasted until 1960, but by 1957, Britain felt confident enough to grant independence to the Federation of Malaya within the Commonwealth. In 1963, the 11 states of the federation together with Singapore, Sarawak and North Borneo joined to form Malaysia, but in 1965 Chinese-majority Singapore was expelled from the union following tensions between the Malay and Chinese populations and became an independent city-state. Brunei, which had been a British protectorate since 1888, declined to join the union. In the 1951 general election, the Conservative Party returned to power in Britain under the leadership of Winston Churchill. Churchill and the Conservatives believed that Britain's position as a world power relied on the continued existence of the empire, with the base at the Suez Canal allowing Britain to maintain its pre-eminent position in the Middle East in spite of the loss of India. Churchill could not ignore Gamal Abdul Nasser's new revolutionary government of Egypt that had taken power in 1952, and the following year it was agreed that British troops would withdraw from the Suez Canal zone and that Sudan would be granted self-determination by 1955, with independence to follow Sudan was granted independence on 1 January 1956. In July 1956, Nasser unilaterally nationalised the Suez Canal. The response of Anthony Eden, who had succeeded Churchill as Prime Minister, was to collude with France to engineer an Israeli attack on Egypt that would give Britain and France an excuse to intervene militarily and retake the canal. Eden infuriated US President Dwight D. Eisenhower by his lack of consultation, and Eisenhower refused to back the invasion. Another of Eisenhower's concerns was the possibility of a wider war with the Soviet Union after it threatened to intervene on the Egyptian side. Eisenhower applied financial leverage by threatening to sell US reserves of the British pound and thereby precipitate a collapse of the British currency. Though the invasion force was militarily successful in its objectives, UN intervention and US pressure forced Britain into a humiliating withdrawal of its forces, and Eden resigned. The Suez Crisis very publicly exposed Britain's limitations to the world and confirmed Britain's decline on the world stage and its end as a first-rate power, demonstrating that henceforth it could no longer act without at least the acquiescence, if not the full support, of the United States. The events at Suez wounded British national pride, leading one Member of Parliament (MP) to describe it as "Britain's Waterloo" and another to suggest that the country had become an "American satellite". Margaret Thatcher later described the mindset she believed had befallen Britain's political leaders after Suez where they "went from believing that Britain could do anything to an almost neurotic belief that Britain could do nothing", from which Britain did not recover until the successful recapture of the Falkland Islands from Argentina in 1982. While the Suez Crisis caused British power in the Middle East to weaken, it did not collapse. Britain again deployed its armed forces to the region, intervening in Oman (1957), Jordan (1958) and Kuwait (1961), though on these occasions with American approval, as the new Prime Minister Harold Macmillan's foreign policy was to remain firmly aligned with the United States. Although Britain granted Kuwait independence in 1961, it continued to maintain a military presence in the Middle East for another decade. On 16 January 1968, a few weeks after the devaluation of the pound, Prime Minister Harold Wilson and his Defence Secretary Denis Healey announced that British Armed Forces troops would be withdrawn from major military bases East of Suez, which included the ones in the Middle East, and primarily from Malaysia and Singapore by the end of 1971, instead of 1975 as earlier planned. By that time over 50,000 British military personnel were still stationed in the Far East, including 30,000 in Singapore. The British granted independence to the Maldives in 1965 but continued to station a garrison there until 1976, withdrew from Aden in 1967, and granted independence to Bahrain, Qatar, and the United Arab Emirates in 1971. Macmillan gave a speech in Cape Town, South Africa in February 1960 where he spoke of "the wind of change blowing through this continent". Macmillan wished to avoid the same kind of colonial war that France was fighting in Algeria, and under his premiership decolonisation proceeded rapidly. To the three colonies that had been granted independence in the 1950s—Sudan, the Gold Coast and Malaya—were added nearly ten times that number during the 1960s. Owing to the rapid pace of decolonisation during this period, the cabinet post of Secretary of State for the Colonies was abolished in 1966, along with the Colonial Office, which merged with the Commonwealth Relations Office to form the Foreign and Commonwealth Office (now the Foreign, Commonwealth and Development Office) in October 1968. Britain's remaining colonies in Africa, except for self-governing Southern Rhodesia, were all granted independence by 1968. British withdrawal from the southern and eastern parts of Africa was not a peaceful process. From 1952 the Kenya Colony saw the eight-year long Mau Mau rebellion, in which tens of thousands of suspected rebels were interned by the colonial government in detention camps to suppress the rebellion and over 1000 convicts executed, with records systematically destroyed. Throughout the 1960s, the British government took a "No independence until majority rule" policy towards decolonising the empire, leading the white minority government of Southern Rhodesia to enact the 1965 Unilateral Declaration of Independence from Britain, resulting in a civil war that lasted until the British-mediated Lancaster House Agreement of 1979. The agreement saw the British Empire temporarily re-establish the Colony of Southern Rhodesia from 1979 to 1980 as a transitionary government to a majority rule Republic of Zimbabwe. This was the last British possession in Africa. In Cyprus, a guerrilla war waged by the Greek Cypriot organisation EOKA against British rule, was ended in 1959 by the London and Zürich Agreements, which resulted in Cyprus being granted independence in 1960. The UK retained the military bases of Akrotiri and Dhekelia as sovereign base areas. The Mediterranean colony of Malta was amicably granted independence from the UK in 1964 and became the country of Malta, though the idea had been raised in 1955 of integration with Britain. Most of the UK's Caribbean territories achieved independence after the departure in 1961 and 1962 of Jamaica and Trinidad from the West Indies Federation, established in 1958 in an attempt to unite the British Caribbean colonies under one government, but which collapsed following the loss of its two largest members. Jamaica attained independence in 1962, as did Trinidad and Tobago. Barbados achieved independence in 1966 and the remainder of the eastern Caribbean islands, including the Bahamas, in the 1970s and 1980s, but Anguilla and the Turks and Caicos Islands opted to revert to British rule after they had already started on the path to independence. The British Virgin Islands, The Cayman Islands and Montserrat opted to retain ties with Britain, while Guyana achieved independence in 1966. Britain's last colony on the American mainland, British Honduras, became a self-governing colony in 1964 and was renamed Belize in 1973, achieving full independence in 1981. A dispute with Guatemala over claims to Belize was left unresolved. British Overseas Territories in the Pacific acquired independence in the 1970s beginning with Fiji in 1970 and ending with Vanuatu in 1980. Vanuatu's independence was delayed because of political conflict between English and French-speaking communities, as the islands had been jointly administered as a condominium with France. Fiji, Papua New Guinea, Solomon Islands and Tuvalu became Commonwealth realms. By 1981, aside from a scattering of islands and outposts, the process of decolonisation that had begun after the Second World War was largely complete. In 1982, Britain's resolve in defending its remaining overseas territories was tested when Argentina invaded the Falkland Islands, acting on a long-standing claim that dated back to the Spanish Empire. Britain's successful military response to retake the Falkland Islands during the ensuing Falklands War contributed to reversing the downward trend in Britain's status as a world power. The 1980s saw Canada, Australia, and New Zealand sever their final constitutional links with Britain. Although granted legislative independence by the Statute of Westminster 1931, vestigial constitutional links had remained in place. The British Parliament retained the power to amend key Canadian constitutional statutes, meaning that an act of the British Parliament was required to make certain changes to the Canadian Constitution. The British Parliament had the power to pass laws extending to Canada at Canadian request. Although no longer able to pass any laws that would apply to Australian Commonwealth law, the British Parliament retained the power to legislate for the individual Australian states. With regard to New Zealand, the British Parliament retained the power to pass legislation applying to New Zealand with the New Zealand Parliament's consent. In 1982, the last legal link between Canada and Britain was severed by the Canada Act 1982, which was passed by the British parliament, formally patriating the Canadian Constitution. The act ended the need for British involvement in changes to the Canadian constitution. Similarly, the Australia Act 1986 (effective 3 March 1986) severed the constitutional link between Britain and the Australian states, while New Zealand's Constitution Act 1986 (effective 1 January 1987) reformed the constitution of New Zealand to sever its constitutional link with Britain. On 1 January 1984, Brunei, Britain's last remaining Asian protectorate, was granted full independence. Independence had been delayed due to the opposition of the Sultan, who had preferred British protection. In September 1982 the Prime Minister, Margaret Thatcher, travelled to Beijing to negotiate with the Chinese Communist government, on the future of Britain's last major and most populous overseas territory, Hong Kong. Under the terms of the 1842 Treaty of Nanking and 1860 Convention of Peking, Hong Kong Island and Kowloon Peninsula had been respectively ceded to Britain in perpetuity, but the majority of the colony consisted of the New Territories, which had been acquired under a 99-year lease in 1898, due to expire in 1997. Thatcher, seeing parallels with the Falkland Islands, initially wished to hold Hong Kong and proposed British administration with Chinese sovereignty, though this was rejected by China. A deal was reached in 1984—under the terms of the Sino-British Joint Declaration, Hong Kong would become a special administrative region of the People's Republic of China. The handover ceremony in 1997 marked for many, including King Charles III, then Prince of Wales, who was in attendance, "the end of Empire", though many British territories that are remnants of the empire still remain. Legacy Britain retains sovereignty over 14 territories outside the British Isles. In 1983, the British Nationality Act 1981 renamed the existing Crown Colonies as "British Dependent Territories",[a] and in 2002 they were renamed the British Overseas Territories. Most former British colonies and protectorates are members of the Commonwealth of Nations, a voluntary association of equal members, comprising a population of around 2.2 billion people. The United Kingdom and 14 other countries, all collectively known as the Commonwealth realms, voluntarily continue to share the same person—King Charles III—as their respective head of state. These 15 nations are distinct and equal legal entities: the United Kingdom, Australia, Canada, New Zealand, Antigua and Barbuda, The Bahamas, Belize, Grenada, Jamaica, Papua New Guinea, Saint Kitts and Nevis, Saint Lucia, Saint Vincent and the Grenadines, Solomon Islands and Tuvalu. During the colonial era, emphasis was given to study of the classical Greco-Roman heritage and their experience with empire, aiming to parse how that heritage could be applied to improve the future of the colonies. American hegemony, which throughout its early rise had challenged British claims of being the "New Rome", became the successor to British dominance in the mid-20th century; the two countries' historical ties and wartime collaboration supported a peaceful handoff of power after World War II. As for the United Kingdom itself, British views of the former Empire are more positive than is the case with other post-imperial nations; discourse around the former Empire has continued to impact the nation's present-day understanding of itself, as seen in the debate leading up to its decision to leave the European Union in 2016. Decades, and in some cases centuries, of British rule and emigration have left their mark on the independent nations that rose from the British Empire. The empire established the use of the English language in regions around the world. Today it is the primary language of up to 460 million people and is spoken by about 1.5 billion as a first, second or foreign language. It has also significantly influenced other languages. Individual and team sports developed in Britain, particularly football, cricket, lawn tennis, and golf were exported. Some sports were also invented or standardised in the former colonies, such as badminton, polo, and snooker in India (see also: Sport in British India). British missionaries who travelled around the globe often in advance of soldiers and civil servants spread Protestantism (including Anglicanism) to all continents. The British Empire provided refuge for religiously persecuted continental Europeans for hundreds of years. British educational institutions also remain popular in the present day, in part due to the importance of the English language and similarity of British curriculums to those in the former colonies. Political boundaries drawn by the British did not always reflect homogeneous ethnicities or religions, contributing to conflicts in formerly colonised areas. The British Empire was responsible for large migrations of peoples (see also: Commonwealth diaspora). Millions left the British Isles, with the founding settler colonist populations of the United States, Canada, Australia and New Zealand coming mainly from Britain and Ireland. Millions of people moved between British colonies, with large numbers of South Asian people emigrating to other parts of the empire, such as Malaysia and Fiji, and Overseas Chinese people to Malaysia, Singapore and the Caribbean; about half of all modern immigration to the Commonwealth nations continues to occur between them. The demographics of the United Kingdom changed after the Second World War owing to immigration to Britain from its former colonies. In the 19th century, innovation in Britain led to revolutionary changes in manufacturing, the development of factory systems, and the growth of transportation by railway and steamship. Debate has also occurred as to what extent the Industrial Revolution, originating from the United Kingdom, was facilitated by or dependent on imperialism. British colonial architecture, such as in churches, railway stations and government buildings, can be seen in many cities that were once part of the British Empire; Western technologies and architecture had been globalised in part due to the Empire's military and administrative requirements. Integration of former colonies into the global economy was also a major legacy. The British choice of system of measurement, the imperial system, continues to be used in some countries in various ways. The convention of driving on the left-hand side of the road has been retained in much of the former empire. The Westminster system of parliamentary democracy has served as the template for the governments of many former colonies, and English common law for legal systems. It has been observed that almost every former colony that emerged as an independent democratic state is a former British colony, though this correlation greatly declines in strength after 30 years of an ex-colony's independence. International commercial contracts are often based on English common law. The British Judicial Committee of the Privy Council still serves as the highest court of appeal for twelve former colonies. Historians' approaches to understanding the British Empire are diverse and evolving. Two key sites of debate over recent decades have been the impact of post-colonial studies, which seek to critically re-evaluate the history of imperialism, and the continued relevance of historians Ronald Robinson and John Gallagher, whose work greatly influenced imperial historiography during the 1950s and 1960s. In addition, differing assessments of the empire's legacy remain relevant to debates over recent history and politics, such as the Anglo-American invasions of Iraq and Afghanistan, as well as Britain's role and identity in the contemporary world. Historians such as Caroline Elkins have argued against perceptions of the British Empire as a primarily liberalising and modernising enterprise, criticising its widespread use of violence and emergency laws to maintain power. Common criticisms of the empire include the use of detention camps in its colonies, massacres of indigenous peoples, and famine-response policies. Some scholars, including Amartya Sen, assert that British policies worsened the famines in India that killed millions during British rule. Conversely, historians such as Niall Ferguson say that the economic and institutional development the British Empire brought resulted in a net benefit to its colonies. Other historians treat its legacy as varied and ambiguous. Public attitudes towards the empire within 21st-century Britain have been broadly positive although sentiment towards the Commonwealth has been one of apathy and decline. See also Notes Citations References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Diaspora_Revolt] | [TOKENS: 10596] |
Contents Diaspora Revolt Jewish rebels, primarily in: The Diaspora Revolt (115–117 CE, Hebrew: מרד הגלויות, romanized: mered ha-galuyot, or מרד התפוצות, mered ha-tfutzot, 'rebellion of the diaspora'; Latin: Tumultus Iudaicus), sometimes known as the Second Jewish–Roman War,[a] was a series of uprisings launched by Jewish diaspora communities across the eastern provinces of the Roman Empire during the final years of Emperor Trajan's reign. Hostilities began while Trajan was engaged in his Parthian campaign in Mesopotamia, creating a favorable opportunity for rebellion. Ancient sources do not specify the motivations, but they were likely shaped by the Roman destruction of the Second Temple during the First Jewish Revolt in 70 CE, long-standing tensions between Jews and Greeks, the Fiscus Judaicus tax, messianic expectations, and hopes for a return to Judaea. The uprisings broke out almost simultaneously in Egypt, Libya and Cyprus. Rebel attacks were directed mainly against locals rather than Roman authorities, with ancient authors such as Cassius Dio and Eusebius, as well as epigraphic evidence, reporting extreme violence. By contrast, the revolt in Mesopotamia appears to have formed part of a broader resistance to Roman expansion into Parthian territories. Major conflicts Marcius Turbo, one of Trajan's top generals, was dispatched to suppress the uprisings in Egypt and Libya. Literary sources suggest that the Jewish population in these regions faced severe reprisals. Meanwhile, General Lusius Quietus quelled the rebellion in Mesopotamia and was subsequently appointed governor of Judaea. It was during this period that the poorly understood Kitos War may have occurred in Judaea, apparently involving unrest among the Jewish population. The diaspora uprisings were likely suppressed before autumn 117, just prior to Trajan's death; however, some unrest may have persisted into the winter of 117–118. The Diaspora Revolt appears to have led to the devastation, and in some cases the annihilation, of Jewish communities in Egypt, Libya, and other regions. Significant damage to buildings, temples, and roads is well attested in Cyrene and other parts of Cyrenaica. A festival celebrating victory over the Jews was still being observed eighty years later in the Egyptian city of Oxyrhynchus. Thirteen years after the uprisings, Trajan's successor Hadrian decided to rebuild Jerusalem as a colony dedicated to Jupiter, possibly aiming to erase what he saw as the center of Jewish rebelliousness. In response, the Bar Kokhba revolt erupted, the last major Jewish attempt to regain independence in Judaea. In the Diaspora, the largest Jewish communities were now concentrated in Mesopotamia, Asia Minor and Italy. Jewish communities reestablished themselves in Egypt, Cyprus, and Cyrenaica during the 3rd–4th centuries CE, though they never reached their former prominence. Primary sources The available sources on the Diaspora Revolt are limited, fragmented, and incomplete, making it difficult for historians to reconstruct a comprehensive account of the events. The principal sources, Cassius Dio and Eusebius, offer only brief coverage, and only a few other, less detailed literary references survive. Scholars therefore rely on archaeological evidence, including ancient documents and inscriptions, to supplement and clarify the limited textual record. Cassius Dio (c. 155 – c. 235), a Roman historian and senator, addresses the revolt in his Roman History (68.32.1–3); his narrative survives, however, only through a 12th-century abridgment by the Byzantine scholar Xiphilinus. Dio's account provides the most detail on the events in the city of Cyrene, Libya, while offering only brief mention of Cyprus and a passing reference to Egypt. This account attributes responsibility for the uprisings to the Jewish population. Dio also records the Roman suppression of unrest in Mesopotamia, though he does not explicitly identify a Jewish role in that region. Scholars disagree on the extent of changes and bias introduced by the abridgment: classicist Timothy Barnes suggested that Xiphilinus's anti-Jewish sentiment may have influenced and distorted the original text, whereas historian Lester L. Grabbe argued that "there is no reason to assume that it has been extensively distorted or rewritten, only shortened by omission." Additional descriptions of the revolt come from Eusebius (c. 260–339), a bishop and scholar from late-antique Syria Palaestina, who discusses it in his Chronicon (68.32) and Ecclesiastical History (4.2.1–5), works generally considered reliable sources. His account centers on the uprisings in Egypt, with additional references to a Jewish rebellion in Mesopotamia and events in Cyprus. Eusebius notes that Greek historians provide accounts of the revolt similar to his, though he appears unaware of Cassius Dio's version, which puts greater emphasis on violent atrocities. In contrast, Eusebius adopts a more neutral tone. Nonetheless, his portrayal of the revolt is framed within his broader theological argument that Jewish suffering was a consequence of their rejection of Christ, a theme common in early Christian references to the Jewish–Roman wars. Appian (c. 95 – c. 165), an Egyptian-born Greek historian and lawyer, provides a first-person account of the revolt in the surviving portions of his Roman History 2.90. Among several anecdotes, he recounts his narrow escape from capture, fleeing a Jewish ship via wilderness paths and boat near Pelusium, and describes the destruction of the Pompey monument near Alexandria by Jewish rebels. His tone is neutral, much like Eusebius, who may have relied on Appian as a source. Also active in the second century, Arrian, a Greek author, wrote a now-lost work on the Parthians that reportedly included references to Trajan's actions against the Jews, and is believed to have been used as a source by Eusebius. A much later source is Paulus Orosius (c. 375 – after 418), a Christian Roman historian and theologian, who discusses the events in his Seven Books Against the Pagans (7.12.6–8), composed around 418. Orosius saw the Jewish uprisings as divine punishment resulting from its persecution of Christians. His narrative draws on Eusebius, likely via Latin translations by Jerome and Rufinus, though Orosius rearranges the material and adopts a more vivid, dramatic rhetorical style. His account's reliability has been questioned due to chronological and historical inaccuracies, and, according to Judaic scholar William Horbury, his version is derivative, lacking immediacy and "vague" in its presentation. The uprisings in Egypt are also documented by papyrological evidence, especially texts in the Corpus Papyrorum Judaicarum (CPJ), a collection of papyri relating to Jews and Judaism in Egypt. These documents illuminate the revolt's chronology, casualties, impact, and aftermath, and show, for example, that local Egyptians fought against the Jews rather than supporting them, as was suggested earlier. Archaeological and epigraphic evidence also clarifies the events in Cyprus and Cyrenaica, with Latin and Greek inscriptions from Cyrenaica recording the reconstruction of buildings damaged during the "Jewish uprising", thereby revealing the scale of destruction and subsequent rebuilding programs. The Diaspora Revolt also finds echoes in rabbinic literature composed in later centuries, based on earlier oral traditions. The Jerusalem Talmud, a rabbinic compilation redacted in Galilee during the 4th–5th centuries CE, refers to the revolt in tractate Sukkah 5:1, which preserves three stories about it, including accounts of the destruction of the Great Synagogue of Alexandria and the massacre of Jews by Trajan. These narratives, which focus on Roman actions rather than the Greeks or Egyptians, were likely influenced by the heightened anti-Roman sentiment following the Bar Kokhba revolt, which occurred about fifteen years later and had disastrous consequences for the Jews of Judaea. While the stories contain historical kernels, they also incorporate legendary elements that limit their reliability as strict historical sources. Nonetheless, these sources reflect contemporary rabbinic debates about Jewish life in the diaspora in the aftermath of the Jewish–Roman wars, highlight hostilities and tensions between Jews and Romans, and reveal continuing hope for the coming of the Messiah among the Jews of Judaea. Background The motivations behind the diaspora uprisings are complex and difficult to discern, owing to the lack of direct sources on their underlying causes. The Roman destruction of the Second Temple in Jerusalem in 70 CE, at the height of the First Jewish Revolt, was a deeply traumatic event whose effect was exacerbated by the simultaneous introduction of the Fiscus Judaicus, a humiliating, punitive tax imposed on all Jews across the empire. The post-revolt period also saw widespread messianic expectations—a belief in the imminent coming of a redeemer, a descendant of David, who would bring transformative change and restore the Davidic kingdom in Israel—as well as a longing for the re-establishment of the Jewish state. Contemporary Jewish texts such as the Third Sibylline Oracle, 4 Ezra, and 2 Baruch reflect these themes, emphasizing anticipation of a messianic figure, the ingathering of the exiles (the biblical promise that Jews dispersed in exile will be restored to their ancestral land) and the eventual rebuilding of the Temple. The messianic aspect of the revolt is perhaps suggested by Eusebius referring to Lukuas, the leader of the Jewish rebels in Libya, as "king", suggesting that the uprising evolved from an ethnic conflict into a nationalist movement with messianic ambitions for political independence. Inter-ethnic tensions and local conditions also fueled the unrest, especially in Egypt, where longstanding social, economic, political, and ideological frictions between Jews and Greeks had intensified since the third century BCE. The situation deteriorated under Roman rule, leading to sporadic violence in various eastern cities, including severe riots in Alexandria in 29 BCE, 38 CE, 41 CE, and 66 CE. The Jewish defeat in the First Jewish Revolt amplified hostility toward the Jews of Egypt, leading to their violent exclusion from civic positions and the imposition of higher business fees. It also intensified anti-Jewish rhetoric in Egypt, greatly exacerbating tensions between Jews and Egyptians. In the years leading up to the Diaspora Revolt, incidents of anti-Jewish violence by Greeks occurred in 112 and the summer of 115 CE. These attacks, especially the latter, were likely direct catalysts for the Jewish uprising in Egypt. In Libya, earlier disturbances in 73 CE, which resulted in the deaths and dispossession of many wealthy Jews, may have weakened the moderating influence of the Jewish elite, allowing more radical elements in Jewish society to gain prominence and push for revolt. Additionally, the destruction of the Jewish landholding aristocracy exacerbated economic hardships for Jewish tenant farmers, pushing them into cities and worsening their plight. Horbury wrote that the revolt was influenced by a strong national hope and local interpretations of messianic expectations, particularly the return of the exiles and the rebuilding of the Temple. He added that Jews in the diaspora may have been influenced by the ideals of "liberty" and "redemption," which were central to the First Jewish Revolt and spread to communities in Egypt, Cyrene, and possibly Cyprus through refugees and traders from Judaea in its aftermath. This idea is supported by Josephus' account of Jews belonging to the radical Sicarii faction who migrated to Cyrene after the war, the discovery of First Jewish Revolt coinage in Memphis and near Cyrene, and traces of these themes in diasporic literature. Classicist E. Mary Smallwood suggested that the revolutionary movement during the Diaspora Revolt can be viewed as an early form of Zionism, seeking the return of Jewish exiles from North Africa to Palestine. The advance of the Cyrenaican Jews into Egypt, marked by widespread destruction, may have been intended as the initial phase of this large-scale migration. Archaeologist Shimon Applebaum wrote that the movement aimed at "the setting up of a new Jewish commonwealth, whose task was to inaugurate the messianic era." Biblical scholar and historian John M. G. Barclay likewise argued that the extensive damage to Cyrenaica's infrastructure during the uprising implies that the Jews involved intended to leave the province, probably aiming ultimately to reach Judaea. Horbury similarly concludes that the Jewish forces likely aimed to return to and defend Judaea. Uprisings The Jewish uprisings erupted almost simultaneously across several eastern provinces of the Roman Empire. In Egypt, Libya, and Cyprus, Jewish actions were directed primarily against local populations rather than Roman authorities. By contrast, the rebellion in Mesopotamia appears to have been part of a broader resistance to Roman expansion into areas ruled by the Parthian Empire. There is no evidence that Jewish communities in Asia Minor participated in the revolt, and the Jewish community in Rome also did not join the uprising. Eusebius links the revolts in Libya and Egypt, while late Syriac sources mention Jews from Egypt fleeing to Judaea. However, there is no definitive evidence of coordinated action among the diaspora communities in revolt. In Libya, Jewish rebels launched attacks against their Greek and Roman neighbors, led either by Andreas (according to Dio/Xiphilinus) or Lukuas (according to Eusebius). These could have been two separate individuals or a single person known by both names—a common practice at the time. Eusebius refers to Lukuas as "king", a title that has prompted some scholars to suggest a possible messianic motivation behind the uprising, though evidence supporting this theory remains limited. Eusebius writes that the Jews of Libya collaborated with the Jews of Egypt, forming a symmachia (military alliance). He also mentions that, at one point, the Jews of Libya moved into Egypt. Dio's account describes the Jews of Libya as engaging in shockingly violent and cruel behavior. They are said to have engaged in cannibalism, mutilation, and other atrocities, including using the victims' skins and entrails to make clothing and belts, and staging gladiatorial and wild beast shows. Dio reports that the Jewish rebels in Cyrenaica were responsible for approximately 220,000 deaths, though this figure is likely exaggerated for rhetorical effect. Historian Miriam Pucci Ben Zeev argued that this portrayal should be viewed within the broader context of how "barbarian" revolts against Rome were typically described in contemporary historiography, adding that the atrocities Dio attributes to the Jews are no more egregious than those he ascribes to the Britons during the Boudican revolt in 61 CE or to the Bucoli, a group of Nile Delta herdsmen, during their uprising in Egypt in 171 CE. Archaeological evidence points to extensive destruction in Cyrenaica attributed to Jewish rebels. Inscriptions record attacks on religious and civic structures, including temples and statues. At Cyrene, for instance, the sanctuary of Apollo and its surroundings saw the destruction and burning of the baths, porticoes, ball-courts, and other nearby buildings; the temple of Hecate was burned down during the uprising, and significant damage is also recorded at the Caesareum and the temple of Zeus. Classicist Joyce Reynolds notes significant damage to the sanctuary of Asclepius at Balagrae, west of Cyrene, which was later rebuilt under the Antonine dynasty. The destruction of a small second-century temple near modern El Dab'a in Marmarica is likely also attributable to the Jewish rebels. Damage is also attested along major roads, possibly resulting from deliberate rebel strategy. A Hadrianic milestone commemorates the repair of the road linking Cyrene with its port, Apollonia, which according to its inscription, "had been overturned and smashed up in the Jewish revolt," perhaps in anticipation of a Roman military advance from the sea. The presence of a deeply incised seven-branched menorah, a prominent Jewish symbol, on a road northwest of Balagrae may indicate, according to Reynolds, that Jews deliberately sought to disrupt the route connecting Cyrene with neighboring regions to the west. Bishop Synesius, a native of Cyrene writing in the early 5th century, still refers to the devastation caused by the Jews four centuries after the revolt. The uprising in Egypt is often believed to have started around October 115 CE, based on papyrus CPJ II 435, which mentions a conflict between Jews and Greeks. Pucci Ben Zeev, however, contends that this document actually describes Greek attacks on Jews, rather than the beginning of a Jewish uprising, and prefers to date the revolt's start to 116. Evidence from ostraca found in the Jewish quarter of Edfu, in Upper Egypt, indicates that tax receipts for Jews paying the Fiscus Judaicus cease by the end of May 116, suggesting this date as the earliest possible start for the revolt in that city. The latest possible date for the revolt's start is the beginning of September 116, as indicated by CPJ II 436, a concerned letter from the wife of the strategos (military leader) Apollonios in Hermoupolis, a major city between Lower and Upper Egypt. According to Eusebius, unrest in Egypt arose when Jewish communities, seized by a spirit of discord (stasis), engaged in civil conflict with their Greek neighbors. This unrest was soon followed by the advance of Jewish forces from Cyrene, led by Lukuas, who then achieved an initial victory over the Greeks. The Greeks escaped to Alexandria, massacring its Jewish population. Lukuas's forces, supported by Egyptian Jews who rallied to his side, continued to plunder the Egyptian countryside (chora) and destroy various districts throughout the country. Papyrological evidence indicates that the revolt indeed affected extensive areas, including the Athribite district, the region around Memphis (noted for its long-standing antisemitism), the Faiyum, Oxyrhynchus, and the Herakleopolite district. Further south, fighting also impacted the Kynopolite, Hermopolite, Lycopolite, and Apollinopolite districts. It seems that the Jewish forces were well-organized and capable of presenting serious military challenges to their adversaries; as they moved through Egyptian villages, they quickly overcame local resistance. Appian writes that the Jews destroyed the shrine of Nemesis, the Greek goddess of retribution, near Alexandria. He writes that this was done "for the needs of the war," suggesting an effort to remove a strategic point of advantage for the enemy, possibly by reusing the stone to strengthen Jewish defenses. Since the shrine housed the buried head of Pompey, its destruction may also have been motivated by a desire to avenge his desecration of the Temple during his conquest of Jerusalem in 63 BCE. This attack, together with other assaults on pagan temples in Egypt and Cyrenaica, may help explain the description of "impious Jews" found in some papyri. Appian notes that the Jews seized control of waterways near Pelusium, located at the eastern edge of the Nile Delta, a region of critical strategic value. Further evidence of military activity in Egypt's waterways is found in another papyrus, CPJ II 441, and in a 7th-century chronicle by Coptic bishop John of Nikiû. The latter mentions the Babylon Fortress, a fort situated at the entrance of Amnis Traianus, a canal constructed under Trajan which facilitated connections between the Nile and the Red Sea. Papyri indicate that the Greeks, led by strategoi, retaliated against the Jews, with assistance from Egyptian peasants and Romans. Prefect Rutilius Lupus is noted to have personally participated in these engagements. Some efforts were successful, as evidenced by the recorded "victory and success" of Apollonios near Memphis; however, due to many Roman forces being deployed in Mesopotamia, the remaining troops, including the Legio XXII Deiotariana and part of the Legio III Cyrenaica, were insufficient to restore order effectively. Most of what is known about the events in Cyprus comes from literary sources, since the epigraphic evidence is limited, indirect, and difficult to interpret. Dio reports that the local Jewish rebels were led by an individual named Artemion. Eusebius's Chronicon states that the Jews attacked the island's pagan inhabitants and destroyed the major port city of Salamis. Both pagan and Christian sources describe the revolt as having heavy casualties, with Dio claiming that "two hundred and forty thousand perished" in Cyprus and Orosius asserting that "all the Greek inhabitants of Salamis were killed." Suppression According to Eusebius, Trajan sent Quintus Marcius Turbo, one of his leading generals, "with land and sea forces including cavalry. He waged war vigorously against them in many battles for a considerable time and killed many thousands of Jews, not only those of Cyrene but also those of Egypt." According to Judaic scholar Allen Kerkeslager, Trajan diverted Marcius Turbo from the Parthian front because the Jewish uprisings threatened the stability of the Roman Empire by disrupting grain shipments from Egypt, which served as a major source of grain for Rome and other provinces. Turbo arrived in Egypt in late 116 or early 117. He was likely accompanied by the cohors (tactical unit) I Ulpia Afrorum equitata and the cohors I Augusta praetoria Lusitanorum equitata, both present in Egypt in 117 CE, with the latter suffering heavy losses during the early summer of the same year. One papyrus details plans to mobilize large forces, including fleets from the Italian ports of Misenum and Ravenna, the Legio III Cyrenaica, and auxiliary units such as the cohors I Flavia Cilicum equitata. Legio XXII Deiotariana and Legio III Cyrenaica fought against the Jews, with the names of specific Roman legionaries from these units recorded as being killed in combat. Native Egyptians and Greeks, driven by entrenched anti-Jewish sentiments intensified by wartime conditions and imperial support, eagerly joined the Romans in attacking Jews. The early severe losses suffered by the Roman military had resulted in the conscription of locals into the army, and the presence of seasoned Roman troops, eager for retribution, further exacerbated the violence. Turbo's mission seemingly included not only quelling the revolt but also exterminating Jews in the affected areas. Roman repression was severe, with Appian describing it as an extermination of the Jewish population in Egypt, and Arian noting that Trajan asked "to destroy the nation entirely, but if not, at least to crush it and stop its presumptuous wickedness." Turbo's military actions may have extended to Libya, where a Roman praefectus castrorum was killed. In Cyprus, the suppression of the Jewish revolt was led by Gaius Valerius Rufus, one of Trajan's generals. The military actions there may also corroborate Babylonian Talmud, Sukkah 51b, according to which the blood of Jews killed in Egypt reached as far as Cyprus. Scholarly debate surrounds the precise end date of the Jewish uprising. Pucci Ben Zeev argues that the revolt was likely suppressed before autumn 117, and possibly by summer, prior to Trajan's death. The reassignment of Marcius Turbo to Mauretania following Hadrian's accession as emperor in August 117 appears to support this timeline. However, historians Noah Hacham and Tal Ilan point to evidence suggesting more prolonged unrest. In CPJ 664c, a letter dated 20 December 117, a woman named Eudaimonis urged her son Apollonios, the strategos of Heptakomia, to remain in his secure residence—a warning that hints at persistent danger. This correspondence, along with a subsequent letter concerning the same family, suggests that instability continued in some areas into the winter of 117–118 CE. Related events Literary sources describing Roman violence against Jews in Mesopotamia, conquered by Trajan around 115 CE, are scarce. As a result, scholars debate whether a distinct Jewish revolt occurred there, as in other provinces, or whether Jewish activity in Mesopotamia formed part of a broader resistance to Roman rule in the recently conquered Parthian territories. Pucci Ben Zeev argues for the latter, suggesting that Jews joined the broader regional insurgency in order to preserve the relatively favorable status they had enjoyed under Parthian rule, in contrast to the harsher conditions they expected under Roman rule. Eusebius reports that Trajan suspected the Jews in Mesopotamia "would also attack the inhabitants", prompting him to send General Lusius Quietus to suppress them harshly. Eusebius further notes that Quietus "murdered a great number of the Jews there." Later Christian sources also describe a military campaign led by Quietus against the Jews. In contrast, Cassius Dio's account does not mention a Jewish uprising or a campaign against Jews in Mesopotamia. Instead, Dio describes a generalized regional rebellion during the summer of 116 CE. In this version, Trajan dispatched several generals—including Quietus—to quell these revolts, which resulted in the recovery of Nisibis and the destruction of Edessa, both in northern Mesopotamia. Dio does not link Jews to these Mesopotamian events; while he does note that Quietus took part in suppressing Jewish rebels, he places that action within the context of the uprisings in Egypt, Cyprus, and Cyrenaica, leaving the exact location of Quietus's involvement unspecified. The question of whether the Diaspora Revolt also spread to the province of Judaea remains debated in modern scholarship. Some researchers use the term "Kitos War" (also known as the "War of Kitos/Qitos/Quietus," after Lusius Quietus) to refer to this possible unrest in Judaea. This interpretation is based on passages in the rabbinic works Seder Olam Rabbah (30) and the Mishnah (Sotah 9:14), which date a "war of Quietus" to fifty-two years after the destruction of the Second Temple and sixteen years before the Bar Kokhba revolt—placing it roughly in the period of the Diaspora Revolt. These sources also mention new Jewish bans, enacted after the war, prohibiting brides from wearing crowns at their weddings and fathers from teaching their sons Greek. However, rabbinic texts do not explicitly associate this "war of Quietus" with Judaea and may instead refer to Quietus's suppression of Jewish uprisings in Mesopotamia rather than to disturbances in Judaea itself.[b] In non-Jewish sources, there are several indications that some military activity did occur in or around Judaea during this period. After the war in Mesopotamia, Lusius Quietus was appointed governor of Judaea and likely brought additional forces with him, likely including a vexillatio (temporary detachment) of Legio III Cyrenaica. An inscription from Sardinia lists an expeditio Judaeae among Trajan's military campaigns, and medieval Syriac sources mention unrest in Judaea, claiming that Jews from Egypt and Libya were defeated there by Roman forces. It is possible that tensions were exacerbated by Roman cult activity: Hippolytus reports (in a fragment preserved in Syriac) that "Traianus Quintos," possibly Quietus, set up a statue of kore (Persephone) in Jerusalem, while an inscription records soldiers of Legio III Cyrenaica dedicating an altar or statue to Serapis in the city during Trajan's final year. Nevertheless, the lateness of the Syriac sources, and the fact that the main primary accounts of the revolt, Cassius Dio and Eusebius, both do not mention hostilities in Judaea, make these indications uncertain. Some scholars have also linked the Talmudic legends of Lulianos and Paphos, two wealthy Jewish brothers, from "the pride of Israel", who are said to have been executed in Laodicea (or, in some versions, Lydda), and taken them as evidence of hostilities in Judaea. These stories, which appear in several passages in rabbinic literature, are, however, vague and often mutually contradictory. They also pose historical difficulties, since they portray Trajan as a kind of Roman governor in Judaea who is himself executed. In Ecclesiastes Rabbah (III, 17), Lulianos and Paphos are condemned to death by Trajan for an unknown offence but are saved at the last moment when two officials arrive from Rome and put Trajan to death. In the Babylonian Talmud (Ta'anit 18b), Trajan executes the brothers shortly before being executed himself. The Jerusalem Talmud (Ta'anit 2:12) connects their execution with the abolition of a festival called "the day of Tirion/Turianus" ("Trajan's Day"), mentioned in several rabbinic passages as a minor feast celebrated on 12 Adar (February/March), whereas the parallel passage in the Babylonian Talmud (Ta'anit 18b) instead connects the abolition with two other figures executed on that day. Smallwood argued that there may be a historical core behind these legends, suggesting that Lulianos and Paphos were leaders of a local Jewish rising in Judaea that was suppressed by Quietus. In her view, the "Trajan" who appears in the story is possibly a distorted echo of Quietus, who is known to have been killed, perhaps on Hadrian's orders, not long after his governorship in Judaea. Historian Aharon Oppenheimer, drawing on Genesis Rabbah, noted that their activity is associated with both Galilee and Syria and takes this as an indication of Jewish unrest in Galilee during the Diaspora Revolt. Historian Moshe David Herr, however, contests this reconstruction, arguing that the rabbinic passages adduced by Oppenheimer cannot be securely dated to the time of the Diaspora Revolt and therefore do not provide firm evidence for a Kitos War in Judaea. Another historian, David Rokeah, likewise rejects the link between the two brothers and unrest in Galilee, noting that the Sifra (Bechukotai 5:2), a rabbinic exegesis on Leviticus, describes them as being from Alexandria—something he takes to suggest that they were refugees from Egypt, possibly after taking part in the uprising there. Some scholars dispute that any conflict occurred in Judaea during the Diaspora Revolt. Historians Eric M. Meyers and Mark A. Chancey, for example, write that "the rebellion did not apparently spread to Judea, where the arrival of a second legion to complement the Tenth Legion provided a successful buffer against further uprisings." Similarly, Fergus Millar notes that "there is no concrete evidence for a Jewish revolt in Judaea" concurrent with the Diaspora Revolt. According to Lester L. Grabbe, "the evidence for such [hostilities in Judaea] seems extremely skimpy and is indirect at best. It is possible that there were attempts at an uprising there, but, if so, they seem to have been quickly put down by the governor Quietus." Aftermath The suppression of the revolt saw a devastating campaign of ethnic cleansing, which effectively led to the near-total expulsion and annihilation of Jews from Cyrenaica, Cyprus, and many parts of Egypt. Historical evidence indicates that Jewish communities were either annihilated or forced into migration, with only a few survivors possibly remaining in isolated areas on the fringes of Roman control. In Egypt, the Jewish community suffered near-total destruction during the revolt, an event historian Willy Clarysse characterized as a genocide. Appian reported that Trajan "was exterminating the Jewish race in Egypt," a claim corroborated by papyri and inscriptions documenting widespread devastation of Jewish populations across many regions. Jewish lands were confiscated, and Trajan implemented a new registry, the Ioudaikos logos, to catalog properties that had previously belonged to Jews. The Jewish community in Alexandria appears to have been entirely eradicated, with the only survivors likely being those who had fled to other regions at the onset of the uprising. The Jerusalem Talmud (Sukkot 5.1.55b) recorded the destruction of the celebrated Great Synagogue of Alexandria. Furthermore, the 2nd-century Tosefta (Pe'ah 4.6 and Ketubot 3.1) contains passages mentioning a former Jewish court in Alexandria that appears to have been abolished during this same period. Horbury suggests that some Jewish refugees fled to Judaea, bringing with them stories about Egypt and Trajan, which were later preserved through rabbinic transmission. Others may have fled to Syria, where it is possible that works like 4 Maccabees were created by Alexandrian Jews who had resettled in the province's capital, Antioch. After 117 CE, Jewish presence in Egypt and Libya virtually disappears from historical sources. No Jewish inscriptions from Egypt have been securely dated from the period following the revolt until the 4th century, and Egyptian papyri that mention Jews predominantly refer to isolated individuals rather than communities. In the Faiyum region, which previously had substantial Jewish communities, mid-2nd century tax records show only one Jew among a thousand adult males. Moreover, no Jewish tax receipts have been discovered in Edfu from after 116. It was not until the 3rd century that Jews re-established communities in Egypt, but they never regained their former influence. In Cyrenaica, a gap in the evidence following the revolt suggests that the region was virtually depopulated of Jews due to their migration to Egypt and subsequent massacres by non-Jews. After the war ended, laws were placed ordering the exile of Jews from Cyrene, which historian Renzo De Felice said "reduced the flourishing [Jewish] community of Cyrene to insignificance and set it on the road to an inevitable decline." According to De Felice, many of the Jews expelled joined Berber tribes, particularly those around modern-day Sirte. A substantial Jewish community was not reestablished in Cyrenaica until the 4th century. Cassius Dio reports that, even in his day in third-century Cyprus, "no Jew may set foot on that island, and even if one of them is driven upon the shores by a storm he is put to death." This claim is corroborated by archaeological evidence, which indicates no Jewish presence on the island until the 4th century. After his accession in 117 CE, Trajan's successor, Hadrian was confronted with the devastation left by the Diaspora Revolt, particularly in Cyrenaica. There was significant damage to buildings, temples, and roads, especially in Cyrene, where the city center was extensively destroyed. The scale of the destruction was such that Hadrian was compelled to rebuild the city at the beginning of his reign, as attested by archaeological evidence. Hadrianic inscriptions document the restoration of sites such as the baths by the Sanctuary of Apollo and the Caesareum. A letter from Hadrian to the citizens of Cyrene in 134/135 CE urged them to prevent their city from remaining in ruins. Following the devastation caused by the revolt, the Roman authorities initiated a large-scale recolonization of Cyrenaica, sending 3,000 veterans under the command of the prefect of Legio XV Apollinaris to settle in the region. Some of these veterans were stationed in Cyrene itself, while others were relocated to other sites, including the newly founded city of Hadrianopolis on the Mediterranean coast. Eusebius's Chronicon and Orosius also report extensive destruction in Salamis and Alexandria, with Orosius noting that Libya would have remained depopulated without Hadrian's resettlement efforts. The Jews [...] waged war on the inhabitants throughout Libya in the most savage fashion, and to such an extent was the country wasted that, its cultivators having been slain, its land would have remained utterly depopulated, had not Emperor Hadrian gathered settlers from other places and sent them thither, for the inhabitants had been wiped out. — Orosius, Seven Books of History Against the Pagans, 7.12.6 In Egypt, the aftermath of the revolts caused agricultural decline, shortages of slave labor and textiles, and an economic crisis with unstable prices and a shortage of essentials like bread. Roman troops in Egypt suffered significant losses, with some units experiencing 30–40 percent casualties. Egypt's agricultural hinterlands were heavily impacted by the war, and many farmlands remained unrecovered and underproductive for decades. Despite this, census data do not show a major demographic disruption in the overall population. In Alexandria, the damage was less extensive than Eusebius suggests, who claimed the city was "overthrown" and required rebuilding by Hadrian. The primary loss was the sanctuary of Nemesis. The Serapeum and other structures were likely damaged later by Egyptian and Cyrenaican Jews, rather than by Alexandrian Jews. The total destruction of the Cypriot city of Salamis has also been questioned, since it received the title of metropolis in 123, only a few years after the Jewish uprising, suggesting that not all damage was as severe as reported. Some Roman actions, such as Trajan's colony in Libya and Hadrian's edict improving conditions for the Egyptian peasantry, may likewise not be directly linked to the uprisings but instead reflect pre-existing circumstances. The simultaneous Jewish uprisings across various regions forced Trajan to divert his top military leaders from the Parthian front, impacting his campaign. The resistance in Mesopotamia, though ultimately unsuccessful in its siege of Hatra, led to a compromise with the Parthians and coincided with Trajan's illness and death. The siege of Hatra continued throughout the summer of 117, but the years of constant campaigning and reports of revolts had taken a toll on Trajan, who suffered a stroke resulting in partial paralysis. He decided to begin the long journey back to Rome to recover. As he sailed from Seleucia, his health deteriorated rapidly. He was taken ashore at Selinus in Cilicia, where he died. His successor, Hadrian, soon assumed power, reversing Trajan's approach by abandoning further imperial expansion. Despite a triumph celebrated at his funeral, Trajan's Parthian campaign ended in failure and ensured that Babylonian Jews remained outside Roman control, as reflected in the Babylonian Talmud's assertion of their protection from Roman decrees (Pesachim, 87b): "The Holy One, blessed be He, knows that Israel is unable to endure the cruel decrees of Edom, therefore He exiled them to Babylonia". In the aftermath of the Diaspora Revolt, Roman authorities tightened their control over Judaea, increasing the military presence and reorganizing the province's administration. With Hadrian's accession to the throne in 117 CE, Quietus was dismissed from his role in Judaea and replaced by Marcus Titius Lustricus Bruttianus. Around the same time, a second legion, Legio II Traiana Fortis, was stationed in the province. This raised the permanent garrison to two legions and elevated Judaea from a praetorian to consular province. By around 120, milestones attest to the construction of a new Roman road securing the key corridor linking Judaea, Galilee, Egypt, and Syria; Caparcotna in Galilee was also integrated into this network and developed into a Roman base of operations. The Romans further consolidated their grip on Judaea by settling loyal populations, including discharged legionaries, in the province. According to historian Martin Goodman, this growing military build‑up indicates Roman anxiety about the possibility of another uprising in Judaea, despite the reluctance of local Jews to join the recent revolts in the diaspora. Around 130, Hadrian visited Judaea and decided to rebuild Jerusalem as a Roman colony dedicated to Jupiter, naming it Aelia Capitolina. This decision, together with a possible imperial ban on circumcision, a key Jewish practice, was among the immediate triggers of the Bar Kokhba revolt, the final major Jewish uprising against Roman rule and the last serious attempt to restore Jewish independence in the Land of Israel until the modern era. According to Goodman, Hadrian—an activist emperor who preferred to impose reforms rather than merely react to crises—was acutely aware of the disastrous consequences of the Diaspora Revolt, as indicated by his post-revolt construction projects in Cyrenaica. Goodman argues that Hadrian's decision to refound Jerusalem as Aelia Capitolina was intended as a "final solution for Jewish rebelliousness": by permanently transforming the Jewish holy city into a Roman colonia modeled on the imperial capital, Hadrian aimed to prevent future Jewish uprisings. Archaeologist Hanan Eshel also points to a rise in Jewish nationalistic sentiment, possibly fueled by the Diaspora Revolt, as one of the motivations behind the revolt. Following a brief period of Jewish independence, a large-scale Roman military campaign devastated Judaea and was followed by severe punitive measures. The Jewish population of Judaea was drastically reduced, and the province was renamed Syria Palaestina. In the aftermath, Galilee emerged as the province's major Jewish center, while the largest diaspora communities were concentrated in central Mesopotamia under Parthian and later Sasanian rule. Other significant Jewish populations remained in Asia Minor and Italy, both within the Roman Empire. In the Mekhilta of Rabbi Ishmael (3:25–27), a tannaitic exegesis on Exodus, the 'days of Trajan' are cited as the third instance in which the Torah's injunction against returning to Egypt was violated, resulting in three punishments: In three places God warned Israel not to return to Egypt [...] Yet three times they returned, and three times they fell. The first was in the days of Sennacherib, as it is said, Woe to them that go down to Egypt for help. The second was in the days of Yohanan son of Kareah, as it is said, 'Then it shall come to pass that the word, which you fear shall overtake you there in the land of Egypt. The third time was in the days of Trajan. On these three occasions they returned, and on all three occasions they fell. — Mekhilta of Rabbi Ishmael, Tractate Vayehi Beshalach (ed. Lauterbach, vol. 1, 213–4), 3:25–27 The reference to the calamity during Trajan's reign is more concise than the detailed accounts of the earlier violations, suggesting that the event was still vivid in the Jewish consciousness. According to this interpretation, the destruction of the community in Alexandria was a consequence of violating the prohibition against returning to Egypt, implying that every Jewish settlement in Egypt was a sin. While the Mekhilta does not identify the sage behind this saying, a parallel tradition in the Jerusalem Talmud (Sukkot 5:1) attributes it to Shimon bar Yochai, a sage of the generation following the Bar Kokhba revolt, who, in numerous other sayings, emphasized the centrality of the Land of Israel. According to Noah Hacham, Bar Yochai's statement served a dual purpose: it aimed to explain to his contemporaries the destruction of the Jewish community in Egypt, while also reinforcing the notion that, despite the disastrous consequences of the Bar Kokhba revolt and subsequent distress, only the Land of Israel offered the hope of safety and salvation for the Jewish people. The Jerusalem Talmud (Sukkot 5:1), following Bar Yochai's statement and preceding a description of the Great Synagogue of Alexandria and its destruction by Trajan, includes an amoraic passage (composed between roughly 200 and 500 CE in a blend of Hebrew and Aramaic) that offers an explanation for Trajan's massacre of the Jews in Alexandria. The legend presents a stark contrast between Jewish and Roman behavior: while the Emperor celebrates the birth of his son, the Jews fast in mourning on the Ninth of Av; when his daughter dies, the Jews celebrate Hanukkah with festive lights. Interpreting these actions as signs of rebellion, Trajan's wife persuades him to redirect his military focus from a campaign against the "Barbarians" toward the suppression of the Jews. When the emperor arrives, he finds the Jews engaged with a prophetic verse from the Torah that alludes to an enemy nation. A later portion of the passage mentions the eagle—a Roman symbol—which identifies the prophesied biblical oppressor with Rome. The emperor then "surrounded them with legions and killed them. He said to their wives, if you listen to my legions I shall not kill you. They told him, what you did to those on the ground floor do to those on the gallery. He mixed their blood with their blood, and the blood flowed into the sea as far as Cyprus. At that moment the horn of Israel was trimmed and will not be restored until the Son of David comes." Although the account contains clear fictionalizations—Trajan is not known to have had children, and there is no evidence of his presence in Egypt at the time—it nevertheless incorporates historical elements, including Trajan's diversion of troops from his Parthian campaign and the destruction of Jewish communities. Noah Hacham interprets the stories as reflecting a fundamental and irreconcilable conflict between Jews and Romans. The Ninth of Av, when Jews commemorate Rome's destruction of the Second Temple, coincides with Rome celebrating the continuity of its empire, while Hanukkah, marking the Temple's rededication, contrasts with the disruption of Roman continuity. Additionally, the Egyptian context casts Trajan as harsher than the biblical Pharaoh: the latter targeted male infants, whereas Trajan annihilated all. According to Hacham, these stories, put together in the Jerusalem Talmud, frame the destruction of Alexandria's Jewish community as part of a pattern of calamities endured by the Jewish people. Another Jewish perspective on the aftermath of the uprising appears in a late 2nd-century rabbinic story attributed to Eleazar ben Jose. In this account, Eleazar visits Alexandria and is shown the bones of Jews buried beneath a building by an elderly local, who boasts that "some they drowned, some they slew with the sword, some they crushed beneath buildings [under construction]." The narrator applies this scene to the oppression of the Israelites in Egypt, but Horbury has argued that the attribution to Alexandria and the motif of construction over Jewish corpses are well suited to Jewish memories of the Diaspora Revolt, and that the story likely reflects what was considered plausible in late second-century Judaea rather than an actual eyewitness report. The impact of the revolt on the development of Christianity in Egypt, and particularly the fate and influence of the pre-revolt Jewish Christian community, has been the subject of scholarly debate. Although Christian traditions place the arrival of Christianity in Egypt in the first century, evidence for a Christian presence in Alexandria before the late second century remains sparse. It is nevertheless plausible that, prior to the Diaspora Revolt, a small and largely inconspicuous early Christian community existed in Alexandria, predominantly of Jewish origin and embedded within the wider Jewish community. Historian Joseph Mélèze-Modrzejewski argues that because early Christianity in Egypt was closely bound to Alexandrian Jewry, it was effectively devastated together with that community during the Diaspora Revolt. In this view, Jewish Christianity disappeared in the catastrophe, and Christianity in Egypt was later reconstituted as a non-Jewish movement (so-called "pagan Christianity"), with any surviving Christian Jews being absorbed in the new community. From this perspective, the institutional church attested from the Severan period onward represents a post-revolt development rather than a direct continuation of the earlier Jewish Christian environment. An alternative interpretation, advanced by the scholar of religion Birger A. Pearson, holds that although the revolt was probably a significant event for Christians in Egypt, it did not result in a complete rupture. Rather, the evidence points to substantial continuities. Pearson notes that Christians preserved and transmitted much of the literary legacy of Alexandrian Jewry, most notably the Septuagint (the Greek translation of the Hebrew Bible) as well as the works of Philo of Alexandria, a Jewish philosopher. Theological traditions rooted in the Jewish community of Alexandria, such as Logos theology and negative theology, were likewise adopted by Christians, leaving their intellectual foundations deeply shaped by Jewish thought. Pearson also points to social and institutional continuities, including the existence of ascetic Christian communities possibly influenced by the Jewish Therapeutae and the organization of each Christian congregation under a presbyter, in a structure modelled on the synagogue. As Christianity later expanded into the countryside, it developed beyond its Jewish and Hellenistic roots under the influence of native Egyptian culture and language, leading to the emergence of Coptic Christianity. At Oxyrhynchus, a festival commemorating the victory over the Jews continued to be observed nearly 80 years later, around 200 CE, during the visit of Emperor Septimius Severus to Egypt, as documented in papyrus CPJ II 450: The inhabitants of Oxyrhynchus possess the goodwill, faithfulness and friendship to the Romans, which they showed in the war against the Jews, fighting on your side. And even now they celebrate the day of victory as a festival day each year. David Frankfurter, a scholar of ancient religion, draws on Egyptian texts that portray Jews as worshippers of Set and associate them with cosmic disorder, as well as on the Egyptian practice of re-enacting mythic battles, to propose a reconstruction of the festival. He theorizes that it involved a ritual re-dramatization of the victory, portraying the Jews as Typhonians (followers of Set-Typhon) and their defeat as the triumph of Horus-Pharaoh, with their expulsion framed as a purification of the land. The Egyptian priesthood, who had previously recast the Greek Ptolemaic rulers as traditional pharaohs, possibly led these celebrations, continuing an earlier priestly tradition that had produced anti-Jewish polemics through figures such as Manetho and Chaeremon. Frankfurter also suggests that the festival drew participants and spectators from diverse social groups, including Greco-Egyptian elites and local Egyptian peasants, reflecting its development within traditional Egyptian festival frameworks. Its annual occurrence linked it to the agricultural cycle of the period, highlighting its significance in the community. See also Notes References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Combinatorial_optimization] | [TOKENS: 979] |
Contents Combinatorial optimization Combinatorial optimization is a subfield of mathematical optimization that consists of finding an optimal object from a finite set of objects, where the set of feasible solutions is discrete or can be reduced to a discrete set. Typical combinatorial optimization problems are the travelling salesman problem ("TSP"), the minimum spanning tree problem ("MST"), and the knapsack problem. In many such problems, such as the ones previously mentioned, exhaustive search is not tractable, and so specialized algorithms that quickly rule out large parts of the search space or approximation algorithms must be resorted to instead. Combinatorial optimization is related to operations research, algorithm theory, and computational complexity theory. It has important applications in several fields, including artificial intelligence, machine learning, auction theory, software engineering, VLSI, applied mathematics and theoretical computer science. Applications Basic applications of combinatorial optimization include, but are not limited to: Methods There is a large amount of literature on polynomial-time algorithms for certain special classes of discrete optimization. A considerable amount of it is unified by the theory of linear programming. Some examples of combinatorial optimization problems that are covered by this framework are shortest paths and shortest-path trees, flows and circulations, spanning trees, matching, and matroid problems. For NP-complete discrete optimization problems, current research literature includes the following topics: Combinatorial optimization problems can be viewed as searching for the best element of some set of discrete items; therefore, in principle, any sort of search algorithm or metaheuristic can be used to solve them. Widely applicable approaches include branch-and-bound (an exact algorithm which can be stopped at any point in time to serve as heuristic), branch-and-cut (uses linear optimisation to generate bounds), dynamic programming (a recursive solution construction with limited search window) and tabu search (a greedy-type swapping algorithm). However, generic search algorithms are not guaranteed to find an optimal solution first, nor are they guaranteed to run quickly (in polynomial time). Since some discrete optimization problems are NP-complete, such as the traveling salesman (decision) problem, this is expected unless P=NP. For each combinatorial optimization problem, there is a corresponding decision problem that asks whether there is a feasible solution for some particular measure m 0 {\displaystyle m_{0}} . For example, if there is a graph G {\displaystyle G} which contains vertices u {\displaystyle u} and v {\displaystyle v} , an optimization problem might be "find a path from u {\displaystyle u} to v {\displaystyle v} that uses the fewest edges". This problem might have an answer of, say, 4. A corresponding decision problem would be "is there a path from u {\displaystyle u} to v {\displaystyle v} that uses 10 or fewer edges?" This problem can be answered with a simple 'yes' or 'no'. The field of approximation algorithms deals with algorithms to find near-optimal solutions to hard problems. The usual decision version is then an inadequate definition of the problem since it only specifies acceptable solutions. Even though we could introduce suitable decision problems, the problem is then more naturally characterized as an optimization problem. NP optimization problem An NP-optimization problem (NPO) is a combinatorial optimization problem with the following additional conditions. Note that the below referred polynomials are functions of the size of the respective functions' inputs, not the size of some implicit set of input instances. This implies that the corresponding decision problem is in NP. In computer science, interesting optimization problems usually have the above properties and are therefore NPO problems. A problem is additionally called a P-optimization (PO) problem, if there exists an algorithm which finds optimal solutions in polynomial time. Often, when dealing with the class NPO, one is interested in optimization problems for which the decision versions are NP-complete. Note that hardness relations are always with respect to some reduction. Due to the connection between approximation algorithms and computational optimization problems, reductions which preserve approximation in some respect are for this subject preferred than the usual Turing and Karp reductions. An example of such a reduction would be L-reduction. For this reason, optimization problems with NP-complete decision versions are not necessarily called NPO-complete. NPO is divided into the following subclasses according to their approximability: An NPO problem is called polynomially bounded (PB) if, for every instance x {\displaystyle x} and for every solution y ∈ f ( x ) {\displaystyle y\in f(x)} , the measure m ( x , y ) {\displaystyle m(x,y)} is bounded by a polynomial function of the size of x {\displaystyle x} . The class NPOPB is the class of NPO problems that are polynomially-bounded. Specific problems See also Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Perpetual_calendar] | [TOKENS: 1450] |
Contents Perpetual calendar A perpetual calendar is a calendar valid for many years, usually designed to look up the day of the week for a given date in the past or future. For the Gregorian and Julian calendars, a perpetual calendar typically consists of one of three general variations: Such a perpetual calendar fails to indicate the dates of moveable feasts such as Easter, which are calculated based on a combination of events in the tropical year and lunar cycles. These issues are dealt with in great detail in computus. An early example of a perpetual calendar for practical use is found in the Nürnberger Handschrift GNM 3227a. The calendar covers the period of 1390–1495 (on which grounds the manuscript is dated to c. 1389). For each year of this period, it lists the number of weeks between Christmas and Quinquagesima. This is the first known instance of a tabular form of perpetual calendar allowing the calculation of the moveable feasts that became popular during the 15th century. The chapel Cappella dei Mercanti, Turin contains a perpetual calendar machine made by Giovanni Plana using rotating drums. Other uses of the term "perpetual calendar" Offices and retail establishments often display devices containing a set of elements to form all possible numbers from 1 through 31, as well as the names/abbreviations for the months and the days of the week, to show the current date for convenience of people who might be signing and dating documents such as checks. Establishments that serve alcoholic beverages may use a variant that shows the current month and day but subtracting the legal age of alcohol consumption in years, indicating the latest legal birth date for alcohol purchases. A common device consists of two cubes in a holder. One cube carries the digits zero to five. The other bears the digits 0, 1, 2, 6 (or 9 if inverted), 7, and 8. This is sufficient because only one and two may appear twice in date and they are on both cubes, while the 0 is on both cubes so that all single-digit dates can be shown in double-digit format. In addition to the two cubes, three blocks, each as wide as the two cubes combined, and a third as tall and as deep, have the names of the months printed on their long faces. The current month is turned forward on the front block, with the other two month blocks behind it. Certain calendar reforms have been labeled perpetual calendars because their dates are fixed on the same weekdays every year. Examples are The World Calendar, the International Fixed Calendar and the Pax Calendar. Technically, these are not perpetual calendars but perennial calendars. Their purpose, in part, is to eliminate the need for perpetual calendar tables, algorithms, and computation devices. In watchmaking, "perpetual calendar" describes a calendar mechanism that correctly displays the date on the watch "perpetually", taking into account the different lengths of the months as well as leap years. The internal mechanism will move the dial to the next day. Algorithms Perpetual calendars use algorithms to compute the day of the week for any given year, month, and day of the month. Even though the individual operations in the formulas can be very efficiently implemented in software, they are too complicated for most people to perform all of the arithmetic mentally. Perpetual calendar designers hide the complexity in tables to simplify their use. A perpetual calendar employs a table for finding which of fourteen yearly calendars to use. A table for the Gregorian calendar expresses its 400-year grand cycle: 303 common years and 97 leap years total to 146,097 days, or exactly 20,871 weeks. This cycle breaks down into one 100-year period with 25 leap years, making 36,525 days, or one day less than 5,218 full weeks; and three 100-year periods with 24 leap years each, making 36,524 days, or two days less than 5,218 full weeks. Within each 100-year block, the cyclic nature of the Gregorian calendar proceeds in the same fashion as its Julian predecessor: A common year begins and ends on the same day of the week, so the following year will begin on the next successive day of the week. A leap year has one more day, so the year following a leap year begins on the second day of the week after the leap year began. Every four years, the starting weekday advances five days, so over a 28-year period, it advances 35, returning to the same place in both the leap year progression and the starting weekday. This cycle completes three times in 84 years, leaving 16 years in the fourth, incomplete cycle of the century. A major complicating factor in constructing a perpetual calendar algorithm is the peculiar and variable length of February, which was at one time the last month of the year, leaving the first 11 months March through January with a five-month repeating pattern: 31, 30, 31, 30, 31, ..., so that the offset from March of the starting day of the week for any month could be easily determined. Zeller's congruence, a well-known algorithm for finding the day of the week for any date, explicitly defines January and February as the "13th" and "14th" months of the previous year to take advantage of this regularity, but the month-dependent calculation is still very complicated for mental arithmetic: Instead, a table-based perpetual calendar provides a simple lookup mechanism to find offset for the day of the week for the first day of each month. To simplify the table, in a leap year January and February must either be treated as a separate year or have extra entries in the month table: Perpetual Julian and Gregorian calendar tables The following calendar works for any date from 15 October 1582 onwards, but only for Gregorian calendar dates. Gregorian 31 March 2006: Greg century 20(c) and year 06(y) meet at A in the table of Latin square. The A in row Mar(m) meets 31(d) at Fri in the table of Weekdays. The day is Friday. BC 1 January 45: BC 45 = -44 = -100 + 56 (a leap year). -1 and 56 meet at B and Jan_B meets 1 at Fri(day). Julian 1 January 1900: Julian 19 meets 00 at A and Jan_A meets 1 at Sat(urday). Gregorian 1 January 1900: Greg 19 meets 00 at G and Jan_G meets 1 at Mon(day). A compact perpetual calendar (Julian and Gregorian) for the years 0 to 2399 based on the dominical letter of a year was devised by the American astronomer G.M. Clemence. It was first published in 1954 in the 9th edition of the Smithsonian Physical Tables and was also adopted from 1956 until the mid 1960s in The World Almanac and Book of Facts. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Orion%E2%80%93Eridanus_Superbubble] | [TOKENS: 445] |
Contents Orion–Eridanus Superbubble The Orion–Eridanus Superbubble is a superbubble located in the constellations Orion and Eridanus. The region is formed from overlapping supernova remnants that were suspected to be associated with the Orion OB1 stellar association. The bubble is approximately 1200 ly across. It is the nearest superbubble to the Local Bubble containing the Sun, with the respective shock fronts being about 500 ly apart. The Orion–Eridanus Superbubble is formed by the stellar wind of tens of massive stars and 10–20 supernovae. The superbubble likely formed from the Orion blue stream, which is composed of massive stars in front of the Orion Molecular Cloud Complex. The Orion blue stream begins at around 150 parsec and extends towards Orion OB1 at around 300 parsec. The stream could however include the Bellatrix cluster, which is around 80 parsec distant. The structure was discovered from 21 cm radio observations by Carl Heiles and interstellar optical emission line observations by Reynolds and Ogden in the 1970s. The western part of the Orion–Eridanus Superbubble is visible in X-ray images and is therefore also referred to as the Eridanus Soft X-ray Enhancement. In the eastern part, these wavelengths are obscured by molecular clouds, making it impossible to determine the morphology from X-rays alone (see also the anti-correlation between the reddish molecular clouds and the blue X-ray emission in the image above). Older works consider Barnard's Loop to be either the nearest or the most distant edge of the Orion–Eridanus Superbubble, assuming that the λ Orionis Nebula lies outside. More recent studies suggest that the superbubble extends to the Galactic plane and that both Barnard's Loop and the λ Orionis Nebula lie inside. The exact morphology and orientation in space remain uncertain. The Sun might have passed through the Orion–Eridanus Superbubble before it passed through the Local Bubble. This could explain an older peak of iron-60 found in deep sea sediments. Gallery See also References This nebula-related article is a stub. You can help Wikipedia by adding missing information. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Minecraft#cite_ref-NS_245-0] | [TOKENS: 12858] |
Contents Minecraft Minecraft is a sandbox game developed and published by Mojang Studios. Following its initial public alpha release in 2009, it was formally released in 2011 for personal computers. The game has since been ported to numerous platforms, including mobile devices and various video game consoles. In Minecraft, players explore a procedurally generated world with virtually infinite terrain made up of voxels (cubes). They can discover and extract raw materials, craft tools and items, build structures, fight hostile mobs, and cooperate with or compete against other players in multiplayer. The game's large community offers a wide variety of user-generated content, such as modifications, servers, player skins, texture packs, and custom maps, which add new game mechanics and possibilities. Originally created by Markus "Notch" Persson using the Java programming language, Jens "Jeb" Bergensten was handed control over the game's development following its full release. In 2014, Mojang and the Minecraft intellectual property were purchased by Microsoft for US$2.5 billion; Xbox Game Studios hold the publishing rights for the Bedrock Edition, the unified cross-platform version which evolved from the Pocket Edition codebase[i] and replaced the legacy console versions. Bedrock is updated concurrently with Mojang's original Java Edition, although with numerous, generally small, differences. Minecraft is the best-selling video game in history with over 350 million copies sold. It has received critical acclaim, winning several awards and being cited as one of the greatest video games of all time. Social media, parodies, adaptations, merchandise, and the annual Minecon conventions have played prominent roles in popularizing it. The wider Minecraft franchise includes several spin-off games, such as Minecraft: Story Mode, Minecraft Dungeons, and Minecraft Legends. A film adaptation, titled A Minecraft Movie, was released in 2025 and became the second highest-grossing video game film of all time. Gameplay Minecraft is a 3D sandbox video game that has no required goals to accomplish, giving players a large amount of freedom in choosing how to play the game. The game features an optional achievement system. Gameplay is in the first-person perspective by default, but players have the option of third-person perspectives. The game world is composed of rough 3D objects—mainly cubes, referred to as blocks—representing various materials, such as dirt, stone, ores, tree trunks, water, and lava. The core gameplay revolves around picking up and placing these objects. These blocks are arranged in a voxel grid, while players can move freely around the world. Players can break, or mine, blocks and then place them elsewhere, enabling them to build things. Very few blocks are affected by gravity, instead maintaining their voxel position in the air. Players can also craft a wide variety of items, such as armor, which mitigates damage from attacks; weapons (such as swords or bows and arrows), which allow monsters and animals to be killed more easily; and tools (such as pickaxes or shovels), which break certain types of blocks more quickly. Some items have multiple tiers depending on the material used to craft them, with higher-tier items being more effective and durable. They may also freely craft helpful blocks—such as furnaces which can cook food and smelt ores, and torches that produce light—or exchange items with villagers (NPC) through trading emeralds for different goods and vice versa. The game has an inventory system, allowing players to carry a limited number of items. The in-game time system follows a day and night cycle, with one full cycle lasting for 20 real-time minutes. The game also contains a material called redstone, which can be used to make primitive mechanical devices, electrical circuits, and logic gates, allowing for the construction of many complex systems. New players are given a randomly selected default character skin out of nine possibilities, including Steve or Alex, but are able to create and upload their own skins. Players encounter various mobs (short for mobile entities) including animals, villagers, and hostile creatures. Passive mobs, such as cows, pigs, and chickens, spawn during the daytime and can be hunted for food and crafting materials, while hostile mobs—including large spiders, witches, skeletons, and zombies—spawn during nighttime or in dark places such as caves. Some hostile mobs, such as zombies and skeletons, burn under the sun if they have no headgear and are not standing in water. Other creatures unique to Minecraft include the creeper (an exploding creature that sneaks up on the player) and the enderman (a creature with the ability to teleport as well as pick up and place blocks). There are also variants of mobs that spawn in different conditions; for example, zombies have husk and drowned variants that spawn in deserts and oceans, respectively. The Minecraft environment is procedurally generated as players explore it using a map seed that is randomly chosen at the time of world creation (or manually specified by the player). Divided into biomes representing different environments with unique resources and structures, worlds are designed to be effectively infinite in traditional gameplay, though technical limits on the player have existed throughout development, both intentionally and not. Implementation of horizontally infinite generation initially resulted in a glitch termed the "Far Lands" at over 12 million blocks away from the world center, where terrain generated as wall-like, fissured patterns. The Far Lands and associated glitches were considered the effective edge of the world until they were resolved, with the current horizontal limit instead being a special impassable barrier called the world border, located 30 million blocks away. Vertical space is comparatively limited, with an unbreakable bedrock layer at the bottom and a building limit several hundred blocks into the sky. Minecraft features three independent dimensions accessible through portals and providing alternate game environments. The Overworld is the starting dimension and represents the real world, with a terrestrial surface setting including plains, mountains, forests, oceans, caves, and small sources of lava. The Nether is a hell-like underworld dimension accessed via an obsidian portal and composed mainly of lava. Mobs that populate the Nether include shrieking, fireball-shooting ghasts, alongside anthropomorphic pigs called piglins and their zombified counterparts. Piglins in particular have a bartering system, where players can give them gold ingots and receive items in return. Structures known as Nether Fortresses generate in the Nether, containing mobs such as wither skeletons and blazes, which can drop blaze rods needed to access the End dimension. The player can also choose to build an optional boss mob known as the Wither, using skulls obtained from wither skeletons and soul sand. The End can be reached through an end portal, consisting of twelve end portal frames. End portals are found in underground structures in the Overworld known as strongholds. To find strongholds, players must craft eyes of ender using an ender pearl and blaze powder. Eyes of ender can then be thrown, traveling in the direction of the stronghold. Once the player reaches the stronghold, they can place eyes of ender into each portal frame to activate the end portal. The dimension consists of islands floating in a dark, bottomless void. A boss enemy called the Ender Dragon guards the largest, central island. Killing the dragon opens access to an exit portal, which, when entered, cues the game's ending credits and the End Poem, a roughly 1,500-word work written by Irish novelist Julian Gough, which takes about nine minutes to scroll past, is the game's only narrative text, and the only text of significant length directed at the player.: 10–12 At the conclusion of the credits, the player is teleported back to their respawn point and may continue the game indefinitely. In Survival mode, players have to gather natural resources such as wood and stone found in the environment in order to craft certain blocks and items. Depending on the difficulty, monsters spawn in darker areas outside a certain radius of the character, requiring players to build a shelter in order to survive at night. The mode also has a health bar which is depleted by attacks from mobs, falls, drowning, falling into lava, suffocation, starvation, and other events. Players also have a hunger bar, which must be periodically refilled by eating food in-game unless the player is playing on peaceful difficulty. If the hunger bar is empty, the player starves. Health replenishes when players have a full hunger bar or continuously on peaceful. Upon losing all health, players die. The items in the players' inventories are dropped unless the game is reconfigured not to do so. Players then re-spawn at their spawn point, which by default is where players first spawn in the game and can be changed by sleeping in a bed or using a respawn anchor. Dropped items can be recovered if players can reach them before they despawn after 5 minutes. Players may acquire experience points (commonly referred to as "xp" or "exp") by killing mobs and other players, mining, smelting ores, animal breeding, and cooking food. Experience can then be spent on enchanting tools, armor and weapons. Enchanted items are generally more powerful, last longer, or have other special effects. The game features two more game modes based on Survival, known as Hardcore mode and Adventure mode. Hardcore mode plays identically to Survival mode, but with the game's difficulty setting locked to "Hard" and with permadeath, forcing them to delete the world or explore it as a spectator after dying. Adventure mode was added to the game in a post-launch update, and prevents the player from directly modifying the game's world. It was designed primarily for use in custom maps, allowing map designers to let players experience it as intended. In Creative mode, players have access to an infinite number of all resources and items in the game through the inventory menu and can place or mine them instantly. Players can toggle the ability to fly freely around the game world at will, and their characters usually do not take any damage nor are affected by hunger. The game mode helps players focus on building and creating projects of any size without disturbance. Multiplayer in Minecraft enables multiple players to interact and communicate with each other on a single world. It is available through direct game-to-game multiplayer, local area network (LAN) play, local split screen (console-only), and servers (player-hosted and business-hosted). Players can run their own server by making a realm, using a host provider, hosting one themselves or connect directly to another player's game via Xbox Live, PlayStation Network or Nintendo Switch Online. Single-player worlds have LAN support, allowing players to join a world on locally interconnected computers without a server setup. Minecraft multiplayer servers are guided by server operators, who have access to server commands such as setting the time of day and teleporting players. Operators can also set up restrictions concerning which usernames or IP addresses are allowed or disallowed to enter the server. Multiplayer servers have a wide range of activities, with some servers having their own unique rules and customs. The largest and most popular server is Hypixel, which has been visited by over 14 million unique players. Player versus player combat (PvP) can be enabled to allow fighting between players. In 2013, Mojang announced Minecraft Realms, a server hosting service intended to enable players to run server multiplayer games easily and safely without having to set up their own. Unlike a standard server, only invited players can join Realms servers, and these servers do not use server addresses. Minecraft: Java Edition Realms server owners can invite up to twenty people to play on their server, with up to ten players online at a time. Minecraft Realms server owners can invite up to 3,000 people to play on their server, with up to ten players online at one time. The Minecraft: Java Edition Realms servers do not support user-made plugins, but players can play custom Minecraft maps. Minecraft Bedrock Realms servers support user-made add-ons, resource packs, behavior packs, and custom Minecraft maps. At Electronic Entertainment Expo 2016, support for cross-platform play between Windows 10, iOS, and Android platforms was added through Realms starting in June 2016, with Xbox One and Nintendo Switch support to come later in 2017, and support for virtual reality devices. On 31 July 2017, Mojang released the beta version of the update allowing cross-platform play. Nintendo Switch support for Realms was released in July 2018. The modding community consists of fans, users and third-party programmers. Using a variety of application program interfaces that have arisen over time, they have produced a wide variety of downloadable content for Minecraft, such as modifications, texture packs and custom maps. Modifications of the Minecraft code, called mods, add a variety of gameplay changes, ranging from new blocks, items, and mobs to entire arrays of mechanisms. The modding community is responsible for a substantial supply of mods from ones that enhance gameplay, such as mini-maps, waypoints, and durability counters, to ones that add to the game elements from other video games and media. While a variety of mod frameworks were independently developed by reverse engineering the code, Mojang has also enhanced vanilla Minecraft with official frameworks for modification, allowing the production of community-created resource packs, which alter certain game elements including textures and sounds. Players can also create their own "maps" (custom world save files) that often contain specific rules, challenges, puzzles and quests, and share them for others to play. Mojang added an adventure mode in August 2012 and "command blocks" in October 2012, which were created specially for custom maps in Java Edition. Data packs, introduced in version 1.13 of the Java Edition, allow further customization, including the ability to add new achievements, dimensions, functions, loot tables, predicates, recipes, structures, tags, and world generation. The Xbox 360 Edition supported downloadable content, which was available to purchase via the Xbox Games Store; these content packs usually contained additional character skins. It later received support for texture packs in its twelfth title update while introducing "mash-up packs", which combined texture packs with skin packs and changes to the game's sounds, music and user interface. The first mash-up pack (and by extension, the first texture pack) for the Xbox 360 Edition was released on 4 September 2013, and was themed after the Mass Effect franchise. Unlike Java Edition, however, the Xbox 360 Edition did not support player-made mods or custom maps. A cross-promotional resource pack based on the Super Mario franchise by Nintendo was released exclusively for the Wii U Edition worldwide on 17 May 2016, and later bundled free with the Nintendo Switch Edition at launch. Another based on Fallout was released on consoles that December, and for Windows and Mobile in April 2017. In April 2018, malware was discovered in several downloadable user-made Minecraft skins for use with the Java Edition of the game. Avast stated that nearly 50,000 accounts were infected, and when activated, the malware would attempt to reformat the user's hard drive. Mojang promptly patched the issue, and released a statement stating that "the code would not be run or read by the game itself", and would run only when the image containing the skin itself was opened. In June 2017, Mojang released the "1.1 Discovery Update" to the Pocket Edition of the game, which later became the Bedrock Edition. The update introduced the "Marketplace", a catalogue of purchasable user-generated content intended to give Minecraft creators "another way to make a living from the game". Various skins, maps, texture packs and add-ons from different creators can be bought with "Minecoins", a digital currency that is purchased with real money. Additionally, users can access specific content with a subscription service titled "Marketplace Pass". Alongside content from independent creators, the Marketplace also houses items published by Mojang and Microsoft themselves, as well as official collaborations between Minecraft and other intellectual properties. By 2022, the Marketplace had over 1.7 billion content downloads, generating over $500 million in revenue. Development Before creating Minecraft, Markus "Notch" Persson was a game developer at King, where he worked until March 2009. At King, he primarily developed browser games and learned several programming languages. During his free time, he prototyped his own games, often drawing inspiration from other titles, and was an active participant on the TIGSource forums for independent developers. One such project was "RubyDung", a base-building game inspired by Dwarf Fortress, but with an isometric, three-dimensional perspective similar to RollerCoaster Tycoon. Among the features in RubyDung that he explored was a first-person view similar to Dungeon Keeper, though he ultimately discarded this idea, feeling the graphics were too pixelated at the time. Around March 2009, Persson left King and joined jAlbum, while continuing to work on his prototypes. Infiniminer, a block-based open-ended mining game first released in April 2009, inspired Persson's vision for RubyDung's future direction. Infiniminer heavily influenced the visual style of gameplay, including bringing back the first-person mode, the "blocky" visual style and the block-building fundamentals. However, unlike Infiniminer, Persson wanted Minecraft to have RPG elements. The first public alpha build of Minecraft was released on 17 May 2009 on TIGSource. Over the years, Persson regularly released test builds that added new features, including tools, mobs, and entire new dimensions. In 2011, partly due to the game's rising popularity, Persson decided to release a full 1.0 version—a second part of the "Adventure Update"—on 18 November 2011. Shortly after, Persson stepped down from development, handing the project's lead to Jens "Jeb" Bergensten. On 15 September 2014, Microsoft, the developer behind the Microsoft Windows operating system and Xbox video game console, announced a $2.5 billion acquisition of Mojang, which included the Minecraft intellectual property. Persson had suggested the deal on Twitter, asking a corporation to buy his stake in the game after receiving criticism for enforcing terms in the game's end-user license agreement (EULA), which had been in place for the past three years. According to Persson, Mojang CEO Carl Manneh received a call from a Microsoft executive shortly after the tweet, asking if Persson was serious about a deal. Mojang was also approached by other companies including Activision Blizzard and Electronic Arts. The deal with Microsoft was arbitrated on 6 November 2014 and led to Persson becoming one of Forbes' "World's Billionaires". After 2014, Minecraft's primary versions received usually annual major updates—free to players who have purchased the game— each primarily centered around a specific theme. For instance, version 1.13, the Update Aquatic, focused on ocean-related features, while version 1.16, the Nether Update, introduced significant changes to the Nether dimension. However, in late 2024, Mojang announced a shift in their update strategy; rather than releasing large updates annually, they opted for a more frequent release schedule with smaller, incremental updates, stating, "We know that you want new Minecraft content more often." The Bedrock Edition has also received regular updates, now matching the themes of the Java Edition updates. Other versions of the game, such as various console editions and the Pocket Edition, were either merged into Bedrock or discontinued and have not received further updates. On 7 May 2019, coinciding with Minecraft's 10th anniversary, a JavaScript recreation of an old 2009 Java Edition build named Minecraft Classic was made available to play online for free. On 16 April 2020, a Bedrock Edition-exclusive beta version of Minecraft, called Minecraft RTX, was released by Nvidia. It introduced physically-based rendering, real-time path tracing, and DLSS for RTX-enabled GPUs. The public release was made available on 8 December 2020. Path tracing can only be enabled in supported worlds, which can be downloaded for free via the in-game Minecraft Marketplace, with a texture pack from Nvidia's website, or with compatible third-party texture packs. It cannot be enabled by default with any texture pack on any world. Initially, Minecraft RTX was affected by many bugs, display errors, and instability issues. On 22 March 2025, a new visual mode called Vibrant Visuals, an optional graphical overhaul similar to Minecraft RTX, was announced. It promises modern rendering features—such as dynamic shadows, screen space reflections, volumetric fog, and bloom—without the need of RTX-capable hardware. Vibrant Visuals was released as a part of the Chase the Skies update on 17 June 2025 for Bedrock Edition and is planned to release on Java Edition at a later date. Development began for the original edition of Minecraft—then known as Cave Game, and now known as the Java Edition—in May 2009,[k] and ended on 13 May, when Persson released a test video on YouTube of an early version of the game, dubbed the "Cave game tech test" or the "Cave game tech demo". The game was named Minecraft: Order of the Stone the next day, after a suggestion made by a player. "Order of the Stone" came from the webcomic The Order of the Stick, and "Minecraft" was chosen "because it's a good name". The title was later shortened to just Minecraft, omitting the subtitle. Persson completed the game's base programming over a weekend in May 2009, and private testing began on TigIRC on 16 May. The first public release followed on 17 May 2009 as a developmental version shared on the TIGSource forums. Based on feedback from forum users, Persson continued updating the game. This initial public build later became known as Classic. Further developmental phases—dubbed Survival Test, Indev, and Infdev—were released throughout 2009 and 2010. The first major update, known as Alpha, was released on 30 June 2010. At the time, Persson was still working a day job at jAlbum but later resigned to focus on Minecraft full-time as sales of the alpha version surged. Updates were distributed automatically, introducing new blocks, items, mobs, and changes to game mechanics such as water flow. With revenue generated from the game, Persson founded Mojang, a video game studio, alongside former colleagues Jakob Porser and Carl Manneh. On 11 December 2010, Persson announced that Minecraft would enter its beta phase on 20 December. He assured players that bug fixes and all pre-release updates would remain free. As development progressed, Mojang expanded, hiring additional employees to work on the project. The game officially exited beta and launched in full on 18 November 2011. On 1 December 2011, Jens "Jeb" Bergensten took full creative control over Minecraft, replacing Persson as lead designer. On 28 February 2012, Mojang announced the hiring of the developers behind Bukkit, a popular developer API for Minecraft servers, to improve Minecraft's support of server modifications. This move included Mojang taking apparent ownership of the CraftBukkit server mod, though this apparent acquisition later became controversial, and its legitimacy was questioned due to CraftBukkit's open-source nature and licensing under the GNU General Public License and Lesser General Public License. In August 2011, Minecraft: Pocket Edition was released as an early alpha for the Xperia Play via the Android Market, later expanding to other Android devices on 8 October 2011. The iOS version followed on 17 November 2011. A port was made available for Windows Phones shortly after Microsoft acquired Mojang. Unlike Java Edition, Pocket Edition initially focused on Minecraft's creative building and basic survival elements but lacked many features of the PC version. Bergensten confirmed on Twitter that the Pocket Edition was written in C++ rather than Java, as iOS does not support Java. On 10 December 2014, a port of Pocket Edition was released for Windows Phone 8.1. In July 2015, a port of the Pocket Edition to Windows 10 was released as the Windows 10 Edition, with full crossplay to other Pocket versions. In January 2017, Microsoft announced that it would no longer maintain the Windows Phone versions of Pocket Edition. On 20 September 2017, with the "Better Together Update", the Pocket Edition was ported to the Xbox One, and was renamed to the Bedrock Edition. The console versions of Minecraft debuted with the Xbox 360 edition, developed by 4J Studios and released on 9 May 2012. Announced as part of the Xbox Live Arcade NEXT promotion, this version introduced a redesigned crafting system, a new control interface, in-game tutorials, split-screen multiplayer, and online play via Xbox Live. Unlike the PC version, its worlds were finite, bordered by invisible walls. Initially, the Xbox 360 version resembled outdated PC versions but received updates to bring it closer to Java Edition before eventually being discontinued. The Xbox One version launched on 5 September 2014, featuring larger worlds and support for more players. Minecraft expanded to PlayStation platforms with PlayStation 3 and PlayStation 4 editions released on 17 December 2013 and 4 September 2014, respectively. Originally planned as a PS4 launch title, it was delayed before its eventual release. A PlayStation Vita version followed in October 2014. Like the Xbox versions, the PlayStation editions were developed by 4J Studios. Nintendo platforms received Minecraft: Wii U Edition on 17 December 2015, with a physical release in North America on 17 June 2016 and in Europe on 30 June. The Nintendo Switch version launched via the eShop on 11 May 2017. During a Nintendo Direct presentation on 13 September 2017, Nintendo announced that Minecraft: New Nintendo 3DS Edition, based on the Pocket Edition, would be available for download immediately after the livestream, and a physical copy available on a later date. The game is compatible only with the New Nintendo 3DS or New Nintendo 2DS XL systems and does not work with the original 3DS or 2DS systems. On 20 September 2017, the Better Together Update introduced Bedrock Edition across Xbox One, Windows 10, VR, and mobile platforms, enabling cross-play between these versions. Bedrock Edition later expanded to Nintendo Switch and PlayStation 4, with the latter receiving the update in December 2019, allowing cross-platform play for users with a free Xbox Live account. The Bedrock Edition released a native version for PlayStation 5 on 22 October 2024, while the Xbox Series X/S version launched on 17 June 2025. On 18 December 2018, the PlayStation 3, PlayStation Vita, Xbox 360, and Wii U versions of Minecraft received their final update and would later become known as "Legacy Console Editions". On 15 January 2019, the New Nintendo 3DS version of Minecraft received its final update, effectively becoming discontinued as well. An educational version of Minecraft, designed for use in schools, launched on 1 November 2016. It is available on Android, ChromeOS, iPadOS, iOS, MacOS, and Windows. On 20 August 2018, Mojang announced that it would bring Education Edition to iPadOS in Autumn 2018. It was released to the App Store on 6 September 2018. On 27 March 2019, it was announced that it would be operated by JD.com in China. On 26 June 2020, a public beta for the Education Edition was made available to Google Play Store compatible Chromebooks. The full game was released to the Google Play Store for Chromebooks on 7 August 2020. On 20 May 2016, China Edition (also known as My World) was announced as a localized edition for China, where it was released under a licensing agreement between NetEase and Mojang. The PC edition was released for public testing on 8 August 2017. The iOS version was released on 15 September 2017, and the Android version was released on 12 October 2017. The PC edition is based on the original Java Edition, while the iOS and Android mobile versions are based on the Bedrock Edition. The edition is free-to-play and had over 700 million registered accounts by September 2023. This version of Bedrock Edition is exclusive to Microsoft's Windows 10 and Windows 11 operating systems. The beta release for Windows 10 launched on the Windows Store on 29 July 2015. After nearly a year and a half in beta, Microsoft fully released the version on 19 December 2016. Called the "Ender Update", this release implemented new features to this version of Minecraft like world templates and add-on packs. On 7 June 2022, the Java and Bedrock Editions of Minecraft were merged into a single bundle for purchase on Windows; those who owned one version would automatically gain access to the other version. Both game versions would otherwise remain separate. Around 2011, prior to Minecraft's full release, Mojang collaborated with The Lego Group to create a Lego brick-based Minecraft game called Brickcraft. This would have modified the base Minecraft game to use Lego bricks, which meant adapting the basic 1×1 block to account for larger pieces typically used in Lego sets. Persson worked on an early version called "Project Rex Kwon Do", named after the character of the same name from the film Napoleon Dynamite. Although Lego approved the project and Mojang assigned two developers for six months, it was canceled due to the Lego Group's demands, according to Mojang's Daniel Kaplan. Lego considered buying Mojang to complete the game, but when Microsoft offered over $2 billion for the company, Lego stepped back, unsure of Minecraft's potential. On 26 June 2025, a build of Brickcraft dated 28 June 2012 was published on a community archive website Omniarchive. Initially, Markus Persson planned to support the Oculus Rift with a Minecraft port. However, after Facebook acquired Oculus in 2013, he abruptly canceled the plans, stating, "Facebook creeps me out." In 2016, a community-made mod, Minecraft VR, added VR support for Java Edition, followed by Vivecraft for HTC Vive. Later that year, Microsoft introduced official Oculus Rift support for Windows 10 Edition, leading to the discontinuation of the Minecraft VR mod due to trademark complaints. Vivecraft was endorsed by Minecraft VR contributors for its Rift support. Also available is a Gear VR version, titled Minecraft: Gear VR Edition. Windows Mixed Reality support was added in 2017. On 7 September 2020, Mojang Studios announced that the PlayStation 4 Bedrock version would receive PlayStation VR support later that month. In September 2024, the Minecraft team announced they would no longer support PlayStation VR, which received its final update in March 2025. Music and sound design Minecraft's music and sound effects were produced by German musician Daniel Rosenfeld, better known as C418. To create the sound effects for the game, Rosenfeld made extensive use of Foley techniques. On learning the processes for the game, he remarked, "Foley's an interesting thing, and I had to learn its subtleties. Early on, I wasn't that knowledgeable about it. It's a whole trial-and-error process. You just make a sound and eventually you go, 'Oh my God, that's it! Get the microphone!' There's no set way of doing anything at all." He reminisced on creating the in-game sound for grass blocks, stating "It turns out that to make grass sounds you don't actually walk on grass and record it, because grass sounds like nothing. What you want to do is get a VHS, break it apart, and just lightly touch the tape." According to Rosenfeld, his favorite sound to design for the game was the hisses of spiders. He elaborates, "I like the spiders. Recording that was a whole day of me researching what a spider sounds like. Turns out, there are spiders that make little screeching sounds, so I think I got this recording of a fire hose, put it in a sampler, and just pitched it around until it sounded like a weird spider was talking to you." Many of the sound design decisions by Rosenfeld were done accidentally or spontaneously. The creeper notably lacks any specific noises apart from a loud fuse-like sound when about to explode; Rosenfeld later recalled "That was just a complete accident by Markus and me [sic]. We just put in a placeholder sound of burning a matchstick. It seemed to work hilariously well, so we kept it." On other sounds, such as those of the zombie, Rosenfeld remarked, "I actually never wanted the zombies so scary. I intentionally made them sound comical. It's nice to hear that they work so well [...]." Rosenfeld remarked that the sound engine was "terrible" to work with, remembering "If you had two song files at once, it [the game engine] would actually crash. There were so many more weird glitches like that the guys never really fixed because they were too busy with the actual game and not the sound engine." The background music in Minecraft consists of instrumental ambient music. To compose the music of Minecraft, Rosenfeld used the package from Ableton Live, along with several additional plug-ins. Speaking on them, Rosenfeld said "They can be pretty much everything from an effect to an entire orchestra. Additionally, I've got some synthesizers that are attached to the computer. Like a Moog Voyager, Dave Smith Prophet 08 and a Virus TI." On 4 March 2011, Rosenfeld released a soundtrack titled Minecraft – Volume Alpha; it includes most of the tracks featured in Minecraft, as well as other music not featured in the game. Kirk Hamilton of Kotaku chose the music in Minecraft as one of the best video game soundtracks of 2011. On 9 November 2013, Rosenfeld released the second official soundtrack, titled Minecraft – Volume Beta, which included the music that was added in a 2013 "Music Update" for the game. A physical release of Volume Alpha, consisting of CDs, black vinyl, and limited-edition transparent green vinyl LPs, was issued by indie electronic label Ghostly International on 21 August 2015. On 14 August 2020, Ghostly released Volume Beta on CD and vinyl, with alternate color LPs and lenticular cover pressings released in limited quantities. The final update Rosenfeld worked on was 2018's 1.13 Update Aquatic. His music remained the only music in the game until 2020's "Nether Update", introducing pieces from Lena Raine. Since then, other composers have made contributions, including Kumi Tanioka, Samuel Åberg, Aaron Cherof, and Amos Roddy, with Raine remaining as the new primary composer. Ownership of all music besides Rosenfeld's independently released albums has been retained by Microsoft, with their label publishing all of the other artists' releases. Gareth Coker also composed some of the music for the game's mini games from the Legacy Console editions. Rosenfeld had stated his intent to create a third album of music for the game in a 2015 interview with Fact, and confirmed its existence in a 2017 tweet, stating that his work on the record as of then had tallied up to be longer than the previous two albums combined, which in total clocks in at over 3 hours and 18 minutes. However, due to licensing issues with Microsoft, the third volume has since not seen release. On 8 January 2021, Rosenfeld was asked in an interview with Anthony Fantano whether or not there was still a third volume of his music intended for release. Rosenfeld responded, saying, "I have something—I consider it finished—but things have become complicated, especially as Minecraft is now a big property, so I don't know." Reception Minecraft has received critical acclaim, with praise for the creative freedom it grants players in-game, as well as the ease of enabling emergent gameplay. Critics have expressed enjoyment in Minecraft's complex crafting system, commenting that it is an important aspect of the game's open-ended gameplay. Most publications were impressed by the game's "blocky" graphics, with IGN describing them as "instantly memorable". Reviewers also liked the game's adventure elements, noting that the game creates a good balance between exploring and building. The game's multiplayer feature has been generally received favorably, with IGN commenting that "adventuring is always better with friends". Jaz McDougall of PC Gamer said Minecraft is "intuitively interesting and contagiously fun, with an unparalleled scope for creativity and memorable experiences". It has been regarded as having introduced millions of children to the digital world, insofar as its basic game mechanics are logically analogous to computer commands. IGN was disappointed about the troublesome steps needed to set up multiplayer servers, calling it a "hassle". Critics also said that visual glitches occur periodically. Despite its release out of beta in 2011, GameSpot said the game had an "unfinished feel", adding that some game elements seem "incomplete or thrown together in haste". A review of the alpha version, by Scott Munro of the Daily Record, called it "already something special" and urged readers to buy it. Jim Rossignol of Rock Paper Shotgun also recommended the alpha of the game, calling it "a kind of generative 8-bit Lego Stalker". On 17 September 2010, gaming webcomic Penny Arcade began a series of comics and news posts about the addictiveness of the game. The Xbox 360 version was generally received positively by critics, but did not receive as much praise as the PC version. Although reviewers were disappointed by the lack of features such as mod support and content from the PC version, they acclaimed the port's addition of a tutorial and in-game tips and crafting recipes, saying that they make the game more user-friendly. The Xbox One Edition was one of the best received ports, being praised for its relatively large worlds. The PlayStation 3 Edition also received generally favorable reviews, being compared to the Xbox 360 Edition and praised for its well-adapted controls. The PlayStation 4 edition was the best received port to date, being praised for having 36 times larger worlds than the PlayStation 3 edition and described as nearly identical to the Xbox One edition. The PlayStation Vita Edition received generally positive reviews from critics but was noted for its technical limitations. The Wii U version received generally positive reviews from critics but was noted for a lack of GamePad integration. The 3DS version received mixed reviews, being criticized for its high price, technical issues, and lack of cross-platform play. The Nintendo Switch Edition received fairly positive reviews from critics, being praised, like other modern ports, for its relatively larger worlds. Minecraft: Pocket Edition initially received mixed reviews from critics. Although reviewers appreciated the game's intuitive controls, they were disappointed by the lack of content. The inability to collect resources and craft items, as well as the limited types of blocks and lack of hostile mobs, were especially criticized. After updates added more content, Pocket Edition started receiving more positive reviews. Reviewers complimented the controls and the graphics, but still noted a lack of content. Minecraft surpassed over a million purchases less than a month after entering its beta phase in early 2011. At the same time, the game had no publisher backing and has never been commercially advertised except through word of mouth, and various unpaid references in popular media such as the Penny Arcade webcomic. By April 2011, Persson estimated that Minecraft had made €23 million (US$33 million) in revenue, with 800,000 sales of the alpha version of the game, and over 1 million sales of the beta version. In November 2011, prior to the game's full release, Minecraft beta surpassed 16 million registered users and 4 million purchases. By March 2012, Minecraft had become the 6th best-selling PC game of all time. As of 10 October 2014[update], the game had sold 17 million copies on PC, becoming the best-selling PC game of all time. On 25 February 2014, the game reached 100 million registered users. By May 2019, 180 million copies had been sold across all platforms, making it the single best-selling video game of all time. The free-to-play Minecraft China version had over 700 million registered accounts by September 2023. By 2023, the game had sold over 300 million copies. As of April 2025, Minecraft has sold over 350 million copies. The Xbox 360 version of Minecraft became profitable within the first day of the game's release in 2012, when the game broke the Xbox Live sales records with 400,000 players online. Within a week of being on the Xbox Live Marketplace, Minecraft sold a million copies. GameSpot announced in December 2012 that Minecraft sold over 4.48 million copies since the game debuted on Xbox Live Arcade in May 2012. In 2012, Minecraft was the most purchased title on Xbox Live Arcade; it was also the fourth most played title on Xbox Live based on average unique users per day. As of 4 April 2014[update], the Xbox 360 version has sold 12 million copies. In addition, Minecraft: Pocket Edition has reached a figure of 21 million in sales. The PlayStation 3 Edition sold one million copies in five weeks. The release of the game's PlayStation Vita version boosted Minecraft sales by 79%, outselling both PS3 and PS4 debut releases and becoming the largest Minecraft launch on a PlayStation console. The PS Vita version sold 100,000 digital copies in Japan within the first two months of release, according to an announcement by SCE Japan Asia. By January 2015, 500,000 digital copies of Minecraft were sold in Japan across all PlayStation platforms, with a surge in primary school children purchasing the PS Vita version. As of 2022, the Vita version has sold over 1.65 million physical copies in Japan, making it the best-selling Vita game in the country. Minecraft helped improve Microsoft's total first-party revenue by $63 million for the 2015 second quarter. The game, including all of its versions, had over 112 million monthly active players by September 2019. On its 11th anniversary in May 2020, the company announced that Minecraft had reached over 200 million copies sold across platforms with over 126 million monthly active players. By April 2021, the number of active monthly users had climbed to 140 million. In July 2010, PC Gamer listed Minecraft as the fourth-best game to play at work. In December of that year, Good Game selected Minecraft as their choice for Best Downloadable Game of 2010, Gamasutra named it the eighth best game of the year as well as the eighth best indie game of the year, and Rock, Paper, Shotgun named it the "game of the year". Indie DB awarded the game the 2010 Indie of the Year award as chosen by voters, in addition to two out of five Editor's Choice awards for Most Innovative and Best Singleplayer Indie. It was also awarded Game of the Year by PC Gamer UK. The game was nominated for the Seumas McNally Grand Prize, Technical Excellence, and Excellence in Design awards at the March 2011 Independent Games Festival and won the Grand Prize and the community-voted Audience Award. At Game Developers Choice Awards 2011, Minecraft won awards in the categories for Best Debut Game, Best Downloadable Game and Innovation Award, winning every award for which it was nominated. It also won GameCity's video game arts award. On 5 May 2011, Minecraft was selected as one of the 80 games that would be displayed at the Smithsonian American Art Museum as part of The Art of Video Games exhibit that opened on 16 March 2012. At the 2011 Spike Video Game Awards, Minecraft won the award for Best Independent Game and was nominated in the Best PC Game category. In 2012, at the British Academy Video Games Awards, Minecraft was nominated in the GAME Award of 2011 category and Persson received The Special Award. In 2012, Minecraft XBLA was awarded a Golden Joystick Award in the Best Downloadable Game category, and a TIGA Games Industry Award in the Best Arcade Game category. In 2013, it was nominated as the family game of the year at the British Academy Video Games Awards. During the 16th Annual D.I.C.E. Awards, the Academy of Interactive Arts & Sciences nominated the Xbox 360 version of Minecraft for "Strategy/Simulation Game of the Year". Minecraft Console Edition won the award for TIGA Game Of The Year in 2014. In 2015, the game placed 6th on USgamer's The 15 Best Games Since 2000 list. In 2016, Minecraft placed 6th on Time's The 50 Best Video Games of All Time list. Minecraft was nominated for the 2013 Kids' Choice Awards for Favorite App, but lost to Temple Run. It was nominated for the 2014 Kids' Choice Awards for Favorite Video Game, but lost to Just Dance 2014. The game later won the award for the Most Addicting Game at the 2015 Kids' Choice Awards. In addition, the Java Edition was nominated for "Favorite Video Game" at the 2018 Kids' Choice Awards, while the game itself won the "Still Playing" award at the 2019 Golden Joystick Awards, as well as the "Favorite Video Game" award at the 2020 Kids' Choice Awards. Minecraft also won "Stream Game of the Year" at inaugural Streamer Awards in 2021. The game later garnered a Nickelodeon Kids' Choice Award nomination for Favorite Video Game in 2021, and won the same category in 2022 and 2023. At the Golden Joystick Awards 2025, it won the Still Playing Award - PC and Console. Minecraft has been subject to several notable controversies. In June 2014, Mojang announced that it would begin enforcing the portion of Minecraft's end-user license agreement (EULA) which prohibits servers from giving in-game advantages to players in exchange for donations or payments. Spokesperson Owen Hill stated that servers could still require players to pay a fee to access the server and could sell in-game cosmetic items. The change was supported by Persson, citing emails he received from parents of children who had spent hundreds of dollars on servers. The Minecraft community and server owners protested, arguing that the EULA's terms were more broad than Mojang was claiming, that the crackdown would force smaller servers to shut down for financial reasons, and that Mojang was suppressing competition for its own Minecraft Realms subscription service. The controversy contributed to Notch's decision to sell Mojang. In 2020, Mojang announced an eventual change to the Java Edition to require a login from a Microsoft account rather than a Mojang account, the latter of which would be sunsetted. This also required Java Edition players to create Xbox network Gamertags. Mojang defended the move to Microsoft accounts by saying that improved security could be offered, including two-factor authentication, blocking cyberbullies in chat, and improved parental controls. The community responded with intense backlash, citing various technical difficulties encountered in the process and how account migration would be mandatory, even for those who do not play on servers. As of 10 March 2022, Microsoft required that all players migrate in order to maintain access the Java Edition of Minecraft. Mojang announced a deadline of 19 September 2023 for account migration, after which all legacy Mojang accounts became inaccessible and unable to be migrated. In June 2022, Mojang added a player-reporting feature in Java Edition. Players could report other players on multiplayer servers for sending messages prohibited by the Xbox Live Code of Conduct; report categories included profane language,[l] substance abuse, hate speech, threats of violence, and nudity. If a player was found to be in violation of Xbox Community Standards, they would be banned from all servers for a specific period of time or permanently. The update containing the report feature (1.19.1) was released on 27 July 2022. Mojang received substantial backlash and protest from community members, one of the most common complaints being that banned players would be forbidden from joining any server, even private ones. Others took issue to what they saw as Microsoft increasing control over its player base and exercising censorship, leading some to start a hashtag #saveminecraft and dub the version "1.19.84", a reference to the dystopian novel Nineteen Eighty-Four. The "Mob Vote" was an online event organized by Mojang in which the Minecraft community voted between three original mob concepts; initially, the winning mob was to be implemented in a future update, while the losing mobs were scrapped, though after the first mob vote this was changed, and losing mobs would now have a chance to come to the game in the future. The first Mob Vote was held during Minecon Earth 2017 and became an annual event starting with Minecraft Live 2020. The Mob Vote was often criticized for forcing players to choose one mob instead of implementing all three, causing divisions and flaming within the community, and potentially allowing internet bots and Minecraft content creators with large fanbases to conduct vote brigading. The Mob Vote was also blamed for a perceived lack of new content added to Minecraft since Microsoft's acquisition of Mojang in 2014. The 2023 Mob Vote featured three passive mobs—the crab, the penguin, and the armadillo—with voting scheduled to start on 13 October. In response, a Change.org petition was created on 6 October, demanding that Mojang eliminate the Mob Vote and instead implement all three mobs going forward. The petition received approximately 445,000 signatures by 13 October and was joined by calls to boycott the Mob Vote, as well as a partially tongue-in-cheek "revolutionary" propaganda campaign in which sympathizers created anti-Mojang and pro-boycott posters in the vein of real 20th century propaganda posters. Mojang did not release an official response to the boycott, and the Mob Vote otherwise proceeded normally, with the armadillo winning the vote. In September 2024, as part of a blog post detailing their future plans for Minecraft's development, Mojang announced the Mob Vote would be retired. Cultural impact In September 2019, The Guardian classified Minecraft as the best video game of the 21st century to date, and in November 2019, Polygon called it the "most important game of the decade" in its 2010s "decade in review". In June 2020, Minecraft was inducted into the World Video Game Hall of Fame. Minecraft is recognized as one of the first successful games to use an early access model to draw in sales prior to its full release version to help fund development. As Minecraft helped to bolster indie game development in the early 2010s, it also helped to popularize the use of the early access model in indie game development. Social media sites such as YouTube, Facebook, and Reddit have played a significant role in popularizing Minecraft. Research conducted by the Annenberg School for Communication at the University of Pennsylvania showed that one-third of Minecraft players learned about the game via Internet videos. In 2010, Minecraft-related videos began to gain influence on YouTube, often made by commentators. The videos usually contain screen-capture footage of the game and voice-overs. Common coverage in the videos includes creations made by players, walkthroughs of various tasks, and parodies of works in popular culture. By May 2012, over four million Minecraft-related YouTube videos had been uploaded. The game would go on to be a prominent fixture within YouTube's gaming scene during the entire 2010s; in 2014, it was the second-most searched term on the entire platform. By 2018, it was still YouTube's biggest game globally. Some popular commentators have received employment at Machinima, a now-defunct gaming video company that owned a highly watched entertainment channel on YouTube. The Yogscast is a British company that regularly produces Minecraft videos; their YouTube channel has attained billions of views, and their panel at Minecon 2011 had the highest attendance. Another well-known YouTube personality is Jordan Maron, known online as CaptainSparklez, who has also created many Minecraft music parodies, including "Revenge", a parody of Usher's "DJ Got Us Fallin' in Love". Minecraft's popularity on YouTube was described by Polygon as quietly dominant, although in 2019, thanks in part to PewDiePie's playthroughs of the game, Minecraft experienced a visible uptick in popularity on the platform. Longer-running series include Far Lands or Bust, dedicated to reaching the obsolete "Far Lands" glitch by foot on an older version of the game. YouTube announced that on 14 December 2021 that the total amount of Minecraft-related views on the website had exceeded one trillion. Minecraft has been referenced by other video games, such as Torchlight II, Team Fortress 2, Borderlands 2, Choplifter HD, Super Meat Boy, The Elder Scrolls V: Skyrim, The Binding of Isaac, The Stanley Parable, and FTL: Faster Than Light. Minecraft is officially represented in downloadable content for the crossover fighter Super Smash Bros. Ultimate, with Steve as a playable character with a moveset including references to building, crafting, and redstone, alongside an Overworld-themed stage. It was also referenced by electronic music artist Deadmau5 in his performances. The game is also referenced heavily in "Informative Murder Porn", the second episode of the seventeenth season of the animated television series South Park. In 2025, A Minecraft Movie was released. It made $313 million in the box office in the first week, a record-breaking opening for a video game adaptation. Minecraft has been noted as a cultural touchstone for Generation Z, as many of the generation's members played the game at a young age. The possible applications of Minecraft have been discussed extensively, especially in the fields of computer-aided design (CAD) and education. In a panel at Minecon 2011, a Swedish developer discussed the possibility of using the game to redesign public buildings and parks, stating that rendering using Minecraft was much more user-friendly for the community, making it easier to envision the functionality of new buildings and parks. In 2012, a member of the Human Dynamics group at the MIT Media Lab, Cody Sumter, said: "Notch hasn't just built a game. He's tricked 40 million people into learning to use a CAD program." Various software has been developed to allow virtual designs to be printed using professional 3D printers or personal printers such as MakerBot and RepRap. In September 2012, Mojang began the Block by Block project in cooperation with UN Habitat to create real-world environments in Minecraft. The project allows young people who live in those environments to participate in designing the changes they would like to see. Using Minecraft, the community has helped reconstruct the areas of concern, and citizens are invited to enter the Minecraft servers and modify their own neighborhood. Carl Manneh, Mojang's managing director, called the game "the perfect tool to facilitate this process", adding "The three-year partnership will support UN-Habitat's Sustainable Urban Development Network to upgrade 300 public spaces by 2016." Mojang signed Minecraft building community, FyreUK, to help render the environments into Minecraft. The first pilot project began in Kibera, one of Nairobi's informal settlements and is in the planning phase. The Block by Block project is based on an earlier initiative started in October 2011, Mina Kvarter (My Block), which gave young people in Swedish communities a tool to visualize how they wanted to change their part of town. According to Manneh, the project was a helpful way to visualize urban planning ideas without necessarily having a training in architecture. The ideas presented by the citizens were a template for political decisions. In April 2014, the Danish Geodata Agency generated all of Denmark in fullscale in Minecraft based on their own geodata. This is possible because Denmark is one of the flattest countries with the highest point at 171 meters (ranking as the country with the 30th smallest elevation span), where the limit in default Minecraft was around 192 meters above in-game sea level when the project was completed. Taking advantage of the game's accessibility where other websites are censored, the non-governmental organization Reporters Without Borders has used an open Minecraft server to create the Uncensored Library, a repository within the game of journalism by authors from countries (including Egypt, Mexico, Russia, Saudi Arabia and Vietnam) who have been censored and arrested, such as Jamal Khashoggi. The neoclassical virtual building was created over about 250 hours by an international team of 24 people. Despite its unpredictable nature, Minecraft speedrunning, where players time themselves from spawning into a new world to reaching The End and defeating the Ender Dragon boss, is popular. Some speedrunners use a combination of mods, external programs, and debug menus, while other runners play the game in a more vanilla or more consistency-oriented way. Minecraft has been used in educational settings through initiatives such as MinecraftEdu, founded in 2011 to make the game affordable and accessible for schools in collaboration with Mojang. MinecraftEdu provided features allowing teachers to monitor student progress, including screenshot submissions as evidence of lesson completion, and by 2012 reported that approximately 250,000 students worldwide had access to the platform. Mojang also developed Minecraft: Education Edition with pre-built lesson plans for up to 30 students in a closed environment. Educators have used Minecraft to teach subjects such as history, language arts, and science through custom-built environments, including reconstructions of historical landmarks and large-scale models of biological structures such as animal cells. The introduction of redstone blocks enabled the construction of functional virtual machines such as a hard drive and an 8-bit computer. Mods have been created to use these mechanics for teaching programming. In 2014, the British Museum announced a project to reproduce its building and exhibits in Minecraft in collaboration with the public. Microsoft and Code.org have offered Minecraft-based tutorials and activities designed to teach programming, reporting by 2018 that more than 85 million children had used their resources. In 2025, the Musée de Minéralogie in Paris held a temporary exhibition titled "Minerals in Minecraft." Following the initial surge in popularity of Minecraft in 2010, other video games were criticised for having various similarities to Minecraft, and some were described as being "clones", often due to a direct inspiration from Minecraft, or a superficial similarity. Examples include Ace of Spades, CastleMiner, CraftWorld, FortressCraft, Terraria, BlockWorld 3D, Total Miner, and Luanti (formerly Minetest). David Frampton, designer of The Blockheads, reported that one failure of his 2D game was the "low resolution pixel art" that too closely resembled the art in Minecraft, which resulted in "some resistance" from fans. A homebrew adaptation of the alpha version of Minecraft for the Nintendo DS, titled DScraft, has been released; it has been noted for its similarity to the original game considering the technical limitations of the system. In response to Microsoft's acquisition of Mojang and their Minecraft IP, various developers announced further clone titles developed specifically for Nintendo's consoles, as they were the only major platforms not to officially receive Minecraft at the time. These clone titles include UCraft (Nexis Games), Cube Life: Island Survival (Cypronia), Discovery (Noowanda), Battleminer (Wobbly Tooth Games), Cube Creator 3D (Big John Games), and Stone Shire (Finger Gun Games). Despite this, the fears of fans were unfounded, with official Minecraft releases on Nintendo consoles eventually resuming. Markus Persson made another similar game, Minicraft, for a Ludum Dare competition in 2011. In 2025, Persson announced through a poll on his X account that he was considering developing a spiritual successor to Minecraft. He later clarified that he was "100% serious", and that he had "basically announced Minecraft 2". Within days, however, Persson cancelled the plans after speaking to his team. In November 2024, artificial intelligence companies Decart and Etched released Oasis, an artificially generated version of Minecraft, as a proof of concept. Every in-game element is completely AI-generated in real time and the model does not store world data, leading to "hallucinations" such as items and blocks appearing that were not there before. In January 2026, indie game developer Unomelon announced that their voxel sandbox game Allumeria would be playable in Steam Next Fest that year. On 10 February, Mojang issued a DMCA takedown of Allumeria on Steam through Valve, alleging the game was infringing on Minecraft's copyright. Some reports suggested that the takedown may have used an automatic AI copyright claiming service. The DMCA was later withdrawn. Minecon was an annual official fan convention dedicated to Minecraft. The first full Minecon was held in November 2011 at the Mandalay Bay Hotel and Casino in Las Vegas. The event included the official launch of Minecraft; keynote speeches, including one by Persson; building and costume contests; Minecraft-themed breakout classes; exhibits by leading gaming and Minecraft-related companies; commemorative merchandise; and autograph and picture times with Mojang employees and well-known contributors from the Minecraft community. In 2016, Minecon was held in-person for the last time, with the following years featuring annual "Minecon Earth" livestreams on minecraft.net and YouTube instead. These livestreams, later rebranded to "Minecraft Live", included the mob/biome votes, and announcements of new game updates. In 2025, "Minecraft Live" became a biannual event as part of Minecraft's changing update schedule.[citation needed] Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/NASA] | [TOKENS: 16674] |
Contents NASA The National Aeronautics and Space Administration (NASA /ˈnæsə/) is an independent agency of the U.S. federal government responsible for the United States' civil space program and for research in aeronautics and space exploration. Headquartered in Washington, D.C., NASA operates ten field centers across the United States and is organized into mission directorates for Science, Space Operations, Exploration Systems Development, Space Technology, Aeronautics Research, and Mission Support. Established in 1958, NASA succeeded the National Advisory Committee for Aeronautics (NACA) to give the American space development effort a distinct civilian orientation, emphasizing peaceful applications in space science. It has since led most of America's space exploration programs, including Project Mercury, Project Gemini, the 1968–1972 Apollo program missions, the Skylab space station, and the Space Shuttle. The agency maintains major ground and communications infrastructure including the Deep Space Network and the Near Space Network. NASA's science division is focused on better understanding Earth through the Earth Observing System; advancing heliophysics through the efforts of the Science Mission Directorate's Heliophysics Research Program; exploring bodies throughout the Solar System with advanced robotic spacecraft such as New Horizons and planetary rovers such as Perseverance; and researching astrophysics topics, such as the Big Bang, through the James Webb Space Telescope, the four Great Observatories (including the Hubble Space Telescope), and associated programs. The Launch Services Program oversees launch operations for its uncrewed launches. NASA supports the International Space Station (ISS) along with the Commercial Crew Program and oversees the development of the Orion spacecraft and the Space Launch System for the lunar Artemis program. It maintains programmatic partnerships with agencies such as ESA, JAXA, CSA, Roscosmos (for ISS operations), NOAA, and the USGS. NASA's missions and media operations—such as NASA TV, Astronomy Picture of the Day, and the NASA+ streaming service—have maintained high public visibility and contributed to spaceflight outreach in the United States and abroad. A subject of numerous major films, NASA has maintained an influence on American popular culture since the Apollo 11 mission in 1969. For FY2022, Congress authorized a $24.041 billion budget, with a civil-service workforce of roughly 18,400; since December 2025, the administrator is Jared Isaacman. History NASA traces its roots to the National Advisory Committee for Aeronautics (NACA). Despite Dayton, Ohio being the birthplace of aviation, by 1914 the United States recognized that it was far behind Europe in aviation capability. Determined to regain American leadership in aviation, the United States Congress created the Aviation Section of the US Army Signal Corps in 1914 and established NACA in 1915 to foster aeronautical research and development. Over the next forty years, NACA would conduct aeronautical research in support of the US Air Force, US Army, US Navy, and the civil aviation sector. After the end of World War II, NACA became interested in the possibilities of guided missiles and supersonic aircraft, developing and testing the Bell X-1 in a joint program with the US Air Force. NACA's interest in space grew out of its rocketry program at the Pilotless Aircraft Research Division. The Soviet Union's launch of Sputnik 1 ushered in the Space Age and kicked off the Space Race. Despite NACA's early rocketry program, the responsibility for launching the first American satellite fell to the Naval Research Laboratory's Project Vanguard, whose operational issues ensured the Army Ballistic Missile Agency would launch Explorer 1, America's first satellite, on February 1, 1958. The Eisenhower Administration decided to split the United States's military and civil spaceflight programs, which were organized together under the Department of Defense's Advanced Research Projects Agency. NASA was established on July 29, 1958, with the signing of the National Aeronautics and Space Act and it began operations on October 1, 1958. As the American's premier aeronautics agency, NACA formed the core of NASA's new structure by reassigning 8,000 employees and three major research laboratories. NASA also proceeded to absorb the Naval Research Laboratory's Project Vanguard, the Army's Jet Propulsion Laboratory (JPL), and the Army Ballistic Missile Agency under Wernher von Braun. This left NASA firmly as the United States's civil space lead and the Air Force as the military space lead. Plans for human spaceflight began in the US Armed Forces prior to NASA's creation. The Air Force's Man in Space Soonest project formed in 1956, coupled with the Army's Project Adam, served as the foundation for Project Mercury. NASA established the Space Task Group to manage the program, which would conduct crewed sub-orbital flights with the Army's Redstone rockets and orbital flights with the Air Force's Atlas launch vehicles. While NASA intended for its first astronauts to be civilians, President Eisenhower directed that they be selected from the military. The Mercury 7 astronauts included three Air Force pilots, three Navy aviators, and one Marine Corps pilot. On May 5, 1961, Alan Shepard became the first American to enter space, performing a suborbital spaceflight in the Freedom 7. This flight occurred less than a month after the Soviet Yuri Gagarin became the first human in space, executing a full orbital spaceflight. NASA's first orbital spaceflight was conducted by John Glenn on February 20, 1962, in the Friendship 7, making three full orbits before reentering. Glenn had to fly parts of his final two orbits manually due to an autopilot malfunction. The sixth and final Mercury mission was flown by Gordon Cooper in May 1963, performing 22 orbits over 34 hours in the Faith 7. The Mercury Program was wildly recognized as a resounding success, achieving its objectives to orbit a human in space, develop tracking and control systems, and identify other issues associated with human spaceflight. While much of NASA's attention turned to space, it did not put aside its aeronautics mission. Early aeronautics research attempted to build upon the X-1's supersonic flight to build an aircraft capable of hypersonic flight. The North American X-15 was a joint NASA–US Air Force program, with the hypersonic test aircraft becoming the first non-dedicated spacecraft to cross from the atmosphere to outer space. The X-15 also served as a testbed for Apollo program technologies, as well as ramjet and scramjet propulsion. Escalations in the Cold War between the United States and Soviet Union prompted President John F. Kennedy to charge NASA with landing a man on the Moon and returning him safely to Earth by the end of the 1960s and installed James E. Webb as NASA administrator to achieve this goal. On May 25, 1961, President Kennedy openly declared this goal in his "Urgent National Needs" speech to the United States Congress, declaring: I believe this Nation should commit itself to achieving the goal, before this decade is out, of landing a man on the Moon and returning him safely to Earth. No single space project in this period will be more impressive to mankind, or more important for the long-range exploration of space; and none will be so difficult or expensive to accomplish. Kennedy gave his "We choose to go to the Moon" speech the next year, on September 12, 1962 at Rice University, where he addressed the nation hoping to reinforce public support for the Apollo program. Despite attacks on the goal of landing astronauts on the Moon from the former president Dwight Eisenhower and 1964 presidential candidate Barry Goldwater, President Kennedy was able to protect NASA's growing budget, of which 50% went directly to human spaceflight and it was later estimated that, at its height, 5% of Americans worked on some aspect of the Apollo program. Mirroring the Department of Defense's program management concept using redundant systems in building the first intercontinental ballistic missiles, NASA requested the Air Force assign Major General Samuel C. Phillips to the space agency where he would serve as the director of the Apollo program. Development of the Saturn V rocket was led by Wernher von Braun and his team at the Marshall Space Flight Center, derived from the Army Ballistic Missile Agency's original Saturn I. The Apollo spacecraft was designed and built by North American Aviation, while the Apollo Lunar Module was designed and built by Grumman. To develop the spaceflight skills and equipment required for a lunar mission, NASA initiated Project Gemini. Using a modified Air Force Titan II launch vehicle, the Gemini capsule could hold two astronauts for flights of over two weeks. Gemini pioneered the use of fuel cells instead of batteries, and conducted the first American spacewalks and rendezvous operations. The Ranger Program was started in the 1950s as a response to Soviet lunar exploration, however most missions ended in failure. The Lunar Orbiter program had greater success, mapping the surface in preparation for Apollo landings, conducting meteoroid detection, and measuring radiation levels. The Surveyor program conducted uncrewed lunar landings and takeoffs, as well as taking surface and regolith observations. Despite the setback caused by the Apollo 1 fire, which killed three astronauts, the program proceeded. Apollo 8 was the first crewed spacecraft to leave low Earth orbit and the first human spaceflight to reach the Moon. The crew orbited the Moon ten times on December 24 and 25, 1968, and then traveled safely back to Earth. The three Apollo 8 astronauts—Frank Borman, James Lovell, and William Anders—were the first humans to see the Earth as a globe in space, the first to witness an Earthrise, and the first to see and manually photograph the far side of the Moon. The first lunar landing was conducted by Apollo 11. Commanded by Neil Armstrong with astronauts Buzz Aldrin and Michael Collins, Apollo 11 was one of the most significant missions in NASA's history, marking the end of the Space Race when the Soviet Union gave up its lunar ambitions. As the first human to step on the surface of the Moon, Neil Armstrong uttered the now famous words: That's one small step for man, one giant leap for mankind. NASA would conduct six total lunar landings as part of the Apollo program, with Apollo 17 concluding the program in 1972. Wernher von Braun had advocated for NASA to develop a space station since the agency was created. In 1973, following the end of the Apollo lunar missions, NASA launched its first space station, Skylab, on the final launch of the Saturn V. Skylab reused a significant amount of Apollo and Saturn hardware, with a repurposed Saturn V third stage serving as the primary module for the space station. Damage to Skylab during its launch required spacewalks to be performed by the first crew to make it habitable and operational. Skylab hosted nine missions and was decommissioned in 1974 and deorbited in 1979, two years prior to the first launch of the Space Shuttle and any possibility of boosting its orbit. In 1975, the Apollo–Soyuz mission was the first ever international spaceflight and a major diplomatic accomplishment between the Cold War rivals, which also marked the last flight of the Apollo capsule. Flown in 1975, a US Apollo spacecraft docked with a Soviet Soyuz capsule. During the 1960s, NASA started its space science and interplanetary probe program. The Mariner program was its flagship program, launching probes to Venus, Mars, and Mercury in the 1960s. The Jet Propulsion Laboratory was the lead NASA center for robotic interplanetary exploration, making significant discoveries about the inner planets. Despite these successes, Congress was unwilling to fund further interplanetary missions and NASA Administrator James Webb suspended all future interplanetary probes to focus resources on the Apollo program. Following the conclusion of the Apollo program, NASA resumed launching interplanetary probes and expanded its space science program. The first planet tagged for exploration was Venus, sharing many similar characteristics to Earth. First visited by American Mariner 2 spacecraft, Venus was observed to be a hot and inhospitable planet. Follow-on missions included the Pioneer Venus project in the 1970s and Magellan, which performed radar mapping of Venus' surface in the 1980s and 1990s. Future missions were flybys of Venus, on their way to other destinations in the Solar System. Mars has long been a planet of intense fascination for NASA, being suspected of potentially having harbored life. Mariner 5 was the first NASA spacecraft to flyby Mars, followed by Mariner 6 and Mariner 7. Mariner 9 was the first orbital mission to Mars. Launched in 1975, Viking program consisted of two landings on Mars in 1976. Follow-on missions would not be launched until 1996, with the Mars Global Surveyor orbiter and Mars Pathfinder, deploying the first Mars rover, Sojourner. During the early 2000s, the 2001 Mars Odyssey orbiter reached the planet and in 2004 the Sprit and Opportunity rovers landed on the Red Planet. This was followed in 2005 by the Mars Reconnaissance Orbiter and 2007 Phoenix Mars lander. The 2012 landing of Curiosity discovered that the radiation levels on Mars were equal to those on the International Space Station, greatly increasing the possibility of Human exploration, and observed the key chemical ingredients for life to occur. In 2013, the Mars Atmosphere and Volatile Evolution (MAVEN) mission observed the Martian upper atmosphere and space environment and in 2018, the Interior exploration using Seismic Investigations Geodesy, and Heat Transport (InSight) studied the Martian interior. The 2021 Perseverance rover carried the first extraplanetary aircraft, a helicopter named Ingenuity. NASA also launched missions to Mercury in 2004, with the MESSENGER probe demonstrating as the first use of a solar sail. NASA also launched probes to the outer Solar System starting in the 1960s. Pioneer 10 was the first probe to the outer planets, flying by Jupiter, while Pioneer 11 provided the first close up view of the planet. Both probes became the first objects to leave the Solar System. The Voyager program launched in 1977, conducting flybys of Jupiter and Saturn, Neptune, and Uranus on a trajectory to leave the Solar System. The Galileo spacecraft, deployed from the Space Shuttle flight STS-34, was the first spacecraft to orbit Jupiter, discovering evidence of subsurface oceans on the Europa and observed that the moon may hold ice or liquid water. A joint NASA-European Space Agency-Italian Space Agency mission, Cassini–Huygens, was sent to Saturn's moon Titan, which, along with Mars and Europa, are the only celestial bodies in the Solar System suspected of being capable of harboring life. Cassini discovered three new moons of Saturn and the Huygens probe entered Titan's atmosphere. The mission discovered evidence of liquid hydrocarbon lakes on Titan and subsurface water oceans on the moon of Enceladus, which could harbor life. Finally launched in 2006, the New Horizons mission was the first spacecraft to visit Pluto and the Kuiper belt. Beyond interplanetary probes, NASA has launched many space telescopes. Launched in the 1960s, the Orbiting Astronomical Observatory were NASA's first orbital telescopes, providing ultraviolet, gamma-ray, x-ray, and infrared observations. NASA launched the Orbiting Geophysical Observatory in the 1960s and 1970s to look down at Earth and observe its interactions with the Sun. The Uhuru satellite was the first dedicated x-ray telescope, mapping 85% of the sky and discovering a large number of black holes. Launched in the 1990s and early 2000s, the Great Observatories program are among NASA's most powerful telescopes. The Hubble Space Telescope was launched in 1990 on STS-31 from the Discovery and could view galaxies 15 billion light years away. A major defect in the telescope's mirror could have crippled the program, had NASA not used computer enhancement to compensate for the imperfection and launched five Space Shuttle servicing flights to replace the damaged components. The Compton Gamma Ray Observatory was launched from the Atlantis on STS-37 in 1991, discovering a possible source of antimatter at the center of the Milky Way and observing that the majority of gamma-ray bursts occur outside of the Milky Way galaxy. The Chandra X-ray Observatory was launched from the Columbia on STS-93 in 1999, observing black holes, quasars, supernova, and dark matter. It provided critical observations on the Sagittarius A* black hole at the center of the Milky Way galaxy and the separation of dark and regular matter during galactic collisions. Finally, the Spitzer Space Telescope is an infrared telescope launched in 2003 from a Delta II rocket. It is in a trailing orbit around the Sun, following the Earth and discovered the existence of brown dwarf stars. Other telescopes, such as the Cosmic Background Explorer and the Wilkinson Microwave Anisotropy Probe, provided evidence to support the Big Bang. The James Webb Space Telescope, named after the NASA administrator who lead the Apollo program, is an infrared observatory launched in 2021. The James Webb Space Telescope is a direct successor to the Hubble Space Telescope, intended to observe the formation of the first galaxies. Other space telescopes include the Kepler space telescope, launched in 2009 to identify planets orbiting extrasolar stars that may be Terran and possibly harbor life. The first exoplanet that the Kepler space telescope confirmed was Kepler-22b, orbiting within the habitable zone of its star. NASA also launched a number of different satellites to study Earth, such as Television Infrared Observation Satellite (TIROS) in 1960, which was the first weather satellite. NASA and the United States Weather Bureau cooperated on future TIROS and the second generation Nimbus program of weather satellites. It also worked with the Environmental Science Services Administration on a series of weather satellites and the agency launched its experimental Applications Technology Satellites into geosynchronous orbit. NASA's first dedicated Earth observation satellite, Landsat, was launched in 1972. This led to NASA and the National Oceanic and Atmospheric Administration jointly developing the Geostationary Operational Environmental Satellite and discovering Ozone depletion. NASA had been pursuing spaceplane development since the 1960s, blending the administration's dual aeronautics and space missions. NASA viewed a spaceplane as part of a larger program, providing routine and economical logistical support to a space station in Earth orbit that would be used as a hub for lunar and Mars missions. A reusable launch vehicle would then have ended the need for expensive and expendable boosters like the Saturn V. In 1969, NASA designated the Johnson Space Center as the lead center for the design, development, and manufacturing of the Space Shuttle orbiter, while the Marshall Space Flight Center would lead the development of the launch system. NASA's series of lifting body aircraft, culminating in the joint NASA-US Air Force Martin Marietta X-24, directly informed the development of the Space Shuttle and future hypersonic flight aircraft. Official development of the Space Shuttle began in 1972, with Rockwell International contracted to design the orbiter and engines, Martin Marietta for the external fuel tank, and Morton Thiokol for the solid rocket boosters. NASA acquired six orbiters: the Enterprise, Columbia, Challenger, Discovery, Atlantis, and Endeavour The Space Shuttle program also allowed NASA to make major changes to its Astronaut Corps. While almost all previous astronauts were Air Force or Naval test pilots, the Space Shuttle allowed NASA to begin recruiting more non-military scientific and technical experts. A prime example is Sally Ride, who became the first American woman to fly in space on STS-7. This new astronaut selection process also allowed NASA to accept exchange astronauts from US allies and partners for the first time. The first Space Shuttle flight occurred in 1981, when the Columbia launched on the STS-1 mission, designed to serve as a flight test for the new spaceplane. NASA intended for the Space Shuttle to replace expendable launch systems like the Air Force's Atlas, Delta, and Titan and the European Space Agency's Ariane. The Space Shuttle's Spacelab payload, developed by the European Space Agency, increased the scientific capabilities of shuttle missions over anything NASA was able to previously accomplish. NASA launched its first commercial satellites on the STS-5 mission and in 1984, the STS-41-C mission conducted the world's first on-orbit satellite servicing mission when the Challenger captured and repaired the malfunctioning Solar Maximum Mission satellite. It also had the capability to return malfunctioning satellite to Earth, like it did with the Palapa B2 and Westar 6 satellites. Once returned to Earth, the satellites were repaired and relaunched. Despite ushering in a new era of spaceflight, where NASA was contracting launch services to commercial companies, the Space Shuttle was criticized for not being as reusable and cost-effective as advertised. In 1986, Challenger disaster on the STS-51L mission resulted in the loss of the spacecraft and all seven astronauts on launch, grounding the entire space shuttle fleet for 36 months and forced the 44 commercial companies that contracted with NASA to deploy their satellites to return to expendable launch vehicles. When the Space Shuttle returned to flight with the STS-26 mission, it had undergone significant modifications to improve its reliability and safety. Following the collapse of the Soviet Union, the Russian Federation and United States initiated the Shuttle-Mir program. The first Russian cosmonaut flew on the STS-60 mission in 1994 and the Discovery rendezvoused, but did not dock with, the Russian Mir in the STS-63 mission. This was followed by Atlantis' STS-71 mission where it accomplished the initial intended mission for the Space Shuttle, docking with a space station and transferring supplies and personnel. The Shuttle-Mir program would continue until 1998, when a series of orbital accidents on the space station spelled an end to the program. In 2003, a second space shuttle was destroyed when the Columbia was destroyed upon reentry during the STS-107 mission, resulting in the loss of the spacecraft and all seven astronauts. This accident marked the beginning of the retiring of the Space Shuttle program, with President George W. Bush directing that upon the completion of the International Space Station, the space shuttle be retired. In 2006, the Space Shuttle returned to flight, conducting several missions to service the Hubble Space Telescope, but was retired following the STS-135 resupply mission to the International Space Station in 2011. NASA never gave up on the idea of a space station after Skylab's reentry in 1979. The agency began lobbying politicians to support building a larger space station as soon as the Space Shuttle began flying, selling it as an orbital laboratory, repair station, and a jumping off point for lunar and Mars missions. NASA found a strong advocate in President Ronald Reagan, who declared in a 1984 speech: America has always been greatest when we dared to be great. We can reach for greatness again. We can follow our dreams to distant stars, living and working in space for peaceful, economic, and scientific gain. Tonight I am directing NASA to develop a permanently manned space station and to do it within a decade. In 1985, NASA proposed the Space Station Freedom, which both the agency and President Reagan intended to be an international program. While this would add legitimacy to the program, there were concerns within NASA that the international component would dilute its authority within the project, having never been willing to work with domestic or international partners as true equals. There was also a concern with sharing sensitive space technologies with the Europeans, which had the potential to dilute America's technical lead. Ultimately, an international agreement to develop the Space Station Freedom program would be signed with thirteen countries in 1985, including the European Space Agency member states, Canada, and Japan. Despite its status as the first international space program, the Space Station Freedom was controversial, with much of the debate centering on cost. Several redesigns to reduce cost were conducted in the early 1990s, stripping away much of its functions. Despite calls for Congress to terminate the program, it continued, in large part because by 1992 it had created 75,000 jobs across 39 states. By 1993, President Bill Clinton attempted to significantly reduce NASA's budget and directed costs be significantly reduced, aerospace industry jobs were not lost, and the Russians be included. In 1993, the Clinton Administration announced that the Space Station Freedom would become the International Space Station in an agreement with the Russian Federation. This allowed the Russians to maintain their space program through an infusion of American currency to maintain their status as one of the two premier space programs. While the United States built and launched the majority of the International Space Station, Russia, Canada, Japan, and the European Space Agency all contributed components. Despite NASA's insistence that costs would be kept at a budget of $17.4, they kept rising and NASA had to transfer funds from other programs to keep the International Space Station solvent. Ultimately, the total cost of the station was $150 billion, with the United States paying for two-thirds. Following the Space Shuttle Columbia disaster in 2003, NASA was forced to rely on Russian Soyuz launches for its astronauts and the 2011 retirement of the Space Shuttle accelerated the station's completion. In the 1980s, right after the first flight of the Space Shuttle, NASA started a joint program with the Department of Defense to develop the Rockwell X-30 National Aerospace Plane. NASA realized that the Space Shuttle, while a massive technological accomplishment, would not be able to live up to all its promises. Designed to be a single-stage-to-orbit spaceplane, the X-30 had both civil and military applications. With the end of the Cold War, the X-30 was canceled in 1992 before reaching flight status. Following the Space Shuttle Columbia disaster in 2003, President Bush started the Constellation program to smoothly replace the Space Shuttle and expand space exploration beyond low Earth orbit. Constellation was intended to use a significant amount of former Space Shuttle equipment and return astronauts to the Moon. This program was canceled by the Obama Administration. Former astronauts Neil Armstrong, Gene Cernan, and Jim Lovell sent a letter to President Barack Obama to warn him that if the United States did not get new human spaceflight ability, the US risked becoming a second or third-rate space power. As early as the Reagan Administration, there had been calls for NASA to expand private sector involvement in space exploration rather than do it all in-house. In the 1990s, NASA and Lockheed Martin entered into an agreement to develop the Lockheed Martin X-33 demonstrator of the VentureStar spaceplane, which was intended to replace the Space Shuttle. Due to technical challenges, the spacecraft was cancelled in 2001. Despite this, it was the first time a commercial space company directly expended a significant amount of its resources into spacecraft development. The advent of space tourism also forced NASA to challenge its assumption that only governments would have people in space. The first space tourist was Dennis Tito, an American investment manager and former aerospace engineer who contracted with the Russians to fly to the International Space Station for four days, despite the opposition of NASA to the idea. Advocates of this new commercial approach for NASA included former astronaut Buzz Aldrin, who remarked that it would return NASA to its roots as a research and development agency, with commercial entities actually operating the space systems. Having corporations take over orbital operations would also allow NASA to focus all its efforts on deep space exploration and returning humans to the Moon and going to Mars. Embracing this approach, NASA's Commercial Crew Program started by contracting cargo delivery to the International Space Station and flew its first operational contracted mission on SpaceX Crew-1. This marked the first time since the retirement of the Space Shuttle that NASA was able to launch its own astronauts on an American spacecraft from the United States, ending a decade of reliance on the Russians. In 2019, NASA announced the Artemis program, intending to return to the Moon and establish a permanent human presence. This was paired with the Artemis Accords with partner nations to establish rules of behavior and norms of space commercialization on the Moon. In 2023, NASA established the Moon to Mars Program office. The office is designed to oversee the various projects, mission architectures and associated timelines relevant to lunar and Mars exploration and science. Active programs The International Space Station (ISS) combines NASA's Space Station Freedom project with the Russian Mir-2 station, the European Columbus station, and the Japanese Kibō laboratory module. NASA originally planned in the 1980s to develop Freedom alone, but US budget constraints led to the merger of these projects into a single multi-national program in 1993, managed by NASA, the Russian Federal Space Agency (RKA), the Japan Aerospace Exploration Agency (JAXA), the European Space Agency (ESA), and the Canadian Space Agency (CSA). The station consists of pressurized modules, external trusses, solar arrays and other components, which were manufactured in various factories around the world and launched by Russian Proton and Soyuz rockets, and the American Space Shuttle. The on-orbit assembly began in 1998, the completion of the US Orbital Segment occurred in 2009 and the completion of the Russian Orbital Segment occurred in 2010. The ownership and use of the space station is established in intergovernmental treaties and agreements, which divide the station into two areas and allow Russia to retain full ownership of the Russian Orbital Segment (with the exception of Zarya), with the US Orbital Segment allocated between the other international partners. Long-duration missions to the ISS are referred to as ISS Expeditions. Expedition crew members typically spend approximately six months on the ISS. The initial expedition crew size was three, temporarily decreased to two following the Columbia disaster. Between May 2009 and until the retirement of the Space Shuttle, the expedition crew size has been six crew members. As of 2024, though the Commercial Program's crew capsules can allow a crew of up to seven, expeditions using them typically consist of a crew of four. The ISS has been continuously occupied for the past 25 years and 111 days, having exceeded the previous record held by Mir; and has been visited by astronauts and cosmonauts from 15 different nations. The station can be seen from the Earth with the naked eye and, as of 2026, is the largest artificial satellite in Earth orbit with a mass and volume greater than that of any previous space station. The Russian Soyuz and American Dragon and Starliner spacecraft are used to send astronauts to and from the ISS. Several uncrewed cargo spacecraft provide service to the ISS; they are the Russian Progress spacecraft which has done so since 2000, the European Automated Transfer Vehicle (ATV) since 2008, the Japanese H-II Transfer Vehicle (HTV) since 2009, the (uncrewed) Dragon since 2012, and the American Cygnus spacecraft since 2013. The Space Shuttle, before its retirement, was also used for cargo transfer and would often switch out expedition crew members, although it did not have the capability to remain docked for the duration of their stay. Between the retirement of the Shuttle in 2011 and the commencement of crewed Dragon flights in 2020, American astronauts exclusively used the Soyuz for crew transport to and from the ISS. The highest number of people occupying the ISS has been thirteen; this occurred three times during the late Shuttle ISS assembly missions. The ISS program is expected to continue until 2030, after which the space station will be retired and destroyed in a controlled de-orbit. Commercial Resupply Services (CRS) are a contract solution to deliver cargo and supplies to the International Space Station on a commercial basis by private companies. NASA signed its first CRS contracts in 2008 and awarded $1.6 billion to SpaceX for twelve cargo Dragon and $1.9 billion to Orbital Sciences[note 1] for eight Cygnus flights, covering deliveries until 2016. Both companies evolved or created their launch vehicle products to launch the spacecrafts (SpaceX with The Falcon 9 and Orbital with the Antares). SpaceX flew its first operational resupply mission (SpaceX CRS-1) in 2012. Orbital Sciences followed in 2014 (Cygnus CRS Orb-1). In 2015, NASA extended CRS-1 to twenty flights for SpaceX and twelve flights for Orbital ATK.[note 1] A second phase of contracts (known as CRS-2) was solicited in 2014; contracts were awarded in January 2016 to Orbital ATK[note 1] Cygnus, Sierra Nevada Corporation Dream Chaser, and SpaceX Dragon 2, for cargo transport flights beginning in 2019 and expected to last through 2024. In March 2022, NASA awarded an additional six CRS-2 missions each to both SpaceX and Northrop Grumman (formerly Orbital). Northrop Grumman successfully delivered Cygnus NG-17 to the ISS in February 2022. In July 2022, SpaceX launched its 25th CRS flight (SpaceX CRS-25) and successfully delivered its cargo to the ISS. The Dream Chaser spacecraft is currently scheduled for its Demo-1 launch in the first half of 2024. The Commercial Crew Program (CCP) provides commercially operated crew transportation service to and from the International Space Station (ISS) under contract to NASA, conducting crew rotations between the expeditions of the International Space Station program. American space manufacturer SpaceX began providing service in 2020, using the Crew Dragon spacecraft, while Boeing's Starliner spacecraft provided service in 2024. It was on contract for 6 missions, but after the first mission nearly ended in disaster and left the two astronauts stranded on the ISS for six months, NASA froze its contract with Boeing. NASA has contracted for six operational missions from Boeing and fourteen from SpaceX, ensuring sufficient support for ISS through 2030. The spacecraft are owned and operated by the vendor, and crew transportation is provided to NASA as a commercial service. Each mission sends up to four astronauts to the ISS, with an option for a fifth passenger available. Operational flights occur approximately once every six months for missions that last for approximately six months. A spacecraft remains docked to the ISS during its mission, and missions usually overlap by at least a few days. Between the retirement of the Space Shuttle in 2011 and the first operational CCP mission in 2020, NASA relied on the Soyuz program to transport its astronauts to the ISS. A Crew Dragon spacecraft is launched to space atop a Falcon 9 Block 5 launch vehicle and the capsule returns to Earth via splashdown in the ocean near Florida. The program's first operational mission, SpaceX Crew-1, launched on November 16, 2020. Boeing Starliner operational flights will now commence with Boeing Starliner-1 which will launched atop an Atlas V N22 launch vehicle. Instead of a splashdown, Starliner capsules return on land with airbags at one of four designated sites in the western United States. Since 2017, NASA's crewed spaceflight program has been the Artemis program, which involves the help of US commercial spaceflight companies and international partners such as ESA, JAXA, and CSA. The goal of this program is to land "the first woman and the next man" on the lunar south pole region by 2025. Artemis would be the first step towards the long-term goal of establishing a sustainable presence on the Moon, laying the foundation for companies to build a lunar economy, and eventually sending humans to Mars. The Orion Crew Exploration Vehicle was held over from the canceled Constellation program for Artemis. Artemis I was the uncrewed initial launch of Space Launch System (SLS) that would also send an Orion spacecraft on a Distant Retrograde Orbit. The first tentative steps of returning to crewed lunar missions will be Artemis II, which is to include the Orion crew module, propelled by the SLS, and is expected to launch no later than April 2026. This mission is to be a 10-day mission planned to briefly place a crew of four into a Lunar flyby. Artemis III aims to conduct the first crewed lunar landing since Apollo 17, and is scheduled for no earlier than mid-2027. In support of the Artemis missions, NASA has been funding private companies to land robotic probes on the lunar surface in a program known as the Commercial Lunar Payload Services. As of March 2022, NASA has awarded contracts for robotic lunar probes to companies such as Intuitive Machines, Firefly Space Systems, and Astrobotic. On April 16, 2021, NASA announced they had selected the SpaceX Lunar Starship as its Human Landing System. The agency's Space Launch System rocket will launch four astronauts aboard the Orion spacecraft for their multi-day journey to lunar orbit where they will transfer to SpaceX's Starship for the final leg of their journey to the surface of the Moon. In November 2021, it was announced that the goal of landing astronauts on the Moon by 2024 had slipped to no earlier than 2027 due to numerous factors. Artemis I launched on November 16, 2022, and returned to Earth safely on December 11, 2022. As of April 2025, NASA plans to launch Artemis II in April 2026. and Artemis III in 2027. Additional Artemis missions, Artemis IV, Artemis V, and Artemis VI are planned to launch between 2028 and 2031. NASA's next major space initiative is the construction of the Lunar Gateway, a small space station in lunar orbit. This space station will be designed primarily for non-continuous human habitation. The construction of the Gateway is expected to begin in 2027 with the launch of the first two modules: the Power and Propulsion Element (PPE) and the Habitation and Logistics Outpost (HALO). Operations on the Gateway will begin with the Artemis IV mission, which plans to deliver a crew of four to the Gateway in 2028. In 2017, NASA was directed by the congressional NASA Transition Authorization Act of 2017 to get humans to Mars-orbit (or to the Martian surface) by the 2030s. The Commercial Low Earth Orbit Destinations program is an initiative by NASA to support work on commercial space stations that the agency hopes to have in place by the end of the current decade to replace the "International Space Station". The three selected companies are: Blue Origin (et al.) with their Orbital Reef station concept, Nanoracks (et al.) with their Starlab Space Station concept, and Northrop Grumman with a station concept based on the HALO-module for the Gateway station. NASA has conducted many uncrewed and robotic spaceflight programs throughout its history. More than 1,000 uncrewed missions have been designed to explore the Earth and the Solar System. NASA executes a mission development framework to plan, select, develop, and operate robotic missions. This framework defines cost, schedule and technical risk parameters to enable competitive selection of missions involving mission candidates that have been developed by principal investigators and their teams from across NASA, the broader US Government research and development stakeholders, and industry. The mission development construct is defined by four umbrella programs. The Explorer program derives its origin from the earliest days of the US Space program. In current form, the program consists of three classes of systems – Small Explorers (SMEX), Medium Explorers (MIDEX), and University-Class Explorers (UNEX) missions. The NASA Explorer program office provides frequent flight opportunities for moderate cost innovative solutions from the heliophysics and astrophysics science areas. The Small Explorer missions are required to limit cost to NASA to below $150M (2022 dollars). Medium class explorer missions have typically involved NASA cost caps of $350M. The Explorer program office is based at NASA Goddard Space Flight Center. The NASA Discovery program develops and delivers robotic spacecraft solutions in the planetary science domain. Discovery enables scientists and engineers to assemble a team to deliver a solution against a defined set of objectives and competitively bid that solution against other candidate programs. Cost caps vary but recent mission selection processes were accomplished using a $500M cost cap for NASA. The Planetary Mission Program Office is based at the NASA Marshall Space Flight Center and manages both the Discovery and New Frontiers missions. The office is part of the Science Mission Directorate. NASA Administrator Bill Nelson announced on June 2, 2021, that the DAVINCI+ and VERITAS missions were selected to launch to Venus in the late 2020s, having beat out competing proposals for missions to Jupiter's volcanic moon Io and Neptune's large moon Triton that were also selected as Discovery program finalists in early 2020. Each mission has an estimated cost of $500 million, with launches expected between 2028 and 2030. Launch contracts will be awarded later in each mission's development. The New Frontiers program focuses on specific Solar System exploration goals identified as top priorities by the planetary science community. Primary objectives include Solar System exploration employing medium class spacecraft missions to conduct high-science-return investigations. New Frontiers builds on the development approach employed by the Discovery program but provides for higher cost caps and schedule durations than are available with Discovery. Cost caps vary by opportunity; recent missions have been awarded based on a defined cap of $1 billion. The higher cost cap and projected longer mission durations result in a lower frequency of new opportunities for the program – typically one every several years. OSIRIS-REx and New Horizons are examples of New Frontiers missions. NASA has determined that the next opportunity to propose for the fifth round of New Frontiers missions will occur no later than the fall of 2024. Missions in NASA's New Frontiers Program tackle specific Solar System exploration goals identified as top priorities by the planetary science community. Exploring the Solar System with medium-class spacecraft missions that conduct high-science-return investigations is NASA's strategy to further understand the Solar System. Large strategic missions (formerly called Flagship missions) are strategic missions that are typically developed and managed by large teams that may span several NASA centers. The individual missions become the program as opposed to being part of a larger effort (see Discovery, New Frontiers, etc.). The James Webb Space Telescope is a strategic mission that was developed over a period of more than 20 years. Strategic missions are developed on an ad-hoc basis as program objectives and priorities are established. Missions like Voyager, had they been developed today, would have been strategic missions. Three of the Great Observatories were strategic missions (the Chandra X-ray Observatory, the Compton Gamma Ray Observatory, and the Hubble Space Telescope). Europa Clipper is the next large strategic mission in development by NASA. NASA continues to play a material role in exploration of the Solar System as it has for decades. Ongoing missions have current science objectives with respect to more than five extraterrestrial bodies within the Solar System – Moon (Lunar Reconnaissance Orbiter), Mars (Perseverance rover), Jupiter (Juno), asteroid Bennu (OSIRIS-REx), and Kuiper Belt Objects (New Horizons). The Juno extended mission will make multiple flybys of the Jovian moon Io in 2023 and 2024 after flybys of Ganymede in 2021 and Europa in 2022. Voyager 1 and Voyager 2 continue to provide science data back to Earth while continuing on their outward journeys into interstellar space. On November 26, 2011, NASA's Mars Science Laboratory mission was successfully launched for Mars. The Curiosity rover successfully landed on Mars on August 6, 2012, and subsequently began its search for evidence of past or present life on Mars. In September 2014, NASA's MAVEN spacecraft, which is part of the Mars Scout Program, successfully entered Mars orbit and, as of October 2022, continues its study of the atmosphere of Mars. NASA's ongoing Mars investigations include in-depth surveys of Mars by the Perseverance rover. NASA's Europa Clipper, launched in October 2024, will study the Galilean moon Europa through a series of flybys while in orbit around Jupiter. Dragonfly will send a mobile robotic rotorcraft to Saturn's biggest moon, Titan. As of May 2021, Dragonfly is scheduled for launch in June 2027. The NASA Science Mission Directorate Astrophysics division manages the agency's astrophysics science portfolio. NASA has invested significant resources in the development, delivery, and operations of various forms of space telescopes. These telescopes have provided the means to study the cosmos over a large range of the electromagnetic spectrum. The Great Observatories that were launched in the 1980s and 1990s have provided a wealth of observations for study by physicists across the planet. The first of them, the Hubble Space Telescope, was delivered to orbit in 1990 and continues to function, in part due to prior servicing missions performed by the Space Shuttle. The other remaining active great observatories include the Chandra X-ray Observatory (CXO), launched by STS-93 in July 1999 and is now in a 64-hour elliptical orbit studying X-ray sources that are not readily viewable from terrestrial observatories. The Imaging X-ray Polarimetry Explorer (IXPE) is a space observatory designed to improve the understanding of X-ray production in objects such as neutron stars and pulsar wind nebulae, as well as stellar and supermassive black holes. IXPE launched in December 2021 and is an international collaboration between NASA and the Italian Space Agency (ASI). It is part of the NASA Small Explorers program (SMEX) which designs low-cost spacecraft to study heliophysics and astrophysics. The Neil Gehrels Swift Observatory was launched in November 2004 and is a gamma-ray burst observatory that also monitors the afterglow in X-ray, and UV/Visible light at the location of a burst. The mission was developed in a joint partnership between Goddard Space Flight Center (GSFC) and an international consortium from the United States, United Kingdom, and Italy. Pennsylvania State University operates the mission as part of NASA's Medium Explorer program (MIDEX). The Fermi Gamma-ray Space Telescope (FGST) is another gamma-ray focused space observatory that was launched to low Earth orbit in June 2008 and is being used to perform gamma-ray astronomy observations. In addition to NASA, the mission involves the United States Department of Energy, and government agencies in France, Germany, Italy, Japan, and Sweden. The James Webb Space Telescope (JWST), launched in December 2021 on an Ariane 5 rocket, operates in a halo orbit circling the Sun-Earth L2 point. JWST's high sensitivity in the infrared spectrum and its imaging resolution will allow it to view more distant, faint, or older objects than its predecessors, including Hubble. NASA Earth Science is a large, umbrella program comprising a range of terrestrial and space-based collection systems in order to better understand the Earth system and its response to natural and human-caused changes. Numerous systems have been developed and fielded over several decades to provide improved prediction for weather, climate, and other changes in the natural environment. Several of the current operating spacecraft programs include: Aqua, Aura, Orbiting Carbon Observatory 2 (OCO-2), Gravity Recovery and Climate Experiment Follow-on (GRACE FO), and Ice, Cloud, and land Elevation Satellite 2 (ICESat-2). In addition to systems already in orbit, NASA is designing a new set of Earth Observing Systems to study, assess, and generate responses for climate change, natural hazards, forest fires, and real-time agricultural processes. The GOES-T satellite (designated GOES-18 after launch) joined the fleet of US geostationary weather monitoring satellites in March 2022. NASA also maintains the Earth Science Data Systems (ESDS) program to oversee the life cycle of NASA's Earth science data – from acquisition through processing and distribution. The primary goal of ESDS is to maximize the scientific return from NASA's missions and experiments for research and applied scientists, decision makers, and society at large. The Earth Science program is managed by the Earth Science Division of the NASA Science Mission Directorate. NASA invests in various ground and space-based infrastructures to support its science and exploration mandate. The agency maintains access to suborbital and orbital space launch capabilities and sustains ground station solutions to support its evolving fleet of spacecraft and remote systems. The NASA Deep Space Network (DSN) serves as the primary ground station solution for NASA's interplanetary spacecraft and select Earth-orbiting missions. The system employs ground station complexes near Barstow, California, in Spain near Madrid, and in Australia near Canberra. The placement of these ground stations approximately 120 degrees apart around the planet provides the ability for communications to spacecraft throughout the Solar System even as the Earth rotates about its axis on a daily basis. The system is controlled at a 24x7 operations center at JPL in Pasadena, California, which manages recurring communications linkages with up to 40 spacecraft. The system is managed by the Jet Propulsion Laboratory. The Near Space Network (NSN) provides telemetry, commanding, ground-based tracking, data and communications services to a wide range of customers with satellites in low earth orbit (LEO), geosynchronous orbit (GEO), highly elliptical orbits (HEO), and lunar orbits. The NSN accumulates ground station and antenna assets from the Near-Earth Network and the Tracking and Data Relay Satellite System (TDRS) which operates in geosynchronous orbit providing continuous real-time coverage for launch vehicles and low earth orbit NASA missions. The NSN consists of 19 ground stations worldwide operated by the US Government and by contractors including Kongsberg Satellite Services (KSAT), Swedish Space Corporation (SSC), and South African National Space Agency (SANSA). The ground network averages between 120 and 150 spacecraft contacts a day with TDRS engaging with systems on a near-continuous basis as needed; the system is managed and operated by the Goddard Space Flight Center. The NASA Sounding Rocket Program (NSRP) is located at the Wallops Flight Facility and provides launch capability, payload development and integration, and field operations support to execute suborbital missions. The program has been in operation since 1959 and is managed by the Goddard Space Flight Center using a combined US Government and contractor team. The NSRP team conducts approximately 20 missions per year from both Wallops and other launch locations worldwide to allow scientists to collect data "where it occurs". The program supports the strategic vision of the Science Mission Directorate collecting important scientific data for earth science, heliophysics, and astrophysics programs. In June 2022, NASA conducted its first rocket launch from a commercial spaceport outside the US. It launched a Black Brant IX from the Arnhem Space Centre in Australia. The NASA Launch Services Program (LSP) is responsible for procurement of launch services for NASA uncrewed missions and oversight of launch integration and launch preparation activity, providing added quality and mission assurance to meet program objectives. Since 1990, NASA has purchased expendable launch vehicle launch services directly from commercial providers, whenever possible, for its scientific and applications missions. Expendable launch vehicles can accommodate all types of orbit inclinations and altitudes and are ideal vehicles for launching Earth-orbit and interplanetary missions. LSP operates from Kennedy Space Center and falls under the NASA Space Operations Mission Directorate (SOMD). The Aeronautics Research Mission Directorate (ARMD) is one of five mission directorates within NASA, the other four being the Exploration Systems Development Mission Directorate, the Space Operations Mission Directorate, the Science Mission Directorate, and the Space Technology Mission Directorate. The ARMD is responsible for NASA's aeronautical research, which benefits the commercial, military, and general aviation sectors. ARMD performs its aeronautics research at four NASA facilities: Ames Research Center and Armstrong Flight Research Center in California, Glenn Research Center in Ohio, and Langley Research Center in Virginia. The NASA X-57 Maxwell is an experimental aircraft being developed by NASA to demonstrate the technologies required to deliver a highly efficient all-electric aircraft. The primary goal of the program is to develop and deliver all-electric technology solutions that can also achieve airworthiness certification with regulators. The program involves development of the system in several phases, or modifications, to incrementally grow the capability and operability of the system. The initial configuration of the aircraft has now completed ground testing as it approaches its first flights. In mid-2022, the X-57 was scheduled to fly before the end of the year. The development team includes staff from the NASA Armstrong, Glenn, and Langley centers along with a number of industry partners from the United States and Italy. NASA is collaborating with the Federal Aviation Administration and industry stakeholders to modernize the United States National Airspace System (NAS). Efforts began in 2007 with a goal to deliver major modernization components by 2025. The modernization effort intends to increase the safety, efficiency, capacity, access, flexibility, predictability, and resilience of the NAS while reducing the environmental impact of aviation. The Aviation Systems Division of NASA Ames operates the joint NASA/FAA North Texas Research Station. The station supports all phases of NextGen research, from concept development to prototype system field evaluation. This facility has already transitioned advanced NextGen concepts and technologies to use through technology transfers to the FAA. NASA contributions also include development of advanced automation concepts and tools that provide air traffic controllers, pilots, and other airspace users with more accurate real-time information about the nation's traffic flow, weather, and routing. Ames' advanced airspace modeling and simulation tools have been used extensively to model the flow of air traffic flow across the US, and to evaluate new concepts in airspace design, traffic flow management, and optimization. NASA has made use of technologies such as the multi-mission radioisotope thermoelectric generator (MMRTG), which is a type of radioisotope thermoelectric generator used to power spacecraft. Shortages of the required plutonium-238 have curtailed deep space missions since the turn of the millennium. An example of a spacecraft that was not developed because of a shortage of this material was New Horizons 2. In July 2021, NASA announced contract awards for development of nuclear thermal propulsion reactors. Three contractors will develop individual designs over 12 months for later evaluation by NASA and the US Department of Energy. NASA's space nuclear technologies portfolio are led and funded by its Space Technology Mission Directorate. In January 2023, NASA announced a partnership with Defense Advanced Research Projects Agency (DARPA) on the Demonstration Rocket for Agile Cislunar Operations (DRACO) program to demonstrate a NTR engine in space, an enabling capability for NASA missions to Mars. In July 2023, NASA and DARPA jointly announced the award of $499 million to Lockheed Martin to design and build an experimental NTR rocket to be launched in 2027. In July 2025, Acting NASA Administrator Sean Duffy issued a directive to fast-track plans for placing a nuclear reactor on the Moon to support the agency's Artemis program and maintain U.S. leadership in space exploration. The directive, prompted by concerns that China and Russia may deploy a joint lunar reactor by the mid-2030s, emphasizes the need for a 100-kilowatt system to power long-term lunar missions. Duffy warned that if another nation establishes a reactor first, it could create "keep-out zones" limiting U.S. access. Socioeconomic Data and Applications Center (SEDAC), founded in 1994, "focuses on archiving and distributing data related to human interactions in the environment. SEDAC synthesizes Earth science and socioeconomic data and information" in Palisades, NY, with partner Center for Integrated Earth System Information, Columbia University. SEDAC has extensive geospatial data holdings. Free Space Optics. NASA contracted a third party to study the probability of using Free Space Optics (FSO) to communicate with Optical (laser) Stations on the Ground (OGS) called laser-com RF networks for satellite communications. Water Extraction from Lunar Soil. On July 29, 2020, NASA requested American universities to propose new technologies for extracting water from the lunar soil and developing power systems. The idea will help the space agency conduct sustainable exploration of the Moon. In 2024, NASA was tasked by the US Government to create a Time standard for the Moon. The standard is to be called Coordinated Lunar Time and is expected to be finalized in 2026. NASA's Human Research Program (HRP) is designed to study the effects of space on human health and also to provide countermeasures and technologies for human space exploration. The medical effects of space exploration are reasonably limited in low Earth orbit or in travel to the Moon. Travel to Mars is significantly longer and deeper into space, significant medical issues can result. These include bone density loss, radiation exposure, vision changes, circadian rhythm disturbances, heart remodeling, and immune alterations. In order to study and diagnose these ill-effects, HRP has been tasked with identifying or developing small portable instrumentation with low mass, volume, and power to monitor the health of astronauts. To achieve this aim, on May 13, 2022, NASA and SpaceX Crew-4 astronauts successfully tested its rHEALTH ONE universal biomedical analyzer for its ability to identify and analyzer biomarkers, cells, microorganisms, and proteins in a spaceflight environment. NASA established the Planetary Defense Coordination Office (PDCO) in 2016 to catalog and track potentially hazardous near-Earth objects (NEO), such as asteroids and comets and develop potential responses and defenses against these threats. The PDCO is chartered to provide timely and accurate information to the government and the public on close approaches by Potentially hazardous objects (PHOs) and any potential for impact. The office functions within the Science Mission Directorate Planetary Science Division. The PDCO augmented prior cooperative actions between the United States, the European Union, and other nations which had been scanning the sky for NEOs since 1998 in an effort called Spaceguard. From the 1990s NASA has run many NEO detection programs from Earth bases observatories, greatly increasing the number of objects that have been detected. Many asteroids are very dark and those near the Sun are much harder to detect from Earth-based telescopes which observe at night, and thus face away from the Sun. NEOs inside Earth orbit only reflect a part of light also rather than potentially a "full Moon" when they are behind the Earth and fully lit by the Sun. In 1998, the United States Congress gave NASA a mandate to detect 90% of near-Earth asteroids over 1 km (0.62 mi) diameter (that threaten global devastation) by 2008. This initial mandate was met by 2011. In 2005, the original USA Spaceguard mandate was extended by the George E. Brown, Jr. Near-Earth Object Survey Act, which calls for NASA to detect 90% of NEOs with diameters of 140 m (460 ft) or greater, by 2020 (compare to the 20-meter Chelyabinsk meteor that hit Russia in 2013). As of January 2020[update], it is estimated that less than half of these have been found, but objects of this size hit the Earth only about once in 2,000 years. In January 2020, NASA officials estimated it would take 30 years to find all objects meeting the 140 m (460 ft) size criteria, more than twice the timeframe that was built into the 2005 mandate. In June 2021, NASA authorized the development of the NEO Surveyor spacecraft to reduce that projected duration to achieve the mandate down to 10 years. NASA has incorporated planetary defense objectives into several ongoing missions. In 1999, NASA visited 433 Eros with the NEAR Shoemaker spacecraft which entered its orbit in 2000, closely imaging the asteroid with various instruments at that time. NEAR Shoemaker became the first spacecraft to successfully orbit and land on an asteroid, improving our understanding of these bodies and demonstrating our capacity to study them in greater detail. OSIRIS-REx used its suite of instruments to transmit radio tracking signals and capture optical images of Bennu during its study of the asteroid that will help NASA scientists determine its precise position in the solar system and its exact orbital path. As Bennu has the potential for recurring approaches to the Earth-Moon system in the next 100–200 years, the precision gained from OSIRIS-REx will enable scientists to better predict the future gravitational interactions between Bennu and our planet and resultant changes in Bennu's onward flight path. The WISE/NEOWISE mission was launched by NASA JPL in 2009 as an infrared-wavelength astronomical space telescope. In 2013, NASA repurposed it as the NEOWISE mission to find potentially hazardous near-Earth asteroids and comets; its mission has been extended into 2023. NASA and Johns Hopkins Applied Physics Laboratory (JHAPL) jointly developed the first planetary defense purpose-built satellite, the Double Asteroid Redirection Test (DART) to test possible planetary defense concepts. DART was launched in November 2021 by a SpaceX Falcon 9 from California on a trajectory designed to impact the Dimorphos asteroid. Scientists were seeking to determine whether an impact could alter the subsequent path of the asteroid; a concept that could be applied to future planetary defense. On September 26, 2022, DART hit its target. In the weeks following impact, NASA declared DART a success, confirming it had shortened Dimorphos' orbital period around Didymos by about 32 minutes, surpassing the pre-defined success threshold of 73 seconds. NEO Surveyor, formerly called the Near-Earth Object Camera (NEOCam) mission, is a space-based infrared telescope under development to survey the Solar System for potentially hazardous asteroids. The spacecraft is scheduled to launch in 2026. In June 2022, the head of the NASA Science Mission Directorate, Thomas Zurbuchen, confirmed the start of NASA's UAP independent study team. At a speech before the National Academies of Science, Engineering and Medicine, Zurbuchen said the space agency would bring a scientific perspective to efforts already underway by the Pentagon and intelligence agencies to make sense of dozens of such sightings. He said it was "high-risk, high-impact" research that the space agency should not shy away from, even if it is a controversial field of study. Collaboration In response to the Apollo 1 accident, which killed three astronauts in 1967, Congress directed NASA to form an Aerospace Safety Advisory Panel (ASAP) to advise the NASA Administrator on safety issues and hazards in NASA's air and space programs. In the aftermath of the Shuttle Columbia disaster, Congress required that the ASAP submit an annual report to the NASA Administrator and to Congress. By 1971, NASA had also established the Space Program Advisory Council and the Research and Technology Advisory Council to provide the administrator with advisory committee support. In 1977, the latter two were combined to form the NASA Advisory Council (NAC). The NASA Authorization Act of 2014 reaffirmed the importance of ASAP. NASA and NOAA have cooperated for decades on the development, delivery and operation of polar and geosynchronous weather satellites. The relationship typically involves NASA developing the space systems, launch solutions, and ground control technology for the satellites and NOAA operating the systems and delivering weather forecasting products to users. Multiple generations of NOAA Polar orbiting platforms have operated to provide detailed imaging of weather from low altitude. Geostationary Operational Environmental Satellites (GOES) provide near-real-time coverage of the western hemisphere to ensure accurate and timely understanding of developing weather phenomenon. The United States Space Force (USSF) is the space service branch of the United States Armed Forces, while the National Aeronautics and Space Administration (NASA) is an independent agency of the United States government responsible for civil spaceflight. NASA and the Space Force's predecessors in the Air Force have a long-standing cooperative relationship, with the Space Force supporting NASA launches out of Kennedy Space Center, Cape Canaveral Space Force Station, and Vandenberg Space Force Base, to include range support and rescue operations from Task Force 45. NASA and the Space Force also partner on matters such as defending Earth from asteroids. Space Force members can be NASA astronauts, with Colonel Michael S. Hopkins, the commander of SpaceX Crew-1, commissioned into the Space Force from the International Space Station on December 18, 2020. In September 2020, the Space Force and NASA signed a memorandum of understanding formally acknowledging the joint role of both agencies. This new memorandum replaced a similar document signed in 2006 between NASA and Air Force Space Command. The Landsat program is the longest-running enterprise for acquisition of satellite imagery of Earth. It is a joint NASA / USGS program. On July 23, 1972, the Earth Resources Technology Satellite was launched. This was eventually renamed to Landsat 1 in 1975. The most recent satellite in the series, Landsat 9, was launched on September 27, 2021. The instruments on the Landsat satellites have acquired millions of images. The images, archived in the United States and at Landsat receiving stations around the world, are a unique resource for global change research and applications in agriculture, cartography, geology, forestry, regional planning, surveillance and education, and can be viewed through the US Geological Survey (USGS) "EarthExplorer" website. The collaboration between NASA and USGS involves NASA designing and delivering the space system (satellite) solution, launching the satellite into orbit with the USGS operating the system once in orbit. As of October 2022, nine satellites have been built with eight of them successfully operating in orbit. NASA collaborates with the European Space Agency on a wide range of scientific and exploration requirements. From participation with the Space Shuttle (the Spacelab missions) to major roles on the Artemis program (the Orion Service Module), ESA and NASA have supported the science and exploration missions of each agency. There are NASA payloads on ESA spacecraft and ESA payloads on NASA spacecraft. The agencies have developed joint missions in areas including heliophysics (e.g. Solar Orbiter) and astronomy (Hubble Space Telescope, James Webb Space Telescope). Under the Artemis Gateway partnership, ESA will contribute habitation and refueling modules, along with enhanced lunar communications, to the Gateway. NASA and ESA continue to advance cooperation in relation to Earth Science including climate change with agreements to cooperate on various missions including the Sentinel-6 series of spacecraft In September 2014, NASA and the Indian Space Research Organisation (ISRO) signed a partnership to collaborate on and launch a joint radar mission, the NASA-ISRO Synthetic Aperature Radar (NISAR) mission. The mission was launched on July 30, 2025. NASA has provided the mission's L-band synthetic aperture radar, a high-rate communication subsystem for science data, GPS receivers, a solid-state recorder and payload data subsystem. ISRO has provided the spacecraft bus, the S-band radar, the launch vehicle and associated launch services. NASA and the Japan Aerospace Exploration Agency (JAXA) cooperate on a range of space projects. JAXA is a direct participant in the Artemis program, including the Lunar Gateway effort. JAXA's planned contributions to Gateway include I-Hab's environmental control and life support system, batteries, thermal control, and imagery components, which will be integrated into the module by the European Space Agency (ESA) prior to launch. These capabilities are critical for sustained Gateway operations during crewed and uncrewed time periods. JAXA and NASA have collaborated on numerous satellite programs, especially in areas of Earth science. NASA has contributed to JAXA satellites and vice versa. Japanese instruments are flying on NASA's Terra and Aqua satellites, and NASA sensors have flown on previous Japanese Earth-observation missions. The NASA-JAXA Global Precipitation Measurement mission was launched in 2014 and includes both NASA- and JAXA-supplied sensors on a NASA satellite launched on a JAXA rocket. The mission provides the frequent, accurate measurements of rainfall over the entire globe for use by scientists and weather forecasters. NASA and Roscosmos have cooperated on the development and operation of the International Space Station since September 1993. The agencies have used launch systems from both countries to deliver station elements to orbit. Astronauts and Cosmonauts jointly maintain various elements of the station. Both countries provide access to the station via launch systems noting Russia's unique role as the sole provider of delivery of crew and cargo upon retirement of the space shuttle in 2011 and prior to commencement of NASA COTS and crew flights. In July 2022, NASA and Roscosmos signed a deal to share space station flights enabling crew from each country to ride on the systems provided by the other. Current geopolitical conditions in late 2022 make it unlikely that cooperation will be extended to other programs such as Artemis or lunar exploration. The Artemis Accords have been established to define a framework for cooperating in the peaceful exploration and exploitation of the Moon, Mars, asteroids, and comets. The accords were drafted by NASA and the US State Department and are executed as a series of bilateral agreements between the United States and the participating countries. As of June 2023, 22 countries have signed the accords. They are Australia, Bahrain, Brazil, Canada, Colombia, France, India, Israel, Italy, Japan, the Republic of Korea, Luxembourg, Mexico, New Zealand, Poland, Romania, the Kingdom of Saudi Arabia, Singapore, Ukraine, the United Arab Emirates, the United Kingdom, and the United States. The Wolf Amendment was passed by the US Congress into law in 2011 and prevents NASA from engaging in direct, bilateral cooperation with the Chinese government and China-affiliated organizations such as the China National Space Administration without the explicit authorization from Congress and the Federal Bureau of Investigation. The law has been renewed annually since by inclusion in annual appropriations bills. Management The agency's administration is located at NASA Headquarters in Washington, DC, and provides overall guidance and direction. Except under exceptional circumstances, NASA civil service employees are required to be US citizens. NASA's administrator is nominated by the President of the United States subject to the approval of the US Senate, and serves at the President's pleasure as a senior space science advisor. Jared Isaacman is the administrator of NASA since December 2025. His first nomination was withdrawn by President Donald Trump on May 31, 2025. He was renominated on November 4, 2025, and confirmed by the Senate on December 17. NASA operates with four FY2022 strategic goals. NASA budget requests are developed by NASA and approved by the administration prior to submission to the US Congress. Authorized budgets are those that have been included in enacted appropriations bills that are approved by both houses of Congress and enacted into law by the US president. NASA fiscal year budget requests and authorized budgets are listed below. NASA funding and priorities are developed through its six Mission Directorates. Center-wide activities such as the Chief Engineer and Safety and Mission Assurance organizations are aligned to the headquarters function. The MSD budget estimate includes funds for these HQ functions. The administration operates 10 major field centers with several managing additional subordinate facilities across the country. Each center is led by a director (data below valid as of December 23, 2024). Sustainability The exhaust gases produced by rocket propulsion systems, both in Earth's atmosphere and in space, can adversely affect the Earth's environment. Some hypergolic rocket propellants, such as hydrazine, are highly toxic prior to combustion, but decompose into less toxic compounds after burning. Rockets using hydrocarbon fuels, such as kerosene, release carbon dioxide and soot in their exhaust. Carbon dioxide emissions are insignificant compared to those from other sources; on average, the United States consumed 803 million US gal (3.0 million m3) of liquid fuels per day in 2014, while a single Falcon 9 rocket first stage burns around 25,000 US gallons (95 m3) of kerosene fuel per launch. Even if a Falcon 9 were launched every single day, it would only represent 0.006% of liquid fuel consumption (and carbon dioxide emissions) for that day. Additionally, the exhaust from LOx- and LH2- fueled engines, like the SSME, is almost entirely water vapor. NASA addressed environmental concerns with its canceled Constellation program in accordance with the National Environmental Policy Act in 2011. In contrast, ion engines use harmless noble gases like xenon for propulsion. An example of NASA's environmental efforts is the NASA Sustainability Base. Additionally, the Exploration Sciences Building was awarded the LEED Gold rating in 2010. On May 8, 2003, the Environmental Protection Agency recognized NASA as the first federal agency to directly use landfill gas to produce energy at one of its facilities—the Goddard Space Flight Center, Greenbelt, Maryland. In 2018, NASA along with other companies including Sensor Coating Systems, Pratt & Whitney, Monitor Coating and UTRC launched the project CAUTION (CoAtings for Ultra High Temperature detectION). This project aims to enhance the temperature range of the Thermal History Coating up to 1,500 °C (2,730 °F) and beyond. The final goal of this project is improving the safety of jet engines as well as increasing efficiency and reducing CO2 emissions. NASA also researches and publishes on climate change. Its statements concur with the global scientific consensus that the climate is warming. Bob Walker, who has advised former US President Donald Trump on space issues, has advocated that NASA should focus on space exploration and that its climate study operations should be transferred to other agencies such as NOAA. Former NASA atmospheric scientist J. Marshall Shepherd countered that Earth science study was built into NASA's mission at its creation in the 1958 National Aeronautics and Space Act. NASA won the 2020 Webby People's Voice Award for Green in the category Web. Educational Launch of Nanosatellites (ELaNa). Since 2011, the ELaNa program has provided opportunities for NASA to work with university teams to test emerging technologies and commercial-off-the-shelf solutions by providing launch opportunities for developed CubeSats using NASA procured launch opportunities. By example, two NASA-sponsored CubeSats launched in June 2022 on a Virgin Orbit LauncherOne vehicle as the ELaNa 39 mission. Cubes in Space. NASA started an annual competition in 2014 named "Cubes in Space". It is jointly organized by NASA and the global education company I Doodle Learning, with the objective of teaching school students aged 11–18 to design and build scientific experiments to be launched into space on a NASA rocket or balloon. On June 21, 2017, the world's smallest satellite, KalamSAT, was launched. US law requires the International System of Units to be used in all US Government programs, "except where impractical". In 1969, Apollo 11 landed on the Moon using a mix of United States customary units and metric units. In the 1980s, NASA started the transition towards the metric system, but was still using both systems in the 1990s. On September 23, 1999, a mixup between NASA's use of SI units and Lockheed Martin Space's use of US units resulted in the loss of the Mars Climate Orbiter. In August 2007, NASA stated that all future missions and explorations of the Moon would be done entirely using the SI system. This was done to improve cooperation with space agencies of other countries that already use the metric system. As of 2007, NASA is predominantly working with SI units, but some projects still use US units, and some, including the International Space Station, use a mix of both. Media presence Approaching 40 years of service, the NASA TV channel airs content ranging from live coverage of crewed missions to video coverage of significant milestones for operating robotic spacecraft (e.g. rover landings on Mars) and domestic and international launches. The channel is delivered by NASA and is broadcast by satellite and over the Internet. The system initially started to capture archival footage of important space events for NASA managers and engineers and expanded as public interest grew. The Apollo 8 Christmas Eve broadcast while in orbit around the Moon was received by more than a billion people. NASA's video transmission of the Apollo 11 Moon landing was awarded a primetime Emmy in commemoration of the 40th anniversary of the landing. The channel is a product of the US Government and is widely available across many television and Internet platforms. NASAcast is the official audio and video podcast of the NASA website. Created in late 2005, the podcast service contains the latest audio and video features from the NASA web site, including NASA TV's This Week at NASA and educational materials produced by NASA. Additional NASA podcasts, such as Science@NASA, are also featured and give subscribers an in-depth look at content by subject matter. NASA EDGE is a video podcast which explores different missions, technologies and projects developed by NASA. The program was released by NASA on March 18, 2007, and, as of August 2020[update], there have been 200 vodcasts produced. It is a public outreach vodcast sponsored by NASA's Exploration Systems Mission Directorate and based out of the Exploration and Space Operations Directorate at Langley Research Center in Hampton, Virginia. The NASA EDGE team takes an insider's look at current projects and technologies from NASA facilities around the United States, and it is depicted through personal interviews, on-scene broadcasts, computer animations, and personal interviews with top scientists and engineers at NASA.[note 2] The show explores the contributions NASA has made to society as well as the progress of current projects in materials and space exploration. NASA EDGE vodcasts can be downloaded from the NASA website and from iTunes. In its first year of production, the show was downloaded over 450,000 times. As of February 2010,[update] the average download rate is more than 420,000 per month, with over one million downloads in December 2009 and January 2010. NASA and the NASA EDGE have also developed interactive programs designed to complement the vodcast. The Lunar Electric Rover App allows users to drive a simulated Lunar Electric Rover between objectives, and it provides information about and images of the vehicle. The NASA EDGE Widget provides a graphical user interface for accessing NASA EDGE vodcasts, image galleries, and the program's Twitter feed, as well as a live NASA news feed. Astronomy Picture of the Day (APOD) is a website provided by NASA and Michigan Technological University (MTU). Each day it features a different image of the universe accompanied by an explanation written by a professional astronomer. The photograph does not necessarily correspond to a celestial event on the exact day that it is displayed, and images are sometimes repeated. These often relate to current events in astronomy and space exploration. The text has several hyperlinks to more pictures and websites for more information. The images are either visible spectrum photographs, images taken at non-visible wavelengths and displayed in false color, video footage, animations, artist's conceptions, or micrographs that relate to space or cosmology. Past images are stored in the APOD Archive, with the first image appearing on June 16, 1995. This initiative has received support from NASA, the National Science Foundation, and MTU. The images are sometimes authored by people or organizations outside NASA, and therefore APOD images are often copyrighted, unlike many other NASA image galleries. In July 2023, NASA announced a new streaming service known as NASA+. It launched on November 8, 2023, and has live coverage of launches, documentaries and original programs. According to NASA, it will be free of ads and subscription fees. It will be a part of the NASA app on iOS, Android, Amazon Fire TV, Roku and Apple TV as well as on the web on desktop and mobile devices. Gallery See also Explanatory notes References Further reading External links |
======================================== |
[SOURCE: https://github.com/#start-of-content] | [TOKENS: 508] |
Navigation Menu Search code, repositories, users, issues, pull requests... Provide feedback We read every piece of feedback, and take your input very seriously. Saved searches Use saved searches to filter your results more quickly To see all available qualifiers, see our documentation. The future of building happens together Tools and trends evolve, but collaboration endures. With GitHub, developers, agents, and code come together on one platform. GitHub features Write, test, and fix code quickly with GitHub Copilot, from simple boilerplate to complex features. GitHub customers Accelerate your entire workflow From your first line of code to final deployment, GitHub provides AI and automation tools to help you build and ship better software faster. Duolingo boosts developer speed by 25% with GitHub Copilot 2025 Gartner® Magic Quadrant™ for AI Code Assistants Ship faster with secure, reliable CI/CD. Launch a full, cloud-based development environment in seconds. Manage projects and assign tasks to Copilot, all from your mobile device. Extend your stack with apps, actions, and AI models. Built-in application security where found means fixed Use AI to find and fix vulnerabilities so your team can ship more secure software faster. Security debt, solved. Leverage security campaigns and Copilot Autofix to reduce application vulnerabilities. Dependencies you can depend on. Update vulnerable dependencies with supported fixes for breaking changes. Your secrets, your business. Detect, prevent, and remediate leaked secrets across your organization. 70% MTTR reduction with Copilot Autofix 8.3M secret leaks stopped in the past 12 months with push protection Work together, achieve more From planning and discussion to code review, GitHub keeps your team’s conversation and context next to your code. Create issues and manage projects with tools that adapt to your code. Create space for open-ended conversations alongside your project. Assign initial reviews to Copilot for greater speed and quality. Become an open source partner and support the tools and libraries that power your work. From startups to enterprises, GitHub scales with teams of any size in any industry. Figma streamlines development and strengthens security Mercedes-Benz standardizes source code and automates onboarding Mercado Libre cuts coding time by 50% Millions of developers and businesses call GitHub home Whether you’re scaling your development process or just learning how to code, GitHub is where you belong. Join the world’s most widely adopted developer platform to build the technologies that shape what’s next. Site-wide Links Get tips, technical guides, and best practices. Twice a month. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Easytrieve] | [TOKENS: 205] |
Contents Easytrieve Easytrieve is a report generator, sold by CA Technologies. Easytrieve Classic and Easytrieve Plus are two available versions of this programming language primarily designed to generate reports and are used by large corporations operating in mainframe (z/OS, z/VM, z/VSE), UNIX, Linux, and Microsoft Windows environments. Easytrieve was originally developed by Ribek Corporation (named for its owner Robert I. Beckler), with an initial release around 1971 or 1972. Easytrieve was originally developed for IBM Systems 360/370 and RCA Series 70 mainframes. Pansophic became the exclusive reseller of Easytrieve in North America in 1973, and then purchased it from Ribek in 1979. Pansophic was bought by Computer Associates in 1991, who in turn were acquired by Broadcom in 2018. Easytrieve has been described as "[o]ne of the most successful software products of the 1970s". Example References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Martius_(month)] | [TOKENS: 986] |
Contents Martius (month) Martius or mensis Martius ("March") was the first month of the ancient Roman year until possibly as late as 153 BC. After that time, it was the third month, following Februarius (February) and preceding Aprilis (April). Martius was one of the few Roman months named for a deity, Mars, who was regarded as an ancestor of the Roman people through his sons Romulus and Remus. March marked a return to the active life of farming, military campaigning, and sailing. It was densely packed with religious observances dating from the earliest period of Roman history. Because of its original position as the first month, a number of festivals originally associated with the new year occurred in March. In the Imperial period, March was also a time for public celebration of syncretic or international deities whose cultus was spread throughout the empire, including Isis and Cybele. In the agricultural year The menologia rustica told farmers to expect 12 hours of daylight and 12 of night in March. The spring equinox was placed March 25. The tutelary deity of the month was Minerva, and the Sun was in Pisces. Farmers were instructed in this month to trellis vines, to prune, and to sow spring wheat. Religious observances Festivals for Mars as the month's namesake deity date from the time of the kings and the early Republic. As a god of war, Mars was a guardian of agriculture and of the state, and was associated with the cycle of life and death. The season of Mars was felt to close in October, when most farming and military activities ceased, and the god has a second round of festivals clustered then. During the Principate, a "holy week" for Cybele and Attis developed in the latter half of the month, with an entry festival on the Ides, and a series of observances from March 22 through March 27 or 28. Isis had official festivals on March 7 and 20. Dates The Romans did not number days of a month sequentially from the 1st through the last day. Instead, they counted back from the three fixed points of the month: the Nones (5th or 7th, depending on the length of the month), the Ides (13th or 15th), and the Kalends (1st) of the following month. The Nones of March was the 7th, and the Ides of March was the 15th. Thus the last day of March was the pridie Kalendas Aprilis, "day before the Kalends of April". Roman counting was inclusive; March 9 was ante diem VII Idūs Martias, "the 7th day before the Ides of March," usually abbreviated a.d. VII Id. Mart. (or with the a.d. omitted altogether); March 23 was X Kal. Apr., "the 10th day before the Kalends of April." On the calendar of the Roman Republic and early Principate, each day was marked with a letter to denote its religiously lawful status. In March, these were: By the late 2nd century AD, extant calendars no longer show days marked with these letters, probably in part as a result of calendar reforms undertaken by Marcus Aurelius. Days were also marked with nundinal letters in cycles of A B C D E F G H, to mark the "market week" (these are omitted in the table below). A dies natalis was an anniversary such as a temple founding or rededication, sometimes thought of as the "birthday" of a deity. During the Imperial period, some of the traditional festivals localized at Rome became less important, and the birthdays and anniversaries of the emperor and his family gained prominence as Roman holidays. On the calendar of military religious observances known as the Feriale Duranum, sacrifices pertaining to Imperial cult outnumber the older festivals, but among the military the importance of Mars was maintained and perhaps magnified. The dies imperii was the anniversary of an emperor's accession. After the mid-1st century AD, a number of dates are added to calendars for spectacles and games (ludi) held in honor of various deities in the venue called a "circus" (ludi circenses). Festivals marked in large letters on extant fasti, represented by festival names in all capital letters on the table, are thought to have been the most ancient holidays, becoming part of the calendar before 509 BC. Unless otherwise noted, the dating and observances on the following table are from H. H. Scullard, Festivals and Ceremonies of the Roman Republic (Cornell University Press, 1981), pp. 84–95. References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Lod#cite_note-Pringlep11-53] | [TOKENS: 4733] |
Contents Lod Lod (Hebrew: לוד, fully vocalized: לֹד), also known as Lydda (Ancient Greek: Λύδδα) and Lidd (Arabic: اللِّدّ, romanized: al-Lidd, or اللُّدّ, al-Ludd), is a city 15 km (9+1⁄2 mi) southeast of Tel Aviv and 40 km (25 mi) northwest of Jerusalem in the Central District of Israel. It is situated between the lower Shephelah on the east and the coastal plain on the west. The city had a population of 90,814 in 2023. Lod has been inhabited since at least the Neolithic period. It is mentioned a few times in the Hebrew Bible and in the New Testament. Between the 5th century BCE and up until the late Roman period, it was a prominent center for Jewish scholarship and trade. Around 200 CE, the city became a Roman colony and was renamed Diospolis (Ancient Greek: Διόσπολις, lit. 'city of Zeus'). Tradition identifies Lod as the 4th century martyrdom site of Saint George; the Church of Saint George and Mosque of Al-Khadr located in the city is believed to have housed his remains. Following the Arab conquest of the Levant, Lod served as the capital of Jund Filastin; however, a few decades later, the seat of power was transferred to Ramla, and Lod slipped in importance. Under Crusader rule, the city was a Catholic diocese of the Latin Church and it remains a titular see to this day.[citation needed] Lod underwent a major change in its population in the mid-20th century. Exclusively Palestinian Arab in 1947, Lod was part of the area designated for an Arab state in the United Nations Partition Plan for Palestine; however, in July 1948, the city was occupied by the Israel Defense Forces, and most of its Arab inhabitants were expelled in the Palestinian expulsion from Lydda and Ramle. The city was largely resettled by Jewish immigrants, most of them expelled from Arab countries. Today, Lod is one of Israel's mixed cities, with an Arab population of 30%. Lod is one of Israel's major transportation hubs. The main international airport, Ben Gurion Airport, is located 8 km (5 miles) north of the city. The city is also a major railway and road junction. Religious references The Hebrew name Lod appears in the Hebrew Bible as a town of Benjamin, founded along with Ono by Shamed or Shamer (1 Chronicles 8:12; Ezra 2:33; Nehemiah 7:37; 11:35). In Ezra 2:33, it is mentioned as one of the cities whose inhabitants returned after the Babylonian captivity. Lod is not mentioned among the towns allocated to the tribe of Benjamin in Joshua 18:11–28. The name Lod derives from a tri-consonantal root not extant in Northwest Semitic, but only in Arabic (“to quarrel; withhold, hinder”). An Arabic etymology of such an ancient name is unlikely (the earliest attestation is from the Achaemenid period). In the New Testament, the town appears in its Greek form, Lydda, as the site of Peter's healing of Aeneas in Acts 9:32–38. The city is also mentioned in an Islamic hadith as the location of the battlefield where the false messiah (al-Masih ad-Dajjal) will be slain before the Day of Judgment. History The first occupation dates to the Neolithic in the Near East and is associated with the Lodian culture. Occupation continued in the Levant Chalcolithic. Pottery finds have dated the initial settlement in the area now occupied by the town to 5600–5250 BCE. In the Early Bronze, it was an important settlement in the central coastal plain between the Judean Shephelah and the Mediterranean coast, along Nahal Ayalon. Other important nearby sites were Tel Dalit, Tel Bareqet, Khirbat Abu Hamid (Shoham North), Tel Afeq, Azor and Jaffa. Two architectural phases belong to the late EB I in Area B. The first phase had a mudbrick wall, while the late phase included a circulat stone structure. Later excavations have produced an occupation later, Stratum IV. It consists of two phases, Stratum IVb with mudbrick wall on stone foundations and rounded exterior corners. In Stratum IVa there was a mudbrick wall with no stone foundations, with imported Egyptian potter and local pottery imitations. Another excavations revealed nine occupation strata. Strata VI-III belonged to Early Bronze IB. The material culture showed Egyptian imports in strata V and IV. Occupation continued into Early Bronze II with four strata (V-II). There was continuity in the material culture and indications of centralized urban planning. North to the tell were scattered MB II burials. The earliest written record is in a list of Canaanite towns drawn up by the Egyptian pharaoh Thutmose III at Karnak in 1465 BCE. From the fifth century BCE until the Roman period, the city was a centre of Jewish scholarship and commerce. According to British historian Martin Gilbert, during the Hasmonean period, Jonathan Maccabee and his brother, Simon Maccabaeus, enlarged the area under Jewish control, which included conquering the city. The Jewish community in Lod during the Mishnah and Talmud era is described in a significant number of sources, including information on its institutions, demographics, and way of life. The city reached its height as a Jewish center between the First Jewish-Roman War and the Bar Kokhba revolt, and again in the days of Judah ha-Nasi and the start of the Amoraim period. The city was then the site of numerous public institutions, including schools, study houses, and synagogues. In 43 BC, Cassius, the Roman governor of Syria, sold the inhabitants of Lod into slavery, but they were set free two years later by Mark Antony. During the First Jewish–Roman War, the Roman proconsul of Syria, Cestius Gallus, razed the town on his way to Jerusalem in Tishrei 66 CE. According to Josephus, "[he] found the city deserted, for the entire population had gone up to Jerusalem for the Feast of Tabernacles. He killed fifty people whom he found, burned the town and marched on". Lydda was occupied by Emperor Vespasian in 68 CE. In the period following the destruction of Jerusalem in 70 CE, Rabbi Tarfon, who appears in many Tannaitic and Jewish legal discussions, served as a rabbinic authority in Lod. During the Kitos War, 115–117 CE, the Roman army laid siege to Lod, where the rebel Jews had gathered under the leadership of Julian and Pappos. Torah study was outlawed by the Romans and pursued mostly in the underground. The distress became so great, the patriarch Rabban Gamaliel II, who was shut up there and died soon afterwards, permitted fasting on Ḥanukkah. Other rabbis disagreed with this ruling. Lydda was next taken and many of the Jews were executed; the "slain of Lydda" are often mentioned in words of reverential praise in the Talmud. In 200 CE, emperor Septimius Severus elevated the town to the status of a city, calling it Colonia Lucia Septimia Severa Diospolis. The name Diospolis ("City of Zeus") may have been bestowed earlier, possibly by Hadrian. At that point, most of its inhabitants were Christian. The earliest known bishop is Aëtius, a friend of Arius. During the following century (200-300CE), it's said that Joshua ben Levi founded a yeshiva in Lod. In December 415, the Council of Diospolis was held here to try Pelagius; he was acquitted. In the sixth century, the city was renamed Georgiopolis after St. George, a soldier in the guard of the emperor Diocletian, who was born there between 256 and 285 CE. The Church of Saint George and Mosque of Al-Khadr is named for him. The 6th-century Madaba map shows Lydda as an unwalled city with a cluster of buildings under a black inscription reading "Lod, also Lydea, also Diospolis". An isolated large building with a semicircular colonnaded plaza in front of it might represent the St George shrine. After the Muslim conquest of Palestine by Amr ibn al-'As in 636 CE, Lod which was referred to as "al-Ludd" in Arabic served as the capital of Jund Filastin ("Military District of Palaestina") before the seat of power was moved to nearby Ramla during the reign of the Umayyad Caliph Suleiman ibn Abd al-Malik in 715–716. The population of al-Ludd was relocated to Ramla, as well. With the relocation of its inhabitants and the construction of the White Mosque in Ramla, al-Ludd lost its importance and fell into decay. The city was visited by the local Arab geographer al-Muqaddasi in 985, when it was under the Fatimid Caliphate, and was noted for its Great Mosque which served the residents of al-Ludd, Ramla, and the nearby villages. He also wrote of the city's "wonderful church (of St. George) at the gate of which Christ will slay the Antichrist." The Crusaders occupied the city in 1099 and named it St Jorge de Lidde. It was briefly conquered by Saladin, but retaken by the Crusaders in 1191. For the English Crusaders, it was a place of great significance as the birthplace of Saint George. The Crusaders made it the seat of a Latin Church diocese, and it remains a titular see. It owed the service of 10 knights and 20 sergeants, and it had its own burgess court during this era. In 1226, Ayyubid Syrian geographer Yaqut al-Hamawi visited al-Ludd and stated it was part of the Jerusalem District during Ayyubid rule. Sultan Baybars brought Lydda again under Muslim control by 1267–8. According to Qalqashandi, Lydda was an administrative centre of a wilaya during the fourteenth and fifteenth century in the Mamluk empire. Mujir al-Din described it as a pleasant village with an active Friday mosque. During this time, Lydda was a station on the postal route between Cairo and Damascus. In 1517, Lydda was incorporated into the Ottoman Empire as part of the Damascus Eyalet, and in the 1550s, the revenues of Lydda were designated for the new waqf of Hasseki Sultan Imaret in Jerusalem, established by Hasseki Hurrem Sultan (Roxelana), the wife of Suleiman the Magnificent. By 1596 Lydda was a part of the nahiya ("subdistrict") of Ramla, which was under the administration of the liwa ("district") of Gaza. It had a population of 241 households and 14 bachelors who were all Muslims, and 233 households who were Christians. They paid a fixed tax-rate of 33,3 % on agricultural products, including wheat, barley, summer crops, vineyards, fruit trees, sesame, special product ("dawalib" =spinning wheels), goats and beehives, in addition to occasional revenues and market toll, a total of 45,000 Akçe. All of the revenue went to the Waqf. In 1051 AH/1641/2, the Bedouin tribe of al-Sawālima from around Jaffa attacked the villages of Subṭāra, Bayt Dajan, al-Sāfiriya, Jindās, Lydda and Yāzūr belonging to Waqf Haseki Sultan. The village appeared as Lydda, though misplaced, on the map of Pierre Jacotin compiled in 1799. Missionary William M. Thomson visited Lydda in the mid-19th century, describing it as a "flourishing village of some 2,000 inhabitants, imbosomed in noble orchards of olive, fig, pomegranate, mulberry, sycamore, and other trees, surrounded every way by a very fertile neighbourhood. The inhabitants are evidently industrious and thriving, and the whole country between this and Ramleh is fast being filled up with their flourishing orchards. Rarely have I beheld a rural scene more delightful than this presented in early harvest ... It must be seen, heard, and enjoyed to be appreciated." In 1869, the population of Ludd was given as: 55 Catholics, 1,940 "Greeks", 5 Protestants and 4,850 Muslims. In 1870, the Church of Saint George was rebuilt. In 1892, the first railway station in the entire region was established in the city. In the second half of the 19th century, Jewish merchants migrated to the city, but left after the 1921 Jaffa riots. In 1882, the Palestine Exploration Fund's Survey of Western Palestine described Lod as "A small town, standing among enclosure of prickly pear, and having fine olive groves around it, especially to the south. The minaret of the mosque is a very conspicuous object over the whole of the plain. The inhabitants are principally Moslim, though the place is the seat of a Greek bishop resident of Jerusalem. The Crusading church has lately been restored, and is used by the Greeks. Wells are found in the gardens...." From 1918, Lydda was under the administration of the British Mandate in Palestine, as per a League of Nations decree that followed the Great War. During the Second World War, the British set up supply posts in and around Lydda and its railway station, also building an airport that was renamed Ben Gurion Airport after the death of Israel's first prime minister in 1973. At the time of the 1922 census of Palestine, Lydda had a population of 8,103 inhabitants (7,166 Muslims, 926 Christians, and 11 Jews), the Christians were 921 Orthodox, 4 Roman Catholics and 1 Melkite. This had increased by the 1931 census to 11,250 (10,002 Muslims, 1,210 Christians, 28 Jews, and 10 Bahai), in a total of 2475 residential houses. In 1938, Lydda had a population of 12,750. In 1945, Lydda had a population of 16,780 (14,910 Muslims, 1,840 Christians, 20 Jews and 10 "other"). Until 1948, Lydda was an Arab town with a population of around 20,000—18,500 Muslims and 1,500 Christians. In 1947, the United Nations proposed dividing Mandatory Palestine into two states, one Jewish state and one Arab; Lydda was to form part of the proposed Arab state. In the ensuing war, Israel captured Arab towns outside the area the UN had allotted it, including Lydda. In December 1947, thirteen Jewish passengers in a seven-car convoy to Ben Shemen Youth Village were ambushed and murdered.In a separate incident, three Jewish youths, two men and a woman were captured, then raped and murdered in a neighbouring village. Their bodies were paraded in Lydda’s principal street. The Israel Defense Forces entered Lydda on 11 July 1948. The following day, under the impression that it was under attack, the 3rd Battalion was ordered to shoot anyone "seen on the streets". According to Israel, 250 Arabs were killed. Other estimates are higher: Arab historian Aref al Aref estimated 400, and Nimr al Khatib 1,700. In 1948, the population rose to 50,000 during the Nakba, as Arab refugees fleeing other areas made their way there. A key event was the Palestinian expulsion from Lydda and Ramle, with the expulsion of 50,000-70,000 Palestinians from Lydda and Ramle by the Israel Defense Forces. All but 700 to 1,056 were expelled by order of the Israeli high command, and forced to walk 17 km (10+1⁄2 mi) to the Jordanian Arab Legion lines. Estimates of those who died from exhaustion and dehydration vary from a handful to 355. The town was subsequently sacked by the Israeli army. Some scholars, including Ilan Pappé, characterize this as ethnic cleansing. The few hundred Arabs who remained in the city were soon outnumbered by the influx of Jews who immigrated to Lod from August 1948 onward, most of them from Arab countries. As a result, Lod became a predominantly Jewish town. After the establishment of the state, the biblical name Lod was readopted. The Jewish immigrants who settled Lod came in waves, first from Morocco and Tunisia, later from Ethiopia, and then from the former Soviet Union. Since 2008, many urban development projects have been undertaken to improve the image of the city. Upscale neighbourhoods have been built, among them Ganei Ya'ar and Ahisemah, expanding the city to the east. According to a 2010 report in the Economist, a three-meter-high wall was built between Jewish and Arab neighbourhoods and construction in Jewish areas was given priority over construction in Arab neighborhoods. The newspaper says that violent crime in the Arab sector revolves mainly around family feuds over turf and honour crimes. In 2010, the Lod Community Foundation organised an event for representatives of bicultural youth movements, volunteer aid organisations, educational start-ups, businessmen, sports organizations, and conservationists working on programmes to better the city. In the 2021 Israel–Palestine crisis, a state of emergency was declared in Lod after Arab rioting led to the death of an Israeli Jew. The Mayor of Lod, Yair Revivio, urged Prime Minister of Israel Benjamin Netanyahu to deploy Israel Border Police to restore order in the city. This was the first time since 1966 that Israel had declared this kind of emergency lockdown. International media noted that both Jewish and Palestinian mobs were active in Lod, but the "crackdown came for one side" only. Demographics In the 19th century and until the Lydda Death March, Lod was an exclusively Muslim-Christian town, with an estimated 6,850 inhabitants, of whom approximately 2,000 (29%) were Christian. According to the Israel Central Bureau of Statistics (CBS), the population of Lod in 2010 was 69,500 people. According to the 2019 census, the population of Lod was 77,223, of which 53,581 people, comprising 69.4% of the city's population, were classified as "Jews and Others", and 23,642 people, comprising 30.6% as "Arab". Education According to CBS, 38 schools and 13,188 pupils are in the city. They are spread out as 26 elementary schools and 8,325 elementary school pupils, and 13 high schools and 4,863 high school pupils. About 52.5% of 12th-grade pupils were entitled to a matriculation certificate in 2001.[citation needed] Economy The airport and related industries are a major source of employment for the residents of Lod. Other important factories in the city are the communication equipment company "Talard", "Cafe-Co" - a subsidiary of the Strauss Group and "Kashev" - the computer center of Bank Leumi. A Jewish Agency Absorption Centre is also located in Lod. According to CBS figures for 2000, 23,032 people were salaried workers and 1,405 were self-employed. The mean monthly wage for a salaried worker was NIS 4,754, a real change of 2.9% over the course of 2000. Salaried men had a mean monthly wage of NIS 5,821 (a real change of 1.4%) versus NIS 3,547 for women (a real change of 4.6%). The mean income for the self-employed was NIS 4,991. About 1,275 people were receiving unemployment benefits and 7,145 were receiving an income supplement. Art and culture In 2009-2010, Dor Guez held an exhibit, Georgeopolis, at the Petach Tikva art museum that focuses on Lod. Archaeology A well-preserved mosaic floor dating to the Roman period was excavated in 1996 as part of a salvage dig conducted on behalf of the Israel Antiquities Authority and the Municipality of Lod, prior to widening HeHalutz Street. According to Jacob Fisch, executive director of the Friends of the Israel Antiquities Authority, a worker at the construction site noticed the tail of a tiger and halted work. The mosaic was initially covered over with soil at the conclusion of the excavation for lack of funds to conserve and develop the site. The mosaic is now part of the Lod Mosaic Archaeological Center. The floor, with its colorful display of birds, fish, exotic animals and merchant ships, is believed to have been commissioned by a wealthy resident of the city for his private home. The Lod Community Archaeology Program, which operates in ten Lod schools, five Jewish and five Israeli Arab, combines archaeological studies with participation in digs in Lod. Sports The city's major football club, Hapoel Bnei Lod, plays in Liga Leumit (the second division). Its home is at the Lod Municipal Stadium. The club was formed by a merger of Bnei Lod and Rakevet Lod in the 1980s. Two other clubs in the city play in the regional leagues: Hapoel MS Ortodoxim Lod in Liga Bet and Maccabi Lod in Liga Gimel. Hapoel Lod played in the top division during the 1960s and 1980s, and won the State Cup in 1984. The club folded in 2002. A new club, Hapoel Maxim Lod (named after former mayor Maxim Levy) was established soon after, but folded in 2007. Notable people Twin towns-sister cities Lod is twinned with: See also References Bibliography External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Rabbinic_period] | [TOKENS: 3195] |
Contents Rabbinic period The Rabbinic period, or the Talmudic period, denotes a transformative era in Jewish history, spanning from the destruction of the Second Temple in 70 CE to the Muslim conquest in 638 CE. Pivotal in shaping Judaism into its classical form, it is regarded as the second most important era in Jewish history after the Biblical period. After the failure of the Great Jewish Revolt of 66–73 CE, Roman measures such as the fiscus Judaicus (Latin for 'Jewish tax') and land confiscation severely impacted the Jewish population of Judaea. The destruction of Jerusalem and the Temple required Jewish culture to adapt in order to survive. Judaism endured through the establishment of new centers of scholarship and leadership, initially at Yavne under Yohanan ben Zakkai, who promoted a focus on Torah study and synagogue worship. The next decades also saw the Jewish response to several catastrophic events, including the failed Diaspora uprisings of 115–117 CE and the Bar Kokhba revolt of 132–135 CE, a failed bid for the reestablishment of an independent Jewish state in Judaea. The suppression of these revolts by the Romans led to the devastation of Judea proper as well as diaspora communities, the death and enslavement of many Jews, further displacement, and economic hardship. Despite these challenges, Jewish communal life continued to thrive, particularly in the Galilee, which became a major center of Jewish life and scholarship. The authority of the Patriarchs was instrumental in maintaining Jewish continuity during this transformative period. During the later Rabbinic period, the Jewish population in the Land of Israel continued to decline under Christianized Roman rule. Jews started facing discriminatory laws and religious persecution, and many emigrated from the country, eventually establishing flourishing Diaspora communities. From the 3rd century onward, the Jewish community in Babylonia became a central hub of Jewish life, benefiting from a relatively tolerant environment under the Sasanian Empire. Contemporary estimates frequently place the Babylonian Jewish population during this period at approximately one million, establishing it as the largest Jewish diaspora community of the time. This period of economic prosperity and political freedom allowed the Babylonian Jewish community, led by the Exilarch, to thrive and foster significant theological and literary developments. During the Rabbinic period, Jewish communities were also present in various regions of the Mediterranean, including Egypt, North Africa, Asia Minor, Italy, and the Iberian Peninsula. The Rabbinic period was consequential in the ongoing development of Judaism and its traditions. During this time, the Jewish religious practice transitioned from a focus on the Temple and sacrificial practices to a greater emphasis on Halakha (Jewish law) and Aggadah (biblical interpretation). This period saw the creation of major texts of rabbinic literature, such as the Mishnah, Tosefta, Jerusalem Talmud, Babylonian Talmud, and various midrashim (biblical commentaries). Jews maintained their cultural and religious identity by continuing to speak and write in Hebrew and Aramaic, and developed liturgy, including piyyutim (liturgical poetry). They set up synagogues and yeshivas, engaged in mysticism, and hoped for the Messiah to bring their exile to an end. History The First Jewish–Roman War took a heavy toll on the Jewish people in the Land of Israel. Approximately one quarter of the Jewish population in Judaea was killed in the fighting and its aftermath and about one tenth was taken into captivity. The Temple, as a national and administrative center of Jewish life and worship was demolished, Jerusalem was destroyed, and the autonomous positions of the Sanhedrin and the High-priesthood were rendered null and void. The social structure prior to the destruction collapsed and the factions of the Sadducees and the Essenes disappeared. On the other hand, the status of the Jews as a people recognized as a nation in the Roman Empire remained, as did their freedom to follow their faith and religious law. Vespasian placed an additional tax of two Dinar for each Jew, the fiscus Judaicus (Latin for 'Jewish tax'), creating a financial burden on Jews and meant to humiliate them. The Romans also confiscated land from Jews. Around the period of the destruction of the Temple, Yohanan ben Zakkai (Ribaz) moved from Jerusalem to Yavneh, a small town on the coast, where he established a new center of leadership. The Rabbinic movement adopted and further developed the Pharisee approach to Halakha. This new movement put an emphasis on Torah study, and prayer and the Synagogue emerged as the center of community life. At this stage, the center of Jewish leadership was still in the Land of Israel, although it would eventually move to Babylonia. Although Yohanan ben Zakkai made certain decrees "to remember the Temple", his general approach was to continue observing Judaism regardless of the Temple or lack thereof. Ribaz was replaced by Gamaliel II, who sought to maintain ties with the diaspora by visiting communities abroad and welcoming visitors to Yavneh for study and consultation. Within the Jewish community, prestige and authority was given to the nasi (patriarch). But this authority was challenged by rabbis several times until Rabbi Judah ha-Nasi consolidated his authority as both the patriarch and religious leader. Between 132–135 CE, the Jews made their last serious attempt to regain their independence in the form of the Bar Kokhba revolt. The Rabbis made an effort to unify the people under Bar Kokhba. The rabbinic leaders understood that such a revolt had no chance of surviving without unity within the Jewish community, and they put much effort into unifying the people behind Bar Kokhba. The failure of the revolt led to many casualties, as well as an economic downturn that caused many Jews to migrate to the Galilee and outside of Israel. In fact, Jews were prohibited from living in the area surrounding Jerusalem during this period; nevertheless, this prohibition was not always enforced, and there appears to have been a small Jewish community that established itself in Jerusalem around the end of the second century. The Talmud describes the ten places where the Sanhedrin was exiled, the later places - namely Usha, Shefa-Amr, Beit She'arim, Sepphoris, and Tiberias being in the Galilee. These exiles lasted a total of about one hundred years. The population of Judea also migrated to the north during this period, making the Galilee the center of Jewish life during this time. Following the Bar Kokhba revolt around 140 CE, when the Sanhedrin was located in Usha, Simeon ben Gamaliel II took its leadership in the form of the Patriarch. This title was passed down from father to son from then on. Although Gamaliel II, his father, is also referred to as "Patriarch", this title may be simply applying the family title retroactively.) The Patriarchs managed to stabilize the economy; in light of the many fields that were left empty following the revolt, they made decrees allowing the owners to reclaim them. To preserve the upper hand of the Talmudic academies in Syria Palaestina, the Patriarchate clarified to the community of Lower Mesopotamia, known to Jews as "Babylonia", that the calendar can only be established in the Land of Israel. The Jews experienced more favorable conditions under the Severan dynasty. According to Jerome, both Septimius Severus and Caracalla "very greatly cherished the Jews." Towards the middle of the third century, the Christian scholar Origen wrote that the Jewish ethnarchs held power comparable to kings and had the authority to condemn individuals to death: Now, for instance, that the Romans rule, and the Jews pay the two drachmas to them, we who have had experience of it know how much power the ethnarches has among them and that differs in little from a king of the nation. Trials are held according to the law, and some are condemned to death. And though there is not full permission for this, still it is not done without the knowledge of the ruler. The fortunes of the Jews in the Land of Israel changed significantly under Byzantine rule. Between the fourth and seventh centuries, the region ceased to be predominantly Jewish, as much of the non-Jewish population had converted to Christianity. The decline and eventual disappearance of the patriarchate, which several scholars suggest occurred around 425 CE, led to the loss of central Jewish leadership, while their spiritual academies (yeshivot) also diminished. Decentralization increased the prominence of local communities centered around synagogues. In 553, Byzantine emperor Justinian issued a decree banning the study of the Mishnah and mandating the use of the Septuagint or Aquila's translation for biblical readings, part of his campaign to convert Jews to Christianity. This marked a decline in the influence of the Jewish community in Palestine, reflected in the cessation of scholarly exchange with Babylon. In the 9th century, Pirqoi ben Baboi described the dire conditions for Jews under Christian rule, contrasting it with the flourishing Torah study in Babylonia: Thus, said Mar Yehudai of blessed memory: religious persecution was decreed upon the Jews of the Land of Israel—that they should not recite the Shema and should not pray, because the [...] evil Edom [Rome, Byzantium] decreed, religious persecution against the Land of Israel that they should not read the Torah, and they hid away all the Torah scrolls because they would burn them." As the influence of the Jewish community in the Land of Israel over the Diaspora waned, Babylonian leadership emerged as the central authority for Jewish cultural and political matters by the Early Muslim conquests. The origins of the Jewish community in Babylonia go back to the Babylonian exile. Beginning in the 3rd century, Lower Mesopotamia became the center of the Jewish world. Babylon was the only major Jewish community outside of the Roman Empire, which attracted Jews and influenced their spiritual world. With estimates around one million, the community under the Sasanian Empire from the 3rd to 7th centuries is thought to have been the world's largest diasporic population, possibly exceeding the number in the Land of Israel. By the late 3rd century, Jewish communities had re-established themselves in Egypt following their near-elimination during the Diaspora Revolt in the early 2nd century. This period witnessed a significant increase in Jewish immigration from Palestine, as supported by the growing number of Jewish texts and documents written in Hebrew and Aramaic during the 4th and 5th centuries. Additional evidence of a demographic shift in the fourth and fifth centuries can be found in the re-establishment of a Jewish population in Cyrenaica. This community appears to have been settled by immigrants from both Palestine and the growing Jewish communities in Egypt. Rabbinic literature In addition to the synagogue, the study hall (bet medrash) played an essential role in the development of Judaism. The sages composed liturgy (piyyutim), targum, and most importantly codified the Halakha and Aggadah. Halakha is the corpus of Jewish laws, and every matter is carefully considered. The Talmud contains not just the final ruling which is codified as binding law, but also the discussions that lead to that conclusion. The major Halachic works are Mishnah and Tosefta (1st–2nd centuries), Babylonian Talmud and Jerusalem Talmud (3rd to 6th centuries), as well as Halakhic midrashim. These inspired later discussions and codifications of Jewish law such as Maimonides in his Mishneh Torah and Rabbi Yosef Karo in his Shulchan Aruch. Aggadah contains interpretations of Biblical stories. It is dispersed tangentially throughout the Talmud, and it also appears in Midrashim such as Genesis Rabbah. Daily life In the Land of Israel, while some Jews lived in towns such as Tiberias, Sepphoris, Caesarea and Lydda, most lived in "villages" with populations ranging from 2,000 to 5,000. Thus, the economy remained similar to what it had been in the Second Temple period. The archeological discovery of many presses indicates that there were large wine and oil industries, and fishing was also common. As Jews moved towards the coast, many began to engage in commerce, primarily with port towns in Lebanon and Syria. The Jerusalem Talmud advises that Jews should reside only in towns that possess essential public amenities such as a medical doctor, a public bathhouse, a municipal kitchen garden, a synagogue, a study hall, as well as access to water through aqueducts and wells. Villages were governed by seven archons, who were authorized to buy and sell public property, including the Synagogue. There was a concept of "citizenship", with a distinction between permanent and temporary residents. Taxes were collected to finance the Synagogue building, the Torah scrolls, maintaining public property, and paying for public officials such as the market inspector, the synagogue officer, city guards, and school teachers. There is evidence for some kind of institutions for elementary religious education. In Late Antiquity, Jewish magical practices are attested through textual and material evidence from the Levant and Mesopotamia. In the Levant, various amulets were discovered, usually thin metal sheets (lemallae) or sometimes scratched onto ceramic sherds. These were typically commissioned for specific clients, serving protective (apotropaic), healing, and generally favorable functions, often involving adjurations of angels and demons and citations of biblical verses. The Jewish community of Palaestina also engaged in other forms of magic, such as binding and erotic spells. One notable example is the bronze defixio (binding spell) found beneath the threshold of the Meroth synagogue in Upper Galilee, which contained a plea for the entire community to be subdued, broken, and fallen before its commissioner, Yose son of Zenobia. Similarly, the use of love magic is attested by an Aramaic text scratched onto ceramic sherds found near the synagogal complex of Horvat Rimmon in southern Judea, intended to make a victim "burn in love" for the agent. Meanwhile, the Jews of Babylonia left behind a corpus of a different magical medium: thousands of inscribed clay incantation bowls. Written in Jewish Babylonian Aramaic, these bowls were typically apotropaic "demon traps," intended to attract and bind evil entities, and were often found inverted or glued together within domestic settings. While influences from non-Jewish Mesopotamian traditions are evident, Jewish scribes were careful to omit references to pagan deities. In addition, Jewish magical knowledge was formalized in specialized manuals and works. One of the best-known compositions is Sefer HaRazim ("The Book of Secrets"), which provides recipes and instructions and is thought to date to the 5th or 6th century. The persistent reappearance of specific formulas and recipes (such as those from the love charm of Horvat Rimmon), in medieval texts (including some preserved in the Cairo Genizah) points to a degree of textual continuity in the Jewish magical tradition stretching from Late Antiquity. References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Rabbinic_Judaism] | [TOKENS: 5672] |
Contents Rabbinic Judaism Rabbinic Judaism (Hebrew: יהדות רבנית, romanized: Yahadut Rabanit), also called Rabbinism, Rabbinicism, or Rabbanite Judaism, is rooted in the many forms of Judaism that coexisted and together formed Second Temple Judaism in the land of Israel, giving birth to classical rabbinic Judaism, which flourished from the 1st century CE to the final redaction of the Talmud in c. 600. Mainly developing after the destruction of the Jerusalem Temple in 70 CE, it eventually became the normative form of Judaism. Rabbinic Judaism has been an orthodox form of Judaism since the 6th century CE, after the codification of the Babylonian Talmud. It has its roots in the Pharisaic school of Second Temple Judaism and is based on the claim that Moses at Mount Sinai received both the Written Torah (Torah she-be-Khetav) and the Oral Torah (Torah she-be-al Peh) from God. The Oral Torah explains the Written Torah, and the rabbis claimed that it was they who possessed this memorized and orally transmitted part of the divine revelation. At first, it was forbidden to write down the Oral Torah, but after the destruction of the Second Temple, it was decided to write it down in the form of the Talmud and other rabbinic texts for the sake of preservation. Rabbinic Judaism contrasts with the non-Rabbinic forms which emphasize the Tanakh over the Talmud, including the defunct Sadducee Judaism as well as with Karaite Judaism, Ethiopian Judaism, and Samaritanism, which do not recognize the Oral Torah as a divine authority nor the rabbinic procedures used to interpret Jewish scripture (e.g., the Hebrew Bible). Although there are now profound differences among Jewish denominations of Rabbinic Judaism with respect to the binding force of Halakha (Jewish religious law) and the willingness to challenge preceding interpretations, all identify themselves as coming from the tradition of the Oral Law and the rabbinic method of analysis. Background In 332 BCE, the Persians were defeated by Alexander the Great. After his demise and the division of Alexander's empire among his generals, the Seleucid Kingdom was formed. During this time, currents of Judaism were influenced by Hellenistic philosophy developed from the 3rd century BCE, notably among the Jewish diaspora in Alexandria, culminating in a Greek translation of the Hebrew Bible known as the Septuagint. An important advocate of the symbiosis of Jewish theology and Hellenistic thought is Philo. Hellenistic culture had a profound impact on Jewish customs and practices, both in Judea and in the diaspora. These inroads into Judaism gave rise to Hellenistic Judaism in the Jewish diaspora, which sought to establish a Hebraic-Jewish religious tradition within the surrounding Hellenist culture and language. There was a general deterioration in relations between Hellenized Jews and other Jews, leading the Seleucid king Antiochus IV Epiphanes to effectively outlaw the observance of Judaism, replace the High Priest Jason with Menelaus—who had bribed Antiochus for the position—and syncretize the Temple worship of the God of Israel with the worship of Zeus. Consequently, the Jews who rejected Hellenism revolted against the Greek ruler, forming a Jewish kingdom of varying degrees of autonomy or independence ruled by the Hasmonean dynasty, which lasted from 110 BCE to 37 BCE. The Hasmonean dynasty eventually disintegrated into a civil war. The competing leaders appealed to Rome for intervention, which in time led to a total Roman conquest and annexation of the region (see Iudaea province). Nevertheless, the region's cultural issues remained unresolved. The main issue separating Hellenistic and other Jews was the application of biblical laws in a Hellenistic melting pot culture. Hellenistic Judaism spread to Ptolemaic Egypt from the 3rd century BCE, becoming a notable religio licita throughout the Roman Empire until its decline in the 3rd century concurrent with the rise of Gnosticism and early Christianity. The decline of Hellenistic Judaism remains obscure. It may have been marginalized, absorbed into, or become early Christianity (see the Gospel of the Hebrews). The Acts of the Apostles, at least, report how Paul the Apostle preferentially evangelized communities of Jewish proselytes, Godfearers, and non-Jewish circles sympathetic to Judaism: the Apostolic Decree, which authorized proselytes to forgo circumcision, made Christianity an easier option for interested pagans in the empire than did Judaism (see also circumcision controversy in early Christianity). However, nascent Christianity's attractiveness may have suffered a setback when it was explicitly outlawed in the 80s CE by Domitian as a "Jewish superstition", while Judaism retained its privileges as long as Jews paid the Fiscus Judaicus. However, from a historical perspective, the state's persecution of Christians seemed only to increase the frequency of pagan conversions to Christianity, leading eventually to the adoption of Christianity by the Roman emperor Constantine the Great, himself a convert. On the other hand, mainstream Judaism began to reject Hellenistic currents, outlawing the use of the Septuagint (see also the Council of Jamnia). Remaining currents of Hellenistic Judaism may have merged into Gnostic movements or early Rabbinic Judaism in the early centuries CE.[citation needed] Classical rabbinic Judaism is seen as consisting of three separate strata: tannaitic (until 200 CE), amoraic (200–500 CE), and saboraic (500 CE–7th century). The views of the Tannaim, who witnessed the destruction of the Second Temple in 70 CE and the defeat of the Bar Kokhba revolt of 132–135 CE, are preserved in the Mishnah (finalized c. 200 CE), a law code constituting the first phase of formative rabbinic Judaism. The late-2nd-century Tosefta ("supplement, addition") was another piece of oral tradition put to paper by rabbinic authors. The two Talmuds (Jerusalem and Babylonian) are also their work, along with Midrashic ( exegetic) texts. After usually opposing the priestly tradition with its exclusive focus on the written tradition and the Temple-related sacrificial cult, rabbinic Judaism reached the end of its formative period, offering a synthesis of a triad of traditions - the interpretive, messianic, and priestly. In medieval times, rabbinic Judaism continued to flourish in the diaspora, nowadays representing normative Judaism. In the later part of the Second Temple period (2nd century BCE), the Second Commonwealth of Judea (Hasmonean Kingdom) was established, and religious matters were determined by a pair (zugot) which led the Sanhedrin. The Hasmonean Kingdom ended in 37 BCE, but it is believed that the "two-man rule of the Sanhedrin" lasted until the early part of the 1st century CE during the period of the Roman province of Judea. The last pair, Hillel and Shammai, was the most influential of the Sanhedrin zugot. Both were Pharisees, but the Sadducees were actually the dominant party while the Temple stood. Since the Sadducees did not survive the First Jewish–Roman War, their version of events has perished. In addition, Rabbinic Judaism sees Hillel's views as superior to Shammai's. The development of an oral tradition of teaching called the tanna would be the means by which the faith of Judaism would sustain the fall of the Second Temple. Jewish messianism has its roots in the apocalyptic literature of the 2nd to 1st centuries BCE, promising a future "anointed" leader, or Messiah, to resurrect the Israelite "Kingdom of God" in place of the foreign rulers of the time. This corresponded with the Maccabean Revolt directed against the Seleucids. Following the fall of the Hasmonean kingdom, it was directed against the Roman administration of Iudaea Province, which, according to Josephus, began with the formation of the Zealots during the Census of Quirinius of 6 CE, although full-scale open revolt did not occur until the First Jewish–Roman War in 66 CE. Historian H. H. Ben-Sasson has proposed that the "Crisis under Caligula" (37–41) was the "first open break" between Rome and the Jews even though tension already existed during the census in 6 CE and under Sejanus (before 31 CE). Emergence of Rabbinic Judaism While the Torah represents the written law, Rabbinic tradition holds that its details and interpretation, which are called the Oral Torah or Oral Law, were originally an unwritten tradition based upon the Law given to Moses on Mount Sinai. However, as the persecutions of the Jews increased and the details were in danger of being forgotten, these oral laws were recorded by rabbi Judah ha-Nasi ("Judah the Prince") in the Mishnah, redacted c. 200 CE. The Talmud was a compilation of both the Mishnah and the Gemara, rabbinic commentaries redacted over the next three centuries. The Gemara originated in two major centers of Jewish scholarship, Syria Palaestina and Babylonia. Correspondingly, two bodies of analysis developed, and two works of Talmud were created. The older compilation is called the Jerusalem Talmud. It was compiled sometime during the 4th century in Syria Palaestina. Judaism at the time of the late Second Temple period was divided into antagonistic factions. The main camps were the Pharisees, Saducees, and Zealots, but also included other less influential sects. This led to further unrest, and the 1st century BCE and 1st century CE saw a number of charismatic religious leaders, contributing to what would become the Mishnah of Rabbinic Judaism, including Yochanan ben Zakai and Hanina Ben Dosa. Following the destruction of the Temple in 70 CE and the expulsion of the Jews from the Roman province of Judea, Jewish worship stopped being centrally organized around the Temple, prayer took the place of sacrifice, and worship was rebuilt around rabbis who acted as teachers and leaders of individual communities. The destruction of the Second Temple was a profoundly traumatic experience for the Jews, who were now confronted with difficult and far-reaching questions: How people answered these questions depended largely on their position prior to the revolt. But the destruction of the Second Temple by the Romans not only put an end to the revolt, it marked the end of an era. Revolutionaries like the Zealots had been crushed by the Romans, and had little credibility (the last Zealots died at Masada in 73). The Sadducees, whose teachings were so closely connected to the Temple cult, disappeared. The Essenes also vanished (or developed into Christians), perhaps because their teachings so diverged from the issues of the times that the destruction of the Second Temple was of no consequence to them; precisely for this reason, they were of little consequence to the vast majority of Jews. Two organized groups remained: the Early Christians, and Pharisees. Some scholars, such as Daniel Boyarin and Paula Fredricksen, suggest that it was at this time, when Christians and Pharisees were competing for leadership of the Jewish people, that accounts of debates between Jesus and the apostles, debates with Pharisees, and anti-Pharisaic passages, were written and incorporated into the New Testament.[citation needed] Of all the major Second Temple sects, only the Pharisees remained. Their vision of Jewish law as a means by which ordinary people could engage with the sacred in their daily lives, provided them with a position from which to respond to all four challenges, in a way meaningful to the vast majority of Jews.[citation needed] Following the destruction of the Temple, Rome governed Judea through a Procurator at Caesarea and a Jewish Patriarch. A former leading Pharisee, Yohanan ben Zakkai, was appointed the first Patriarch (the Hebrew word, Nasi, also means prince, or president), and he reestablished the Sanhedrin at Javneh under Pharisee control. Instead of giving tithes to the priests and sacrificing offerings at the Temple, the rabbis instructed Jews to give money to charities and study in local synagogues, as well as to pay the Fiscus Iudaicus. In 132, the Emperor Hadrian threatened to rebuild Jerusalem as a pagan city dedicated to Jupiter, called Aelia Capitolina. Some of the leading sages of the Sanhedrin supported a rebellion (and, for a short time, an independent state) led by Simon bar Kozeba (also called Simon bar Kokhba, or "son of a star"); some, such as Rabbi Akiva, believed Bar Kokhba to be a messiah. Up until this time, a number of Christians were still part of the Jewish community. However, they did not support or take part in the revolt. Whether because they had no wish to fight, or because they could not support a second messiah in addition to Jesus, or because of their harsh treatment by Bar Kokhba during his brief reign, these Christians also left the Jewish community around this time.[citation needed] This revolt ended in 135 when Bar Kokhba and his army were defeated. The Romans then barred Jews from Jerusalem, until Constantine allowed Jews to enter for one day each year, during the holiday of Tisha B'Av. After the suppression of the revolt the vast majority of Jews were sent into exile; shortly thereafter (around 200), Judah haNasi edited together judgments and traditions into an authoritative code, the Mishnah. This marks the transformation of Pharisaic Judaism into Rabbinic Judaism. Although the rabbis traced their origins to the Pharisees, Rabbinic Judaism nevertheless involved a radical repudiation of certain elements of Pharisaism, elements that were basic to Second Temple Judaism. The Pharisees had been partisan. Members of different sects argued with one another over the correctness of their respective interpretations. After the destruction of the Second Temple, these sectarian divisions ended. The term Pharisee was no longer used, perhaps because it was a term more often used by non-Pharisees, but also because the term was explicitly sectarian. The rabbis claimed leadership over all Jews, and added to the Amidah the birkat haMinim, a prayer which in part exclaims, "Praised are You O Lord, who breaks enemies and defeats the arrogant", and which is understood as a rejection of sectarians and sectarianism. This shift by no means resolved conflicts over the interpretation of the Torah; rather, it relocated debates between sects to debates within Rabbinic Judaism.[citation needed] The survival of Pharisaic or Rabbinic Judaism is attributed to Rabbi Yohanan ben Zakkai, the founder of the Yeshiva (religious school) in Yavne. Yavneh replaced Jerusalem as the new seat of a reconstituted Sanhedrin, which reestablished its authority and became a means of reuniting Jewry. The destruction of the Second Temple brought about a dramatic change in Judaism. Rabbinic Judaism built upon Jewish tradition while adjusting to new realities. Temple ritual was replaced with prayer service in synagogues which built upon practices of Jews in the diaspora dating back to the Babylonian exile. As the rabbis were required to face two shattering new realities, Judaism without a Temple (to serve as the center of teaching and study) and Judea without autonomy, there was a flurry of legal discourse and the old system of oral scholarship could not be maintained. It is during this period that rabbinic discourse began to be recorded in writing. The theory that the destruction of the Temple and subsequent upheaval led to the committing of Oral Law into writing was first explained in the Epistle of Sherira Gaon and often repeated. The Oral Law was subsequently codified in the Mishnah and Gemarah, and is interpreted in rabbinic literature detailing subsequent rabbinic decisions and writings. Rabbinic Jewish literature is predicated on the belief that the Written Law cannot be properly understood without recourse to the Oral Law (the Mishnah). Much rabbinic Jewish literature concerns specifying what behavior is sanctioned by the law; this body of interpretations is called halakha (the way). The Talmud contains discussions and opinions regarding details of many oral laws believed to have originally been transmitted to Moses. Some see Exodus 18 and Numbers 11 as a display of Moses' appointing elders as judges to govern with him and judge disputes, imparting to them details and guidance of how to interpret the laws of God while carrying out their duties.[citation needed] The Oral Torah includes rules intended to prevent violations of the laws of the Torah and Talmud, sometimes referred to as "a fence around the Torah". For example, the written Torah prohibits certain types of travelling on the Sabbath; consequently, the Oral Torah prohibits walking great distances on the Sabbath to ensure that one does not accidentally engage in a type of travelling prohibited by the written Torah. Similarly, the written Torah prohibits plowing on the Sabbath; the Oral Torah prohibits carrying a stick on the Sabbath to ensure that one does not drag the stick and accidentally engage in prohibited plowing. Rabbinic literature As the rabbis were required to face a new reality, that of Judaism without a Temple (to serve as the location for sacrifice and study) and Judea without autonomy, there was a flurry of legal discourse, and the old system of oral scholarship could not be maintained. It is during this period that rabbinic discourse began to be recorded in writing. The theory that the destruction of the Temple and subsequent upheaval led to the committing of Oral Torah into writing was first explained in the Epistle of Sherira Gaon and often repeated. The Oral Torah was subsequently codified in the Mishnah and Gemara, and is interpreted in rabbinic literature detailing subsequent rabbinic decisions and writings. Rabbinic Jewish literature is predicated on the belief that the Torah cannot be properly understood without recourse to the Oral Torah. It states that many commandments and stipulations contained in the Written Torah would be difficult, if not impossible, to keep without the Oral Torah to define them. For example, the prohibition to do any "creative work" (melakha) on the Sabbath, which is given no definition in the Torah, is given a practical meaning in the Oral Torah, which provides definition of what constitutes melakha. Numerous examples exist of this general prohibitive language in the Torah (such as, "don't steal", without defining what is considered theft, or ownership and property laws), requiring—according to rabbinic thought—a subsequent definition through the Oral Torah. Thus Rabbinic Judaism claims that almost all directives, both positive and negative, in the Torah are non-specific in nature and require the existence of either an Oral Torah or some other method to explain them.[citation needed] Much rabbinic Jewish literature concerns specifying what behavior is sanctioned by the law; this body of interpretations is called halakha (the way). Originally, Jewish scholarship was oral. Rabbis expounded and debated the law (the written law expressed in the Hebrew Bible) and discussed the Tanakh without the benefit of written works (other than the biblical books themselves), though some may have made private notes (megillot setarim), for example of court decisions. This situation changed drastically, however, mainly as the result of the destruction of the Jewish commonwealth in the year 70 CE and the consequent upheaval of Jewish social and legal norms. As the rabbis were required to face a new reality—mainly Judaism without a Temple (to serve as the center of teaching and study) and Judea without autonomy—there was a flurry of legal discourse and the old system of oral scholarship could not be maintained. It is during this period that rabbinic discourse began to be recorded in writing. The earliest recorded oral law may have been of the midrashic form, in which halakhic discussion is structured as exegetical commentary on the Pentateuch (Torah). But an alternative form, organized by subject matter instead of by biblical verse, became dominant about the year 200 CE, when Rabbi Judah haNasi redacted the Mishnah (משנה). The Oral Law was far from monolithic; rather, it varied among various schools. The most famous two were the School of Shammai and the School of Hillel. In general, all valid[citation needed] opinions, even the non-normative ones, were recorded in the Talmud. The Talmud has two components: the Mishnah (c. 200 CE), the first written compendium of Judaism's Oral Law; and the Gemara (c. 500 CE), a discussion of the Mishnah and related Tannaitic writings that often ventures onto other subjects and expounds broadly on the Tanakh. The rabbis of the Mishnah are known as Tannaim (sing. Tanna תנא). The rabbis of the Gemara are referred to as Amoraim (sing. Amora אמורא). The Mishnah does not claim to be the development of new laws, but merely the collection of existing oral laws, traditions and traditional wisdom. The rabbis who contributed to the Mishnah are known as the Tannaim, of whom approximately 120 are known. The period during which the Mishnah was assembled spanned about 130 years, and five generations. Most of the Mishnah is related without attribution (stam). This usually indicates that many sages taught so, or that Judah haNasi who redacted the Mishnah together with his academy/court ruled so. The halakhic ruling usually follows that view. Sometimes, however, it appears to be the opinion of a single sage, and the view of the sages collectively (Hebrew: חכמים, hachamim) is given separately. The Talmud records a tradition that unattributed statements of the law represent the views of Rabbi Meir (Sanhedrin 86a), which supports the theory (recorded by Rav Sherira Gaon in his famous Iggeret) that he was the author of an earlier collection. For this reason, the few passages that actually say "this is the view of Rabbi Meir" represent cases where the author intended to present Rabbi Meir's view as a "minority opinion" not representing the accepted law. Judah haNasi is credited with publishing the Mishnah, although there have been a few edits since his time (for example, those passages that cite him or his grandson, Rabbi Yehuda Nesi'ah; in addition, the Mishnah at the end of Tractate Sotah refers to the period after Judah haNasi's death, which could not have been written by Judah haNasi himself). According to the Iggeret of Sherira Gaon, after the tremendous upheaval caused by the destruction of the Temple and the Bar Kokhba revolt, the Oral Torah was in danger of being forgotten. It was for this reason that Judah haNasi chose to redact the Mishnah. In addition to redacting the Mishnah, Judah haNasi and his court also ruled on which opinions should be followed, although the rulings do not always appear in the text. As he went through the tractates, the Mishnah was set forth, but throughout his life some parts were updated as new information came to light. Because of the proliferation of earlier versions, it was deemed too hard to retract anything already released, and therefore a second version of certain laws were released. The Talmud refers to these differing versions as Mishnah Rishonah ("First Mishnah") and Mishnah Acharonah ("Last Mishnah"). David Zvi Hoffmann suggests that Mishnah Rishonah actually refers to texts from earlier sages upon which Judah haNasi based his Mishnah. One theory is that the present Mishnah was based on an earlier collection by Rabbi Meir. There are also references to the "Mishnah of Rabbi Akiva", although this may simply mean his teachings in general. It is possible that Rabbi Akiva and Rabbi Meir established the divisions and order of subjects in the Mishnah, but this would make them the authors of a school curriculum rather than of a book. Authorities are divided on whether Judah haNasi recorded the Mishnah in writing or established it as an oral text for memorisation. The most important early account of its composition, the Iggeret of Rabbi Sherira Gaon of Sherira Gaon, is ambiguous on the point, although the "Spanish" recension leans to the theory that the Mishnah was written. The Gemara is the part of the Talmud that contains rabbinical commentaries and analysis of the Mishnah. In the three centuries following the redaction of the Mishnah by Judah ha-Nasi (c. 200 CE), rabbis throughout Palestine and Babylonia analyzed, debated and discussed that work. These discussions form the Gemara (גמרא). Gemara means "completion" (from the Hebrew gamar גמר: "to complete") or "learning" (from the Aramaic: "to study"). The Gemara mainly focuses on elucidating and elaborating the opinions of the Tannaim. The rabbis of the Gemara are known as Amoraim (sing. Amora אמורא). Much of the Gemara consists of legal analysis. The starting point for the analysis is usually a legal statement found in a Mishnah. The statement is then analyzed and compared with other statements used in different approaches to biblical exegesis in rabbinic Judaism (or—simpler—interpretation of text in Torah study) exchanges between two (frequently anonymous and sometimes metaphorical) disputants, termed the makshan (questioner) and tartzan (answerer). Another important function of Gemara is to identify the correct biblical basis for a given law presented in the Mishnah and the logical process connecting one with the other: this activity was known as talmud long before the existence of the Talmud as a text. Orthodox Jewish view Orthodox Judaism sees itself as continuing organically from the religious and cultural heritage of the Israelites, stemming from the Law given to Moses at Sinai onwards. According to this view, while the title rabbi was not used earlier, Moses was the first rabbi (and is commonly referred to by Orthodox Jews as "Moses our Rabbi"), with the knowledge and laws received at Sinai being passed down from teachers to students through the era of the Judges, and the prophets (most of whom are seen as the "rabbis" of their time), through the sages of the late Second Temple period, and continuing until today. See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Middle_East#cite_note-18] | [TOKENS: 6152] |
Contents Middle East The Middle East[b] is a geopolitical region encompassing the Arabian Peninsula, Egypt, Iran, Iraq, the Levant, and Turkey. The term came into widespread usage by Western European nations in the early 20th century as a replacement of the term Near East (both were in contrast to the Far East). The term "Middle East" has led to some confusion over its changing definitions. Since the late 20th century, it has been criticized as being too Eurocentric. The region includes the vast majority of the territories included in the closely associated definition of West Asia, but without the South Caucasus. It also includes all of Egypt (not just the Sinai region) and all of Turkey (including East Thrace). Most Middle Eastern countries (13 out of 18) are part of the Arab world. The three most populous countries in the region are Egypt, Iran, and Turkey, while Saudi Arabia is the largest Middle Eastern country by area. The history of the Middle East dates back to ancient times, and it was long considered the "cradle of civilization". The geopolitical importance of the region has been recognized and competed for during millennia. The Abrahamic religions (Judaism, Christianity, and Islam) have their origins in the Middle East. Arabs constitute the main ethnic group in the region, followed by Turks, Persians, Kurds, Jews, and Assyrians. The Middle East generally has a hot, arid climate, especially in the Arabian and Egyptian regions. Several major rivers provide irrigation to support agriculture in limited areas here, such as the Nile Delta in Egypt, the Tigris and Euphrates watersheds of Mesopotamia, and the basin of the Jordan River that spans most of the Levant. These regions are collectively known as the Fertile Crescent, and comprise the core of what historians had long referred to as the cradle of civilization; multiple regions of the world have since been classified as also having developed independent, original civilizations. Conversely, the Levantine coast and most of Turkey have relatively temperate climates typical of the Mediterranean, with dry summers and cool, wet winters. Most of the countries that border the Persian Gulf have vast reserves of petroleum. Monarchs of the Arabian Peninsula in particular have benefitted economically from petroleum exports. Because of the arid climate and dependence on the fossil fuel industry, the Middle East is both a major contributor to climate change and a region that is expected to be severely adversely affected by it. Other concepts of the region exist, including the broader Middle East and North Africa (MENA), which includes states of the Maghreb and the Sudan. The term the "Greater Middle East" also includes Afghanistan, Mauritania, Pakistan, as well as parts of East Africa, and sometimes Central Asia and the South Caucasus. Terminology The term "Middle East" may have originated in the 1850s in the British India Office. However, it became more widely known when United States naval strategist Alfred Thayer Mahan used the term in 1902 to "designate the area between Arabia and India". During this time the British and Russian empires were vying for influence in Central Asia, a rivalry that would become known as the Great Game. Mahan realized not only the strategic importance of the region, but also of its center, the Persian Gulf. He labeled the area surrounding the Persian Gulf as the Middle East. He said that, beyond Egypt's Suez Canal, the Gulf was the most important passage for Britain to control in order to keep the Russians from advancing towards British India. Mahan first used the term in his article "The Persian Gulf and International Relations", published in September 1902 in the National Review, a British journal. The Middle East, if I may adopt a term which I have not seen, will some day need its Malta, as well as its Gibraltar; it does not follow that either will be in the Persian Gulf. Naval force has the quality of mobility which carries with it the privilege of temporary absences; but it needs to find on every scene of operation established bases of refit, of supply, and in case of disaster, of security. The British Navy should have the facility to concentrate in force if occasion arise, about Aden, India, and the Persian Gulf. Mahan's article was reprinted in The Times and followed in October by a 20-article series entitled "The Middle Eastern Question", written by Sir Ignatius Valentine Chirol. During this series, Sir Ignatius expanded the definition of Middle East to include "those regions of Asia which extend to the borders of India or command the approaches to India." After the series ended in 1903, The Times removed quotation marks from subsequent uses of the term. Until World War II, it was customary to refer to areas centered on Turkey and the eastern shore of the Mediterranean as the "Near East", while the "Far East" centered on China, India and Japan. The Middle East was then defined as the area from Mesopotamia to Burma; namely, the area between the Near East and the Far East. This area broadly corresponds to South Asia. In the late 1930s, the British established the Middle East Command, which was based in Cairo, for its military forces in the region. After that time, the term "Middle East" gained broader usage in Europe and the United States. Following World War II, for example, the Middle East Institute was founded in Washington, D.C. in 1946. The corresponding adjective is Middle Eastern and the derived noun is Middle Easterner. While non-Eurocentric terms such as "Southwest Asia" or "Swasia" have been sparsely used, the classification of the African country, Egypt, among those counted in the Middle East challenges the usefulness of using such terms. The description Middle has also led to some confusion over changing definitions. Before the First World War, "Near East" was used in English to refer to the Balkans and the Ottoman Empire, while "Middle East" referred to the Caucasus, Persia, and Arabian lands, and sometimes Afghanistan, India and others. In contrast, "Far East" referred to the countries of East Asia (e.g. China, Japan, and Korea). With the collapse of the Ottoman Empire in 1918, "Near East" largely fell out of common use in English, while "Middle East" came to be applied to the emerging independent countries of the Islamic world. However, the usage "Near East" was retained by a variety of academic disciplines, including archaeology and ancient history. In their usage, the term describes an area identical to the term Middle East, which is not used by these disciplines (see ancient Near East).[citation needed] The first official use of the term "Middle East" by the United States government was in the 1957 Eisenhower Doctrine, which pertained to the Suez Crisis. Secretary of State John Foster Dulles defined the Middle East as "the area lying between and including Libya on the west and Pakistan on the east, Syria and Iraq on the North and the Arabian peninsula to the south, plus the Sudan and Ethiopia." In 1958, the State Department explained that the terms "Near East" and "Middle East" were interchangeable, and defined the region as including only Egypt, Syria, Israel, Lebanon, Jordan, Iraq, Saudi Arabia, Kuwait, Bahrain, and Qatar. Since the late 20th century, scholars and journalists from the region, such as journalist Louay Khraish and historian Hassan Hanafi have criticized the use of "Middle East" as a Eurocentric and colonialist term. The Associated Press Stylebook of 2004 says that Near East formerly referred to the farther west countries while Middle East referred to the eastern ones, but that now they are synonymous. It instructs: Use Middle East unless Near East is used by a source in a story. Mideast is also acceptable, but Middle East is preferred. European languages have adopted terms similar to Near East and Middle East. Since these are based on a relative description, the meanings depend on the country and are generally different from the English terms. In German the term Naher Osten (Near East) is still in common use (nowadays the term Mittlerer Osten is more and more common in press texts translated from English sources, albeit having a distinct meaning). In the four Slavic languages, Russian Ближний Восток or Blizhniy Vostok, Bulgarian Близкия Изток, Polish Bliski Wschód or Croatian Bliski istok (terms meaning Near East are the only appropriate ones for the region). However, some European languages do have "Middle East" equivalents, such as French Moyen-Orient, Swedish Mellanöstern, Spanish Oriente Medio or Medio Oriente, Greek is Μέση Ανατολή (Mesi Anatoli), and Italian Medio Oriente.[c] Perhaps because of the political influence of the United States and Europe, and the prominence of Western press, the Arabic equivalent of Middle East (Arabic: الشرق الأوسط ash-Sharq al-Awsaṭ) has become standard usage in the mainstream Arabic press. It comprises the same meaning as the term "Middle East" in North American and Western European usage. The designation, Mashriq, also from the Arabic root for East, also denotes a variously defined region around the Levant, the eastern part of the Arabic-speaking world (as opposed to the Maghreb, the western part). Even though the term originated in the West, countries of the Middle East that use languages other than Arabic also use that term in translation. For instance, the Persian equivalent for Middle East is خاورمیانه (Khāvar-e miyāneh), the Hebrew is המזרח התיכון (hamizrach hatikhon), and the Turkish is Orta Doğu. Countries and territory Traditionally included within the Middle East are Arabia, Asia Minor, East Thrace, Egypt, Iran, the Levant, Mesopotamia, and the Socotra Archipelago. The region includes 17 UN-recognized countries and one British Overseas Territory. Various concepts are often paralleled to the Middle East, most notably the Near East, Fertile Crescent, and Levant. These are geographical concepts, which refer to large sections of the modern-day Middle East, with the Near East being the closest to the Middle East in its geographical meaning. Due to it primarily being Arabic speaking, the Maghreb region of North Africa is sometimes included. "Greater Middle East" is a political term coined by the second Bush administration in the first decade of the 21st century to denote various countries, pertaining to the Muslim world, specifically Afghanistan, Iran, Pakistan, and Turkey. Various Central Asian countries are sometimes also included. History The Middle East lies at the juncture of Africa and Eurasia and of the Indian Ocean and the Mediterranean Sea (see also: Indo-Mediterranean). It is the birthplace and spiritual center of religions such as Christianity, Islam, Judaism, Manichaeism, Yezidi, Druze, Yarsan, and Mandeanism, and in Iran, Mithraism, Zoroastrianism, Manicheanism, and the Baháʼí Faith. Throughout its history the Middle East has been a major center of world affairs; a strategically, economically, politically, culturally, and religiously sensitive area. The region is one of the regions where agriculture was independently discovered, and from the Middle East it was spread, during the Neolithic, to different regions of the world such as Europe, the Indus Valley and Eastern Africa. Prior to the formation of civilizations, advanced cultures formed all over the Middle East during the Stone Age. The search for agricultural lands by agriculturalists, and pastoral lands by herdsmen meant different migrations took place within the region and shaped its ethnic and demographic makeup. The Middle East is widely and most famously known as the cradle of civilization. The world's earliest civilizations, Mesopotamia (Sumer, Akkad, Assyria and Babylonia), ancient Egypt and Kish in the Levant, all originated in the Fertile Crescent and Nile Valley regions of the ancient Near East. These were followed by the Hittite, Greek, Hurrian and Urartian civilisations of Asia Minor; Elam, Persia and Median civilizations in Iran, as well as the civilizations of the Levant (such as Ebla, Mari, Nagar, Ugarit, Canaan, Aramea, Mitanni, Phoenicia and Israel) and the Arabian Peninsula (Magan, Sheba, Ubar). The Near East was first largely unified under the Neo Assyrian Empire, then the Achaemenid Empire followed later by the Macedonian Empire and after this to some degree by the Iranian empires (namely the Parthian and Sassanid Empires), the Roman Empire and Byzantine Empire. The region served as the intellectual and economic center of the Roman Empire and played an exceptionally important role due to its periphery on the Sassanid Empire. Thus, the Romans stationed up to five or six of their legions in the region for the sole purpose of defending it from Sassanid and Bedouin raids and invasions. From the 4th century CE onwards, the Middle East became the center of the two main powers at the time, the Byzantine Empire and the Sassanid Empire. However, it would be the later Islamic Caliphates of the Middle Ages, or Islamic Golden Age which began with the Islamic conquest of the region in the 7th century AD, that would first unify the entire Middle East as a distinct region and create the dominant Islamic Arab ethnic identity that largely (but not exclusively) persists today. The 4 caliphates that dominated the Middle East for more than 600 years were the Rashidun Caliphate, the Umayyad caliphate, the Abbasid caliphate and the Fatimid caliphate. Additionally, the Mongols would come to dominate the region, the Kingdom of Armenia would incorporate parts of the region to their domain, the Seljuks would rule the region and spread Turko-Persian culture, and the Franks would found the Crusader states that would stand for roughly two centuries. Josiah Russell estimates the population of what he calls "Islamic territory" as roughly 12.5 million in 1000 – Anatolia 8 million, Syria 2 million, and Egypt 1.5 million. From the 16th century onward, the Middle East came to be dominated, once again, by two main powers: the Ottoman Empire and the Safavid dynasty. The modern Middle East began after World War I, when the Ottoman Empire, which was allied with the Central Powers, was defeated by the Allies and partitioned into a number of separate nations, initially under British and French Mandates. Other defining events in this transformation included the establishment of Israel in 1948 and the eventual departure of European powers, notably Britain and France by the end of the 1960s. They were supplanted in some part by the rising influence of the United States from the 1970s onwards. In the 20th century, the region's significant stocks of crude oil gave it new strategic and economic importance. Mass production of oil began around 1945, with Saudi Arabia, Iran, Kuwait, Iraq, and the United Arab Emirates having large quantities of oil. Estimated oil reserves, especially in Saudi Arabia and Iran, are some of the highest in the world, and the international oil cartel OPEC is dominated by Middle Eastern countries. During the Cold War, the Middle East was a theater of ideological struggle between the two superpowers and their allies: NATO and the United States on one side, and the Soviet Union and Warsaw Pact on the other, as they competed to influence regional allies. Besides the political reasons there was also the "ideological conflict" between the two systems. Moreover, as Louise Fawcett argues, among many important areas of contention, or perhaps more accurately of anxiety, were, first, the desires of the superpowers to gain strategic advantage in the region, second, the fact that the region contained some two-thirds of the world's oil reserves in a context where oil was becoming increasingly vital to the economy of the Western world [...] Within this contextual framework, the United States sought to divert the Arab world from Soviet influence. Throughout the 20th and 21st centuries, the region has experienced both periods of relative peace and tolerance and periods of conflict particularly between Sunnis and Shiites. Geography In 2018, the MENA region emitted 3.2 billion tonnes of carbon dioxide and produced 8.7% of global greenhouse gas emissions (GHG) despite making up only 6% of the global population. These emissions are mostly from the energy sector, an integral component of many Middle Eastern and North African economies due to the extensive oil and natural gas reserves that are found within the region. The Middle East region is one of the most vulnerable to climate change. The impacts include increase in drought conditions, aridity, heatwaves and sea level rise. Sharp global temperature and sea level changes, shifting precipitation patterns and increased frequency of extreme weather events are some of the main impacts of climate change as identified by the Intergovernmental Panel on Climate Change (IPCC). The MENA region is especially vulnerable to such impacts due to its arid and semi-arid environment, facing climatic challenges such as low rainfall, high temperatures and dry soil. The climatic conditions that foster such challenges for MENA are projected by the IPCC to worsen throughout the 21st century. If greenhouse gas emissions are not significantly reduced, part of the MENA region risks becoming uninhabitable before the year 2100. Climate change is expected to put significant strain on already scarce water and agricultural resources within the MENA region, threatening the national security and political stability of all included countries. Over 60 percent of the region's population lives in high and very high water-stressed areas compared to the global average of 35 percent. This has prompted some MENA countries to engage with the issue of climate change on an international level through environmental accords such as the Paris Agreement. Law and policy are also being established on a national level amongst MENA countries, with a focus on the development of renewable energies. Economy Middle Eastern economies range from being very poor (such as Gaza and Yemen) to extremely wealthy nations (such as Qatar and UAE). According to the International Monetary Fund, the three largest Middle Eastern economies in nominal GDP in 2023 were Saudi Arabia ($1.06 trillion), Turkey ($1.03 trillion), and Israel ($0.54 trillion). For nominal GDP per person, the highest ranking countries are Qatar ($83,891), Israel ($55,535), the United Arab Emirates ($49,451) and Cyprus ($33,807). Turkey ($3.6 trillion), Saudi Arabia ($2.3 trillion), and Iran ($1.7 trillion) had the largest economies in terms of GDP PPP. For GDP PPP per person, the highest-ranking countries are Qatar ($124,834), the United Arab Emirates ($88,221), Saudi Arabia ($64,836), Bahrain ($60,596) and Israel ($54,997). The lowest-ranking country in the Middle East, in terms of GDP nominal per capita, is Yemen ($573). The economic structure of Middle Eastern nations are different because while some are heavily dependent on export of only oil and oil-related products (Saudi Arabia, the UAE and Kuwait), others have a highly diverse economic base (such as Cyprus, Israel, Turkey and Egypt). Industries of the Middle Eastern region include oil and oil-related products, agriculture, cotton, cattle, dairy, textiles, leather products, surgical instruments, defence equipment (guns, ammunition, tanks, submarines, fighter jets, UAVs, and missiles). Banking is an important sector, especially for UAE and Bahrain. With the exception of Cyprus, Turkey, Egypt, Lebanon and Israel, tourism has been a relatively undeveloped area of the economy, in part because of the socially conservative nature of the region as well as political turmoil in certain regions. Since the end of the COVID pandemic however, countries such as the UAE, Bahrain, and Jordan have begun attracting greater numbers of tourists because of improving tourist facilities and the relaxing of tourism-related restrictive policies. Unemployment is high in the Middle East and North Africa region, particularly among people aged 15–29, a demographic representing 30% of the region's population. The total regional unemployment rate in 2025 is 10.8%, and among youth is as high as 28%. Demographics Arabs constitute the largest ethnic group in the Middle East, followed by various Iranian peoples and then by Turkic peoples (Turkish, Azeris, Syrian Turkmen, and Iraqi Turkmen). Native ethnic groups of the region include, in addition to Arabs, Arameans, Assyrians, Baloch, Berbers, Copts, Druze, Greek Cypriots, Jews, Kurds, Lurs, Mandaeans, Persians, Samaritans, Shabaks, Tats, and Zazas. European ethnic groups that form a diaspora in the region include Albanians, Bosniaks, Circassians (including Kabardians), Crimean Tatars, Greeks, Franco-Levantines, Italo-Levantines, and Iraqi Turkmens. Among other migrant populations are Chinese, Filipinos, Indians, Indonesians, Pakistanis, Pashtuns, Romani, and Afro-Arabs. "Migration has always provided an important vent for labor market pressures in the Middle East. For the period between the 1970s and 1990s, the Arab states of the Persian Gulf in particular provided a rich source of employment for workers from Egypt, Yemen and the countries of the Levant, while Europe had attracted young workers from North African countries due both to proximity and the legacy of colonial ties between France and the majority of North African states." According to the International Organization for Migration, there are 13 million first-generation migrants from Arab nations in the world, of which 5.8 reside in other Arab countries. Expatriates from Arab countries contribute to the circulation of financial and human capital in the region and thus significantly promote regional development. In 2009 Arab countries received a total of US$35.1 billion in remittance in-flows and remittances sent to Jordan, Egypt and Lebanon from other Arab countries are 40 to 190 per cent higher than trade revenues between these and other Arab countries. In Somalia, the Somali Civil War has greatly increased the size of the Somali diaspora, as many of the best educated Somalis left for Middle Eastern countries as well as Europe and North America. Non-Arab Middle Eastern countries such as Turkey, Israel and Iran are also subject to important migration dynamics. A fair proportion of those migrating from Arab nations are from ethnic and religious minorities facing persecution and are not necessarily ethnic Arabs, Iranians or Turks.[citation needed] Large numbers of Kurds, Jews, Assyrians, Greeks and Armenians as well as many Mandeans have left nations such as Iraq, Iran, Syria and Turkey for these reasons during the last century. In Iran, many religious minorities such as Christians, Baháʼís, Jews and Zoroastrians have left since the Islamic Revolution of 1979. The Middle East is very diverse when it comes to religions, many of which originated there. Islam is the largest religion in the Middle East, but other faiths that originated there, such as Judaism and Christianity, are also well represented. Christian communities have played a vital role in the Middle East, and they represent 78% of Cyprus population, and 40.5% of Lebanon, where the Lebanese president, half of the cabinet, and half of the parliament follow one of the various Lebanese Christian rites. There are also important minority religions like the Baháʼí Faith, Yarsanism, Yazidism, Zoroastrianism, Mandaeism, Druze, and Shabakism, and in ancient times the region was home to Mesopotamian religions, Canaanite religions, Manichaeism, Mithraism and various monotheist gnostic sects. The six top languages, in terms of numbers of speakers, are Arabic, Persian, Turkish, Kurdish, Modern Hebrew and Greek. About 20 minority languages are also spoken in the Middle East. Arabic, with all its dialects, is the most widely spoken language in the Middle East, with Literary Arabic being official in all North African and in most West Asian countries. Arabic dialects are also spoken in some adjacent areas in neighbouring Middle Eastern non-Arab countries. It is a member of the Semitic branch of the Afro-Asiatic languages. Several Modern South Arabian languages such as Mehri and Soqotri are also spoken in Yemen and Oman. Another Semitic language is Aramaic and its dialects are spoken mainly by Assyrians and Mandaeans, with Western Aramaic still spoken in two villages near Damascus, Syria. There is also an Oasis Berber-speaking community in Egypt where the language is also known as Siwa. It is a non-Semitic Afro-Asiatic sister language. Persian is the second most spoken language. While it is primarily spoken in Iran and some border areas in neighbouring countries, the country is one of the region's largest and most populous. It belongs to the Indo-Iranian branch of the family of Indo-European languages. Other Western Iranic languages spoken in the region include Achomi, Daylami, Kurdish dialects, Semmani, Lurish, amongst many others. The close third-most widely spoken language, Turkish, is largely confined to Turkey, which is also one of the region's largest and most populous countries, but it is present in areas in neighboring countries. It is a member of the Turkic languages, which have their origins in East Asia. Another Turkic language, Azerbaijani, is spoken by Azerbaijanis in Iran. The fourth-most widely spoken language, Kurdish, is spoken in the countries of Iran, Iraq, Syria and Turkey, Sorani Kurdish is the second official language in Iraq (instated after the 2005 constitution) after Arabic. Hebrew is the official language of Israel, with Arabic given a special status after the 2018 Basic law lowered its status from an official language prior to 2018. Hebrew is spoken and used by over 80% of Israel's population, the other 20% using Arabic. Modern Hebrew only began being spoken in the 20th century after being revived in the late 19th century by Elizer Ben-Yehuda (Elizer Perlman) and European Jewish settlers, with the first native Hebrew speaker being born in 1882. Greek is one of the two official languages of Cyprus, and the country's main language. Small communities of Greek speakers exist all around the Middle East; until the 20th century it was also widely spoken in Asia Minor (being the second most spoken language there, after Turkish) and Egypt. During the antiquity, Ancient Greek was the lingua franca for many areas of the western Middle East and until the Muslim expansion it was widely spoken there as well. Until the late 11th century, it was also the main spoken language in Asia Minor; after that it was gradually replaced by the Turkish language as the Anatolian Turks expanded and the local Greeks were assimilated, especially in the interior. English is one of the official languages of Akrotiri and Dhekelia. It is also commonly taught and used as a foreign second language, in countries such as Egypt, Jordan, Iran, Iraq, Qatar, Bahrain, United Arab Emirates and Kuwait. It is also a main language in some Emirates of the United Arab Emirates. It is also spoken as native language by Jewish immigrants from Anglophone countries (UK, US, Australia) in Israel and understood widely as second language there. French is taught and used in many government facilities and media in Lebanon, and is taught in some primary and secondary schools of Egypt and Syria. Maltese, a Semitic language mainly spoken in Europe, is used by the Franco-Maltese diaspora in Egypt. Due to widespread immigration of French Jews to Israel, it is the native language of approximately 200,000 Jews in Israel. Armenian speakers are to be found in the region. Georgian is spoken by the Georgian diaspora. Russian is spoken by a large portion of the Israeli population, because of emigration in the late 1990s. Russian today is a popular unofficial language in use in Israel; news, radio and sign boards can be found in Russian around the country after Hebrew and Arabic. Circassian is also spoken by the diaspora in the region and by almost all Circassians in Israel who speak Hebrew and English as well. The largest Romanian-speaking community in the Middle East is found in Israel, where as of 1995[update] Romanian is spoken by 5% of the population.[d] Bengali, Hindi and Urdu are widely spoken by migrant communities in many Middle Eastern countries, such as Saudi Arabia (where 20–25% of the population is South Asian), the United Arab Emirates (where 50–55% of the population is South Asian), and Qatar, which have large numbers of Pakistani, Bangladeshi and Indian immigrants. Culture The Middle East has recently become more prominent in hosting global sport events due to its wealth and desire to diversify its economy. The South Asian diaspora is a major backer of cricket in the region. See also Notes References Further reading External links 29°N 41°E / 29°N 41°E / 29; 41 |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Differential_analyser] | [TOKENS: 1433] |
Contents Differential analyser The differential analyser is a mechanical analogue computer designed to solve differential equations by integration, using wheel-and-disc mechanisms to perform the integration. It was one of the first advanced computing devices to be used operationally. In addition to the integrator devices, the machine used an epicyclic differential mechanism to perform addition or subtraction - similar to that used on a front-wheel drive car, where the speed of the two output shafts (driving the wheels) may differ but the speeds add up to the speed of the input shaft. Multiplication/division by integer values was achieved by simple gear ratios; multiplication by fractional values was achieved by means of a multiplier table, where a human operator would have to keep a stylus tracking the slope of a bar. A variant of this human-operated table was used to implement other functions such as polynomials. History Research on solutions for differential equations using mechanical devices, discounting planimeters, started at least as early as 1836, when the French physicist Gaspard-Gustave Coriolis designed a mechanical device to integrate differential equations of the first order. The first description of a device which could integrate differential equations of any order was published in 1876 by James Thomson, who was born in Belfast in 1822, but lived in Scotland from the age of 10. Though Thomson called his device an "integrating machine", it is his description of the device, together with the additional publication in 1876 of two further descriptions by his younger brother, Lord Kelvin, which represents the invention of the differential analyser. One of the earliest practical uses of Thomson's concepts was a tide-predicting machine built by Kelvin starting in 1872–3. On Lord Kelvin's advice, Thomson's integrating machine was later incorporated into a fire-control system for naval gunnery being developed by Arthur Pollen, resulting in an electrically driven, mechanical analogue computer, which was completed by about 1912. Italian mathematician Ernesto Pascal also developed integraphs for the mechanical integration of differential equations and published details in 1914. However, the first widely practical general-purpose differential analyser was constructed by Harold Locke Hazen and Vannevar Bush at MIT, 1928–1931, comprising six mechanical integrators. In the same year, Bush described this machine in a journal article as a "continuous integraph". When he published a further article on the device in 1931, he called it a "differential analyzer". In this article, Bush stated that "[the] present device incorporates the same basic idea of interconnection of integrating units as did [Lord Kelvin's]. In detail, however, there is little resemblance to the earlier model." According to his 1970 autobiography, Bush was "unaware of Kelvin’s work until after the first differential analyzer was operational." Claude Shannon was hired as a research assistant in 1936 to run the differential analyser in Bush's lab. Douglas Hartree of Manchester University brought Bush's design to England, where he constructed his first "proof of concept" model with his student, Arthur Porter, during 1934. As a result of this, the university acquired a full-scale machine incorporating four mechanical integrators in March 1935, which was built by Metropolitan-Vickers, and was, according to Hartree, "[the] first machine of its kind in operation outside the United States". During the next five years three more were added, at Cambridge University, Queen's University Belfast, and the Royal Aircraft Establishment in Farnborough. One of the integrators from this proof of concept is on display in the History of Computing section of the Science Museum in London, alongside a complete Manchester machine. In Norway, the locally built Oslo Analyser was finished during 1938, based on the same principles as the MIT machine. This machine had 12 integrators, and was the largest analyser built for a period of four years. In the United States, further differential analysers were built at the Ballistic Research Laboratory in Maryland and in the basement of the Moore School of Electrical Engineering at the University of Pennsylvania during the early 1940s. The latter was used extensively in the computation of artillery firing tables prior to the invention of the ENIAC, which, in many ways, was modelled on the differential analyser. Also in the early 1940s, with Samuel H. Caldwell, one of the initial contributors during the early 1930s, Bush attempted an electrical, rather than mechanical, variation, but the digital computer built elsewhere had much greater promise and the project ceased. In 1947, UCLA installed a differential analyser built for them by General Electric at a cost of $125,000. By 1950, this machine had been joined by three more. The UCLA differential analyser appeared in 1950's Destination Moon, and the same footage in 1951's When Worlds Collide, where it was called "DA". A different shot appears in 1956's Earth vs. the Flying Saucers. At Osaka Imperial University (present-day Osaka University) around 1944, a complete differential analyser machine was developed (illustrated) to calculate the movement of an object and other problems with mechanical components, and then draws graphs on paper with a pen. It was later transferred to the Tokyo University of Science and has been displayed at the school's Museum of Science in Shinjuku Ward. Restored in 2014, it is one of only two still operational differential analysers produced before the end of World War II. In Canada, a differential analyser was constructed at the University of Toronto in 1948 by Beatrice Helen Worsley, but it appears to have had little or no use. A differential analyser may have been used in the development of the bouncing bomb, used to attack German hydroelectric dams during World War II. Differential analysers have also been used in the calculation of soil erosion by river control authorities. The differential analyser was eventually rendered obsolete by electronic analogue computers and, later, digital computers. Use of Meccano The model differential analyser built at Manchester University in 1934 by Douglas Hartree and Arthur Porter made extensive use of Meccano parts: this meant that the machine was less costly to build, and it proved "accurate enough for the solution of many scientific problems". A similar machine built by J.B. Bratt at Cambridge University in 1935 is now in the Museum of Transport and Technology (MOTAT) collection in Auckland, New Zealand. A memorandum written for the British military's Armament Research Department in 1944 describes how this machine had been modified during World War II for improved reliability and enhanced capability, and identifies its wartime applications as including research on the flow of heat, explosive detonations, and simulations of transmission lines. It has been estimated, by Garry Tee that "about 15 Meccano model Differential Analysers were built for serious work by scientists and researchers around the world". See also Notes Bibliography External links |
======================================== |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.