id
stringlengths
2
8
url
stringlengths
31
139
title
stringlengths
1
87
text
stringlengths
10
247k
embedding
list
11757
https://en.wikipedia.org/wiki/Fast%20combat%20support%20ship
Fast combat support ship
The fast combat support ship (US Navy hull classification symbol: AOE) is the United States Navy's largest combat logistics ship, designed as an oiler, ammunition and supply ship. All fast combat support ships currently in service are operated by Military Sealift Command. They can carry more than 177,000 barrels of oil, 2,150 tons of ammunition, 500 tons of dry stores and 250 tons of refrigerated stores. It receives petroleum products, ammunition and stores from various shuttle ships and redistributes these items when needed to ships in the carrier battle group. This greatly reduces the number of service ships needed to travel with carrier battle groups. The four ships of the were 53,000 tons at full load, 796 feet overall length, and carried two Boeing Vertol CH-46 Sea Knight helicopters. The Sacramento class was retired in 2005. The ships displace 48,800 tons full load and carried two Boeing Vertol CH-46 Sea Knight helicopters or two Sikorsky MH-60S Knighthawk helicopters. Air defense includes the Sea Sparrow radar and infrared surface-to-air missile in eight-cell launchers to provide point defence with 15km to 25km range. There are also two Phalanx mk15 20mm gatling gun close-in weapon systems (CIWS) and two 25mm Raytheon mk88 guns. China has developed the Type 901 fast combat support ship which serves a similar mission in their navy. List of Fast Combat Support Ships (AOE-1) USS Sacramento (AOE-2) USS Camden (AOE-3) USS Seattle (AOE-4) USS Detroit (AOE-6) USS Supply (New Class) (AOE-7) USS Rainier (AOE-8) USS Arctic (AOE-10) USS Bridge References AOE class at Naval Technology AOE class at US Navy Military Sealift Command AOE class at US Naval Vessel Register US Navy Military Sealift Command front page Extensive description and photos of AOE-1 Sacramento at Military Analysis Network (unofficial) See also List of Military Sealift Command ships Ship types Auxiliary ships of the United States Navy
[ -0.3709679841995239, -0.009886632673442364, 0.28851452469825745, 0.03764292597770691, 0.13165490329265594, 0.15944504737854004, -0.07835080474615097, 0.1597573608160019, -0.4842575490474701, -0.25531208515167236, -0.6987892985343933, 0.47028717398643494, -0.1220136508345604, 0.009725500829...
11758
https://en.wikipedia.org/wiki/FASA
FASA
FASA Corporation was an American publisher of role-playing games, wargames and board games between 1980 and 2001, after which they closed publishing operations for several years, becoming an IP holding company under the name FASA Inc. In 2012, a wholly owned subsidiary called FASA Games Inc. went into operation, using the name and logo under license from the parent company. FASA Games Inc. works alongside Ral Partha Europe, also a subsidiary of FASA Corporation, to bring out new editions of existing properties such as Earthdawn and Demonworld, and to develop new properties within the FASA cosmology. FASA first appeared as a Traveller licensee, producing supplements for that Game Designers' Workshop role-playing game, especially the work of the Keith Brothers. The company went on to establish itself as a major gaming company with the publication of the first licensed Star Trek RPG, then several successful original games. Noteworthy lines included BattleTech and Shadowrun. Their Star Trek role-playing supplements and tactical ship game enjoyed popularity outside the wargaming community since, at the time, official descriptions of the Star Trek universe were not common, and the gaming supplements offered details fans craved. The highly successful BattleTech line led to a series of video games, some of the first virtual reality gaming suites, called Virtual World (created by a subdivision of the company known at the time of development as ESP, an acronym for "Extremely Secret Project") and a Saturday-morning animated TV series. Originally the name FASA was an acronym for "Freedonian Aeronautics and Space Administration", a joking allusion to the Marx Brothers film Duck Soup. This tongue-in-cheek attitude was carried over in humorous self-references in its games. For example, in Shadowrun, a tactical nuclear device was detonated near FASA's offices at 1026 W. Van Buren St in Chicago, Illinois. History FASA Corporation was founded by Jordan Weisman and L. Ross Babcock III in 1980 with a starting capital of $350 ($1,200 adjusted for inflation). The two were fellow gamers at the United States Merchant Marine Academy. Mort Weisman, Jordan's father, joined the company in 1985 to lead the company's operational management having sold his book publishing business, Swallow Press. Under the new commercial direction and with Mort's capital injection, the company diversified into books and miniature figures. After consulting their UK distributor, Chart Hobby Distributors, FASA licensed the manufacture of its BattleTech figurines to Miniature Figurines (also known as Minifigs). FASA would later acquire the U.S. figures manufacturer Ral Partha, which was the US manufacturer of Minifigs. While Mort ran the paper and metal based sides of the business, the company's founders focused on the development of computer-based games. They were particularly interested in virtual reality (particularly the BattleTech Centers / Virtual World) but also developed desktop computer games. When Microsoft acquired the FASA Interactive subsidiary, Babcock went with that company. After the sale of Virtual World, Jordan turned his attention to the founding of a new games venture called WizKids. Current status and intellectual property FASA unexpectedly ceased active operations on April 30, 2001, but still exists as a corporation holding intellectual property rights, which it licenses to other publishers. Contrary to popular belief, the company did not go bankrupt. Allegedly the owners decided to quit while the company was still financially sound in a market they perceived as going downhill. Mort Weisman had been talking of retirement for some years and his confidence in the future of the paper-based games business was low. He considered the intellectual property of FASA to be of high value but did not wish to continue working as he had been for the last decade or more. Unwilling to wrestle with the complexities of dividing up the going concern, the owners issued a press release on January 25, 2001 announcing the immediate closure of the business. The BattleTech and Shadowrun properties were sold to WizKids, who in turn licensed their publication to FanPro LLC and then to Catalyst Game Labs. The Earthdawn license was sold to WizKids, and then back to FASA. Living Room Games published Earthdawn (Second Edition), RedBrick published Earthdawn (Classic and Third Editions), but the license has now returned to FASA Corporation, and FASA Games, Inc. is the current license holder for new material. Crimson Skies was originally developed by Zipper Interactive under the FASA Interactive brand in late 2000 and used under license by FASA; FASA Interactive had been purchased by Microsoft, so rights to Crimson Skies stayed with Microsoft. Rights to the miniatures game VOR: The Maelstrom reverted to the designer Mike "Skuzzy" Nielsen, but it has not been republished in any form due partly to legal difficulties. Microsoft officially closed the FASA team in the company's gaming division on September 12, 2007. On December 6, 2007, FASA founder Jordan Weisman announced that his new venture, Smith & Tinker, had licensed the electronic gaming rights to MechWarrior, Shadowrun, and Crimson Skies from Microsoft. On April 28, 2008 Mike "Skuzzy" Nielsen announced plans to create Vor 2.0. At Gen Con 2012, FASA Games, Inc. was revealed, which includes FASA Corporation co-founder Ross Babcock on the Board of Directors. While FASA Corporation still owns and manages the FASA IP and brands, FASA Games, Inc would release new games and content. As of 2020, FASA Games has released contents for 2 games; a 4th edition for Earthdawn and the new game 1879 which aims to replace and/or create an alternate future '6th Age' in 'replacement' to Shadowrun. Notable games Role-playing games Star Trek: The Role Playing Game (1982) Star Trek: Starship Tactical Combat Simulator Doctor Who (1985) MechWarrior (1986) Shadowrun (1989) Legionnaire (1990) Earthdawn (1993) Board games BattleTech (released in 1984 as BattleDroids, titled BattleTech as of 1985) Renegade Legion (1989) Crimson Skies (1998) Miniature games VOR: The Maelstrom (1999) Demonworld (second edition: 2011, miniatures by Ral Partha Europe. The first edition was released in 1999 by Hobby Products) Video games See FASA Studio References External links , (FGI) Companies based in Chicago Companies disestablished in 2001 Companies established in 1980 Role-playing game publishing companies
[ -0.4626676142215729, 0.24974967539310455, -0.09943326562643051, -0.24739965796470642, -0.36332473158836365, 0.1313757598400116, -0.37674492597579956, -0.13067710399627686, -0.2243465930223465, 0.36113372445106506, -0.18647721409797668, 0.12158802151679993, -0.3422631621360779, 0.0350725017...
11759
https://en.wikipedia.org/wiki/McDonnell%20Douglas%20F-4%20Phantom%20II
McDonnell Douglas F-4 Phantom II
The McDonnell Douglas F-4 Phantom II is an American tandem two-seat, twin-engine, all-weather, long-range supersonic jet interceptor and fighter-bomber originally developed by McDonnell Aircraft for the United States Navy. Proving highly adaptable, it first entered service with the Navy in 1961 before it was adopted by the United States Marine Corps and the United States Air Force, and by the mid-1960s it had become a major part of their air arms. Phantom production ran from 1958 to 1981 with a total of 5,195 aircraft built, making it the most produced American supersonic military aircraft in history, and cementing its position as an iconic combat aircraft of the Cold War. The Phantom is a large fighter with a top speed of over Mach 2.2. It can carry more than 18,000 pounds (8,400 kg) of weapons on nine external hardpoints, including air-to-air missiles, air-to-ground missiles, and various bombs. The F-4, like other interceptors of its time, was initially designed without an internal cannon. Later models incorporated an M61 Vulcan rotary cannon. Beginning in 1959, it set 15 world records for in-flight performance, including an absolute speed record and an absolute altitude record. The F-4 was used extensively during the Vietnam War. It served as the principal air superiority fighter for the U.S. Air Force, Navy, and Marine Corps and became important in the ground-attack and aerial reconnaissance roles late in the war. During the Vietnam War, one U.S. Air Force pilot, two weapon systems officers (WSOs), one U.S. Navy pilot and one radar intercept officer (RIO) became aces by achieving five aerial kills against enemy fighter aircraft. The F-4 continued to form a major part of U.S. military air power throughout the 1970s and 1980s, being gradually replaced by more modern aircraft such as the F-15 Eagle and F-16 Fighting Falcon in the U.S. Air Force, the F-14 Tomcat in the U.S. Navy, and the F/A-18 Hornet in the U.S. Navy and U.S. Marine Corps. The F-4 Phantom II remained in use by the U.S. in the reconnaissance and Wild Weasel (Suppression of Enemy Air Defenses) roles in the 1991 Gulf War, finally leaving service in 1996. It was also the only aircraft used by both U.S. flight demonstration teams: the United States Air Force Thunderbirds (F-4E) and the United States Navy Blue Angels (F-4J). The F-4 was also operated by the armed forces of 11 other nations. Israeli Phantoms saw extensive combat in several Arab–Israeli conflicts, while Iran used its large fleet of Phantoms, acquired before the fall of the Shah, in the Iran–Iraq War. As of 2021, 63 years after its first flight, the F-4 remains in active service with the air forces of Iran, South Korea, Greece and Turkey. The aircraft has most recently been in service against the Islamic State group in the Middle East. Development Origins In 1952, McDonnell's Chief of Aerodynamics, Dave Lewis, was appointed by CEO Jim McDonnell to be the company's preliminary design manager. With no new aircraft competitions on the horizon, internal studies concluded the Navy had the greatest need for a new and different aircraft type: an attack fighter. In 1953, McDonnell Aircraft began work on revising its F3H Demon naval fighter, seeking expanded capabilities and better performance. The company developed several projects, including a variant powered by a Wright J67 engine, and variants powered by two Wright J65 engines, or two General Electric J79 engines. The J79-powered version promised a top speed of Mach 1.97. On 19 September 1953, McDonnell approached the United States Navy with a proposal for the "Super Demon". Uniquely, the aircraft was to be modular, as it could be fitted with one- or two-seat noses for different missions, with different nose cones to accommodate radar, photo cameras, four 20 mm (.79 in) cannon, or 56 FFAR unguided rockets in addition to the nine hardpoints under the wings and the fuselage. The Navy was sufficiently interested to order a full-scale mock-up of the F3H-G/H, but felt that the upcoming Grumman XF9F-9 and Vought XF8U-1 already satisfied the need for a supersonic fighter. The McDonnell design was therefore reworked into an all-weather fighter-bomber with 11 external hardpoints for weapons and on 18 October 1954, the company received a letter of intent for two YAH-1 prototypes. Then on 26 May 1955, four Navy officers arrived at the McDonnell offices and, within an hour, presented the company with an entirely new set of requirements. Because the Navy already had the Douglas A-4 Skyhawk for ground attack and F-8 Crusader for dogfighting, the project now had to fulfill the need for an all-weather fleet defense interceptor. A second crewman was added to operate the powerful radar; designers believed that air combat in the next war would overload solo pilots with information. XF4H-1 prototype The XF4H-1 was designed to carry four semi-recessed AAM-N-6 Sparrow III radar-guided missiles, and to be powered by two J79-GE-8 engines. As in the McDonnell F-101 Voodoo, the engines sat low in the fuselage to maximize internal fuel capacity and ingested air through fixed geometry intakes. The thin-section wing had a leading edge sweep of 45° and was equipped with blown flaps for better low-speed handling. Wind tunnel testing had revealed lateral instability, requiring the addition of 5° dihedral to the wings. To avoid redesigning the titanium central section of the aircraft, McDonnell engineers angled up only the outer portions of the wings by 12°, which averaged to the required 5° over the entire wingspan. The wings also received the distinctive "dogtooth" for improved control at high angles of attack. The all-moving tailplane was given 23° of anhedral to improve control at high angles of attack, while still keeping the tailplane clear of the engine exhaust. In addition, air intakes were equipped with one fixed ramp and one variable geometry ramp with angle scheduled to give maximum pressure recovery between Mach 1.4 and Mach 2.2. Airflow matching between the inlet and engine was achieved by bypassing the engine as secondary air into the exhaust nozzle. All-weather intercept capability was achieved with the AN/APQ-50 radar. To meet requirements for carrier operations, the landing gear was designed to withstand landings with a maximum sink rate of , while the nose strut could extend by to increase angle of attack on the catapult portion of a takeoff. On 25 July 1955, the Navy ordered two XF4H-1 test aircraft and five YF4H-1 pre-production examples. The Phantom made its maiden flight on 27 May 1958 with Robert C. Little at the controls. A hydraulic problem precluded retraction of the landing gear, but subsequent flights went more smoothly. Early testing resulted in redesign of the air intakes, including the distinctive addition of 12,500 holes to "bleed off" the slow-moving boundary layer air from the surface of each intake ramp. Series production aircraft also featured splitter plates to divert the boundary layer away from the engine intakes. The aircraft was soon in competition with the XF8U-3 Crusader III. Due to cockpit workload, the Navy wanted a two-seat aircraft and on 17 December 1958 the F4H was declared the winner. Delays with the J79-GE-8 engines meant that the first production aircraft were fitted with J79-GE-2 and −2A engines, each having 16,100 lbf (71.8 kN) of afterburning thrust. In 1959, the Phantom began carrier suitability trials with the first complete launch-recovery cycle performed on 15 February 1960 from . There were proposals to name the F4H "Satan" and "Mithras". In the end, the aircraft was given the less controversial name "Phantom II", the first "Phantom" being another McDonnell jet fighter, the FH-1 Phantom. The Phantom II was briefly given the designation F-110A and named "Spectre" by the USAF, but these were not officially used and the Tri-Service aircraft designation system was adopted in September 1962. Production Early in production, the radar was upgraded to the Westinghouse AN/APQ-72, an AN/APG-50 with a larger radar antenna, necessitating the bulbous nose, and the canopy was reworked to improve visibility and make the rear cockpit less claustrophobic. During its career the Phantom underwent many changes in the form of numerous variants developed. The USN operated the F4H-1 (re-designated F-4A in 1962) with J79-GE-2 and -2A engines of 16,100 lbf (71.62 kN) thrust and later builds receiving -8 engines. A total of 45 F-4As were built; none saw combat, and most ended up as test or training aircraft. The USN and USMC received the first definitive Phantom, the F-4B which was equipped with the Westinghouse APQ-72 radar (pulse only), a Texas Instruments AAA-4 Infrared search and track pod under the nose, an AN/AJB-3 bombing system and powered by J79-GE-8,-8A and -8B engines of 10,900 lbf (48.5 kN) dry and 16,950 lbf (75.4 kN) afterburner (reheat) with the first flight on 25 March 1961. 649 F-4Bs were built with deliveries beginning in 1961 and VF-121 Pacemakers receiving the first examples at NAS Miramar. The USAF received Phantoms as the result of Defense Secretary Robert McNamara's push to create a unified fighter for all branches of the US military. After an F-4B won the "Operation Highspeed" fly-off against the Convair F-106 Delta Dart, the USAF borrowed two Naval F-4Bs, temporarily designating them F-110A in January 1962, and developed requirements for their own version. Unlike the US Navy's focus on air-to-air interception in the Fleet Air Defense (FAD) mission, the USAF emphasized both an air-to-air and an air-to-ground fighter-bomber role. With McNamara's unification of designations on 18 September 1962, the Phantom became the F-4 with the naval version designated F-4B and USAF F-4C. The first Air Force Phantom flew on 27 May 1963, exceeding Mach 2 on its maiden flight. The F-4J improved both air-to-air and ground-attack capability; deliveries begun in 1966 and ended in 1972 with 522 built. It was equipped with J79-GE-10 engines with 17,844 lbf (79.374 kN) thrust, the Westinghouse AN/AWG-10 Fire Control System (making the F-4J the first fighter in the world with operational look-down/shoot-down capability), a new integrated missile control system and the AN/AJB-7 bombing system for expanded ground attack capability. The F-4N (updated F-4Bs) with smokeless engines and F-4J aerodynamic improvements started in 1972 under a U.S. Navy-initiated refurbishment program called "Project Bee Line" with 228 converted by 1978. The F-4S model resulted from the refurbishment of 265 F-4Js with J79-GE-17 smokeless engines of 17,900 lbf (79.379 kN), AWG-10B radar with digitized circuitry for improved performance and reliability, Honeywell AN/AVG-8 Visual Target Acquisition Set or VTAS (world's first operational Helmet Sighting System), classified avionics improvements, airframe reinforcement and leading edge slats for enhanced maneuvering. The USMC also operated the RF-4B with reconnaissance cameras with 46 built; the RF-4B flew alone and unarmed, with a requirement to fly straight and level at 5,000 feet while taking photographs. They relied on the shortcomings of the anti-aircraft defences to survive as they were unable to make evasive manoeuvres. Phantom II production ended in the United States in 1979 after 5,195 had been built (5,057 by McDonnell Douglas and 138 in Japan by Mitsubishi). Of these, 2,874 went to the USAF, 1,264 to the Navy and Marine Corps, and the rest to foreign customers. The last U.S.-built F-4 went to South Korea, while the last F-4 built was an F-4EJ built by Mitsubishi Heavy Industries in Japan and delivered on 20 May 1981. As of 2008, 631 Phantoms were in service worldwide, while the Phantoms were in use as a target drone (specifically QF-4Cs) operated by the U.S. military until 21 December 2016, when the Air Force officially ended use of the type. World records To show off their new fighter, the Navy led a series of record-breaking flights early in Phantom development: All in all, the Phantom set 16 world records. Except for Skyburner, all records were achieved in unmodified production aircraft. Five of the speed records remained unbeaten until the F-15 Eagle appeared in 1975. Operation Top Flight: On 6 December 1959, the second XF4H-1 performed a zoom climb to a world record 98,557 ft (30,040 m). Commander Lawrence E. Flint Jr., USN accelerated his aircraft to at 47,000 ft (14,330 m) and climbed to 90,000 ft (27,430 m) at a 45° angle. He then shut down the engines and glided to the peak altitude. As the aircraft fell through 70,000 ft (21,300 m), Flint restarted the engines and resumed normal flight. On 5 September 1960, an F4H-1 averaged 1,216.78 mph (1,958.16 km/h) over a 500 km (311 mi) closed-circuit course. On 25 September 1960, an F4H-1F averaged 1,390.24 mph (2,237.37 km/h) over a 100 km (62.1 mi) closed-circuit course. FAIRecord File Number 8898. Operation LANA: To celebrate the 50th anniversary of Naval aviation (L is the Roman numeral for 50 and ANA stood for Anniversary of Naval Aviation) on 24 May 1961, Phantoms flew across the continental United States in under three hours and included several tanker refuelings. The fastest of the aircraft averaged 869.74 mph (1,400.28 km/h) and completed the trip in 2 hours 47 minutes, earning the pilot (and future NASA Astronaut), Lieutenant Richard Gordon, USN and RIO, Lieutenant Bobbie Young, USN, the 1961 Bendix trophy. Operation Sageburner: On 28 August 1961, a F4H-1F Phantom II averaged 1,452.777 kilometers per hour (902.714 miles per hour) over a 3 mi (4.82 km) course flying below at all times. Commander J.L. Felsman, USN was killed during the first attempt at this record on 18 May 1961 when his aircraft disintegrated in the air after pitch damper failure. Operation Skyburner: On 22 November 1961, a modified Phantom with water injection, piloted by Lt. Col. Robert B. Robinson, set an absolute world record average speed over a 20-mile (32.2 km) long 2-way straight course of 1,606.342 mph (2,585.086 km/h). On 5 December 1961, another Phantom set a sustained altitude record of . Project High Jump: A series of time-to-altitude records was set in early 1962: 34.523 seconds to , 48.787 seconds to , 61.629 seconds to , 77.156 seconds to , 114.548 seconds to , 178.5 s to , 230.44 s to , and 371.43 s to . Design Overview The F-4 Phantom is a tandem-seat fighter-bomber designed as a carrier-based interceptor to fill the U.S. Navy's fleet defense fighter role. Innovations in the F-4 included an advanced pulse-Doppler radar and extensive use of titanium in its airframe. Despite imposing dimensions and a maximum takeoff weight of over 60,000 lb (27,000 kg), the F-4 has a top speed of Mach 2.23 and an initial climb rate of over 41,000 ft/min (210 m/s). The F-4's nine external hardpoints have a capability of up to 18,650 pounds (8,480 kg) of weapons, including air-to-air and air-to-surface missiles, and unguided, guided, and thermonuclear weapons. Like other interceptors of its day, the F-4 was designed without an internal cannon. The baseline performance of a Mach 2-class fighter with long-range and a bomber-sized payload would be the template for the next generation of large and light/middle-weight fighters optimized for daylight air combat. Flight characteristics "Speed is life" was F-4 pilots' slogan. The Phantom's greatest advantage in air combat was acceleration and thrust, which permitted a skilled pilot to engage and disengage from the fight at will. MiGs usually could outturn the F-4 because of the high drag on its airframe; as a massive fighter aircraft designed to fire radar-guided missiles from beyond visual range, the F-4 lacked the agility of its Soviet opponents and was subject to adverse yaw during hard maneuvering. Although thus subject to irrecoverable spins during aileron rolls, pilots reported the aircraft to be very responsive and easy to fly on the edge of its performance envelope. In 1972, the F-4E model was upgraded with leading edge slats on the wing, greatly improving high angle of attack maneuverability at the expense of top speed. The J79 had a reduced time lag between the pilot slamming the throttle, from idle to maximum thrust, and the engine producing maximum thrust compared to earlier engines. While landing on John Chesire's tailhook missed the arresting gear after selecting idle thrust. By slamming the throttle to full afterburner he turned his bolter into a touch-and-go landing. The J79 produced noticeable amounts of black smoke (at mid-throttle/cruise settings), a severe disadvantage in that it made it easier for the enemy to spot the aircraft. Two decades after the aircraft entered service this was solved on the F-4S, which was fitted with the −10A engine variant with a smokeless combustor. The lack of an internal gun "was the biggest mistake on the F-4", Chesire said; "Bullets are cheap and tend to go where you aim them. I needed a gun, and I really wished I had one". Marine Corps general John R. Dailey recalled that "everyone in RF-4s wished they had a gun on the aircraft". For a brief period, doctrine held that turning combat would be impossible at supersonic speeds and little effort was made to teach pilots air combat maneuvering. In reality, engagements quickly became subsonic, as pilots would slow down in an effort to get behind their adversaries. Furthermore, the relatively new heat-seeking and radar-guided missiles at the time were frequently reported as unreliable and pilots had to fire multiple missiles just to hit one enemy fighter. To compound the problem, rules of engagement in Vietnam precluded long-range missile attacks in most instances, as visual identification was normally required. Many pilots found themselves on the tail of an enemy aircraft, but too close to fire short-range Falcons or Sidewinders. Although by 1965 USAF F-4Cs began carrying SUU-16 external gunpods containing a 20 mm (.79 in) M61A1 Vulcan Gatling cannon, USAF cockpits were not equipped with lead-computing gunsights until the introduction of the SUU-23, virtually assuring a miss in a maneuvering fight. Some Marine Corps aircraft carried two pods for strafing. In addition to the loss of performance due to drag, combat showed the externally mounted cannon to be inaccurate unless frequently boresighted, yet far more cost-effective than missiles. The lack of a cannon was finally addressed by adding an internally mounted 20 mm (.79 in) M61A1 Vulcan on the F-4E. Costs Note: Original amounts were in 1965 U.S. dollars. The figures in these tables have been adjusted for inflation to the current year. Operational history United States Air Force In USAF service, the F-4 was initially designated the F-110A prior to the introduction of the 1962 United States Tri-Service aircraft designation system. The USAF quickly embraced the design and became the largest Phantom user. The first USAF Phantoms in Vietnam were F-4Cs from the 43rd Tactical Fighter Squadron arrived in December 1964. Unlike the U.S. Navy and U.S. Marine Corps, which flew the Phantom with a Naval Aviator (pilot) in the front seat and a Naval Flight Officer as a radar intercept officer (RIO) in the back seat, the USAF initially flew its Phantoms with a rated Air Force Pilot in front and back seats. Pilots usually did not like flying in the back seat; while the GIB, or "guy in back", could fly and ostensibly land the aircraft, he had fewer flight instruments and a very restricted forward view. The Air Force later assigned a rated Air Force Navigator qualified as a weapon/targeting systems officer (later designated as weapon systems officer or WSO) in the rear seat instead of another pilot. On 10 July 1965, F-4Cs of the 45th Tactical Fighter Squadron, 15th TFW, on temporary assignment in Ubon, Thailand, scored the USAF's first victories against North Vietnamese MiG-17s using AIM-9 Sidewinder air-to-air missiles. On 26 April 1966, an F-4C from the 480th Tactical Fighter Squadron scored the first aerial victory by a U.S. aircrew over a North Vietnamese MiG-21 "Fishbed". On 24 July 1965, another Phantom from the 45th Tactical Fighter Squadron became the first American aircraft to be downed by an enemy SAM, and on 5 October 1966 an 8th Tactical Fighter Wing F-4C became the first U.S. jet lost to an air-to-air missile, fired by a MiG-21. Early aircraft suffered from leaks in wing fuel tanks that required re-sealing after each flight and 85 aircraft were found to have cracks in outer wing ribs and stringers. There were also problems with aileron control cylinders, electrical connectors, and engine compartment fires. Reconnaissance RF-4Cs made their debut in Vietnam on 30 October 1965, flying the hazardous post-strike reconnaissance missions. The USAF Thunderbirds used the F-4E from the 1969 season until 1974. Although the F-4C was essentially identical to the Navy/Marine Corps F-4B in-flight performance and carried the AIM-9 Sidewinder missiles, USAF-tailored F-4Ds initially arrived in June 1967 equipped with AIM-4 Falcons. However, the Falcon, like its predecessors, was designed to shoot down heavy bombers flying straight and level. Its reliability proved no better than others and its complex firing sequence and limited seeker-head cooling time made it virtually useless in combat against agile fighters. The F-4Ds reverted to using Sidewinders under the "Rivet Haste" program in early 1968, and by 1972 the AIM-7E-2 "Dogfight Sparrow" had become the preferred missile for USAF pilots. Like other Vietnam War Phantoms, the F-4Ds were urgently fitted with radar warning receivers to detect the Soviet-built S-75 Dvina SAMs. From the initial deployment of the F-4C to Southeast Asia, USAF Phantoms performed both air superiority and ground attack roles, supporting not only ground troops in South Vietnam, but also conducting bombing sorties in Laos and North Vietnam. As the F-105 force underwent severe attrition between 1965 and 1968, the bombing role of the F-4 proportionately increased until after November 1970 (when the last F-105D was withdrawn from combat) it became the primary USAF tactical ordnance delivery system. In October 1972 the first squadron of EF-4C Wild Weasel aircraft deployed to Thailand on temporary duty. The "E" prefix was later dropped and the aircraft was simply known as the F-4C Wild Weasel. Sixteen squadrons of Phantoms were permanently deployed between 1965 and 1973, and 17 others deployed on temporary combat assignments. Peak numbers of combat F-4s occurred in 1972, when 353 were based in Thailand. A total of 445 Air Force Phantom fighter-bombers were lost, 370 in combat and 193 of those over North Vietnam (33 to MiGs, 30 to SAMs, and 307 to AAA). The RF-4C was operated by four squadrons, and of the 83 losses, 72 were in combat including 38 over North Vietnam (seven to SAMs and 65 to AAA). By war's end, the U.S. Air Force had lost a total of 528 F-4 and RF-4C Phantoms. When combined with U.S. Navy and Marine Corps losses of 233 Phantoms, 761 F-4/RF-4 Phantoms were lost in the Vietnam War. On 28 August 1972, Captain Steve Ritchie became the first USAF ace of the war. On 9 September 1972, WSO Capt Charles B. DeBellevue became the highest-scoring American ace of the war with six victories. and WSO Capt Jeffrey Feinstein became the last USAF ace of the war on 13 October 1972. Upon return to the United States, DeBellevue and Feinstein were assigned to undergraduate pilot training (Feinstein was given a vision waiver) and requalified as USAF pilots in the F-4. USAF F-4C/D/E crews claimed MiG kills in Southeast Asia (50 by Sparrow, 31 by Sidewinder, five by Falcon, 15.5 by gun, and six by other means). On 31 January 1972, the 170th Tactical Fighter Squadron/183d Tactical Fighter Group of the Illinois Air National Guard became the first Air National Guard unit to transition to Phantoms from Republic F-84F Thunderstreaks which were found to have corrosion problems. Phantoms would eventually equip numerous tactical fighter and tactical reconnaissance units in the USAF active, National Guard, and reserve. On 2 June 1972, a Phantom flying at supersonic speed shot down a MiG-19 over Thud Ridge in Vietnam with its cannon. At a recorded speed of Mach 1.2, Major Phil Handley's shoot down was the first and only recorded gun kill while flying at supersonic speeds. On 15 August 1990, 24 F-4G Wild Weasel Vs and six RF-4Cs were deployed to Shaikh Isa AB, Bahrain, for Operation Desert Storm. The F-4G was the only aircraft in the USAF inventory equipped for the Suppression of Enemy Air Defenses (SEAD) role, and was needed to protect coalition aircraft from Iraq's extensive air defense system. The RF-4C was the only aircraft equipped with the ultra-long-range KS-127 LOROP (long-range oblique photography) camera, and was used for a variety of reconnaissance missions. In spite of flying almost daily missions, only one RF-4C was lost in a fatal accident before the start of hostilities. One F-4G was lost when enemy fire damaged the fuel tanks and the aircraft ran out of fuel near a friendly airbase. The last USAF Phantoms, F-4G Wild Weasel Vs from 561st Fighter Squadron, were retired on 26 March 1996. The last operational flight of the F-4G Wild Weasel was from the 190th Fighter Squadron, Idaho Air National Guard, in April 1996. The last operational USAF/ANG F-4 to land was flown by Maj Mike Webb and Maj Gary Leeder of the Idaho ANG. Like the Navy, the Air Force has operated QF-4 target drones, serving with the 82d Aerial Targets Squadron at Tyndall Air Force Base, Florida, and Holloman Air Force Base, New Mexico. It was expected that the F-4 would remain in the target role with the 82d ATRS until at least 2015, when they would be replaced by early versions of the F-16 Fighting Falcon converted to a QF-16 configuration. Several QF-4s also retain capability as manned aircraft and are maintained in historical color schemes, being displayed as part of Air Combat Command's Heritage Flight at air shows, base open houses, and other events while serving as non-expendable target aircraft during the week. On 19 November 2013, BAE Systems delivered the last QF-4 aerial target to the Air Force. The example had been in storage for over 20 years before being converted. Over 16 years, BAE had converted 314 F-4 and RF-4 Phantom IIs into QF-4s and QRF-4s, with each aircraft taking six months to adapt. As of December 2013, QF-4 and QRF-4 aircraft had flown over 16,000 manned and 600 unmanned training sorties, with 250 unmanned aircraft being shot down in firing exercises. The remaining QF-4s and QRF-4s held their training role until the first of 126 QF-16s were delivered by Boeing. The final flight of an Air Force QF-4 from Tyndall AFB took place on 27 May 2015 to Holloman AFB. After Tyndall AFB ceased operations, the 53d Weapons Evaluation Group at Holloman became the fleet of 22 QF-4s' last remaining operator. The base continued using them to fly manned test and unmanned live fire test support and Foreign Military Sales testing, with the final unmanned flight taking place in August 2016. The type was officially retired from US military service with a four–ship flight at Holloman during an event on 21 December 2016. The remaining QF-4s were to be demilitarized after 1 January 2017. United States Navy On 30 December 1960, the VF-121 "Pacemakers" at NAS Miramar became the first Phantom operator with its F4H-1Fs (F-4As). The VF-74 "Be-devilers" at NAS Oceana became the first deployable Phantom squadron when it received its F4H-1s (F-4Bs) on 8 July 1961. The squadron completed carrier qualifications in October 1961 and Phantom's first full carrier deployment between August 1962 and March 1963 aboard . The second deployable U.S. Atlantic Fleet squadron to receive F-4Bs was the VF-102 "Diamondbacks", who promptly took their new aircraft on the shakedown cruise of . The first deployable U.S. Pacific Fleet squadron to receive the F-4B was the VF-114 "Aardvarks", which participated in the September 1962 cruise aboard . By the time of the Tonkin Gulf incident, 13 of 31 deployable navy squadrons were armed with the type. F-4Bs from made the first Phantom combat sortie of the Vietnam War on 5 August 1964, flying bomber escort in Operation Pierce Arrow. Navy fighter pilots were unused to flying with a non-pilot RIO, but learned from air combat in Vietnam the benefits of the GiB "guy in back" or "voice in the luggage compartment" helping with the workload. The first Phantom air-to-air victory of the war took place on 9 April 1965 when an F-4B from VF-96 "Fighting Falcons" piloted by Lieutenant (junior grade) Terence M. Murphy and his RIO, Ensign Ronald Fegan, shot down a Chinese MiG-17 "Fresco". The Phantom was then shot down, probably by an AIM-7 Sparrow from one of its wingmen. There continues to be controversy over whether the Phantom was shot down by MiG guns or, as enemy reports later indicated, an AIM-7 Sparrow III from one of Murphy's and Fegan's wingmen. On 17 June 1965, an F-4B from VF-21 "Freelancers" piloted by Commander Louis Page and Lieutenant John C. Smith shot down the first North Vietnamese MiG of the war. On 10 May 1972, Lieutenant Randy "Duke" Cunningham and Lieutenant (junior grade) William P. Driscoll flying an F-4J, call sign "Showtime 100", shot down three MiG-17s to become the first American flying aces of the war. Their fifth victory was believed at the time to be over a mysterious North Vietnamese ace, Colonel Nguyen Toon, now considered mythical. On the return flight, the Phantom was damaged by an enemy surface-to-air missile. To avoid being captured, Cunningham and Driscoll flew their burning aircraft using only the rudder and afterburner (the damage to the aircraft rendered conventional control nearly impossible), until they could eject over water. During the war, U.S. Navy F-4 Phantom squadrons participated in 84 combat tours with F-4Bs, F-4Js, and F-4Ns. The Navy claimed 40 air-to-air victories at a cost of 73 Phantoms lost in combat (seven to enemy aircraft, 13 to SAMs, and 53 to AAA). An additional 54 Phantoms were lost in mishaps. In 1984, all Navy F-4Ns were retired from Fleet service in deployable USN squadrons and by 1987 the last F-4Ss were retired from deployable USN squadrons. On 25 March 1986, an F-4S belonging to the VF-151 "Vigilantes," became the last active duty U.S. Navy Phantom to launch from an aircraft carrier, in this case, . On 18 October 1986, an F-4S from the VF-202 "Superheats", a Naval Reserve fighter squadron, made the last-ever Phantom carrier landing while operating aboard . In 1987, the last of the Naval Reserve-operated F-4S aircraft were replaced by F-14As. The last Phantoms in service with the Navy were QF-4N and QF-4S target drones operated by the Naval Air Warfare Center at NAS Point Mugu, California. These airframes were subsequently retired in 2004. United States Marine Corps The Marine Corps received its first F-4Bs in June 1962, with the "Black Knights" of VMFA-314 at Marine Corps Air Station El Toro, California becoming the first operational squadron. Marine Phantoms from VMFA-531 "Grey Ghosts" were assigned to Da Nang airbase on South Vietnam's northeast coast on 10 May 1965 and were initially assigned to provide air defense for the USMC. They soon began close air support missions (CAS) and VMFA-314 'Black Knights', VMFA-232 'Red Devils, VMFA-323 'Death Rattlers', and VMFA-542 'Bengals' soon arrived at the primitive airfield. Marine F-4 pilots claimed three enemy MiGs (two while on exchange duty with the USAF) at the cost of 75 aircraft lost in combat, mostly to ground fire, and four in accidents. The VMCJ-1 Golden Hawks (later VMAQ-1 and VMAQ-4 which had the old RM tailcode) flew the first photo recon mission with an RF-4B variant on 3 November 1966 from Da Nang AB, South Vietnam and remained there until 1970 with no RF-4B losses and only one aircraft damaged by anti-aircraft artillery (AAA) fire. VMCJ-2 and VMCJ-3 (now VMAQ-3) provided aircraft for VMCJ-1 in Da Nang and VMFP-3 was formed in 1975 at MCAS El Toro, CA consolidating all USMC RF-4Bs in one unit that became known as "The Eyes of the Corps." VMFP-3 disestablished in August 1990 after the Advanced Tactical Airborne Reconnaissance System was introduced for the F/A-18D Hornet. The F-4 continued to equip fighter-attack squadrons in both active and reserve Marine Corps units throughout the 1960s, 1970s and 1980s and into the early 1990s. In the early 1980s, these squadrons began to transition to the F/A-18 Hornet, starting with the same squadron that introduced the F-4 to the Marine Corps, VMFA-314 at MCAS El Toro, California. On 18 January 1992, the last Marine Corps Phantom, an F-4S in the Marine Corps Reserve, was retired by the "Cowboys" of VMFA-112 at NAS Dallas, Texas, after which the squadron was re-equipped with F/A-18 Hornets. Aerial combat in the Vietnam War The USAF and the US Navy had high expectations of the F-4 Phantom, assuming that the massive firepower, the best available on-board radar, the highest speed and acceleration properties, coupled with new tactics, would provide Phantoms with an advantage over the MiGs. However, in confrontations with the lighter MiG-21, F-4s did not always succeed and began to suffer losses. Over the course of the air war in Vietnam, between 3 April 1965 and 8 January 1973, each side would ultimately claim favorable kill ratios. During the war, U.S. Navy F-4 Phantoms claimed 40 air-to-air victories at a loss of seven Phantoms to enemy aircraft. USMC F-4 pilots claimed three enemy MiGs at the cost of one aircraft in air-combat. USAF F-4 Phantom crews scored MiG kills (including MiG-17s, eight MiG-19s and 66 MiG-21s) at a cost of 33 Phantoms in air-combat. F-4 pilots were credited with a total of MiG kills at a cost of 42 Phantoms in air-combat. According to the VPAF, 103 F-4 Phantoms were shot down by MiG-21s at a cost of 54 MiG-21s downed by F-4s. During the war, the VPAF lost 131 MiGs in air combat (63 MiG-17s, eight MiG-19s and 60 MiG-21s) of which one half were by F-4s. From 1966 to November 1968, in 46 air battles conducted over North Vietnam between F-4s and MiG-21s, VPAF claimed 27 F-4s were shot down by MiG-21s at a cost of 20 MiG-21s In 1970, one F-4 Phantom was shot down by a MiG-21. The struggle culminated on 10 May 1972, with VPAF aircraft completing 64 sorties, resulting in 15 air battles. The VPAF claimed seven F-4s were shot down, while U.S. confirmed five F-4s were lost. The Phantoms, in turn, managed to destroy two MiG-21s, three MiG-17s, and one MiG-19. On 11 May, two MiG-21s, which played the role of "bait", brought the four F-4s to two MiG-21s circling at low altitude. The MiGs quickly engaged and shot down two F-4s. On 18 May, Vietnamese aircraft made 26 sorties in eight air engagements, which cost 4 F-4 Phantoms; Vietnamese fighters on that day did not suffer losses. Non-U.S. users The Phantom has served with the air forces of many countries, including Australia, Egypt, Germany, United Kingdom, Greece, Iran, Israel, Japan, Spain, South Korea and Turkey. Australia The Royal Australian Air Force (RAAF) leased 24 USAF F-4Es from 1970 to 1973 while waiting for their order for the General Dynamics F-111C to be delivered. They were so well-liked that the RAAF considered retaining the aircraft after the F-111Cs were delivered. They were operated from RAAF Amberley by No. 1 Squadron and No. 6 Squadron. Egypt In 1979, the Egyptian Air Force purchased 35 former USAF F-4Es along with a number of Sparrow, Sidewinder, and Maverick missiles from the U.S. for $594 million as part of the "Peace Pharaoh" program. An additional seven surplus USAF aircraft were purchased in 1988. Three attrition replacements had been received by the end of the 1990s. Egyptian F-4Es were retired in 2020, with their former base at Cairo West Airport being reconfigured for the operation of F-16C/D Fighting Falcons. Germany The German Air Force (Luftwaffe) initially ordered the reconnaissance RF-4E in 1969, receiving a total of 88 aircraft from January 1971. In 1982, the initially unarmed RF-4Es were given a secondary ground attack capability; these aircraft were retired in 1994. In 1973, under the "Peace Rhine" program, the Luftwaffe purchased the F-4F (a lightened and simplified version of the F-4E) which was upgraded in the mid-1980s. 24 German F-4F Phantom IIs were operated by the 49th Tactical Fighter Wing of the USAF at Holloman AFB to train Luftwaffe crews until December 2004. In 1975, Germany also received 10 F-4Es for training in the U.S. In the late 1990s, these were withdrawn from service after being replaced by F-4Fs. Germany also initiated the Improved Combat Efficiency (ICE) program in 1983. The 110 ICE-upgraded F-4Fs entered service in 1992, and were expected to remain in service until 2012. All the remaining Luftwaffe Phantoms were based at Wittmund with Jagdgeschwader 71 (fighter wing 71) in Northern Germany and WTD61 at Manching. Phantoms were deployed to NATO states under the Baltic Air Policing starting in 2005, 2008, 2009, 2011 and 2012. The German Air Force retired its last F-4Fs on 29 June 2013. German F-4Fs flew 279,000 hours from entering service on 31 August 1973 until retirement. Greece In 1971, the Hellenic Air Force ordered brand new F-4E Phantoms, with deliveries starting in 1974. In the early 1990s, the Hellenic AF acquired surplus RF-4Es and F-4Es from the Luftwaffe and U.S. ANG. Following the success of the German ICE program, on 11 August 1997, a contract was signed between DASA of Germany and Hellenic Aerospace Industry for the upgrade of 39 aircraft to the very similar "Peace Icarus 2000" standard. The Hellenic AF operated 34 upgraded F-4E-PI2000 (338 and 339 Squadrons) and 12 RF-4E aircraft (348 Squadron) as of September 2013. On 5 May 2017, the Hellenic Air Force officially retired the RF-4E Phantom II during a public ceremony. Iran In the 1960s and 1970s when the U.S. and Iran were on friendly terms, the U.S. sold 225 F-4D, F-4E, and RF-4E Phantoms to Iran. The Imperial Iranian Air Force saw at least one engagement, resulting in a loss, after an RF-4C was rammed by a Soviet MiG-21 during Project Dark Gene, an ELINT operation during the Cold War. The Islamic Republic of Iran Air Force Phantoms saw heavy action in the Iran–Iraq War in the 1980s and are kept operational by overhaul and servicing from Iran's aerospace industry. Notable operations of Iranian F-4s during the war included Operation Scorch Sword, an attack by two F-4s against the Iraqi Osirak nuclear reactor site near Baghdad on 30 September 1980, and the attack on H3, a 4 April 1981 strike by eight Iranian F-4s against the H-3 complex of air bases in the far west of Iraq, which resulted in many Iraqi aircraft being destroyed or damaged for no Iranian losses. On 5 June 1984, two Saudi Arabian fighter pilots shot down two Iranian F-4 fighters. The Royal Saudi Air Force pilots were flying American-built F-15s and fired air-to-air missiles to bring down the Iranian planes. The Saudi fighter pilots had KC-135 aerial tanker planes and Boeing E-3 Sentry AWACS surveillance planes assist in the encounter. The aerial fight occurred in Saudi airspace over the Persian Gulf near the Saudi island Al Arabiyah, about 60 miles northeast of Jubail. Iranian F-4s were in use as of late 2014; the aircraft reportedly conducted air strikes on ISIS targets in the eastern Iraqi province of Diyala. Israel The Israeli Air Force was the largest foreign operator of the Phantom, flying both newly built and ex-USAF aircraft, as well as several one-off special reconnaissance variants. The first F-4Es, nicknamed "Kurnass" (Sledgehammer), and RF-4Es, nicknamed "Orev" (Raven), were delivered in 1969 under the "Peace Echo I" program. Additional Phantoms arrived during the 1970s under "Peace Echo II" through "Peace Echo V" and "Nickel Grass" programs. Israeli Phantoms saw extensive combat during Arab–Israeli conflicts, first seeing action during the War of Attrition. In the 1980s, Israel began the "Kurnass 2000" modernization program which significantly updated avionics. The last Israeli F-4s were retired in 2004. Japan From 1968, the Japan Air Self-Defense Force (JASDF) purchased a total of 140 F-4EJ Phantoms without aerial refueling, AGM-12 Bullpup missile system, nuclear control system or ground attack capabilities. Mitsubishi built 138 under license in Japan and 14 unarmed reconnaissance RF-4Es were imported. One of the aircraft (17-8440) was the last of the 5,195 F-4 Phantoms to be produced. It was manufactured by Mitsubishi Heavy Industries on 21 May 1981. "The Final Phantom" served with 306th Tactical Fighter Squadron and later transferred to the 301st Tactical Fighter Squadron. Of these, 96 F-4EJs were modified to the F-4EJ standard. 15 F-4EJ and F-4EJ Kai were converted to reconnaissance aircraft designated RF-4EJ. Japan had a fleet of 90 F-4s in service in 2007. After studying several replacement fighters the F-35A Lightning II was chosen in 2011. The 302nd Tactical Fighter Squadron became the first JASDF F-35 Squadron at Misawa Air Base when it converted from the F-4EJ Kai on 29 March 2019. The JASDF's sole aerial reconnaissance unit, the 501st Tactical Reconnaissance Squadron, retired their RF-4Es and RF-4EJs on 9 March 2020, and the unit itself dissolved on 26 March. The 301st Tactical Fighter Squadron then became the sole user of the F-4EJ in the Air Defense Command, with their retirement originally scheduled in 2021 along with the unit's transition to the F-35A. However, on 20 November 2020, the 301st Tactical Fighter Squadron announced the earlier retirement of their remaining F-4EJs, concluding the Phantom's long-running career in the JASDF Air Defense Command. Although retirement was announced, the 301st TFS continued operations up until 10 December 2020, with the squadron's Phantoms being decommissioned on 14 December. Two F-4EJs and a F-4EJ Kai continued to be operated by the Air Development and Test Wing in Gifu Prefecture until their retirement on 17 March 2021, marking an end of Phantom operations in Japan. South Korea The Republic of Korea Air Force purchased its first batch of secondhand USAF F-4D Phantoms in 1968 under the "Peace Spectator" program. The F-4Ds continued to be delivered until 1988. The "Peace Pheasant II" program also provided new-built and former USAF F-4Es. Spain The Spanish Air Force acquired its first batch of ex-USAF F-4C Phantoms in 1971 under the "Peace Alfa" program. Designated C.12, the aircraft were retired in 1989. At the same time, the air arm received a number of ex-USAF RF-4Cs, designated CR.12. In 1995–1996, these aircraft received extensive avionics upgrades. Spain retired its RF-4s in 2002. Turkey The Turkish Air Force (TAF) received 40 F-4Es in 1974, with a further 32 F-4Es and 8 RF-4Es in 1977–78 under the "Peace Diamond III" program, followed by 40 ex-USAF aircraft in "Peace Diamond IV" in 1987, and a further 40 ex-U.S. Air National Guard Aircraft in 1991. A further 32 RF-4Es were transferred to Turkey after being retired by the Luftwaffe between 1992 and 1994. In 1995, Israel Aerospace Industries (IAI) implemented an upgrade similar to Kurnass 2000 on 54 Turkish F-4Es which were dubbed the F-4E 2020 Terminator. Turkish F-4s, and more modern F-16s have been used to strike Kurdish PKK bases in ongoing military operations in Northern Iraq. On 22 June 2012, a Turkish RF-4E was shot down by Syrian air defenses while flying a reconnaissance flight near the Turkish-Syrian border. Turkey has stated the reconnaissance aircraft was in international airspace when it was shot down, while Syrian authorities stated it was inside Syrian airspace. Turkish F-4s remained in use as of 2020. On 24 February 2015, two RF-4Es crashed in the Malatya region in the southeast of Turkey, under yet unknown circumstances, killing both crew of two each. On 5 March 2015, an F-4E-2020 crashed in central Anatolia killing both crew. After the recent accidents, the TAF withdrew RF-4Es from active service. Turkey was reported to have used F-4 jets to attack PKK separatists and the ISIS capital on 19 September 2015. The Turkish Air Force has reportedly used the F-4E 2020s against the more recent Third Phase of the PKK conflict on heavy bombardment missions into Iraq on 15 November 2015, 12 January 2016, and 12 March 2016. United Kingdom The United Kingdom bought versions based on the U.S. Navy's F-4J for use with the Royal Air Force and the Royal Navy's Fleet Air Arm. The UK was the only country outside the United States to operate the Phantom at sea, with them operating from . The main differences were the use of the British Rolls-Royce Spey engines and of British-made avionics. The RN and RAF versions were given the designation F-4K and F-4M respectively, and entered service with the British military aircraft designations Phantom FG.1 (fighter/ground attack) and Phantom FGR.2 (fighter/ground attack/reconnaissance). Initially, the FGR.2 was used in the ground attack and reconnaissance role, primarily with RAF Germany, while 43 Squadron was formed in the air defence role using the FG.1s that had been intended for the Fleet Air Arm for use aboard . The superiority of the Phantom over the English Electric Lightning in terms of both range and weapons system capability, combined with the successful introduction of the SEPECAT Jaguar, meant that, during the mid-1970s, most of the ground attack Phantoms in Germany were redeployed to the UK to replace air defence Lightning squadrons. A second RAF squadron, 111 Squadron, was formed on the FG.1 in 1979 after the disbandment of 892 NAS. In 1982, during the Falklands War, three Phantom FGR2s of No. 29 Squadron were on active Quick Reaction Alert duty on Ascension Island to protect the base from air attack. After the Falklands War, 15 upgraded ex-USN F-4Js, known as the F-4J(UK) entered RAF service to compensate for one interceptor squadron redeployed to the Falklands. Around 15 RAF squadrons received various marks of Phantom, many of them based in Germany. The first to be equipped was No. 228 Operational Conversion Unit at RAF Coningsby in August 1968. One noteworthy operator was No. 43 Squadron where Phantom FG1s remained the squadron equipment for 20 years, arriving in September 1969 and departing in July 1989. During this period the squadron was based at Leuchars. The interceptor Phantoms were replaced by the Panavia Tornado F3 from the late 1980s onwards, and the last combat British Phantoms were retired in October 1992 when No. 74(F) Squadron was disbanded. Phantom FG.1 XT597 was the last British Phantom to be retired on 28 January 1994, it was used as a test jet by the Aeroplane and Armament Experimental Establishment for its whole service life. Civilian use Sandia National Laboratories expended an F-4 mounted on a "rocket sled" in a crash test to record the results of an aircraft impacting a reinforced concrete structure, such as a nuclear power plant. One aircraft, an F-4D (civilian registration N749CF), is operated by the Massachusetts-based non-profit organization Collings Foundation as a "living history" exhibit. Funds to maintain and operate the aircraft, which is based in Houston, Texas, are raised through donations/sponsorships from public and commercial parties. After finding the Lockheed F-104 Starfighter inadequate, NASA used the F-4 to photograph and film Titan II missiles after launch from Cape Canaveral during the 1960s. Retired U.S. Air Force colonel Jack Petry described how he put his F-4 into a Mach 1.2 dive synchronized to the launch countdown, then "walked the (rocket's) contrail". Petry's Phantom stayed with the Titan for 90 seconds, reaching 68,000 feet, then broke away as the missile continued into space. NASA's Dryden Flight Research Center acquired an F-4A on 3 December 1965. It made 55 flights in support of short programs, chase on X-15 missions and lifting body flights. The F-4 also supported a biomedical monitoring program involving 1,000 flights by NASA Flight Research Center aerospace research pilots and students of the USAF Aerospace Research Pilot School flying high-performance aircraft. The pilots were instrumented to record accurate and reliable data of electrocardiogram, respiration rate, and normal acceleration. In 1967, the Phantom supported a brief military-inspired program to determine whether an airplane's sonic boom could be directed and whether it could be used as a weapon of sorts, or at least an annoyance. NASA also flew an F-4C in a spanwise blowing study from 1983 to 1985, after which it was returned. Variants F-4A, B, J, N and S Variants for the U.S. Navy and the U.S. Marine Corps. F-4B was upgraded to F-4N, and F-4J was upgraded to F-4S. F-110 (original USAF designation for F-4C), F-4C, D and E Variants for the U.S. Air Force. F-4E introduced an internal M61 Vulcan cannon. The F-4D and E were the most numerously produced, widely exported, and also extensively used under the Semi Automatic Ground Environment (SAGE) U.S. air defense system. F-4G Wild Weasel V A dedicated SEAD variant for the U.S. Air Force with updated radar and avionics, converted from F-4E. The designation F-4G was applied earlier to an entirely different U.S. Navy Phantom. F-4K and M Variants for the Royal Navy and Royal Air Force, respectively, re-engined with Rolls-Royce Spey turbofan engines. F-4EJ and RF-4EJ Simplified F-4E exported to and license-built in Japan. Some modified for reconnaissance role, carrying photographic and/or electronic reconnaissance pods and designated RF-4EJ. F-4F Simplified F-4E exported to Germany. QRF-4C, QF-4B, E, G, N and S Retired aircraft converted into remote-controlled target drones used for weapons and defensive systems research by USAF and USN / USMC. RF-4B, C, and E Tactical reconnaissance variants. Operators Operators Hellenic Air Force – 18 F-4E AUPs in service Andravida Air Base, Elis 338 MDV Islamic Republic of Iran Air Force – 62 F-4D, F-4E, and RF-4Es in service Bandar Abbas Air Base, Hormozgan Province 91st Tactical Fighter Squadron (F-4E) Bushehr Air Base, Bushehr Province 61st Tactical Fighter Squadron (F-4E) Chabahar Konarak Air Base, Sistan and Baluchestan Province 101st Tactical Fighter Squadron (F-4D) Hamadan Air Base, Hamadan Province 31st Tactical Reconnaissance Squadron (RF-4E) 31st Tactical Fighter Squadron (F-4E) Republic of Korea Air Force – 27 F-4Es in service Suwon Air Base, Gyeonggi Province 153rd Fighter Squadron Turkish Air Force – 26 F-4E 2020 Terminators in service Eskişehir Air Base, Eskişehir Province 111 Filo Former operators Royal Australian Air Force (F-4E 1970 to 1973) Egyptian Air Force (F-4E 1977 to 2020) German Air Force (RF-4E 1971 to 1994; F-4F 1973 to 2013; F-4E 1978 to 1992) Hellenic Air Force (RF-4E 1978 to 2017) Imperial Iranian Air Force (F-4D 1968 to 1979; F-4E 1971 to 1979; RF-4E 1971 to 1979) Israeli Air Force (F-4E 1969 to 2004; RF-4C 1970 to 1971; RF-4E 1971 to 2004) Japan Air Self-Defense Force (F-4EJ 1971 to 2021; RF-4E 1974 to 2020; RF-4EJ 1992 to 2020) Republic of Korea Air Force (F-4D 1969 to 2010; RF-4C 1989 to 2014) Spanish Air Force (F-4C 1971 to 1990; RF-4C 1978 to 2002) Turkish Air Force (RF-4E 1980 to 2015) Aeroplane and Armament Experimental Establishment (F-4K 1970 to 1994) Fleet Air Arm (F-4K 1968 to 1978) Royal Air Force (F-4M 1968 to 1992; F-4K 1969 to 1990; F-4J(UK) 1984 to 1991) NASA (F-4A 1965 to 1967; F-4C 1983 to 1985) United States Air Force (F-4B 1963 to 1964; F-4C 1964 to 1989; RF-4C 1964 to 1995; F-4D 1965 to 1992; F-4E 1967 to 1991; F-4G 1978 to 1996; QF-4 1996 to 2016) United States Marine Corps (F-4B 1962 to 1979; RF-4B 1965 to 1990; F-4J 1967 to 1984; F-4N 1973 to 1985; F-4S 1978 to 1992) United States Navy (F-4A 1960 to 1968; F-4B 1961 to 1974; F-4J 1966 to 1982; F-4N 1973 to 1984; F-4S 1979 to 1987; QF-4 1983 to 2004) Culture Nicknames The Phantom gathered a number of nicknames during its career. Some of these names included "Snoopy", "Rhino", "Double Ugly", "Old Smokey", the "Flying Anvil", "Flying Footlocker", "Flying Brick", "Lead Sled", the "Big Iron Sled", and the "St. Louis Slugger". In recognition of its record of downing large numbers of Soviet-built MiGs, it was called the "World's Leading Distributor of MiG Parts". As a reflection of excellent performance in spite of its bulk, the F-4 was dubbed "the triumph of thrust over aerodynamics." German Luftwaffe crews called their F-4s the Eisenschwein ("Iron Pig"), Fliegender Ziegelstein ("Flying Brick") and Luftverteidigungsdiesel ("Air Defense Diesel"). Reputation Imitating the spelling of the aircraft's name, McDonnell issued a series of patches. Pilots became "Phantom Phlyers", backseaters became "Phantom Pherrets", fans of the F-4 "Phantom Phanatics", and call it the "Phabulous Phantom". Ground crewmen who worked on the aircraft are known as "Phantom Phixers". Several active websites are devoted to sharing information on the F-4, and the aircraft is grudgingly admired as brutally effective by those who have flown it. Colonel (Ret.) Chuck DeBellevue reminisced, "The F-4 Phantom was the last plane that looked like it was made to kill somebody. It was a beast. It could go through a flock of birds and kick out barbeque from the back." It had "A reputation of being a clumsy bruiser reliant on brute engine power and obsolete weapons technology." The Spook The aircraft's emblem is a whimsical cartoon ghost called "The Spook", which was created by McDonnell Douglas technical artist, Anthony "Tony" Wong, for shoulder patches. The name "Spook" was coined by the crews of either the 12th Tactical Fighter Wing or the 4453rd Combat Crew Training Wing at MacDill AFB. The figure is ubiquitous, appearing on many items associated with the F-4. The Spook has followed the Phantom around the world adopting local fashions; for example, the British adaptation of the U.S. "Phantom Man" is a Spook that sometimes wears a bowler hat and smokes a pipe. Aircraft on display As a result of its extensive number of operators and large number of aircraft produced, there are many F-4 Phantom II of numerous variants on display worldwide. Notable accidents On 6 June 1971, Hughes Airwest Flight 706, a McDonnell Douglas DC-9-31 collided in mid-air with a United States Marine Corps F-4B Phantom above the San Gabriel Mountains, while en route from Los Angeles International Airport to Salt Lake City. All 49 on board the DC-9 were killed, while the pilot of the F-4B was unable to eject and died when the aircraft crashed shortly afterwards. The F-4B's Radar Intercept Officer successfully ejected from the plane and parachuted to safety, being the sole survivor of the incident. On 9 August 1974, a Royal Air Force Phantom FGR2 was involved in a fatal collision with a civilian PA-25-235 Pawnee crop-sprayer over Norfolk, England. Aircraft Accident Report 975 On 21 March 1987, Captain Dean Paul Martin, a pilot in the 163d Tactical Fighter Group of the California Air National Guard and son of entertainer Dean Martin, crashed his F-4C into San Gorgonio Mountain, California shortly after departure from March Air Force Base. Both Martin and his weapon systems officer (WSO) Captain Ramon Ortiz were killed. Specifications (F-4E) See also References Notes Citations Bibliography Angelucci, Enzo. The American Fighter. Sparkford, Somerset, UK: Haynes Publishing Group, 1987. . Beit-Hallahmi, Benjamin. The Israeli Connection: Whom Israel Arms and Why. London: I.G. Tauris, 1987. . Bishop, Farzad and Tom Cooper. Iranian F-4 Phantom II Units in Combat (Osprey Combat Aircraft #37). Oxford, UK: Osprey Publishing Limited, 2003. . Bowers, Peter M. and Enzo Angellucci. The American Fighter. New York: Orion Books, 1987. . Burden, Rodney, Michael I. Draper, Douglas A. Rough, Colin R. Smith and David L. Wilton. Falklands: The Air War. London: Arms and Armour Press, 1986. . Burgess, Richard E. The Naval Aviation Guide, 4th ed. Annapolis, Maryland: Naval Institute Press, 1985. . Calvert, Denis. Le Tigri della RAF (RAF's Tigers)(in Italian). Aerei magazine N.5, Parma, Italy: Delta editrice, 1991. Carrara, Dino. Phantom Targets: The USAFs Last F-4 Squadron. Air International, Volume 71, no. 5, November 2006. Stamford, Lincolnshire, UK: Key Publishing, pp. 42–48.. Cooper, Tom and Farzad Bishop. Target Saddam's Reactor: Israeli and Iranian Operations Against Iraqi Planes to Develop Nuclear Weapons. Air Enthusiast, No. 110, March/April 2004. pp. 2–12.. Davies, Peter E. USAF F-4 Phantom II MiG Killers 1965-68 (Osprey Combat Aircraft #45). Oxford, UK: Osprey Publishing Limited, 2004. . Davies, Peter E. USAF F-4 Phantom II MiG Killers 1972-73 (Osprey Combat Aircraft #55). Oxford, UK: Osprey Publishing Limited, 2005. . Deurenberg, Rudd. Shedding Light on Iranian Phantoms. Air Enthusiast, No. 111, May/June 2004, p. 72. Donald, David. RAF Phantoms. Wings of Fame. London: Aerospace. Volume 15, 1999. pp. 4–21. . Donald, David and Jon Lake, eds. Desert Storm: The First Phase. World Air Power Journal. London: Aerospace, Volume 5, Spring 1991.. Donald, David and Jon Lake, eds. Desert Storm: Gulf Victory. World Air Power Journal. London: Aerospace, Volume 6, Summer 1991.. Donald, David and Jon Lake, eds. Encyclopedia of World Military Aircraft. London: AIRtime Publishing, 1996. . Donald, David and Jon Lake, eds. McDonnell F-4 Phantom: Spirit in the Skies. London: AIRtime Publishing, 2002. . Dorr, Robert F. Navy Phantoms in Vietnam. Wings of Fame, Volume 1, 1995. London: Aerospace Publishing. . Dorr, Robert F. "McDonnell F3H Demon". Aeroplane. Volume 36, No. 3, March 2008, pp. 58–61. London: IBC. Dorr, Robert F. and Chris Bishop, eds. Vietnam Air War Debrief. London: Aerospace Publishing, 1996. . Dorr, Robert F. and Jon Lake. Fighters of the United States Air Force. London: Temple Press, 1990. . Dorr, Robert F. Phantoms Forever. London: Osprey Publishing Limited, 1987. . Eden, Paul ed. The Encyclopedia of Modern Military Aircraft. London: Amber Books Ltd, 2004. . Elward, Brad and Peter Davies. US Navy F-4 Phantom II MiG Killers 1965-70 (Osprey Combat Aircraft #26). Oxford, UK: Osprey Publishing Limited, 2001. . Elward, Brad and Peter Davies. US Navy F-4 Phantom II MiG Killers 1972-73 (Osprey Combat Aircraft #30). Oxford, UK: Osprey Publishing Limited, 2002. . Freeman, CJ and Gunston, Bill Consulting ed. The Encyclopedia of World Airpower. Crown Publishers, 1979. . Fricker, John. "Boeing /McDonnell Douglas F-4 Phantom II Current Operators". World Air Power Journal. London: Aerospace, Volume 40, Spring 2000. . Green, William and Gordon Swanborough. The Great Book of Fighters. St. Paul, Minnesota: MBI Publishing, 2001. . Gimmi, Russell M. Airman: The Life of Richard F. B. Gimmi. Bloomington, Indiana: iUniverse, 2009. . Grossnick, Roy and William J. Armstrong. United States Naval Aviation, 1910–1995. Annapolis, Maryland: Naval Historical Center, 1997. . Gunston, Bill ed. The Illustrated History of Fighters. New York, New York: Exeter Books Div. of Simon Schuster, 1981. . Gunston, Bill Consulting ed. The Encyclopedia of World Airpower. Crown Publishers, 1979. . Higham, Robin and Carol Williams. Flying Combat Aircraft of USAAF-USAF (Vol.2). Manhattan, Kansas: Sunflower University Press, 1978. . Hobson, Chris. Vietnam Air Losses, USAF, USN, USMC, Fixed-Wing Aircraft Losses in Southeast Asia 1961–1973. North Branch, Minnesota: Specialty Press, 2001. . Howarth, Alan. Spanish Phantoms and Their Legacy. Air Enthusiast 115, January–February 2005, p. 74 Jefford, C.G. RAF Squadrons: A Comprehensive Record of the Movement and Equipment of All RAF Squadrons and Their Antecedents Since 1912:. Shrewsbury, UK: Airlife Publishing, 2nd edition, 2001. Jones, Lloyd S. U.S. Fighters: 1925–1980s. Fallbrook, California: Aero Publishers, Inc., 1975. . Knaack, Marcelle Size. Encyclopedia of U.S. Air Force Aircraft and Missile Systems: Volume 1 Post-World War II Fighters 1945–1973. Washington, DC: Office of Air Force History, 1978. . Lake Jon. McDonnell F-4 Phantom: Spirit in the Skies. London: Aerospace Publishing, 1992. . List, Friedrich. "German Air Arms Review". Air International, Volume 70, No. 5, May 2006, pp. 50–57. Stamford, Lincolnshire, UK: Key Publishing.. Melampy, Jake. "Phantoms West". Air International, Volume 80, No. 1, January 2011, pp. 36–38. Stamford, Lincolnshire, UK: Key Publishing.. Nordeen, Lon. Fighters Over Israel: The Story of the Israeli Air Force from the War of Independence to the Bekaa Valley. London: Guild Publishing, 1991. . Richardson, Doug and Mike Spick. F-4 Phantom II (Modern Fighting Aircraft, Volume 4) . New York: Arco Publishing, 1984. . Swanborough, Gordon and Peter Bowers. United States Military Aircraft Since 1909. Washington, District of Columbia: Smithsonian, 1989. . Swanborough, Gordon and Peter Bowers. United States Navy Aircraft since 1911. London: Putnam, 1976. . Taylor, Michael J.H. Jane's American Fighting Aircraft of the 20th century. New York: Mallard Press, 1991. . Thetford, Owen. British Naval Aircraft since 1912. London: Putnam, Fourth Edition, 1994, pp. 254–255. . Thornborough, Anthony M. and Peter E. Davies. The Phantom Story. London: Arms and Armour Press, 1994. . Wagner, Ray. American Combat Planes, Third Enlarged Edition. New York: Doubleday, 1982. . Wilson, Stewart. Phantom, Hornet and Skyhawk in Australian Service. Weston Creek, ACT, Australia: Aerospace Publications, 1993. . External links F-4 Phantom II history page on Boeing.com F-4 Phantom II Society site PhantomF4K.org – Fleet Air Arm – Royal Navy site F-4.nl site Countering Israeli Reaction to F-4 Sales to Saudi Arabia and Kuwait 8th Tactical Fighter Wing site F-4 Phantom II articles and publications, theaviationindex.com The Phantom page with images on fas.org "The Phantom Turns 50" article at Fence Check site F-4 Phantom page on Aerospaceweb.org RAF Phantom Losses The Phantom Zone Phantom 50th Anniversary Slideshow F-004 Phantom II 1950s United States fighter aircraft Aircraft first flown in 1958 Carrier-based aircraft Low-wing aircraft Twinjets Articles containing video clips
[ -0.4942408502101898, -0.2186441719532013, 0.34268441796302795, 0.13642969727516174, -0.004240391310304403, -0.15139082074165344, 0.27043139934539795, -0.2370392382144928, 0.18742257356643677, -0.14229720830917358, -0.7714798450469971, 0.8528129458427429, -0.3062574565410614, -0.10465309023...
11761
https://en.wikipedia.org/wiki/McDonnell%20FH%20Phantom
McDonnell FH Phantom
The McDonnell FH Phantom was a twinjet fighter aircraft designed and first flown during World War II for the United States Navy. The Phantom was the first purely jet-powered aircraft to land on an American aircraft carrier and the first jet deployed by the United States Marine Corps. Although with the end of the war, only 62 FH-1s were built, it helped prove the viability of carrier-based jet fighters. As McDonnell's first successful fighter, leading to the development of the follow-on F2H Banshee, which was one of the two most important naval jet fighters of the Korean War, it would also establish McDonnell as an important supplier of navy aircraft. When McDonnell chose to bring the name back with the Mach 2–class McDonnell Douglas F-4 Phantom II, it launched what would become the most versatile and widely used western combat aircraft of the Vietnam War era, adopted by the USAF and the US Navy, remaining in use with various countries to the present day. The FH Phantom was originally designated the FD Phantom, but the designation was changed as the aircraft entered production. Design and development In early 1943, aviation officials at the United States Navy were impressed with McDonnell's audacious XP-67 Bat project. McDonnell was invited by the navy to cooperate in the development of a shipboard jet fighter, using an engine from the turbojets under development by Westinghouse Electric Corporation. Three prototypes were ordered on 30 August 1943 and the designation XFD-1 was assigned. Under the 1922 United States Navy aircraft designation system, the letter "D" before the dash designated the aircraft's manufacturer. The Douglas Aircraft Company had previously been assigned this letter, but the USN elected to reassign it to McDonnell because Douglas had not provided any fighters for navy service in years. McDonnell engineers evaluated a number of engine combinations, varying from eight 9.5 in (24 cm) diameter engines down to two engines of 19 inch (48 cm) diameter. The final design used the two 19 in (48 cm) engines after it was found to be the lightest and simplest configuration. The engines were buried in the wing root to keep intake and exhaust ducts short, offering greater aerodynamic efficiency than underwing nacelles, and the engines were angled slightly outwards to protect the fuselage from the hot exhaust blast. Placement of the engines in the middle of the airframe allowed the cockpit with its bubble-style canopy to be placed ahead of the wing, granting the pilot excellent visibility in all directions. This engine location also freed up space under the nose, allowing designers to use tricycle gear, thereby elevating the engine exhaust path and reducing the risk that the hot blast would damage the aircraft carrier deck. The construction methods and aerodynamic design of the Phantom were fairly conventional for the time; the aircraft had unswept wings, a conventional empennage, and an aluminum monocoque structure with flush riveted aluminum skin. Folding wings were used to reduce the width of the aircraft in storage configuration. Provisions for four .50-caliber (12.7 mm) machine guns were made in the nose, while racks for eight 5 in (127 mm) High Velocity Aircraft Rockets could be fitted under the wings, although these were seldom used in service. Adapting a jet to carrier use was a much greater challenge than producing a land-based fighter because of slower landing and takeoff speeds required on a small carrier deck. The Phantom used split flaps on both the folding and fixed wing sections to enhance low-speed landing performance, but no other high-lift devices were used. Provisions were also made for Rocket Assisted Take Off (RATO) bottles to improve takeoff performance. When the first XFD-1, serial number 48235, was completed in January 1945, only one Westinghouse 19XB-2B engine was available for installation. Ground runs and taxi tests were conducted with the single engine, and such was the confidence in the aircraft that the first flight on 26 January 1945 was made with only the one turbojet engine. During flight tests, the Phantom became the first U.S. Navy aircraft to exceed 500 mph (434 kn, 805 km/h). With successful completion of tests, a production contract was awarded on 7 March 1945 for 100 FD-1 aircraft. With the end of the war, the Phantom production contract was reduced to 30 aircraft, but was soon increased back to 60. The first prototype was lost in a fatal crash on 1 November 1945, but the second and final Phantom prototype (serial number 48236) was completed early the next year and became the first purely jet-powered aircraft to operate from an American aircraft carrier, completing four successful takeoffs and landings on 21 July 1946, from near Norfolk, Virginia. At the time, she was the largest carrier serving with the U.S. Navy, allowing the aircraft to take off without assistance from a catapult. The second prototype crashed on 26 August 1946. Production Phantoms incorporated a number of design improvements. These included provisions for a flush-fitting centerline drop tank, an improved gunsight, and the addition of speed brakes. Production models used Westinghouse J30-WE-20 engines with 1,600 lbf (7.1 kN) of thrust per engine. The top of the vertical tail had a more square shape than the rounder tail used on the prototypes, and a smaller rudder was used to resolve problems with control surface clearance discovered during test flights. The horizontal tail surfaces were shortened slightly, while the fuselage was stretched by 19 in (48 cm). The amount of framing in the windshield was reduced to enhance pilot visibility. Halfway through the production run, the navy reassigned the designation letter "D" back to Douglas, with the Phantom being redesignated FH-1. Including the two prototypes, a total of 62 Phantoms were finally produced, with the last FH-1 rolling off the assembly line in May 1948. Realizing that the production of more powerful jet engines was imminent, McDonnell engineers proposed a more powerful variant of the Phantom while the original aircraft was still under development – a proposal that would lead to the design of the Phantom's replacement, the F2H Banshee. Although the new aircraft was originally envisioned as a modified Phantom, the need for heavier armament, greater internal fuel capacity, and other improvements eventually led to a substantially heavier and bulkier aircraft that shared few parts with its agile predecessor. Despite this, the two aircraft were similar enough that McDonnell was able to complete its first F2H-1 in August 1948, a mere three months after the last FH-1 had rolled off the assembly line. Operational history The first Phantoms were delivered to USN fighter squadron VF-17A (later redesignated VF-171) in August 1947; the squadron received a full complement of 24 aircraft on 29 May 1948. Beginning in November 1947, Phantoms were delivered to United States Marine Corps squadron VMF-122, making it the first USMC combat squadron to deploy jets. VF-17A became the USN's first fully operational jet carrier squadron when it deployed aboard on 5 May 1948. The Phantom was one of the first jets used by the U.S. military for exhibition flying. Three Phantoms used by the Naval Air Test Center were used by a unique demonstration team called the Gray Angels, whose members consisted entirely of naval aviators holding the rank of rear admiral (Daniel V. Gallery, Apollo Soucek and Edgar A. Cruise.) The team's name was an obvious play on the name of the recently formed U.S. Navy Blue Angels, who were still flying propeller-powered Grumman F8F Bearcats at the time. The "Grays" flew in various air shows during the summer of 1947, but the team was abruptly disbanded after their poorly timed arrival at a September air show in Cleveland, Ohio, nearly caused a head-on low-altitude collision with a large formation of other aircraft; their Phantoms were turned over to test squadron VX-3. The VMF-122 Phantoms were later used for air show demonstrations until they were taken out of service in 1949, with the team being known alternately as the Marine Phantoms or the Flying Leathernecks. The Phantom's service as a frontline fighter would be short-lived. Its limited range and light armament – notably, its inability to carry bombs – made it best suited for duty as a point-defence interceptor aircraft. However, its speed and rate of climb were only slightly better than existing propeller-powered fighters and fell short of other contemporary jets, such as the Lockheed P-80 Shooting Star, prompting concerns that the Phantom would be outmatched by future enemy jets it might soon face. Moreover, recent experience in World War II had demonstrated the value of naval fighters that could double as fighter-bombers, a capability the Phantom lacked. Finally, the aircraft exhibited some design deficiencies – its navigational avionics were poor, it could not accommodate newly developed ejection seats, and the location of the machine guns in the upper nose caused pilots to be dazzled by muzzle flash. The F2H Banshee and Grumman F9F Panther, both of which began flight tests around the time of the Phantom's entry into service, better satisfied the navy's desire for a versatile, long-range, high-performance jet. Consequently, the FH-1 saw little weapons training, and was primarily used for carrier qualifications to transition pilots from propeller-powered fighters to jets in preparation for flying the Panther or Banshee. In June 1949, VF-171 (VF-17A) re-equipped with the Banshee, and their Phantoms were turned over to VF-172; this squadron, along with the NATC, VX-3, and VMF-122, turned over their Phantoms to the United States Naval Reserve by late 1949 after receiving F2H-1 Banshees. The FH-1 would see training duty with the USNR until being replaced by the F9F Panther in July 1954; none ever saw combat, having been retired from frontline service prior to the outbreak of the Korean War. Civilian use In 1964, Progressive Aero, Incorporated of Fort Lauderdale, Florida purchased three surplus Phantoms, intending to use them to teach civilians how to fly jets. A pair were stripped of military equipment and restored to flying condition, but the venture was unsuccessful, and the aircraft were soon retired once again. Variants XFD-1 Prototype aircraft powered by Westinghouse 19XB-2B engines (J-30). Two built. FH-1 (FD-1) Production version with Westinghouse J30-WE-20 engines (originally designated FD-1). 60 built. Operators United States Navy VX-3 VF-171 (VF-17A) VF-172 Naval Air Reserve United States Marine Corps VMF-122 VMF-311 Aircraft on display FH-1 BuNo 111759 - National Air and Space Museum of the Smithsonian Institution in Washington, D.C., United States. This aircraft served with Marine Fighter Squadron 122 (VMF-122). It was retired in April 1954, with a total of 418 flight hours. The aircraft was transferred to the Smithsonian by the U.S. Navy in 1959. BuNo 111768 - Wings of Eagles Discovery Center in Horseheads, New York. It has had a busy post-retirement life. Formerly a Progressive Aero aircraft c/n 456 (civil registration N4283A) it was placed on display at the Marine Corps Museum. The aircraft was later transferred to the St. Louis Aviation Museum, and then the National Warplane Museum in Geneseo, New York. In 2006 the aircraft was moved its current location. 5 Aug 2016 aircraft is on display in H3 of Pima Air & Space Museum, Tucson, Arizona. BuNo 111793 - National Naval Aviation Museum at Naval Air Station Pensacola, Florida. This aircraft was accepted by the navy on 28 February 1948. After flying for a brief time with Marine Fighter Squadron (VMF) 122, the first Marine jet squadron, at Marine Corps Air Station Cherry Point, North Carolina, it was stricken from the naval inventory in 1949. The museum acquired the aircraft from National Jets, Inc., of Fort Lauderdale, Florida, in 1983. Specifications (FH-1 Phantom) See also References Notes Citations Bibliography Angelucci, Enzo and Peter M. Bowers. The American Fighter. Sparkford, Somerset, UK: Haynes Publishing Group, 1987. . Francillon, René J. McDonnell Douglas Aircraft since 1920. London: Putnam & Company, Ltd, 1979. . Green, William. War Planes of the Second World War, Volume Four: Fighters. London: MacDonald & Co. (Publishers) Ltd., 1961 (sixth impression 1969). . Green, William and Gordon Swanborough. WW2 Aircraft Fact Files: US Navy and Marine Corps Fighters. London: Macdonald and Jane's, 1976. . Grossnick, Roy A. "Part 6: Postwar Years: 1946–1949". United States Naval Aviation 1910–1995. Washington, D.C.: Naval Historical Center, 1997. . Hamilton, Hayden. "The McDonnell FH-1 Phantom: the Forgotten Phantom". AAHS Journal, Vol. 55, No. 2, Summer 2010. Mesko, Jim. FH Phantom/F2H Banshee in action. Carrollton, Texas: Squadron/Signal Publications, Inc., 2002. . Mills, Carl. Banshees in the Royal Canadian Navy. Willowdale, Ontario, Canada: Banshee Publication, 1991. . "Mr Mac's First Phantom: The Story of the McDonnell FH-1". Air International Vol. 33, No. 5, November 1987, pp. 231–235, 258–260. Bromley, UK: Fine Scroll. . Wagner, Ray. American Combat Planes. New York: Doubleday, third edition, 1982. . External links "Phantom Development" a 1947 Flight article by John W. R. Taylor Carrier-based aircraft F1H Phantom McDonnell F1H Phantom World War II jet aircraft of the United States Cruciform tail aircraft Twinjets Aircraft first flown in 1947
[ -0.7970694899559021, -0.3546697795391083, -0.08159271627664566, 0.17737971246242523, 0.21610486507415771, -0.10191699862480164, 0.14305736124515533, -0.14345474541187286, 0.18339017033576965, 0.027393994852900505, -0.7687545418739319, 0.6192238926887512, -0.3186328113079071, -0.25763890147...
11762
https://en.wikipedia.org/wiki/Fricative
Fricative
Fricatives are consonants produced by forcing air through a narrow channel made by placing two articulators close together. These may be the lower lip against the upper teeth, in the case of ; the back of the tongue against the soft palate, in the case of German (the final consonant of Bach); or the side of the tongue against the molars, in the case of Welsh (appearing twice in the name Llanelli). This turbulent airflow is called frication. A particular subset of fricatives are the sibilants. When forming a sibilant, one still is forcing air through a narrow channel, but in addition, the tongue is curled lengthwise to direct the air over the edge of the teeth. English , , , and are examples of sibilants. The usage of two other terms is less standardized: "Spirant" is an older term for fricatives used by some American and European phoneticians and phonologists. "Strident" could mean just "sibilant", but some authors include also labiodental and uvular fricatives in the class. Types The airflow is not completely stopped in the production of fricative consonants. In other words, the airflow experiences friction. Sibilants voiceless coronal sibilant, as in English sip voiced coronal sibilant, as in English zip voiceless dental sibilant voiced dental sibilant voiceless apical sibilant voiced apical sibilant voiceless predorsal sibilant (laminal, with tongue tip at lower teeth) voiced predorsal sibilant (laminal) voiceless postalveolar sibilant (laminal) voiced postalveolar sibilant (laminal) voiceless palato-alveolar sibilant (domed, partially palatalized), as in English ship voiced palato-alveolar sibilant (domed, partially palatalized), as the si in English vision voiceless alveolo-palatal sibilant (laminal, palatalized) voiced alveolo-palatal sibilant (laminal, palatalized) voiceless retroflex sibilant (apical or subapical) voiced retroflex sibilant (apical or subapical) All sibilants are coronal, but may be dental, alveolar, postalveolar, or palatal (retroflex) within that range. However, at the postalveolar place of articulation, the tongue may take several shapes: domed, laminal, or apical, and each of these is given a separate symbol and a separate name. Prototypical retroflexes are subapical and palatal, but they are usually written with the same symbol as the apical postalveolars. The alveolars and dentals may also be either apical or laminal, but this difference is indicated with diacritics rather than with separate symbols. Central non-sibilant fricatives voiceless bilabial fricative voiced bilabial fricative voiceless labiodental fricative, as in English fine voiced labiodental fricative, as in English vine voiceless linguolabial fricative voiced linguolabial fricative voiceless dental non-sibilant fricative, as in English thing voiced dental non-sibilant fricative, as in English that voiceless alveolar non-sibilant fricative voiced alveolar non-sibilant fricative voiceless trilled fricative voiced trilled fricative voiceless palatal fricative voiced palatal fricative voiceless velar fricative voiced velar fricative voiceless palatal-velar fricative (articulation disputed) voiceless uvular fricative voiceless pharyngeal fricative The IPA also has letters for epiglottal fricatives, voiceless epiglottal fricative voiced epiglottal fricative with allophonic trilling, but these might be better analyzed as pharyngeal trills. voiceless velopharyngeal fricative (often occurs with a cleft palate) voiced velopharyngeal fricative Lateral fricatives voiceless dental lateral fricative voiced dental lateral fricative voiceless alveolar lateral fricative voiced alveolar lateral fricative voiceless postalveolar lateral fricative (Mehri) or extIPA voiceless retroflex lateral fricative or extIPA Voiced retroflex lateral fricative (in Ao) or or extIPA voiceless palatal lateral fricative (PUA ) or extIPA voiced palatal lateral fricative (allophonic in Jebero) or extIPA voiceless velar lateral fricative (PUA ) or extIPA voiced velar lateral fricative The lateral fricative occurs as the ll of Welsh, as in Lloyd, Llewelyn, and Machynlleth (, a town), as the unvoiced 'hl' and voiced 'dl' or 'dhl' in the several languages of Southern Africa (such as Xhosa and Zulu), and in Mongolian. or and voiceless grooved lateral alveolar fricative (a laterally lisped or ) (Modern South Arabian) or and voiced grooved lateral alveolar fricative (a laterally lisped or ) (Modern South Arabian) IPA letters used for both fricatives and approximants voiced uvular fricative voiced pharyngeal fricative No language distinguishes voiced fricatives from approximants at these places, so the same symbol is used for both. For the pharyngeal, approximants are more numerous than fricatives. A fricative realization may be specified by adding the uptack to the letters, . Likewise, the downtack may be added to specify an approximant realization, . (The bilabial approximant and dental approximant do not have dedicated symbols either and are transcribed in a similar fashion: . However, the base letters are understood to specifically refer to the fricatives.) Pseudo-fricatives voiceless glottal transition, as in English hat breathy-voiced glottal transition In many languages, such as English, the glottal "fricatives" are unaccompanied phonation states of the glottis, without any accompanying manner, fricative or otherwise. However, in languages such as Arabic, they are true fricatives. In addition, is usually called a "voiceless labial-velar fricative", but it is actually an approximant. True doubly articulated fricatives may not occur in any language; but see voiceless palatal-velar fricative for a putative (and rather controversial) example. Aspirated fricatives Fricatives are very commonly voiced, though cross-linguistically voiced fricatives are not nearly as common as tenuis ("plain") fricatives. Other phonations are common in languages that have those phonations in their stop consonants. However, phonemically aspirated fricatives are rare. contrasts with a tense, unaspirated in Korean; aspirated fricatives are also found in a few Sino-Tibetan languages, in some Oto-Manguean languages, in the Siouan language Ofo ( and ), and in the (central?) Chumash languages ( and ). The record may be Cone Tibetan, which has four contrastive aspirated fricatives: , , and . Nasalized fricatives Phonemically nasalized fricatives are rare. Umbundu has and Kwangali and Souletin Basque have . In Coatzospan Mixtec, appear allophonically before a nasal vowel, and in Igbo nasality is a feature of the syllable; when occur in nasal syllables they are themselves nasalized. Occurrence Until its extinction, Ubykh may have been the language with the most fricatives (29 not including ), some of which did not have dedicated symbols or diacritics in the IPA. This number actually outstrips the number of all consonants in English (which has 24 consonants). By contrast, approximately 8.7% of the world's languages have no phonemic fricatives at all. This is a typical feature of Australian Aboriginal languages, where the few fricatives that exist result from changes to plosives or approximants, but also occurs in some indigenous languages of New Guinea and South America that have especially small numbers of consonants. However, whereas is entirely unknown in indigenous Australian languages, most of the other languages without true fricatives do have in their consonant inventory. Voicing contrasts in fricatives are largely confined to Europe, Africa, and Western Asia. Languages of South and East Asia, such as Mandarin Chinese, Korean, the Dravidian and Austronesian languages, typically do not have such voiced fricatives as and , which are familiar to many European speakers. These voiced fricatives are also relatively rare in indigenous languages of the Americas. Overall, voicing contrasts in fricatives are much rarer than in plosives, being found only in about a third of the world's languages as compared to 60 percent for plosive voicing contrasts. About 15 percent of the world's languages, however, have unpaired voiced fricatives, i.e. a voiced fricative without a voiceless counterpart. Two-thirds of these, or 10 percent of all languages, have unpaired voiced fricatives but no voicing contrast between any fricative pair. This phenomenon occurs because voiced fricatives have developed from lenition of plosives or fortition of approximants. This phenomenon of unpaired voiced fricatives is scattered throughout the world, but is confined to nonsibilant fricatives with the exception of a couple of languages that have but lack . (Relatedly, several languages have the voiced affricate but lack , and vice versa.) The fricatives that occur most often without a voiceless counterpart are – in order of ratio of unpaired occurrences to total occurrences – , , , and . Acoustics Fricatives appear in waveforms as random noise caused by the turbulent airflow, upon which a periodic pattern is overlaid if voiced. Fricatives produced in the front of the mouth tend to have energy concentration at higher frequencies than ones produced in the back. The centre of gravity, the average frequency in a spectrum weighted by the amplitude, may be used to determine the place of articulation of a fricative relative to that of another. See also Apical consonant Hush consonant Laminal consonant List of phonetics topics Notes References External links Fricatives in English Manner of articulation
[ -0.4795290529727936, 0.4644368886947632, -0.12657172977924347, -0.33810433745384216, -0.6814380884170532, 0.8081082701683044, 0.8044707775115967, 0.6406608819961548, -0.5015323162078857, -0.8868658542633057, -0.6536917090415955, 0.5558885931968689, -0.4621250629425049, 0.12128089368343353,...
11763
https://en.wikipedia.org/wiki/Frost
Frost
Frost is a thin layer of ice on a solid surface, which forms from water vapor in an above-freezing atmosphere coming in contact with a solid surface whose temperature is below freezing, and resulting in a phase change from water vapor (a gas) to ice (a solid) as the water vapor reaches the freezing point. In temperate climates, it most commonly appears on surfaces near the ground as fragile white crystals; in cold climates, it occurs in a greater variety of forms. The propagation of crystal formation occurs by the process of nucleation. The ice crystals of frost form as the result of fractal process development. The depth of frost crystals varies depending on the amount of time they have been accumulating, and the concentration of the water vapor (humidity). Frost crystals may be invisible (black), clear (translucent), or white; if a mass of frost crystals scatters light in all directions, the coating of frost appears white. Types of frost include crystalline frost (hoar frost or radiation frost) from deposition of water vapor from air of low humidity, white frost in humid conditions, window frost on glass surfaces, advection frost from cold wind over cold surfaces, black frost without visible ice at low temperatures and very low humidity, and rime under supercooled wet conditions. Plants that have evolved in warmer climates suffer damage when the temperature falls low enough to freeze the water in the cells that make up the plant tissue. The tissue damage resulting from this process is known as "frost damage". Farmers in those regions where frost damage is known to affect their crops often invest in substantial means to protect their crops from such damage. Formation If a solid surface is chilled below the dew point of the surrounding humid air, and the surface itself is colder than freezing, ice will form on it. If the water deposits as a liquid that then freezes, it forms a coating that may look glassy, opaque, or crystalline, depending on its type. Depending on context, that process also may be called atmospheric icing. The ice it produces differs in some ways from crystalline frost, which consists of spicules of ice that typically project from the solid surface on which they grow. The main difference between the ice coatings and frost spicules arises because the crystalline spicules grow directly from desublimation of water vapour from air, and desublimation is not a factor in icing of freezing surfaces. For desublimation to proceed, the surface must be below the frost point of the air, meaning that it is sufficiently cold for ice to form without passing through the liquid phase. The air must be humid, but not sufficiently humid to permit the condensation of liquid water, or icing will result instead of desublimation. The size of the crystals depends largely on the temperature, the amount of water vapor available, and how long they have been growing undisturbed. As a rule, except in conditions where supercooled droplets are present in the air, frost will form only if the deposition surface is colder than the surrounding air. For instance, frost may be observed around cracks in cold wooden sidewalks when humid air escapes from the warmer ground beneath. Other objects on which frost commonly forms are those with low specific heat or high thermal emissivity, such as blackened metals, hence the accumulation of frost on the heads of rusty nails. The apparently erratic occurrence of frost in adjacent localities is due partly to differences of elevation, the lower areas becoming colder on calm nights. Where static air settles above an area of ground in the absence of wind, the absorptivity and specific heat of the ground strongly influence the temperature that the trapped air attains. Types Hoar frost Hoar frost, also hoarfrost, radiation frost, or pruina, refers to white ice crystals deposited on the ground or loosely attached to exposed objects, such as wires or leaves. They form on cold, clear nights when conditions are such that heat radiates out to the open air faster than it can be replaced from nearby sources, such as wind or warm objects. Under suitable circumstances, objects cool to below the frost point of the surrounding air, well below the freezing point of water. Such freezing may be promoted by effects such as flood frost or frost pocket. These occur when ground-level radiation losses cool air until it flows downhill and accumulates in pockets of very cold air in valleys and hollows. Hoar frost may freeze in such low-lying cold air even when the air temperature a few feet above ground is well above freezing. The word "hoar" comes from an Old English adjective that means "showing signs of old age". In this context, it refers to the frost that makes trees and bushes look like white hair. Hoar frost may have different names depending on where it forms: Air hoar is a deposit of hoar frost on objects above the surface, such as tree branches, plant stems, and wires. Surface hoar refers to fern-like ice crystals directly deposited on snow, ice, or already frozen surfaces. Crevasse hoar consists of crystals that form in glacial crevasses where water vapour can accumulate under calm weather conditions. Depth hoar refers to faceted crystals that have slowly grown large within cavities beneath the surface of banks of dry snow. Depth hoar crystals grow continuously at the expense of neighbouring smaller crystals, so typically are visibly stepped and have faceted hollows. When surface hoar covers sloping snowbanks, the layer of frost crystals may create an avalanche risk; when heavy layers of new snow cover the frosty surface, furry crystals standing out from the old snow hold off the falling flakes, forming a layer of voids that prevents the new snow layers from bonding strongly to the old snow beneath. Ideal conditions for hoarfrost to form on snow are cold, clear nights, with very light, cold air currents conveying humidity at the right rate for growth of frost crystals. Wind that is too strong or warm destroys the furry crystals, and thereby may permit a stronger bond between the old and new snow layers. However, if the winds are strong enough and cold enough to lay the crystals flat and dry, carpeting the snow with cold, loose crystals without removing or destroying them or letting them warm up and become sticky, then the frost interface between the snow layers may still present an avalanche danger, because the texture of the frost crystals differs from the snow texture, and the dry crystals will not stick to fresh snow. Such conditions still prevent a strong bond between the snow layers. In very low temperatures where fluffy surface hoar crystals form without subsequently being covered with snow, strong winds may break them off, forming a dust of ice particles and blowing them over the surface. The ice dust then may form yukimarimo, as has been observed in parts of Antarctica, in a process similar to the formation of dust bunnies and similar structures. Hoar frost and white frost also occur in man-made environments such as in freezers or industrial cold-storage facilities. If such cold spaces or the pipes serving them are not well insulated and are exposed to ambient humidity, the moisture will freeze instantly depending on the freezer temperature. The frost may coat pipes thickly, partly insulating them, but such inefficient insulation still is a source of heat loss. Advection frost Advection frost (also called wind frost) refers to tiny ice spikes that form when very cold wind is blowing over tree branches, poles, and other surfaces. It looks like rimming on the edges of flowers and leaves, and usually forms against the direction of the wind. It can occur at any hour, day or night. Window frost Window frost (also called fern frost or ice flowers) forms when a glass pane is exposed to very cold air on the outside and warmer, moderately moist air on the inside. If the pane is a bad insulator (for example, if it is a single-pane window), water vapour condenses on the glass, forming frost patterns. With very low temperatures outside, frost can appear on the bottom of the window even with double-pane energy-efficient windows because the air convection between two panes of glass ensures that the bottom part of the glazing unit is colder than the top part. On unheated motor vehicles, the frost usually forms on the outside surface of the glass first. The glass surface influences the shape of crystals, so imperfections, scratches, or dust can modify the way ice nucleates. The patterns in window frost form a fractal with a fractal dimension greater than one, but less than two. This is a consequence of the nucleation process being constrained to unfold in two dimensions, unlike a snowflake, which is shaped by a similar process, but forms in three dimensions and has a fractal dimension greater than two. If the indoor air is very humid, rather than moderately so, water first condenses in small droplets, and then freezes into clear ice. Similar patterns of freezing may occur on other smooth vertical surfaces, but they seldom are as obvious or spectacular as on clear glass. White frost White frost is a solid deposition of ice that forms directly from water vapour contained in air. White frost forms when relative humidity is above 90% and the temperature below −8 °C (18 °F), and it grows against the wind direction, since air arriving from windward has a higher humidity than leeward air, but the wind must not be strong, else it damages the delicate icy structures as they begin to form. White frost resembles a heavy coating of hoar frost with big, interlocking crystals, usually needle-shaped. Rime Rime is a type of ice deposition that occurs quickly, often under heavily humid and windy conditions. Technically speaking, it is not a type of frost, since usually supercooled water drops are involved, in contrast to the formation of hoar frost, in which water vapour desublimates slowly and directly. Ships travelling through Arctic seas may accumulate large quantities of rime on the rigging. Unlike hoar frost, which has a feathery appearance, rime generally has an icy, solid appearance. Black frost Black frost (or "killing frost") is not strictly speaking frost at all, because it is the condition seen in crops when the humidity is too low for frost to form, but the temperature falls so low that plant tissues freeze and die, becoming blackened, hence the term "black frost". Black frost often is called "killing frost" because white frost tends to be less cold, partly because the latent heat of freezing of the water reduces the temperature drop. Effect on plants Damage Many plants can be damaged or killed by freezing temperatures or frost. This varies with the type of plant, the tissue exposed, and how low temperatures get; a "light frost" of damages fewer types of plants than a "hard frost" below . Plants likely to be damaged even by a light frost include vines—such as beans, grapes, squashes, melons—along with nightshades such as tomatoes, eggplants, and peppers. Plants that may tolerate (or even benefit from) frosts include: root vegetables (e.g. beets, carrots, parsnips, onions) leafy greens (e.g. lettuces, spinach, chard, cucumber) cruciferous vegetables (e.g. cabbages, cauliflower, bok choy, broccoli, Brussels sprouts, radishes, kale, collard, mustard, turnips, rutabagas) Even those plants that tolerate frost may be damaged once temperatures drop even lower (below ). Hardy perennials, such as Hosta, become dormant after the first frosts and regrow when spring arrives. The entire visible plant may turn completely brown until the spring warmth, or may drop all of its leaves and flowers, leaving the stem and stalk only. Evergreen plants, such as pine trees, withstand frost although all or most growth stops. Frost crack is a bark defect caused by a combination of low temperatures and heat from the winter sun. Vegetation is not necessarily damaged when leaf temperatures drop below the freezing point of their cell contents. In the absence of a site nucleating the formation of ice crystals, the leaves remain in a supercooled liquid state, safely reaching temperatures of . However, once frost forms, the leaf cells may be damaged by sharp ice crystals. Hardening is the process by which a plant becomes tolerant to low temperatures. See also Cryobiology. Certain bacteria, notably Pseudomonas syringae, are particularly effective at triggering frost formation, raising the nucleation temperature to about . Bacteria lacking ice nucleation-active proteins (ice-minus bacteria) result in greatly reduced frost damage. Protection methods Typical measures to prevent frost or reduce its severity include one or more of: deploying powerful blowers to simulate wind, thereby preventing the formation of accumulations of cold air. There are variations on this theme. One variety is the wind machine, an engine-driven propeller mounted on a vertical pole that blows air almost horizontally. Wind machines were introduced as a method for frost protection in California during the 1920s, but they were not widely accepted until the 1940s and 1950s. Now, they are commonly used in many parts of the world. Another is the selective inverted sink, a device which prevents frost by drawing cold air from the ground and blowing it up through a chimney. It was originally developed to prevent frost damage to citrus fruits in Uruguay. In New Zealand, helicopters are used in similar fashion, especially in the vineyard regions such as Marlborough. By dragging down warmer air from the inversion layers, and preventing the ponding of colder air on the ground, the low-flying helicopters prevent damage to the fruit buds. As the operations are conducted at night, and have in the past involved up to 130 aircraft per night in one region, safety rules are strict. Although not a dedicated method, wind turbines have similar (small) effect of vertically mixing air layers of different temperature. For high-value crops, farmers may wrap trees and cover crops. Heating to slow the drop in temperature is not practical except for high-value crops grown over small areas. Production of smoke to reduce cooling by radiation Spraying crops with a layer of water releases latent heat, preventing harmful freezing of the tissues of the plants that it coats. Such measures need to be applied with discretion, because they may do more harm than good; for example, spraying crops with water can cause damage if the plants become overburdened with ice. An effective, low cost method for small crop farms and plant nurseries, exploits the latent heat of freezing. A pulsed irrigation timer delivers water through existing overhead sprinklers at a low volumes to combat frosts down to . If the water freezes, it gives off its latent heat, preventing the temperature of the foliage from falling much below zero. Frost-free areas Frost-free areas are found mainly in the lowland tropics, where they cover almost all land except at altitudes above about near the equator and around in the semiarid areas in tropical regions. Some areas on the oceanic margins of the subtropics are also frost-free, as are highly oceanic areas near windward coasts. The most poleward frost-free areas are the lower altitudes of the Azores, Île Amsterdam, Île Saint-Paul, and Tristan da Cunha. In the United States, southern Florida around Miami Beach and the Florida Keys are the only reliably frost-free areas, as well as the Channel Islands off the coast of California. The hardiness zones in these regions are 11a and 11b. Personifications Frost is personified in Russian culture as Ded Moroz. Indigenous peoples of Russia such as the Mordvins have their own traditions of frost deities. English folklore tradition holds that Jack Frost, an elfish creature, is responsible for feathery patterns of frost found on windows on cold mornings. Gallery See also Black ice Frost (temperature) Frost heaving Frost line Frostbite Ground frost Icing (nautical) Needle ice References External links Guide to Frost How much do you know about frost? – BBC American Meteorological Society, Glossary of Meteorology – Hoarfrost The Weather Doctor – Weather Whys – Frost Precipitation Psychrometrics Water ice
[ -0.06680899113416672, 0.5133216977119446, 0.05834983289241791, 0.2605942189693451, -0.11153616011142731, -0.5536448955535889, 0.8582044839859009, 0.4760860800743103, 0.057405874133110046, -0.684349536895752, -0.6525681614875793, -0.415320485830307, -0.10506489872932434, 0.5550734400749207,...
11768
https://en.wikipedia.org/wiki/Franz%20Schmidt
Franz Schmidt
Franz Schmidt, also Ferenc Schmidt (22 December 1874 – 11 February 1939) was an Austro-Hungarian composer, cellist and pianist. Life Schmidt was born in Pozsony/Pressburg, in the Hungarian part of Austria-Hungary (today Bratislava, Slovakia) to a half-Hungarian father – with the same name, born in the same city – and to a Hungarian mother, Mária Ravasz. He was a Roman Catholic. His earliest teacher was his mother, Mária Ravasz, an accomplished pianist, who gave him a systematic instruction in the keyboard works of J. S. Bach. He received a foundation in theory from , the organist at the Franciscan church in Pressburg. He studied piano briefly with Theodor Leschetizky, with whom he clashed. He moved to Vienna with his family in 1888, and studied at the Vienna Conservatory (composition with Robert Fuchs, cello with Ferdinand Hellmesberger, and, for a few lessons, counterpoint with Anton Bruckner, who was already seriously ill at that time), graduating "with excellence" in 1896. He obtained a post as cellist with the Vienna Court Opera Orchestra, where he played until 1914, often under Gustav Mahler. Mahler habitually had Schmidt play all the cello solos, even though Friedrich Buxbaum was the principal cellist. Schmidt was also in demand as a chamber musician. Schmidt and Arnold Schoenberg maintained cordial relations despite their vast differences in eventual outlook and style (Schmidt certainly shows a perceptible influence from Schoenberg's early, tonal works such as Verklärte Nacht, Op. 4, in whose Viennese première he participated as cellist, the Chamber Symphony No. 1, Op. 9 and the gigantic cantata Gurre-Lieder. Unable to procure a teaching position for Schoenberg at the Academy, Schmidt rehearsed his students in a performance of Pierrot Lunaire, Op. 21 which Schoenberg warmly praised). Also a brilliant pianist, in 1914 Schmidt took up a professorship in piano at the Vienna Conservatory, which had been recently renamed Imperial Academy of Music and the Performing Arts. (Apparently, when asked who the greatest living pianist was, Leopold Godowsky replied, "The other one is Franz Schmidt.") In 1925 he became Director of the Academy, and from 1927 to 1931 its Rector. As teacher of piano, cello and counterpoint and composition at the Academy, Schmidt trained numerous instrumentalists, conductors, and composers who later achieved fame. Among his best-known students were the pianist Friedrich Wührer and Alfred Rosé (son of Arnold Rosé, the founder of the Rosé Quartet, Konzertmeister of the Vienna Philharmonic and brother-in-law of Gustav Mahler). Among the composers were Walter Bricht (his favourite student), Theodor Berger, Marcel Rubin, Alfred Uhl and Ľudovít Rajter. He received many tokens of the high esteem in which he was held, notably the Order of Franz Joseph, and an Honorary Doctorate from the University of Vienna. Schmidt's private life was in stark contrast to the success of his distinguished professional career. His first wife, Karoline Perssin (c. 1880–1943), was confined in the Vienna mental hospital Am Steinhof in 1919, and three years after his death was murdered under the Nazi euthanasia program. Their daughter Emma Schmidt Holzschuh (1902–1932, married 1929) died unexpectedly after the birth of her first child. Schmidt experienced a spiritual and physical breakdown after this, and achieved an artistic revival and resolution in his Fourth Symphony of 1933 (which he inscribed as "Requiem for my Daughter") and, especially, in his oratorio The Book with Seven Seals. His second marriage in 1923, to a successful young piano student Margarethe Jirasek (1891–1964), for the first time brought some desperately needed stability into the private life of the artist, who was plagued by many serious health problems. Schmidt's worsening health forced his retirement from the Academy in early 1937. In the last year of his life Austria was brought into the German Reich by the Anschluss, and Schmidt was feted by the Nazi authorities as the greatest living composer of the so-called Ostmark. He was given a commission to write a cantata entitled The German Resurrection, which, after 1945, was taken by many as a reason to brand him as having been tainted by Nazi sympathy. However, Schmidt left this composition unfinished, and in the summer and autumn of 1938, a few months before his death, set it aside to devote himself to two other commissioned works for the one-armed pianist Paul Wittgenstein: the Quintet in A major for piano left-hand, clarinet, and string trio; and the Toccata in D minor for solo piano. Schmidt died on 11 February 1939. Musical works As a composer, Schmidt was slow to develop, but his reputation, at least in Austria, saw a steady growth from the late 1890s until his death in 1939. In his music, Schmidt continued to develop the Viennese classic-romantic traditions he inherited from Schubert, Brahms, and Bruckner. He also takes forward the "gypsy" style of Liszt and Brahms. His works are monumental in form and firmly tonal in language, though quite often innovative in their designs and clearly open to some of the new developments in musical syntax initiated by Mahler and Schoenberg. Although Schmidt did not write a lot of chamber music, what he did write, in the opinion of such critics as Wilhelm Altmann, was important and of high quality. Although Schmidt's organ works may resemble others of the era in terms of length, complexity, and difficulty, they are forward-looking in being conceived for the smaller, clearer, classical-style instruments of the Orgelbewegung, which he advocated. Schmidt worked mainly in large forms, including four symphonies (1899, 1913, 1928 and 1933) and two operas: Notre Dame (1904–6) and Fredigundis (1916–21). A CD recording of Notre Dame has been available for many years, starring Dame Gwyneth Jones and James King. Fredigundis No really adequate recording has been made of Schmidt's second and last opera Fredigundis, of which there has been but one "unauthorized" release in the early 1980s on the Voce label of an Austrian Radio broadcast of a 1979 Vienna performance under the direction of Ernst Märzendorfer. Aside from numerous "royal fanfares" (Fredigundis held the French throne in the sixth century) the score contains some fine examples of Schmidt's transitional style between his earlier and later manner. In many respects, Schmidt seldom ventured so far from traditional tonality again, and his third and final period (in the last decade-and-a-half of his life) was generally one of (at least partial) retrenchment, consolidation and the integration of the style of his opulently scored and melodious early compositions (the First Symphony, "Notre Dame") with elements of the overt experimentation seen in "Fredigundis", combined with an economy of utterance born of artistic maturity. New Grove encyclopaedia states that Fredigundis was a critical and popular failure, which may be partly attributable to the fact that Fredigundis (Fredegund, the widow of Chilperic I), is presented as a murderous and sadistic feminine monster. Add to this some structural problems with the libretto, and the opera's failure to make headway – despite an admirable and impressive score – becomes comprehensible. The Book with Seven Seals Aside from the mature symphonies (Nos. 2–4), Schmidt's crowning achievement was the oratorio The Book with Seven Seals (1935–37), a setting of passages from the Book of Revelation. His choice of subject was prophetic: with hindsight the work appears to foretell, in the most powerful terms, the disasters that were shortly to be visited upon Europe in the Second World War. Here his invention rises to a sustained pitch of genius. A narrative upon the text of the oratorio was provided by the composer. Schmidt's oratorio stands in the Austro-German tradition stretching back to the time of J. S. Bach and Handel. He was one of relatively few composers to write an oratorio fully on the subject of the Book of Revelation (earlier works include Georg Philipp Telemann: Der Tag des Gerichts, Schneider: Das Weltgericht, Louis Spohr: Die letzten Dinge, Joachim Raff: Weltende, and Ralph Vaughan Williams: Sancta Civitas). Far from glorifying its subject, it is a mystical contemplation, a horrified warning, and a prayer for salvation. The premiere was held in Vienna on 15 June 1938, with the Vienna Symphony Orchestra under Oswald Kabasta: the soloists were Rudolf Gerlach (John), Erika Rokyta, Enid Szánthó, Anton Dermota, Josef von Manowarda and Franz Schütz at the organ. Symphonies Schmidt is generally regarded as a conservative composer, but the rhythmic subtlety and harmonic complexity of much of his music belie this. His music combines a reverence for the Austro-German lineage of composers with innovations in harmony and orchestration (showing an awareness of the output of composers such as Debussy and Ravel, whose piano music he greatly admired, along with a knowledge of more recent composers in his own German-speaking realm, such as Schoenberg, Berg, Hindemith, etc.). Symphony No. 1 in E major. Written in 1896 at age 22. The scherzo (which shows a mature absorption of Bruckner and Richard Strauss) is especially noteworthy, while Schmidt demonstrates his contrapuntal skills in the Finale. Symphony No. 2 in E-flat major. Written in 1913 in a style reminiscent of Strauss and Reger, with homage to the grandiosity of Bruckner. This is Schmidt's longest symphony and it employs a huge orchestra. The central movement (of three) is an ingenious set of variations, which are grouped to suggest the characters of slow movement and scherzo. The complex scoring renders it a considerable challenge for most orchestras. Symphony No. 3 in A major. A sunny, melodic work in the Schubert vein (although its lyricism and superb orchestration do much to conceal the fact that it is one of the composer's most harmonically advanced works). Winner of the Austrian section of the 1928 International Columbia Graphophone Competition (the overall winner was Swedish composer Kurt Atterberg with his 6th Symphony), it enjoyed some popularity at the time (1928). Symphony No. 4 in C major.Written in 1933, this is the best-known work of his entire oeuvre. The composer called it "A requiem for my daughter". It begins with a long 23-bar melody on an unaccompanied solo trumpet (which returns at the symphony's close, "transfigured" by all that has intervened). The Adagio is an immense ABA ternary structure. The first A is an expansive threnody on solo cello (Schmidt's own instrument) whose seamless lyricism predates Strauss's Metamorphosen by more than a decade (its theme is later adjusted to form the scherzo of the symphony); the B section is an equally expansive funeral march (unmistakably referencing the Marcia Funebre from Beethoven's Eroica in its texture) whose dramatic climax is marked by an orchestral crescendo culminating in a gong and cymbal crash (again, a clear allusion to similar climaxes in the later symphonies of Bruckner, and followed by what Harold Truscott has described as a "reverse climax", leading back to a repeat of the A section). Schmidt and Nazism Schmidt's premiere of The Book with Seven Seals was made much of by the Nazis (who had annexed Austria shortly before in the Anschluss), and Schmidt was seen to give the Nazi salute (according to a report by Georg Tintner, who revered Schmidt and whose intent to record his symphonies was never realised). His conductor Oswald Kabasta was apparently an enthusiastic Nazi who, being prohibited from conducting in 1946 during de-nazification, committed suicide. These facts long placed Schmidt's posthumous reputation under a cloud. His lifelong friend and colleague Oskar Adler, who fled the Nazis in 1938, wrote afterwards that Schmidt was never a Nazi and never antisemitic but was extremely naive about politics. Hans Keller gave a similar endorsement. Regarding Schmidt's political naivety, Michael Steinberg, in his book The Symphony, tells of Schmidt's recommending Variations on a Hebrew Theme by his student Israel Brandmann to a musical group associated with the proto-Nazi German National Party. Most of Schmidt's principal musical friends were Jews, and they benefited from his generosity. Schmidt's last listed work, the cantata Deutsche Auferstehung (German Resurrection), was composed to a Nazi text. As one of the most famous living Austrian composers, Schmidt was well known to Hitler and received this commission after the Anschluss. He left it unfinished, to be completed later by Robert Wagner. Already seriously ill, Schmidt worked instead on other compositions such as the Quintet in A major for piano (left hand), clarinet and string trio, intended for Paul Wittgenstein and incorporating a variation set based on a theme by Wittgenstein's old teacher, Josef Labor. His failure to complete the cantata is likely to be a further indication that he was not committed to the Nazi cause; such, at any rate, was the opinion of his friend Oskar Adler. Listing of works Operas Notre Dame, romantic Opera in two acts, text after Victor Hugo by Franz Schmidt and Leopold Wilk; comp. 1902–4, premiered Vienna 1914 Fredigundis, Opera in three acts, text after Felix Dahn by and ; comp. 1916–21, premiered Berlin 1922 Oratorio The Book with Seven Seals (Das Buch mit sieben Siegeln) for Soli, Chorus, Organ and Orchestra, Text after the Revelation of St John; comp. 1935–37; premiered Vienna, 1938 Cantata Deutsche Auferstehung a Festival Song for Soli, Chorus, Organ and Orchestra, Text by Oskar Dietrich; comp. 1938–39, unfinished, prepared for performance by Dr. Robert Wagner; premiered Vienna, 1940 Symphonies Symphony No. 1 in E major; comp. 1896–99, premiered Vienna 1902 Symphony No. 2 in E-flat major; comp. 1911–13, premiered Vienna 1913 Symphony No. 3 in A major; comp. 1927–28, premiered Vienna 1928 Symphony No. 4 in C major; comp. 1932–33, premiered Vienna 1934 Piano concertos Concertante Variations on a Theme of Beethoven for Piano (left hand alone) with orchestral accompaniment; comp. 1923, premiered Vienna 1924; Two-handed arrangement by Friedrich Wührer (1952) Piano Concerto in E-flat major (for left hand alone); comp. 1934, premiered: Vienna 1935; Two-handed version by Friedrich Wührer (1952) Other orchestral works Carnival music and Intermezzo from the Opera Notre Dame; comp. 1902–03; premiered Vienna 1903 Variations on a Hussar Song for orchestra; comp. 1930–31; premiered Vienna 1931 Chaconne in D minor; transcribed from the Chaconne in C-sharp minor for organ from 1925; completed 1931; Manuscript Chamber music Four Little Fantasy pieces after Hungarian National Melodies, for cello with piano accompaniment; comp. 1892; premiered Vienna 1926 (three pieces) String Quartet in A major; comp. 1925; premiered Vienna 1925 String Quartet in G major; comp. 1929; premiered Vienna 1930 Quintet for piano left hand, two violins, viola and cello in G major; comp. 1926; premiered Stuttgart 1931; two-handed arrangement by Friedrich Wührer (1954) Quintet for clarinet, piano left hand, violin, viola and cello in B-flat major; comp. 1932; premiered Vienna 1933 Quintet for clarinet, piano left hand, violin, viola and cello in A major; comp. 1938; premiered Vienna 1939; two-handed arrangement by Friedrich Wührer (1952) Music for trumpets Variations and Fugue on an original Theme in D major (King's Fanfare from Fredigundis); 3. Arrangement for Trumpets alone; comp. 1925, premiered 1925 Music for organ and trumpet Variations and Fugue on an original Theme in D major (King's Fanfare from Fredigundis); 4. Arrangement for 14 Trumpets, Kettledrum and Organ; comp. 1925, premiered Vienna 1925 Choral overture "God preserve us" for Organ with ad libitum processional Trumpet-chorus; comp. 1933, premiered Vienna 1933 Solemn Fugue (Fuga solemnis) for Organ with Entrance of 6 Trumpets, 6 Horns, 3 Trombones, Bass Tuba and Kettledrums; comp. 1937, premiered Wien 1939 Piano music Romance in A major Christmas pastorale in A major (= Organ work, arrangement) Intermezzo in F-sharp minor (2nd movement of the A major Quintet) Toccata in D minor (for left hand alone); comp. 1938, premiered: Vienna 1940 (two-handed arrangement); two-handed arrangement by Friedrich Wührer (1952) Organ works Variations on a theme by Christoph Willibald Gluck (lost) Variations and Fugue on an original theme in D major (King's Fanfare from Fredigundis), 1. Arrangement; comp. 1916 Phantasie and Fugue in D major; comp. 1923–24, premiered Vienna 1924 Variations and Fugue on an original theme in D major (King's Fanfare from Fredigundis), 2. Arrangement; comp. 1924, premiered Vienna 1924 Toccata in C major; comp. 1924, premiered Vienna 1925 Prelude and Fugue in E-flat major; comp. 1924, premiered Vienna 1925 Chaconne in C-sharp minor; comp. 1925, premiered Vienna 1925 Four small Chorale preludes; comp. 1926, premiered Vienna 1926 "O Ewigkeit du Donnerwort" (O Eternity thou Thundrous Word), F major "Was mein Gott will" (What My God Wills), D major "O, wie selig seid ihr doch, ihr Frommen" (O How Happy Are Ye Now, You Blessed), D minor "Nun danket alle Gott" (Now Thank We All Our God), A major Fugue in F major; comp. 1927, premiered Vienna 1932 Prelude and Fugue in C major; comp. 1927, premiered Vienna 1928 Four little Preludes and Fugues; comp. 1928, premiered Berlin 1929 Prelude and Fugue in E-flat major Prelude and Fugue in C minor Prelude and Fugue in G major Prelude and Fugue in D major Chorale Prelude, "Der Heiland ist erstanden"; comp. 1934, premiered Vienna 1934 Prelude and Fugue in A major, Christmas pastoral; comp. 1934, premiered Vienna 1934 Toccata and Fugue A-flat major; comp. 1935, premiered Vienna 1936 Notes References Thomas Bernard Corfield – Franz Schmidt (1874–1939) – A Discussion of His Style With Particular Reference to the Four Symphonies and 'Das Buch mit sieben Siegeln (Garland Publishing, New York, 1989) Harold Truscott – The Music of Franz Schmidt – 1: The Orchestral Music (Toccata Press, London, 1984) Wilhelm Altmann – Handbuch für Streichquartettspieler (Handbook for String quartet performers) (Hinrichshofen Verlag, Wilhelmshafen, 1972) Otto Brusatti, Studien zu Franz Schmidt (Studies of Franz Schmidt) (Universal Edition, Vienna 1977) Andreas Liess, Franz Schmidt (Böhlau, Graz 1951) C. Nemeth, Franz Schmidt (Leipzig 1957) Walter Obermaier (Ed.), Franz Schmidt und seine Zeit (Franz Schmidt and his time): Symposium 1985 (Doblinger, Vienna-Munich 1988). Carmen Ottner, Quellen zu Franz Schmidt (Sources for Franz Schmidt), Parts 1 and 2. (Doblinger, Vienna-Munich 1985–1987) Carmen Ottner (edit.): Franz Schmidt und die österreichische Orgelmusik seiner Zeit (Franz Schmidt, and Austrian Organ-Music of his time): Symposion 1991 (Doblinger, Vienna 1992), Norbert Tschulik: Franz Schmidt (Österreichischer Bundesverlag, Wien 1972) Peter Watchorn: Isolde Ahlgrimm, Vienna and the early music revival (Ashgate, Burlington Vermont; Aldershot UK; 2007), External links Franz Schmidt String Quartet No.1 sound-bites and information about the work 1874 births 1939 deaths 20th-century Austrian composers 20th-century Austrian male musicians 19th-century classical composers 19th-century classical pianists 19th-century male musicians 20th-century classical composers 20th-century classical pianists Burials at the Vienna Central Cemetery Hungarian classical cellists Hungarian classical composers Hungarian male classical composers Hungarian classical pianists Hungarian music educators Hungarian Roman Catholics Male classical pianists Male opera composers Musicians from Bratislava Composers from Vienna
[ -0.2415006309747696, 0.6239356994628906, -0.09439168125391006, -0.051494430750608444, -0.3782346248626709, 0.824017345905304, 0.34640830755233765, -0.3152201175689697, -0.270355224609375, -0.8207388520240784, -0.18275129795074463, -0.3786541819572449, -0.37931522727012634, 0.26931649446487...
11771
https://en.wikipedia.org/wiki/Fucking
Fucking
Fucking may refer to: Fucking, an English profanity derived from fuck Fucking, a synonym for sexual intercourse Fugging, Upper Austria, a village known as Fucking until 2021 Fugging, Lower Austria, a village known as Fucking until 1836 See also Fakkin, abbreviation of Japanese restaurant First Kitchen Fuck (disambiguation) Fugging (disambiguation) Fukin (disambiguation) Fuqing Fuxing (disambiguation)
[ 0.8098114132881165, 0.47732222080230713, 0.33554476499557495, -0.19406692683696747, -0.3274311125278473, 0.3224756717681885, 0.43782690167427063, 0.7313457727432251, -0.04584571719169617, -0.2558601200580597, -0.03419588506221771, 0.17103725671768188, -0.32971492409706116, 0.44762107729911...
11772
https://en.wikipedia.org/wiki/Finnish%20Civil%20War
Finnish Civil War
The Finnish Civil War was a civil war in Finland in 1918 fought for the leadership and control of the country between White Finland and the Finnish Socialist Workers' Republic (Red Finland) during the country's transition from a Grand Duchy of the Russian Empire to an independent state. The clashes took place in the context of the national, political, and social turmoil caused by World War I (Eastern Front) in Europe. The war was fought between the Reds, led by a section of the Social Democratic Party, and the Whites, conducted by the conservative-based Senate and the German Imperial Army. The paramilitary Red Guards, which were composed of industrial and agrarian workers, controlled the cities and industrial centers of southern Finland. The paramilitary White Guards, which consisted of land owners and those in the middle and upper-classes, controlled rural central and northern Finland, and were led by General C. G. E. Mannerheim. In the years before the conflict, Finland had experienced rapid population growth, industrialisation, pre-urbanisation and the rise of a comprehensive labour movement. The country's political and governmental systems were in an unstable phase of democratisation and modernisation. The socio-economic condition and education of the population had gradually improved, and national thinking and cultural life had increased. World War I led to the collapse of the Russian Empire, causing a power vacuum in Finland, and the subsequent struggle for dominance led to militarisation and an escalating crisis between the left-leaning labour movement and the conservatives. The Reds carried out an unsuccessful general offensive in February 1918, supplied with weapons by Soviet Russia. A counteroffensive by the Whites began in March, reinforced by the German Empire's military detachments in April. The decisive engagements were the Battles of Tampere and Vyborg (; ), won by the Whites, and the Battles of Helsinki and Lahti, won by German troops, leading to overall victory for the Whites and the German forces. Political violence became a part of this warfare. Around 12,500 Red prisoners died of malnutrition and disease in camps. About 39,000 people, of whom 36,000 were Finns, perished in the conflict. In the immediate aftermath, the Finns passed from Russian governance to the German sphere of influence with a plan to establish a German-led Finnish monarchy. The scheme ended with Germany's defeat in World War I, and Finland instead emerged as an independent, democratic republic. The Civil War divided the nation for decades. Finnish society was reunited through social compromises based on a long-term culture of moderate politics and religion and the post-war economic recovery. The Finnish Civil War of 1918 was the second civil conflict within Finland's borders, as the Cudgel War of 1596/1597 (where poor peasants rose up against the troops, nobles and cavalry who taxed them) has similar features to the Civil War of 1918. Background International politics The main factor behind the Finnish Civil War was a political crisis arising out of World War I. Under the pressures of the Great War, the Russian Empire collapsed, leading to the February and October Revolutions in 1917. This breakdown caused a power vacuum and a subsequent struggle for power in Eastern Europe. Grand Duchy of Finland (1809–1917), became embroiled in the turmoil. Geopolitically less important than the continental Moscow–Warsaw gateway, Finland, isolated by the Baltic Sea was relatively peaceful until early 1918. The war between the German Empire and Russia had only indirect effects on the Finns. Since the end of the 19th century, the Grand Duchy had become a vital source of raw materials, industrial products, food and labour for the growing Imperial Russian capital Petrograd (modern Saint Petersburg), and World War I emphasised that role. Strategically, the Finnish territory was the less important northern section of the Estonian–Finnish gateway and a buffer zone to and from Petrograd through the Narva area, the Gulf of Finland and the Karelian Isthmus. The German Empire saw Eastern Europe—primarily Russia—as a major source of vital products and raw materials, both during World War I and for the future. Her resources overstretched by the two-front war, Germany attempted to divide Russia by providing financial support to revolutionary groups, such as the Bolsheviks and the Socialist Revolutionary Party, and to radical, separatist factions, such as the Finnish national activist movement leaning toward Germanism. Between 30 and 40 million marks were spent on this endeavour. Controlling the Finnish area would allow the Imperial German Army to penetrate Petrograd and the Kola Peninsula, an area rich in raw materials for the mining industry. Finland possessed large ore reserves and a well-developed forest industry. From 1809 to 1898, a period called Pax Russica, the peripheral authority of the Finns gradually increased, and Russo-Finnish relations were exceptionally peaceful in comparison with other parts of the Russian Empire. Russia's defeat in the Crimean War in the 1850s led to attempts to speed up the modernisation of the country. This caused more than 50 years of economic, industrial, cultural and educational progress in the Grand Duchy of Finland, including an improvement in the status of the Finnish language. All this encouraged Finnish nationalism and cultural unity through the birth of the Fennoman movement, which bound the Finns to the domestic administration and led to the idea that the Grand Duchy was an increasingly autonomous state of the Russian Empire. In 1899, the Russian Empire initiated a policy of integration through the Russification of Finland. The strengthened, pan-slavist central power tried to unite the "Russian Multinational Dynastic Union" as the military and strategic situation of Russia became more perilous due to the rise of Germany and Japan. Finns called the increased military and administrative control, "the First Period of Oppression", and for the first time Finnish politicians drew up plans for disengagement from Russia or sovereignty for Finland. In the struggle against integration, activists drawn from sections of the working class and the Swedish-speaking intelligentsia carried out terrorist acts. During World War I and the rise of Germanism, the pro-Swedish Svecomans began their covert collaboration with Imperial Germany and, from 1915 to 1917, a Jäger (; ) battalion consisting of 1,900 Finnish volunteers was trained in Germany. Domestic politics The major reasons for rising political tensions among Finns were the autocratic rule of the Russian czar and the undemocratic class system of the estates of the realm. The latter system originated in the regime of the Swedish Empire that preceded Russian governance and divided the Finnish people economically, socially and politically. Finland's population grew rapidly in the nineteenth century (from 860,000 in 1810 to 3,130,000 in 1917), and a class of agrarian and industrial workers, as well as crofters, emerged over the period. The Industrial Revolution was rapid in Finland, though it started later than in the rest of Western Europe. Industrialisation was financed by the state and some of the social problems associated with the industrial process were diminished by the administration's actions. Among urban workers, socio-economic problems steepened during periods of industrial depression. The position of rural workers worsened after the end of the nineteenth century, as farming became more efficient and market-oriented, and the development of industry was insufficiently vigorous to fully utilise the rapid population growth of the countryside. The difference between Scandinavian-Finnish and Russian-Slavic culture affected the nature of Finnish national integration. The upper social strata took the lead and gained domestic authority from the Russian czar in 1809. The estates planned to build an increasingly autonomous Finnish state, led by the elite and the intelligentsia. The Fennoman movement aimed to include the common people in a non-political role; the labour movement, youth associations and the temperance movement were initially led "from above". Between 1870 and 1916 industrialisation gradually improved social conditions and the self-confidence of workers, but while the standard of living of the common people rose in absolute terms, the rift between rich and poor deepened markedly. The commoners' rising awareness of socio-economic and political questions interacted with the ideas of socialism, social liberalism and nationalism. The workers' initiatives and the corresponding responses of the dominant authorities intensified social conflict in Finland.The Finnish labour movement, which emerged at the end of the nineteenth century from temperance, religious movements and Fennomania, had a Finnish nationalist, working-class character. From 1899 to 1906, the movement became conclusively independent, shedding the paternalistic thinking of the Fennoman estates, and it was represented by the Finnish Social Democratic Party, established in 1899. Workers' activism was directed both toward opposing Russification and in developing a domestic policy that tackled social problems and responded to the demand for democracy. This was a reaction to the domestic dispute, ongoing since the 1880s, between the Finnish nobility-bourgeoisie and the labour movement concerning voting rights for the common people. Despite their obligations as obedient, peaceful and non-political inhabitants of the Grand Duchy (who had, only a few decades earlier, accepted the class system as the natural order of their life), the commoners began to demand their civil rights and citizenship in Finnish society. The power struggle between the Finnish estates and the Russian administration gave a concrete role model and free space for the labour movement. On the other side, due to an at-least century-long tradition and experience of administrative authority, the Finnish elite saw itself as the inherent natural leader of the nation. The political struggle for democracy was solved outside Finland, in international politics: the Russian Empire's failed 1904–1905 war against Japan led to the 1905 Revolution in Russia and to a general strike in Finland. In an attempt to quell the general unrest, the system of estates was abolished in the Parliamentary Reform of 1906. The general strike increased support for the social democrats substantially. The party encompassed a higher proportion of the population than any other socialist movement in the world. The Reform of 1906 was a giant leap towards the political and social liberalisation of the common Finnish people because the Russian House of Romanov had been the most autocratic and conservative ruler in Europe. The Finns adopted a unicameral parliamentary system, the Parliament of Finland (; ) with universal suffrage. The number of voters increased from 126,000 to 1,273,000, including female citizens. The reform led to the social democrats obtaining about fifty percent of the popular vote, but the Czar regained his authority after the crisis of 1905. Subsequently, during the more severe programme of Russification, called "the Second Period of Oppression" by the Finns, the Czar neutralised the power of the Finnish Parliament between 1908 and 1917. He dissolved the assembly, ordered parliamentary elections almost annually, and determined the composition of the Finnish Senate, which did not correlate with the Parliament. The capacity of the Finnish Parliament to solve socio-economic problems was stymied by confrontations between the largely uneducated commoners and the former estates. Another conflict festered as employers denied collective bargaining and the right of the labour unions to represent workers. The parliamentary process disappointed the labour movement, but as dominance in the Parliament and legislation was the workers' most likely way to obtain a more balanced society, they identified themselves with the state. Overall domestic politics led to a contest for leadership of the Finnish state during the ten years before the collapse of the Russian Empire. February Revolution Build-up The Second Period of Russification was halted on 15 March 1917 by the February Revolution, which removed the czar, Nicholas II. The collapse of Russia was caused by military defeats, war-weariness against the duration and hardships of the Great War, and the collision between the most conservative regime in Europe and a Russian people desiring modernisation. The Czar's power was transferred to the State Duma (Russian Parliament) and the right-wing Provisional Government, but this new authority was challenged by the Petrograd Soviet (city council), leading to dual power in the country. The autonomous status of 1809–1899 was returned to the Finns by the March 1917 manifesto of the Russian Provisional Government. For the first time in history, de facto political power existed in the Parliament of Finland. The political left, consisting mainly of social democrats, covered a wide spectrum from moderate to revolutionary socialists. The political right was even more diverse, ranging from social liberals and moderate conservatives to rightist conservative elements. The four main parties were: The conservative Finnish Party; the Young Finnish Party, which included both liberals and conservatives, with the liberals divided between social liberals and economic liberals; the social reformist, centrist Agrarian League, which drew its support mainly from peasants with small or mid-sized farms; and the conservative Swedish People's Party, which sought to retain the rights of the former nobility and the Swedish-speaking minority of Finland. During 1917, a power struggle and social disintegration interacted. The collapse of Russia induced a chain reaction of disintegration, starting from the government, military and economy, and spreading to all fields of society, such as local administration, workplaces and to individual citizens. The social democrats wanted to retain the civil rights already achieved and to increase the socialists' power over society. The conservatives feared the loss of their long-held socio-economic dominance. Both factions collaborated with their equivalents in Russia, deepening the split in the nation. The Social Democratic Party gained an absolute majority in the parliamentary elections of 1916. A new Senate was formed in March 1917 by Oskari Tokoi, but it did not reflect the socialists' large parliamentary majority: it comprised six social democrats and six non-socialists. In theory, the Senate consisted of a broad national coalition, but in practice (with the main political groups unwilling to compromise and top politicians remaining outside of it), it proved unable to solve any major Finnish problem. After the February Revolution, political authority descended to the street level: mass meetings, strike organisations and worker-soldier councils on the left and to active organisations of employers on the right, all serving to undermine the authority of the state. The February Revolution halted the Finnish economic boom caused by the Russian war-economy. The collapse in business led to unemployment and high inflation, but the employed workers gained an opportunity to resolve workplace problems. The commoners' call for the eight-hour working day, better working conditions and higher wages led to demonstrations and large-scale strikes in industry and agriculture. While the Finns had specialised in milk and butter production, the bulk of the food supply for the country depended on cereals produced in southern Russia. The cessation of cereal imports from disintegrating Russia led to food shortages in Finland. The Senate responded by introducing rationing and price controls. The farmers resisted the state control and thus a black market, accompanied by sharply rising food prices, formed. As a consequence, export to the free market of the Petrograd area increased. Food supply, prices and, in the end, the fear of starvation became emotional political issues between farmers and urban workers, especially those who were unemployed. Common people, their fears exploited by politicians and an incendiary, polarised political media, took to the streets. Despite the food shortages, no actual large-scale starvation hit southern Finland before the civil war and the food market remained a secondary stimulator in the power struggle of the Finnish state. Contest for leadership The passing of the Tokoi Senate bill called the "Law of Supreme Power" (, more commonly known as valtalaki; ) in July 1917, triggered one of the key crises in the power struggle between the social democrats and the conservatives. The fall of the Russian Empire opened the question of who would hold sovereign political authority in the former Grand Duchy. After decades of political disappointment, the February Revolution offered the Finnish social democrats an opportunity to govern; they held the absolute majority in Parliament. The conservatives were alarmed by the continuous increase of the socialists' influence since 1899, which reached a climax in 1917. The "Law of Supreme Power" incorporated a plan by the socialists to substantially increase the authority of Parliament, as a reaction to the non-parliamentary and conservative leadership of the Finnish Senate between 1906 and 1916. The bill furthered Finnish autonomy in domestic affairs: the Russian Provisional Government was only allowed the right to control Finnish foreign and military policies. The Act was adopted with the support of the Social Democratic Party, the Agrarian League, part of the Young Finnish Party and some activists eager for Finnish sovereignty. The conservatives opposed the bill and some of the most right-wing representatives resigned from Parliament. In Petrograd, the social democrats' plan had the backing of the Bolsheviks. They had been plotting a revolt against the Provisional Government since April 1917, and pro-Soviet demonstrations during the July Days brought matters to a head. The Helsinki Soviet and the Regional Committee of the Finnish Soviets, led by the Bolshevik Ivar Smilga, both pledged to defend the Finnish Parliament, were it threatened with attack. However, the Provisional Government still had sufficient support in the Russian army to survive and as the street movement waned, Vladimir Lenin fled to Karelia. In the aftermath of these events, the "Law of Supreme Power" was overruled and the social democrats eventually backed down; more Russian troops were sent to Finland and, with the co-operation and insistence of the Finnish conservatives, Parliament was dissolved and new elections announced. In the October 1917 elections, the social democrats lost their absolute majority, which radicalised the labour movement and decreased support for moderate politics. The crisis of July 1917 did not bring about the Red Revolution of January 1918 on its own, but together with political developments based on the commoners' interpretation of the ideas of Fennomania and socialism, the events favoured a Finnish revolution. In order to win power, the socialists had to overcome Parliament. The February Revolution resulted in a loss of institutional authority in Finland and the dissolution of the police force, creating fear and uncertainty. In response, both the right and left assembled their own security groups, which were initially local and largely unarmed. By late 1917, following the dissolution of Parliament, in the absence of a strong government and national armed forces, the security groups began assuming a broader and more paramilitary character. The Civil Guards (; ; ) and the later White Guards (; ) were organised by local men of influence: conservative academics, industrialists, major landowners, and activists. The Workers' Order Guards (; ) and the Red Guards (; ) were recruited through the local social democratic party sections and from the labour unions. October Revolution The Bolsheviks' and Vladimir Lenin's October Revolution of 7 November 1917 transferred political power in Petrograd to the radical, left-wing socialists. The German government's decision to arrange safe-conduct for Lenin and his comrades from exile in Switzerland to Petrograd in April 1917, was a success. An armistice between Germany and the Bolshevik regime came into force on 6 December and peace negotiations began on 22 December 1917 at Brest-Litovsk. November 1917 became another watershed in the 1917–1918 rivalry for the leadership of Finland. After the dissolution of the Finnish Parliament, polarisation between the social democrats and the conservatives increased markedly and the period witnessed the appearance of political violence. An agricultural worker was shot during a local strike on 9 August 1917 at Ypäjä and a Civil Guard member was killed in a local political crisis at Malmi on 24 September. The October Revolution disrupted the informal truce between the Finnish non-socialists and the Russian Provisional Government. After political wrangling over how to react to the revolt, the majority of the politicians accepted a compromise proposal by Santeri Alkio, the leader of the Agrarian League. Parliament seized the sovereign power in Finland on 15 November 1917 based on the socialists' "Law of Supreme Power" and ratified their proposals of an eight-hour working day and universal suffrage in local elections, from July 1917. The purely non-socialist, conservative-led government of Pehr Evind Svinhufvud was appointed on 27 November. This nomination was both a long-term aim of the conservatives and a response to the challenges of the labour movement during November 1917. Svinhufvud's main aspirations were to separate Finland from Russia, to strengthen the Civil Guards, and to return a part of Parliament's new authority to the Senate. There were 149 Civil Guards on 31 August 1917 in Finland, counting local units and subsidiary White Guards in towns and rural communes; 251 on 30 September; 315 on 31 October; 380 on 30 November and 408 on 26 January 1918. The first attempt at serious military training among the Guards was the establishment of a 200-strong cavalry school at the Saksanniemi estate in the vicinity of the town of Porvoo, in September 1917. The vanguard of the Finnish Jägers and German weaponry arrived in Finland during October–November 1917 on the freighter and the German U-boat ; around 50 Jägers had returned by the end of 1917. After political defeats in July and October 1917, the social democrats put forward an uncompromising program called "We Demand" (; ) on 1 November, in order to push for political concessions. They insisted upon a return to the political status before the dissolution of Parliament in July 1917, disbandment of the Civil Guards and elections to establish a Finnish Constituent Assembly. The program failed and the socialists initiated a general strike during 14–19 November to increase political pressure on the conservatives, who had opposed the "Law of Supreme Power" and the parliamentary proclamation of sovereign power on 15 November. Revolution became the goal of the radicalised socialists after the loss of political control, and events in November 1917 offered momentum for a socialist uprising. In this phase, Lenin and Joseph Stalin, under threat in Petrograd, urged the social democrats to take power in Finland. The majority of Finnish socialists were moderate and preferred parliamentary methods, prompting the Bolsheviks to label them "reluctant revolutionaries". The reluctance diminished as the general strike appeared to offer a major channel of influence for the workers in southern Finland. The strike leadership voted by a narrow majority to start a revolution on 16 November, but the uprising had to be called off the same day due to the lack of active revolutionaries to execute it. At the end of November 1917, the moderate socialists among the social democrats won a second vote over the radicals in a debate over revolutionary versus parliamentary means, but when they tried to pass a resolution to completely abandon the idea of a socialist revolution, the party representatives and several influential leaders voted it down. The Finnish labour movement wanted to sustain a military force of its own and to keep the revolutionary road open, too. The wavering Finnish socialists disappointed V. I. Lenin and in turn, he began to encourage the Finnish Bolsheviks in Petrograd. Among the labour movement, a more marked consequence of the events of 1917 was the rise of the Workers' Order Guards. There were 20–60 separate guards between 31 August and 30 September 1917, but on 20 October, after defeat in parliamentary elections, the Finnish labour movement proclaimed the need to establish more worker units. The announcement led to a rush of recruits: on 31 October the number of guards was 100–150; 342 on 30 November 1917 and 375 on 26 January 1918. Since May 1917, the paramilitary organisations of the left had grown in two phases, the majority of them as Workers' Order Guards. The minority were Red Guards, these were partly underground groups formed in industrialised towns and industrial centres, such as Helsinki, Kotka and Tampere, based on the original Red Guards that had been formed during 1905–1906 in Finland. The presence of the two opposing armed forces created a state of dual power and divided sovereignty on Finnish society. The decisive rift between the guards broke out during the general strike: the Reds executed several political opponents in southern Finland and the first armed clashes between the Whites and Reds took place. In total, 34 casualties were reported. Eventually, the political rivalries of 1917 led to an arms race and an escalation towards civil war. Independence of Finland The disintegration of Russia offered Finns an historic opportunity to gain national independence. After the October Revolution, the conservatives were eager for secession from Russia in order to control the left and minimise the influence of the Bolsheviks. The socialists were skeptical about sovereignty under conservative rule, but they feared a loss of support among nationalistic workers, particularly after having promised increased national liberty through the "Law of Supreme Power". Eventually, both political factions supported an independent Finland, despite strong disagreement over the composition of the nation's leadership. Nationalism had become a "civic religion" in Finland by the end of nineteenth century, but the goal during the general strike of 1905 was a return to the autonomy of 1809–1898, not full independence. In comparison to the unitary Swedish regime, the domestic power of Finns had increased under the less uniform Russian rule. Economically, the Grand Duchy of Finland benefited from having an independent domestic state budget, a central bank with national currency, the markka (deployed 1860), and customs organisation and the industrial progress of 1860–1916. The economy was dependent on the huge Russian market and separation would disrupt the profitable Finnish financial zone. The economic collapse of Russia and the power struggle of the Finnish state in 1917 were among the key factors that brought sovereignty to the fore in Finland. Svinhufvud's Senate introduced Finland's Declaration of Independence on 4 December 1917 and Parliament adopted it on 6 December. The social democrats voted against the Senate's proposal, while presenting an alternative declaration of sovereignty. The establishment of an independent state was not a guaranteed conclusion for the small Finnish nation. Recognition by Russia and other great powers was essential; Svinhufvud accepted that he had to negotiate with Lenin for the acknowledgement. The socialists, having been reluctant to enter talks with the Russian leadership in July 1917, sent two delegations to Petrograd to request that Lenin approve Finnish sovereignty. In December 1917, Lenin was under intense pressure from the Germans to conclude peace negotiations at Brest-Litovsk, and the Bolsheviks' rule was in crisis, with an inexperienced administration and the demoralised army facing powerful political and military opponents. Lenin calculated that the Bolsheviks could fight for central parts of Russia but had to give up some peripheral territories, including Finland in the geopolitically less important north-western corner. As a result, Svinhufvud's delegation won Lenin's concession of sovereignty on 31 December 1917. By the beginning of the Civil War, Austria-Hungary, Denmark, France, Germany, Greece, Norway, Sweden and Switzerland had recognised Finnish independence. The United Kingdom and United States did not approve it; they waited and monitored the relations between Finland and Germany (the main enemy of the Allies), hoping to override Lenin's regime and to get Russia back into the war against the German Empire. In turn, the Germans hastened Finland's separation from Russia so as to move the country to within their sphere of influence. Warfare Escalation The final escalation towards war began in early January 1918, as each military or political action of the Reds or the Whites resulted in a corresponding counteraction by the other. Both sides justified their activities as defensive measures, particularly to their own supporters. On the left, the vanguard of the movement was the urban Red Guards from Helsinki, Kotka and Turku; they led the rural Reds and convinced the socialist leaders who wavered between peace and war to support the revolution. On the right, the vanguard was the Jägers, who had transferred to Finland, and the volunteer Civil Guards of southwestern Finland, southern Ostrobothnia and Vyborg province in the southeastern corner of Finland. The first local battles were fought during 9–21 January 1918 in southern and southeastern Finland, mainly to win the arms race and to control Vyborg (; ). On 12 January 1918, Parliament authorised the Svinhufvud Senate to establish internal order and discipline on behalf of the state. On 15 January, Carl Gustaf Emil Mannerheim, a former Finnish general of the Imperial Russian Army, was appointed the commander-in-chief of the Civil Guards. The Senate appointed the Guards, henceforth called the White Guards, as the White Army of Finland. Mannerheim placed his Headquarters of the White Army in the Vaasa–Seinäjoki area. The White Order to engage was issued on 25 January. The Whites gained weaponry by disarming Russian garrisons during 21–28 January, in particular in southern Ostrobothnia. The Red Guards, led by Ali Aaltonen, refused to recognise the Whites' hegemony and established a military authority of their own. Aaltonen installed his headquarters in Helsinki and nicknamed it Smolna echoing the Smolny Institute, the Bolsheviks' headquarters in Petrograd. The Red Order of Revolution was issued on 26 January, and a red lantern, a symbolic indicator of the uprising, was lit in the tower of the Helsinki Workers' House. A large-scale mobilisation of the Reds began late in the evening of 27 January, with the Helsinki Red Guard and some of the Guards located along the Vyborg-Tampere railway having been activated between 23 and 26 January, in order to safeguard vital positions and escort a heavy railroad shipment of Bolshevik weapons from Petrograd to Finland. White troops tried to capture the shipment: 20–30 Finns, Red and White, died in the Battle of Kämärä at the Karelian Isthmus on 27 January 1918. The Finnish rivalry for power had culminated. Opposing parties Red Finland and White Finland At the beginning of the war, a discontinuous front line ran through southern Finland from west to east, dividing the country into White Finland and Red Finland. The Red Guards controlled the area to the south, including nearly all the major towns and industrial centres, along with the largest estates and farms with the highest numbers of crofters and tenant farmers. The White Army controlled the area to the north, which was predominantly agrarian and contained small or medium-sized farms and tenant farmers. The number of crofters was lower and they held a better social status than those in the south. Enclaves of the opposing forces existed on both sides of the front line: within the White area lay the industrial towns of Varkaus, Kuopio, Oulu, Raahe, Kemi and Tornio; within the Red area lay Porvoo, Kirkkonummi and Uusikaupunki. The elimination of these strongholds was a priority for both armies in February 1918. Red Finland was led by the Finnish People's Delegation (; ), established on 28 January 1918 in Helsinki, which was supervised by the Central Workers' Council. The delegation sought democratic socialism based on the Finnish Social Democratic Party's ethos; their visions differed from Lenin's dictatorship of the proletariat. Otto Ville Kuusinen formulated a proposal for a new constitution, influenced by those of Switzerland and the United States. With it, political power was to be concentrated to Parliament, with a lesser role for a government. The proposal included a multi-party system; freedom of assembly, speech and press; and the use of referenda in political decision-making. In order to ensure the authority of the labour movement, the common people would have a right to permanent revolution. The socialists planned to transfer a substantial part of property rights to the state and local administrations. In foreign policy, Red Finland leaned on Bolshevist Russia. A Red-initiated Finno–Russian treaty and peace agreement was signed on 1 March 1918, where Red Finland was called the Finnish Socialist Workers' Republic (; ). The negotiations for the treaty implied that –as in World War I in general– nationalism was more important for both sides than the principles of international socialism. The Red Finns did not simply accept an alliance with the Bolsheviks and major disputes appeared, for example, over the demarcation of the border between Red Finland and Soviet Russia. The significance of the Russo–Finnish Treaty evaporated quickly due to the signing of the Treaty of Brest-Litovsk between the Bolsheviks and the German Empire on 3 March 1918. Lenin's policy on the right of nations to self-determination aimed at preventing the disintegration of Russia during the period of military weakness. He assumed that in war-torn, splintering Europe, the proletariat of free nations would carry out socialist revolutions and unite with Soviet Russia later. The majority of the Finnish labour movement supported Finland's independence. The Finnish Bolsheviks, influential, though few in number, favoured annexation of Finland by Russia. The government of White Finland, Pehr Evind Svinhufvud's first senate, was called the Vaasa Senate after its relocation to the safer west-coast city of Vaasa, which acted as the capital of the Whites from 29 January to 3 May 1918. In domestic policy, the White Senate's main goal was to return the political right to power in Finland. The conservatives planned a monarchist political system, with a lesser role for Parliament. A section of the conservatives had always supported monarchy and opposed democracy; others had approved of parliamentarianism since the revolutionary reform of 1906, but after the crisis of 1917–1918, concluded that empowering the common people would not work. Social liberals and reformist non-socialists opposed any restriction of parliamentarianism. They initially resisted German military help, but the prolonged warfare changed their stance. In foreign policy, the Vaasa Senate relied on the German Empire for military and political aid. Their objective was to defeat the Finnish Reds; end the influence of Bolshevist Russia in Finland and expand Finnish territory to East Karelia, a geopolitically significant home to people speaking Finnic languages. The weakness of Russia inspired an idea of Greater Finland among the expansionist factions of both the right and left: the Reds had claims concerning the same areas. General Mannerheim agreed on the need to take over East Karelia and to request German weapons, but opposed actual German intervention in Finland. Mannerheim recognised the Red Guards' lack of combat skill and trusted in the abilities of the German-trained Finnish Jägers. As a former Russian army officer, Mannerheim was well aware of the demoralisation of the Russian army. He co-operated with White-aligned Russian officers in Finland and Russia. Soldiers and weapons The number of Finnish troops on each side varied from 70,000 to 90,000 and both had around 100,000 rifles, 300–400 machine guns and a few hundred cannons. While the Red Guards consisted mostly of volunteers, with wages paid at the beginning of the war, the White Army consisted predominantly of conscripts with 11,000–15,000 volunteers. The main motives for volunteering were socio-economic factors, such as salary and food, as well as idealism and peer pressure. The Red Guards included 2,600 women, mostly girls recruited from the industrial centres and cities of southern Finland. Urban and agricultural workers constituted the majority of the Red Guards, whereas land-owning farmers and well-educated people formed the backbone of the White Army. Both armies used child soldiers, mainly between 14 and 17 years of age. The use of juvenile soldiers was not rare in World War I; children of the time were under the absolute authority of adults and were not shielded against exploitation. Rifles and machine guns from Imperial Russia were the main armaments of the Reds and the Whites. The most commonly used rifle was the Russian Mosin–Nagant Model 1891. In total, around ten different rifle models were in service, causing problems for ammunition supply. The Maxim gun was the most-used machine gun, along with the less-used M1895 Colt–Browning, Lewis and Madsen guns. The machine guns caused a substantial part of the casualties in combat. Russian field guns were mostly used with direct fire. The Civil War was fought primarily along railways; vital means for transporting troops and supplies, as well for using armoured trains, equipped with light cannons and heavy machine guns. The strategically most important railway junction was Haapamäki, approximately northeast of Tampere, connecting eastern and western Finland and as well as southern and northern Finland. Other critical junctions included Kouvola, Riihimäki, Tampere, Toijala and Vyborg. The Whites captured Haapamäki at the end of January 1918, leading to the Battle of Vilppula. Red Guards and Soviet troops The Finnish Red Guards seized the early initiative in the war by taking control of Helsinki on 28 January 1918 and by undertaking a general offensive lasting from February till early March 1918. The Reds were relatively well-armed, but a chronic shortage of skilled leaders, both at the command level and in the field, left them unable to capitalise on this momentum, and most of the offensives came to nothing. The military chain of command functioned relatively well at company and platoon level, but leadership and authority remained weak as most of the field commanders were chosen by the vote of the troops. The common troops were more or less armed civilians, whose military training, discipline and combat morale were both inadequate and low. Ali Aaltonen was replaced on 28 January 1918 by Eero Haapalainen as commander-in-chief. He, in turn, was displaced by the Bolshevik triumvirate of Eino Rahja, Adolf Taimi and Evert Eloranta on 20 March. The last commander-in-chief of the Red Guard was Kullervo Manner, from 10 April until the last period of the war when the Reds no longer had a named leader. Some talented local commanders, such as Hugo Salmela in the Battle of Tampere, provided successful leadership, but could not change the course of the war. The Reds achieved some local victories as they retreated from southern Finland toward Russia, such as against German troops in the Battle of Syrjäntaka on 28–29 April in Tuulos. Around 50,000 of the former czar's army troops were stationed in Finland in January 1918. The soldiers were demoralised and war-weary, and the former serfs were thirsty for farmland set free by the revolutions. The majority of the troops returned to Russia by the end of March 1918. In total, 7,000 to 10,000 Red Russian soldiers supported the Finnish Reds, but only around 3,000, in separate, smaller units of 100–1,000 soldiers, could be persuaded to fight in the front line. The revolutions in Russia divided the Soviet army officers politically and their attitude towards the Finnish Civil War varied. Mikhail Svechnikov led Finnish Red troops in western Finland in February and Konstantin Yeremejev Soviet forces on the Karelian Isthmus, while other officers were mistrustful of their revolutionary peers and instead co-operated with General Mannerheim, in disarming Soviet garrisons in Finland. On 30 January 1918, Mannerheim proclaimed to Russian soldiers in Finland that the White Army did not fight against Russia, but that the objective of the White campaign was to beat the Finnish Reds and the Soviet troops supporting them. The number of Soviet soldiers active in the civil war declined markedly once Germany attacked Russia on 18 February 1918. The German-Soviet Treaty of Brest-Litovsk of 3 March restricted the Bolsheviks' support for the Finnish Reds to weapons and supplies. The Soviets remained active on the south-eastern front, mainly in the Battle of Rautu on the Karelian Isthmus between February and April 1918, where they defended the approaches to Petrograd. White Guards and Sweden's role While the conflict has been called by some, "The War of Amateurs", the White Army had two major advantages over the Red Guards: the professional military leadership of Gustaf Mannerheim and his staff, which included 84 Swedish volunteer officers and former Finnish officers of the czar's army; and 1,450 soldiers of the 1,900-strong, Jäger battalion. The majority of the unit arrived in Vaasa on 25 February 1918. On the battlefield, the Jägers, battle-hardened on the Eastern Front, provided strong leadership that made disciplined combat of the common White troopers possible. The soldiers were similar to those of the Reds, having brief and inadequate training. At the beginning of the war, the White Guards' top leadership had little authority over volunteer White units, which obeyed only their local leaders. At the end of February, the Jägers started a rapid training of six conscript regiments. The Jäger battalion was politically divided, too. Four-hundred-and-fifty –mostly socialist– Jägers remained stationed in Germany, as it was feared they were likely to side with the Reds. White Guard leaders faced a similar problem when drafting young men to the army in February 1918: 30,000 obvious supporters of the Finnish labour movement never showed up. It was also uncertain whether common troops drafted from the small-sized and poor farms of central and northern Finland had strong enough motivation to fight the Finnish Reds. The Whites' propaganda promoted the idea that they were fighting a defensive war against Bolshevist Russians, and belittled the role of the Red Finns among their enemies. Social divisions appeared both between southern and northern Finland and within rural Finland. The economy and society of the north had modernised more slowly than that of the south. There was a more pronounced conflict between Christianity and socialism in the north, and the ownership of farmland conferred major social status, motivating the farmers to fight against the Reds. Sweden declared neutrality both during World War I and the Finnish Civil War. General opinion, in particular among the Swedish elite, was divided between supporters of the Allies and the Central powers, Germanism being somewhat more popular. Three war-time priorities determined the pragmatic policy of the Swedish liberal-social democratic government: sound economics, with export of iron-ore and foodstuff to Germany; sustaining the tranquility of Swedish society; and geopolitics. The government accepted the participation of Swedish volunteer officers and soldiers in the Finnish White Army in order to block expansion of revolutionary unrest to Scandinavia. A 1,000-strong paramilitary Swedish Brigade, led by Hjalmar Frisell, took part in the Battle of Tampere and in the fighting south of the town. In February 1918, the Swedish Navy escorted the German naval squadron transporting Finnish Jägers and German weapons and allowed it to pass through Swedish territorial waters. The Swedish socialists tried to open peace negotiations between the Whites and the Reds. The weakness of Finland offered Sweden a chance to take over the geopolitically vital Finnish Åland Islands, east of Stockholm, but the German army's Finland operation stalled this plan. German intervention In March 1918, the German Empire intervened in the Finnish Civil War on the side of the White Army. Finnish activists leaning on Germanism had been seeking German aid in freeing Finland from Soviet hegemony since late 1917, but because of the pressure they were facing at the Western Front, the Germans did not want to jeopardise their armistice and peace negotiations with the Soviet Union. The German stance changed after 10 February when Leon Trotsky, despite the weakness of the Bolsheviks' position, broke off negotiations, hoping revolutions would break out in the German Empire and change everything. On 13 February, the German leadership decided to retaliate and send military detachments to Finland too. As a pretext for aggression, the Germans invited "requests for help" from the western neighbouring countries of Russia. Representatives of White Finland in Berlin duly requested help on 14 February. The Imperial German Army attacked Russia on 18 February. The offensive led to a rapid collapse of the Soviet forces and to the signing of the first Treaty of Brest-Litovsk by the Bolsheviks on 3 March 1918. Finland, the Baltic countries, Poland and Ukraine were transferred to the German sphere of influence. The Finnish Civil War opened a low-cost access route to Fennoscandia, where the geopolitical status was altered as a Royal Navy squadron occupied the Soviet harbour of Murmansk by the Arctic Ocean on 9 March 1918. The leader of the German war effort, General Erich Ludendorff, wanted to keep Petrograd under threat of attack via the Vyborg-Narva area and to install a German-led monarchy in Finland. On 5 March 1918, a German naval squadron landed on the Åland Islands (in mid-February 1918, the islands had been occupied by a Swedish military expedition, which departed from there in May). On 3 April 1918, the 10,000-strong Baltic Sea Division (), led by General Rüdiger von der Goltz, launched the main attack at Hanko, west of Helsinki. It was followed on 7 April by Colonel Otto von Brandenstein's 3,000-strong Detachment Brandenstein () taking the town of Loviisa east of Helsinki. The larger German formations advanced eastwards from Hanko and took Helsinki on 12–13 April, while Detachment Brandenstein overran the town of Lahti on 19 April. The main German detachment proceeded northwards from Helsinki and took Hyvinkää and Riihimäki on 21–22 April, followed by Hämeenlinna on 26 April. The final blow to the cause of the Finnish Reds was dealt when the Bolsheviks broke off the peace negotiations at Brest-Litovsk, leading to the German eastern offensive in February 1918. Decisive engagements Battle of Tampere In February 1918, General Mannerheim deliberated on where to focus the general offensive of the Whites. There were two strategically vital enemy strongholds: Tampere, Finland's major industrial town in the south-west, and Vyborg, Karelia's main city. Although seizing Vyborg offered many advantages, his army's lack of combat skills and the potential for a major counterattack by the Reds in the area or in the south-west made it too risky. Mannerheim decided to strike first at Tampere, despite the fact that the town, mostly known for its working class, housed nearly 15,000 heavily armed Red Guards. He launched the main assault on 16 March 1918, at Längelmäki north-east of the town, through the right flank of the Reds' defence. At the same time, the Whites attacked through the north-western frontline Vilppula–Kuru–Kyröskoski–Suodenniemi. Although the Whites were unaccustomed to offensive warfare, some Red Guard units collapsed and retreated in panic under the weight of the offensive, while other Red detachments defended their posts to the last and were able to slow the advance of the White troops. Eventually, the Whites lay siege to Tampere. They cut off the Reds' southward connection at Lempäälä on 24 March and westward ones at Siuro, Nokia, and Ylöjärvi on 25 March. The Battle for Tampere was fought between 16,000 White and 14,000 Red soldiers. It was Finland's first large-scale urban battle and one of the four most decisive military engagements of the war. The fight for the area of Tampere began on 28 March, on the eve of Easter 1918, later called "Bloody Maundy Thursday", in the Kalevankangas Cemetery. The White Army did not achieve a decisive victory in the fierce combat, suffering more than 50 percent losses in some of their units. The Whites had to re-organise their troops and battle plans, managing to raid the town centre in the early hours of 3 April. After a heavy, concentrated artillery barrage, the White Guards advanced from house to house and street to street, as the Red Guards retreated. In the late evening of 3 April, the Whites reached the eastern banks of the Tammerkoski rapids. The Reds' attempts to break the siege of Tampere from the outside along the Helsinki-Tampere railway failed. The Red Guards lost the western parts of the town between 4 and 5 April. The Tampere City Hall was among the last strongholds of the Reds. The battle ended 6 April 1918 with the surrender of Red forces in the Pyynikki and Pispala sections of Tampere. The Reds, now on the defensive, showed increased motivation to fight during the battle. General Mannerheim was compelled to deploy some of the best-trained Jäger detachments, initially meant to be conserved for later use in the Vyborg area. The Battle of Tampere was the bloodiest action of the Civil War. The White Army lost 700–900 men, including 50 Jägers, the highest number of deaths the Jäger battalion suffered in a single battle of the 1918 war. The Red Guards lost 1,000–1,500 soldiers, with a further 11,000–12,000 captured. 71 civilians died, mainly due to artillery fire. The eastern parts of the city, consisting mostly of wooden buildings, were completely destroyed. Battle of Helsinki After peace talks between Germans and the Finnish Reds were broken off on 11 April 1918, the battle for the capital of Finland began. At 05:00 on 12 April, around 2,000–3,000 German Baltic Sea Division soldiers, led by Colonel Hans von Tschirsky und von Bögendorff, attacked the city from the north-west, supported via the Helsinki-Turku railway. The Germans broke through the area between Munkkiniemi and Pasila, and advanced on the central-western parts of the town. The German naval squadron led by Vice Admiral Hugo Meurer blocked the city harbour, bombarded the southern town area, and landed Seebataillon marines at Katajanokka. Around 7,000 Finnish Reds defended Helsinki, but their best troops fought on other fronts of the war. The main strongholds of the Red defence were the Workers' Hall, the Helsinki railway station, the Red Headquarters at Smolna, the Senate Palace–Helsinki University area and the former Russian garrisons. By the late evening of 12 April, most of the southern parts and all of the western area of the city had been occupied by the Germans. Local Helsinki White Guards, having hidden in the city during the war, joined the battle as the Germans advanced through the town. On 13 April, German troops took over the Market Square, the Smolna, the Presidential Palace and the Senate-Ritarihuone area. Toward the end, a German brigade with 2,000–3,000 soldiers, led by Colonel Kondrad Wolf joined the battle. The unit rushed from north to the eastern parts of Helsinki, pushing into the working-class neighborhoods of Hermanni, Kallio and Sörnäinen. German artillery bombarded and destroyed the Workers' Hall and put out the red lantern of the Finnish revolution. The eastern parts of the town surrendered around 14:00 on 13 April, when a white flag was raised in the tower of the Kallio Church. Sporadic fighting lasted until the evening. In total, 60 Germans, 300–400 Reds and 23 White Guard troopers were killed in the battle. Around 7,000 Reds were captured. The German army celebrated the victory with a military parade in the centre of Helsinki on 14 April 1918. Battle of Hyvinkää After losing Helsinki, the Red Defense Command moved to Riihimäki, where it was headed by painter and congressman Efraim Kronqvist. The Germans troops, led by Major General Konrad Wolf, on the other hand, attacked Helsinki north on April 15 and conquered Klaukkala four days later, continuing from there to Hämeenlinna. In that connection, the Battle of Hyvinkää took place in the town of Hyvinkää, in connection with which killed 21 Germans and about 50 Red Guards. After the battle, at least 150 of the Reds were executed by the Whites. Battle of Lahti On 19 April 1918, Detachment Brandenstein took over the town of Lahti. The German troops advanced from the east-southeast via Nastola, through the Mustankallio graveyard in Salpausselkä and the Russian garrisons at Hennala. The battle was minor but strategically important as it cut the connection between the western and eastern Red Guards. Local engagements broke out in the town and the surrounding area between 22 April and 1 May 1918 as several thousand western Red Guards and Red civilian refugees tried to push through on their way to Russia. The German troops were able to hold major parts of the town and halt the Red advance. In total, 600 Reds and 80 German soldiers perished, and 30,000 Reds were captured in and around Lahti. Battle of Vyborg After the defeat in Tampere, the Red Guards began a slow retreat eastwards. As the German army seized Helsinki, the White Army shifted the military focus to Vyborg area, where 18,500 Whites advanced against 15,000 defending Reds. General Mannerheim's war plan had been revised as a result of the Battle for Tampere, a civilian, industrial town. He aimed to avoid new, complex city combat in Vyborg, an old military fortress. The Jäger detachments tried to tie down and destroy the Red force outside the town. The Whites were able to cut the Reds' connection to Petrograd and weaken the troops on the Karelian Isthmus on 20–26 April, but the decisive blow remained to be dealt in Vyborg. The final attack began on late 27 April with a heavy Jäger artillery barrage. The Reds' defence collapsed gradually, and eventually the Whites conquered Patterinmäki—the Reds' symbolic last stand of the 1918 uprising—in the early hours of 29 April 1918. In total, 400 Whites died, and 500–600 Reds perished and 12,000–15,000 were captured. Red and White terror Both Whites and Reds carried out political violence through executions, respectively termed White Terror (; ) and Red Terror (; ). The threshold of political violence had already been crossed by the Finnish activists during the First Period of Russification. Large-scale terror operations were born and bred in Europe during World War I, the first total war. The February and October Revolutions initiated similar violence in Finland: at first by Russian army troops executing their officers, and then later between the Finnish Reds and Whites. The terror consisted of a calculated aspect of general warfare and, on the other hand, the local, personal murders and corresponding acts of revenge. In the former, the commanding staff planned and organised the actions and gave orders to the lower ranks. At least a third of the Red terror and most of the White terror was centrally led. In February 1918, a Desk of Securing Occupied Areas was implemented by the highest-ranking White staff, and the White troops were given Instructions for Wartime Judicature, later called the Shoot on the Spot Declaration. This order authorised field commanders to execute essentially anyone they saw fit. No order by the less-organised, highest Red Guard leadership authorising Red Terror has been found. The paper was "burned" or the command was oral. The main goals of the terror were to destroy the command structure of the enemy; to clear and secure the areas governed and occupied by armies; and to create shock and fear among the civil population and the enemy soldiers. Additionally, the common troops' paramilitary nature and their lack of combat skills drove them to use political violence as a military weapon. Most of the executions were carried out by cavalry units called Flying Patrols, consisting of 10 to 80 soldiers aged 15 to 20 and led by an experienced, adult leader with absolute authority. The patrols, specialised in search and destroy operations and death squad tactics, were similar to German Sturmbattalions and Russian Assault units organized during World War I. The terror achieved some of its objectives but also gave additional motivation to fight against an enemy perceived to be inhuman and cruel. Both Red and White propaganda made effective use of their opponents' actions, increasing the spiral of revenge. The Red Guards executed influential Whites, including politicians, major landowners, industrialists, police officers, civil servants and teachers as well as White Guards. Ten priests of the Evangelical Lutheran Church and 90 moderate socialists were killed. The number of executions varied over the war months, peaking in February as the Reds secured power, but March saw low counts because the Reds could not seize new areas outside of the original frontlines. The numbers rose again in April as the Reds aimed to leave Finland. The two major centres for Red Terror were Toijala and Kouvola, where 300–350 Whites were executed between February and April 1918. The White Guards executed Red Guard and party leaders, Red troops, socialist members of the Finnish Parliament and local Red administrators, and those active in implementing Red Terror. The numbers varied over the months as the Whites conquered southern Finland. Comprehensive White Terror started with their general offensive in March 1918 and increased constantly. It peaked at the end of the war and declined and ceased after the enemy troops had been transferred to prison camps. During the high point of the executions, between the end of April and the beginning of May, 200 Reds were shot per day. White Terror was decisive against Russian soldiers who assisted the Finnish Reds, and several Russian non-socialist civilians were killed in the Vyborg massacre, the aftermath of the Battle of Vyborg. In total, 1,650 Whites died as a result of Red Terror, while around 10,000 Reds perished by White Terror, which turned into political cleansing. White victims have been recorded exactly, while the number of Red troops executed immediately after battles remains unclear. Together with the harsh prison-camp treatment of the Reds during 1918, the executions inflicted the deepest mental scars on the Finns, regardless of their political allegiance. Some of those who carried out the killings were traumatised, a phenomenon that was later documented. End On 8 April 1918, after the defeat in Tampere and the German army intervention, the People's Delegation retreated from Helsinki to Vyborg. The loss of Helsinki pushed them to Petrograd on 25 April. The escape of the leadership embittered many Reds, and thousands of them tried to flee to Russia, but most of the refugees were encircled by White and German troops. In the Lahti area they surrendered on 1–2 May. The long Red caravans included women and children, who experienced a desperate, chaotic escape with severe losses due to White attacks. The scene was described as a "road of tears" for the Reds, but for the Whites, the sight of long, enemy caravans heading east was a victorious moment. The Red Guards' last strongholds between the Kouvola and Kotka area fell by 5 May, after the Battle of Ahvenkoski. The war of 1918 ended on 15 May 1918, when the Whites took over Fort Ino, a Russian coastal artillery base on the Karelian Isthmus, from the Russian troops. White Finland and General Mannerheim celebrated the victory with a large military parade in Helsinki on 16 May 1918. The Red Guards had been defeated. The Finnish labour movement had lost the Civil War, several military leaders committed suicide and a majority of the Reds were sent to prison camps. The Vaasa Senate returned to Helsinki on 4 May 1918, but the capital was under the control of the German army. White Finland had become a protectorate of the German Empire and General Rüdiger von der Goltz was called "the true Regent of Finland". No armistice or peace negotiations were carried out between the Whites and Reds and an official peace treaty to end the Finnish Civil War was never signed. Aftermath and impact Casualties Casualties of Finnish Civil War were according to a Finnish Government project (2004): Died in battle: "whites" 3414, "reds" 5199; Missing: whites 46, reds 1767; Executed: whites 1424, reds 7370; Died in prison camps: whites 4, reds 11652 – total deaths 36640. Prison camps The White Army and German troops captured around 80,000 Red prisoners, including 5,000 women, 1,500 children and 8,000 Russians. The largest prison camps were Suomenlinna (an island facing Helsinki), Hämeenlinna, Lahti, Riihimäki, Tammisaari, Tampere and Vyborg. The Senate decided to keep the prisoners detained until each individual's role in the Civil War had been investigated. Legislation making provision for a Treason Court (; ) was enacted on 29 May 1918. The judicature of the 145 inferior courts led by the Supreme Treason Court (; ) did not meet the standards of impartiality, due to the condemnatory atmosphere of White Finland. In total 76,000 cases were examined and 68,000 Reds were convicted, primarily for treason; 39,000 were released on parole while the mean-length of punishment for the rest was two to four years in jail. 555 people were sentenced to death, of whom 113 were executed. The trials revealed that some innocent adults had been imprisoned. Combined with the severe food shortages caused by the Civil War, mass imprisonment led to high mortality rates in the prison camps, and the catastrophe was compounded by the angry, punitive and uncaring mentality of the victors. Many prisoners felt that they had been abandoned by their own leaders, who had fled to Russia. The physical and mental condition of the prisoners declined in May 1918. Many prisoners had been sent to the camps in Tampere and Helsinki in the first half of April and food supplies were disrupted during the Reds' eastward retreat. Consequently, in June 2,900 prisoners starved to death, or died as a result of diseases caused by malnutrition or the Spanish flu: 5,000 in July; 2,200 in August; and 1,000 in September. The mortality rate was highest in the Tammisaari camp at 34 percent, while the rate varied between 5 percent and 20 percent in the others. In total, around 12,500 Finns perished (3,000–4,000 due to the Spanish flu) while detained. The dead were buried in mass graves near the camps, of which more than 2,500 Red Guards have been buried in the large mass grave located in the Kalevankangas Cemetery. Moreover, 700 severely weakened prisoners died soon after release from the camps. Most prisoners were paroled or pardoned by the end of 1918, after a shift in the political situation. There were 6,100 Red prisoners left at the end of the year and 4,000 at the end of 1919. In January 1920, 3,000 prisoners were pardoned and civil rights were returned to 40,000 former Reds. In 1927, the Social Democratic Party government led by Väinö Tanner pardoned the last 50 prisoners. The Finnish government paid reparations to 11,600 prisoners in 1973. The traumatic hardships of the prison camps increased support for communism in Finland. War-torn nation The Civil War was a catastrophe for Finland: around 36,000 people – 1.2 percent of the population – perished. The war left approximately 15,000 children orphaned. Most of the casualties occurred outside the battlefields: in the prison camps and the terror campaigns. Many Reds fled to Russia at the end of the war and during the period that followed. The fear, bitterness and trauma caused by the war deepened the divisions within Finnish society and many moderate Finns identified themselves as "citizens of two nations." During the war and after that, the warring sides have been derogatively referred to as "butchers" (for Whites; ) and "red russkies" (for Reds; or punaryssä) or just "commies". Among the Reds in particular, the loss of the war caused such bitterness that some of those who fled behind the eastern border tried to carry out the assassination of General Mannerheim during a White Guard's victory parade of Tampere in 1920, with poor results. The conflict caused disintegration within both socialist and non-socialist factions. The rightward shift of power caused a dispute between conservatives and liberals on the best system of government for Finland to adopt: the former demanded monarchy and restricted parliamentarianism; the latter demanded a democratic republic. Both sides justified their views on political and legal grounds. The monarchists leaned on the Swedish regime's 1772 monarchist constitution (accepted by Russia in 1809), belittled the Declaration of Independence of 1917, and proposed a modernised, monarchist constitution for Finland. The republicans argued that the 1772 law lost validity in the February Revolution, that the authority of the Russian czar was assumed by the Finnish Parliament on 15 November 1917, and that the Republic of Finland had been adopted on 6 December that year. The republicans were able to halt the passage of the monarchists' proposal in Parliament. The royalists responded by applying the 1772 law to select a new monarch for the country without reference to Parliament. The Finnish labour movement was divided into three parts: moderate social democrats in Finland; radical socialists in Finland; and communists in Soviet Russia. The Social Democratic Party had its first official party meeting after the Civil War on 25 December 1918, at which the party proclaimed a commitment to parliamentary means and disavowed Bolshevism and communism. The leaders of Red Finland, who had fled to Russia, established the Communist Party of Finland in Moscow on 29 August 1918. After the power struggle of 1917 and the bloody civil war, the former Fennomans and the social democrats who had supported "ultra-democratic" means in Red Finland declared a commitment to revolutionary Bolshevism–communism and to the dictatorship of the proletariat, under the control of Lenin. In May 1918, a conservative-monarchist Senate was formed by J. K. Paasikivi, and the Senate asked the German troops to remain in Finland. 3 March 1918 Treaty of Brest-Litovsk and 7 March German-Finnish agreements bound White Finland to the German Empire's sphere of influence. General Mannerheim resigned his post on 25 May after disagreements with the Senate about German hegemony over Finland, and about his planned attack on Petrograd to repulse the Bolsheviks and capture Russian Karelia. The Germans opposed these plans due to their peace treaties with Lenin. The Civil War weakened the Finnish Parliament; it became a Rump Parliament that included only three socialist representatives. On 9 October 1918, under pressure from Germany, the Senate and Parliament elected a German prince, Friedrich Karl, the brother-in-law of German Emperor William II, to become the King of Finland. The German leadership was able to utilise the breakdown of Russia for the geopolitical benefit of the German Empire in Fennoscandia also. The Civil War and the aftermath diminished independence of Finland, compared to the status it had held at the turn of the year 1917–1918. The economic condition of Finland deteriorated drastically from 1918; recovery to pre-conflict levels was achieved only in 1925. The most acute crisis was in food supply, already deficient in 1917, though large-scale starvation had been avoided that year. The Civil War caused marked starvation in southern Finland. Late in 1918, Finnish politician Rudolf Holsti appealed for relief to Herbert Hoover, the American chairman of the Committee for Relief in Belgium. Hoover arranged for the delivery of food shipments and persuaded the Allies to relax their blockade of the Baltic Sea, which had obstructed food supplies to Finland, and to allow food into the country. Compromise On 15 March 1917, the fate of Finns had been decided outside Finland, in Petrograd. On 11 November 1918, the future of the nation was determined in Berlin, as a result of Germany's surrender to end World War I. The German Empire collapsed in the German Revolution of 1918–19, caused by lack of food, war-weariness and defeat in the battles of the Western Front. General Rüdiger von der Goltz and his division left Helsinki on 16 December 1918, and Prince Friedrich Karl, who had not yet been crowned, abandoned his role four days later. Finland's status shifted from a monarchist protectorate of the German Empire to an independent republic. The new system of government was confirmed by the Constitution Act (; ) on 17 July 1919. The first local elections based on universal suffrage in Finland were held during 17–28 December 1918, and the first free parliamentary election took place after the Civil War on 3 March 1919. The United States and the United Kingdom recognised Finnish sovereignty on 6–7 May 1919. The Western powers demanded the establishment of democratic republics in post-war Europe, to lure the masses away from widespread revolutionary movements. The Finno–Russian Treaty of Tartu was signed on 14 October 1920, with the aim of stabilizing political relations between Finland and Russia and settling the border question. In April 1918, the leading Finnish social liberal and the eventual first President of Finland, Kaarlo Juho Ståhlberg wrote: "It is urgent to get the life and development in this country back on the path that we had already reached in 1906 and which the turmoil of war turned us away from." Moderate social democrat Väinö Voionmaa agonised in 1919: "Those who still trust in the future of this nation must have an exceptionally strong faith. This young independent country has lost almost everything due to the war." Voionmaa was a vital companion for the leader of the reformed Social Democratic Party, Väinö Tanner. Santeri Alkio supported moderate politics. His party colleague, Kyösti Kallio urged in his Nivala address of 5 May 1918: "We must rebuild a Finnish nation, which is not divided into the Reds and Whites. We have to establish a democratic Finnish republic, where all the Finns can feel that we are true citizens and members of this society." In the end, many of the moderate Finnish conservatives followed the thinking of National Coalition Party member Lauri Ingman, who wrote in early 1918: "A political turn more to the right will not help us now, instead it would strengthen the support of socialism in this country." Together with other broad-minded Finns, the new partnership constructed a Finnish compromise which eventually delivered a stable and broad parliamentary democracy. The compromise was based both on the defeat of the Reds in the Civil War and the fact that most of the Whites' political goals had not been achieved. After foreign forces left Finland, the militant factions of the Reds and the Whites lost their backing, while the pre-1918 cultural and national integrity and the legacy of Fennomania stood out among the Finns. The weakness of both Germany and Russia after World War I empowered Finland and made a peaceful, domestic Finnish social and political settlement possible. A reconciliation process led to a slow and painful, but steady, national unification. In the end, the power vacuum and interregnum of 1917–1919 gave way to the Finnish compromise. From 1919 to 1991, the democracy and sovereignty of the Finns withstood challenges from right-wing and left-wing political radicalism, the crisis of World War II and pressure from the Soviet Union during the Cold War. In popular culture Literature Despite the fact that the Civil War was one of the most sensitive and controversial topics more than a hundred years later in Finland, still between 1918 and the 1950s, mainstream literature and poetry presented the 1918 war from the White victors' point of view, with works such as the "Psalm of the Cannons" () by Arvi Järventaus in 1918. In poetry, Bertel Gripenberg, who had volunteered for the White Army, celebrated its cause in "The Great Age" () in 1928 and V. A. Koskenniemi in "Young Anthony" () in 1918. The war tales of the Reds were kept silent. The first neutrally critical books were written soon after the war, notably, "Devout Misery" () written by the Nobel Prize laureate Frans Emil Sillanpää in 1919; "Dead Apple Trees" () by Joel Lehtonen in 1918; and "Homecoming" () by Runar Schildt in 1919. These were followed by Jarl Hemmer in 1931 with the book "A Man and His Conscience" () and Oiva Paloheimo in 1942 with "Restless Childhood" (). Lauri Viita's book "Scrambled Ground" () from 1950 presented the life and experiences of a worker family in the Tampere of 1918, including a point of view from outsiders to the Civil War. Between 1959 and 1962, Väinö Linna described in his trilogy "Under the North Star" () the Civil War and World War II from the viewpoint of the common people. Part II of Linna's work opened a larger view of these events and included tales of the Reds in the 1918 war. At the same time, a new outlook on the war was opened by Paavo Haavikko's book "Private Matters" (), Veijo Meri's "The Events of 1918" () and Paavo Rintala's "My Grandmother and Mannerheim" (), all published in 1960. In poetry, Viljo Kajava, who had experienced the Battle of Tampere at the age of nine, presented a pacifist view of the Civil War in his "Poems of Tampere" () in 1966. The same battle is described in the novel "Corpse Bearer" () by Antti Tuuri from 2007. Jenni Linturi's multilayered "Malmi 1917" (2013) describes contradictory emotions and attitudes in a village drifting towards civil war. Väinö Linna's trilogy turned the general tide, and after it, several books were written mainly from the Red viewpoint: The Tampere-trilogy by Erkki Lepokorpi in 1977; Juhani Syrjä's "Juho 18" in 1998; "The Command" () by Leena Lander in 2003; and "Sandra" by Heidi Köngäs in 2017. Kjell Westö's epic novel "Where We Once Went" (), published in 2006, deals with the period of 1915–1930 from both the Red and the White sides. Westö's book "Mirage 38" () from 2013, describes post-war traumas of the 1918 war and Finnish mentality in the 1930s. Many of the stories have been utilised in motion pictures and in theatre. Cinema and television The Civil War and the literature about it has inspired many Finnish filmmakers to take it the subject for the film and television adaptations. As early as 1957, 1918, a film directed by Toivo Särkkä and based on Jarl Hemmer's play and novel A Man and His Conscience, was screened at the 7th Berlin International Film Festival. The most recent films about the civil war include the 2007 film The Border, directed by Lauri Törhönen, and the 2008 film Tears of April, directed by Aku Louhimies and based on Leena Lander's novel The Command. However, perhaps the most famous film about the Finnish Civil War is the 1968 film Here, Beneath the North Star, directed by Edvin Laine and based on the first two books of Väinö Linna's Under the North Star trilogy. In 2012, the dramatized documentary Dead or Alive 1918 (or The Battle of Näsilinna 1918; ) was made, which tells the story of the Battle of Tampere during the Civil War. Other noteworthy documentary-styled films about the Finnish Civil War include from 1973, Trust from 1976, and Flame Top from 1980. See also Cudgel War Finnish War Åland War History of Finland Kagal (Finnish resistance movement) List of Finnish wars Lotta Svärd Mensheviks Ukrainian War of Independence Estonian War of Independence Winter War References Notes Citations Bibliography English . Finnish External links Tepora, Tuomas: Finnish Civil War 1918, in: 1914-1918-online. International Encyclopedia of the First World War. Jalonen, Jussi: Tampere, Battle of, in: 1914-1918-online. International Encyclopedia of the First World War. 1918 (pictures of the Civil War on Flickr uploaded by the Vapriikki Museum Centre under CC-BY 2.0) Finna.fi (search service for information from Finnish archives, libraries and museums) Finnish Civil War 1918 (part of the 1914–1918 online International Encyclopedia of the First World War) The Representation of Violence in the Finnish (press-) Photography of the Civil War (requires an Adobe Flash player) Finnish War Victims 1914–22 1918 in Finland Civil wars involving the states and peoples of Europe Civil wars of the Industrial era Conflicts in 1918 Finland–Russia relations Finland–Soviet Union relations Proxy wars Revolution-based civil wars Subsidiary conflicts of World War I Wars involving Finland Wars involving Germany
[ 0.24605494737625122, -0.0884609967470169, -0.408282071352005, 0.2392195463180542, -0.737468421459198, 0.4411929249763489, 0.5372157692909241, 0.27413755655288696, -0.18296334147453308, -0.007870716042816639, -0.14650997519493103, 0.3328489661216736, -0.44154679775238037, 0.8896795511245728...
11773
https://en.wikipedia.org/wiki/Flynn%20effect
Flynn effect
The Flynn effect is the substantial and long-sustained increase in both fluid and crystallized intelligence test scores that were measured in many parts of the world over the 20th century. When intelligence quotient (IQ) tests are initially standardized using a sample of test-takers, by convention the average of the test results is set to 100 and their standard deviation is set to 15 or 16 IQ points. When IQ tests are revised, they are again standardized using a new sample of test-takers, usually born more recently than the first. Again, the average result is set to 100. However, when the new test subjects take the older tests, in almost every case their average scores are significantly above 100. Test score increases have been continuous and approximately linear from the earliest years of testing to the present. For example, a study published in the year 2009 found that British children's average scores on the Raven's Progressive Matrices test rose by 14 IQ points from 1942 to 2008. Similar gains have been observed in many other countries in which IQ testing has long been widely used, including other Western European countries, Japan, and South Korea. There are numerous proposed explanations of the Flynn effect, such as the rise in efficiency of education, along with skepticism concerning its implications. Similar improvements have been reported for semantic and episodic memory. Some research suggests that there may be an ongoing reversed Flynn effect (i.e., a decline in IQ scores) in Norway, Denmark, Australia, Britain, the Netherlands, Sweden, Finland, and German-speaking countries, a development which appears to have started in the 1990s. In certain cases, this apparent reversal may be due to cultural changes which render parts of intelligence tests obsolete. Meta-analyses indicate that, overall, the Flynn effect continues, either at the same rate or at a slower rate in developed countries. Origin of term The Flynn effect is named for James R. Flynn, who did much to document it and promote awareness of its implications. The term itself was coined by Richard Herrnstein and Charles Murray in their 1994 book The Bell Curve. Although the general term for the phenomenon—referring to no researcher in particular—continues to be "secular rise in IQ scores", many textbooks on psychology and IQ testing have now followed the lead of Herrnstein and Murray in calling the phenomenon the Flynn effect. Rise in IQ IQ tests are updated periodically. For example, the Wechsler Intelligence Scale for Children (WISC), originally developed in 1949, was updated in 1974, 1991, 2003, and again in 2014. The revised versions are standardized based on the performance of test-takers in standardization samples. A standard score of IQ 100 is defined as the median performance of the standardization sample. Thus one way to see changes in norms over time is to conduct a study in which the same test-takers take both an old and new version of the same test. Doing so confirms IQ gains over time. Some IQ tests, for example, tests used for military draftees in NATO countries in Europe, report raw scores, and those also confirm a trend of rising scores over time. The average rate of increase seems to be about three IQ points per decade in the United States, as scaled by the Wechsler tests. The increasing test performance over time appears on every major test, in every age range, at every ability level, and in every modern industrialized country, although not necessarily at the same rate as in the United States. The increase was continuous and roughly linear from the earliest days of testing to the mid-1990s. Though the effect is most associated with IQ increases, a similar effect has been found with increases in attention and of semantic and episodic memory. Ulric Neisser estimated that using the IQ values of 1997, the average IQ of the United States in 1932, according to the first Stanford–Binet Intelligence Scales standardization sample, was 80. Neisser states that "Hardly any of them would have scored 'very superior', but nearly one-quarter would have appeared to be 'deficient.'" He also wrote that "Test scores are certainly going up all over the world, but whether intelligence itself has risen remains controversial." Trahan et al. (2014) found that the effect was about 2.93 points per decade, based on both Stanford–Binet and Wechsler tests; they also found no evidence the effect was diminishing. In contrast, Pietschnig and Voracek (2015) reported, in their meta-analysis of studies involving nearly 4 million participants, that the Flynn effect had decreased in recent decades. They also reported that the magnitude of the effect was different for different types of intelligence ("0.41, 0.30, 0.28, and 0.21 IQ points annually for fluid, spatial, full-scale, and crystallized IQ test performance, respectively"), and that the effect was stronger for adults than for children. Raven (2000) found that, as Flynn suggested, data interpreted as showing a decrease in many abilities with increasing age must be re-interpreted as showing that there has been a dramatic increase of these abilities with the date of birth. On many tests this occurs at all levels of ability. Some studies have found the gains of the Flynn effect to be particularly concentrated at the lower end of the distribution. Teasdale and Owen (1989), for example, found the effect primarily reduced the number of low-end scores, resulting in an increased number of moderately high scores, with no increase in very high scores. In another study, two large samples of Spanish children were assessed with a 30-year gap. Comparison of the IQ distributions indicated that the mean IQ scores on the test had increased by 9.7 points (the Flynn effect), the gains were concentrated in the lower half of the distribution and negligible in the top half, and the gains gradually decreased as the IQ of the individuals increased. Some studies have found a reverse Flynn effect with declining scores for those with high IQ. In 1987, Flynn took the position that the very large increase indicates that IQ tests do not measure intelligence but only a minor sort of "abstract problem-solving ability" with little practical significance. He argued that if IQ gains do reflect intelligence increases, there would have been consequent changes of our society that have not been observed (a presumed non-occurrence of a "cultural renaissance"). Flynn no longer endorses this view of intelligence and has since elaborated and refined his view of what rising IQ scores mean. Precursors to Flynn's publications Earlier investigators had discovered rises in raw IQ test scores in some study populations, but had not published general investigations of that issue in particular. Historian Daniel C. Calhoun cited earlier psychology literature on IQ score trends in his book The Intelligence of a People (1973). R. L. Thorndike drew attention to rises in Stanford-Binet scores in a 1975 review of the history of intelligence testing. Richard Lynn recorded an increase in Japanese IQ in 1982. Intelligence There is debate about whether the rise in IQ scores also corresponds to a rise in general intelligence, or only a rise in special skills related to taking IQ tests. Because children attend school longer now and have become much more familiar with the testing of school-related material, one might expect the greatest gains to occur on such school content-related tests as vocabulary, arithmetic or general information. Just the opposite is the case: abilities such as these have experienced relatively small gains and even occasional decreases over the years. Meta-analytic findings indicate that Flynn effects occur for tests assessing both fluid and crystallized abilities. For example, Dutch conscripts gained 21 points during only 30 years, or 7 points per decade, between 1952 and 1982. But this rise in IQ test scores is not wholly explained by an increase in general intelligence. Studies have shown that while test scores have improved over time, the improvement is not fully correlated with latent factors related to intelligence. Rushton argues that the gains in IQ over time are unrelated to general intelligence. Other researchers argue that the IQ gains described by the Flynn effect are due in part to increasing intelligence, and in part to increases in test-specific skills. Proposed explanations Schooling and test familiarity The duration of average schooling has increased steadily. One problem with this explanation is that if in the US comparing older and more recent subjects with similar educational levels, then the IQ gains appear almost undiminished in each such group considered individually. Many studies find that children who do not attend school score drastically lower on the tests than their regularly attending peers. During the 1960s, when some Virginia counties closed their public schools to avoid racial integration, compensatory private schooling was available only for Caucasian children. On average, the scores of African-American children who received no formal education during that period decreased at a rate of about six IQ points per year. Another explanation is an increased familiarity of the general population with tests and testing. For example, children who take the very same IQ test a second time usually gain five or six points. However, this seems to set an upper limit on the effects of test sophistication. One problem with this explanation and others related to schooling is that in the US, the groups with greater test familiarity show smaller IQ increases. Early intervention programs have shown mixed results. Some preschool (ages 3–4) intervention programs like "Head Start" do not produce lasting changes of IQ, although they may confer other benefits. The "Abecedarian Early Intervention Project", an all-day program that provided various forms of environmental enrichment to children from infancy onward, showed IQ gains that did not diminish over time. The IQ difference between the groups, although only five points, was still present at age 12. Not all such projects have been successful. Also, such IQ gains can diminish until age 18. Citing a high correlation between rising literacy rates and gains in IQ, David Marks has argued that the Flynn effect is caused by changes in literacy rates. Generally more stimulating environment Still another theory is that the general environment today is much more complex and stimulating. One of the most striking 20th-century changes in the human intellectual environment has come from the increase of exposure to many types of visual media. From pictures on the wall to movies to television to video games to computers, each successive generation has been exposed to richer optical displays than the one before and may have become more adept at visual analysis. This would explain why visual tests like the Raven's have shown the greatest increases. An increase only of particular forms of intelligence would explain why the Flynn effect has not caused a "cultural renaissance too great to be overlooked." In 2001, William Dickens and James Flynn presented a model for resolving several contradictory findings regarding IQ. They argue that the measure "heritability" includes both a direct effect of the genotype on IQ and also indirect effects such that the genotype changes the environment, thereby affecting IQ. That is, those with a greater IQ tend to seek stimulating environments that further increase IQ. These reciprocal effects result in gene environment correlation. The direct effect could initially have been very small, but feedback can create large differences in IQ. In their model, an environmental stimulus can have a very great effect on IQ, even for adults, but this effect also decays over time unless the stimulus continues (the model could be adapted to include possible factors, like nutrition during early childhood, that may cause permanent effects). The Flynn effect can be explained by a generally more stimulating environment for all people. The authors suggest that any program designed to increase IQ may produce long-term IQ gains if that program teaches children how to replicate the types of cognitively demanding experiences that produce IQ gains outside the program. To maximize lifetime IQ, the programs should also motivate them to continue searching for cognitively demanding experiences after they have left the program. Flynn in his 2007 book What Is Intelligence? further expanded on this theory. Environmental changes resulting from modernization—such as more intellectually demanding work, greater use of technology, and smaller families—have meant that a much larger proportion of people are more accustomed to manipulating abstract concepts such as hypotheses and categories than a century ago. Substantial portions of IQ tests deal with these abilities. Flynn gives, as an example, the question 'What do a dog and a rabbit have in common?' A modern respondent might say they are both mammals (an abstract, or a priori answer, which depends only on the meanings of the words dog and rabbit), whereas someone a century ago might have said that humans catch rabbits with dogs (a concrete, or a posteriori answer, which depended on what happened to be the case at that time). Nutrition Improved nutrition is another possible explanation. Today's average adult from an industrialized nation is taller than a comparable adult of a century ago. That increase of stature, likely the result of general improvements in nutrition and health, has been at a rate of more than a centimeter per decade. Available data suggest that these gains have been accompanied by analogous increases in head size, and by an increase in the average size of the brain. This argument had been thought to suffer the difficulty that groups who tend to be of smaller overall body size (e.g. women, or people of Asian ancestry) do not have lower average IQs. A 2005 study presented data supporting the nutrition hypothesis, which predicts that gains will occur predominantly at the low end of the IQ distribution, where nutritional deprivation is probably most severe. An alternative interpretation of skewed IQ gains could be that improved education has been particularly important for this group. Richard Lynn makes the case for nutrition, arguing that cultural factors cannot typically explain the Flynn effect because its gains are observed even at infant and preschool levels, with rates of IQ test score increase about equal to those of school students and adults. Lynn states that "This rules out improvements in education, greater test sophistication, etc., and most of the other factors that have been proposed to explain the Flynn effect. He proposes that the most probable factor has been improvements in pre-natal and early post-natal nutrition." A century ago, nutritional deficiencies may have limited body and organ functionality, including skull volume. The first two years of life are a critical time for nutrition. The consequences of malnutrition can be irreversible and may include poor cognitive development, educability, and future economic productivity. On the other hand, Flynn has pointed to 20-point gains on Dutch military (Raven's type) IQ tests between 1952, 1962, 1972, and 1982. He observes that the Dutch 18-year-olds of 1962 had a major nutritional handicap. They were either in the womb or were recently born, during the great Dutch famine of 1944—when German troops monopolized food and 18,000 people died of starvation. Yet, concludes Flynn, "they do not show up even as a blip in the pattern of Dutch IQ gains. It is as if the famine had never occurred." It appears that the effects of diet are gradual, taking effect over decades (affecting mother as well as the child) rather than a few months. In support of the nutritional hypothesis, it is known that, in the United States, the average height before 1900 was about 10 cm (~4 inches) shorter than it is today. Possibly related to the Flynn effect is a similar change of skull size and shape during the last 150 years. A Norwegian study found that height gains were strongly correlated with intelligence gains until the cessation of height gains in military conscript cohorts towards the end of the 1980s. Both height and skull size increases probably result from a combination of phenotypic plasticity and genetic selection over this period. With only five or six human generations in 150 years, time for natural selection has been very limited, suggesting that increased skeletal size resulting from changes in population phenotypes is more likely than recent genetic evolution. It is well known that micronutrient deficiencies change the development of intelligence. For instance, one study has found that iodine deficiency causes a fall, on average, of 12 IQ points in China. Scientists James Feyrer, Dimitra Politi, and David N. Weil have found in the U.S. that the proliferation of iodized salt increased IQ by 15 points in some areas. Journalist Max Nisen has stated that with this type of salt becoming popular, that "the aggregate effect has been extremely positive." Daley et al. (2003) found a significant Flynn effect among children in rural Kenya, and concluded that nutrition was one of the hypothesized explanations that best explained their results (the others were parental literacy and family structure). Infectious diseases Eppig, Fincher, and Thornhill (2009) argue that "From an energetics standpoint, a developing human will have difficulty building a brain and fighting off infectious diseases at the same time, as both are very metabolically costly tasks" and that "the Flynn effect may be caused in part by the decrease in the intensity of infectious diseases as nations develop." They suggest that improvements in gross domestic product (GDP), education, literacy, and nutrition may have an effect on IQ mainly through reducing the intensity of infectious diseases. Eppig, Fincher, and Thornhill (2011) in a similar study instead looking at different US states found that states with a higher prevalence of infectious diseases had lower average IQ. The effect remained after controlling for the effects of wealth and educational variation. Atheendar Venkataramani (2010) studied the effect of malaria on IQ in a sample of Mexicans. Malaria eradication during the birth year was associated with increases in IQ. It also increased the probability of employment in a skilled occupation. The author suggests that this may be one explanation for the Flynn effect and that this may be an important explanation for the link between national malaria burden and economic development. A literature review of 44 papers states that cognitive abilities and school performance were shown to be impaired in sub-groups of patients (with either cerebral malaria or uncomplicated malaria) when compared with healthy controls. Studies comparing cognitive functions before and after treatment for acute malarial illness continued to show significantly impaired school performance and cognitive abilities even after recovery. Malaria prophylaxis was shown to improve cognitive function and school performance in clinical trials when compared to placebo groups. Heterosis Heterosis, or hybrid vigor associated with historical reductions of the levels of inbreeding, has been proposed by Michael Mingroni as an alternative explanation of the Flynn effect. However, James Flynn has pointed out that even if everyone mated with a sibling in 1900, subsequent increases in heterosis would not be a sufficient explanation of the observed IQ gains. Reduction of lead in gasoline One study found the drop in blood lead levels in the United States from the 1970s to 2007 correlated with a 4-5 point increase in IQ. Possible end of progression Jon Martin Sundet and colleagues (2004) examined scores on intelligence tests given to Norwegian conscripts between the 1950s and 2002. They found that the increase of scores of general intelligence stopped after the mid-1990s and declined in numerical reasoning sub-tests. Teasdale and Owen (2005) examined the results of IQ tests given to Danish male conscripts. Between 1959 and 1979 the gains were 3 points per decade. Between 1979 and 1989 the increase approached 2 IQ points. Between 1989 and 1998 the gain was about 1.3 points. Between 1998 and 2004 IQ declined by about the same amount as it gained between 1989 and 1998. They speculate that "a contributing factor in this recent fall could be a simultaneous decline in proportions of students entering 3-year advanced-level school programs for 16–18-year-olds." The same authors in a more comprehensive 2008 study, again on Danish male conscripts, found that there was a 1.5-point increase between 1988 and 1998, but a 1.5-point decrease between 1998 and 2003/2004. A possible contributing factor to the more recent decline may be the changes in the Danish educational system. Another may be the rising proportion of immigrants or their immediate descendants in Denmark. This is supported by data on Danish draftees where first or second-generation immigrants with Danish nationality score below average. In Australia, the IQ of 6–12 year olds as measured by the Colored Progressive Matrices has shown no increase from 1975 to 2003. In the United Kingdom, a study by Flynn (2009) found that tests carried out in 1980 and again in 2008 show that the IQ score of an average 14-year-old dropped by more than two points over the period. For the upper half of the results, the performance was even worse. Average IQ scores declined by six points. However, children aged between five and 10 saw their IQs increase by up to half a point a year over the three decades. Flynn argues that the abnormal drop in British teenage IQ could be due to youth culture having "stagnated" or even dumbed down. He also states that the youth culture is more oriented towards computer games than towards reading and holding conversations. Researcher Richard House, commenting on the study, also mentions the computer culture diminishing reading books as well as a tendency towards teaching to the test. Stefansson et al. (2017) argue for a decline in polygenic scores pertaining to educational attainment in Icelandic individuals born from 1910 to 1990. They point out that the effect observed is extremely negligible, however, and may only be of concern if the trend is assumed to be larger in genomic effect and continues across centuries. Bratsberg & Rogeberg (2018) present evidence that the Flynn effect in Norway has reversed, and that both the original rise in mean IQ scores and their subsequent decline was caused by environmental factors. They conclude that environmental factors explain all or almost all of the decline, and the hypothesized declines in genotypic IQ is negligible, although they "cannot rule out the theoretical possibility of negative selection on a genetic component that is masked when assessed using environmentally influenced measures", not being able to rule out the decline posited by Stefansson et al. One possible explanation of a worldwide decline in intelligence, suggested by the World Health Organization and the Forum of International Respiratory Societies' Environmental Committee, is an increase in air pollution, which now affects over 90% of the world's population. IQ group differences If the Flynn effect has ended in developed nations but continues in less developed ones, this would tend to diminish national differences in IQ scores. Also, if the Flynn effect has ended for the majority in developed nations, it may still continue for minorities, especially for groups like immigrants where many may have received poor nutrition during early childhood or have had other disadvantages. A study in the Netherlands found that children of non-Western immigrants had improvements for g, educational achievements, and work proficiency compared to their parents, although there were still remaining differences compared to ethnic Dutch. In the United States, the IQ gap between black and white people was gradually closing over the last decades of the 20th century, as black test-takers increased their average scores relative to white test-takers. For instance, Vincent reported in 1991 that the black–white IQ gap was decreasing among children, but that it was remaining constant among adults. Similarly, a 2006 study by Dickens and Flynn estimated that the difference between mean scores of black people and white people closed by about 5 or 6 IQ points between 1972 and 2002, a reduction of about one-third. In the same period, the educational achievement disparity also diminished. Reviews by Flynn and Dickens, Mackintosh, and Nisbett et al. all concluded that the gradual closing of the gap was a real phenomenon. Flynn has commented that he never claimed that the Flynn effect has the same causes as the black-white gap, but that it shows that environmental factors can create IQ differences of a magnitude similar to the gap. A meta-analysis which examined whether g factor and IQ gains from the Flynn effect are related found a small negative correlation between the two, which may indicate that group differences and the Flynn effect are possibly due to differing causes. The Flynn effect has also been part of the discussions regarding Spearman's hypothesis, which states that differences in the g factor are the major source of differences between blacks and whites observed in many studies of race and intelligence. See also Academic inflation Environment and intelligence Euthenics Gene–environment correlation Grade inflation Impact of health on intelligence Intelligence References Further reading External links Beyond the Flynn Effect a 2006 lecture by James R. Flynn at the University of Cambridge The Flynn Effect by Indiana University. Marguerite Holloway, Flynn's effect, Scientific American, January 1999; online edition "Heritability Estimates Versus Large Environmental Effects: The IQ Paradox Resolved" – a 2001 article by Dickens and Flynn "Dome Improvement" (Wired article) Malcolm Gladwell from the New Yorker on race, I.Q., and the Flynn effect Increasing intelligence: the Flynn effect Intelligence quotient Race and intelligence controversy
[ 0.08725623786449432, -0.10381316393613815, 0.017618460580706596, 0.6525589227676392, -0.046361908316612244, -0.047677867114543915, 0.42815831303596497, 0.34392520785331726, -0.23106050491333008, -0.1679193675518036, 0.0719919204711914, 0.45951929688453674, -0.08135945349931717, 0.516679883...
11774
https://en.wikipedia.org/wiki/Field%20ion%20microscope
Field ion microscope
The Field ion microscope (FIM) was invented by Müller in 1951. It is a type of microscope that can be used to image the arrangement of atoms at the surface of a sharp metal tip. On October 11, 1955, Erwin Müller and his Ph.D. student, Kanwar Bahadur (Pennsylvania State University) observed individual tungsten atoms on the surface of a sharply pointed tungsten tip by cooling it to 21 K and employing helium as the imaging gas. Müller & Bahadur were the first persons to observe individual atoms directly. Introduction In FIM, a sharp (<50 nm tip radius) metal tip is produced and placed in an ultra high vacuum chamber, which is backfilled with an imaging gas such as helium or neon. The tip is cooled to cryogenic temperatures (20–100 K). A positive voltage of 5 to 10 kilovolts is applied to the tip. Gas atoms adsorbed on the tip are ionized by the strong electric field in the vicinity of the tip (thus, "field ionization"), becoming positively charged and being repelled from the tip. The curvature of the surface near the tip causes a natural magnification — ions are repelled in a direction roughly perpendicular to the surface (a "point projection" effect). A detector is placed so as to collect these repelled ions; the image formed from all the collected ions can be of sufficient resolution to image individual atoms on the tip surface. Unlike conventional microscopes, where the spatial resolution is limited by the wavelength of the particles which are used for imaging, the FIM is a projection type microscope with atomic resolution and an approximate magnification of a few million times. Design, limitations and applications FIM like Field Emission Microscopy (FEM) consists of a sharp sample tip and a fluorescent screen (now replaced by a multichannel plate) as the key elements. However, there are some essential differences as follows: The tip potential is positive. The chamber is filled with an imaging gas (typically, He or Ne at 10−5 to 10−3 Torr). The tip is cooled to low temperatures (~20-80K). Like FEM, the field strength at the tip apex is typically a few V/Å. The experimental set-up and image formation in FIM is illustrated in the accompanying figures. In FIM the presence of a strong field is critical. The imaging gas atoms (He, Ne) near the tip are polarized by the field and since the field is non-uniform the polarized atoms are attracted towards the tip surface. The imaging atoms then lose their kinetic energy performing a series of hops and accommodate to the tip temperature. Eventually, the imaging atoms are ionized by tunneling electrons into the surface and the resulting positive ions are accelerated along the field lines to the screen to form a highly magnified image of the sample tip. In FIM, the ionization takes place close to the tip, where the field is strongest. The electron that tunnels from the atom is picked up by the tip. There is a critical distance, xc, at which the tunneling probability is a maximum. This distance is typically about 0.4 nm. The very high spatial resolution and high contrast for features on the atomic scale arises from the fact that the electric field is enhanced in the vicinity of the surface atoms because of the higher local curvature. The resolution of FIM is limited by the thermal velocity of the imaging ion. Resolution of the order of 1Å (atomic resolution) can be achieved by effective cooling of the tip. Application of FIM, like FEM, is limited by the materials which can be fabricated in the shape of a sharp tip, can be used in an ultra high vacuum (UHV) environment, and can tolerate the high electrostatic fields. For these reasons, refractory metals with high melting temperature (e.g. W, Mo, Pt, Ir) are conventional objects for FIM experiments. Metal tips for FEM and FIM are prepared by electropolishing (electrochemical polishing) of thin wires. However, these tips usually contain many asperities. The final preparation procedure involves the in situ removal of these asperities by field evaporation just by raising the tip voltage. Field evaporation is a field induced process which involves the removal of atoms from the surface itself at very high field strengths and typically occurs in the range 2-5 V/Å. The effect of the field in this case is to reduce the effective binding energy of the atom to the surface and to give, in effect, a greatly increased evaporation rate relative to that expected at that temperature at zero fields. This process is self-regulating since the atoms that are at positions of high local curvature, such as adatoms or ledge atoms, are removed preferentially. The tips used in FIM is sharper (tip radius is 100~300 Å) compared to those used in FEM experiments (tip radius ~1000 Å). FIM has been used to study dynamical behavior of surfaces and the behavior of adatoms on surfaces. The problems studied include adsorption-desorption phenomena, surface diffusion of adatoms and clusters, adatom-adatom interactions, step motion, equilibrium crystal shape, etc. However, there is the possibility of the results being affected by the limited surface area (i.e. edge effects) and by the presence of large electric field. See also Atom probe Electron microscope Field emission microscopy List of surface analysis methods References K.Oura, V.G.Lifshits, A.ASaranin, A.V.Zotov and M.Katayama, Surface Science – An Introduction, (Springer-Verlag Berlin Heidelberg 2003). John B. Hudson, Surface Science – An Introduction, BUTTERWORTH-Heinemann 1992. External links Northwestern University Center for Atom-Probe Tomography Microscope Parts need to know. Further reading Microscopes Scientific techniques
[ -0.1714453548192978, 0.3510090112686157, 0.050079505890607834, 0.1493941992521286, 0.007592529524117708, -0.15555225312709808, 0.18631330132484436, -0.15500304102897644, -0.13992610573768616, -0.3001754879951477, -0.10965713113546371, 0.9301086068153381, -0.1440669447183609, 0.284137427806...
11775
https://en.wikipedia.org/wiki/First%20Battle%20of%20El%20Alamein
First Battle of El Alamein
The First Battle of El Alamein (1–27 July 1942) was a battle of the Western Desert Campaign of the Second World War, fought in Egypt between Axis forces (Germany and Italy) of the Panzer Army Africa () (which included the under Field Marshal () Erwin Rommel) and Allied (British Imperial and Commonwealth) forces (United Kingdom, British India, Australia, South Africa and New Zealand) of the Eighth Army (General Claude Auchinleck). The British prevented a second advance by the Axis forces into Egypt. Axis positions near El Alamein, only from Alexandria, were dangerously close to the ports and cities of Egypt, the base facilities of the Commonwealth forces and the Suez Canal. However, the Axis forces were too far from their base at Tripoli in Libya to remain at El Alamein indefinitely, which led both sides to accumulate supplies for more offensives, against the constraints of time and distance. The battle and the Second Battle of El Alamein three months later are highly regarded within some of the countries that took part. In New Zealand, this is due to the country's significant contribution to the defence of El Alamein, especially the heavy role the Māori Battalion played. Members of this battalion have been labelled war heroes since, such as commander Frederick Baker, James Henare and Eruera Te Whiti o Rongomai Love, the last of whom was killed in action. Background Retreat from Gazala Following their defeat at the Battle of Gazala in Eastern Libya in June 1942, the British Eighth Army, commanded by Lieutenant-General Neil Ritchie, had retreated east from the Gazala line into north-western Egypt as far as Mersa Matruh, roughly inside the border. Ritchie had decided not to hold the defences on the Egyptian border, because the defensive plan there was for infantry to hold defended localities and a strong armoured force behind them to meet any attempts to penetrate or outflank the fixed defences. Since General Ritchie had virtually no armoured units left fit to fight, the infantry positions would be defeated in detail. The Mersa defence plan also included an armoured reserve but in its absence Ritchie believed he could organise his infantry to cover the minefields between the defended localities to prevent Axis engineers from having undisturbed access. To defend the Matruh line, Ritchie placed 10th Indian Infantry Division (in Matruh itself) and 50th (Northumbrian) Infantry Division (some down the coast at Gerawla) under X Corps HQ, newly arrived from Syria. Inland from X Corps would be XIII Corps with 5th Indian Infantry Division (with only one infantry brigade, 29th Indian, and two artillery regiments) around Sidi Hamza about inland, and the newly arrived 2nd New Zealand Division (short one brigade, the 6th, which had been left out of combat in case the division was captured and it would be needed to serve as the nucleus of a new division) at Minqar Qaim (on the escarpment inland) and 1st Armored Division in the open desert to the south. The 1st Armored Division had taken over 4th and 22nd Armoured Brigades from 7th Armoured Division which by this time had only three tank regiments (battalions) between them. On 25 June, General Claude Auchinleck—Commander-in-Chief (C-in-C) Middle East Command—relieved Ritchie and assumed direct command of the Eighth Army himself. He decided not to seek a decisive confrontation at the Mersa Matruh position. He concluded that his inferiority in armour after the Gazala defeat, meant he would be unable to prevent Rommel either breaking through his centre or enveloping his open left flank to the south in the same way he had at Gazala. He decided instead to employ delaying tactics while withdrawing a further or more east to a more defensible position near El Alamein on the Mediterranean coast. Only to the south of El Alamein, the steep slopes of the Qattara Depression ruled out the possibility of Axis armour moving around the southern flank of his defences and limited the width of the front he had to defend. Battle of Mersa Matruh While preparing the Alamein positions, Auchinleck fought strong delaying actions, first at Mersa Matruh on 26–27 June and then Fuka on 28 June. The late change of orders resulted in some confusion in the forward formations (X Corps and XIII Corps) between the desire to inflict damage on the enemy and the intention not to get trapped in the Matruh position but retreat in good order. The result was poor co-ordination between the two forward Corps and units within them. Late on 26 June, the German 90th Light and 21st Panzer Divisions managed to find their way through the minefields in the centre of the front. Early on 27 June, resuming its advance, the 90th Light was checked by British 50th Division's artillery. Meanwhile, the 15th and 21st Panzer Divisions advanced east above and below the escarpment. The 15th Panzer were blocked by 4th Armoured and 7th Motor Brigades, but the 21st Panzer were ordered on to attack Minqar Qaim. Rommel ordered 90th Light to resume its advance, requiring it to cut the coast road behind 50th Division by the evening. As the 21st Panzer moved on Minqar Qaim, the 2nd New Zealand Division found itself surrounded but broke out on the night of 27/28 June without serious losses and withdrew east. Auchinleck had planned a second delaying position at Fuka, some east of Matruh, and at 21:20 he issued the orders for a withdrawal to Fuka. Confusion in communication led the division withdrawing immediately to the El Alamein position. X Corps, having made an unsuccessful attempt to secure a position on the escarpment, were out of touch with Eighth Army from 19:30 until 04:30 the next morning. Only then did they discover that the withdrawal order had been given. The withdrawal of XIII Corps had left the southern flank of X Corps on the coast at Matruh exposed and their line of retreat compromised by the cutting of the coastal road east of Matruh. They were ordered to break out southwards into the desert and then make their way east. Auchinleck ordered XIII Corps to provide support but they were in no position to do so. At 21:00 on 28 June, X Corps—organised into brigade groups—headed south. In the darkness, there was considerable confusion as they came across enemy units laagered for the night. In the process, 5th Indian Division in particular sustained heavy casualties, including the destruction of the 29th Indian Infantry Brigade at Fuka. Axis forces captured more than 6,000 prisoners, in addition to 40 tanks and an enormous quantity of supplies. Prelude Defences at El Alamein Alamein itself was an inconsequential railway station on the coast. Some to the south lay the Ruweisat Ridge, a low stony prominence that gave excellent observation for many miles over the surrounding desert; to the south was the Qattara Depression. The line the British chose to defend stretched between the sea and the Depression, which meant that Rommel could outflank it only by taking a significant detour to the south and crossing the Sahara Desert. The British Army in Egypt recognised this before the war and had the Eighth Army begin construction of several "boxes" (localities with dug-outs and surrounded by minefields and barbed wire) the most developed being around the railway station at Alamein. Most of the "line" was open, empty desert. Lieutenant-General William Norrie (General officer commanding [GOC] XXX Corps) organised the position and started to construct three defended "boxes". The first and strongest, at El Alamein on the coast, had been partly wired and mined by 1st South African Division. The Bab el Qattara box—some from the coast and south-west of the Ruweisat Ridge—had been dug but had not been wired or mined, while at the Naq Abu Dweis box (on the edge of the Qattara Depression), from the coast, very little work had been done. The British position in Egypt was desperate, the rout from Mersa Matruh had created a panic in the British headquarters at Cairo, something later called "the Flap". On what came to be referred to as "Ash Wednesday", at British headquarters, rear echelon units and the British Embassy, papers were hurriedly burned in anticipation of the fall of the city. Auchinleck—although believing he could stop Rommel at Alamein—felt he could not ignore the possibility that he might once more be outmanoeuvred or outfought. To maintain his army, plans must be made for the possibility of a further retreat whilst maintaining morale and retaining the support and co-operation of the Egyptians. Defensive positions were constructed west of Alexandria and on the approaches to Cairo while considerable areas in the Nile delta were flooded. The Axis, too, believed that the capture of Egypt was imminent; Italian leader Benito Mussolini—sensing a historic moment—flew to Libya to prepare for his triumphal entry into Cairo. The scattering of X Corps at Mersa Matruh disrupted Auchinleck's plan for occupying the Alamein defences. On 29 June, he ordered XXX Corps—the 1st South African, 5th and 10th Indian divisions—to take the coastal sector on the right of the front and XIII Corps—the 2nd New Zealand Division and 4th Indian divisions—to be on the left. The remains of the 1st Armoured Division and the 7th Armoured Division were to be held as a mobile army reserve. His intention was for the fixed defensive positions to channel and disorganise the enemy's advance while mobile units would attack their flanks and rear. On 30 June, Rommel's Panzerarmee Afrika approached the Alamein position. The Axis forces were exhausted and understrength. Rommel had driven them forward ruthlessly, being confident that, provided he struck quickly before Eighth Army had time to settle, his momentum would take him through the Alamein position and he could then advance to the Nile with little further opposition. Supplies remained a problem because the Axis staff had originally expected a pause of six weeks after the capture of Tobruk. German air units were also exhausted and providing little help against the RAF's all-out attack on the Axis supply lines which, with the arrival of United States Army Air Forces (USAAF) heavy bombers, could reach as far as Benghazi. Although captured supplies proved useful, water and ammunition were constantly in short supply, while a shortage of transport impeded the distribution of the supplies that the Axis forces did have. Axis plan of attack Rommel's plan was for the 90th Light Division and the 15th and 21st Panzer divisions of the Afrika Korps to penetrate the Eighth Army lines between the Alamein box and Deir el Abyad (which he believed was defended). The 90th Light Division was then to veer north to cut the coastal road and trap the defenders of the Alamein box (which Rommel thought was occupied by the remains of the 50th Infantry Division) and the Afrika Korps would veer right to attack the rear of XIII Corps. Battle An Italian division was to attack the Alamein box from the west and another was to follow the 90th Light Division. The Italian XX Corps was to follow the Afrika Korps and deal with the Qattara box while the 133rd Armoured Division "Littorio" and German reconnaissance units would protect the right flank. Rommel had planned to attack on 30 June but supply and transport difficulties had resulted in a day's delay, vital to the defending forces reorganising on the Alamein line. On 30 June, the 90th Light Division was still short of its start line, 21st Panzer Division was immobilised through lack of fuel and the promised air support had yet to move into its advanced airfields. Panzer Army Africa attacks At 03:00 on 1 July, 90th Light Infantry Division advanced east but strayed too far north and ran into the 1st South African Division's defences and became pinned down. The 15th and 21st Panzer Divisions of the Afrika Korps were delayed by a sandstorm and then a heavy air attack. It was broad daylight by the time they circled round the back of Deir el Abyad where they found the feature to the east of it occupied by 18th Indian Infantry Brigade which, after a hasty journey from Iraq, had occupied the exposed position just west of Ruweisat Ridge and east of Deir el Abyad at Deir el Shein late on 28 June to create one of Norrie's additional defensive boxes. At about 10:00 on 1 July, 21st Panzer Division attacked Deir el Shein. 18th Indian Infantry Brigade—supported by 23 25-pounder gun-howitzers, 16 of the new 6-pounder anti-tank guns and nine Matilda tanks—held out the whole day in desperate fighting but by evening the Germans succeeded in over-running them. The time they bought allowed Auchinleck to organise the defence of the western end of Ruweisat Ridge. The 1st Armoured Division had been sent to intervene at Deir el Shein. They ran into 15th Panzer Division just south of Deir el Shein and drove it west. By the end of the day's fighting, the Afrika Korps had 37 tanks left out of its initial complement of 55. During the early afternoon, 90th Light had extricated itself from the El Alamein box defences and resumed its move eastward. It came under artillery fire from the three South African brigade groups and was forced to dig in. On 2 July, Rommel ordered the resumption of the offensive. Once again, 90th Light failed to make progress so Rommel called the Afrika Korps to abandon its planned sweep southward and instead join the effort to break through to the coast road by attacking east toward Ruweisat Ridge. The British defence of Ruweisat Ridge relied on an improvised formation called "Robcol", comprising a regiment each of field artillery and light anti-aircraft artillery and a company of infantry. Robcol—in line with normal British Army practice for ad hoc formations—was named after its commander, Brigadier Robert Waller, the Commander Royal Artillery of the 10th Indian Infantry Division. Robcol was able to buy time, and by late afternoon the two British armoured brigades joined the battle with 4th Armoured Brigade engaging 15th Panzer and 22nd Armoured Brigade 21st Panzer respectively. They drove back repeated attacks by the Axis armour, who then withdrew before dusk. The British reinforced Ruweisat on the night of 2 July. The now enlarged Robcol became "Walgroup". Meanwhile, the Royal Air Force (RAF) made heavy air attacks on the Axis units. The next day, 3 July, Rommel ordered the Afrika Korps to resume its attack on the Ruweisat ridge with the Italian XX Motorised Corps on its southern flank. Italian X Corps, meanwhile were to hold El Mreir. By this stage the Afrika Korps had only 26 operational tanks. There was a sharp armoured exchange south of Ruweisat ridge during the morning and the main Axis advance was held. On 3 July, the RAF flew 780 sorties. To relieve the pressure on the right and centre of the Eighth Army line, XIII Corps on the left advanced from the Qattara box (known to the New Zealanders as the Kaponga box). The plan was that the New Zealand 2nd Division—with the remains of Indian 5th Division and 7th Motor Brigade under its command—would swing north to threaten the Axis flank and rear. This force encountered the 132nd Armoured Division "Ariete"'s artillery, which was driving on the southern flank of the division as it attacked Ruweisat. The Italian commander ordered his battalions to fight their way out independently but the Ariete lost 531 men (about 350 were prisoners), 36 pieces of artillery, six (or eight?) tanks, and 55 trucks. By the end of the day, the Ariete Division had only five tanks. The day ended once again with the Afrika Korps and Ariete coming off second best to the superior numbers of the British 22nd Armoured and 4th Armoured Brigades, frustrating Rommel's attempts to resume his advance. The RAF once again played its part, flying 900 sorties during the day. To the south, on 5 July the New Zealand group resumed its advance northwards towards El Mreir intending to cut the rear of the Ariete Division. Heavy fire from the Italian 27th Infantry Division "Brescia" at El Mreir, however, north of the Qattara box, checked their progress and led XIII Corps to call off its attack. Rommel digs in At this point, Rommel decided his exhausted forces could make no further headway without resting and regrouping. He reported to the German High Command that his three German divisions numbered just 1,200–1,500 men each and resupply was proving highly problematic because of enemy interference from the air. He expected to have to remain on the defensive for at least two weeks. Rommel was by this time suffering from the extended length of his supply lines. The Allied Desert Air Force (DAF) was concentrating fiercely on his fragile and elongated supply routes while British mobile columns moving west and striking from the south were causing havoc in the Axis rear echelons. Rommel could afford these losses even less since shipments from Italy had been substantially reduced (in June, he received of supplies compared with in May and 400 vehicles (compared with 2,000 in May). Meanwhile, the Eighth Army was reorganising and rebuilding, benefiting from its short lines of communication. By 4 July, the Australian 9th Division had entered the line in the north, and on 9 July the Indian 5th Infantry Brigade also returned, taking over the Ruweisat position. At the same time, the fresh Indian 161st Infantry Brigade reinforced the depleted Indian 5th Infantry Division. Tel el Eisa On 8 July, Auchinleck ordered the new XXX Corps commander—Lieutenant-General William Ramsden—to capture the low ridges at Tel el Eisa and Tel el Makh Khad and then to push mobile battle groups south toward Deir el Shein and raiding parties west toward the airfields at El Daba. Meanwhile, XIII Corps would prevent the Axis from moving troops north to reinforce the coastal sector. Ramsden tasked the Australian 9th Division with 44th Royal Tank Regiment under command with the Tel el Eisa objective and the South African 1st Division with eight supporting tanks, Tel el Makh Khad. The raiding parties were to be provided by 1st Armoured Division. Following a bombardment which started at 03:30 on 10 July, the Australian 26th Brigade launched an attack against the ridge north of Tel el Eisa station along the coast (Trig 33). The bombardment was the heaviest barrage yet experienced in North Africa, which created panic in the inexperienced soldiers of the Italian 60th Infantry Division "Sabratha" who had only just occupied sketchy defences in the sector. The Australian attack took more than 1,500 prisoners, routed an Italian Division and overran the German Signals Intercept Company 621. Meanwhile, the South Africans had by late morning taken Tel el Makh Khad and were in covering positions. Elements of the German 164th Light Division and Italian 101st Motorised Division "Trieste" arrived to plug the gap torn in the Axis defences. That afternoon and evening, tanks from the German 15th Panzer and Italian Trieste Divisions launched counter-attacks against the Australian positions, the counter-attacks failing in the face of overwhelming Allied artillery and the Australian anti-tank guns. At first light on 11 July, the Australian 2/24th Battalion supported by tanks from 44th Royal Tank Regiment attacked the western end of Tel el Eisa hill (Point 24). By early afternoon, the feature was captured and was then held against a series of Axis counter-attacks throughout the day. A small column of armour, motorised infantry, and guns then set off to raid Deir el Abyad and caused a battalion of Italian infantry to surrender. Its progress was checked at the Miteirya ridge and it was forced to withdraw that evening to the El Alamein box. During the day, more than 1,000 Italian prisoners were taken. On 12 July, the 21st Panzer Division launched a counter-attack against Trig 33 and Point 24, which was beaten off after a 2½-hour fight, with more than 600 German dead and wounded left strewn in front of the Australian positions. The next day, 21. Panzerdivision launched an attack against Point 33 and South African positions in the El Alamein box. In the El Alamein Box, the Royal Durban Light Infantry faced the full force of the German attacks. The Royal Durban Light Infantry (RDLI) did not have adequate anti-tank guns and the German artillery cut the South African telephone cables, compounding field artillery support. The attack was halted by intense artillery fire from the defenders. Although the South Africans repulsed the German attack, by 16:10, German tanks and dive bombers advanced up to 300 metres from the South African positions. The 9th Australian field artillery, 7th British Medium Regiment had to assist in repulsing the German attack. At last light, the 79th British Anti-Tank Regiment was deployed to assist the South African forces but the German attack was petering out. The South African losses on 13 July totaled nine dead and 42 wounded. Although the South African casualties were relatively light, their skill in withstanding the German attacks negated their casualties. Had the El Alamein Box been captured by Rommel's forces, the consequences for the Eighth Army would have been devastating; the El Alamein line would have been ruptured, Australian forces would have been cut off from the Eighth Army and forced a general retreat to the Nile Delta. Rommel was still determined to drive the British forces from the northern salient. Although the Australian defenders had been forced back from Point 24, heavy casualties had been inflicted on 21st Panzer Division. Another attack was mounted on 15 July but made no ground against tenacious resistance. On 16 July, the Australians—supported by British tanks—launched an attack to try to take Point 24 but were forced back by German counter-attacks, suffering nearly fifty percent casualties. After seven days of fierce fighting, the battle in the north for Tel el Eisa salient petered out. Australian 9th Division estimated at least 2,000 Axis troops had been killed and more than 3,700 prisoners of war taken in the battle. Possibly the most important feature of the battle, however, was that the Australians had captured Signals Intercept Company 621, which had provided Rommel with priceless intelligence from British radio communications. First Battle of Ruweisat Ridge As the Axis forces dug in, Auchinleck—having drawn a number of German units to the coastal sector during the Tel el Eisa fighting—developed a plan—codenamed Operation Bacon—to attack the Italian 17th Infantry Division "Pavia" and Brescia Divisions in the centre of the front at the Ruweisat ridge. Signals intelligence was giving Auchinleck clear details of the Axis order of battle and force dispositions. His policy was to "...hit the Italians wherever possible in view of their low morale and because the Germans cannot hold extended fronts without them." The intention was for the 4th New Zealand Brigade and 5th New Zealand Brigade (on 4th Brigade's right) to attack north-west to seize the western part of the ridge and on their right the Indian 5th Infantry Brigade to capture the eastern part of the ridge in a night attack. Then 2nd Armoured Brigade would pass through the centre of the infantry objectives to exploit toward Deir el Shein and the Miteirya Ridge. On the left, the 22nd Armoured Brigade would be ready to move forward to protect the infantry as they consolidated on the ridge. The attack commenced at 23:00 on 14 July. The two New Zealand brigades shortly before dawn on 15 July took their objectives, but minefields and pockets of resistance created disarray among the attackers. A number of pockets of resistance were left behind the forward troops' advance which impeded the move forward of reserves, artillery, and support arms. As a result, the New Zealand brigades occupied exposed positions on the ridge without support weapons except for a few anti-tank guns. More significantly, the two British armoured brigades failed to move forwards to protect the infantry. At first light, a detachment from 15th Panzer division's 8th Panzer Regiment launched a counter-attack against New Zealand 4th Brigade's 22nd Battalion. A sharp exchange knocked out their anti-tank guns and the infantry found themselves exposed in the open with no alternative but to surrender. About 350 New Zealanders were taken prisoner. While the 2nd New Zealand Division attacked the western slopes of Ruweisat Ridge, the Indian 5th Brigade made small gains on Ruweisat ridge to the east. By 07:00, word was finally got to 2nd Armoured Brigade which started to move north west. Two regiments became embroiled in a minefield but the third was able to join Indian 5th Infantry 5th Brigade as it renewed its attack. With the help of the armour and artillery, the Indians were able to take their objectives by early afternoon. Meanwhile, the 22nd Armoured Brigade had been engaged at Alam Nayil by 90th Light Division and the Ariete Armoured Division, advancing from the south. While—with help from mobile infantry and artillery columns from 7th Armoured Division—they pushed back the Axis probe with ease, they were prevented from advancing north to protect the New Zealand flank. Seeing the Brescia and Pavia under pressure, Rommel rushed German troops to Ruweisat. By 15:00, the 3rd Reconnaissance Regiment and part of 21st Panzer Division from the north and 33rd Reconnaissance Regiment and the Baade Group comprising elements from 15th Panzer Division from the south were in place under Lieutenant-General (General der Panzertruppe) Walther Nehring. At 17:00, Nehring launched his counter-attack. 4th New Zealand Brigade were still short of support weapons and also, by this time, ammunition. Once again, the anti-tank defences were overwhelmed and about 380 New Zealanders were taken prisoner including Captain Charles Upham who gained a second Victoria Cross for his actions including destroying a German tank and several guns and vehicles with grenades despite being shot through the elbow by a machine gun bullet. At about 18:00, the brigade HQ was overrun. At about 18:15, 2nd Armoured Brigade engaged the German armour and halted the Axis eastward advance. At dusk, Nehring broke off the action. Early on 16 July, Nehring renewed his attack. The 5th Indian Infantry Brigade pushed them back but it was clear from intercepted radio traffic that a further attempt would be made. Strenuous preparations to dig in anti-tank guns were made, artillery fire plans organised and a regiment from the 22nd Armoured Brigade was sent to reinforce the 2nd Armoured Brigade. When the attack resumed late in the afternoon, it was repulsed. After the battle, the Indians counted 24 knocked out tanks, as well as armoured cars and numerous anti-tank guns left on the battlefield. In three days' fighting, the Allies took more than 2,000 Axis prisoners, mostly from the Italian Brescia and Pavia Divisions; the New Zealand division suffered 1,405 casualties. The fighting at Tel el Eisa and Ruweisat had caused the destruction of three Italian divisions, forced Rommel to redeploy his armour from the south, made it necessary to lay minefields in front of the remaining Italian divisions and stiffen them with detachments of German troops. Miteirya Ridge (Ruin Ridge) To relieve pressure on Ruweisat ridge, Auchinleck ordered the Australian 9th Division to make another attack from the north. In the early hours of 17 July, the Australian 24th Brigade—supported by 44th Royal Tank Regiment (RTR) and strong fighter cover from the air—assaulted Miteirya ridge (known as "Ruin ridge" to the Australians). The initial night attack went well, with 736 prisoners taken, mostly from the Italian Trento and Trieste motorised divisions. Once again, however, a critical situation for the Axis forces was retrieved by vigorous counter-attacks from hastily assembled German and Italian forces, which forced the Australians to withdraw back to their start line with 300 casualties. Although the Australian Official History of the 24th Brigade's 2/32nd Battalion describes the counter-attack force as "German", the Australian historian Mark Johnston reports that German records indicate that it was the Trento Division that overran the Australian battalion. Second Battle of Ruweisat Ridge (El Mreir) The Eighth Army now enjoyed a massive superiority in material over the Axis forces: 1st Armoured Division had 173 tanks and more in reserve or in transit, including 61 Grants while Rommel possessed only 38 German tanks and 51 Italian tanks although his armoured units had some 100 tanks awaiting repair. Auchinleck's plan was for Indian Infantry 161st Brigade to attack along Ruweisat ridge to take Deir el Shein, while the New Zealand 6th Brigade attacked from south of the ridge to the El Mreir depression. At daylight, two British armoured brigades—2nd Armoured Brigade and the fresh 23rd Armoured Brigade—would sweep through the gap created by the infantry. The plan was complicated and ambitious. The infantry night attack began at 16:30 on 21 July. The New Zealand attack took their objectives in the El Mreir depression but, once again, many vehicles failed to arrive and they were short of support arms in an exposed position. At daybreak on 22 July, the British armoured brigades again failed to advance. At daybreak on 22 July, Nehring's 5th and 8th Panzer Regiments responded with a rapid counter-attack which quickly overran the New Zealand infantry in the open, inflicting more than 900 casualties on the New Zealanders. 2nd Armoured Brigade sent forward two regiments to help but they were halted by mines and anti-tank fire. The attack by Indian 161st Brigade had mixed fortunes. On the left, the initial attempt to clear the western end of Ruweisat failed but at 08:00 a renewed attack by the reserve battalion succeeded. On the right, the attacking battalion broke into the Deir el Shein position but was driven back in hand-to-hand fighting. Compounding the disaster at El Mreir, at 08:00 the commander of 23rd Armoured Brigade ordered his brigade forward, intent on following his orders to the letter. Major-General Gatehouse—commanding 1st Armoured Division—had been unconvinced that a path had been adequately cleared in the minefields and had suggested the advance be cancelled. However, XIII Corps commander—Lieutenant-General William Gott—rejected this and ordered the attack but on a centre line south of the original plan which he incorrectly believed was mine-free. These orders failed to get through and the attack went ahead as originally planned. The brigade found itself mired in mine fields and under heavy fire. They were then counter-attacked by 21st Panzer at 11:00 and forced to withdraw. The 23rd Armoured Brigade was destroyed, with the loss of 40 tanks destroyed and 47 badly damaged. At 17:00, Gott ordered 5th Indian Infantry Division to execute a night attack to capture the western half of Ruweisat ridge and Deir el Shein. 3/14th Punjab Regiment from 9th Indian Infantry Brigade attacked at 02:00 on 23 July but failed as they lost their direction. A further attempt in daylight succeeded in breaking into the position but intense fire from three sides resulted in control being lost as the commanding officer was killed, and four of his senior officers were wounded or went missing. Attack on Tel el Eisa resumed To the north, Australian 9th Division continued its attacks. At 06:00 on 22 July, Australian 26th Brigade attacked Tel el Eisa and Australian 24th Brigade attacked Tel el Makh Khad toward Miteirya (Ruin Ridge). It was during this fighting that Arthur Stanley Gurney performed the actions for which he was posthumously awarded the Victoria Cross. The fighting for Tel el Eisa was costly, but by the afternoon the Australians controlled the feature. That evening, Australian 24th Brigade attacked Tel el Makh Khad with the tanks of 50th RTR in support. The tank unit had not been trained in close infantry support and failed to co-ordinate with the Australian infantry. The result was that the infantry and armour advanced independently and having reached the objective 50th RTR lost 23 tanks because they lacked infantry support. Once more, the Eighth Army had failed to destroy Rommel's forces, despite its overwhelming superiority in men and equipment. On the other hand, for Rommel the situation continued to be grave as, despite successful defensive operations, his infantry had suffered heavy losses and he reported that "the situation is critical in the extreme". Operation Manhood On 26/27 July, Auchinleck launched Operation Manhood in the northern sector in a final attempt to break the Axis forces. XXX Corps was reinforced with 1st Armoured Division (less 22nd Armoured Brigade), 4th Light Armoured Brigade, and 69th Infantry Brigade. The plan was to break the enemy line south of Miteirya ridge and exploit north-west. The South Africans were to make and mark a gap in the minefields to the south-east of Miteirya by midnight of 26/27 July. By 01:00 on 27 July, 24th Australian Infantry Brigade was to have captured the eastern end of the Miteirya ridge and would exploit toward the north-west. The 69th Infantry Brigade would pass through the minefield gap created by the South Africans to Deir el Dhib and clear and mark gaps in further minefields. The 2nd Armoured Brigade would then pass through to El Wishka and would be followed by 4th Light Armoured Brigade which would attack the Axis lines of communication. This was the third attempt to break through in the northern sector, and the Axis defenders were expecting the attack. Like the previous attacks, it was hurriedly and therefore poorly planned. The Australian 24th Brigade managed to take their objectives on Miteirya Ridge by 02:00 of 27 July. To the south, the British 69th Brigade set off at 01:30 and managed to take their objectives by about 08:00. However, the supporting anti-tank units became lost in the darkness or delayed by minefields, leaving the attackers isolated and exposed when daylight came. There followed a period during which reports from the battlefront regarding the minefield gaps were confused and conflicting. As a consequence, the advance of 2nd Armoured Brigade was delayed. Rommel launched an immediate counter-attack and the German armoured battlegroups overran the two forward battalions of 69th Brigade. Meanwhile, 50th RTR supporting the Australians was having difficulty locating the minefield gaps made by Australian 2/24th Battalion. They failed to find a route through and in the process were caught by heavy fire and lost 13 tanks. The unsupported 2/28th Australian battalion on the ridge was overrun. The 69th Brigade suffered 600 casualties and the Australians 400 for no gain. The Eighth Army was exhausted, and on 31 July Auchinleck ordered an end to offensive operations and the strengthening of the defences to meet a major counter-offensive. Rommel was later to blame the failure to break through to the Nile on how the sources of supply to his army had dried up and how: Rommel complained bitterly about the failure of important Italian convoys to get desperately needed tanks and supplies through to him, always blaming the Italian Supreme Command, never suspecting British code breaking. According to Dr James Sadkovich and others, Rommel often displayed a distinct tendency to blame and scapegoat his Italian allies to cover up his own mistakes and deficiencies as a commander in the field. For example, while Rommel was a very good tactical commander, the Italian and German High Commands were concerned that he lacked operational awareness and a sense of strategic objectives. Dr Sadkovich points out that he would often out-run his logistics and squander valuable (mostly Italian) military hardware and resources in battle after battle without clear strategic goals and an appreciation of the limited logistics his Italian allies were desperately trying to provide him. Aftermath The battle was a stalemate, but it had halted the Axis advance on Alexandria (and then Cairo and ultimately the Suez Canal). The Eighth Army had suffered over 13,000 casualties in July, including 4,000 in the 2nd New Zealand Division, 3,000 in the 5th Indian Infantry Division and 2,552 battle casualties in the 9th Australian Division but had taken 7,000 prisoners and inflicted heavy damage on Axis men and machines. In his appreciation of 27 July, Auchinleck wrote that the Eighth Army would not be ready to attack again until mid-September at the earliest. He believed that because Rommel understood that with the passage of time the Allied situation would only improve, he was compelled to attack as soon as possible and before the end of August when he would have superiority in armour. Auchinleck therefore made plans for a defensive battle. In early August, Winston Churchill and General Sir Alan Brooke—the Chief of the Imperial General Staff (CIGS)—visited Cairo on their way to meet Joseph Stalin in Moscow. They decided to replace Auchinleck, appointing the XIII Corps commander, William Gott, to the Eighth Army command and General Sir Harold Alexander as C-in-C Middle East Command. Persia and Iraq were to be split from Middle East Command as a separate Persia and Iraq Command and Auchinleck was offered the post of C-in-C (which he refused). Gott was killed on the way to take up his command when his aircraft was shot down. Lieutenant-General Bernard Montgomery was appointed in his place and took command on 13 August. See also List of German military equipment of World War II List of Italian military equipment in World War II List of British military equipment of World War II List of Australian military equipment of World War II North African campaign timeline List of World War II Battles List of World War II North Africa Airfields Battle of Alam el Halfa Second Battle of El Alamein Alamein Memorial Notes Citations References External links First Battle of El Alamein, from Italian "Comando Supremo" Royal Engineers Museum Royal Engineers and Second World War (Deception and mine clearance at EL Alamein) Alam Halfa and Alamein New Zealand Electronic Text Centre Rommel’s 621st Radio Intercept Company Conflicts in 1942 First Battle of El Alamein First Battle of El Alamein Western Desert campaign Battles of World War II involving Australia El Alamein Battles of World War II involving New Zealand Battles and operations of World War II involving India Battles and operations of World War II involving South Africa Battles of World War II involving Germany Battles of World War II involving Italy Egypt in World War II El Alamein Erwin Rommel Tank battles involving Germany Tank battles involving Italy Tank battles involving the United Kingdom Tank battles involving South Africa Tank battles of World War II
[ -0.08307790011167526, 0.21253257989883423, -0.35581642389297485, -0.4072519838809967, -0.5163605213165283, 0.17488890886306763, 0.5058161616325378, -0.23081113398075104, 0.3160409927368164, -0.6302691102027893, -0.5288105010986328, 0.4618013799190521, -0.242100790143013, 0.6604843735694885...
11776
https://en.wikipedia.org/wiki/First%20Italo-Ethiopian%20War
First Italo-Ethiopian War
The First Italo-Ethiopian War was fought between Italy and Ethiopia from 1895 to 1896. It originated from the disputed Treaty of Wuchale, which the Italians claimed turned Ethiopia into an Italian protectorate. Full-scale war broke out in 1895, with Italian troops from Italian Eritrea having initial success until Ethiopian troops counterattacked Italian positions and besieged the Italian fort of Mekele, forcing its surrender. Italian defeat came about after the Battle of Adwa, where the Ethiopian army dealt the heavily outnumbered Italian soldiers and Eritrean askaris a decisive blow and forced their retreat back into Eritrea. Some Eritreans, regarded as traitors by the Ethiopians, were also captured and mutilated. The war concluded with the Treaty of Addis Ababa. Because this was one of the first decisive victories by African forces over a European colonial power, this war became a preeminent symbol of pan-Africanism and secured Ethiopia's sovereignty until 1937. Background The Khedive of Egypt Isma'il Pasha, better known as "Isma'il the Magnificent" had conquered Eritrea as part of his efforts to give Egypt an African empire. Isma'il had tried to follow up that conquest with Ethiopia, but the Egyptian attempts to conquer that realm ended in humiliating defeat. After Egypt's bankruptcy in 1876 followed by the Ansar revolt under the leadership of the Mahdi in 1881, the Egyptian position in Eritrea was hopeless with the Egyptian forces cut off and unpaid for years. By 1884 the Egyptians began to pull out of both Sudan and Eritrea. Egypt had been very much in the French sphere of influence until 1882 when the British occupied Egypt. A major goal of French foreign policy until 1904 was to diminish British influence in Egypt and restore it to its place in the French sphere of influence, and in 1883 the French created the colony of French Somaliland which allowed for the establishment of a French naval base at Djibouti on the Red Sea. The opening of the Suez Canal in 1869 had turned the Horn of Africa into a very strategic region as a navy based in the Horn could interdict any shipping going up and down the Red Sea. By building naval bases on the Red Sea that could intercept British shipping in the Red Sea, the French hoped to reduce the value of the Suez Canal for the British, and thus "lever" them out of Egypt. A French historian in 1900 wrote: "The importance of Djibouti lies almost solely in the uniqueness of its geographic position, which makes it a port of transit and natural entrepôt for areas more infinitely more populated than its own territory...the rich provinces of central Ethiopia." The British historian Harold Marcus noted that for the French, "Ethiopia represented the entrance to the Nile valley; if she could obtain hegemony over Ethiopia, her dream of a west to east French African empire would be closer to reality". In response, Britain consistently supported Italian ambitions in the Horn of Africa as the best way of keeping the French out. On 3 June 1884, the Hewett Treaty was signed between Britain, Egypt and Ethiopia that allowed the Ethiopians to occupy parts of Eritrea and allowed Ethiopian goods to pass in and out of Massawa duty-free. From the viewpoint of Britain, it was highly undesirable that the French replace the Egyptians in Eritrea as that would allow the French to have more naval bases on the Red Sea that could interfere with British shipping using the Suez Canal, and as the British did not want the financial burden of ruling Eritrea, they looked for another power who would be interested in replacing the Egyptians. The Hewett treaty seemed to suggest that Eritrea would fall into the Ethiopian sphere of influence as the Egyptians pulled out. After initially encouraging the Emperor Yohannes IV to move into Eritrea to replace the Egyptians, London decided to have the Italians move into Eritrea. In his history of Ethiopia, British historian Augustus Wylde wrote: "England made use of King John [Emperor Yohannes] as long as he was of any service and then threw him over to the tender mercies of Italy...It is one of our worst bits of business out of the many we have been guilty of in Africa...one of the vilest bites of treachery". After the French had unexpectedly made Tunis into their protectorate in 1881, outraging opinion in Italy over the so-called "Schiaffo di Tunisi" (the "slap of Tunis"), Italian foreign policy had been extremely anti-French, and from the British viewpoint the best way of ensuring the Eritrean ports on the Red Sea stayed out of French hands was by allowing the staunchly anti-French Italians move in. In 1882, Italy had joined the Triple Alliance, allying herself with Austria and Germany against France. On 5 February 1885 Italian troops landed at Massawa to replace the Egyptians. The Italian government for its part was more than happy to embark upon an imperialist policy to distract its people from the failings in post Risorgimento Italy. In 1861, the unification of Italy was supposed to mark the beginning of a glorious new era in Italian life, and many Italians were gravely disappointed to find that not much had changed in the new Kingdom of Italy with the vast majority of Italians still living in abject poverty. To compensate, a chauvinist mood was rampant among the upper classes in Italy with the newspaper Il Diritto writing in an editorial: "Italy must be ready. The year 1885 will decide her fate as a great power. It is necessary to feel the responsibility of the new era; to become again strong men afraid of nothing, with the sacred love of the fatherland, of all Italy, in our hearts". On the Ethiopian side, the wars that Emperor Yohannes had waged first against the invading Egyptians in the 1870s and then more so against the Sudanese Mahdiyya state in the 1880s had been presented by him to his subjects as holy wars in defense of Orthodox Christianity against Islam, reinforcing the Ethiopian belief that their country was an especially virtuous and holy land. The struggle against the Ansar from Sudan complicated Yohannes's relations with the Italians, whom he sometimes asked to provide him with guns to fight the Ansar and other times he resisted the Italians and proposed a truce with the Ansar. On 18 January 1887, at a village named Saati, an advancing Italian Army detachment defeated the Ethiopians in a skirmish, but it ended with the numerically superior Ethiopians surrounding the Italians in Saati after they retreated in face of the enemy's numbers. Some 500 Italian soldiers under Colonel de Christoforis together with 50 Eritrean auxiliaries were sent to support the besieged garrison at Saati. At Dogali on his way to Saati, de Christoforis was ambushed by an Ethiopian force under Ras Alula, whose men armed with spears skillfully encircled the Italians who retreated to one hill and then to another higher hill. After the Italians ran out of ammunition, Ras Alula ordered his men to charge and the Ethiopians swiftly overwhelmed the Italians in an action that featured bayonets against spears. The Battle of Dogali ended with the Italians losing 23 officers and 407 other ranks killed. As a result of the defeat at Dogali, the Italians abandoned Saati and retreated back to the Red Sea coast. Italians newspapers called the battle a "massacre" and excoriated the Regio Esercito for not assigning de Chistoforis enough ammunition. Having, at first, encouraged Emperor Yohannes to move into Eritrea, and then having encouraged the Italians to also do so, London realised a war was brewing and decided to try to mediate, largely out of the fear that the Italians might actually lose. The British consul in Zanzibar, Gerald Portal, was sent in 1887 to mediate between the Ethiopians and Italians before war broke out.Upon meeting the Emperor Yohannes on 4 December 1887, he presented him with gifts and a letter from Queen Victoria urging him to settle with the Italians. Portal reported: "What might have been possible in August or September was impossible in December, when the whole of the immense available forces in the country were already under arms; and that there now remains no hope of a satisfactory adjustment of the difficulties between Italy and Abyssinia [Ethiopia] until the question of the relative supremacy of these two nations has been decided by an appeal to the fortunes of war... No one who has once seen the nature of the gorges, ravines and mountain passes near the Abyssinian frontier can doubt for a moment that any advance by a civilised army in the face of the hostile Abyssinian hordes would be accomplished at the price of a fearful loss of life on both sides. ... The Abyssinians are savage and untrustworthy, but they are also redeemed by the possession of an unbounded courage, by a disregard of death, and by a national pride which leads them to look down on every human being who has not had the good fortune to be born an Abyssinian". Portal ended by writing that the Italians were making a mistake in preparing to go war against Ethiopia: "It is the old, old story, contempt of a gallant enemy because his skin happens to be chocolate or brown or black, and because his men have not gone through orthodox courses of field-firing, battalion drill, or 'autumn maneuvers'". The defeat at Dogali made the Italians cautious for a moment, but on 10 March 1889, Emperor Yohannes died after being wounded in battle against the Ansar and on his deathbed admitted that Ras Mengesha, the supposed son of his brother, was actually his own son and asked that he succeed him. The revelation that the emperor had slept with his brother's wife scandalised intensely Orthodox Ethiopia, and instead the Negus Menelik was proclaimed emperor on 26 March 1889. Ras Mengesha, one of the most powerful Ethiopian noblemen, was unhappy about being by-passed in the succession and for a time allied himself with the Italians against the Emperor Menelik. Under the feudal Ethiopian system, there was no standing army, and instead, the nobility raised up armies on behalf of the Emperor. In December 1889, the Italians advanced inland again and took the cities of Asmara and Keren and in January 1890 took Adowa. Treaty of Wuchale On 25 March 1889, the Shewa ruler Menelik II, having conquered Tigray and Amhara, declared himself Emperor of Ethiopia (or "Abyssinia", as it was commonly called in Europe at the time). Barely a month later, on 2 May he signed the Treaty of Wuchale with the Italians, which apparently gave them control over Eritrea, the Red Sea coast to the northeast of Ethiopia, in return for recognition of Menelik's rule. Menelik II continued the policy of Tewodros II of integrating Ethiopia. However, the bilingual treaty did not say the same thing in Italian and Amharic; the Italian version did not give the Ethiopians the "significant autonomy" written into the Amharic translation. The Italian text stated that Ethiopia must conduct its foreign affairs through Italy (making it an Italian protectorate), but the Amharic version merely stated that Ethiopia could contact foreign powers and conduct foreign affairs using the embassy of Italy. Italian diplomats, however, claimed that the original Amharic text included the clause and Menelik knowingly signed a modified copy of the Treaty. In October 1889, the Italians informed all of the other European governments because of the Treaty of Wuchale that Ethiopia was now an Italian protectorate and therefore the other European nations could not conduct diplomatic relations with Ethiopia. With the exceptions of the Ottoman Empire, which still maintained its claim to Eritrea, and Russia, which disliked the idea of an Orthodox nation being subjugated to a Roman Catholic nation, all of the European powers accepted the Italian claim to a protectorate. The Italian claim that Menelik was aware of Article XVII turning his nation into an Italian protectorate seems unlikely given that the Emperor Menelik sent letters to Queen Victoria and Emperor Wilhelm II in late 1889 and was informed in the replies in early 1890 that neither Britain nor Germany could have diplomatic relations with Ethiopia on the account of Article XVII of the Treaty of Wuchale, a revelation that came as a great shock to the Emperor. Victoria's letter was polite whereas Wilhelm's letter was somewhat more rude, saying that King Umberto I was a great friend of Germany and Menelik's violation of the supposed Italian protectorate was a grave insult to Umberto, adding that he never wanted to hear from Menelik again. Moreover, Menelik did not know Italian and only signed the Amharic text of the treaty, being assured that there were no differences between the Italian and Amharic texts before he signed. The differences between the Italian and Amharic texts were due to the Italian minister in Addis Ababa, Count Pietro Antonelli, who had been instructed by his government to gain as much territory as possible in negotiating with the Emperor Menelik. However, knowing Menelik was now enthroned as the King of Kings and had a strong position, Antonelli was in the unenviable situation of negotiating a treaty that his own government might disallow. Therefore, he inserted the statement making Ethiopia give up its right to conduct its foreign affairs to Italy as a way of pleasing his superiors who might otherwise have fired him for only making small territorial gains. Antonelli was fluent in Amharic and given that Menelik only signed the Amharic text he could not have been unaware that the Amharic version of Article XVII only stated that the King of Italy places the services of his diplomats at the disposal of the Emperor of Ethiopia to represent him abroad if he so wished. When his subterfuge was exposed in 1890 with Menelik indignantly saying he would never sign away his country's independence to anybody, Antonelli who left Addis Ababa in mid 1890 resorted to racism, telling his superiors in Rome that as Menelik was a black man, he was thus intrinsically dishonest and it was only natural the Emperor would lie about the protectorate he supposedly willingly turned his nation into. Francesco Crispi, the Italian Prime Minister was an ultra-imperialist who believed the newly unified Italian state required "the grandeur of a second Roman empire". Crispi believed that the Horn of Africa was the best place for the Italians to start building the new Roman empire. The American journalist James Perry wrote that "Crispi was a fool, a bigot and a very dangerous man". Because of the Ethiopian refusal to abide by the Italian version of the treaty and despite economic handicaps at home, the Italian government decided on a military solution to force Ethiopia to abide by the Italian version of the treaty. In doing so, they believed that they could exploit divisions within Ethiopia and rely on tactical and technological superiority to offset any inferiority in numbers. The efforts of Emperor Menelik, viewed as pro-French by London, to unify Ethiopia and thus bring the source of the Blue Nile under his control was perceived in Whitehall as a threat to their influence in Egypt. As Menelik became increasingly successful in unifying Ethiopia, the British government courted the Italians to counter Ethiopian expansion. There was a broader, European background as well: the Triple Alliance of Germany, Austria-Hungary, and Italy was under some stress, with Italy being courted by the British government. Two secret Anglo-Italian protocols were signed in 1891, leaving most of Ethiopia in Italy's sphere of influence. France, one of the members of the opposing Franco-Russian Alliance, had its own claims on Eritrea and was bargaining with Italy over giving up those claims in exchange for a more secure position in Tunisia. Meanwhile, Russia was supplying weapons and other aid to Ethiopia. It had been trying to gain a foothold in Ethiopia, and in 1894, after denouncing the Treaty of Wuchale in July, it received an Ethiopian mission in St. Petersburg and sent arms and ammunition to Ethiopia. This support continued after the war ended. The Russian travel writer Alexander Bulatovich who went to Ethiopia to serve as a Red Cross volunteer with the Emperor Menelik made a point of emphasizing in his books that the Ethiopians converted to Christianity before any of the Europeans ever did, described the Ethiopians as a deeply religious people like the Russians, and argued the Ethiopians did not have the "low cultural level" of the other African peoples, making them equal to the Europeans. Germany and Austria supported their ally in the Triple Alliance Italy while France and Russia supported Ethiopia. Prelude & Beginning of Conflict In 1893, judging that his power over Ethiopia was secure, Menelik repudiated the treaty; in response the Italians ramped up the pressure on his domain in a variety of ways, including the annexation of small territories bordering their original claim under the Treaty of Wuchale, and finally culminating with a military campaign and across the Mareb River into Tigray (on the border with Eritrea) in December 1894. The Italians expected disaffected potentates like Negus Tekle Haymanot of Gojjam, Ras Mengesha Yohannes, and the Sultan of Aussa to join them; instead, all of the ethnic Tigrayan or Amharic peoples flocked to the Emperor Menelik's side in a display of both nationalism and anti-Italian feeling, while other peoples of dubious loyalty (e.g. the Sultan of Aussa) were watched by Imperial garrisons. In June 1894, Ras Mengesha and his generals had appeared in Addis Ababa carrying large stones which they dropped before the Emperor Menelik (a gesture that is a symbol of submission in Ethiopian culture). In Ethiopia, the popular saying at the time was: "Of a black snake's bite, you may be cured, but from the bite of a white snake, you will never recover." There was an overwhelming national unity in Ethiopia as various feuding noblemen rallied behind the emperor who insisted that Ethiopia, unlike the other African nations, would retain its freedom and not be subjected to Italy. The ethnic rivalries between the Tigrians and the Amhara that the Italians were counting upon did not prove to be a factor as Menelik pointed out that the Italians held all Ethnic Africans, regardless of their individual ethnic backgrounds, in contempt, noting the segregation policies in Eritrea applied to all Ethnic Africans. Further, Menelik had spent much of the previous four years building up a supply of modern weapons and ammunition, acquired from the French, British, and the Italians themselves, as the European colonial powers sought to keep each other's North African aspirations in check. They also used the Ethiopians as a proxy army against the Sudanese Mahdists. In December 1894, Bahta Hagos led a rebellion against the Italians in Akkele Guzay, claiming support of Mengesha. Units of General Oreste Baratieri's army under Major Pietro Toselli crushed the rebellion and killed Bahta at the Battle of Halai. The Italian army then occupied the Tigrian capital, Adwa. Baratieri suspected that Mengesha would invade Eritrea, and met him at the Battle of Coatit in January 1895. The victorious Italians chased the retreating Mengesha, capturing weapons and important documents proving his complicity with Menelik. The victory in this campaign, along with previous victories against the Sudanese Mahdists, led the Italians to underestimate the difficulties to overcome in a campaign against Menelik. At this point, Emperor Menelik turned to France, offering a treaty of alliance; the French response was to abandon the Emperor in order to secure Italian approval of the Treaty of Bardo which would secure French control of Tunisia. Virtually alone, on 17 September 1895, Emperor Menelik issued a proclamation calling up the men of Shewa to join his army at Were Ilu. As the Italians were poised to enter Ethiopian territory, the Ethiopians mobilised en masse all over the country. Helping it was the newly updated imperial fiscal and taxation system. As a result, a hastily mobilised army of 196,000 men gathered from all parts of Abyssinia, more than half of whom were armed with modern rifles, rallied at Addis Ababa in support of the Emperor and defence of their country. The only European ally of Ethiopia was Russia. The Ethiopian emperor sent his first diplomatic mission to St. Petersburg in 1895. In June 1895, the newspapers in St. Petersburg wrote, "Along with the expedition, Menelik II sent his diplomatic mission to Russia, including his princes and his bishop". Many citizens of the capital came to meet the train that brought Prince Damto, General Genemier, Prince Belyakio, Bishop of Harer Gabraux Xavier and other members of the delegation to St. Petersburg. On the eve of war, an agreement providing military help for Ethiopia was concluded. The next clash came at Amba Alagi on 7 December 1895, when Ethiopian soldiers overran the Italian positions dug in on the natural fortress, and forced the Italians to retreat back to Eritrea. The remaining Italian troops under General Giuseppe Arimondi reached the unfinished Italian fort at Mekele. Arimondi left there a small garrison of approximately 1,150 Askaris and 200 Italians, commanded by Major Giuseppe Galliano, and took the bulk of his troops to Adigrat, where Oreste Baratieri, the Italian Commander, was concentrating the Italian Army. The first Ethiopian troops reached Mekele in the following days. Ras Makonnen surrounded the fort at Mekele on 18 December, but the Italian Commander adroitly used promises of a negotiated surrender to prevent the Ras from attacking the fort. By the first days of January, Emperor Menelik, accompanied by his Queen Taytu Betul, had led large forces into Tigray, and besieged the Italians for sixteen days (6–21 January 1896), making several unsuccessful attempts to carry the fort by storm, until the Italians surrendered with permission from the Italian Headquarters. Menelik allowed them to leave Mekele with their weapons, and even provided the defeated Italians mules and pack animals to rejoin Baratieri. While some historians read this generous act as a sign that Emperor Menelik still hoped for a peaceful resolution to the war, Harold Marcus points out that this escort allowed him a tactical advantage: "Menelik craftily managed to establish himself in Hawzien, at Gendepata, near Adwa, where the mountain passes were not guarded by Italian fortifications." Heavily outnumbered, Baratieri refused to engage, knowing that due to their lack of infrastructure the Ethiopians could not keep large numbers of troops in the field much longer. However, Baratieri also never knew the true numerical strength of the Ethiopian army he faced, so he further fortified his positions in the Tigray instead of advancing. But the Italian government of Francesco Crispi was unable to accept being stymied by non-Europeans. The prime minister specifically ordered Baratieri to advance deep into enemy territory and bring about a battle. Battle of Adwa The decisive battle of the war was the Battle of Adwa on March 1, 1896, which took place in the mountainous country north of the actual town of Adwa (or Adowa). The Italian army comprised four brigades totaling approximately 17,700 men, with fifty-six artillery pieces; the Ethiopian army comprised several brigades numbering between 73,000 and 120,000 men (80–100,000 with firearms: according to Richard Pankhurst, the Ethiopians were armed with approximately 100,000 rifles of which about half were quick-firing), with almost fifty artillery pieces. General Baratieri planned to surprise the larger Ethiopian force with an early morning attack, expecting his enemy to be asleep. However, the Ethiopians had risen early for Church services and, upon learning of the Italian advance, promptly attacked. The Italian forces were hit by wave after wave of attacks, until Menelik released his reserve of 25,000 men, destroying an Italian brigade. Another brigade was cut off, and destroyed by a cavalry charge. The last two brigades were destroyed piecemeal. By noon, the Italian survivors were in full retreat. While Menelik's victory was in a large part due to the sheer force of numbers, his troops were well-armed because of his careful preparations. The Ethiopian army only had a feudal system of organisation but proved capable of properly executing the strategic plan drawn up in Menelik's headquarters. However, the Ethiopian army also had its problems. The first was the quality of its arms, as the Italian colonial authorities in Eritrea prevented the transportation of 30,000–60,000 modern Mosin–Nagant rifles and Berdan rifles from Russia into landlocked Ethiopia. The rest of the Ethiopian army was equipped with swords and spears. Secondly, the Ethiopian army's feudal organisation meant that nearly the entire force was composed of peasant militia. Russian military experts advising Menelik II suggested a full-contact battle with Italians, to neutralise the Italian fire superiority, instead of engaging in a campaign of harassment designed to nullify problems with arms, training, and organisation. Some Russian councillors of Menelik II and a team of fifty Russian volunteers participated in the battle, among them Nikolay Leontiev, an officer of the Kuban Cossack army. Russian support for Ethiopia also led to a Russian Red Cross mission, which arrived in Addis Ababa some three months after Menelik's Adwa victory. The Italians suffered about 7,000 killed and 1,500 wounded in the battle and subsequent retreat back into Eritrea, with 3,000 taken prisoner; Ethiopian losses have been estimated around 4,000 killed and 8,000 wounded. In addition, 2,000 Eritrean Askaris were killed or captured. Italian prisoners were treated as well as possible under difficult circumstances, but 800 captured Askaris, regarded as traitors by the Ethiopians, had their right hands and left feet amputated. Menelik, knowing that the war was very unpopular in Italy with the Italian Socialists in particular condemning the policy of the Crispi government, chose to be a magnanimous victor, making it clear that he saw a difference between the Italian people and Crispi. Outcome and consequences Menelik retired in good order to his capital, Addis Ababa, and waited for the fallout of the victory to hit Italy. Riots broke out in several Italian cities, and within two weeks, the Crispi government collapsed amidst Italian disenchantment with "foreign adventures". Menelik secured the Treaty of Addis Ababa in October, which delineated the borders of Eritrea and forced Italy to recognise the independence of Ethiopia. Delegations from the United Kingdom and France—whose colonial possessions lay next to Ethiopia—soon arrived in the Ethiopian capital to negotiate their own treaties with this newly proven power. Owing to Russia's diplomatic support of her fellow Orthodox nation, Russia's prestige greatly increased in Ethiopia. The adventuresome Seljan brothers, Mirko and Stjepan, who were actually Catholic Croats, were warmly welcomed when they arrived in Ethiopia in 1899 when they misinformed their hosts by saying they were Russians. As France supported Ethiopia with weapons, French influence increased markedly. Prince Henri of Orléans, the French traveller, wrote: "France gave rifles to this country and taking the hand of its Emperor like an elder sister has explained to him the old motto which has guided her across the centuries of greatness and glory: Honor and Country!". In December 1896, a French diplomatic mission in Addis Ababa arrived and on 20 March 1897 signed a treaty that was described as "véritable traité d'alliance. In turn, the increase in French influence in Ethiopia led to fears in London that the French would gain control of the Blue Nile and would be able to "lever" the British out of Egypt. To keep control of the Nile in Egypt, the British government decided in March 1896 to advance down the Nile from Egypt into the Sudan to conquer the Mahdiyya state. On 12 March 1896, upon hearing of the Italian defeat at the Battle of Adwa, the British Prime Minister, Lord Salisbury, gave instructions for the British forces in Egypt to occupy the Sudan before the French could conquer the Mahdiyya state, stating that no hostile power could be allowed to control the Nile. In 1935, Italy launched a second invasion, which resulted in an Italian victory and the annexation of Ethiopia to Italian East Africa until the Italians were defeated in the Second World War and expelled by the British Empire, with some assistance from Ethiopian arbegnoch guerilla. The Italians successively started a guerrilla war until 1943 in some areas of northern Ethiopia, supporting the rebellion of the Galla in 1942. Gallery See also Italo-Ethiopian War of 1887–1889 Second Italo-Ethiopian War Italian Empire Military history of Ethiopia Notes References Bibliography Italo-Ethiopian War, First 1895 in Ethiopia 1896 in Ethiopia 1895 in Italy 1896 in Italy Conflicts in 1895 Conflicts in 1896 Italo-Ethiopian War, First Italo-Ethiopian War, First Italo-Ethiopian War, First Italo-Ethiopian War, First African resistance to colonialism
[ 0.09722978621721268, -0.028561562299728394, -0.7108006477355957, -0.039674028754234314, -0.8370487689971924, 0.09946733713150024, 0.2752518355846405, 0.49946022033691406, -0.25730109214782715, -0.5621790289878845, 0.22555947303771973, 0.31966352462768555, -0.6575376391410828, 0.86493593454...
11778
https://en.wikipedia.org/wiki/Frederick%20Soddy
Frederick Soddy
Frederick Soddy FRS (2 September 1877 – 22 September 1956) was an English radiochemist who explained, with Ernest Rutherford, that radioactivity is due to the transmutation of elements, now known to involve nuclear reactions. He also proved the existence of isotopes of certain radioactive elements. He was a polymath who mastered chemistry, nuclear physics, statistical mechanics, finance and economics. Biography Soddy was born at 5 Bolton Road, Eastbourne, England, the son of Benjamin Soddy, corn merchant, and his wife Hannah Green. He went to school at Eastbourne College, before going on to study at University College of Wales at Aberystwyth and at Merton College, Oxford, where he graduated in 1898 with first class honours in chemistry. He was a researcher at Oxford from 1898 to 1900. Scientific career In 1900 he became a demonstrator in chemistry at McGill University in Montreal, Quebec, where he worked with Ernest Rutherford on radioactivity. He and Rutherford realized that the anomalous behaviour of radioactive elements was because they decayed into other elements. This decay also produced alpha, beta, and gamma radiation. When radioactivity was first discovered, no one was sure what the cause was. It needed careful work by Soddy and Rutherford to prove that atomic transmutation was in fact occurring. In 1903, with Sir William Ramsay at University College London, Soddy showed that the decay of radium produced helium gas. In the experiment a sample of radium was enclosed in a thin-walled glass envelope sited within an evacuated glass bulb. After leaving the experiment running for a long period of time, a spectral analysis of the contents of the former evacuated space revealed the presence of helium. Later in 1907, Rutherford and Thomas Royds showed that the helium was first formed as positively charged nuclei of helium (He2+) which were identical to alpha particles, which could pass through the thin glass wall but were contained within the surrounding glass envelope. From 1904 to 1914, Soddy was a lecturer at the University of Glasgow. Ruth Pirret worked as his research assistant during this time. In May 1910 Soddy was elected a Fellow of the Royal Society. In 1914 he was appointed to a chair at the University of Aberdeen, where he worked on research related to World War I. In 1913, Soddy showed that an atom moves lower in atomic number by two places on alpha emission, higher by one place on beta emission. This was discovered at about the same time by Kazimierz Fajans, and is known as the radioactive displacement law of Fajans and Soddy, a fundamental step toward understanding the relationships among families of radioactive elements. In 1913 Soddy also described the phenomenon in which a radioactive element may have more than one atomic mass though the chemical properties are identical. He named this concept isotope meaning "same place". The word was initially suggested to him by Margaret Todd. Later, J. J. Thomson showed that non-radioactive elements can also have multiple isotopes. The work that Soddy and his research assistant Ada Hitchins did at Glasgow and Aberdeen showed that uranium decays to radium. Soddy published The Interpretation of Radium (1909) and Atomic Transmutation (1953). In 1918 working with John Arnold Cranston, he announced the discovery of an isotope of the element later named protactinium. This slightly post-dated its discovery by the Germans Lise Meitner and Otto Hahn; however, it is said their discovery was actually made in 1915 but its announcement was delayed due to Cranston's notes being locked away whilst on active service in the First World War. In 1919 he moved to the University of Oxford as Dr Lee's Professor of Chemistry, where, in the period up till 1936, he reorganized the laboratories and the syllabus in chemistry. He received the 1921 Nobel Prize in chemistry for his research in radioactive decay and particularly for his formulation of the theory of isotopes. His work and essays popularising the new understanding of radioactivity was the main inspiration for H. G. Wells's The World Set Free (1914), which features atomic bombs dropped from biplanes in a war set many years in the future. Wells's novel is also known as The Last War and imagines a peaceful world emerging from the chaos. In Wealth, Virtual Wealth and Debt Soddy praises Wells's The World Set Free. He also says that radioactive processes probably power the stars. Economics In four books written from 1921 to 1934, Soddy carried on a "campaign for a radical restructuring of global monetary relationships", offering a perspective on economics rooted in physics – the laws of thermodynamics, in particular – and was "roundly dismissed as a crank". While most of his proposals – "to abandon the gold standard, let international exchange rates float, use federal surpluses and deficits as macroeconomic policy tools that could counter cyclical trends, and establish bureaus of economic statistics (including a consumer price index) in order to facilitate this effort" – are now conventional practice, his critique of fractional-reserve banking still "remains outside the bounds of conventional wisdom" although a recent paper by the IMF reinvigorated his proposals. Soddy wrote that financial debts grew exponentially at compound interest but the real economy was based on exhaustible stocks of fossil fuels. Energy obtained from the fossil fuels could not be used again. This criticism of economic growth is echoed by his intellectual heirs in the now emergent field of ecological economics. The New Palgrave Dictionary of Economics, an influential reference text in economics, recognized Soddy as a "reformer" for his works on monetary reforms. Political views In Wealth, Virtual Wealth and Debt, Soddy cited the Protocols of the Learned Elders of Zion as evidence for the belief, which was relatively widespread at the time, of a "financial conspiracy to enslave the world". The Protocols was widely disseminated by Henry Ford in the United States. He claimed that "A corrupt monetary system strikes at the very life of the nation." Later in life he published a pamphlet Abolish Private Money, or Drown in Debt (1939). The influence of his writing can be gauged, for example, in this quote from Ezra Pound:"Professor Frederick Soddy states that the Gold Standard monetary system has wrecked a scientific age! ... The world's bankers ... have not been content to take their share of modern wealth production – great as it has been – but they have refused to allow the masses of mankind to receive theirs." Though some activists have insubstantially accused Soddy of anti-Semitism, most of his biographers dispute this narrative and argue that among Soddy's friends and students were some Jews who held positive views of him. Among these friends include Kazimierz Fajans, a Polish-Jewish physicist who worked with both Ernest Rutherford and Soddy. Descartes' theorem He rediscovered the Descartes' theorem in 1936 and published it as a poem, "The Kiss Precise", quoted at Problem of Apollonius. The kissing circles in this problem are sometimes known as Soddy circles. Honours and awards He received the Nobel Prize in Chemistry in 1921 and the same year he was elected member of the International Atomic Weights Committee. A small crater on the far side of the Moon as well as the radioactive uranium mineral soddyite are named after him. Personal life In 1908, Soddy married Winifred Moller Beilby (1885-1936), the daughter of industrial chemist Sir George Beilby and Lady Emma Bielby, a philanthropist to women's causes. The couple worked together and co-published a paper in 1910 on the absorption of gamma rays from radium. He died in Brighton, England in 1956, twenty days after his 79th birthday. Bibliography Radioactivity (1904) The Interpretation of Radium (1909) Matter and Energy (1911), second edition (2015) The Chemistry of the Radio-elements (1915) Science and life: Aberdeen addresses (1920) Cartesian Economics: The Bearing of Physical Science upon State Stewardship (1921) Science and Life Wealth, Virtual Wealth, and Debt Money versus Man etc (1921) Nobel Lecture – The origins of the conception of isotopes (1922) Wealth, Virtual Wealth and Debt. The solution of the economic paradox (George Allen & Unwin, 1926) The wrecking of a scientific age (1927) The Interpretation of the Atom (1932) Money versus Man (1933) The Role of Money (London: George Routledge & Sons Ltd, 1934) at Internet Archive.org, second edition (2015) Money as nothing for something ; The gold "standard" snare (1935) Abolish Private Money, or Drown in Debt (1939) Present outlook, a warning : debasement of the currency, deflation and unemployment (1944) The Story of Atomic Energy (1949) Atomic Transmutation (1953) See also Ada Hitchins, who helped Soddy to discover the element protactinium Alfred J. Lotka Problem of Apollonius Oliver Sacks' autobiography Uncle Tungsten, in which Soddy, his work and his profound discoveries in atomic physics are extensively discussed and explained in Sacks' insightful and easily understandable language. References External links The Central Role of Energy in Soddy's Holistic and Critical Approach to Nuclear Science, Economics, and Social Responsibility Annotated bibliography for Frederick Soddy from the Alsos Digital Library for Nuclear Issues M. King Hubbert on the Nature of Growth. 1974 A biography of Frederick Soddy by Arian Forrest Nevin The Frederick Soddy Trust including the Nobel Lecture, December 12, 1922 The Origins of the Conception of Isotopes Frederick Soddy Papers, 1920-1956 (inclusive). H MS c388. Harvard Medical Library, Francis A. Countway Library of Medicine, Boston, Mass. 1877 births 1956 deaths Academics of the University of Aberdeen Alumni of Merton College, Oxford Alumni of Aberystwyth University British Nobel laureates Fellows of the Royal Society Corresponding Members of the Russian Academy of Sciences (1917–1925) Corresponding Members of the USSR Academy of Sciences Nobel laureates in Chemistry People educated at Eastbourne College People from Eastbourne English chemists English Nobel laureates Dr Lee's Professors of Chemistry McGill University faculty People involved with the periodic table
[ 0.12182395160198212, 0.8193849921226501, -0.8165460824966431, -0.5226547122001648, -0.25302401185035706, -0.060023825615644455, 0.8897548913955688, -0.6176999807357788, 0.3871574103832245, 0.5979273915290833, -0.4269770085811615, 0.08474932610988617, -0.3340919315814972, 0.8163372278213501...
11780
https://en.wikipedia.org/wiki/Fur%20seal
Fur seal
Fur seals are any of nine species of pinnipeds belonging to the subfamily Arctocephalinae in the family Otariidae. They are much more closely related to sea lions than true seals, and share with them external ears (pinnae), relatively long and muscular foreflippers, and the ability to walk on all fours. They are marked by their dense underfur, which made them a long-time object of commercial hunting. Eight species belong to the genus Arctocephalus and are found primarily in the Southern Hemisphere, while a ninth species also sometimes called fur seal, the northern fur seal (Callorhinus ursinus), belongs to a different genus and inhabits the North Pacific. Taxonomy Fur seals and sea lions make up the family Otariidae. Along with the Phocidae and Odobodenidae, ottariids are pinnipeds descending from a common ancestor most closely related to modern bears (as hinted by the subfamily Arctocephalinae, meaning "bear-headed"). The name pinniped refers to mammals with front and rear flippers. Otariids arose about 15-17 million years ago in the Miocene, and were originally land mammals that rapidly diversified and adapted to a marine environment, giving rise to the semiaquatic marine mammals that thrive today. Fur seals and sea lions are closely related and commonly known together as the "eared seals". Until recently, fur seals were all grouped under a single subfamily of Pinnipedia, called the Arctocephalinae, to contrast them with Otariinae – the sea lions – based on the most prominent common feature, namely the coat of dense underfur intermixed with guard hairs. Recent genetic evidence, however, suggests Callorhinus is more closely related to some sea lion species, and the fur seal/sea lion subfamily distinction has been eliminated from many taxonomies. Nonetheless, all fur seals have certain features in common: the fur, generally smaller sizes, farther and longer foraging trips, smaller and more abundant prey items, and greater sexual dimorphism. For these reasons, the distinction remains useful. Fur seals comprise two genera: Callorhinus, and Arctocephalus. Callorhinus is represented by just one species in the Northern Hemisphere, the northern fur seal (Callorhinus ursinus), and Arctocephalus is represented by eight species in the Southern Hemisphere. The southern fur seals comprising the genus Arctocephalus include Antarctic fur seals, Galapagos fur seals, Juan Fernandez fur seals, New Zealand fur seals, brown fur seals, South American fur seals, and subantarctic fur seals. Physical appearance Along with the previously mentioned thick underfur, fur seals are distinguished from sea lions by their smaller body structure, greater sexual dimorphism, smaller prey, and longer foraging trips during the feeding cycle. The physical appearance of fur seals varies with individual species, but the main characteristics remain constant. Fur seals are characterized by their external pinnae, dense underfur, vibrissae, and long, muscular limbs. They share with other otariids the ability to rotate their rear limbs forward, supporting their bodies and allowing them to ambulate on land. In water, their front limbs, typically measuring about a fourth of their body length, act as oars and can propel them forward for optimal mobility. The surfaces of these long, paddle-like fore limbs are leathery with small claws. Otariids have a dog-like head, sharp, well-developed canines, sharp eyesight, and keen hearing. They are extremely sexually dimorphic mammals, with the males often two to five times the size of the females, with proportionally larger heads, necks, and chests. Size ranges from about 1.5 m, 64 kg in the male Galapagos fur seal (also the smallest pinniped) to 2.5 m, 180 kg in the adult male New Zealand fur seal. Most fur seal pups are born with a black-brown coat that molts at 2–3 months, revealing a brown coat that typically gets darker with age. Some males and females within the same species have significant differences in appearance, further contributing to the sexual dimorphism. Females and juveniles often have a lighter colored coat overall or only on the chest, as seen in South American fur seals. In a northern fur seal population, the females are typically silvery-gray on the dorsal side and reddish-brown on their ventral side with a light gray patch on their chest. This makes them easily distinguished from the males with their brownish-gray to reddish-brown or black coats. Habitat Of the fur seal family, eight species are considered southern fur seals, and only one is found in the Northern Hemisphere. The southern group includes Antarctic, Galapagos, Guadalupe, Juan Fernandez, New Zealand, brown, South American, and subantarctic fur seals. They typically spend about 70% of their lives in subpolar, temperate, and equatorial waters. Colonies of fur seals can be seen throughout the Pacific and Southern Oceans from south Australia, Africa, and New Zealand, to the coast of Peru and north to California. They are typically nonmigrating mammals, with the exception of the northern fur seal, which has been known to travel distances up to 10,000 km. Fur seals are often found near isolated islands or peninsulas, and can be seen hauling out onto the mainland during winter. Although they are not migratory, they have been observed wandering hundreds of miles from their breeding grounds in times of scarce resources. For example, the subantarctic fur seal typically resides near temperate islands in the South Atlantic and Indian Oceans north of the Antarctic Polar Front, but juvenile males have been seen wandering as far north as Brazil and South Africa. Behavior and ecology Typically, fur seals gather during the summer in large rookeries at specific beaches or rocky outcrops to give birth and breed. All species are polygynous, meaning dominant males reproduce with more than one female. For most species, total gestation lasts about 11.5 months, including a several-month period of delayed implantation of the embryo. Northern fur seal males aggressively select and defend the specific females in their harems. Females typically reach sexual maturity around 3–4 years. The males reach sexual maturity around the same time, but do not become territorial or mate until 6–10 years. The breeding season typically begins in November and lasts 2–3 months. The northern fur seals begin their breeding season as early as June due to their region, climate, and resources. In all cases, the males arrive a few weeks early to fight for their territory and groups of females with which to mate. They congregate at rocky, isolated breeding grounds and defend their territory through fighting and vocalization. Males typically do not leave their territory for the entirety of the breeding season, fasting and competing until all energy sources are depleted. The Juan Fernandez fur seals deviate from this typical behavior, using aquatic breeding territories not seen in other fur seals. They use rocky sites for breeding, but males fight for territory on land and on the shoreline and in the water. Upon arriving to the breeding grounds, females give birth to their pups from the previous season. About a week later, the females mate again and shortly after begin their feeding cycle, which typically consists of foraging and feeding at sea for about 5 days, then returning to the breeding grounds to nurse the pups for about 2 days. Mothers and pups locate each other using call recognition during nursing period. The Juan Fernandez fur seal has a particularly long feeding cycle, with about 12 days of foraging and feeding and 5 days of nursing. Most fur seals continue this cycle for about 9 months until they wean their pup. The exception to this is the Antarctic fur seal, which has a feeding cycle that lasts only 4 months. During foraging trips, most female fur seals travel around 200 km from the breeding site, and can dive around 200 m depending on food availability. The remainder of the year, fur seals lead a largely pelagic existence in the open sea, pursuing their prey wherever it is abundant. They feed on moderately sized fish, squid, and krill. Several species of the southern fur seal also have sea birds, especially penguins, as part of their diets. Fur seals, in turn, are preyed upon by sharks, killer whales, and occasionally by larger sea lions. These opportunistic mammals tend to feed and dive in shallow waters at night, when their prey are swimming near the surface. South American fur seals exhibit a different diet; adults feed almost exclusively on anchovies, while juveniles feed on demersal fish, most likely due to availability. When fur seals were hunted in the late 18th and early 19th centuries, they hauled out on remote islands where no predators were present. The hunters reported being able to club the unwary animals to death one after another, making the hunt profitable, though the price per seal skin was low. Population and survival The average lifespan of fur seals varies with different species from 13 to 25 years, with females typically living longer. Most populations continue to expand as they recover from previous commercial hunting and environmental threats. Many species were heavily exploited by commercial sealers, especially during the 19th century, when their fur was highly valued. Beginning in the 1790s, the ports of Stonington and New Haven, Connecticut, were leaders of the American fur seal trade, which primarily entailed clubbing fur seals to death on uninhabited South Pacific islands, skinning them, and selling the hides in China. Many populations, notably the Guadalupe fur seal, northern fur seal, and Cape fur seal, suffered dramatic declines and are still recovering. Currently, most species are protected, and hunting is mostly limited to subsistence harvest. Globally, most populations can be considered healthy, mostly because they often prefer remote habitats that are relatively inaccessible to humans. Nonetheless, environmental degradation, competition with fisheries, and climate change potentially pose threats to some populations. See also Bering Sea Arbitration References Further reading Gentry, R. L (1998) Behavior and Ecology of the Northern Fur Seal. Princeton: Princeton University Press. Fur seal Fur trade Pinnipeds Paraphyletic groups
[ -0.043077435344457626, 0.3715279996395111, -0.22252005338668823, -0.01948128454387188, 0.22910857200622559, 0.4219883978366852, 0.4742790162563324, 0.20664182305335999, -0.5670791268348694, 0.08642977476119995, -0.2261810153722763, 0.20613789558410645, -0.5276185870170593, 0.26185178756713...
11781
https://en.wikipedia.org/wiki/Frisian
Frisian
Frisian most often refers to: Frisia, a cross-border coastal region in Germany and the Netherlands Frisians, the medieval and modern ethnic group inhabiting Frisia Frisii, the ancient inhabitants of Frisia prior to 600 AD Frisian languages, a group of West Germanic languages, including: Old Frisian, spoken in Frisia from the 8th to 16th Century Middle Frisian, spoken in Frisia from the 16th to 19th Century North Frisian language, spoken in Schleswig-Holstein, Germany Saterland Frisian language, spoken in Lower Saxony, Germany West Frisian language, spoken in Friesland, Netherlands Frisian or Friesian may also refer to: Animal breeds Friesian (chicken), a Dutch breed of chicken East Friesian (sheep), a breed of sheep notable for its high production of milk Friesian cross, a cross of the Friesian horse with any other breed Friesian horse, a horse breed from Friesland Friesian Sporthorse, a type of Frisian cross, bred specifically for sport horse disciplines Friesian or Holstein (cattle), a widespread black-and-white breed of dairy cattle Other uses Friesan Fire, a horse that ran in the 2009 Kentucky Derby Frisian horse or cheval de frise, a type of military barrier Frisian School, a school of philosophy based on the works of Jakob Friedrich Fries Frisian Solar Challenge, a solar-powered boat race See also East Frisian (disambiguation) West Frisian (disambiguation) Language and nationality disambiguation pages
[ -0.18119454383850098, 0.08103194832801819, -0.033446382731199265, -0.5761111974716187, 0.23764033615589142, 0.5661193132400513, 0.2946579158306122, 0.6052128672599792, -0.621159017086029, -0.22625285387039185, -0.018092306330800056, 0.39678430557250977, 0.23828759789466858, -0.184336289763...
11784
https://en.wikipedia.org/wiki/Fauna%20%28disambiguation%29
Fauna (disambiguation)
Fauna is a collective term for animal life. Fauna may also refer to: Fauna (deity), an ancient Roman goddess Fauna, Bloemfontein, a suburb of the South African city of Bloemfontein Fauna (album), a 2008 album by Oh Land Fauna (film), a 2020 Mexican/Canadian drama film Fauna, a fictional character from Disney's Sleeping Beauty, see Flora, Fauna, and Merryweather Fauna, a 2009 Spiel des Jahres-nominated board game Fauna, a female character in Sweet Thursday, a novel by John Steinbeck Florian-Ayala Fauna, American artist, musician, and music producer See also Avifauna (disambiguation)
[ 0.4922305643558502, 0.25844818353652954, -0.7399471402168274, 0.11222360283136368, -0.39255237579345703, -0.11717036366462708, 0.7997729182243347, 0.10516522824764252, -0.36894461512565613, -0.3476540744304657, -0.4910963773727417, 0.16916295886039734, -0.5151859521865845, 0.74379324913024...
11786
https://en.wikipedia.org/wiki/Federico%20Fellini
Federico Fellini
Federico Fellini, (; 20 January 1920 – 31 October 1993) was an Italian film director and screenwriter known for his distinctive style, which blends fantasy and baroque images with earthiness. He is recognized as one of the greatest and most influential filmmakers of all time. His films have ranked highly in critical polls such as that of Cahiers du Cinéma and Sight & Sound, which lists his 1963 film as the 10th-greatest film. For La Dolce Vita Fellini won the Palme d'Or; additionally, he was nominated for twelve Academy Awards, and won four in the category of Best Foreign Language Film, the most for any director in the history of the Academy. He received an honorary award for Lifetime Achievement at the 65th Academy Awards in Los Angeles. His other well-known films include La Strada (1954), Nights of Cabiria (1957), 8½ (1963), Juliet of the Spirits (1965), the "Toby Dammit" segment of Spirits of the Dead (1968), Fellini Satyricon (1969), Roma (1972), Amarcord (1973), and Fellini's Casanova (1976). Fellini was ranked 2nd in the directors' poll and 7th in the critics' poll in Sight & Sounds 2002 list of the greatest directors of all time. Early life and education Rimini (1920–1938) Fellini was born on 20 January 1920, to middle-class parents in Rimini, then a small town on the Adriatic Sea. On 25 January, at the San Nicolò church he was baptized Federico Domenico Marcello Fellini. His father, Urbano Fellini (1894–1956), born to a family of Romagnol peasants and small landholders from Gambettola, moved to Rome in 1915 as a baker apprenticed to the Pantanella pasta factory. His mother, Ida Barbiani (1896–1984), came from a bourgeois Catholic family of Roman merchants. Despite her family's vehement disapproval, she had eloped with Urbano in 1917 to live at his parents' home in Gambettola. A civil marriage followed in 1918 with the religious ceremony held at Santa Maria Maggiore in Rome a year later. The couple settled in Rimini where Urbano became a traveling salesman and wholesale vendor. Fellini had two siblings, Riccardo (1921–1991), a documentary director for RAI Television, and Maria Maddalena (m. Fabbri; 1929–2002). In 1924, Fellini started primary school in an institute run by the nuns of San Vincenzo in Rimini, attending the Carlo Tonni public school two years later. An attentive student, he spent his leisure time drawing, staging puppet shows and reading Il corriere dei piccoli, the popular children's magazine that reproduced traditional American cartoons by Winsor McCay, George McManus and Frederick Burr Opper. (Opper's Happy Hooligan would provide the visual inspiration for Gelsomina in Fellini's 1954 film La Strada; McCay's Little Nemo would directly influence his 1980 film City of Women.) In 1926, he discovered the world of Grand Guignol, the circus with Pierino the Clown and the movies. Guido Brignone's Maciste all'Inferno (1926), the first film he saw, would mark him in ways linked to Dante and the cinema throughout his entire career. Enrolled at the Ginnasio Giulio Cesare in 1929, he made friends with Luigi Titta Benzi, later a prominent Rimini lawyer (and the model for young Titta in Amarcord (1973)). In Mussolini's Italy, Fellini and Riccardo became members of the Avanguardista, the compulsory Fascist youth group for males. He visited Rome with his parents for the first time in 1933, the year of the maiden voyage of the transatlantic ocean liner SS Rex (which is shown in Amarcord). The sea creature found on the beach at the end of La Dolce Vita (1960) has its basis in a giant fish marooned on a Rimini beach during a storm in 1934. Although Fellini adapted key events from his childhood and adolescence in films such as I Vitelloni (1953), (1963), and Amarcord (1973), he insisted that such autobiographical memories were inventions: In 1937, Fellini opened Febo, a portrait shop in Rimini, with the painter Demos Bonini. His first humorous article appeared in the "Postcards to Our Readers" section of Milan's Domenica del Corriere. Deciding on a career as a caricaturist and gag writer, Fellini travelled to Florence in 1938, where he published his first cartoon in the weekly 420. According to a biographer, Fellini found school "exasperating" and, in one year, had 67 absences. Failing his military culture exam, he graduated from high school in July 1938 after doubling the exam. Rome (1939) In September 1939, he enrolled in law school at the University of Rome to please his parents. Biographer Hollis Alpert reports that "there is no record of his ever having attended a class". Installed in a family pensione, he met another lifelong friend, the painter Rinaldo Geleng. Desperately poor, they unsuccessfully joined forces to draw sketches of restaurant and café patrons. Fellini eventually found work as a cub reporter on the dailies Il Piccolo and Il Popolo di Roma, but quit after a short stint, bored by the local court news assignments. Four months after publishing his first article in Marc'Aurelio, the highly influential biweekly humour magazine, he joined the editorial board, achieving success with a regular column titled But Are You Listening? Described as "the determining moment in Fellini's life", the magazine gave him steady employment between 1939 and 1942, when he interacted with writers, gagmen, and scriptwriters. These encounters eventually led to opportunities in show business and cinema. Among his collaborators on the magazine's editorial board were the future director Ettore Scola, Marxist theorist and scriptwriter Cesare Zavattini, and Bernardino Zapponi, a future Fellini screenwriter. Conducting interviews for CineMagazzino also proved congenial: when asked to interview Aldo Fabrizi, Italy's most popular variety performer, he established such immediate personal rapport with the man that they collaborated professionally. Specializing in humorous monologues, Fabrizi commissioned material from his young protégé. Career and later life Early screenplays (1940–1943) Retained on business in Rimini, Urbano sent wife and family to Rome in 1940 to share an apartment with his son. Fellini and Ruggero Maccari, also on the staff of Marc'Aurelio, began writing radio sketches and gags for films. Not yet twenty and with Fabrizi's help, Fellini obtained his first screen credit as a comedy writer on Mario Mattoli's Il pirata sono io (The Pirate's Dream). Progressing rapidly to numerous collaborations on films at Cinecittà, his circle of professional acquaintances widened to include novelist Vitaliano Brancati and scriptwriter Piero Tellini. In the wake of Mussolini's declaration of war against France and Britain on 10 June 1940, Fellini discovered Kafka's The Metamorphosis, Gogol, John Steinbeck and William Faulkner along with French films by Marcel Carné, René Clair, and Julien Duvivier. In 1941 he published Il mio amico Pasqualino, a 74-page booklet in ten chapters describing the absurd adventures of Pasqualino, an alter ego. Writing for radio while attempting to avoid the draft, Fellini met his future wife Giulietta Masina in a studio office at the Italian public radio broadcaster EIAR in the autumn of 1942. Well-paid as the voice of Pallina in Fellini's radio serial, Cico and Pallina, Masina was also well known for her musical-comedy broadcasts which cheered an audience depressed by the war. In November 1942, Fellini was sent to Libya, occupied by Fascist Italy, to work on the screenplay of I cavalieri del deserto (Knights of the Desert, 1942), directed by Osvaldo Valenti and Gino Talamo. Fellini welcomed the assignment as it allowed him "to secure another extension on his draft order". Responsible for emergency re-writing, he also directed the film's first scenes. When Tripoli fell under siege by British forces, he and his colleagues made a narrow escape by boarding a German military plane flying to Sicily. His African adventure, later published in Marc'Aurelio as "The First Flight", marked "the emergence of a new Fellini, no longer just a screenwriter, working and sketching at his desk, but a filmmaker out in the field". The apolitical Fellini was finally freed of the draft when an Allied air raid over Bologna destroyed his medical records. Fellini and Giulietta hid in her aunt's apartment until Mussolini's fall on 25 July 1943. After dating for nine months, the couple were married on 30 October 1943. Several months later, Masina fell down the stairs and suffered a miscarriage. She gave birth to a son, Pierfederico, on 22 March 1945, but the child died of encephalitis 11 days later on 2 April 1945. The tragedy had enduring emotional and artistic repercussions. Neorealist apprenticeship (1944–1949) After the Allied liberation of Rome on 4 June 1944, Fellini and Enrico De Seta opened the Funny Face Shop where they survived the postwar recession drawing caricatures of American soldiers. He became involved with Italian Neorealism when Roberto Rossellini, at work on Stories of Yesteryear (later Rome, Open City), met Fellini in his shop, and proposed he contribute gags and dialogue for the script. Aware of Fellini's reputation as Aldo Fabrizi's "creative muse", Rossellini also requested that he try to convince the actor to play the role of Father Giuseppe Morosini, the parish priest executed by the SS on 4 April 1944. In 1947, Fellini and Sergio Amidei received an Oscar nomination for the screenplay of Rome, Open City. Working as both screenwriter and assistant director on Rossellini's Paisà (Paisan) in 1946, Fellini was entrusted to film the Sicilian scenes in Maiori. In February 1948, he was introduced to Marcello Mastroianni, then a young theatre actor appearing in a play with Giulietta Masina. Establishing a close working relationship with Alberto Lattuada, Fellini co-wrote the director's Senza pietà (Without Pity) and Il mulino del Po (The Mill on the Po). Fellini also worked with Rossellini on the anthology film L'Amore (1948), co-writing the screenplay and in one segment titled, "The Miracle", acting opposite Anna Magnani. To play the role of a vagabond rogue mistaken by Magnani for a saint, Fellini had to bleach his black hair blond. Early films (1950–1953) In 1950 Fellini co-produced and co-directed with Alberto Lattuada Variety Lights (Luci del varietà), his first feature film. A backstage comedy set among the world of small-time travelling performers, it featured Giulietta Masina and Lattuada's wife, Carla Del Poggio. Its release to poor reviews and limited distribution proved disastrous for all concerned. The production company went bankrupt, leaving both Fellini and Lattuada with debts to pay for over a decade. In February 1950, Paisà received an Oscar nomination for the screenplay by Rossellini, Sergio Amidei, and Fellini. After travelling to Paris for a script conference with Rossellini on Europa '51, Fellini began production on The White Sheik in September 1951, his first solo-directed feature. Starring Alberto Sordi in the title role, the film is a revised version of a treatment first written by Michelangelo Antonioni in 1949 and based on the fotoromanzi, the photographed cartoon strip romances popular in Italy at the time. Producer Carlo Ponti commissioned Fellini and Tullio Pinelli to write the script but Antonioni rejected the story they developed. With Ennio Flaiano, they re-worked the material into a light-hearted satire about newlywed couple Ivan and Wanda Cavalli (Leopoldo Trieste, Brunella Bovo) in Rome to visit the Pope. Ivan's prissy mask of respectability is soon demolished by his wife's obsession with the White Sheik. Highlighting the music of Nino Rota, the film was selected at Cannes (among the films in competition was Orson Welles's Othello) and then retracted. Screened at the 13th Venice International Film Festival, it was razzed by critics in "the atmosphere of a soccer match". One reviewer declared that Fellini had "not the slightest aptitude for cinema direction". In 1953, I Vitelloni found favour with the critics and public. Winning the Silver Lion Award in Venice, it secured Fellini his first international distributor. Beyond neorealism (1954–1960) Fellini directed La Strada based on a script completed in 1952 with Pinelli and Flaiano. During the last three weeks of shooting, Fellini experienced the first signs of severe clinical depression. Aided by his wife, he undertook a brief period of therapy with Freudian psychoanalyst Emilio Servadio. Fellini cast American actor Broderick Crawford to interpret the role of an aging swindler in Il Bidone. Based partly on stories told to him by a petty thief during production of La Strada, Fellini developed the script into a con man's slow descent towards a solitary death. To incarnate the role's "intense, tragic face", Fellini's first choice had been Humphrey Bogart, but after learning of the actor's lung cancer, chose Crawford after seeing his face on the theatrical poster of All the King's Men (1949). The film shoot was wrought with difficulties stemming from Crawford's alcoholism. Savaged by critics at the 16th Venice International Film Festival, the film did miserably at the box office and did not receive international distribution until 1964. During the autumn, Fellini researched and developed a treatment based on a film adaptation of Mario Tobino's novel, The Free Women of Magliano. Set in a mental institution for women, the project was abandoned when financial backers considered the subject had no potential. While preparing Nights of Cabiria in spring 1956, Fellini learned of his father's death by cardiac arrest at the age of sixty-two. Produced by Dino De Laurentiis and starring Giulietta Masina, the film took its inspiration from news reports of a woman's severed head retrieved in a lake and stories by Wanda, a shantytown prostitute Fellini met on the set of Il Bidone. Pier Paolo Pasolini was hired to translate Flaiano and Pinelli's dialogue into Roman dialect and to supervise researches in the vice-afflicted suburbs of Rome. The movie won the Academy Award for Best Foreign Language Film at the 30th Academy Awards and brought Masina the Best Actress Award at Cannes for her performance. With Pinelli, he developed Journey with Anita for Sophia Loren and Gregory Peck. An "invention born out of intimate truth", the script was based on Fellini's return to Rimini with a mistress to attend his father's funeral. Due to Loren's unavailability, the project was shelved and resurrected twenty-five years later as Lovers and Liars (1981), a comedy directed by Mario Monicelli with Goldie Hawn and Giancarlo Giannini. For Eduardo De Filippo, he co-wrote the script of Fortunella, tailoring the lead role to accommodate Masina's particular sensibility. The Hollywood on the Tiber phenomenon of 1958 in which American studios profited from the cheap studio labour available in Rome provided the backdrop for photojournalists to steal shots of celebrities on the via Veneto. The scandal provoked by Turkish dancer Haish Nana's improvised striptease at a nightclub captured Fellini's imagination: he decided to end his latest script-in-progress, Moraldo in the City, with an all-night "orgy" at a seaside villa. Pierluigi Praturlon's photos of Anita Ekberg wading fully dressed in the Trevi Fountain provided further inspiration for Fellini and his scriptwriters. Changing the title of the screenplay to La Dolce Vita, Fellini soon clashed with his producer on casting: The director insisted on the relatively unknown Mastroianni while De Laurentiis wanted Paul Newman as a hedge on his investment. Reaching an impasse, De Laurentiis sold the rights to publishing mogul Angelo Rizzoli. Shooting began on 16 March 1959 with Anita Ekberg climbing the stairs to the cupola of Saint Peter's in a mammoth décor constructed at Cinecittà. The statue of Christ flown by helicopter over Rome to St. Peter's Square was inspired by an actual media event on 1 May 1956, which Fellini had witnessed. The film wrapped 15 August on a deserted beach at Passo Oscuro with a bloated mutant fish designed by Piero Gherardi. La Dolce Vita broke all box office records. Despite scalpers selling tickets at 1000 lire, crowds queued in line for hours to see an "immoral movie" before the censors banned it. At an exclusive Milan screening on 5 February 1960, one outraged patron spat on Fellini while others hurled insults. Denounced in parliament by right-wing conservatives, undersecretary Domenico Magrì of the Christian Democrats demanded tolerance for the film's controversial themes. The Vatican's official press organ, l'Osservatore Romano, lobbied for censorship while the Board of Roman Parish Priests and the Genealogical Board of Italian Nobility attacked the film. In one documented instance involving favourable reviews written by the Jesuits of San Fedele, defending La Dolce Vita had severe consequences. In competition at Cannes alongside Antonioni's L'Avventura, the film won the Palme d'Or awarded by presiding juror Georges Simenon. The Belgian writer was promptly "hissed at" by the disapproving festival crowd. Art films and dreams (1961–1969) A major discovery for Fellini after his Italian neorealism period (1950–1959) was the work of Carl Jung. After meeting Jungian psychoanalyst Dr. Ernst Bernhard in early 1960, he read Jung's autobiography, Memories, Dreams, Reflections (1963) and experimented with LSD. Bernhard also recommended that Fellini consult the I Ching and keep a record of his dreams. What Fellini formerly accepted as "his extrasensory perceptions" were now interpreted as psychic manifestations of the unconscious. Bernhard's focus on Jungian depth psychology proved to be the single greatest influence on Fellini's mature style and marked the turning point in his work from neorealism to filmmaking that was "primarily oneiric". As a consequence, Jung's seminal ideas on the anima and the animus, the role of archetypes and the collective unconscious directly influenced such films as (1963), Juliet of the Spirits (1965), Fellini Satyricon (1969), Casanova (1976), and City of Women (1980). Other key influences on his work include Luis Buñuel. Charlie Chaplin, Sergei Eisenstein, Buster Keaton, Laurel and Hardy, the Marx Brothers, and Roberto Rossellini. Exploiting La Dolce Vitas success, financier Angelo Rizzoli set up Federiz in 1960, an independent film company, for Fellini and production manager Clemente Fracassi to discover and produce new talent. Despite the best intentions, their overcautious editorial and business skills forced the company to close down soon after cancelling Pasolini's project, Accattone (1961). Condemned as a "public sinner", for La Dolce Vita, Fellini responded with The Temptations of Doctor Antonio, a segment in the omnibus Boccaccio '70. His second colour film, it was the sole project green-lighted at Federiz. Infused with the surrealistic satire that characterized the young Fellini's work at Marc'Aurelio, the film ridiculed a crusader against vice, interpreted by Peppino De Filippo, who goes insane trying to censor a billboard of Anita Ekberg espousing the virtues of milk. In an October 1960 letter to his colleague Brunello Rondi, Fellini first outlined his film ideas about a man suffering creative block: "Well then – a guy (a writer? any kind of professional man? a theatrical producer?) has to interrupt the usual rhythm of his life for two weeks because of a not-too-serious disease. It's a warning bell: something is blocking up his system." Unclear about the script, its title, and his protagonist's profession, he scouted locations throughout Italy "looking for the film", in the hope of resolving his confusion. Flaiano suggested La bella confusione (literally The Beautiful Confusion) as the movie's title. Under pressure from his producers, Fellini finally settled on , a self-referential title referring principally (but not exclusively) to the number of films he had directed up to that time. Giving the order to start production in spring 1962, Fellini signed deals with his producer Rizzoli, fixed dates, had sets constructed, cast Mastroianni, Anouk Aimée, and Sandra Milo in lead roles, and did screen tests at the Scalera Studios in Rome. He hired cinematographer Gianni Di Venanzo, among key personnel. But apart from naming his hero Guido Anselmi, he still couldn't decide what his character did for a living. The crisis came to a head in April when, sitting in his Cinecittà office, he began a letter to Rizzoli confessing he had "lost his film" and had to abandon the project. Interrupted by the chief machinist requesting he celebrate the launch of , Fellini put aside the letter and went on the set. Raising a toast to the crew, he "felt overwhelmed by shame… I was in a no exit situation. I was a director who wanted to make a film he no longer remembers. And lo and behold, at that very moment everything fell into place. I got straight to the heart of the film. I would narrate everything that had been happening to me. I would make a film telling the story of a director who no longer knows what film he wanted to make". The self-mirroring structure makes the entire film inseparable from its reflecting construction. Shooting began on 9 May 1962. Perplexed by the seemingly chaotic, incessant improvisation on the set, Deena Boyer, the director's American press officer at the time, asked for a rationale. Fellini told her that he hoped to convey the three levels "on which our minds live: the past, the present, and the conditional — the realm of fantasy". After shooting wrapped on 14 October, Nino Rota composed various circus marches and fanfares that would later become signature tunes of the maestro's cinema. Nominated for four Oscars, won awards for best foreign language film and best costume design in black-and-white. In California for the ceremony, Fellini toured Disneyland with Walt Disney the day after. Increasingly attracted to parapsychology, Fellini met the Turin antiquarian Gustavo Rol in 1963. Rol, a former banker, introduced him to the world of Spiritism and séances. In 1964, Fellini took LSD under the supervision of Emilio Servadio, his psychoanalyst during the 1954 production of La Strada. For years reserved about what actually occurred that Sunday afternoon, he admitted in 1992 that ... objects and their functions no longer had any significance. All I perceived was perception itself, the hell of forms and figures devoid of human emotion and detached from the reality of my unreal environment. I was an instrument in a virtual world that constantly renewed its own meaningless image in a living world that was itself perceived outside of nature. And since the appearance of things was no longer definitive but limitless, this paradisiacal awareness freed me from the reality external to my self. The fire and the rose, as it were, became one. Fellini's hallucinatory insights were given full flower in his first colour feature Juliet of the Spirits (1965), depicting Giulietta Masina as Juliet, a housewife who rightly suspects her husband's infidelity and succumbs to the voices of spirits summoned during a séance at her home. Her sexually voracious next door neighbor Suzy (Sandra Milo) introduces Juliet to a world of uninhibited sensuality, but Juliet is haunted by childhood memories of her Catholic guilt and a teenaged friend who committed suicide. Complex and filled with psychological symbolism, the film is set to a jaunty score by Nino Rota. Nostalgia, sexuality, and politics (1970–1980) To help promote Satyricon in the United States, Fellini flew to Los Angeles in January 1970 for interviews with Dick Cavett and David Frost. He also met with film director Paul Mazursky who wanted to star him alongside Donald Sutherland in his new film, Alex in Wonderland. In February, Fellini scouted locations in Paris for The Clowns, a docufiction both for cinema and television, based on his childhood memories of the circus and a "coherent theory of clowning." As he saw it, the clown "was always the caricature of a well-established, ordered, peaceful society. But today all is temporary, disordered, grotesque. Who can still laugh at clowns?... All the world plays a clown now." In March 1971, Fellini began production on Roma, a seemingly random collection of episodes informed by the director's memories and impressions of Rome. The "diverse sequences," writes Fellini scholar Peter Bondanella, "are held together only by the fact that they all ultimately originate from the director's fertile imagination." The film's opening scene anticipates Amarcord while its most surreal sequence involves an ecclesiastical fashion show in which nuns and priests roller skate past shipwrecks of cobwebbed skeletons. Over a period of six months between January and June 1973, Fellini shot the Oscar-winning Amarcord. Loosely based on the director's 1968 autobiographical essay My Rimini, the film depicts the adolescent Titta and his friends working out their sexual frustrations against the religious and Fascist backdrop of a provincial town in Italy during the 1930s. Produced by Franco Cristaldi, the seriocomic movie became Fellini's second biggest commercial success after La Dolce Vita. Circular in form, Amarcord avoids plot and linear narrative in a way similar to The Clowns and Roma. The director's overriding concern with developing a poetic form of cinema was first outlined in a 1965 interview he gave to The New Yorker journalist Lillian Ross: "I am trying to free my work from certain constrictions – a story with a beginning, a development, an ending. It should be more like a poem with metre and cadence." Late films and projects (1981–1990) Organized by his publisher Diogenes Verlag in 1982, the first major exhibition of 63 drawings by Fellini was held in Paris, Brussels, and the Pierre Matisse Gallery in New York. A gifted caricaturist, he found much of the inspiration for his sketches from his own dreams while the films-in-progress both originated from and stimulated drawings for characters, decor, costumes and set designs. Under the title, I disegni di Fellini (Fellini's Designs), he published 350 drawings executed in pencil, watercolours, and felt pens. On 6 September 1985 Fellini was awarded the Golden Lion for lifetime achievement at the 42nd Venice Film Festival. That same year, he became the first non-American to receive the Film Society of Lincoln Center's annual award for cinematic achievement. Long fascinated by Carlos Castaneda's The Teachings of Don Juan: A Yaqui Way of Knowledge, Fellini accompanied the Peruvian author on a journey to the Yucatán to assess the feasibility of a film. After first meeting Castaneda in Rome in October 1984, Fellini drafted a treatment with Pinelli titled Viaggio a Tulun. Producer Alberto Grimaldi, prepared to buy film rights to all of Castaneda's work, then paid for pre-production research taking Fellini and his entourage from Rome to Los Angeles and the jungles of Mexico in October 1985. When Castaneda inexplicably disappeared and the project fell through, Fellini's mystico-shamanic adventures were scripted with Pinelli and serialized in Corriere della Sera in May 1986. A barely veiled satirical interpretation of Castaneda's work, Viaggio a Tulun was published in 1989 as a graphic novel with artwork by Milo Manara and as Trip to Tulum in America in 1990. For Intervista, produced by Ibrahim Moussa and RAI Television, Fellini intercut memories of the first time he visited Cinecittà in 1939 with present-day footage of himself at work on a screen adaptation of Franz Kafka's Amerika. A meditation on the nature of memory and film production, it won the special 40th Anniversary Prize at Cannes and the 15th Moscow International Film Festival Golden Prize. In Brussels later that year, a panel of thirty professionals from eighteen European countries named Fellini the world's best director and the best European film of all time. In early 1989 Fellini began production on The Voice of the Moon, based on Ermanno Cavazzoni's novel, Il poema dei lunatici (The Lunatics' Poem). A small town was built at Empire Studios on the via Pontina outside Rome. Starring Roberto Benigni as Ivo Salvini, a madcap poetic figure newly released from a mental institution, the character is a combination of La Stradas Gelsomina, Pinocchio, and Italian poet Giacomo Leopardi. Fellini improvised as he filmed, using as a guide a rough treatment written with Pinelli. Despite its modest critical and commercial success in Italy, and its warm reception by French critics, it failed to interest North American distributors. Fellini won the Praemium Imperiale, an international prize in the visual arts given by the Japan Art Association in 1990. Final years (1991–1993) In July 1991 and April 1992, Fellini worked in close collaboration with Canadian filmmaker Damian Pettigrew to establish "the longest and most detailed conversations ever recorded on film". Described as the "Maestro's spiritual testament" by his biographer Tullio Kezich, excerpts culled from the conversations later served as the basis of their feature documentary, Fellini: I'm a Born Liar (2002) and the book, I'm a Born Liar: A Fellini Lexicon. Finding it increasingly difficult to secure financing for feature films, Fellini developed a suite of television projects whose titles reflect their subjects: Attore, Napoli, L'Inferno, L'opera lirica, and L'America. In April 1993 Fellini received his fifth Oscar, for lifetime achievement, "in recognition of his cinematic accomplishments that have thrilled and entertained audiences worldwide". On 16 June, he entered the Cantonal Hospital in Zürich for an angioplasty on his femoral artery but suffered a stroke at the Grand Hotel in Rimini two months later. Partially paralyzed, he was first transferred to Ferrara for rehabilitation and then to the Policlinico Umberto I in Rome to be near his wife, also hospitalized. He suffered a second stroke and fell into an irreversible coma. Death Fellini died in Rome on 31 October 1993 at the age of 73 after a heart attack he suffered a few weeks earlier, a day after his 50th wedding anniversary. The memorial service, in Studio 5 at Cinecittà, was attended by an estimated 70,000 people. At Giulietta Masina's request, trumpeter Mauro Maur played Nino Rota's "Improvviso dell'Angelo" during the ceremony. Five months later, on 23 March 1994, Masina died of lung cancer. Fellini, Masina and their son, Pierfederico, are buried in a bronze sepulchre sculpted by Arnaldo Pomodoro. Designed as a ship's prow, the tomb is at the main entrance to the cemetery of Rimini. The Federico Fellini Airport in Rimini is named in his honour. Religious views Fellini was raised in a Roman Catholic family and considered himself a Catholic, but avoided formal activity in the Catholic Church. Fellini's films include Catholic themes; some celebrate Catholic teachings, while others criticize or ridicule church dogma. Political views While Fellini was for the most part indifferent to politics, he had a general dislike of authoritarian institutions, and is interpreted by Bondanella as believing in "the dignity and even the nobility of the individual human being". In a 1966 interview, he said, "I make it a point to see if certain ideologies or political attitudes threaten the private freedom of the individual. But for the rest, I am not prepared nor do I plan to become interested in politics." Despite various famous Italian actors favouring the Communists, Fellini was not left-wing. It is rumored that he supported Christian Democracy (DC). Bondanella writes that DC "was far too aligned with an extremely conservative and even reactionary pre-Vatican II church to suit Fellini's tastes", but Fellini opposed the '68 Movement and befriended Giulio Andreotti. Apart from satirizing Silvio Berlusconi and mainstream television in Ginger and Fred, Fellini rarely expressed political views in public and never directed an overtly political film. He directed two electoral television spots during the 1990s: one for DC and another for the Italian Republican Party (PRI). His slogan "Non si interrompe un'emozione" (Don't interrupt an emotion) was directed against the excessive use of TV advertisements. The Democratic Party of the Left also used the slogan in the referendums of 1995. Influence and legacy Personal and highly idiosyncratic visions of society, Fellini's films are a unique combination of memory, dreams, fantasy and desire. The adjectives "Fellinian" and "Felliniesque" are "synonymous with any kind of extravagant, fanciful, even baroque image in the cinema and in art in general". La Dolce Vita contributed the term paparazzi to the English language, derived from Paparazzo, the photographer friend of journalist Marcello Rubini (Marcello Mastroianni). Contemporary filmmakers such as Tim Burton, Terry Gilliam, Emir Kusturica, and David Lynch have cited Fellini's influence on their work. Polish director Wojciech Has, whose two best-received films, The Saragossa Manuscript (1965) and The Hour-Glass Sanatorium (1973), are examples of modernist fantasies, has been compared to Fellini for the sheer "luxuriance of his images". I Vitelloni inspired European directors Juan Antonio Bardem, Marco Ferreri, and Lina Wertmüller and influenced Martin Scorsese's Mean Streets (1973), George Lucas's American Graffiti (1974), Joel Schumacher's St. Elmo's Fire (1985), and Barry Levinson's Diner (1982), among many others. When the American magazine Cinema asked Stanley Kubrick in 1963 to name his ten favorite films, he ranked I Vitelloni number one. Nights of Cabiria was adapted as the Broadway musical Sweet Charity and the movie Sweet Charity (1969) by Bob Fosse starring Shirley MacLaine. City of Women was adapted for the Berlin stage by Frank Castorf in 1992. inspired, among others, Mickey One (Arthur Penn, 1965), Alex in Wonderland (Paul Mazursky, 1970), Beware of a Holy Whore (Rainer Werner Fassbinder, 1971), Day for Night (François Truffaut, 1973), All That Jazz (Bob Fosse, 1979), Stardust Memories (Woody Allen, 1980), Sogni d'oro (Nanni Moretti, 1981), Parad Planet (Vadim Abdrashitov, 1984), La Pelicula del rey (Carlos Sorin, 1986), Living in Oblivion (Tom DiCillo, 1995), Women (Peter Greenaway, 1999), Falling Down (Joel Schumacher, 1993), and the Broadway musical Nine (Maury Yeston and Arthur Kopit, 1982). Yo-Yo Boing! (1998), a Spanish novel by Puerto Rican writer Giannina Braschi, features a dream sequence with Fellini inspired by . Fellini's work is referenced on the albums Fellini Days (2001) by Fish, Another Side of Bob Dylan (1964) by Bob Dylan with Motorpsycho Nitemare, Funplex (2008) by the B-52's with the song Juliet of the Spirits, and in the opening traffic jam of the music video Everybody Hurts by R.E.M. American singer Lana Del Rey has cited Fellini as an influence. His work influenced the American TV shows Northern Exposure and Third Rock from the Sun. Wes Anderson's short film Castello Cavalcanti (2013) is in many places a direct homage to Fellini. In 1996, Entertainment Weekly ranked Fellini tenth on its "50 Greatest Directors" list. In 2002 MovieMaker magazine ranked Fellini No. 9 on their list of The 25 Most Influential Directors of All Time. In 2007, Total Film magazine ranked Fellini at No. 67 on its "100 Greatest Film Directors Ever" list. Various film-related material and personal papers of Fellini are in the Wesleyan University Cinema Archives, to which scholars and media experts have full access. In October 2009, the Jeu de Paume in Paris opened an exhibit devoted to Fellini that included ephemera, television interviews, behind-the-scenes photographs, Book of Dreams (based on 30 years of the director's illustrated dreams and notes), along with excerpts from La dolce vita and . In 2014, the Blue Devils Drum and Bugle Corps of Concord, California, performed "Felliniesque", a show themed around Fellini's work, with which they won a record 16th Drum Corps International World Class championship with a record score of 99.650. That same year, the weekly entertainment-trade magazine Variety announced that French director Sylvain Chomet was moving forward with The Thousand Miles, a project based on various Fellini works, including his unpublished drawings and writings. Filmography As a director As a screenwriterTelevision commercials''' TV commercial for Campari Soda (1984) TV commercial for Barilla pasta (1984) Three TV commercials for Banca di Roma (1992) Awards and nominations Academy Awards Other awards Honors Documentaries on Fellini Ciao Federico (1969). Dir. Gideon Bachmann. (60') Federico Fellini – (2000). Dir. Paquito Del Bosco. (RAI TV, 68') Fellini: I'm a Born Liar (2002). Dir. Damian Pettigrew. Feature documentary. (Arte, Eurimages, Scottish Screen, 102') How Strange to Be Named Federico (2013). Dir. Ettore Scola. Fellini degli spiriti'' (2020). Dir. . See also Art film Notes References Sources Further reading External links Fellini Official site (in English) Fellini Foundation Official Rimini web site (in Italian) Fondation Fellini pour le cinéma Swiss web site (in French) Federico Fellini biography on Lambiek Comiclopedia Site commemorating Fellini's 100th birthday 1920 births 1993 deaths Academy Honorary Award recipients Directors of Best Foreign Language Film Academy Award winners Directors of Palme d'Or winners Best Production Design BAFTA Award winners BAFTA fellows European Film Awards winners (people) Italian film directors Television commercial directors Italian screenwriters Italian Roman Catholics People from Rimini David di Donatello winners Nastro d'Argento winners English-language film directors German-language film directors People from the Province of Rimini Analysands of Ernst Bernhard Recipients of the Praemium Imperiale Magic realism Italian-language film directors Italian cartoonists Italian comics artists Italian satirists Italian male screenwriters Italian surrealist artists Surrealist filmmakers 20th-century Italian screenwriters
[ 0.1123555451631546, 0.16441383957862854, -0.3693178594112396, 0.07108570635318756, 0.365157812833786, 0.6840800046920776, 0.4833088219165802, -0.02275429666042328, -0.1925041526556015, -0.3019803464412689, -0.9673197865486145, 0.0758499726653099, -0.32844212651252747, 0.6141001582145691, ...
11787
https://en.wikipedia.org/wiki/Fleetwood%20Mac
Fleetwood Mac
Fleetwood Mac are a British-American rock band, formed in London in 1967. Fleetwood Mac were founded by guitarist Peter Green, drummer Mick Fleetwood and guitarist Jeremy Spencer, before bassist John McVie joined the line-up for their self-titled debut album. Danny Kirwan joined as a third guitarist in 1968. Keyboardist and vocalist Christine Perfect, who contributed as a session musician from the second album, married McVie and joined in 1970. Primarily a British blues band at first, Fleetwood Mac scored a UK number one with "Albatross", and had other hits such as the singles "Oh Well" and "Man of the World". All three guitarists left in succession during the early 1970s, to be replaced by guitarists Bob Welch and Bob Weston and vocalist Dave Walker. By 1974, Welch, Weston and Walker had all either departed or been dismissed, leaving the band without a male lead vocalist or guitarist. In late 1974, while Fleetwood was scouting studios in Los Angeles, he heard American folk-rock duo Lindsey Buckingham and Stevie Nicks, and asked Buckingham to be their new lead guitarist, and Buckingham agreed on condition that Nicks could also join the band. The addition of Buckingham and Nicks gave the band a more pop rock sound, and their 1975 self-titled album, Fleetwood Mac, reached No. 1 in the United States. Rumours (1977), Fleetwood Mac's second album after the arrival of Buckingham and Nicks, produced four U.S. Top 10 singles and remained at number one on the American albums chart for 31 weeks. It also reached the top spot in countries around the world and won a Grammy Award for Album of the Year in 1978. Rumours has sold over 40 million copies worldwide, making it one of the best-selling albums in history. Although each member of the band went through a breakup (John and Christine McVie, Buckingham and Nicks, and Fleetwood and his wife Jenny) while recording the album, they continued to write and record music together. The band's personnel remained stable through three more studio albums, but by the late 1980s began to disintegrate. After Buckingham and Nicks each left the band, they were replaced by a number of other guitarists and vocalists. A 1993 one-off performance for the first inauguration of Bill Clinton featured the line-up of Fleetwood, John McVie, Christine McVie, Nicks, and Buckingham back together for the first time in six years. A full reunion occurred four years later, and the group released their fourth U.S. No. 1 album, The Dance (1997), a live compilation of their hits, also marking the 20th anniversary of Rumours. Christine McVie left the band in 1998, but continued to work with the band in a session capacity. Meanwhile, the group remained together as a four-piece, releasing their most recent studio album, Say You Will, in 2003. Christine McVie rejoined the band full-time in 2014. In 2018, Buckingham was fired from the band and replaced by Mike Campbell, formerly of Tom Petty and the Heartbreakers, and Neil Finn of Split Enz and Crowded House. Fleetwood Mac have sold more than 120 million records worldwide, making them one of the world's best-selling bands. In 1979, the group were honoured with a star on the Hollywood Walk of Fame. In 1998 the band were inducted into the Rock and Roll Hall of Fame and received the Brit Award for Outstanding Contribution to Music. In 2018, the band received the MusiCares Person of the Year award from The Recording Academy in recognition of their artistic achievement in the music industry and dedication to philanthropy. History 1967–1970: Formation and early years Fleetwood Mac were formed in July 1967 in London, England, when Peter Green left the British blues band John Mayall & the Bluesbreakers. Green had previously replaced guitarist Eric Clapton in the Bluesbreakers and had received critical acclaim for his work on their album A Hard Road. Green had been in two bands with Mick Fleetwood, Peter B's Looners and the subsequent Shotgun Express (which featured a young Rod Stewart as vocalist), and suggested Fleetwood as a replacement for drummer Aynsley Dunbar when Dunbar left the Bluesbreakers to join the new Jeff Beck/Rod Stewart band. John Mayall agreed and Fleetwood joined the Bluesbreakers. The Bluesbreakers then consisted of Green, Fleetwood, John McVie and Mayall. Mayall gave Green free recording time as a gift, which Fleetwood, McVie and Green used to record five songs. The fifth song was an instrumental that Green named after the rhythm section, "Fleetwood Mac" ("Mac" being short for McVie). Soon after this, Green suggested to Fleetwood that they form a new band. The pair wanted McVie on bass guitar and named the band 'Fleetwood Mac' to entice him, but McVie opted to keep his steady income with Mayall rather than take a risk with a new band. In the meantime Peter Green and Mick Fleetwood had teamed up with slide guitarist Jeremy Spencer and bassist Bob Brunning. Brunning was in the band on the understanding that he would leave if McVie agreed to join. The Green, Fleetwood, Spencer, Brunning version of the band made its debut on 13 August 1967 at the Windsor Jazz and Blues Festival as 'Peter Green's Fleetwood Mac, also featuring Jeremy Spencer'. Brunning played only a few gigs with Fleetwood Mac. Within weeks of this show, John McVie agreed to join the band as permanent bassist. Fleetwood Mac's self-titled debut album was a blues rock album and was released by the Blue Horizon label in February 1968. There were no other players on the album (except on the song "Long Grey Mare", which was recorded with Brunning on bass). The album was successful in the UK and reached no. 4, although no tracks were released as singles. Later in the year the singles "Black Magic Woman" (later a big hit for Santana) and "Need Your Love So Bad" were released. The band's second studio album, Mr. Wonderful, was released in August 1968. Like their first album, it was all blues. The album was recorded live in the studio with miked amplifiers and a PA system, rather than being plugged into the board. They also added horns and featured a friend of the band on keyboards, Christine Perfect of Chicken Shack. Shortly after the release of Mr. Wonderful, Fleetwood Mac recruited 18-year-old guitarist Danny Kirwan. He was in the South London blues trio Boilerhouse, consisting of Kirwan (guitar), Trevor Stevens (bass) and Dave Terrey (drums). Green and Fleetwood had watched Boilerhouse rehearse in a basement boiler-room, and Green had been so impressed that he invited the band to play support slots for Fleetwood Mac. Green wanted Boilerhouse to become a professional band but Stevens and Terrey were not prepared to turn professional, so Green tried to find another rhythm section for Kirwan by placing an ad in Melody Maker. There were over 300 applicants, but when Green and Fleetwood ran auditions at the Nag's Head in Battersea (home of the Mike Vernon Blue Horizon Club) the hard-to-please Green could not find anyone good enough. Fleetwood invited Kirwan to join Fleetwood Mac as a third guitarist. Green was frustrated that Jeremy Spencer did not contribute to his songs. Kirwan, a talented self-taught guitarist, had a signature vibrato and a unique style that added a new dimension to the band's sound. In November 1968, with Kirwan in the band, they released their first number one single in Europe, "Albatross", on which Kirwan duetted with Green. Green said later that the success of 'Albatross' was thanks to Kirwan. "If it wasn't for Danny, I would never have had a number one hit record." In January 1969 they released their first compilation album English Rose, which contained half of Mr Wonderful plus new songs from Kirwan. Their next and more successful compilation album,The Pious Bird of Good Omen was released in August and contained various singles, B-sides and tracks the band had done with Eddie Boyd. On tour in the US in January 1969, the band recorded Fleetwood Mac in Chicago (released in December as a double album) at the soon-to-close Chess Records Studio with some of the blues legends of Chicago, including Willie Dixon, Buddy Guy and Otis Spann. These were Fleetwood Mac's last all-blues recordings. Along with the change of style the band was also going through label changes. Up until that point they had been on the Blue Horizon label, but with Kirwan in the band the musical possibilities had become too diverse for a blues-only label. The band signed with Immediate Records and released the single "Man of the World", which became another British and European hit. For the B-side Spencer fronted Fleetwood Mac as "Earl Vince and the Valiants" and recorded "Somebody's Gonna Get Their Head Kicked In Tonite", typifying the more raucous rock 'n' roll side of the band. Immediate Records was in bad shape, however, and the band shopped around for a new deal. The Beatles wanted the band on Apple Records (Mick Fleetwood and George Harrison were brothers-in-law), but the band's manager Clifford Davis decided to go with Warner Bros. Records (through Reprise Records, a Frank Sinatra-founded label), the label they have stayed with ever since. Under the wing of Reprise, Fleetwood Mac released their third studio album, Then Play On, in September 1969. Although the initial pressing of the American release of this album was the same as the British version, it was altered to contain the song "Oh Well", which featured consistently in live performances from the time of its release through 1997 and again starting in 2009. Then Play On, the band's first rock album, was written by Kirwan and Green, plus a track each by Fleetwood and McVie. Jeremy Spencer, meanwhile, had recorded a solo album of 1950s-style rock and roll songs, backed by the rest of the band except Green. By 1970, Green, the frontman of the band, had become a user of LSD. During the band's European tour, he experienced a bad acid trip at a hippie commune in Munich. Clifford Davis, the band's manager, singled out this incident as the crucial point in Green's mental decline. He said: "The truth about Peter Green and how he ended up how he did is very simple. We were touring Europe in late 1969. When we were in Germany, Peter told me he had been invited to a party. I knew there were going to be a lot of drugs around and I suggested that he didn't go. But he went anyway and I understand from him that he took what turned out to be very bad, impure LSD. He was never the same again." German author and filmmaker Rainer Langhans stated in his autobiography that he and Uschi Obermaier met Green in Munich and invited him to their Highfisch-Kommune, where the drinks were spiked with acid. Langhans and Obermaier were planning to organise an open-air "Bavarian Woodstock", for which they wanted Jimi Hendrix and The Rolling Stones to be the main acts, and they hoped Green would help them to get in contact with The Rolling Stones. Green's last hit with Fleetwood Mac was "The Green Manalishi (With the Two-Prong Crown)". The track was recorded at Warner-Reprise's studios in Hollywood on the band's third US tour in April 1970, a few weeks before Green left the band. A live performance was recorded at the Boston Tea Party in February 1970, and the song was later recorded by Judas Priest. "Green Manalishi" was released as Green's mental stability deteriorated. He wanted the band to give all their money to charity, but the other members of the band disagreed. In April, Green decided to quit the band after the completion of their European tour. His last show with Fleetwood Mac was on 20 May 1970. During that show the band went past their allotted time and the power was shut off, although Mick Fleetwood kept drumming. Some of the Boston Tea Party recordings (5/6/7 February 1970) were eventually released in the 1980s as the Live in Boston album. A more complete remastered three-volume compilation was released by Snapper Music in the late 1990s. 1970–1974: Transitional era Kirwan and Spencer were left with the task of replacing Green in their live shows and on their recordings. In September 1970, Fleetwood Mac released their fourth studio album, Kiln House. Kirwan's songs on the album moved the band in the direction of rock, while Spencer's contributions focused on re-creating the country-tinged "Sun Sound" of the late 1950s. Christine Perfect, who had retired from the music business after one unsuccessful solo album, contributed (uncredited) to Kiln House, singing backup vocals and playing keyboards. She also drew the album cover. After Kiln House, Fleetwood Mac were progressing and developing a new sound, and she was invited to join the band to help fill in the rhythm section. They released a single, Danny Kirwan's "Dragonfly" b/w "The Purple Dancer" in the UK and certain European countries, but despite good notices in the press it was not a success. The B-side has been reissued only once, on a Reprise German and Dutch-only "Best of" album. The single was re-issued on 19 April 2014 for Record Store Day (RSD) 2014 in Europe on Blue Vinyl and in the U.S. on translucent purple vinyl. Christine Perfect, who by this point had married bassist John McVie, made her first appearance with the band as Christine McVie at Bristol University, England, in May 1969, just as she was leaving Chicken Shack. She had had success with the Etta James classic "I'd Rather Go Blind" and was twice voted female artist of the year in England. Christine McVie played her first gig as an official member of Fleetwood Mac on 1 August 1970 in New Orleans, Louisiana. CBS Records, which now owned Blue Horizon (except in the US and Canada), released the band's fourth compilation album, The Original Fleetwood Mac, containing previously unreleased material. The album was relatively successful, and the band continued to gain popularity. While on tour in February 1971, Jeremy Spencer said he was going out to "get a magazine" but never returned. After several days of frantic searching the band discovered that Spencer had joined a religious group, the Children of God. The band were liable for the remaining shows on the tour and asked Peter Green to step in as a replacement. Green brought along his friend Nigel Watson, who played the congas. (Twenty-five years later Green and Watson collaborated again to form the Peter Green Splinter Group). Green was only back with Fleetwood Mac temporarily and the band began a search for a new guitarist. Green insisted on playing only new material and none he had written. He and Watson played only the last week of shows. The San Bernardino show on 20 February was taped. In the summer of 1971, the band held auditions for a replacement guitarist at their large country home, "Benifold", which they had jointly bought with their manager Davis for £23,000 () prior to the Kiln House tour. A friend of the band, Judy Wong, recommended her high school friend Bob Welch, who was living in Paris, France, at the time. The band held a few meetings with Welch and decided to hire him, without actually playing with him, after they heard a tape of his songs. In September 1971, the band released their fifth studio album, Future Games. As a result of Welch's arrival and Spencer's departure, the album was different from anything they had done previously. While it became the band's first studio album to miss the charts in the UK, it helped to expand the band's appeal in the United States. In Europe CBS released Fleetwood Mac's first Greatest Hits album, which mostly consisted of songs by Peter Green, with one song by Spencer and one by Kirwan. In 1972, six months after the release of Future Games, the band released their sixth studio album, Bare Trees. Mostly composed by Kirwan, Bare Trees featured the Welch-penned single "Sentimental Lady", which would be a much bigger hit for Welch five years later when he re-recorded it for his solo album French Kiss, backed by Mick Fleetwood and Christine McVie. Bare Trees also featured "Spare Me a Little of Your Love", a bright Christine McVie song that became a staple of the band's live act throughout the early to mid-1970s. While the band was doing well in the studio, their tours started to be problematic. By 1972 Danny Kirwan had developed an alcohol dependency and was becoming alienated from Welch and the McVies. When Kirwan smashed his Gibson Les Paul Custom guitar before a concert on a US tour in August 1972, refused to go on stage and criticised the band afterwards, Fleetwood fired him. Fleetwood said later that the pressure had become too much for Kirwan, and he had suffered a breakdown. In the three albums they released in this period they constantly changed line-ups. In September 1972 the band added guitarist Bob Weston and vocalist Dave Walker, formerly of Savoy Brown and Idle Race. Bob Weston was well known as a slide guitarist and had known the band from his touring period with Long John Baldry. Fleetwood Mac also hired Savoy Brown's road manager, John Courage. Fleetwood, The McVies, Welch, Weston and Walker recorded the band's seventh studio album, Penguin, which was released in January 1973. After the tour the band fired Walker because they felt his vocal style and attitude did not fit well with the rest of the band. The remaining five members carried on and recorded the band's eighth studio album, Mystery to Me, six months later. This album contained Welch's song "Hypnotized", which received a great amount of airplay on the radio and became one of the band's most successful songs to date in the US. The band was proud of the new album and anticipated that it would be a smash hit. While it did eventually go Gold, personal problems within the band emerged. The McVies' marriage was under a lot of stress, which was aggravated by their constant working with each other and by John McVie's considerable alcohol abuse. Subsequent lack of touring meant that the album was unable to chart as high as the previous one. During the 1973 US tour to promote Mystery to Me, Weston had an affair with Fleetwood's wife Jenny Boyd Fleetwood, sister of Pattie Boyd Harrison. Fleetwood was said to have been emotionally devastated by this, and could not continue with the tour. Courage fired Weston and two weeks in, with another 26 concerts scheduled, the tour was cancelled. The last date played was Lincoln, Nebraska, on 20 October 1973. In a late-night meeting after that show, the band told their sound engineer that the tour was over and Fleetwood Mac was splitting up. 1974: Name dispute and 'fake Fleetwood Mac' In late 1973, after the collapse of the US tour, the band's manager, Clifford Davis, was left with major touring commitments to fulfil and no band. Fleetwood Mac had "temporarily disbanded" in Nebraska and its members had gone their separate ways. Davis was concerned that failing to complete the tour would destroy his reputation with bookers and promoters. He sent the band a letter in which he said he "hadn't slaved for years to be brought down by the whims of irresponsible musicians". Davis claimed that he owned the name 'Fleetwood Mac' and the right to choose the band members, and he recruited members of the band Legs, which had recently issued one single under Davis's management, to tour the US in early 1974 under the name 'The New Fleetwood Mac' and perform the rescheduled dates. This band — who former guitarist Dave Walker said were "very good" — consisted of Elmer Gantry (Dave Terry, formerly of Velvet Opera: vocals, guitar), Kirby Gregory (formerly of Curved Air: guitar), Paul Martinez (formerly of the Downliners Sect: bass), John Wilkinson (also known as Dave Wilkinson: keyboards) and Australian drummer Craig Collinge (formerly of Manfred Mann Ch III, the Librettos, Procession and Third World War). The members of this group were told that Fleetwood would join them after the tour had started, to validate the use of the name, and claimed that he had been involved in planning it. Davis and others stated that Fleetwood had committed himself to the project and had given instructions to hire musicians and rehearse the band. Davis said Collinge had been hired only as a temporary stand-in drummer for rehearsals and the first two gigs, and that Fleetwood had agreed to appear on the rest of the tour, but then had backed out after the tour started. Fleetwood said later that he had not promised to appear on the tour. The 'New Fleetwood Mac' tour began on 16 January 1974 at the Syria Mosque in Pittsburgh, Pennsylvania, and was initially successful. One of the band members said the first concert "went down a storm". The promoter was dubious at first, but said later that the crowd had loved the band and they were "actually really good". More successful gigs followed, but then word got around that this was not the real Fleetwood Mac and audiences became hostile. The band was turned away from several gigs and the next half-dozen were pulled by promoters. The band struggled on and played further dates in the face of increasing hostility and heckling, more dates were pulled, the keyboard player quit, and after a concert in Edmonton where bottles were thrown at the stage, the tour collapsed. The band dissolved and the remainder of the tour was cancelled. The lawsuit that followed regarding who owned the rights to the name 'Fleetwood Mac' put the original Fleetwood Mac on hiatus for almost a year. Although the band was named after Mick Fleetwood and John McVie, they had apparently signed contracts in which they had forfeited the rights to the name. Their record company, Warner Bros. Records, when appealed to, said they didn't know who owned it. The dispute was eventually settled out of court, four years later, in what was described as "a reasonable settlement not unfair to either party." In later years Fleetwood said that, in the end, he was grateful to Davis because the lawsuit was the reason the band moved to California. Nobody from the alternative line-up was ever made a part of the real Fleetwood Mac, although some of them later played in Danny Kirwan's studio band. Gantry and Gregory went on to become members of Stretch, whose 1975 UK hit single "Why Did You Do It" was written about the touring debacle. Gantry later collaborated with the Alan Parsons Project. Martinez went on to play with the Deep Purple offshoot Paice Ashton Lord, as well as Robert Plant's backing band. 1974: Return of the authentic Fleetwood Mac While the other band had been on tour, Welch stayed in Los Angeles and connected with entertainment attorneys. He realised that the original Fleetwood Mac was being neglected by Warner Bros and that they would need to change their base of operation from England to America, to which the rest of the band agreed. Rock promoter Bill Graham wrote a letter to Warner Bros to convince them that the real Fleetwood Mac was, in fact, Fleetwood, Welch, and the McVies. This did not end the legal battle but the band was able to record as Fleetwood Mac again. Instead of hiring another manager, Fleetwood Mac, having re-formed, became the only major rock band managed by the artists themselves. In September 1974, Fleetwood Mac signed a new recording contract with Warner Bros, but remained on the Reprise label. In the same month the band released their ninth studio album, Heroes Are Hard to Find. This was the first time Fleetwood Mac had only one guitarist. While on tour they added a second keyboardist, Doug Graves, who had been an engineer on Heroes Are Hard to Find. In late 1974 Graves was preparing to become a permanent member of the band by the end of their US tour. He said: However, Graves did not ultimately join full-time. In 1980, Christine McVie explained the decision: Robert ("Bobby") Hunt, who had been in the band Head West with Bob Welch back in 1970, replaced Graves. Neither musician proved to be a long-term addition to the line-up. Welch left soon after the tour ended (on 5 December 1974 at Cal State University), having grown tired of touring and legal struggles. Nevertheless, the tour had enabled the Heroes album to reach a higher position on the American charts than any of the band's previous records. 1975–1987: Addition of Buckingham and Nicks, and global success After Welch decided to leave the band, Fleetwood began searching for a replacement. Whilst he was checking out Sound City Studios in Los Angeles, the house engineer, Keith Olsen, played him a track he had recorded, "Frozen Love", from the album Buckingham Nicks (1973). Fleetwood liked it and was introduced to the guitarist from the band, Lindsey Buckingham, who was at Sound City that day recording demos. Fleetwood asked him to join Fleetwood Mac and Buckingham agreed, on the condition that his music partner and girlfriend, Stevie Nicks, be included. Buckingham and Nicks joined the band on New Year's Eve 1974, within four weeks of the previous incarnation splitting. In 1975, the new line-up released another self-titled album, their tenth studio album. The album was a breakthrough for the band and became a huge hit, reaching No.1 in the US and selling over 7 million copies. Among the hit singles from this album were Christine McVie's "Over My Head" and "Say You Love Me" and Stevie Nicks's "Rhiannon", as well as the much-played album track "Landslide", a live rendition of which became a hit twenty years later on The Dance album. In 1976, the band was suffering from severe stress. With success came the end of John and Christine McVie's marriage, as well as Buckingham and Nicks's long-term romantic relationship. Fleetwood, meanwhile, was in the midst of divorce proceedings from his wife, Jenny. The pressure on Fleetwood Mac to release a successful follow-up album, combined with their new-found wealth, led to creative and personal tensions which were allegedly fuelled by high consumption of drugs and alcohol. The band's eleventh studio album, Rumours (the band's first release on the main Warner label after Reprise was retired and all of its acts were reassigned to the parent label), was released in the spring of 1977. In this album, the band members laid bare the emotional turmoil they were experiencing at the time. Rumours was critically acclaimed and won the Grammy Award for Album of the Year in 1977. The album generated four Top Ten singles: Buckingham's "Go Your Own Way", Nicks's US No. 1 "Dreams" and Christine McVie's "Don't Stop" and "You Make Loving Fun". Buckingham's "Second Hand News", Nicks's "Gold Dust Woman" and "The Chain" (the only song written by all five band members) also received significant radio airplay. By 2003 Rumours had sold over 19 million copies in the US alone (certified as a diamond album by the RIAA) and a total of 40 million copies worldwide, bringing it to eighth on the list of best-selling albums. Fleetwood Mac supported the album with a lucrative tour. On 10 October 1979, Fleetwood Mac were honoured with a star on the Hollywood Walk of Fame for their contributions to the music industry at 6608 Hollywood Boulevard. Buckingham convinced Fleetwood to let his work on their next album be more experimental, and to be allowed to work on tracks at home before bringing them to the rest of the band in the studio. The result of this, the band's twelfth studio album Tusk, was a 20-track double album released in 1979. It produced three hit singles: Buckingham's "Tusk" (US No. 8), which featured the USC Trojan Marching Band, Christine McVie's "Think About Me" (US No. 20), and Nicks's six-and-a-half minute opus "Sara" (US No. 7). "Sara" was cut to four-and-a-half minutes for both the hit single and the first CD-release of the album, but the unedited version has since been restored on the 1988 greatest hits compilation, the 2004 reissue of Tusk and Fleetwood Mac's 2002 release of The Very Best of Fleetwood Mac. Original guitarist Peter Green also took part in the sessions of Tusk although his playing, on the Christine McVie track "Brown Eyes", is not credited on the album. In an interview in 2019 Fleetwood described Tusk as his "personal favourite" and said, “Kudos to Lindsey ... for us not doing a replica of Rumours." Tusk sold four million copies worldwide. Fleetwood blamed the album's relative lack of commercial success on the RKO radio chain having played the album in its entirety prior to release, thereby allowing mass home taping. The band embarked on an 11-month tour to support and promote Tusk. They travelled around the world, including the US, Australia, New Zealand, Japan, France, Belgium, Germany, the Netherlands, and the United Kingdom. In Germany, they shared the bill with reggae superstar Bob Marley. On this world tour, the band recorded music for their first live album, which was released at the end of 1980. The band's thirteenth studio album, Mirage, was released in 1982. Following 1981 solo albums by Nicks (Bella Donna), Fleetwood (The Visitor), and Buckingham (Law and Order), there was a return to a more conventional approach. Buckingham had been chided by critics, fellow band members and music business managers for the lesser commercial success of Tusk. Recorded at Château d'Hérouville in France and produced by Richard Dashut, Mirage was an attempt to recapture the huge success of Rumours. Its hits included Christine McVie's "Hold Me" and "Love in Store" (co-written by Robbie Patton and Jim Recor, respectively), Nicks's "Gypsy", and Buckingham's "Oh Diane", which made the Top 10 in the UK. A minor hit was also scored by Buckingham's "Eyes Of The World" and "Can't Go Back". In contrast to the Tusk Tour the band embarked on only a short tour of 18 American cities, the Los Angeles show being recorded and released on video. They also headlined the first US Festival, on 5 September 1982, for which the band was paid $500,000 ($ today). Mirage was certified double platinum in the US. Following Mirage the band went on hiatus, which allowed members to pursue solo careers. Nicks released two more solo albums (1983's The Wild Heart and 1985's Rock a Little). Buckingham issued Go Insane in 1984, the same year that Christine McVie made an eponymous album (yielding the Top 10 hit "Got a Hold on Me" and the Top 40 hit "Love Will Show Us How"). All three met with success, Nicks being the most popular. During this period Fleetwood had filed for bankruptcy, Nicks was admitted to the Betty Ford Clinic for addiction problems and John McVie had suffered an addiction-related seizure, all of which were attributed to the lifestyle of excess afforded to them by their worldwide success. It was rumoured that Fleetwood Mac had disbanded, but Buckingham commented that he was unhappy to allow Mirage to remain as the band's last effort. The Rumours line-up of Fleetwood Mac recorded one more album, their fourteenth studio album, Tango in the Night, in 1987. As with various other Fleetwood Mac albums, the material started off as a Buckingham solo album before becoming a group project. The album went on to become their best-selling release since Rumours, especially in the UK where it hit No. 1 three times in the following year. The album sold three million copies in the US and contained four hits: Christine McVie's "Little Lies" and "Everywhere" ('Little Lies' being co-written with McVie's new husband Eddy Quintela), Sandy Stewart and Nicks's "Seven Wonders", and Buckingham's "Big Love". "Family Man" (Buckingham and Richard Dashut), and "Isn't It Midnight" (Christine McVie), were also released as singles, with less success. 1987–1995: Departure of Buckingham and Nicks With a ten-week tour scheduled, Buckingham held back at the last minute, saying he felt his creativity was being stifled. A group meeting at Christine McVie's house on 7 August 1987 resulted in turmoil. Tensions were coming to a head. Fleetwood said in his autobiography that there was a physical altercation between Buckingham and Nicks. Buckingham left the band the following day. After Buckingham's departure Fleetwood Mac added two new guitarists to the band, Billy Burnette and Rick Vito, again without auditions. Burnette was the son of Dorsey Burnette and nephew of Johnny Burnette, both of The Rock and Roll Trio. He had already worked with Fleetwood in Zoo, with Christine McVie as part of her solo band, had done some session work with Nicks, and backed Buckingham on Saturday Night Live. Fleetwood and Christine McVie had played on his Try Me album in 1985. Vito, a Peter Green admirer, had played with many artists from Bonnie Raitt to John Mayall, to Roger McGuinn in Thunderbyrd and worked with John McVie on two Mayall albums. The 1987–88 "Shake the Cage" tour was the first outing for this line-up. It was successful enough to warrant the release of a concert video, entitled "Tango in the Night", which was filmed at San Francisco's Cow Palace arena in December 1987. Capitalising on the success of Tango in the Night, the band released a Greatest Hits album in 1988. It featured singles from the 1975–1988 era and included two new compositions, "No Questions Asked" written by Nicks and "As Long as You Follow", written by Christine McVie and Quintela. 'As Long as You Follow' was released as a single in 1988 but only made No. 43 in the US and No.66 in the UK, although it reached No.1 on the US Adult Contemporary charts. The Greatest Hits album, which peaked at No. 3 in the UK and No. 14 in the US (though it has since sold over 8 million copies there) was dedicated by the band to Buckingham, with whom they were now reconciled. In 1990, Fleetwood Mac released their fifteenth studio album, Behind the Mask. With this album the band veered away from the stylised sound that Buckingham had evolved during his tenure in the band (which was also evident in his solo work) and developed a more adult contemporary style with producer Greg Ladanyi. The album yielded only one Top 40 hit, Christine McVie's "Save Me". Behind the Mask only achieved Gold album status in the US, peaking at No. 18 on the Billboard album chart, though it entered the UK Albums Chart at No. 1. It received mixed reviews and was seen by some music critics as a low point for the band in the absence of Buckingham (who had actually made a guest appearance playing on the title track). But Rolling Stone magazine said that Vito and Burnette were "the best thing to ever happen to Fleetwood Mac". The subsequent "Behind the Mask" tour saw the band play sold-out shows at London's Wembley Stadium. In the final show in Los Angeles, Buckingham joined the band on stage. The two women of the band, McVie and Nicks, had decided that the tour would be their last (McVie's father had died during the tour), although both stated that they would still record with the band. In 1991, however, Nicks and Rick Vito left Fleetwood Mac altogether. In 1992, Fleetwood arranged a 4-disc box set, spanning highlights from the band's 25-year history, entitled 25 Years – The Chain (an edited 2-disc set was also available). A notable inclusion in the box set was "Silver Springs", a Nicks composition that was recorded during the Rumours sessions but was omitted from the album and used as the B-side of "Go Your Own Way". Nicks had requested use of this track for her 1991 best-of compilation TimeSpace, but Fleetwood had refused as he had planned to include it in this collection as a rarity. The disagreement between Nicks and Fleetwood garnered press coverage and was believed to have been the main reason for Nicks leaving the band in 1991. The box set also included a new Nicks/Rick Vito composition, "Paper Doll", which was released in the US as a single and produced by Buckingham and Richard Dashut. There were also two new Christine McVie compositions, "Heart of Stone" and "Love Shines". "Love Shines" was released as a single in the UK and elsewhere. Buckingham also contributed a new song, "Make Me a Mask". Fleetwood also released a deluxe hardcover companion book to coincide with the release of the box set, titled My 25 Years in Fleetwood Mac. The volume featured notes written by Fleetwood detailing the band's 25-year history and many rare photographs. The Buckingham/Nicks/McVie/McVie/Fleetwood line-up reunited in 1993 at the request of US President Bill Clinton for his first Inaugural Ball. Clinton had made Fleetwood Mac's "Don't Stop" his campaign theme song. His request for it to be performed at the Inauguration Ball was met with enthusiasm by the band, although this line-up had no intention of reuniting again. Inspired by the new interest in the band, Mick Fleetwood, John McVie, and Christine McVie recorded another album as Fleetwood Mac, with Billy Burnette taking lead guitar duties. Burnette left in March 1993 to record a country album and pursue an acting career and Bekka Bramlett, who had worked a year earlier with Fleetwood's Zoo, was recruited to take his place. Solo singer-songwriter/guitarist and Traffic member Dave Mason, who had worked with Bekka's parents Delaney & Bonnie twenty-five years earlier, was subsequently added. In March 1994 Billy Burnette, a good friend and co-songwriter with Delaney Bramlett, returned to the band with Fleetwood's blessing. The band, minus Christine McVie, toured in 1994, opening for Crosby, Stills, & Nash and in 1995 as part of a package with REO Speedwagon and Pat Benatar. This tour saw the band perform classic Fleetwood Mac songs from their 1967–1974 era. In 1995, at a concert in Tokyo, the band was greeted by former member Jeremy Spencer, who performed a few songs with them. On 10 October 1995, Fleetwood Mac released their sixteenth studio album, Time, which was not a success. Although it hit the UK Top 60 for one week, the album had zero impact in the US. It failed to graze the Billboard Top 200 albums chart, a reversal for a band that had been a mainstay on that chart for most of the previous two decades. Shortly after the album's release, Christine McVie informed the band that the album would be her last. Bramlett and Burnette subsequently formed a country music duo, Bekka & Billy. 1995–2007: Re-formation, Reunion and Christine McVie's departure Just weeks after disbanding Fleetwood Mac, Mick Fleetwood started working with Lindsey Buckingham again. John McVie was added to the sessions, and later Christine McVie. Stevie Nicks also enlisted Buckingham to produce a song for a soundtrack. In May 1996 Fleetwood, John McVie, Christine McVie, and Nicks performed together at a private party in Louisville, Kentucky, prior to the Kentucky Derby, with Steve Winwood filling in for Buckingham. A week later the Twister film soundtrack was released, which featured the Nicks-Buckingham duet "Twisted", with Fleetwood on drums. This eventually led to a full reunion of the Rumours line-up, which officially reformed in March 1997. The regrouped Fleetwood Mac performed a live concert on a soundstage at Warner Bros. Burbank, California, on 22 May 1997. The concert was recorded, and from this performance came the 1997 live album The Dance, which brought the band back to the top of the US album charts for the first time in 10 years. The Dance returned Fleetwood Mac to a superstar status they had not enjoyed since Tango in the Night. The album was certified 5 million units by the RIAA. An arena tour followed the MTV premiere of The Dance and kept the reunited Fleetwood Mac on the road throughout much of 1997, the 20th anniversary of Rumours. With additional musicians Neale Heywood on guitar, Brett Tuggle on keyboards, Lenny Castro on percussion and Sharon Celani (who had toured with the band in the late 1980s) and Mindy Stein on backing vocals, this would be the final appearance of the classic line-up including Christine McVie for 16 years. Neale Heywood and Sharon Celani remain touring members to this day. In 1998 Fleetwood Mac were inducted into the Rock and Roll Hall of Fame. Members inducted included the original band, Mick Fleetwood, John McVie, Peter Green, Jeremy Spencer and Danny Kirwan, and Rumours-era members Christine McVie, Stevie Nicks and Lindsey Buckingham. Bob Welch was not included, despite his key role in keeping the band alive during the early 1970s. The Rumours-era version of the band performed both at the induction ceremony and at the Grammy Awards programme that year. Peter Green attended the induction ceremony but did not perform with his former bandmates, opting instead to perform his composition "Black Magic Woman" with Santana, who were inducted the same night. Neither Jeremy Spencer nor Danny Kirwan attended. Fleetwood Mac also received the "Outstanding Contribution to Music" award at the Brit Awards (British Phonographic Industry Awards) the same year. In 1998 Christine McVie left the band. Her departure left Buckingham and Nicks to sing all the lead vocals for the band's seventeenth album, Say You Will, released in 2003, although Christine contributed some backing vocals and keyboards. The album debuted at No.3 on the Billboard 200 chart (No. 6 in the UK) and yielded chart hits with "Peacekeeper" and the title track, and a successful world arena tour which lasted through 2004. The tour grossed $27,711,129 and was ranked No. 21 in the top 25 grossing tours of 2004. Around 2004–05 there were rumours of a reunion of the early line-up of Fleetwood Mac involving Peter Green and Jeremy Spencer. While these two apparently remained unconvinced, in April 2006 bassist John McVie, during a question-and-answer session on the Penguin Fleetwood Mac fan website, said of the reunion idea: In interviews given in November 2006 to support his solo album Under the Skin, Buckingham stated that plans for the band to reunite once more for a 2008 tour were still on the cards. Recording plans had been put on hold for the foreseeable future. In an interview Nicks gave to the UK newspaper The Daily Telegraph i in September 2007, she stated that she was unwilling to carry on with the band unless Christine McVie returned. 2008–2013: Unleashed tour and Extended Play In March 2008, it was mooted that Sheryl Crow might work with Fleetwood Mac in 2009. Crow and Stevie Nicks had collaborated in the past and Crow had stated that Nicks had been a great teacher and inspiration to her. Later, Buckingham said that the potential collaboration with Crow had "lost its momentum". and the idea was abandoned. In March 2009, Fleetwood Mac started their "Unleashed" tour, again without Christine McVie. It was a greatest hits show, although album tracks such as "Storms" and "I Know I'm Not Wrong" were also played. During their show on 20 June 2009 in New Orleans, Louisiana, Stevie Nicks premiered part of a new song that she had written about Hurricane Katrina. The song was later released as "New Orleans" on Nicks's 2011 album In Your Dreams with Mick Fleetwood on drums. In October 2009 and November the band toured Europe, followed by Australia and New Zealand in December. In October, The Very Best of Fleetwood Mac was re-released in an extended two-disc format (this format having been released in the US in 2002), entering at number six on the UK Albums Chart. On 1 November 2009 a one-hour documentary, Fleetwood Mac: Don't Stop, was broadcast in the UK on BBC One, featuring recent interviews with all four current band members. During the documentary Nicks gave a candid summary of the current state of her relationship with Buckingham, saying "Maybe when we're 75 and Fleetwood Mac is a distant memory, we might be friends." On 6 November 2009, Fleetwood Mac played the last show of the European leg of their Unleashed tour at London's Wembley Arena. Christine McVie was present in the audience. Nicks paid tribute to her from the stage to a standing ovation from the audience, saying that she thought about her former bandmate "every day", and dedicated that night's performance of "Landslide" to her. On 19 December 2009 Fleetwood Mac played the second-to-last show of their Unleashed tour to a sell-out crowd in New Zealand, at what was originally intended to be a one-off event at the TSB Bowl of Brooklands in New Plymouth. Tickets, after pre-sales, sold out within twelve minutes of public release. Another date, Sunday 20 December, was added and also sold out. The tour grossed $84,900,000 and was ranked No. 13 in the highest grossing worldwide tours of 2009. On 19 October 2010, Fleetwood Mac played a private show at the Phoenician Hotel in Scottsdale, Arizona for TPG (Texas Pacific Group). On 3 May 2011, the Fox Network broadcast an episode of Glee entitled "Rumours" that featured six songs from the band's 1977 album. The show sparked renewed interest in the band and its commercially most successful album, and Rumours re-entered the Billboard 200 chart at No. 11 in the same week that Nicks's new solo album In Your Dreams debuted at No. 6. (She was quoted by Billboard saying that her new album was "my own little Rumours.") The two recordings sold about 30,000 and 52,000 units respectively. Music downloads accounted for 91 per cent of the Rumours sales. The spike in sales for Rumours represented an increase of 1,951%. It was the highest chart entry by a previously issued album since The Rolling Stones''' reissue of Exile On Main St. re-entered the chart at No. 2 on 5 June 2010. In an interview in July 2012 Nicks confirmed that the band would reunite for a tour in 2013. Original Fleetwood Mac bassist Bob Brunning died on 18 October 2011 at the age of 68. Former guitarist and singer Bob Weston was found dead on 3 January 2012 at the age of 64. Former singer and guitarist Bob Welch was found dead from a self-inflicted gunshot wound on 7 June 2012 at the age of 66. Don Aaron, a spokesman at the scene, stated, "He died from an apparent self-inflicted gunshot wound to the chest." A suicide note was found. Welch had been struggling with health issues and was dealing with depression. His wife discovered his body. The band's 2013 tour, which took place in 34 cities, started on 4 April in Columbus, OH. The band performed two new songs ("Sad Angel" and "Without You"), which Buckingham described as some of the most "Fleetwood Mac-ey" sounding songs since Mirage. 'Without You' was re-recorded from the Buckingham-Nicks era. The band released their first new studio material in ten years, Extended Play, on 30 April 2013. The EP debuted and peaked at No. 48 in the US and produced one single, "Sad Angel". On 25 and 27 September 2013, the second and third nights of the band's London O2 shows, Christine McVie joined them on stage for "Don't Stop". On 27 October 2013, the band cancelled their New Zealand and Australian performances after John McVie had been diagnosed with cancer, so that he could undergo treatment. They said: "We are sorry not to be able to play these Australian and New Zealand dates. We hope our Australian and New Zealand fans as well as Fleetwood Mac fans everywhere will join us in wishing John and his family all the best." Also in October 2013, Stevie Nicks appeared in American Horror Story: Coven with Fleetwood Mac's song "Seven Wonders" playing in the background. In November 2013, Christine McVie expressed interest in a return to Fleetwood Mac, and also affirmed that John McVie's prognosis was "really good". 2014–present: Return of McVie and departure of Buckingham On 11 January 2014, Mick Fleetwood confirmed that Christine McVie would be rejoining Fleetwood Mac. On with the Show, a 33-city North American tour, opened in Minneapolis, Minnesota, on 30 September 2014. A series of May–June 2015 arena dates in the United Kingdom went on sale on 14 November, selling out in minutes. Due to high demand, additional dates were added to the tour, including an Australian leg. In January 2015, Buckingham suggested that the new album and tour might be Fleetwood Mac's last, and that the band would cease operations in 2015 or soon afterwards. He concluded: "We're going to continue working on the new album and the solo stuff will take a back seat for a year or two. A beautiful way to wrap up this last act." But Mick Fleetwood stated that the new album might take a few years to complete and that they were waiting for contributions from Nicks, who had been ambivalent about committing to a new record. In August 2016, Fleetwood revealed that while the band had "a huge amount of recorded music", virtually none of it featured Nicks. Buckingham and Christine McVie, however, had contributed multiple songs to the new project. Fleetwood told Ultimate Classic Rock: "She [McVie] ... wrote up a storm ... She and Lindsey could probably have a mighty strong duet album if they want. In truth, I hope it will come to more than that. There really are dozens of songs. And they’re really good. So we’ll see." Nicks explained her reluctance to record another album with Fleetwood Mac. "Is it possible that Fleetwood Mac might do another record? I can never tell you yes or no, because I don't know. I honestly don't know... It's like, do you want to take a chance of going in and setting up in a room for like a year [to record an album] and having a bunch of arguing people? And then not wanting to go on tour because you just spent a year arguing?". She also emphasised that people do not buy as many records as they used to. On 9 June 2017, Buckingham and Christine McVie released a new album, titled Lindsey Buckingham/Christine McVie, which included contributions from Mick Fleetwood and John McVie. The album was preceded by the single "In My World". A 38-date tour began on 21 June and concluded 16 November. Fleetwood Mac also planned to embark on another tour in 2018. The band headlined the second night of the Classic West concert (on 16 July 2017 at Dodger Stadium in Los Angeles) and the second night of the Classic East concert (at New York City's Citi Field on 30 July 2017). The band received the MusiCares Person of the Year award in 2018 and reunited to perform several songs at the Grammy-hosted gala honouring them. Artists including Lorde, Harry Styles, Little Big Town and Miley Cyrus also performed. In April 2018, the song "Dreams" re-entered the Hot Rock Songs chart at No. 16 after a viral meme had featured the song. This chart re-entry came 40 years after the song had topped the Hot 100. The song's streaming totals also translated into 7,000 "equivalent album units", a jump of 12 per cent, which helped Rumours to go from No. 21 to No. 13 on the Top Rock Albums chart. That month Buckingham departed from the group a second time, having reportedly been dismissed. The reason was said to have been a disagreement about the nature of the tour, and in particular the question of whether newer or less well-known material would be included, as Buckingham wanted. Mick Fleetwood and the band appeared on CBS This Morning on 25 April 2018 and said that Buckingham would not sign off on a tour that the group had been planning for a year and a half and they had reached a "huge impasse" and "hit a brick wall". When asked if Buckingham had been fired, he said, "Well, we don't use that word because I think it's ugly." He also said that "Lindsey has huge amounts of respect and kudos to what he's done within the ranks of Fleetwood Mac and always will." In October 2018, Buckingham filed a lawsuit against Fleetwood Mac for breach of fiduciary duty, breach of oral contract and intentional interference with prospective economic advantage, among other charges. He stated that they eventually came to a settlement, which he would not share the terms of, but claimed he was "happy enough with it". Buckingham also told his version of what had led to his departure from the band. Two days after their performance at the MusiCares event he got a phone call from the band's manager Irving Azoff, who had a list of things that, as Buckingham puts it, “Stevie took issue with” that evening, including the guitarist’s outburst just before the band’s set over the intro music [for their acceptance speech being] the studio recording of Nicks’ “Rhiannon” — and the way he “smirked” during Nicks’ thank-you speech. Buckingham concedes the first point. “It wasn’t about it being ‘Rhiannon,’ ” he says. “It just undermined the impact of our entrance. That’s me being very specific about the right and wrong way to do something.” As for smirking, “The irony is that we have this standing joke that Stevie, when she talks, goes on a long time,” Buckingham says. “I may or may not have smirked. But I look over and Christine and Mick are doing the waltz behind her as a joke.” At the end of that call, Buckingham assumed Nicks was quitting Fleetwood Mac. He wrote an e-mail to Fleetwood assuring the drummer that the group could continue. There was no reply. A couple of days later, Buckingham says, “I called Irving and said, ‘This feels funny. Is Stevie leaving the band, or am I getting kicked out?’ ” Azoff told the guitarist he was “getting ousted” and that Nicks gave the rest of the band “an ultimatum: Either you go or she’s gonna go.” Former Tom Petty and the Heartbreakers guitarist Mike Campbell and Neil Finn of Crowded House were named to replace Buckingham. On CBS This Morning, Fleetwood said that Fleetwood Mac had been reborn and that "This is the new lineup of Fleetwood Mac." Aside from touring, the band plans to record new music with Campbell and Finn in the future. The band's "An Evening with Fleetwood Mac" tour started in October 2018. The band launched the tour at the iHeartRadio Music Festival on 21 September 2018 at the T-Mobile Arena in Las Vegas, NV. On 8 June 2018, former Fleetwood Mac guitarist Danny Kirwan died aged 68, having contracted pneumonia earlier in the year. The British music magazine Mojo quoted Christine McVie as saying: "Danny Kirwan was the white English blues guy. Nobody else could play like him. He was a one-off ... Danny and Peter [Green] gelled so well together. Danny had a very precise, piercing vibrato – a unique sound ... He was a perfectionist; a fantastic musician and a fantastic writer." Kirwan's song "Tell Me All the Things You Do" from Kiln House was included in the set of the 2018–19 An Evening with Fleetwood Mac tour. On 28 May 2020 Neil Finn, featuring Nicks and McVie, with Campbell on guitar, released the song “Find Your Way Back Home” for the Auckland, New Zealand homeless shelter Auckland City Mission. Founding member Peter Green died on 25 July 2020 at the age of 73. In October 2020, Rumours again entered the Billboard top 10. The album received 30.6 million streams on streaming platforms the week of 15 October. Tours Kiln House Tour – 1970 Future Games Tour – 1971 British Are Coming Tour - 1972 Bare Trees Tour – 1972 Penguin Tour – early 1973 Mystery to Me Tour – mid-1973 Heroes Are Hard to Find Tour – 1974 Fleetwood Mac Tour – 1975 Rumours Tour – 1977 Tusk Tour – 1979–1980 Mirage Tour – 1982 Shake the Cage Tour – 1987–1988 Behind the Mask Tour – 1990 Another Link in the Chain Tour – 1994–1995 The Dance – 1997 Say You Will Tour – 2003–2004 Unleashed tour – 2009 Fleetwood Mac Live – 2013 On with the Show – 2014–2015 An Evening with Fleetwood Mac – 2018–2019 Band members Mick Fleetwood – drums, percussion (1967–1995, 1997–present) John McVie – bass (1967–1995, 1997–present) Christine McVie – vocals, keyboards (1970–1995, 1997–1998, 2014–present) Stevie Nicks – vocals (1975–1991, 1997–present) Mike Campbell – lead guitar (2018–present) Neil Finn – vocals, rhythm guitar (2018–present) Timeline Discography Studio albums Fleetwood Mac (1968) [also known as Peter Green's Fleetwood Mac] Mr. Wonderful (1968) Then Play On (1969) Kiln House (1970) Future Games (1971) Bare Trees (1972) Penguin (1973) Mystery to Me (1973) Heroes Are Hard to Find (1974) Fleetwood Mac (1975) [also known as "The White Album"] Rumours (1977) Tusk (1979) Mirage (1982) Tango in the Night (1987) Behind the Mask (1990) Time (1995) Say You Will (2003) Extended plays Extended Play (2013) Awards and nominations The following is a list of awards and nominations received by Fleetwood Mac: Citations Sources Berkery, Patrick. "The Return of the Mac Daddy: Mick Fleetwood". Via ProQuest. Modern Drummer, Sep 2015. Web. Jul 2016. Bob Brunning, Blues: The British Connection, Helter Skelter Publishing, London 2002, – First edition 1986 – Second edition 1995 Blues in Britain Bob Brunning, The Fleetwood Mac Story: Rumours and Lies, Omnibus Press London, 1990 and 1998, Bob Brunning, Fleetwood Mac: The First 30 Years, Omnibus Press, London, 1998, Caillat, Ken and Steve Steifel: Making Rumours: The Inside Story of the Classic Fleetwood Mac Album. New Jersey: Wiley, 2012. Print Carol Ann Harris, Storms: My Life with Lindsey Buckingham and Fleetwood Mac, Chicago Review Press, 2007, Christopher Hjort, Strange brew: Eric Clapton and the British blues boom, 1965–1970, foreword by John Mayall, Jawbone 2007, Dick Heckstall-Smith, The safest place in the world: A personal history of British Rhythm and blues, 1989 Quartet Books Limited, – Second Edition : Blowing The Blues – Fifty Years Playing The British Blues, 2004, Clear Books, Evans, Mike, Fleetwood Mac: The Definitive History, Sterling New York, 2011, Fancourt, L., (1989) British blues on record (1957–1970), Retrack Books. Fleetwood, Mick, Stephen Davis and Frank Harding. My Twenty-Five Years in Fleetwood Mac. New York, NY: Hyperion, 1992. Print. Harry Shapiro Alexis Korner: The Biography, Bloomsbury Publishing PLC, London 1997, Discography by Mark Troster, Fortner, Stephen. "Filling Some Mightily High Heels with Fleetwood Mac". ProQuest. Keyboard, Jan 2016. Web. Jul 2016 Martin Celmins, Peter Green – Founder of Fleetwood Mac, Sanctuary London, 1995, foreword by B.B. King, Mick Fleetwood with Stephen Davis, Fleetwood – My Life and Adventures in Fleetwood Mac, William Morrow and Company, 1990, Mike Vernon, The Blue Horizon story 1965–1970 vol. 1, notes of the booklet of the Box Set (60 pages) Paul Myers, Long John Baldry and the Birth of the British Blues, Vancouver 2007, GreyStone Books, Further reading Silver, Murray When Elvis Meets the Dalai Lama, (Bonaventure Books, Savannah, 2005) in which the author recounts his days as a concert promoter in Atlanta, Ga., and having brought Fleetwood Mac to town for the first time in December 1969. Stephen Thomas Erlewine, Allmusic The Rolling Stone Encyclopedia of Rock & Roll'' (Simon & Schuster, 2001) External links 1967 establishments in the United Kingdom American blues rock musical groups American pop rock music groups American soft rock music groups Art pop musicians Brit Award winners British blues musical groups British blues rock musical groups British expatriates in the United States British pop rock music groups British soft rock music groups Grammy Award winners Juno Award for International Album of the Year winners Lindsey Buckingham Musical groups disestablished in 1996 Musical groups established in 1967 Musical groups from London Musical groups reestablished in 1997 Musical quintets Reprise Records artists Stevie Nicks Warner Records artists
[ 0.011742128990590572, 0.031730521470308304, 0.30065295100212097, -0.1997002512216568, 0.39154472947120667, 0.40240415930747986, -0.11364452540874481, 0.35596805810928345, -0.3685307800769806, 0.3180772066116333, 0.05373415723443031, 1.1417862176895142, -0.20675890147686005, 0.2547983527183...
11788
https://en.wikipedia.org/wiki/Frederick%20I%2C%20Margrave%20of%20Brandenburg-Ansbach
Frederick I, Margrave of Brandenburg-Ansbach
Frederick I of Ansbach and Bayreuth (also known as Frederick V; or ; 8 May 1460 – 4 April 1536) was born at Ansbach as the eldest son of Albert III, Margrave of Brandenburg by his second wife Anna, daughter of Frederick II, Elector of Saxony. His elder half-brother was the Elector John Cicero of Brandenburg. Friedrich succeeded his father as Margrave of Ansbach in 1486 and his younger brother Siegmund as Margrave of Bayreuth in 1495. Life After depleting the finances of the margraviate with his lavish lifestyle, Frederick I was deposed by his two elder sons, Casimir and George, in 1515. He was then locked up at Plassenburg Castle by his eldest son Casimir in a tower room from which he could not escape for 12 years. Thereupon, his son Casimir took up the rule of the Margraviate of Bayreuth (Kulmbach) and his son George took up the rule of the Margraviate of Ansbach. However, the overthrow of Frederick did outrage his other younger sons and led to far-reaching political countermeasures. When Elector Joachim I of Brandenburg visited Kulmbach during his journey to Augsburg, and wanted to plead for Frederick's release, he was nevertheless denied entry to Plassenburg Castle. The dispute was finally cleared when an agreement was reached in 1522, in which the demands of the younger sons of Frederick were met. Family and children On 14 February 1479, at Frankfurt (Oder), Frederick I was married to Princess Sophia of Poland (6 April 1464 – 5 October 1512), daughter of King Casimir IV of Poland by his wife Elisabeth of Austria, and sister of King Sigismund I of Poland. They had seventeen children: Casimir, Margrave of Brandenburg-Kulmbach (27 September 1481, Ansbach – 21 September 1527, Buda). Elisabeth, died young. Margarete of Brandenburg-Ansbach-Kulmbach (10 January 1483, Ansbach – 10 July 1532). George, Margrave of Brandenburg-Ansbach (4 March 1484, Ansbach – 27 December 1543, Ansbach). Sophie of Brandenburg-Ansbach-Kulmbach (10 March 1485, Ansbach – 24 May 1537, Liegnitz), married on 14 November 1518 to Duke Frederick II of Legnica. Anna of Brandenburg-Ansbach-Kulmbach (5 May 1487, Ansbach – 7 February 1539), married on 1 December 1518 to Duke Wenceslaus II of Cieszyn. Barbara, died young. Albert, 1st Duke of Prussia (17 May 1490, Ansbach – 20 March 1568, Castle Tapiau), Grand Master of the Teutonic Order from 1511 to 1525, and first Duke of Prussia from 1525. Frederick of Brandenburg-Ansbach-Kulmbach (13 June 1491, Ansbach – ca. 1497). Johann, Viceroy of Valencia (9 January 1493, Plassenburg – 5 July 1525, Valencia) Elisabeth of Brandenburg-Ansbach-Kulmbach (25 March 1494, Ansbach – 31 May 1518, Pforzheim), married in Pforzheim on 29 September 1510 to Margrave Ernest of Baden-Durlach. Barbara of Brandenburg-Ansbach-Kulmbach (24 September 1495, Ansbach – 23 September 1552), married in Plassenburg on 26 July 1528 to Landgrave George III of Leuchtenberg. Frederick of Brandenburg-Ansbach-Kulmbach (17 January 1497, Ansbach – 20 August 1536, Genoa), a canon in Würzburg and Salzburg. Wilhelm, Archbishop of Riga (30 June 1498, Ansbach – 4 February 1563, Riga) John Albert, Archbishop of Magdeburg (20 September 1499, Ansbach – 17 May 1550, Halle) Frederick Albert, died young. Gumprecht of Brandenburg-Ansbach-Kulmbach (16 July 1503, Ansbach – 25 June 1528, Naples), a canon in Bamberg. Ancestry References Sources External links Brandenburg-Ansbach, Frederick I, Margrave of Brandenburg-Ansbach, Frederick I, Margrave of Brandenburg-Ansbach, Frederick I, Margrave of Margraves of Brandenburg-Ansbach Margraves of Bayreuth 15th-century German people 16th-century German people
[ -1.0437499284744263, 0.5027105212211609, 0.5056071877479553, 0.1865072250366211, -1.1238796710968018, 0.9671042561531067, 0.8581126928329468, -0.5031152367591858, -0.5586503744125366, -0.45526912808418274, -0.0073370737954974174, -0.5202021598815918, -0.3346412777900696, 0.4721061587333679...
11790
https://en.wikipedia.org/wiki/F-Zero%3A%20Maximum%20Velocity
F-Zero: Maximum Velocity
F-Zero: Maximum Velocity is a futuristic racing video game developed by NDcube and published by Nintendo as a launch title for the Game Boy Advance. The game was released in Japan, North America and Europe in 2001. It is the first F-Zero game to be released on a handheld game console. Maximum Velocity takes place twenty-five years after F-Zero, in another F-Zero Grand Prix. The past generations of F-Zero had "piloted their way to fame", so it is the second F-Zero game without Captain Falcon, Samurai Goroh, Pico, or Dr. Stewart after the game BS F-Zero Grand Prix 2. Players control fast hovering crafts and use their speed-boosting abilities to navigate through the courses as quickly as possible. Gameplay Every race consists of five laps around a race track. Players lose the race if their machine explodes due to taking too much damage, or if they land outside of the track, get ejected from the race due to falling to 20th place, complete a lap with a rank outside of the rank limit of that lap, or forfeit. In the single player Grand Prix mode, all of these conditions require the player to possess and use an extra machine to try again. For each lap completed the player is rewarded with a speed boost, to be used once any time; one of the "SSS" marks will be shaded green to indicate that it can be used. A boost will dramatically increase a player's speed, but will decrease their ability to turn. A boost used before a jump will make the player jump farther, which could allow the player to use a shortcut with the right vehicle. Boost time and speed varies according to the machine, and is usually tuned for proper balance. For example, one machine boasts a boost time of twelve seconds, yet has the slowest boost speed of the entire game. Players can also take advantage of the varying deceleration of each vehicle. Some vehicles, such as the Jet Vermilion, take longer than others to decelerate from top boost speed to normal speed, once the boost has been used up. Players can also take advantage of this effect on boost pads. The Grand Prix is the main single player component of Maximum Velocity. It consists of four series named after chess pieces: "Pawn", "Knight", "Bishop" and "Queen". The latter of these can be unlocked by winning the others on "Expert" mode. They have five races in four difficulty settings, "Master" mode is unlocked by winning expert mode in each series, the player unlocks a new machine after completing it. The player needs to be in the top three at the end of the last lap in order to continue to the next race. If the player is unable to continue, the player will lose a machine and can try the race again. If the player runs out of machines, then the game ends, and the player has to start the series from the beginning. Championship is another single player component. It is basically the same as a "Time Attack" mode, except the player can only race on one, special course: the Synobazz Championship Circuit. This special course is not selectable in any other modes. Multiplayer Maximum Velocity can be played in two multiplayer modes using the Game Boy Advance link cable, with one cartridge, or one cartridge per player. Two to four players can play in both modes. In single cart, only one player needs to have a cartridge. The other players will boot off the link cable network from the player with the cart using the GBA's netboot capability. All players drive a generic craft, and the game can only be played on one level, Silence. Silence, along with Fire Field, are the only areas to return from previous games. Aptly, Silence in Maximum Velocity has no background music, unlike in most other F-Zero games. In multi cart, each player needs to have a cartridge to play. This has many advantages over single cart: All players can use any machine in this game that has been unlocked by another player. Players can select any course in this game. After the race is finished, all of the players' ranking data are mixed and shared ("Mixed ranking" stored in each cart). Development F-Zero: Maximum Velocity is one of the first titles to have been developed by NDcube. Like the original F-Zero for SNES, Maximum Velocity implements a pseudo-3D visual technique based on the scaling and rotation effects of bitmap graphics. In this game, this technique consists of a double layer, one of which gives the illusion of depth. Release Maximum Velocity is one of ten Game Boy Advance games released on December 16, 2011, to Nintendo 3DS Ambassadors, a program to give free downloadable games to early adopters who bought a Nintendo 3DS before its price drop. It was also released on the Wii U Virtual Console on April 3, 2014, in Japan and April 17 in North America and Europe. Reception On release, Famitsu magazine scored the game a 31 out of 40. F-Zero: Maximum Velocity went on to sell 334,145 copies in Japan and 273,229 copies in the U.S. as of 2005. The game has total sales of over 1 million copies worldwide and has an overall score of 86% on Metacritic and 83.37% on Game Rankings. Notes References External links Official website 2001 video games F-Zero Game Boy Advance games Nintendo games Video games developed in Japan Virtual Console games Virtual Console games for Wii U
[ -0.46510910987854004, -0.419935941696167, 0.5615286231040955, 0.6935945749282837, -0.4705609381198883, -0.07202407717704773, -0.05622667074203491, 0.15017949044704437, 0.2967166602611542, -0.014369134791195393, -0.08238793909549713, 0.26073580980300903, -0.41646715998649597, -0.26289355754...
11794
https://en.wikipedia.org/wiki/Frederick%20William%20I%20of%20Prussia
Frederick William I of Prussia
Frederick William I (; 14 August 1688 – 31 May 1740), known as the "Soldier King" (), was the king in Prussia and elector of Brandenburg from 1713 until his death in 1740, as well as prince of Neuchâtel. He was succeeded by his son, Frederick the Great. Reign He was born in Berlin to King Frederick I of Prussia and Princess Sophia Charlotte of Hanover. During his first years, he was raised by the Huguenot governess Marthe de Roucoulle. His father had successfully acquired the title of king for the margraves of Brandenburg. On ascending the throne in 1713 (the year before his maternal grandmother’s death and the ascension of his maternal uncle George I of Great Britain to the British throne), the new king sold most of his father's horses, jewels and furniture; he did not intend to treat the treasury as his personal source of revenue the way Frederick I and many of the other German princes had. Throughout his reign, Frederick William was characterized by his frugal, austere and martial lifestyle, as well as his devout Calvinist faith. He practiced rigid management of the treasury, never started a war, and led a simple and austere lifestyle, in contrast to the lavish court his father had presided over. At his death, Prussia had a sound exchequer and a full treasury, in contrast to the other German states. Frederick William I did much to improve Prussia economically and militarily. He replaced mandatory military service among the middle class with an annual tax, and he established schools and hospitals. The king encouraged farming, reclaimed marshes, stored grain in good times and sold it in bad times. He dictated the manual of Regulations for State Officials, containing 35 chapters and 297 paragraphs in which every public servant in Prussia could find his duties precisely set out: a minister or councillor failing to attend a committee meeting, for example, would lose six months' pay; if he absented himself a second time, he would be discharged from the royal service. In short, Frederick William I concerned himself with every aspect of his relatively small country, ruling an absolute monarchy with great energy and skill. In 1732, the king invited the Salzburg Protestants to settle in East Prussia, which had been depopulated by plague in 1709. Under the terms of the Peace of Augsburg, the prince-archbishop of Salzburg could require his subjects to practice the Catholic faith, but Protestants had the right to emigrate to a Protestant state. Prussian commissioners accompanied 20,000 Protestants to their new homes on the other side of Germany. Frederick William I personally welcomed the first group of migrants and sang Protestant hymns with them. Frederick William intervened briefly in the Great Northern War, allied with Peter the Great of Russia, in order to gain a small portion of Swedish Pomerania; this gave Prussia new ports on the Baltic Sea coast. More significantly, aided by his close friend Prince Leopold of Anhalt-Dessau, the "Soldier-King" made considerable reforms to the Prussian army's training, tactics and conscription program—introducing the canton system, and greatly increasing the Prussian infantry's rate of fire through the introduction of the iron ramrod. Frederick William's reforms left his son Frederick with the most formidable army in Europe, which Frederick used to increase Prussia's power. The observation that "the pen is mightier than the sword" has sometimes been attributed to him. (See as well: "Prussian virtues".) Although a highly effective ruler, Frederick William had a perpetually short temper which sometimes drove him to physically attack servants (or even his own children) with a cane at the slightest perceived provocation. His violent, harsh nature was further exacerbated by his inherited porphyritic disease, which gave him gout, obesity and frequent crippling stomach pains. He also had a notable contempt for France, and would sometimes fly into a rage at the mere mention of that country, although this did not stop him from encouraging the immigration of French Huguenot refugees to Prussia. Burial and reburials Frederick William died in 1740 at age 51 and was interred at the Garrison Church in Potsdam. During World War II, in order to protect it from advancing allied forces, Hitler ordered the king's coffin, as well as those of Frederick the Great and Paul von Hindenburg, into hiding, first to Berlin and later to a salt mine outside of Bernterode. The coffins were later discovered by occupying American forces, who re-interred the bodies in St. Elisabeth's Church in Marburg in 1946. In 1953 the coffin was moved to Burg Hohenzollern, where it remained until 1991, when it was finally laid to rest on the steps of the altar in the Kaiser Friedrich Mausoleum in the Church of Peace on the palace grounds of Sanssouci. The original black marble sarcophagus collapsed at Burg Hohenzollern—the current one is a copper copy. Relationship with Frederick II His eldest surviving son was Frederick II (Fritz), born in 1712. Frederick William wanted him to become a fine soldier. As a small child, Fritz was awakened each morning by the firing of a cannon. At the age of 6, he was given his own regiment of children to drill as cadets, and a year later, he was given a miniature arsenal. The love and affection Frederick William had for his heir initially was soon destroyed due to their increasingly different personalities. Frederick William ordered Fritz to undergo a minimal education, live a simple Protestant lifestyle, and focus on the Army and statesmanship as he had. However, the intellectual Fritz was more interested in music, books and French culture, which were forbidden by his father as decadent and unmanly. As Fritz's defiance for his father's rules increased, Frederick William would frequently beat or humiliate Fritz (he preferred his younger sibling Augustus William). Fritz was beaten for being thrown off a bolting horse and wearing gloves in cold weather. After the prince attempted to flee to England with his tutor, Hans Hermann von Katte, the enraged King had Katte beheaded before the eyes of the prince, who himself was court-martialled. The court declared itself not competent in this case. Whether it was the king's intention to have his son executed as well (as Voltaire claims) is not clear. However, the Holy Roman Emperor Charles VI intervened, claiming that a prince could only be tried by the Imperial Diet of the Holy Roman Empire itself. Frederick was imprisoned in the Fortress of Küstrin from 2 September to 19 November 1731 and exiled from court until February 1732, during which time he was rigorously schooled in matters of state. After achieving a measure of reconciliation, Frederick William had his son married to Princess Elizabeth of Brunswick-Wolfenbüttel, whom Frederick despised, but then grudgingly allowed him to indulge in his musical and literary interests again. He also gifted him a stud farm in East Prussia, and Rheinsberg Palace. By the time of Frederick William's death in 1740, he and Frederick were on at least reasonable terms with each other. Although the relationship between Frederick William and Frederick was clearly hostile, Frederick himself later wrote that his father "penetrated and understood great objectives, and knew the best interests of his country better than any minister or general." Marriage and family Frederick William married his first cousin Sophia Dorothea of Hanover, George II's younger sister (daughter of his uncle, King George I of Great Britain and Sophia Dorothea of Celle) on 28 November 1706. Frederick William was faithful and loving to his wife but they did not have a happy relationship: Sophia Dorothea feared his unpredictable temper and resented him, both for allowing her no influence or independence at court, and for refusing to marry her children to their English cousins. She also abhorred his cruelty towards their son and heir Frederick (with whom she was close), although rather than trying to mend the relationship between father and son she frequently spurred Frederick on in his defiance. They had fourteen children, including: He was the godfather of the Prussian envoy Friedrich Wilhelm von Thulemeyer and of his grand-nephew, Prince Edward Augustus of Great Britain. Ancestry See also Prussian virtues References Further reading Dorwart, Reinhold A. The administrative reforms of Frederick William I of Prussia (Harvard University Press, 2013). Fann, Willerd R. "Peacetime Attrition in the Army of Frederick William I, 1713–1740." Central European History 11.4 (1978): 323–334. online Gothelf, Rodney. "Frederick William I and the beginnings of Prussian absolutism, 1713–1740." in The Rise of Prussia 1700–1830 (Routledge, 2014) pp. 47–67. External links King Frederick William I of Prussia and his “obsession” 1688 births 1740 deaths 18th-century Kings of Prussia Kings of Prussia Prince-electors of Brandenburg Princes of Neuchâtel German Calvinist and Reformed Christians House of Hohenzollern People from Berlin 18th-century German people Brandenburgian nobility Sons of kings
[ -0.3047398626804352, -0.1666642129421234, 0.3060285747051239, 0.47876712679862976, -0.6000276803970337, 0.34905749559402466, 0.6027175784111023, -0.23821046948432922, -0.2755400836467743, -0.6556982398033142, -0.28806501626968384, -0.7129788994789124, 0.09409373998641968, 0.647316992282867...
11795
https://en.wikipedia.org/wiki/Felsic
Felsic
In geology, felsic is an adjective describing igneous rocks that are relatively rich in elements that form feldspar and quartz. It is contrasted with mafic rocks, which are relatively richer in magnesium and iron. Felsic refers to silicate minerals, magma, and rocks which are enriched in the lighter elements such as silicon, oxygen, aluminium, sodium, and potassium. Felsic magma or lava is higher in viscosity than mafic magma/lava. Felsic rocks are usually light in color and have specific gravities less than 3. The most common felsic rock is granite. Common felsic minerals include quartz, muscovite, orthoclase, and the sodium-rich plagioclase feldspars (albite-rich). Terminology In modern usage, the term acid rock, although sometimes used as a synonym, normally now refers specifically to a high-silica-content (greater than 63% SiO2 by weight) volcanic rock, such as rhyolite. Older, broader usage is now considered archaic. That usage, with the contrasting term "basic rock" (MgO, FeO, mafic), was based on an ancient concept, dating from the 19th century, that "silicic acid" (H4SiO4 or Si(OH)4) was the chief form of silicon occurring in siliceous rocks. Although this intuition makes sense from an acid-base perspective in aquatic chemistry considering water-rock interactions and silica dissolution, siliceous rocks are not formed by this protonated monomeric species, but by a tridimensional network of SiO44– tetrahedra connected to each others. Once released in water and hydrolyzed, these silica entities can indeed form silicic acid in aqueous solution. The term "felsic" combines the words "feldspar" and "silica". The similarity of the resulting term felsic to the German felsig, "rocky" (from Fels, "rock"), is purely accidental. Feldspar is linked to German. It is a borrowing of Feldspat. The link is therefore to German Feld, meaning "field". Classification of felsic rocks In order for a rock to be classified as felsic, it generally needs to contain more than 75% felsic minerals; namely quartz, orthoclase and plagioclase. Rocks with greater than 90% felsic minerals can also be called leucocratic, from the Greek words for white and dominance. Felsite is a petrologic field term used to refer to very fine-grained or aphanitic, light-colored volcanic rocks which might be later reclassified after a more detailed microscopic or chemical analysis. In some cases, felsic volcanic rocks may contain phenocrysts of mafic minerals, usually hornblende, pyroxene or a feldspar mineral, and may need to be named after their phenocryst mineral, such as 'hornblende-bearing felsite'. The chemical name of a felsic rock is given according to the TAS classification of Le Maitre (1975). However, this only applies to volcanic rocks. If the rock is analyzed and found to be felsic but is metamorphic and has no definite volcanic protolith, it may be sufficient to simply call it a 'felsic schist'. There are examples known of highly sheared granites which can be mistaken for rhyolites. For phaneritic felsic rocks, the QAPF diagram should be used, and a name given according to the granite nomenclature. Often the species of mafic minerals is included in the name, for instance, hornblende-bearing granite, pyroxene tonalite or augite megacrystic monzonite, because the term "granite" already assumes content with feldspar and quartz. The rock texture thus determines the basic name of a felsic rock. See also QAPF diagram List of minerals List of rock types Bowen's reaction series Archean felsic volcanic rocks Notes References Le Maitre, L. E., ed. 2002. Igneous Rocks: A Classification and Glossary of Terms 2nd edition, Cambridge Igneous petrology
[ 0.3759709596633911, 0.319534569978714, -0.20788155496120453, 0.021057428792119026, -0.31711530685424805, -0.09433600306510925, 0.6338968873023987, 0.7165911197662354, -0.14276136457920074, -0.23306290805339813, -0.7438808679580688, 0.14850403368473053, -0.06835366040468216, 0.8080036640167...
11797
https://en.wikipedia.org/wiki/Frisians
Frisians
The Frisians are a Germanic ethnic group indigenous to the coastal regions of the Netherlands and northwestern Germany. They inhabit an area known as Frisia and are concentrated in the Dutch provinces of Friesland and Groningen and, in Germany, East Frisia and North Frisia (which was a part of Denmark until 1864). The Frisian languages are spoken by more than 500,000 people; West Frisian is officially recognised in the Netherlands (in Friesland), and North Frisian and Saterland Frisian are recognised as regional languages in Germany. History The ancient Frisii enter recorded history in the Roman account of Drusus's 12 BC war against the Rhine Germans and the Chauci. They occasionally appear in the accounts of Roman wars against the Germanic tribes of the region, up to and including the Revolt of the Batavi around 70 AD. Frisian mercenaries were hired to assist the Roman invasion of Britain in the capacity of cavalry. They are not mentioned again until 296, when they were deported into Roman territory as laeti (i.e., Roman-era serfs; see Binchester Roman Fort and Cuneus Frisionum). The discovery of a type of earthenware unique to fourth century Frisia, called terp Tritzum, shows that an unknown number of them were resettled in Flanders and Kent, probably as laeti under Roman coercion. From the third through the fifth centuries Frisia suffered marine transgressions that made most of the land uninhabitable, aggravated by a change to a cooler and wetter climate. Whatever population may have remained dropped dramatically, and the coastal lands remained largely unpopulated for the next two centuries. When conditions improved, Frisia received an influx of new settlers, mostly Angles and Saxons. These people would eventually be referred to as 'Frisians', though they were not necessarily descended from the ancient Frisii. It is these 'new Frisians' who are largely the ancestors of the medieval and modern Frisians. By the end of the sixth century, Frisian territory had expanded westward to the North Sea coast and, in the seventh century, southward down to Dorestad. This farthest extent of Frisian territory is sometimes referred to as Frisia Magna. Early Frisia was ruled by a High King, with the earliest reference to a 'Frisian King' being dated 678. In the early eighth century the Frisian nobles came into increasing conflict with the Franks to their south, resulting in a series of wars in which the Frankish Empire eventually subjugated Frisia in 734. These wars benefited attempts by Anglo-Irish missionaries (which had begun with Saint Boniface) to convert the Frisian populace to Christianity, in which Saint Willibrord largely succeeded. Some time after the death of Charlemagne, the Frisian territories were in theory under the control of the Count of Holland, but in practice the Hollandic counts, starting with Count Arnulf in 993, were unable to assert themselves as the sovereign lords of Frisia. The resulting stalemate resulted in a period of time called the 'Frisian freedom', a period in which feudalism and serfdom (as well as central or judicial administration) did not exist, and in which the Frisian lands only owed their allegiance to the Holy Roman Emperor. During the 13th century, however, the counts of Holland became increasingly powerful and, starting in 1272, sought to reassert themselves as rightful lords of the Frisian lands in a series of wars, which (with a series of lengthy interruptions) ended in 1422 with the Hollandic conquest of Western Frisia and with the establishment of a more powerful noble class in Central and Eastern Frisia. In 1524, Frisia became part of the Seventeen Provinces and in 1568 joined the Dutch revolt against Philip II, king of Spain, heir of the Burgundian territories; Central Frisia has remained a part of the Netherlands ever since. The eastern periphery of Frisia would become part of various German states (later Germany) and Denmark. An old tradition existed in the region of exploitation of peatlands. Migration to England and Scotland Though impossible to know exact numbers and migration patterns, research has indicated that many Frisians were part of the wave of ethnic groups to colonise areas of present-day England alongside the Angles, Saxons and Jutes, starting from around the fifth century when Frisians arrived along the coastline of Kent. One study found the DNA of people tested in Central England to be "indistinguishable" from that of Frisians. Frisians principally settled in modern-day Kent, East Anglia, the East Midlands, North East England, and Yorkshire. Across these areas, evidence of their settlement includes place names of Frisian origin, such as Frizinghall in Bradford and Frieston in Lincolnshire. Similarities in dialect between Great Yarmouth and Friesland have been noted, originating from trade between these areas during the Middle Ages. Frisians are also known to have founded the Freston area of Ipswich. In Scotland, historians have noted that colonies of Angles and Frisians settled as far north as the River Forth. This corresponds to those areas of Scotland which historically constituted part of Northumbria. Language As both the Anglo-Saxons of England and the early Frisians were formed from similar tribal confederacies, their respective languages were very similar, together forming the Anglo-Frisian family. Old Frisian is the most closely attested language to Old English and the modern Frisian dialects are in turn the closest related languages to contemporary English that do not themselves derive from Old English (although the modern Frisian and English are not mutually intelligible). The Frisian language group is divided into three mutually unintelligible languages: West Frisian, spoken in the Dutch province of Friesland Saterland Frisian, spoken in the German municipality of Saterland just south of East Frisia North Frisian, spoken in the German region of North Frisia (within the Kreis of Nordfriesland) on the west coast of Jutland. Of these three languages both Saterland Frisian (2,000 speakers) and North Frisian (10,000 speakers) are endangered. West Frisian is spoken by around 350,000 native speakers in Friesland, and as many as 470,000 when including speakers in neighbouring Groningen province. West Frisian is not listed as threatened, although research published by Radboud University in 2016 has challenged that assumption. Identity Today there exists a tripartite division, of North, East and West Frisians, caused by Frisia's continual loss of territory in the Middle Ages. The West Frisians, in general, do not see themselves as part of a larger group of Frisians, and, according to a 1970 poll, identify themselves more with the Dutch than with the East or North Frisians. Therefore, the term 'Frisian', when applied to the speakers of all three Frisian languages, is a linguistic, ethnic and/or cultural concept, not a political one. See also Anglo-Frisian languages Frisian Americans Frisian church in Rome Frisian Islands Frisian languages East Frisian (Saterland Frisian) North Frisian West Frisian Friso-Saxon dialects East Frisian Low Saxon Gronings Stellingwarfs Ingvaeonic languages List of Frisians List of Germanic tribes References Works cited Further reading Greg Woolf, "Cruptorix and his kind. Talking ethnicity on the middle ground", Ton Derks, Nico Roymans (ed.), Ethnic Constructs in Antiquity: The Role of Power and Tradition (Amsterdam: Amsterdam University Press, 2009) (Amsterdam Archaeological Studies, 13), 207–218. Jos Bazelmans, "The early-medieval use of ethnic names from classical antiquity. The case of the Frisians", in Ton Derks, Nico Roymans (ed.), Ethnic Constructs in Antiquity: The Role of Power and Tradition (Amsterdam: Amsterdam University Press, 2009) (Amsterdam Archaeological Studies, 13), 321–329. External links Fryske Akademy, the Frisian Academy Lex Frisionum in Latin, Dutch and English History of the Frisian folk Geographic history of Denmark German tribes Ethnic groups divided by international borders Germanic ethnic groups
[ -0.20605933666229248, -0.05474906787276268, -0.06366433203220367, -0.855450451374054, 0.062282416969537735, 0.3547029495239258, 0.5897210836410522, 0.47678670287132263, -0.8279067277908325, -0.35743260383605957, 0.016532210633158684, 0.1534731090068817, 0.3041175305843353, 0.32082736492156...
11800
https://en.wikipedia.org/wiki/Futurism%20%28disambiguation%29
Futurism (disambiguation)
Futurism is an artistic and social movement that originated in Italy in the early 20th century. Futures studies, also known as futurology, is the study of possible futures. Futurists are people specializing or interested in such study. Futurism or futurist may also refer to: Cultural movements Futurism (painting), a modern art school of painting and sculpture in early 1900s Futurism (literature), a modernist avant-garde movement in literature Futurist architecture, an architectural movement begun in Italy in 1904 Africanfuturism, an African subculture and literature genre Afrofuturism, an African-American and African diaspora subculture Cubo-Futurism, the main school of painting and sculpture practiced by the Russian Futurists Ego-Futurism, a Russian literary movement of the 1910s Indigenous Futurism, a movement consisting of art, literature, comics, games Neo-futurism, a contemporary art and architecture movement Retrofuturism, a modern art movement Russian Futurism, a movement of Russian poets and artists Religion Futurism (Christianity), an interpretation of the Bible in Christian eschatology Futurism (Judaism), used in three different contexts: religious, artistic and futures studies Music Futurism (music), a movement in music Albums Musica Futurista, an album of Futurist music Futurist (Alec Empire album), 2005 Futurist (Keeno album), 2016 The Futurist (Robert Downey Jr. album), 2004 The Futurist (Shellac album), 1997 Futurism, an album by Danny Tenaglia Songs "Futurism", a bonus track from the Muse album Origin of Symmetry "Futurism", from the Deerhunter album Why Hasn't Everything Already Disappeared? Other uses Futurism.com, a science and tech website formerly owned by Singularity University Futurist (comics), a Marvel Comics character Retro Futurism, a Korean play Futurist (magazine), published by the World Future Society Futurist Theatre, a theatre and cinema in Scarborough, North Yorkshire, England See also The Futurist (disambiguation) Future (disambiguation)
[ 0.6425701379776001, -0.05624100938439369, 0.12238620966672897, -0.048527684062719345, -0.3185495138168335, 0.009981033392250538, 0.19472339749336243, 0.06879754364490509, -0.033545203506946564, -0.4614962935447693, -0.2155839055776596, 0.253440797328949, -0.5714353919029236, 0.549409627914...
11801
https://en.wikipedia.org/wiki/Filippo%20Tommaso%20Marinetti
Filippo Tommaso Marinetti
Filippo Tommaso Emilio Marinetti (; 22 December 1876 – 2 December 1944) was an Italian poet, editor, art theorist, and founder of the Futurist movement. He was associated with the utopian and Symbolist artistic and literary community Abbaye de Créteil between 1907 and 1908. Marinetti is best known as the author of the first Futurist Manifesto, which was written and published in 1909, and as a co-author of the Fascist Manifesto, in 1919. Childhood and adolescence Emilio Angelo Carlo Marinetti (some documents give his name as "Filippo Achille Emilio Marinetti") spent the first years of his life in Alexandria, Egypt, where his father (Enrico Marinetti) and his mother (Amalia Grolli) lived together more uxorio (as if married). Enrico was a lawyer from Piedmont, and his mother was the daughter of a literary professor from Milan. They had come to Egypt in 1865, at the invitation of Khedive Isma'il Pasha, to act as legal advisers for foreign companies that were taking part in his modernization program. His love for literature developed during the school years. His mother was an avid reader of poetry, and introduced the young Marinetti to the Italian and European classics. At age seventeen he started his first school magazine, Papyrus; the Jesuits threatened to expel him for publicizing Émile Zola's scandalous novels in the school. He first studied in Egypt then in Paris, obtaining a baccalauréat degree in 1894 at the Sorbonne, and in Italy, graduating in law at the University of Pavia in 1899. He decided not to be a lawyer but to develop a literary career. He experimented with every type of literature (poetry, narrative, theatre, words in liberty), signing everything "Filippo Tommaso Marinetti". Futurism Marinetti and Constantin Brâncuși were visitors of the Abbaye de Créteil c. 1908 along with young writers like Roger Allard (one of the first to defend Cubism), Pierre Jean Jouve, and Paul Castiaux, who wanted to publish their works through the Abbaye. The Abbaye de Créteil was a phalanstère community founded in the autumn of 1906 by the painter Albert Gleizes, and the poets René Arcos, Henri-Martin Barzun, Alexandre Mercereau and Charles Vildrac. The movement drew its inspiration from the Abbaye de Thélème, a fictional creation by Rabelais in his novel Gargantua. It was closed down by its members early in 1908. Marinetti is known best as the author of the Futurist Manifesto, which he wrote in 1909. It was published in French on the front page of the most prestigious French daily newspaper, Le Figaro, on 20 February 1909. In The Founding and Manifesto of Futurism, Marinetti declared that "Art, in fact, can be nothing but violence, cruelty, and injustice." Georges Sorel, who influenced the entire political spectrum from anarchism to Fascism, also argued for the importance of violence. Futurism had both anarchist and Fascist elements; Marinetti later became an active supporter of Benito Mussolini. Marinetti, who admired speed, had a minor car accident outside Milan in 1908 when he veered into a ditch to avoid two cyclists. He referred to the accident in the Futurist Manifesto: the Marinetti who was helped out of the ditch was a new man, determined to end the pretense and decadence of the prevailing Liberty style. He discussed a new and strongly revolutionary programme with his friends, in which they should end every artistic relationship with the past, "destroy the museums, the libraries, every type of academy". Together, he wrote, "We will glorify war—the world's only hygiene—militarism, patriotism, the destructive gesture of freedom-bringers, beautiful ideas worth dying for, and scorn for woman". The Futurist Manifesto was read and debated all across Europe, but Marinetti's first 'Futurist' works were not as successful. In April, the opening night of his drama Le Roi bombance (The Feasting King), written in 1905, was interrupted by loud, derisive whistling by the audience... and by Marinetti himself, who thus introduced another element of Futurism, "the desire to be heckled." Marinetti did, however, fight a duel with a critic he considered too harsh. His drama La donna è mobile (Poupées électriques), first presented in Turin, was not successful either. Nowadays, the play is remembered through a later version, named Elettricità sessuale (Sexual Electricity), and mainly for the appearance onstage of humanoid automatons, ten years before the Czech writer Karel Čapek invented the term robot. In 1910 his first novel, Mafarka il futurista, was cleared of all charges by an obscenity trial. That year, Marinetti discovered some allies in three young painters (Umberto Boccioni, Carlo Carrà, Luigi Russolo), who adopted the Futurist philosophy. Together with them (and with poets such as Aldo Palazzeschi), Marinetti began a series of Futurist Evenings, theatrical spectacles in which Futurists declaimed their manifestos in front of a crowd that in part attended the performances to throw vegetables at them. The most successful "happening" of that period was the publicization of the "Manifesto Against Past-Loving Venice" in Venice. In the flier, Marinetti demands "fill(ing) the small, stinking canals with the rubble from the old, collapsing and leprous palaces" to "prepare for the birth of an industrial and militarized Venice, capable of dominating the great Adriatic, a great Italian lake." In 1911, the Italo-Turkish War began and Marinetti departed for Libya as war correspondent for a French newspaper. His articles were eventually collected and published in The Battle of Tripoli. He then covered the First Balkan War of 1912–13, witnessing the surprise success of Bulgarian troops against the Ottoman Empire in the Siege of Adrianople. In this period he also made a number of visits to London, which he considered 'the Futurist city par excellence', and where a number of exhibitions, lectures and demonstrations of Futurist music were staged. However, although a number of artists, including Wyndham Lewis, were interested in the new movement, only one British convert was made, the young artist C.R.W. Nevinson. Nevertheless, Futurism was an important influence upon Lewis's Vorticist philosophy. About the same time Marinetti worked on a very anti-Roman Catholic and anti-Austrian verse-novel, Le monoplan du Pape (The Pope's Aeroplane, 1912) and edited an anthology of futurist poets. But his attempts to renew the style of poetry did not satisfy him. So much so that, in his foreword to the anthology, he declared a new revolution: it was time to be done with traditional syntax and to use "words in freedom" (parole in libertà). His sound-poem Zang Tumb Tumb, an account of the Battle of Adrianople, exemplifies words in freedom. Recordings can be heard of Marinetti reading some of his sound poems: Battaglia, Peso + Odore (1912); Dune, parole in libertà (1914); La Battaglia di Adrianopoli (1926) (recorded 1935). Wartime Marinetti agitated for Italian involvement in World War I, and once Italy was engaged, promptly volunteered for service. In the fall of 1915 he and several other Futurists who were members of the Lombard Volunteer Cyclists were stationed at Lake Garda, in Trentino province, high in the mountains along the Italo-Austrian border. They endured several weeks of fighting in harsh conditions before the cyclists units, deemed inappropriate for mountain warfare, were disbanded. Marinetti spent most of 1916 supporting Italy's war effort with speeches, journalism, and theatrical work, then returned to military service as a regular army officer in 1917. In May of that year he was seriously wounded while serving with an artillery battalion on the Isonzo front; he returned to service after a long recovery, and participated in the decisive Italian victory at Vittorio Veneto in October 1918. Marriage After an extended courtship, in 1923 Marinetti married Benedetta Cappa (1897–1977), a writer and painter and a pupil of Giacomo Balla. Born in Rome, she had joined the Futurists in 1917. They'd met in 1918, moved in together in Rome, and chose to marry only to avoid legal complications on a lecture tour of Brazil. They had three daughters: Vittoria, Ala, and Luce. Cappa and Marinetti collaborated on a genre of mixed-media assemblages in the mid-1920s they called tattilismo ("Tactilism"), and she was a strong proponent and practitioner of the aeropittura movement after its inception in 1929. She also produced three experimental novels. Cappa's major public work is likely a series of five murals at the Palermo Post Office (1926–1935) for the Fascist public-works architect Angiolo Mazzoni. Marinetti and Fascism In early 1918 he founded the Partito Politico Futurista or Futurist Political Party, which only a year later merged with Benito Mussolini's Fasci Italiani di Combattimento. Marinetti was one of the first affiliates of the Italian Fascist Party. In 1919 he co-wrote with Alceste De Ambris the Fascist Manifesto, the original manifesto of Italian Fascism. He opposed Fascism's later exaltation of existing institutions, terming them "reactionary," and, after walking out of the 1920 Fascist party congress in disgust, withdrew from politics for three years. However, he remained a notable force in developing the party philosophy throughout the regime's existence. For example, at the end of the Congress of Fascist Culture that was held in Bologna on 30 March 1925, Giovanni Gentile addressed Sergio Panunzio on the need to define Fascism more purposefully by way of Marinetti's opinion, stating, "Great spiritual movements make recourse to precision when their primitive inspirations—what F. T. Marinetti identified this morning as artistic, that is to say, the creative and truly innovative ideas, from which the movement derived its first and most potent impulse—have lost their force. We today find ourselves at the very beginning of a new life and we experience with joy this obscure need that fills our hearts—this need that is our inspiration, the genius that governs us and carries us with it." As part of his campaign to overturn tradition, Marinetti also attacked traditional Italian food. His Manifesto of Futurist Cooking was published in the Turin Gazzetta del Popolo on 28 December 1930. Arguing that "People think, dress and act in accordance with what they drink and eat", Marinetti proposed wide-ranging changes to diet. He condemned pasta, blaming it for lassitude, pessimism and lack of virility, and promoted the eating of Italian-grown rice. In this, as in other ways, his proposed Futurist cooking was nationalistic, rejecting foreign foods and food names. It was also militaristic, seeking to stimulate men to be fighters. Marinetti also sought to increase creativity. His attraction to whatever was new made scientific discoveries appealing to him, but his views on diet were not scientifically based. He was fascinated with the idea of processed food, predicting that someday pills would replace food as a source of energy, and calling for the creation of "plastic complexes" to replace natural foods. Food, in turn, would become a matter of artistic expression. Many of the meals Marinetti described and ate resemble performance art, such as the "Tactile Dinner", recreated in 2014 for an exhibit at the Guggenheim Museum. Participants wore pajamas decorated with sponge, sandpaper, and aluminum, and ate salads without using cutlery. During the Fascist regime Marinetti sought to make Futurism the official state art of Italy but failed to do so. Mussolini was personally uninterested in art and chose to give patronage to numerous styles to keep artists loyal to the regime. Opening the exhibition of art by the Novecento Italiano group in 1923, he said: "I declare that it is far from my idea to encourage anything like a state art. Art belongs to the domain of the individual. The state has only one duty: not to undermine art, to provide humane conditions for artists, to encourage them from the artistic and national point of view." Mussolini's mistress, Margherita Sarfatti, successfully promoted the rival Novecento Group, and even persuaded Marinetti to be part of its board. In Fascist Italy, modern art was tolerated and even approved by the Fascist hierarchy. Towards the end of the 1930s, some Fascist ideologues (for example, the ex-Futurist Ardengo Soffici) wished to import the concept of "degenerate art" from Germany to Italy and condemned modernism, although their demands were ignored by the regime. In 1938, hearing that Adolf Hitler wanted to include Futurism in a traveling exhibition of degenerate art, Marinetti persuaded Mussolini to refuse to let it enter Italy. On 17 November 1938, Italy passed The Racial Laws, discriminating against Italian Jews, much as the discrimination pronounced in the Nuremberg Laws. The antisemitic trend in Italy resulted in attacks against modern art, judged too foreign, too radical and anti-nationalist. In the 11 January 1939 issue of the Futurist journal, Artecrazia Marinetti expressed his condemnation of such attacks on modern art, noting Futurism is both Italian and nationalist, not foreign, and stating that there were no Jews in Futurism. Furthermore, he claimed Jews were not active in the development of modern art. Regardless, the Italian state shut down Artecrazia. Marinetti made numerous attempts to ingratiate himself with the regime, becoming less radical and avant garde with each attempt. He relocated from Milan to Rome. He became an academician despite his condemnation of academies, saying, "It is important that Futurism be represented in the Academy." He was an atheist, but by the mid 1930s he had come to accept the influence of the Catholic Church on Italian society. In Gazzetta del Popolo, 21 June 1931, Marinetti proclaimed that "Only Futurist artists...are able to express clearly...the simultaneous dogmas of the Catholic faith, such as the Holy Trinity, the Immaculate Conception and Christ's Calvary." In his last works, written just before his death in 1944 L'aeropoema di Gesù ("The Aeropoem of Jesus") and Quarto d'ora di poesia per the X Mas ("A Fifteen Minutes' Poem of the X Mas"), Marinetti sought to reconcile his newfound love for God and his passion for the action that accompanied him throughout his life. There were other contradictions in his character: despite his nationalism, he was international, educated in Egypt and France, writing his first poems in French, publishing the Futurist Manifesto in a French newspaper and traveling to promote his ideas. Marinetti volunteered for active service in the Second Italo-Abyssinian War and the Second World War, serving on the Eastern Front for a few weeks in the Summer and Autumn of 1942 at the age of 65. He died of cardiac arrest in Bellagio on 2 December 1944 while working on a collection of poems praising the wartime achievements of the Decima Flottiglia MAS. Writings Marinetti, Filippo Tommaso, Il Fascino dell'Egitto (The Charm of Egypt), A. Mondadori – Editore, 1933, https://archive.org/details/marinetti_fascino_1933A/page/n3/mode/2up Marinetti, Filippo Tommaso: Mafarka the Futurist. An African novel, Middlesex University Press, 1998, Marinetti, Filippo Tommaso: Selected Poems and Related Prose, Yale University Press, 2002, Marinetti, Filippo Tommaso: Critical Writings, ed. by Günter Berghaus, New York : Farrar, Straus, and Giroux, 2006, 549p., , pocket edition 2008: Carlo Schirru, Per un’analisi interlinguistica d’epoca: Grazia Deledda e contemporanei, Rivista Italiana di Linguistica e di Dialettologia, Fabrizio Serra editore, Pisa-Roma, Anno XI, 2009, pp. 9–32 Filippo Tommaso Marinetti, Le Futurisme, textes annotés et préfacés par Giovanni Lista, L’Age d’Homme, Lausanne, 1980 Filippo Tommaso Marinetti, Les Mots en liberté futuristes, préfacés par Giovanni Lista, L’Age d’Homme, Lausanne, 1987 Giovanni Lista, F. T. Marinetti, Éditions Seghers, Paris, 1976 Marinetti et le futurisme, poèmes, études, documents, iconographie, réunis et préfacés par Giovanni Lista, bibliographie établie par Giovanni Lista, L’Age d’Homme, Lausanne, 1977 Giovanni Lista, F. T. Marinetti, l’anarchiste du futurisme, Éditions Séguier, Paris, 1995 Giovanni Lista, Le Futurisme : création et avant-garde, Éditions L’Amateur, Paris, 2001 Giovanni Lista, Le Futurisme, une avant-garde radicale, coll. "Découvertes Gallimard" (n° 533), Éditions Gallimard, Paris, 2008. Giovanni Lista, Journal des Futurismes, Éditions Hazan, coll. "Bibliothèque", Paris, 2008 () Antonino Reitano, L'onore, la patria e la fede nell'ultimo Marinetti, Angelo Parisi Editore, 2006 Barbara Meazzi, Il fantasma del romanzo. Le futurisme italien et l'écriture romanesque (1909–1929), Chambéry, Presses universitaires Savoie Mont Blanc, 2021, 430 pp., References Further reading Robbins, Daniel, Sources of Cubism and Futurism, Art Journal, Vol. 41, No. 4, (Winter 1981): pp. 324–327, College Art Association External links ItalianFuturism.org: news, exhibitions, and scholarship pertaining to the Futurist Movement Image of Le Figaro with Le Futurisme (1909) Score to the sound poem Dune, parole in libertà (1914) Marinetti's "La Battaglia di Adrianopoli" (1926) recorded by Marinetti in 1935 published at LTM Filippo Tommaso Marinetti Papers. General Collection, Beinecke Rare Book and Manuscript Library. Filippo Tommaso Marinetti's Libroni on Futurism Images derived from slides taken of seven scrapbooks compiled by Marinetti between 1905 and 1944 from the Beinecke Rare Book and Manuscript Library at Yale University Italian male classical composers Italian classical composers Italian male poets Italian fascists Italian anti-communists Italian Futurism Italian military personnel of World War I Italian military personnel of the Second Italo-Ethiopian War Italian military personnel of World War II Italian Roman Catholics Converts to Roman Catholicism from atheism or agnosticism 1876 births 1944 deaths Futurism Futurist composers Futurist writers Italian writers in French Italian magazine editors Italian art critics Members of the Royal Academy of Italy Modernist theatre Modernist writers People from Alexandria People of the Italian Social Republic War correspondents of the Balkan Wars Burials at the Cimitero Monumentale di Milano 19th-century classical composers 20th-century classical composers 19th-century Italian composers 20th-century Italian composers 19th-century Italian poets 20th-century Italian poets 19th-century Italian writers 20th-century Italian male writers 19th-century Italian male writers Anti-Masonry Italian male non-fiction writers 20th-century Italian male musicians 19th-century Italian male musicians
[ 0.6400661468505859, 0.09261500835418701, -0.13466651737689972, -0.023322124034166336, -0.3118668794631958, 0.6745457649230957, 0.35452181100845337, -0.2203630954027176, -0.24522896111011505, -0.5940994620323181, -0.7113878130912781, 0.4251299798488617, -0.6780291199684143, 0.48280102014541...
11803
https://en.wikipedia.org/wiki/Franz%20Mesmer
Franz Mesmer
Franz Anton Mesmer (; ; 23 May 1734 – 5 March 1815) was a German physician with an interest in astronomy. He theorised the existence of a natural energy transference occurring between all animated and inanimate objects; this he called "animal magnetism", sometimes later referred to as mesmerism. Mesmer's theory attracted a wide following between about 1780 and 1850, and continued to have some influence until the end of the 19th century. In 1843, the Scottish doctor James Braid proposed the term "hypnotism" for a technique derived from animal magnetism; today the word "mesmerism" generally functions as a synonym of "hypnosis". Mesmer also supported the arts, specifically music; he was on friendly terms with Haydn and Mozart. Early life Mesmer was born in the village of Iznang (nowadays part of the municipality of Moos), on the shore of Lake Constance in Swabia, a son of master forester Anton Mesmer (1701—after 1747) and his wife, Maria/Ursula (née Michel; 1701—1770). After studying at the Jesuit universities of Dillingen and Ingolstadt, he took up the study of medicine at the University of Vienna in 1759. In 1766 he published a doctoral dissertation with the Latin title De planetarum influxu in corpus humanum (On the Influence of the Planets on the Human Body), which discussed the influence of the moon and the planets on the human body and on disease. This was not medical astrology. Building largely on Isaac Newton's theory of the tides, Mesmer expounded on certain tides in the human body that might be accounted for by the movements of the sun and moon. Evidence assembled by Frank A. Pattie suggests that Mesmer plagiarized a part of his dissertation from a work by Richard Mead, an eminent English physician and Newton's friend. However, in Mesmer's day doctoral theses were not expected to be original. In January 1768, Mesmer married Anna Maria von Posch, a wealthy widow, and established himself as a doctor in Vienna. In the summers he lived on a splendid estate and became a patron of the arts. In 1768, when court intrigue prevented the performance of La finta semplice (K. 51), for which the twelve-year-old Wolfgang Amadeus Mozart had composed 500 pages of music, Mesmer is said to have arranged a performance in his garden of Mozart's Bastien und Bastienne (K. 50), a one-act opera, though Mozart's biographer Nissen found no proof that this performance actually took place. Mozart later immortalized his former patron by including a comedic reference to Mesmer in his opera Così fan tutte. Animal magnetism In 1774, Mesmer produced an "artificial tide" in a patient, Francisca Österlin, who suffered from hysteria, by having her swallow a preparation containing iron and then attaching magnets to various parts of her body. She reported feeling streams of a mysterious fluid running through her body and was relieved of her symptoms for several hours. Mesmer did not believe that the magnets had achieved the cure on their own. He felt that he had contributed animal magnetism, which had accumulated in his work, to her. He soon stopped using magnets as a part of his treatment. In the same year Mesmer collaborated with Maximilian Hell. In 1775, Mesmer was invited to give his opinion before the Munich Academy of Sciences on the exorcisms carried out by Johann Joseph Gassner (Gaßner), a priest and healer who grew up in Vorarlberg, Austria. Mesmer said that while Gassner was sincere in his beliefs, his cures resulted because he possessed a high degree of animal magnetism. This confrontation between Mesmer's secular ideas and Gassner's religious beliefs marked the end of Gassner's career as well as, according to Henri Ellenberger, the emergence of dynamic psychiatry. The scandal that followed Mesmer's only partial success in curing the blindness of an 18-year-old musician, Maria Theresia Paradis, led him to leave Vienna in 1777. In February 1778 Mesmer moved to Paris, rented an apartment in a part of the city preferred by the wealthy and powerful, and established a medical practice. There he would reunite with Mozart who often visited him. Paris soon divided into those who thought he was a charlatan who had been forced to flee from Vienna and those who thought he had made a great discovery. In his first years in Paris, Mesmer tried and failed to get either the Royal Academy of Sciences or the Royal Society of Medicine to provide official approval for his doctrines. He found only one physician of high professional and social standing, Charles d'Eslon, to become a disciple. In 1779, with d'Eslon's encouragement, Mesmer wrote an 88-page book, Mémoire sur la découverte du magnétisme animal, to which he appended his famous 27 Propositions. These propositions outlined his theory at that time. Some contemporary scholars equate Mesmer's animal magnetism with the Qi (chi) of Traditional Chinese Medicine and mesmerism with medical Qigong practices. According to d'Eslon, Mesmer understood health as the free flow of the process of life through thousands of channels in our bodies. Illness was caused by obstacles to this flow. Overcoming these obstacles and restoring flow produced crises, which restored health. When Nature failed to do this spontaneously, contact with a conductor of animal magnetism was a necessary and sufficient remedy. Mesmer aimed to aid or provoke the efforts of Nature. To cure an insane person, for example, involved causing a fit of madness. The advantage of magnetism involved accelerating such crises without danger. Procedure Mesmer treated patients both individually and in groups. With individuals he would sit in front of his patient with his knees touching the patient's knees, pressing the patient's thumbs in his hands, looking fixedly into the patient's eyes. Mesmer made "passes", moving his hands from patients' shoulders down along their arms. He then pressed his fingers on the patient's hypochondrium region (the area below the diaphragm), sometimes holding his hands there for hours. Many patients felt peculiar sensations or had convulsions that were regarded as crises and supposed to bring about the cure. Mesmer would often conclude his treatments by playing some music on a glass armonica. By 1780 Mesmer had more patients than he could treat individually and he established a collective treatment known as the "baquet." An English doctor who observed Mesmer described the treatment as follows:In the middle of the room is placed a vessel of about a foot and a half high which is called here a "baquet". It is so large that twenty people can easily sit round it; near the edge of the lid which covers it, there are holes pierced corresponding to the number of persons who are to surround it; into these holes are introduced iron rods, bent at right angles outwards, and of different heights, so as to answer to the part of the body to which they are to be applied. Besides these rods, there is a rope which communicates between the baquet and one of the patients, and from him is carried to another, and so on the whole round. The most sensible effects are produced on the approach of Mesmer, who is said to convey the fluid by certain motions of his hands or eyes, without touching the person. I have talked with several who have witnessed these effects, who have convulsions occasioned and removed by a movement of the hand... Investigation In 1784, without Mesmer requesting it, King Louis XVI appointed four members of the Faculty of Medicine as commissioners to investigate animal magnetism as practiced by d'Eslon. At the request of these commissioners, the king appointed five additional commissioners from the Royal Academy of Sciences. These included the chemist Antoine Lavoisier, the doctor Joseph-Ignace Guillotin, the astronomer Jean Sylvain Bailly, and the American ambassador Benjamin Franklin. The commission conducted a series of experiments aimed not at determining whether Mesmer's treatment worked, but whether he had discovered a new physical fluid. The commission concluded that there was no evidence for such a fluid. Whatever benefit the treatment produced was attributed to "imagination". One of the commissioners, the botanist Antoine Laurent de Jussieu took exception to the official reports. He wrote a dissenting opinion that declared Mesmer's theory credible and worthy of further investigation. The commission did not examine Mesmer, but investigated the practice of d'Eslon. In doing so using blind trials in their investigation, the commission learned that Mesmerism only seemed to work when the subject was aware of it. The commission termed it as "Imagination," but their findings are considered the first observation of the placebo effect. Mesmer was driven into exile soon after the investigations on animal magnetism although his influential student, Armand-Marie-Jacques de Chastenet, Marquis de Puységur (1751–1825), continued to have many followers until his death. Mesmer continued to practice in Frauenfeld, Switzerland, for a number of years and died in 1815 in Meersburg. Abbé Faria, an Indo-Portuguese monk in Paris and a contemporary of Mesmer, claimed that "nothing comes from the magnetizer; everything comes from the subject and takes place in his imagination, i.e. autosuggestion generated from within the mind." Works De planetarum influxu in corpus humanum (Über den Einfluss der Gestirne auf den menschlichen Körper) [The Influence of the Planets on the Human Body] (1766) . Mémoire sur la découverte du magnetisme animal, Didot, Genf und Paris (1779) . View at Gallica, from the Bibliothèque nationale de France (BnF). Sendschreiben an einen auswärtigen Arzt über die Magnetkur [Circulatory letter to an external[?] physician about the magnetic cure] (1775) . Théorie du monde et des êtres organisés suivant les principes de M…., Paris, (1784) . View at Gallica, BnF. Mémoire de F. A. Mesmer,...sur ses découvertes (1798–1799) . View at Gallica, BnF. Mesmerismus oder System der Wechselwirkungen. Theorie und Anwendung des thierischen Magnetismus als die allgemeine Heilkunde zur Erhaltung des Menschen [Mesmerism or the system of inter-relations. Theory and applications of animal magnetism as general medicine for the preservation of man]. Edited by . Nikolai, Berlin (1814) . View at Munich Digitization Center, from the Bavarian State Library. See also Animal magnetism Royal Commission on Animal Magnetism Notes References Bailly, J-S., "Secret Report on Mesmerism or Animal Magnetism", International Journal of Clinical and Experimental Hypnosis, Vol. 50, No. 4, (October 2002), pp. 364–68. doi=10.1080/00207140208410110 Franklin, B., Majault, M. J., Le Roy, J. B., Sallin, C. L., Bailly, J-S., d'Arcet, J., de Bory, G., Guillotin, J-I., and Lavoisier, A., "Report of the Commissioners charged by the King with the Examination of Animal Magnetism", International Journal of Clinical and Experimental Hypnosis, Vol. 50, No. 4, (October 2002), pp. 332–63. doi=10.1080/00207140208410109 Buranelli, V., The Wizard from Vienna: Franz Anton Mesmer, Coward, McCann & Geoghegan., (New York), 1975. Crabtree, Adam (1988). Animal Magnetism, Early Hypnotism, and Psychical Research, 1766–1925 – An Annotated Bibliography. White Plains, NY: Kraus International. Donaldson, I.M.L., "Mesmer's 1780 Proposal for a Controlled Trial to Test his Method of Treatment Using 'Animal Magnetism'", Journal of the Royal Society of Medicine, Vol.98, No.12, (December 2005), pp. 572–575. Goldsmith, M., Franz Anton Mesmer: A History of Mesmerism, Doubleday, Doran & Co., (New York), 1934. Harte, R., Hypnotism and the Doctors, Volume I: Animal Magnetism: Mesmer/De Puysegur, L.N. Fowler & Co., (London), 1902. Pattie, F.A., "Mesmer's Medical Dissertation and Its Debt to Mead's De Imperio Solis ac Lunae", Journal of the History of Medicine and Allied Sciences, Vol.11, (July 1956), pp. 275–287. http://www.deutsche-biographie.de/pnd118581309.html Winter, A., Mesmerized: Powers of Mind in Victorian Britain, The University of Chicago Press, (Chicago), 1998. Wyckoff, J. [1975], Franz Anton Mesmer: Between God and Devil, Prentice-Hall, (Englewood Cliffs), 1975. External links "Condorcet and mesmerism: a record in the history of scepticism", Condorcet manuscript (1784), online and analyzed on Bibnum [click 'à télécharger' for English version]. 1734 births 1815 deaths 18th-century German physicians 18th-century German writers 18th-century German male writers 19th-century German writers 19th-century German male writers German astrologers German hypnotists Animal magnetism New Age predecessors Rosicrucians People from Konstanz (district) 18th-century occultists 19th-century occultists
[ -0.5217832326889038, 0.5896355509757996, -0.35461342334747314, 0.3380639851093292, 0.03252246975898743, 1.0586628913879395, 0.6064560413360596, -0.4378587305545807, -0.24126508831977844, -0.2839764952659607, -0.2645755708217621, -0.14118866622447968, -0.1890610307455063, 0.4139659404754638...
11806
https://en.wikipedia.org/wiki/Foix%E2%80%93Alajouanine%20syndrome
Foix–Alajouanine syndrome
Foix–Alajouanine syndrome, also called subacute ascending necrotizing myelitis, is a disease caused by an arteriovenous malformation of the spinal cord. In particular, most cases involve dural arteriovenous malformations that present in the lower thoracic or lumbar spinal cord. The patients can present with symptoms indicating spinal cord involvement such as (paralysis of arms and legs, numbness and loss of sensation and sphincter dysfunction), and pathological examination reveals disseminated nerve cell death in the spinal cord. The condition is named after Charles Foix and Théophile Alajouanine who first described the condition in 1926. Diagnosis Clinically, the patient may present with neurological symptoms such as numbness, weakness, loss of reflexes, or even sudden or progressive paralysis. The affected portion of the body will correlate to where the lesion lies within the spinal cord. The disease typically has an insidious onset, but symptoms may manifest suddenly. A thorough physical exam may lead a physician toward targeted imaging, with MRI being the most appropriate imaging modality for initial diagnosis. A spinal MRA will serve as a superior imaging technique to visualize the extent of the arteriovenous malformation within the cord and may be especially useful if surgical treatment is attempted. Treatment Surgical treatment may be attempted with endovascular embolization or ligation of the arteriovenous malformation within the spinal cord.Corticosteroids may be used acutely to help slow the progression of symptoms or they may be used chronically in a poor surgical candidate. In either case, physical therapy will be an important part of the recovery process in helping the patient regain strength and coordination. See also Vascular myelopathy References External links Spinal cord disorders Syndromes affecting the nervous system
[ -0.22756892442703247, 0.5166674852371216, -0.19287273287773132, -0.2639637887477875, 0.01118420623242855, 0.34483063220977783, 0.4912356436252594, 0.23114782571792603, -0.41057226061820984, -0.6327050924301147, -0.1526069939136505, 0.2939780652523041, -0.1629338562488556, 0.419103235006332...
11807
https://en.wikipedia.org/wiki/Ferromagnetism
Ferromagnetism
Ferromagnetism is the basic mechanism by which certain materials (such as iron) form permanent magnets, or are attracted to magnets. In physics, several different types of magnetism are distinguished. Ferromagnetism (along with the similar effect ferrimagnetism) is the strongest type and is responsible for the common phenomenon of magnetism in magnets encountered in everyday life. Substances respond weakly to magnetic fields with three other types of magnetism—paramagnetism, diamagnetism, and antiferromagnetism—but the forces are usually so weak that they can be detected only by sensitive instruments in a laboratory. An everyday example of ferromagnetism is a refrigerator magnet used to hold notes on a refrigerator door. The attraction between a magnet and ferromagnetic material is "the quality of magnetism first apparent to the ancient world, and to us today". Permanent magnets (materials that can be magnetized by an external magnetic field and remain magnetized after the external field is removed) are either ferromagnetic or ferrimagnetic, as are the materials that are noticeably attracted to them. Only a few substances are ferromagnetic. The common ones are iron, cobalt, nickel and most of their alloys, and some compounds of rare earth metals. Ferromagnetism is very important in industry and modern technology, and is the basis for many electrical and electromechanical devices such as electromagnets, electric motors, generators, transformers, and magnetic storage such as tape recorders, and hard disks, and nondestructive testing of ferrous materials. Ferromagnetic materials can be divided into magnetically "soft" materials like annealed iron, which can be magnetized but do not tend to stay magnetized, and magnetically "hard" materials, which do. Permanent magnets are made from "hard" ferromagnetic materials such as alnico, and ferrimagnetic materials such as ferrite that are subjected to special processing in a strong magnetic field during manufacture to align their internal microcrystalline structure, making them very hard to demagnetize. To demagnetize a saturated magnet, a certain magnetic field must be applied, and this threshold depends on coercivity of the respective material. "Hard" materials have high coercivity, whereas "soft" materials have low coercivity. The overall strength of a magnet is measured by its magnetic moment or, alternatively, the total magnetic flux it produces. The local strength of magnetism in a material is measured by its magnetization. History and distinction from ferrimagnetism Historically, the term ferromagnetism was used for any material that could exhibit spontaneous magnetization: a net magnetic moment in the absence of an external magnetic field; that is any material that could become a magnet. This general definition is still in common use. However, in a landmark paper in 1948, Louis Néel showed there are two levels of magnetic alignment that result in this behavior. One is ferromagnetism in the strict sense, where all the magnetic moments are aligned. The other is ferrimagnetism, where some magnetic moments point in the opposite direction but have a smaller contribution, so there is still a spontaneous magnetization. In the special case where the opposing moments balance completely, the alignment is known as antiferromagnetism. Therefore antiferromagnets do not have a spontaneous magnetization. Ferromagnetic materials Ferromagnetism is an unusual property that occurs in only a few substances. The common ones are the transition metals iron, nickel, cobalt and their alloys, and alloys of rare earth metals. It is a property not just of the chemical make-up of a material, but of its crystalline structure and microstructure. Their ferromagnetism results from having many unpaired electrons in their d-block in the case of iron and its relatives, or the f-block in the case of the rare earth metals, a result of Hund's rule of maximum multiplicity. There are ferromagnetic metal alloys whose constituents are not themselves ferromagnetic, called Heusler alloys, named after Fritz Heusler. Conversely there are non-magnetic alloys, such as types of stainless steel, composed almost exclusively of ferromagnetic metals. Amorphous (non-crystalline) ferromagnetic metallic alloys can be made by very rapid quenching (cooling) of a liquid alloy. These have the advantage that their properties are nearly isotropic (not aligned along a crystal axis); this results in low coercivity, low hysteresis loss, high permeability, and high electrical resistivity. One such typical material is a transition metal-metalloid alloy, made from about 80% transition metal (usually Fe, Co, or Ni) and a metalloid component (B, C, Si, P, or Al) that lowers the melting point. A relatively new class of exceptionally strong ferromagnetic materials are the rare-earth magnets. They contain lanthanide elements that are known for their ability to carry large magnetic moments in well-localized f-orbitals. The table lists a selection of ferromagnetic and ferrimagnetic compounds, along with the temperature above which they cease to exhibit spontaneous magnetization (see Curie temperature). Unusual materials Most ferromagnetic materials are metals, since the conducting electrons are often responsible for mediating the ferromagnetic interactions. It is therefore a challenge to develop ferromagnetic insulators, especially multiferroic materials, which are both ferromagnetic and ferroelectric. A number of actinide compounds are ferromagnets at room temperature or exhibit ferromagnetism upon cooling. PuP is a paramagnet with cubic symmetry at room temperature, but which undergoes a structural transition into a tetragonal state with ferromagnetic order when cooled below its TC = 125 K. In its ferromagnetic state, PuP's easy axis is in the <100> direction. In NpFe2 the easy axis is <111>. Above , NpFe2 is also paramagnetic and cubic. Cooling below the Curie temperature produces a rhombohedral distortion wherein the rhombohedral angle changes from 60° (cubic phase) to 60.53°. An alternate description of this distortion is to consider the length c along the unique trigonal axis (after the distortion has begun) and a as the distance in the plane perpendicular to c. In the cubic phase this reduces to . Below the Curie temperature which is the largest strain in any actinide compound. NpNi2 undergoes a similar lattice distortion below , with a strain of (43 ± 5) × 10−4. NpCo2 is a ferrimagnet below 15 K. In 2009, a team of MIT physicists demonstrated that a lithium gas cooled to less than one kelvin can exhibit ferromagnetism. The team cooled fermionic lithium-6 to less than (150 billionths of one kelvin) using infrared laser cooling. This demonstration is the first time that ferromagnetism has been demonstrated in a gas. In 2018, a team of University of Minnesota physicists demonstrated that body-centered tetragonal ruthenium exhibits ferromagnetism at room temperature. Electrically-induced ferromagnetism Recent research has shown evidence that ferromagnetism can be induced in some materials by an electric current or voltage. Antiferromagnetic LaMnO3 and SrCoO has been switched to ferromagnetic by a current. In July 2020 scientists reported inducing ferromagnetism in the abundant diamagnetic material iron pyrite ("fool's gold") by an applied voltage. In these experiments the ferromagnetism was limited to a thin surface layer. Explanation The Bohr–Van Leeuwen theorem, discovered in the 1910s, showed that classical physics theories are unable to account for any form of magnetism, including ferromagnetism. Magnetism is now regarded as a purely quantum mechanical effect. Ferromagnetism arises due to two effects from quantum mechanics: spin and the Pauli exclusion principle. Origin of magnetism One of the fundamental properties of an electron (besides that it carries charge) is that it has a magnetic dipole moment, i.e., it behaves like a tiny magnet, producing a magnetic field. This dipole moment comes from the more fundamental property of the electron that it has quantum mechanical spin. Due to its quantum nature, the spin of the electron can be in one of only two states; with the magnetic field either pointing "up" or "down" (for any choice of up and down). The spin of the electrons in atoms is the main source of ferromagnetism, although there is also a contribution from the orbital angular momentum of the electron about the nucleus. When these magnetic dipoles in a piece of matter are aligned, (point in the same direction) their individually tiny magnetic fields add together to create a much larger macroscopic field. However, materials made of atoms with filled electron shells have a total dipole moment of zero: because the electrons all exist in pairs with opposite spin, every electron's magnetic moment is cancelled by the opposite moment of the second electron in the pair. Only atoms with partially filled shells (i.e., unpaired spins) can have a net magnetic moment, so ferromagnetism occurs only in materials with partially filled shells. Because of Hund's rules, the first few electrons in a shell tend to have the same spin, thereby increasing the total dipole moment. These unpaired dipoles (often called simply "spins" even though they also generally include orbital angular momentum) tend to align in parallel to an external magnetic field, an effect called paramagnetism. Ferromagnetism involves an additional phenomenon, however: in a few substances the dipoles tend to align spontaneously, giving rise to a spontaneous magnetization, even when there is no applied field. Exchange interaction When two nearby atoms have unpaired electrons, whether the electron spins are parallel or antiparallel affects whether the electrons can share the same orbit as a result of the quantum mechanical effect called the exchange interaction. This in turn affects the electron location and the Coulomb (electrostatic) interaction and thus the energy difference between these states. The exchange interaction is related to the Pauli exclusion principle, which says that two electrons with the same spin cannot also be in the same spatial state (orbital). This is a consequence of the spin-statistics theorem and that electrons are fermions. Therefore, under certain conditions, when the orbitals of the unpaired outer valence electrons from adjacent atoms overlap, the distributions of their electric charge in space are farther apart when the electrons have parallel spins than when they have opposite spins. This reduces the electrostatic energy of the electrons when their spins are parallel compared to their energy when the spins are anti-parallel, so the parallel-spin state is more stable. This difference in energy is called the exchange energy. In simple terms, the outer electrons of adjacent atoms, which repel each other, can move further apart by aligning their spins in parallel, so the spins of these electrons tend to line up. This energy difference can be orders of magnitude larger than the energy differences associated with the magnetic dipole-dipole interaction due to dipole orientation, which tends to align the dipoles antiparallel. In certain doped semiconductor oxides RKKY interactions have been shown to bring about periodic longer-range magnetic interactions, a phenomenon of significance in the study of spintronic materials. The materials in which the exchange interaction is much stronger than the competing dipole-dipole interaction are frequently called magnetic materials. For instance, in iron (Fe) the exchange force is about 1000 times stronger than the dipole interaction. Therefore, below the Curie temperature virtually all of the dipoles in a ferromagnetic material will be aligned. In addition to ferromagnetism, the exchange interaction is also responsible for the other types of spontaneous ordering of atomic magnetic moments occurring in magnetic solids, antiferromagnetism and ferrimagnetism. There are different exchange interaction mechanisms which create the magnetism in different ferromagnetic, ferrimagnetic, and antiferromagnetic substances. These mechanisms include direct exchange, RKKY exchange, double exchange, and superexchange. Magnetic anisotropy Although the exchange interaction keeps spins aligned, it does not align them in a particular direction. Without magnetic anisotropy, the spins in a magnet randomly change direction in response to thermal fluctuations and the magnet is superparamagnetic. There are several kinds of magnetic anisotropy, the most common of which is magnetocrystalline anisotropy. This is a dependence of the energy on the direction of magnetization relative to the crystallographic lattice. Another common source of anisotropy, inverse magnetostriction, is induced by internal strains. Single-domain magnets also can have a shape anisotropy due to the magnetostatic effects of the particle shape. As the temperature of a magnet increases, the anisotropy tends to decrease, and there is often a blocking temperature at which a transition to superparamagnetism occurs. Magnetic domains The above would seem to suggest that every piece of ferromagnetic material should have a strong magnetic field, since all the spins are aligned, yet iron and other ferromagnets are often found in an "unmagnetized" state. The reason for this is that a bulk piece of ferromagnetic material is divided into tiny regions called magnetic domains (also known as Weiss domains). Within each domain, the spins are aligned, but (if the bulk material is in its lowest energy configuration; i.e. unmagnetized), the spins of separate domains point in different directions and their magnetic fields cancel out, so the object has no net large scale magnetic field. Ferromagnetic materials spontaneously divide into magnetic domains because the exchange interaction is a short-range force, so over long distances of many atoms the tendency of the magnetic dipoles to reduce their energy by orienting in opposite directions wins out. If all the dipoles in a piece of ferromagnetic material are aligned parallel, it creates a large magnetic field extending into the space around it. This contains a lot of magnetostatic energy. The material can reduce this energy by splitting into many domains pointing in different directions, so the magnetic field is confined to small local fields in the material, reducing the volume of the field. The domains are separated by thin domain walls a number of molecules thick, in which the direction of magnetization of the dipoles rotates smoothly from one domain's direction to the other. Magnetized materials Thus, a piece of iron in its lowest energy state ("unmagnetized") generally has little or no net magnetic field. However, the magnetic domains in a material are not fixed in place; they are simply regions where the spins of the electrons have aligned spontaneously due to their magnetic fields, and thus can be altered by an external magnetic field. If a strong enough external magnetic field is applied to the material, the domain walls will move by the process of the spins of the electrons in atoms near the wall in one domain turning under the influence of the external field to face in the same direction as the electrons in the other domain, thus reorienting the domains so more of the dipoles are aligned with the external field. The domains will remain aligned when the external field is removed, creating a magnetic field of their own extending into the space around the material, thus creating a "permanent" magnet. The domains do not go back to their original minimum energy configuration when the field is removed because the domain walls tend to become 'pinned' or 'snagged' on defects in the crystal lattice, preserving their parallel orientation. This is shown by the Barkhausen effect: as the magnetizing field is changed, the magnetization changes in thousands of tiny discontinuous jumps as the domain walls suddenly "snap" past defects. This magnetization as a function of the external field is described by a hysteresis curve. Although this state of aligned domains found in a piece of magnetized ferromagnetic material is not a minimal-energy configuration, it is metastable, and can persist for long periods, as shown by samples of magnetite from the sea floor which have maintained their magnetization for millions of years. Heating and then cooling (annealing) a magnetized material, subjecting it to vibration by hammering it, or applying a rapidly oscillating magnetic field from a degaussing coil tends to release the domain walls from their pinned state, and the domain boundaries tend to move back to a lower energy configuration with less external magnetic field, thus demagnetizing the material. Commercial magnets are made of "hard" ferromagnetic or ferrimagnetic materials with very large magnetic anisotropy such as alnico and ferrites, which have a very strong tendency for the magnetization to be pointed along one axis of the crystal, the "easy axis". During manufacture the materials are subjected to various metallurgical processes in a powerful magnetic field, which aligns the crystal grains so their "easy" axes of magnetization all point in the same direction. Thus the magnetization, and the resulting magnetic field, is "built in" to the crystal structure of the material, making it very difficult to demagnetize. Curie temperature As the temperature increases, thermal motion, or entropy, competes with the ferromagnetic tendency for dipoles to align. When the temperature rises beyond a certain point, called the Curie temperature, there is a second-order phase transition and the system can no longer maintain a spontaneous magnetization, so its ability to be magnetized or attracted to a magnet disappears, although it still responds paramagnetically to an external field. Below that temperature, there is a spontaneous symmetry breaking and magnetic moments become aligned with their neighbors. The Curie temperature itself is a critical point, where the magnetic susceptibility is theoretically infinite and, although there is no net magnetization, domain-like spin correlations fluctuate at all length scales. The study of ferromagnetic phase transitions, especially via the simplified Ising spin model, had an important impact on the development of statistical physics. There, it was first clearly shown that mean field theory approaches failed to predict the correct behavior at the critical point (which was found to fall under a universality class that includes many other systems, such as liquid-gas transitions), and had to be replaced by renormalization group theory. See also References External links Electromagnetism – ch. 11, from an online textbook Detailed nonmathematical description of ferromagnetic materials with illustrations Magnetism: Models and Mechanisms in E. Pavarini, E. Koch, and U. Schollwöck: Emergent Phenomena in Correlated Matter, Jülich 2013, Quantum phases Magnetic ordering Hysteresis Physical phenomena
[ 0.05929665267467499, 0.6121243834495544, 0.06666743755340576, -0.21355772018432617, -0.6828011870384216, -0.21682941913604736, 0.35035160183906555, -0.29307320713996887, -0.09869434684515, -0.2960871160030365, -0.49276599287986755, 0.49996379017829895, -0.5233675837516785, 0.64382266998291...
11809
https://en.wikipedia.org/wiki/Francesco%20Cossiga
Francesco Cossiga
Francesco Maurizio Cossiga (; , ; 1928 – 2010) was an Italian politician. A member of the Christian Democratic Party of Italy, he was prime minister of Italy from 1979 to 1980 and the president of Italy from 1985 to 1992. Cossiga is widely considered one of the most prominent and influential politicians of the First Republic. Cossiga also served as minister on several occasions, most notably as Minister of the Interior. In that position he re-structured the Italian police, civil protection and secret services. Due to his repressive approach to public protests, he has been described as a strongman and labeled "Iron Minister". He was in office at the time of the kidnapping and murder of Aldo Moro by the Red Brigades, and resigned as Minister of the Interior when Moro was found dead in 1978. Cossiga was Prime Minister during the Bologna station bombing in 1980. Before his political career, Cossiga was a professor of constitutional law at the University of Sassari. Early life Francesco Cossiga was born in Sassari on 26 July 1928, from a republican and anti-fascist middle-bourgeois family. His parents were Giuseppe Cossiga and Maria "Mariuccia" Zanfarino. He was the second-degree cousin of brothers Enrico and Giovanni Berlinguer (whose parents were Mario Berlinguer and Maria "Mariuccia" Loriga) because their respective maternal grandfathers, Antonio Zanfarino and Giovanni Loriga, were half-brothers on their mother's side. Although he was commonly called "Cossìga" , the original pronunciation of the surname is "Còssiga" . His surname in Sardinian and Sassarese means "Corsica", likely pointing to the family's origin. At the age of sixteen, he graduated, three years in advance, at the classical lyceum Domenico Alberto Azuni. The following year he joined in the Christian Democracy, and three years later, at only 19 years old, he graduated in law and started a university career as professor of constitutional law at the faculty of jurisprudence of the University of Sassari. During his period at the university he became a member of the Catholic Federation of University Students (FUCI), becoming the association's leader for Sassari. Beginnings of his political career After the 1958 general election Cossiga was elected in the Chamber of Deputies for the first time, representing the constituency of Cagliari–Sassari. In February 1966 he became the youngest Undersecretary of the Ministry of Defence, in the government of Aldo Moro. In this role he had to face the aftermath of Piano Solo, an envisaged plot for an Italian coup d'état requested by then President Antonio Segni, two years before. From November 1974 to February 1976 Cossiga was Minister of Public Administration in Moro's fourth government. Minister of the Interior On 12 February 1976, Cossiga was appointed Minister of the Interior, by Prime Minister Moro. During his term he re-structured the Italian police, civil protection and secret services. Cossiga has been often described as a strongman and labeled "iron minister", for repressing public protests. Moreover, during his tenure his surname was often stylized as "Koiga", using the SS symbol. 1977 protests and riots In 1977 the city of Bologna was the scene of violent street clashes. In particular, on 11 March a militant of the far-left organization Lotta Continua, Francesco Lorusso, was killed by a gunshot to the back (probably fired by a policeman), when police dispersed protesters against a mass meeting of Communion and Liberation, which was being held that morning at the University. This event served as a detonator for a long series of clashes with security forces for two days, which affected the entire city of Bologna. Cossiga sent armored vehicles into the university area and other hot spots of the city to quell what he perceived as guerrilla warfare. Clashes with the police caused numerous casualties among people who got caught up in the riots, including uninvolved locals. No old leftist party, except the Youth Socialist Federation, led by local secretary Emilio Lonardo, participated at the funeral of the student Lorusso, showing the dramatic split between the movement and the historical left parties. Turin was also the scene of bloody clashes and attacks. On 1 October 1977, after a procession had started with an attack on the headquarters of the Italian Social Movement (MSI), a group of militants of Lotta Continua reached a downtown bar, L'angelo azzurro (The Blue Angel), frequented by young right-wing activists. They threw two Molotov cocktails, and Roberto Crescenzio, a totally apolitical student, died of burns. The perpetrators of the murder were never identified. Lotta Continua leader Silvio Viale called it a "tragic accident". Another innocent victim of the riots of that year was Giorgiana Masi, who was killed in Rome by a gunshot during an event organized by the Radical Party to celebrate the third anniversary of the victory in the referendum on divorce. As the perpetrators of the murder remained unknown, the movement attributed the responsibility of the crime to police officers in plain clothes, which were immortalized at that time dressed in clothing of the style of young people of the movement. Kidnapping of Aldo Moro Cossiga was in office at the time of the kidnapping and murder of the Christian Democratic leader Aldo Moro by the Marxist-Leninist extreme-left terrorist group Red Brigades. On the morning of 16 March 1978, the day on which the new cabinet led by Giulio Andreotti was supposed to have undergone a confidence vote in the Italian Parliament, the car of Moro, former prime minister and then president of DC, was assaulted by a group of Red Brigades terrorists in Via Fani in Rome. Firing automatic weapons, the terrorists killed Moro's bodyguards, (two Carabinieri in Moro's car and three policemen in the following car) and kidnapped him. Cossiga formed immediately two "crisis committees". The first one was a technical-operational-political committee, chaired by Cossiga himself and, in his absence, by undersecretary Nicola Lettieri. Other members included the supreme commanders of the Italian Police Forces, of the Carabinieri, the Guardia di Finanza, the recently named directors of SISMI and SISDE (respectively, Italy's military and civil intelligence services), the national secretary of CESIS (a secret information agency), the director of UCIGOS and the police prefect of Rome. The second one was an information committee, including members of CESIS, SISDE, SISMI and SIOS, another military intelligence office. A third unofficial committee was created which never met officially, called the comitato di esperti ("committee of experts"). Its existence was not disclosed until 1981, by Cossiga himself, in his interrogation by the Italian Parliament's Commission about the Moro affair. He omitted to reveal the decisions and the activities of the committee however. This committee included: Steve Pieczenik, a psychologist of the anti-terrorism section of the US State Department, and notable Italian criminologists. Pieczenik later declared that there were numerous leaks about the discussions made at the committee, and accused Cossiga. However, on 9 May 1978 Moro's body was found in the trunk of a Renault 4 in Via Caetani after 55 days of imprisonment, during which Moro was submitted to a political trial by the so-called "people's court" set up by the Brigate Rosse and the Italian government was asked for an exchange of prisoners. Despite the common interpretation, the car location in Via Caetani was not halfway between the locations of the national offices of DC and of the Italian Communist Party (PCI) in Rome. After two days, Cossiga resigned as Minister of the Interior. According to Italian journalist Enrico Deaglio, Cossiga, to justify his lack of action, "accused the leaders of CGIL and of the Communist Party of knowing where Moro was detained". Cossiga was also accused by Moro himself, in his letters who wrote during his detention, saying that "his blood will fall over him". Prime Minister of Italy One year after Moro's death and the subsequent Cossiga's resignation as Interior Minister, he was appointed Prime Minister of Italy. He led a government's coalition composed by Christian Democrats, Socialists, Democratic Socialists, Republicans and Liberals. Bologna massacre Cossiga was head of the government during the Bologna massacre, a terrorist bombing of the Bologna Central Station on the morning of 2 August 1980, which killed 85 people and wounded more than 200. The attack was attributed to the neo-fascist terrorist organization Nuclei Armati Rivoluzionari (Armed Revolutionary Nucleus), which always denied any involvement; other theories have been proposed, especially in correlation with the strategy of tension. Francesco Cossiga first assumed the explosion to have been caused by an accident (the explosion of an old boiler located in the basement of the station). Nevertheless, soon the evidence gathered on site of the explosion made it clear that the attack constituted an act of terrorism. L'Unità, the newspaper of the Communist Party on 3 August already attributed responsibility for the attack to neo-fascists. Later, in a special session to the Senate, Cossiga supported the theory that neofascists were behind the attack, "unlike leftist terrorism, which strikes at the heart of the state through its representatives, black terrorism prefers the massacre because it promotes panic and impulsive reactions." Later, according to media reports in 2004, taken up again in 2007, Cossiga, in a letter addressed to Enzo Fragala, leader of the National Alliance section in the Mitrokhin Committee, suggested Palestinian involvement of George Habash's Popular Front for the Liberation of Palestine and the Separat group of Ilich Ramirez Sanchez, known as "Carlos the Jackal". In addition, in 2008 Cossiga gave an interview to BBC in which it reaffirmed his belief that the massacre would not be attributable to black terrorism, but to an "incident" of Palestinian resistance groups operating in Italy. He declared also being convinced of the innocence of Francesca Mambro and Giuseppe Valerio Fioravanti, the two neo-fascist terrorists accused of the massacre. The PFLP has always denied responsibility. Resignation In October 1980, Cossiga resigned as Prime Minister after the rejection of the annual budget bill by the Italian Parliament. Following the 1983 general election, Cossiga became a member of the Italian Senate; on 12 July, he was elected President of the Senate. President of Italy In the 1985 presidential election, Cossiga was elected as President of Italy with 752 votes out of 977. His candidacy was endorsed by the Christian Democracy, but supported also by communists, socialists, social democrats, liberals and republicans. This was the first time an Italian presidential candidate had won the election on the first ballot, where a two-thirds majority is necessary. He took office on 29 June 1985 on an interim basis after the resignation of Outgoing President Sandro Pertini, but was not sworn in until a few days later, on 3 July. The Cossiga presidency was essentially divided into two phases related to the attitudes of the head of state. In the first five years, Cossiga played its role in a traditional way, caring for the role of the republican institutions under the Constitution, which makes the President of the Republic a kind of arbitrator in relations between the powers of the state. "Pickaxe-wielder" president It was in his last two years as president that Cossiga began to express some unusual opinions regarding the Italian political system. He opined that the Italian parties, especially the Christian Democrats and the Communists had to take into account the deep changes brought about by the fall of the Berlin Wall and the end of the Cold War. According to him, DC and PCI would therefore have been seriously affected by this change, but Cossiga believed that political parties and the same institutions refused to recognize it. Thus, a period of conflict and political controversy began, often provocative and deliberately excessive, and with very strong media exposure. These statements, soon dubbed "esternazioni", or "mattock blows" (picconate), were considered by many to be inappropriate for a President, and often beyond his constitutional powers; also, his mental health was doubted and Cossiga had to declare "I am the fake madman who speaks the truth." Cossiga suffered from bipolar disorder and depression in the last years of his life. Among the statements of the President there were also allegations of excessive politicization of the judiciary system, and the stigmatization of the fact that young magistrates, who just came into service, were immediately destined for the Sicilian prosecutor to carry out mafia proceedings. For his changed attitude, Cossiga received various criticisms by almost every party, with the exception of the Italian Social Movement, which stood beside him in defense of the "picconate". He will, amongst other things, be considered one of the first "cleansers" of MSI, who recognized it as a constitutional and democratic force. Revelation of Gladio and resignation Tension developed between Cossiga and Prime Minister Giulio Andreotti. This tension emerged when Andreotti revealed the existence of Gladio, a stay-behind organization with the official aim of countering a possible Soviet invasion through sabotage and guerrilla warfare behind enemy lines. Cossiga acknowledged his involvement in the establishment of the organization. The Democratic Party of the Left (successor to the Communist Party) started the procedure of impeachment (Presidents of Italy can be impeached only for high treason against the state or for an attempt to overthrow the Constitution). Although he threatened to prevent the impeachment procedure by dissolving Parliament, the impeachment request was ultimately dismissed. Cossiga resigned two months before the end of his term, on 25 April 1992. In his last speech as president he stated "To young people I want to say to love the fatherland, to honor the nation, to serve the Republic, to believe in freedom and to believe in our country". After the presidency According to the Italian Constitution, after his resignation from the office of President, Cossiga became Lifetime Senator, joining his predecessors in the upper house of Parliament, with whom he also shared the title of President Emeritus of the Italian Republic. On 12 January 1997, Cossiga survived unscathed a railway accident (:it:Incidente ferroviario di Piacenza), while traveling on a high-speed train from Milan to Rome that derailed near Piacenza. In February 1998, Cossiga created the Democratic Union for the Republic (UDR), a Christian democratic political party, declaring it to be politically central. The UDR was a crucial component of the majority that supported the Massimo D'Alema cabinet in October 1998, after the fall of the Romano Prodi's government which lost a vote of confidence. Cossiga declared that his support for D'Alema was intended to end the conventional exclusion of the former communist leaders from the premiership in Italy. In 1999 UDR was dissolved and Cossiga returned to his activities as a Senator, with competences in the Military Affairs' Commission. In May 2006, Cossiga gave his support to the formation of Prodi's second government. In the same month, he brought in a bill that would allow the region of South Tyrol to hold a referendum, where the local electorate could decide whether to remain within the Republic of Italy, take independence, or become part of Austria again. On 27 November 2006, he resigned from his position as a lifetime senator. His resignation was, however, rejected on 31 January 2007 by a vote of the Senate. In May 2008, Cossiga voted in favor of the government of Silvio Berlusconi. Death and legacy Cossiga died on 17 August 2010 from respiratory problems at the Agostino Gemelli Polyclinic. After his death, four letters written by Cossiga were sent to the four highest authorities of the state in office at the time of his death, President of the Republic Giorgio Napolitano, President of the Senate Renato Schifani, President of the Chamber of Deputies Gianfranco Fini and Prime Minister Silvio Berlusconi. The funeral took place in his hometown, Sassari, at the Church of San Giuseppe. Cossiga is buried in the public cemetery of Sassari, in the family tomb, not far from one of his predecessors as President of Italy, Antonio Segni. In 2020, Cossiga was depicted in the film Rose Island, which told the story of the Republic of Rose Island, played by Luca Della Bianca. Controversies In 2007, Cossiga sarcastically referred to the 2001 September 11 attacks as a false flag: "all democratic circles in America and of Europe, especially those of the Italian centre-left, now know that the disastrous attack was planned and realized by the American CIA and Mossad with the help of the Zionist world, to place the blame on Arab countries and to persuade the Western powers to intervene in Iraq and Afghanistan". The previous year Cossiga had stated that he rejects theoretical conspiracies and that it "seems unlikely that September 11 was the result of an American plot." In the statement, Cossiga was indeed mocking Italian media claiming that a video tape circulated by Osama bin Laden's al Qaeda and containing threats against Silvio Berlusconi was "produced in the studios of Mediaset in Milan" and forwarded to the "Islamist Al-Jazeera television network." According to the media, the purpose of that video tape (which was actually an audio tape) was to raise "a wave of solidarity to Berlusconi" who was, at the time, facing political difficulties. In 2008, Francesco Cossiga said that Mario Draghi was "a craven moneyman". Cossiga blamed the loss of Itavia Flight 870, a passenger jet that crashed in 1980 with the loss of all 81 people on board, on a missile fired from a French Navy aircraft. On 23 January 2013 Italy's top criminal court ruled that there was "abundantly" clear evidence that the flight was brought down by a missile fired from a French Navy aircraft. Honours and awards As President of the Republic, Cossiga was Head (and also Knight Grand Cross with Grand Cordon) of the Order of Merit of the Italian Republic (from 3 July 1985 to 28 April 1992), Military Order of Italy, Order of the Star of Italian Solidarity, Order of Merit for Labour and Order of Vittorio Veneto and Grand Cross of Merit of the Italian Red Cross. He has also been given honours and awards by other countries. References Notes Sources (on links between Cossiga, Licio Gelli and Propaganda Due masonic lodge; Massera, part of Videla's junta in Argentina, is also named) Obituary – Fox news External links |- |- |- |- 1928 births 2010 deaths People from Sassari Italian Roman Catholics Christian Democracy (Italy) politicians Italian People's Party (1994) politicians Democratic Union for the Republic politicians Union of the Centre (2002) politicians Presidents of Italy Prime Ministers of Italy Italian Ministers of the Interior Presidents of the Italian Senate Deputies of Legislature III of Italy Deputies of Legislature IV of Italy Deputies of Legislature V of Italy Deputies of Legislature VI of Italy Deputies of Legislature VII of Italy Deputies of Legislature VIII of Italy Senators of Legislature IX of Italy Senators of Legislature XVI of Italy Italian life senators Politicians of Sardinia Amateur radio people University of Sassari alumni University of Sassari faculty Knights Grand Cross with Collar of the Order of Merit of the Italian Republic Recipients of the Military Order of Italy Recipients of the Order of Merit for Labour Grand Collars of the Order of Prince Henry Honorary Knights Grand Cross of the Order of St Michael and St George Honorary Knights Grand Cross of the Order of the Bath Knights Grand Cross of the Order of Orange-Nassau Bailiffs Grand Cross of the Order of St John Grand Crosses of the Order of the Dannebrog Grand Crosses of the Order of Christ (Portugal) Grand Cordons of the Order of Merit of the Republic of Poland Grand Croix of the Légion d'honneur Commanders with Star of the Order of Polonia Restituta Recipients of the Order of the Sun of Peru Recipients of the Order of the Liberator General San Martin Grand Crosses Special Class of the Order of Merit of the Federal Republic of Germany 9/11 conspiracy theorists Italian anti-communists Italian conspiracy theorists
[ -0.045178934931755066, -0.2208961695432663, -0.04584242030978203, -0.11881373077630997, -0.3095558285713196, 1.038845181465149, -0.1621074378490448, 0.18142151832580566, -0.3171355724334717, -0.4839440882205963, 0.01982107385993004, 0.5505615472793579, -0.563082754611969, 0.304777562618255...
11812
https://en.wikipedia.org/wiki/Lockheed%20Martin%20F-35%20Lightning%20II
Lockheed Martin F-35 Lightning II
The Lockheed Martin F-35 Lightning II is an American family of single-seat, single-engine, all-weather stealth multirole combat aircraft that is intended to perform both air superiority and strike missions. It is also able to provide electronic warfare and intelligence, surveillance, and reconnaissance capabilities. Lockheed Martin is the prime F-35 contractor, with principal partners Northrop Grumman and BAE Systems. The aircraft has three main variants: the conventional takeoff and landing (CTOL) F-35A, the short take-off and vertical-landing (STOVL) F-35B, and the carrier-based (CV/CATOBAR) F-35C. The aircraft descends from the Lockheed Martin X-35, which in 2001 beat the Boeing X-32 to win the Joint Strike Fighter (JSF) program. Its development is principally funded by the United States, with additional funding from program partner countries from NATO and close U.S. allies, including the United Kingdom, Australia, Canada, Italy, Norway, Denmark, the Netherlands, and formerly Turkey. Several other countries have ordered, or are considering ordering, the aircraft. The program has drawn much scrutiny and criticism for its unprecedented size, complexity, ballooning costs, and much-delayed deliveries, with numerous technical flaws still being corrected. The acquisition strategy of concurrent production of the aircraft while it was still in development and testing led to expensive design changes and retrofits. The F-35B entered service with the U.S. Marine Corps in July 2015, followed by the U.S. Air Force F-35A in August 2016 and the U.S. Navy F-35C in February 2019. The F-35 was first used in combat in 2018 by the Israeli Air Force. The U.S. plans to buy 2,456 F-35s through 2044, which will represent the bulk of the crewed tactical airpower of the U.S. Air Force, Navy, and Marine Corps for several decades. The aircraft is projected to operate until 2070. Development Program origins The F-35 was the product of the Joint Strike Fighter (JSF) program, which was the merger of various combat aircraft programs from the 1980s and 1990s. One progenitor program was the Defense Advanced Research Projects Agency (DARPA) Advanced Short Take-Off/Vertical Landing (ASTOVL) which ran from 1983 to 1994; ASTOVL aimed to develop a Harrier Jump Jet replacement for the U.S. Marine Corps (USMC) and the U.K. Royal Navy. Under one of ASTOVL's classified programs, the Supersonic STOVL Fighter (SSF), Lockheed Skunk Works conducted research for a stealthy supersonic STOVL fighter intended for both U.S. Air Force (USAF) and USMC; a key technology explored was the shaft-driven lift fan (SDLF) system. Lockheed's concept was a single-engine canard delta aircraft weighing about empty. ASTOVL was rechristened as the Common Affordable Lightweight Fighter (CALF) in 1993 and involved Lockheed, McDonnell Douglas, and Boeing. In 1993, the Joint Advanced Strike Technology (JAST) program emerged following the cancellation of the USAF's Multi-Role Fighter (MRF) and U.S. Navy's (USN) Advanced Fighter-Attack (A/F-X) programs. MRF, a program for a relatively affordable F-16 replacement, was scaled back and delayed due to post–Cold War defense posture easing F-16 fleet usage and thus extending its service life as well as increasing budget pressure from the F-22 program. The A/F-X, initially known as the Advanced-Attack (A-X), began in 1991 as the USN's follow-on to the Advanced Tactical Aircraft (ATA) program for an A-6 replacement; the ATA's resulting A-12 Avenger II had been canceled due to technical problems and cost overruns in 1991. In the same year, the termination of the Naval Advanced Tactical Fighter (NATF), an offshoot of USAF's Advanced Tactical Fighter (ATF) program to replace the F-14, resulted in additional fighter capability being added to A-X, which was then renamed A/F-X. Amid increased budget pressure, the Department of Defense's (DoD) Bottom-Up Review (BUR) in September 1993 announced MRF's and A/F-X's cancellations, with applicable experience brought to the emerging JAST program. JAST was not meant to develop a new aircraft, but rather to develop requirements, maturing technologies, and demonstrating concepts for advanced strike warfare. As JAST progressed, the need for concept demonstrator aircraft by 1996 emerged, which would coincide with the full-scale flight demonstrator phase of ASTOVL/CALF. Because the ASTOVL/CALF concept appeared to align with the JAST charter, the two programs were eventually merged in 1994 under the JAST name, with the program now serving the USAF, USMC, and USN. JAST was subsequently renamed to Joint Strike Fighter (JSF) in 1995, with STOVL submissions by McDonnell Douglas, Northrop Grumman, Lockheed Martin, and Boeing. The JSF was expected to eventually replace large numbers of multi-role and strike fighters in the inventories of the US and its allies, including the Harrier, F-16, F/A-18, A-10, and F-117. International participation is a key aspect of the JSF program, starting with United Kingdom participation in the ASTOVL program. Many international partners requiring modernization of their air forces were interested in the JSF. The United Kingdom joined JAST/JSF as a founding member in 1995 and thus became the only Tier 1 partner of the JSF program; Italy, the Netherlands, Denmark, Norway, Canada, Australia, and Turkey joined the program during the Concept Demonstration Phase (CDP), with Italy and the Netherlands being Tier 2 partners and the rest Tier 3. Consequently, the aircraft was developed in cooperation with international partners and available for export. JSF competition Boeing and Lockheed Martin were selected in early 1997 for CDP, with their concept demonstrator aircraft designated X-32 and X-35 respectively; the McDonnell Douglas team was eliminated and Northrop Grumman and British Aerospace joined the Lockheed Martin team. Each firm would produce two prototype air vehicles to demonstrate conventional takeoff and landing (CTOL), carrier takeoff and landing (CV), and STOVL. Lockheed Martin's design would make use of the work on the SDLF system conducted under the ASTOVL/CALF program. The key aspect of the X-35 that enabled STOVL operation, the SDLF system consists of the lift fan in the forward center fuselage that could be activated by engaging a clutch that connects the driveshaft to the turbines and thus augmenting the thrust from the engine's swivel nozzle. Research from prior aircraft incorporating similar systems, such as the Convair Model 200, Rockwell XFV-12, and Yakovlev Yak-141, were also taken into consideration. By contrast, Boeing's X-32 employed direct lift system that the augmented turbofan would be reconfigured to when engaging in STOVL operation. Lockheed Martin's commonality strategy was to replace the STOVL variant's SDLF with a fuel tank and the aft swivel nozzle with a two-dimensional thrust vectoring nozzle for the CTOL variant. This would enable identical aerodynamic configuration for the STOVL and CTOL variants, while the CV variant would have an enlarged wing in order to reduce landing speed for carrier recovery. Due to aerodynamic characteristics and carrier recovery requirements from the JAST merger, the design configuration settled on a conventional tail compared to the canard delta design from the ASTOVL/CALF; notably, the conventional tail configuration offers much lower risk for carrier recovery compared to the ASTOVL/CALF canard configuration, which was designed without carrier compatibility in mind. This enabled greater commonality between all three variants, as the commonality goal was important at this design stage. Lockheed Martin's prototypes would consist of the X-35A for demonstrating CTOL before converting it to the X-35B for STOVL demonstration and the larger-winged X-35C for CV compatibility demonstration. The X-35A first flew on 24 October 2000 and conducted flight tests for subsonic and supersonic flying qualities, handling, range, and maneuver performance. After 28 flights, the aircraft was then converted into the X-35B for STOVL testing, with key changes including the addition of the SDLF, the three-bearing swivel module (3BSM), and roll-control ducts. The X-35B would successfully demonstrate the SDLF system by performing stable hover, vertical landing, and short takeoff in less than . The X-35C first flew on 16 December 2000 and conducted field landing carrier practice tests. On 26 October 2001, Lockheed Martin was declared the winner and was awarded the System Development and Demonstration (SDD) contract; Pratt & Whitney was separately awarded a development contract for the F135 engine for the JSF. The F-35 designation, which was out of sequence with standard DoD numbering, was allegedly determined on the spot by program manager Major General Mike Hough; this came as a surprise even to Lockheed Martin, which had expected the "F-24" designation for the JSF. Design and production As the JSF program moved into the System Development and Demonstration phase, the X-35 demonstrator design was modified to create the F-35 combat aircraft. The forward fuselage was lengthened by to make room for mission avionics, while the horizontal stabilizers were moved aft to retain balance and control. The diverterless supersonic inlet changed from a four-sided to a three-sided cowl shape and was moved aft. The fuselage section was fuller, the top surface raised by along the centerline to accommodate weapons bays. Following the designation of the X-35 prototypes, the three variants were designated F-35A (CTOL), F-35B (STOVL), and F-35C (CV). Prime contractor Lockheed Martin performs overall systems integration and final assembly and checkout (FACO), while Northrop Grumman and BAE Systems supply components for mission systems and airframe. Adding the systems of a fighter aircraft added weight. The F-35B gained the most, largely due to a 2003 decision to enlarge the weapons bays for commonality between variants; the total weight growth was reportedly up to , over 8%, causing all STOVL key performance parameter (KPP) thresholds to be missed. In December 2003, the STOVL Weight Attack Team (SWAT) was formed to reduce the weight increase; changes included more engine thrust, thinned airframe members, smaller weapons bays and vertical stabilizers, less thrust fed to the roll-post outlets, and redesigning the wing-mate joint, electrical elements, and the airframe immediately aft of the cockpit. Many changes from the SWAT effort were applied to all three variants for commonality. By September 2004, these efforts had reduced the F-35B's weight by over , while the F-35A and F-35C were reduced in weight by and respectively. The weight reduction work cost $6.2 billion and caused an 18-month delay. The first F-35A, designated AA-1, was rolled out in Fort Worth, Texas, on 19 February 2006 and first flew on 15 December 2006. In 2006, the F-35 was given the name "Lightning II" after the Lockheed P-38 Lightning of World War II. Some USAF pilots have nicknamed the aircraft "Panther" instead. The aircraft's software was developed as six releases, or Blocks, for SDD. The first two Blocks, 1A and 1B, readied the F-35 for initial pilot training and multi-level security. Block 2A improved the training capabilities, while 2B was the first combat-ready release planned for the USMC's Initial Operating Capability (IOC). Block 3i retains the capabilities of 2B while having new hardware and was planned for the USAF's IOC. The final release for SDD, Block 3F, would have full flight envelope and all baseline combat capabilities. Alongside software releases, each block also incorporates avionics hardware updates and air vehicle improvements from flight and structural testing. In what is known as "concurrency", some low rate initial production (LRIP) aircraft lots would be delivered in early Block configurations and eventually upgraded to Block 3F once development is complete. After 17,000 flight test hours, the final flight for the SDD phase was completed in April 2018. Like the F-22, the F-35 has been targeted by cyberattacks and technology theft efforts, as well as potential vulnerabilities in the integrity of the supply chain. Testing found several major problems: early F-35B airframes had premature cracking, the F-35C arrestor hook design was unreliable, fuel tanks were too vulnerable to lightning strikes, the helmet display had problems, and more. Software was repeatedly delayed due to its unprecedented scope and complexity. In 2009, the DoD Joint Estimate Team (JET) estimated that the program was 30 months behind the public schedule. In 2011, the program was "re-baselined"; that is, its cost and schedule goals were changed, pushing the IOC from the planned 2010 to July 2015. The decision to simultaneously test, fix defects, and begin production was criticized as inefficient; in 2014, Under Secretary of Defense for Acquisition Frank Kendall called it "acquisition malpractice". The three variants shared just 25% of their parts, far below the anticipated commonality of 70%. The program received considerable criticism for cost overruns and for the total projected lifetime cost, as well as quality management shortcomings by contractors. The JSF program was expected to cost about $200 billion for acquisition in base-year 2002 dollars when SDD was awarded in 2001. As early as 2005, the Government Accountability Office (GAO) had identified major program risks in cost and schedule. The costly delays strained the relationship between the Pentagon and contractors. By 2017, delays and cost overruns had pushed the F-35 program's expected acquisition costs to $406.5 billion, with total lifetime cost (i.e., to 2070) to $1.5 trillion in then-year dollars which also includes operations and maintenance. The unit cost of LRIP lot 13 F-35A was $79.2 million. Delays in development and operational test and evaluation pushed full-rate production to 2021. Upgrades and further development The first combat-capable Block 2B configuration, which had basic air-to-air and strike capabilities, was declared ready by the USMC in July 2015. The Block 3F configuration began operational test and evaluation (OT&E) in December 2018, the completion of which will conclude SDD. The F-35 program is also conducting sustainment and upgrade development, with early LRIP aircraft gradually upgraded to the baseline Block 3F standard by 2021. The F-35 is expected to be continually upgraded over its lifetime. The first upgrade program, called Continuous Capability Development and Delivery (C2D2) began in 2019 and is currently planned to run to 2024. The near-term development priority of C2D2 is Block 4, which would integrate additional weapons, including those unique to international customers, refresh the avionics, improve ESM capabilities, and add Remotely Operated Video Enhanced Receiver (ROVER) support. C2D2 also places greater emphasis on agile software development to enable quicker releases. In 2018, the Air Force Life Cycle Management Center (AFLCMC) awarded contracts to General Electric and Pratt & Whitney to develop more powerful and efficient adaptive cycle engines for potential application in the F-35, leveraging the research done under the Adaptive Engine Transition Program (AETP); in 2022, the F-35 Adaptive Engine Replacement (FAER) program was launched to integrate adaptive cycle engines into the aircraft by 2028. Defense contractors have offered upgrades to the F-35 outside of official program contracts. In 2013, Northrop Grumman disclosed its development of a directional infrared countermeasures suite, named Threat Nullification Defensive Resource (ThNDR). The countermeasure system would share the same space as the Distributed Aperture System (DAS) sensors and acts as a laser missile jammer to protect against infrared-homing missiles. Israel wants more access to the core avionics to include their own equipment. Procurement and international participation The United States is the primary customer and financial backer, with planned procurement of 1,763 F-35As for the USAF, 353 F-35Bs and 67 F-35Cs for the USMC, and 273 F-35Cs for the USN. Additionally, the United Kingdom, Italy, the Netherlands, Turkey, Australia, Norway, Denmark and Canada have agreed to contribute US$4.375 billion towards development costs, with the United Kingdom contributing about 10% of the planned development costs as the sole Tier 1 partner. The initial plan was that the U.S. and eight major partner nations would acquire over 3,100 F-35s through 2035. The three tiers of international participation generally reflect financial stake in the program, the amount of technology transfer and subcontracts open for bid by national companies, and the order in which countries can obtain production aircraft. Alongside program partner countries, Israel and Singapore have joined as Security Cooperative Participants (SCP). Sales to SCP and non-partner nations, including Belgium, Japan, and South Korea, are made through the Pentagon's Foreign Military Sales program. Turkey was removed from the F-35 program in July 2019 over security concerns. In December 2011 Japan announced its intention to purchase 42 F-35s to replace the F-4 Phantom II, with 38 to be assembled domestically and deliveries beginning in 2016. Due to delays in development and testing, many initial orders have been postponed. Italy reduced its order from 131 to 90 F-35s in 2012. Australia decided to buy the F/A-18F Super Hornet in 2006 and the EA-18G Growler in 2013 as interim measures. On 3 April 2012, the Auditor General of Canada published a report outlining problems with Canada's F-35 procurement; the report states that the government knowingly understated the final cost of 65 F-35s by $10 billion. Following the 2015 Federal Election, the Canadian government under the Liberal Party decided not to proceed with a sole-sourced purchase and launched a competition to choose an aircraft. In January 2019, Singapore announced its plan to buy a small number of F-35s for an evaluation of capabilities and suitability before deciding on more to replace its F-16 fleet. In May 2019, Poland announced plans to buy 32 F-35As to replace its Soviet-era jets; the contract was signed in January 2020. In June 2021, the Swiss government decided to propose to Parliament to buy 36 F-35As for $5.4 billion. The Swiss anti-military group GSoA, supported by the Greens and Social Democrats, intends to contest the purchase through a peoples initiative that would constitutionally prohibit the deal. In December 2021, Finland announced its decision to buy 64 F-35As. Design Overview The F-35 is a family of single-engine, supersonic, stealth multirole fighters. The second fifth generation fighter to enter US service and the first operational supersonic STOVL stealth fighter, the F-35 emphasizes low observables, advanced avionics and sensor fusion that enable a high level of situational awareness and long range lethality; the USAF considers the aircraft its primary strike fighter for conducting suppression of enemy air defense (SEAD) missions, owing to the advanced sensors and mission systems. The F-35 has a wing-tail configuration with two vertical stabilizers canted for stealth. Flight control surfaces include leading-edge flaps, flaperons, rudders, and all-moving horizontal tails (stabilators); leading edge root extensions also run forwards to the inlets. The relatively short 35-foot wingspan of the F-35A and F-35B is set by the requirement to fit inside USN amphibious assault ship parking areas and elevators; the F-35C's larger wing is more fuel efficient. The fixed diverterless supersonic inlets (DSI) use a bumped compression surface and forward-swept cowl to shed the boundary layer of the forebody away from the inlets, which form a Y-duct for the engine. Structurally, the F-35 drew upon lessons from the F-22; composites comprise 35% of airframe weight, with the majority being bismaleimide and composite epoxy materials as well as some carbon nanotube-reinforced epoxy in later production lots. The F-35 is considerably heavier than the lightweight fighters it replaces, with the lightest variant having an empty weight of ; much of the weight can be attributed to the internal weapons bays and the extensive avionics carried. While lacking the raw performance of the larger twin-engine F-22, the F-35 has kinematics competitive with fourth generation fighters such as the F-16 and F/A-18, especially with ordnance mounted because the F-35's internal weapons carriage eliminates parasitic drag from external stores. All variants have a top speed of Mach 1.6, attainable with full internal payload. The powerful F135 engine gives good subsonic acceleration and energy, with supersonic dash in afterburner. The large stabilitors, leading edge extensions and flaps, and canted rudders provide excellent high alpha (angle-of-attack) characteristics, with a trimmed alpha of 50°. Relaxed stability and fly-by-wire controls provide excellent handling qualities and departure resistance. Having over double the F-16's internal fuel, the F-35 has considerably greater combat radius, while stealth also enables a more efficient mission flight profile. Sensors and avionics The F-35's mission systems are among the most complex aspects of the aircraft. The avionics and sensor fusion are designed to enhance the pilot's situational awareness and command and control capabilities and facilitate network-centric warfare. Key sensors include the Northrop Grumman AN/APG-81 active electronically scanned array (AESA) radar, BAE Systems AN/ASQ-239 Barracuda electronic warfare system, Northrop Grumman/Raytheon AN/AAQ-37 Distributed Aperture System (DAS), Lockheed Martin AN/AAQ-40 Electro-Optical Targeting System (EOTS) and Northrop Grumman AN/ASQ-242 Communications, Navigation, and Identification (CNI) suite. The F-35 was designed with sensor intercommunication to provide a cohesive image of the local battlespace and availability for any possible use and combination with one another; for example, the APG-81 radar also acts as a part of the electronic warfare system. Much of the F-35's software was developed in C and C++ programming languages, while Ada83 code from the F-22 was also used; the Block 3F software has 8.6 million lines of code. The Green Hills Software Integrity DO-178B real-time operating system (RTOS) runs on integrated core processors (ICPs); data networking includes the IEEE 1394b and Fibre Channel buses. To enable fleet software upgrades for the software-defined radio systems and greater upgrade flexibility and affordability, the avionics leverage commercial off-the-shelf (COTS) components when practical. The mission systems software, particularly for sensor fusion, was one of the program's most difficult parts and responsible for substantial program delays. The APG-81 radar uses electronic scanning for rapid beam agility and incorporates passive and active air-to-air modes, strike modes, and synthetic aperture radar (SAR) capability, with multiple target track-while-scan at ranges in excess of . The antenna is tilted backwards for stealth. Complementing the radar is the AAQ-37 DAS, which consists of six infrared sensors that provide all-aspect missile launch warning and target tracking; the DAS acts as a situational awareness infrared search-and-track (SAIRST) and gives the pilot spherical infrared and night-vision imagery on the helmet visor. The ASQ-239 Barracuda electronic warfare system has ten radio frequency antennas embedded into the edges of the wing and tail for all-aspect radar warning receiver (RWR). It also provides sensor fusion of radio frequency and infrared tracking functions, geolocation threat targeting, and multispectral image countermeasures for self-defense against missiles. The electronic warfare system is capable of detecting and jamming hostile radars. The AAQ-40 EOTS is mounted internally behind a faceted low-observable window under the nose and performs laser targeting, forward-looking infrared (FLIR), and long range IRST functions. The ASQ-242 CNI suite uses a half dozen different physical links, including the Multifunction Advanced Data Link (MADL), for covert CNI functions. Through sensor fusion, information from radio frequency receivers and infrared sensors are combined to form a single tactical picture for the pilot. The all-aspect target direction and identification can be shared via MADL to other platforms without compromising low observability, while Link 16 is present for communication with legacy systems. The F-35 was designed from the outset to incorporate improved processors, sensors, and software enhancements over its lifespan. Technology Refresh 3, which includes a new core processor and a new cockpit display, is planned for Lot 15 aircraft. Lockheed Martin has offered the Advanced EOTS for the Block 4 configuration; the improved sensor fits into the same area as the baseline EOTS with minimal changes. In June 2018, Lockheed Martin picked Raytheon for improved DAS. The USAF has studied the potential for the F-35 to orchestrate attacks by unmanned combat aerial vehicles (UCAVs) via its sensors and communications equipment. Stealth and signatures Stealth is a key aspect of the F-35s design, and radar cross-section (RCS) is minimized through careful shaping of the airframe and the use of radar-absorbent materials (RAM); visible measures to reduce RCS include alignment of edges, serration of skin panels, and the masking of the engine face and turbine. Additionally, the F-35's diverterless supersonic inlet (DSI) uses a compression bump and forward-swept cowl rather than a splitter gap or bleed system to divert the boundary layer away from the inlet duct, eliminating the diverter cavity and further reducing radar signature. The RCS of the F-35 has been characterized as lower than a metal golf ball at certain frequencies and angles; in some conditions, the F-35 compares favorably to the F-22 in stealth. For maintainability, the F-35's stealth design took lessons learned from prior stealth aircraft such as the F-22; the F-35's radar-absorbent fibermat skin is more durable and requires less maintenance than older topcoats. The aircraft also has reduced infrared and visual signatures as well as strict controls of radio frequency emitters to prevent their detection. The F-35's stealth design is primarily focused on high-frequency X-band wavelengths; low-frequency radars can spot stealthy aircraft due to Rayleigh scattering, but such radars are also conspicuous, susceptible to clutter, and lack precision. To disguise its RCS, the aircraft can mount four Luneburg lens reflectors. Noise from the F-35 caused concerns in residential areas near potential bases for the aircraft, and residents near two such bases—Luke Air Force Base, Arizona, and Eglin Air Force Base (AFB), Florida—requested environmental impact studies in 2008 and 2009 respectively. Although the noise level in decibels were comparable to those of prior fighters such as the F-16, the sound power of the F-35 is stronger particularly at lower frequencies. Subsequent surveys and studies have indicated that the noise of the F-35 was not perceptibly different from the F-16 and F/A-18E/F, though the greater low-frequency noise was noticeable for some observers. Cockpit The glass cockpit was designed to give the pilot good situational awareness. The main display is a 20- by 8-inch (50 by 20 cm) panoramic touchscreen, which shows flight instruments, stores management, CNI information, and integrated caution and warnings; the pilot can customize the arrangement of the information. Below the main display is a smaller stand-by display. The cockpit has a speech-recognition system developed by Adacel. The F-35 does not have a head-up display; instead, flight and combat information is displayed on the visor of the pilot's helmet in a helmet-mounted display system (HMDS). The one-piece tinted canopy is hinged at the front and has an internal frame for structural strength. The Martin-Baker US16E ejection seat is launched by a twin-catapult system housed on side rails. There is a right-hand side stick and throttle hands-on throttle-and-stick system. For life support, an onboard oxygen-generation system (OBOGS) is fitted and powered by the Integrated Power Package (IPP), with an auxiliary oxygen bottle and backup oxygen system for emergencies. The Vision Systems International helmet display is a key piece of the F-35's human-machine interface. Instead of the head-up display mounted atop the dashboard of earlier fighters, the HMDS puts flight and combat information on the helmet visor, allowing the pilot to see it no matter which way he or she is facing. Infrared and night vision imagery from the Distributed Aperture System can be displayed directly on the HMDS and enables the pilot to "see through" the aircraft. The HMDS allows an F-35 pilot to fire missiles at targets even when the nose of the aircraft is pointing elsewhere by cuing missile seekers at high angles off-boresight. Each helmet costs $400,000. The HMDS weighs more than traditional helmets, and there is concern that it can endanger lightweight pilots during ejection. Due to the HMDS's vibration, jitter, night-vision and sensor display problems during development, Lockheed Martin and Elbit issued a draft specification in 2011 for an alternative HMDS based on the AN/AVS-9 night vision goggles as backup, with BAE Systems chosen later that year. A cockpit redesign would be needed to adopt an alternative HMDS. Following progress on the baseline helmet, development on the alternative HMDS was halted in October 2013. In 2016, the Gen 3 helmet with improved night vision camera, new liquid crystal displays, automated alignment and software enhancements was introduced with LRIP lot 7. Armament To preserve its stealth shaping, the F-35 has two internal weapons bays with four weapons stations. The two outboard weapon stations each can carry ordnance up to , or for F-35B, while the two inboard stations carry air-to-air missiles. Air-to-surface weapons for the outboard station include the Joint Direct Attack Munition (JDAM), Paveway series of bombs, Joint Standoff Weapon (JSOW), and cluster munitions (Wind Corrected Munitions Dispenser). The station can also carry multiple smaller munitions such as the GBU-39 Small Diameter Bombs (SDB), GBU-53/B SDB II, and SPEAR 3 anti-tank missiles; up to four SDBs can be carried per station for the F-35A and F-35C, and three for the F-35B. The inboard station can carry the AIM-120 AMRAAM. Two compartments behind the weapons bays contain flares, chaff, and towed decoys. The aircraft can use six external weapons stations for missions that do not require stealth. The wingtip pylons each can carry an AIM-9X or AIM-132 ASRAAM and are canted outwards to reduce their radar cross-section. Additionally, each wing has a inboard station and a middle station, or for F-35B. The external wing stations can carry large air-to-surface weapons that would not fit inside the weapons bays such as the AGM-158 Joint Air to Surface Stand-off Missile (JASSM) cruise missile. An air-to-air missile load of eight AIM-120s and two AIM-9s is possible using internal and external weapons stations; a configuration of six bombs, two AIM-120s and two AIM-9s can also be arranged. The F-35A is armed with a 25 mm GAU-22/A rotary cannon mounted internally near the left wing root with 182 rounds carried; the gun is more effective against ground targets than the 20 mm cannon carried by other USAF fighters. The F-35B and F-35C have no internal gun and instead can use a Terma A/S multi-mission pod (MMP) carrying the GAU-22/A and 220 rounds; the pod is mounted on the centerline of the aircraft and shaped to reduce its radar cross-section. In lieu of the gun, the pod can also be used for different equipment and purposes, such as electronic warfare, aerial reconnaissance, or rear-facing tactical radar. Lockheed Martin is developing a weapon rack called Sidekick that would enable the internal outboard station to carry two AIM-120s, thus increasing the internal air-to-air payload to six missiles, currently offered for Block 4. Block 4 will also have a rearranged hydraulic line and bracket to allow the F-35B to carry four SDBs per internal outboard station; integration of the MBDA Meteor is also planned. The USAF and USN are planning to integrate the AGM-88G AARGM-ER internally in the F-35A and F-35C. Norway and Australia are funding an adaptation of the Naval Strike Missile (NSM) for the F-35; designated Joint Strike Missile (JSM), two missiles can be carried internally with an additional four externally. Nuclear weapons delivery via internal carriage of the B61 nuclear bomb is planned for Block 4B in 2024. Both hypersonic missiles and direct energy weapons such as solid-state laser are currently being considered as future upgrades. Lockheed Martin is studying integrating a fiber laser that uses spectral beam combining multiple individual laser modules into a single high-power beam, which can be scaled to various levels. The USAF plans for the F-35A to take up the close air support (CAS) mission in contested environments; amid criticism that it is not as well suited as a dedicated attack platform, USAF chief of staff Mark Welsh placed a focus on weapons for CAS sorties, including guided rockets, fragmentation rockets that shatter into individual projectiles before impact, and more compact ammunition for higher capacity gun pods. Fragmentary rocket warheads create greater effects than cannon shells as each rocket creates a "thousand-round burst", delivering more projectiles than a strafing run. Engine The single-engine aircraft is powered by the Pratt & Whitney F135 low-bypass augmented turbofan with rated thrust of . Derived from the Pratt & Whitney F119 used by the F-22, the F135 has a larger fan and higher bypass ratio to increase subsonic fuel efficiency, and unlike the F119, is not optimized for supercruise. The engine contributes to the F-35's stealth by having a low-observable augmenter, or afterburner, that incorporates fuel injectors into thick curved vanes; these vanes are covered by ceramic radar-absorbent materials and mask the turbine. The stealthy augmenter had problems with pressure pulsations, or "screech", at low altitude and high speed early in its development. The low-observable axisymmetric nozzle consists of 15 partially overlapping flaps that create a sawtooth pattern at the trailing edge, which reduces radar signature and creates shed vortices that reduce the infrared signature of the exhaust plume. Due to the engines large dimensions, the USN had to modify its underway replenishment system to facilitate at-sea logistics support. The F-35's Integrated Power Package (IPP) performs power and thermal management and integrates environment control, auxiliary power unit, engine starting, and other functions into a single system. The F135-PW-600 variant for the F-35B incorporates the SDLF to allow STOVL operations. Designed by Lockheed Martin and developed by Rolls-Royce, the SDLF, also known as the Rolls-Royce LiftSystem, consists of the lift fan, drive shaft, two roll posts, and a "three-bearing swivel module" (3BSM). The thrust vectoring 3BSM nozzle allows the main engine exhaust to be deflected downward at the tail of the aircraft and is moved by a "fueldraulic" actuator that uses pressurized fuel as the working fluid. Unlike the Harriers Pegasus engine that entirely uses direct engine thrust for lift, the F-35B's system augments the swivel nozzle's thrust with the lift fan; the fan is powered by the low-pressure turbine through a drive shaft when engaged with a clutch and placed near the front of the aircraft to provide a counterbalancing thrust. Roll control during slow flight is achieved by diverting unheated engine bypass air through wing-mounted thrust nozzles called roll posts. An alternative engine, the General Electric/Rolls-Royce F136, was being developed in the 2000s; originally, F-35 engines from Lot 6 onward were competitively tendered. Using technology from the General Electric YF120, the F136 was claimed to have a greater temperature margin than the F135. The F136 was canceled in December 2011 due to lack of funding. The F-35 is expected to receive propulsion upgrades over its lifecycle in order to adapt to emerging threats and enable additional capabilities. In 2016, the Adaptive Engine Transition Program (AETP) was launched to develop and test adaptive cycle engines, with one major potential application being the re-engining of the F-35; in 2018, both GE and P&W were awarded contracts to develop thrust class demonstrators, with the designations XA100 and XA101 respectively. In addition to potential re-engining, P&W also plans to improve the baseline F135; in 2017, P&W announced the F135 Growth Option 1.0 and 2.0; Growth Option 1.0 was a drop-in power module upgrade that offered 6–10% thrust improvement and 5–6% fuel burn reduction, while Growth Option 2.0 would be the adaptive cycle XA101. In 2020, P&W shifted its F135 upgrade plan from the Growth Options to a series of Engine Enhancement Packages along with some additional capabilities, while the XA101 became a separate clean-sheet design. The capability packages are planned to be incorporated in two-year increments starting in the mid-2020s. Maintenance and logistics The F-35 is designed to require less maintenance than earlier stealth aircraft. Some 95% of all field-replaceable parts are "one deep"—that is, nothing else need be removed to reach the desired part; for instance, the ejection seat can be replaced without removing the canopy. The F-35 has a fibermat radar-absorbent material (RAM) baked into the skin, which is more durable, easier to work with, and faster to cure than older RAM coatings; similar coatings are currently being considered for application on older stealth aircraft such as the F-22. Skin corrosion on the F-22 led the F-35's designers to use a less galvanic corrosion-inducing skin gap filler and to use fewer gaps in the airframe skin needing filler and better drainage. The flight control system uses electro-hydrostatic actuators rather than traditional hydraulic systems; these controls can be powered by lithium-ion batteries in case of emergency. Commonality between the different variants allowed the USMC to create their first aircraft maintenance Field Training Detachment to apply the USAF's lessons to their F-35 operations. The F-35 was intended to be supported by a computerized maintenance management system named Autonomic Logistics Information System (ALIS). In concept, any aircraft can be serviced at any F-35 maintenance facility and for all parts to be globally tracked and shared as needed. Due to numerous problems, such as unreliable diagnoses, excessive connectivity requirements, and security vulnerabilities, program officials plan to replace ALIS with the cloud-based Operational Data Integrated Network (ODIN) by 2022. ODIN base kits (OBKs)— OBKs are new computer hardware which replace ALIS's Standard Operating Unit unclassified (SOU-U) server hardware. Beginning in September 2020 OBKs were running ALIS software, as well as ODIN software, first at Marine Corps Air Station (MCAS) Yuma, Arizona, then at Naval Air Station Lemoore, California, in support of Strike Fighter Squadron (VFA) 125 on 16 July 2021, and then Nellis Air Force Base, Nevada, in support of the 422nd Test and Evaluation Squadron (TES) on 6 August 2021. In 2022, over a dozen more OBK server installation sites will replace the ALIS SOU-U servers, which will be able to run the legacy ALIS software as well as its replacement ODIN software. The performance on the OBK has doubled so far, compared to ALIS. Operational history Testing The first F-35A, AA-1, conducted its engine run in September 2006 and first flew on 15 December 2006. Unlike all subsequent aircraft, AA-1 did not have the weight optimization from SWAT; consequently, it mainly tested subsystems common to subsequent aircraft, such as the propulsion, electrical system, and cockpit displays. This aircraft was retired from flight testing in December 2009 and was used for live-fire testing at NAS China Lake. The first F-35B, BF-1, flew on 11 June 2008, while the first weight-optimized F-35A and F-35C, AF-1 and CF-1, flew on 14 November 2009 and 6 June 2010 respectively. The F-35B's first hover was on 17 March 2010, followed by its first vertical landing the next day. The F-35 Integrated Test Force (ITF) consisted of 18 aircraft at Edwards Air Force Base and Naval Air Station Patuxent River. Nine aircraft at Edwards, five F-35As, three F-35Bs, and one F-35C, performed flight sciences testing such as F-35A envelope expansion, flight loads, stores separation, as well as mission systems testing. The other nine aircraft at Patuxent River, five F-35Bs and four F-35Cs, were responsible for F-35B and C envelope expansion and STOVL and CV suitability testing. Additional carrier suitability testing was conducted at Naval Air Warfare Center Aircraft Division at Lakehurst, New Jersey. Two non-flying aircraft of each variant were used to test static loads and fatigue. For testing avionics and mission systems, a modified Boeing 737-300 with a duplication of the cockpit, the Lockheed Martin CATBird has been used. Field testing of the F-35's sensors were conducted during Exercise Northern Edge 2009 and 2011, serving as significant risk-reduction steps. Flight tests revealed several serious deficiencies that required costly redesigns, caused delays, and resulted in several fleet-wide groundings. In 2011, the F-35C failed to catch the arresting wire in all eight landing tests; a redesigned tail hook was delivered two years later. By June 2009, many of the initial flight test targets had been accomplished but the program was behind schedule. Software and mission systems were among the biggest sources of delays for the program, with sensor fusion proving especially challenging. In fatigue testing, the F-35B suffered several premature cracks, requiring a redesign of the structure. A third non-flying F-35B is currently planned to test the redesigned structure. The F-35B and C also had problems with the horizontal tails suffering heat damage from prolonged afterburner use. Early flight control laws had problems with "wing drop" and also made the airplane sluggish, with high angles-of-attack tests in 2015 against an F-16 showing a lack of energy. At-sea testing of the F-35B was first conducted aboard . In October 2011, two F-35Bs conducted three weeks of initial sea trials, called Development Test I. The second F-35B sea trials, Development Test II, began in August 2013, with tests including nighttime operations; two aircraft completed 19 nighttime vertical landings using DAS imagery. The first operational testing involving six F-35Bs was done on the Wasp in May 2015. The final Development Test III on involving operations in high sea states was completed in late 2016. A Royal Navy F-35 conducted the first "rolling" landing on board in October 2018. After the redesigned tail hook arrived, the F-35C's carrier-based Development Test I began in November 2014 aboard and focused on basic day carrier operations and establishing launch and recovery handling procedures. Development Test II, which focused on night operations, weapons loading, and full power launches, took place in October 2015. The final Development Test III was completed in August 2016, and included tests of asymmetric loads and certifying systems for landing qualifications and interoperability. Operational test of the F-35C began in 2018. The F-35's reliability and availability have fallen short of requirements, especially in the early years of testing. The ALIS maintenance and logistics system was plagued by excessive connectivity requirements and faulty diagnoses. In late 2017, the GAO reported the time needed to repair an F-35 part averaged 172 days, which was "twice the program's objective," and that shortage of spare parts was degrading readiness. In 2019, while individual F-35 units have achieved mission-capable rates of over the target of 80% for short periods during deployed operations, fleet-wide rates remained below target. The fleet availability goal of 65% was also not met, although the trend shows improvement. Gun accuracy of the F-35A remains unacceptable. As of 2020, the number of the program's most serious issues have been decreased by half. Operational test and evaluation (OT&E) with Block 3F, the final configuration for SDD, began in December 2018. United States The F-35A and F-35B were cleared for basic flight training in early 2012. However, lack of system maturity at the time led to concerns over safety as well as concerns by the Director of Operational Test & Evaluation (DOT&E) over electronic warfare testing, budget, and concurrency for the Operational Test and Evaluation master plan. Nevertheless, on 10 September 2012, the USAF began an operational utility evaluation (OUE) of the F-35A, including logistical support, maintenance, personnel training, and pilot execution. OUE flights began on 26 October and were completed on 14 November after 24 flights, each pilot having completed six flights. On 16 November 2012, the USMC received the first F-35B at MCAS Yuma, although Marine pilots had several flight restrictions. During the Low Rate Initial Production (LRIP) phase, the three U.S. military services jointly developed tactics and procedures using flight simulators, testing effectiveness, discovering problems and refining design. In January 2013, training began at Eglin AFB with capacity for 100 pilots and 2,100 maintainers at once. On 8 January 2015, RAF Lakenheath in the UK was chosen as the first base in Europe to station two USAF F-35 squadrons, with 48 aircraft adding to the 48th Fighter Wing's existing F-15C and F-15E squadrons. The USMC declared Initial Operational Capability (IOC) for the F-35B in the Block 2B configuration on 31 July 2015 after operational trials. However, limitations remained in night operations, communications, software and weapons carriage capabilities. USMC F-35Bs participated in their first Red Flag exercise in July 2016 with 67 sorties conducted. USAF F-35A in the Block 3i configuration achieved IOC with the USAF's 34th Fighter Squadron at Hill Air Force Base, Utah on 2 August 2016. The USN achieved operational status with the F-35C in Block 3F on 28 February 2019. USAF F-35As conducted their first Red Flag exercise in 2017; system maturity had improved and the aircraft scored a kill ratio of 15:1 against an F-16 aggressor squadron in a high-threat environment. The F-35's operating cost is higher than some older fighters. In fiscal year 2018, the F-35A's cost per flight hour (CPFH) was $44,000, a number that was reduced to $35,000 in 2019. For comparison, in 2015 the CPFH of the A-10 was $17,716; the F-15C, $41,921; and the F-16C, $22,514. Lockheed Martin hopes to reduce it to $25,000 by 2025 through performance-based logistics and other measures. The USMC plans to disperse its F-35Bs among forward-deployed bases to enhance survivability while remaining close to a battlespace, similar to RAF Harrier deployment in the Cold War, which relied on the use of off-base locations that offered short runways, shelter, and concealment. Known as distributed STOVL operations (DSO), F-35Bs would operate from temporary bases in allied territory within range of hostile missiles and move between temporary locations inside the enemy's 24- to 48-hour targeting cycle; this strategy accounts for the F-35B's short range, the shortest of the three variants, with mobile forward arming and refueling points (M-Farps) accommodating KC-130 and MV-22 Osprey aircraft to rearm and refuel the jets, as well as littoral areas for sea links of mobile distribution sites. M-Farps can be based on small airfields, multi-lane roads, or damaged main bases, while F-35Bs return to rear-area friendly bases or ships for scheduled maintenance. Helicopter-portable metal planking is needed to protect unprepared roads from the F-35B's exhaust; the USMC are studying lighter heat-resistant options. The first U.S. combat employment began in July 2018 with USMC F-35Bs from the amphibious assault ship , with the first combat strike on 27 September 2018 against a Taliban target in Afghanistan. This was followed by a USAF deployment to Al Dhafra Air Base, UAE on 15 April 2019. On 27 April 2019, USAF F-35As were first used in combat in an airstrike on an Islamic State tunnel network in northern Iraq. On 2 August 2021, the F-35C embarked on its maiden deployment on board the USS Carl Vinson with another aircraft making its debut deployment being the CMV-22 Osprey. United Kingdom The United Kingdom's Royal Air Force and Royal Navy both operate the F-35B, known simply as the Lightning in British service; it has replaced the Harrier GR9, which was retired in 2010, and Tornado GR4, which was retired in 2019. The F-35 is to be Britain's primary strike aircraft for the next three decades. One of the Royal Navy's requirements for the F-35B was a Shipborne Rolling and Vertical Landing (SRVL) mode to increase maximum landing weight by using wing lift during landing. In July 2013, Chief of the Air Staff, Air Chief Marshal Sir Stephen Dalton announced that No. 617 (The Dambusters) Squadron would be the RAF's first operational F-35 squadron. The second operational squadron will be the Fleet Air Arm's 809 Naval Air Squadron which will stand up in April 2023 or later. No. 17 (Reserve) Test and Evaluation Squadron (TES) stood-up on 12 April 2013 as the Operational Evaluation Unit for the Lightning, becoming the first British squadron to operate the type. By June 2013, the RAF had received three F-35s of the 48 on order, all initially based at Eglin Air Force Base. In June 2015, the F-35B undertook its first launches from a ski-jump at NAS Patuxent River. When operating at sea, British F-35Bs use ski-jumps fitted to the flight decks of aircraft carriers HMS Queen Elizabeth (R08) and HMS Prince of Wales (R09). The Italian Navy will use the same process. British F-35Bs are not intended to receive the Brimstone 2 missile. On 5 July 2017, it was announced the second UK-based RAF squadron would be No. 207 Squadron, which reformed on 1 August 2019 as the Lightning Operational Conversion Unit. No. 617 Squadron reformed on 18 April 2018 during a ceremony in Washington, D.C., US, becoming the first RAF front-line squadron to operate the type; receiving its first four F-35Bs on 6 June, flying from MCAS Beaufort to RAF Marham. Both No. 617 Squadron and its F-35s were declared combat ready on 10 January 2019. In April 2019, No. 617 Squadron deployed to RAF Akrotiri, Cyprus, the type's first overseas deployment. On 25 June 2019, the first combat use of an RAF F-35B was reportedly undertaken as armed reconnaissance flights searching for Islamic State targets in Iraq and Syria. In October 2019, the Dambusters and No. 17 TES F-35s were embarked on HMS Queen Elizabeth for the first time. No. 617 Squadron departed RAF Marham on 22 January 2020 for their first Exercise Red Flag with the Lightning. Australia Australia’s first F-35, designated A35-001, was manufactured in 2014, with flight training provided through international Pilot Training Centre (PTC) at Luke Air Force Base in Arizona. The first two F-35s were unveiled to the Australian public on 3 March 2017 at the Avalon Airshow. By 2021, the Royal Australian Air Force had accepted 26 F-35A aircraft, with nine in the US and 17 operating at No 3 Squadron and No 2 Operational Conversion Unit at RAAF Base Williamtown. With 41 trained RAAF pilots and 225 trained technicians for maintenance, the fleet was declared ready to deploy on operations. It is expected that Australia will receive all 72 of the F-35s by 2023. Israel The Israeli Air Force (IAF) declared the F-35 operationally capable on 6 December 2017. According to Kuwaiti newspaper Al Jarida, in July 2018, a test mission of at least three IAF F-35s flew to Iran's capital Tehran and back from Tel Aviv. While publicly unconfirmed, regional leaders acted on the report; Iran's supreme leader Ali Khamenei reportedly fired the air force chief and commander of Iran's Revolutionary Guard Corps over the mission. On 22 May 2018, IAF chief Amikam Norkin said that the service had employed their F-35Is in two attacks on two battle fronts, marking the first combat operation of an F-35 by any country. Norkin said it had been flown "all over the Middle East", and showed photos of an F-35I flying over Beirut in daylight. In July 2019, Israel expanded its strikes against Iranian missile shipments; IAF F-35Is allegedly struck Iranian targets in Iraq twice. In November 2020, the IAF announced the delivery of an F-35I Testbed aircraft amongst a delivery of four aircraft received in August. This example will be used to test and integrate Israeli-produced weapons and electronic systems on future F-35s received. This is the only example of a testbed F-35 delivered to an air force outside of the United States. On 11 May 2021, eight IAF F-35Is took part in an attack on 150 terrorist targets in Hamas' rocket array, including 50-70 launch pits in the northern Gaza Strip, as part of Operation Guardian of the Walls. Italy Italy's F-35As were declared to have reached initial operational capability (IOC) on 30 November 2018. At the time Italy had taken delivery of 10 F-35As and one F-35B, with 2 F-35As and the one F-35B being stationed in the U.S. for training, the remaining 8 F-35As were stationed in Amendola. Norway On 6 November 2019 Norway declared initial operational capability (IOC) for its fleet of 15 F-35As out of a planned 52 F-35As. On January 6, 2022 Norway's F-35As replaced its F-16s for the NATO quick reaction alert mission in the high north. Netherlands On 27 December 2021 the Netherlands declared initial operational capability (IOC) for its fleet of 24 F-35As that it has received to date from its order for 46 F-35As. Variants The F-35 was designed with three initial variants - the F-35A, a CTOL land-based version; the F-35B, a STOVL version capable of use either on land or on aircraft carriers; and the F-35C, a CATOBAR carrier-based version. Since then, there has been work on the design of nationally specific versions for Israel and Canada, as well as initial concept design work for an updated version of the F-35A, which would become the F-35D. F-35A The F-35A is the conventional takeoff and landing (CTOL) variant intended for the USAF and other air forces. It is the smallest, lightest version and capable of 9 g, the highest of all variants. Although the F-35A currently conducts aerial refueling via boom and receptacle method, the aircraft can be modified for probe-and-drogue refueling if needed by the customer. A drag chute pod can be installed on the F-35A, with the Royal Norwegian Air Force being the first operator to adopt it. F-35B The F-35B is the short takeoff and vertical landing (STOVL) variant of the aircraft. Similar in size to the A variant, the B sacrifices about a third of the A variant's fuel volume to accommodate the SDLF. This variant is limited to 7 g. Unlike other variants, the F-35B has no landing hook. The "STOVL/HOOK" control instead engages conversion between normal and vertical flight. The F-35B can also perform vertical and/or short take-off and landing (V/STOL). F-35C The F-35C variant is designed for catapult-assisted take-off but arrested recovery operations from aircraft carriers. Compared to the F-35A, the F-35C features larger wings with foldable wingtip sections, larger control surfaces for improved low-speed control, stronger landing gear for the stresses of carrier arrested landings, a twin-wheel nose gear, and a stronger tailhook for use with carrier arrestor cables. The larger wing area allows for decreased landing speed while increasing both range and payload. The F-35C is limited to 7.5 g. F-35I "Adir" The F-35I Adir (, meaning "Awesome", or "Mighty One") is an F-35A with unique Israeli modifications. The US initially refused to allow such changes before permitting Israel to integrate its own electronic warfare systems, including sensors and countermeasures. The main computer has a plug-and-play function for add-on systems; proposals include an external jamming pod, and new Israeli air-to-air missiles and guided bombs in the internal weapon bays. A senior IAF official said that the F-35's stealth may be partly overcome within 10 years despite a 30 to 40 year service life, thus Israel's insistence on using their own electronic warfare systems. Israel Aerospace Industries (IAI) has considered a two-seat F-35 concept; an IAI executive noted: "There is a known demand for two seats not only from Israel but from other air forces". IAI plans to produce conformal fuel tanks. Proposed variants F-35D A study for a possible upgrade of the F-35A to be fielded by the 2035 target date of the USAF's Future Operating Concept. CF-35 The Canadian CF-35 is a proposed variant that would differ from the F-35A through the addition of a drogue parachute and may include an F-35B/C-style refueling probe. In 2012, it was revealed that the CF-35 would employ the same boom refueling system as the F-35A. One alternative proposal would have been the adoption of the F-35C for its probe refueling and lower landing speed; however, the Parliamentary Budget Officer's report cited the F-35C's limited performance and payload as being too high a price to pay. Following the 2015 Federal Election the Liberal Party, whose campaign had included a pledge to cancel the F-35 procurement, formed a new government and commenced an open competition to replace the existing CF-18 Hornet. New Export variant In December 2021, it was reported that Lockheed Martin was developing a new variant for a unspecified foreign customer. The Department of Defense released US$49 million in funding for this work. Operators Royal Australian Air Force – 44 F-35A delivered as of November 2021, of 72 ordered. Belgian Air Component – 34 F-35A planned. Royal Danish Air Force – 4 F-35A delivered of the 27 planned. Finnish Air Force – F-35A Block 4 selected via the HX fighter program to replace the current F/A-18 Hornets. 64 F-35As on order. Israeli Air Force – 30 delivered as of September 2021 (F-35I "Adir"). Includes one F-35 testbed aircraft for indigenous Israeli weapons, electronics and structural upgrades, designated (AS-15). A total of 75 ordered with 75 planned. Italian Air Force – 12 F-35As delivered as of May 2020. 1 F-35B delivered as of October 2020, at which point Italy planned to order 60 F-35As and 15 F-35Bs for the Italian Air Force. Italian Navy – 2 had been delivered as of October 2020. 15 F-35Bs planned for the Italian Navy. Japan Air Self-Defense Force – 23 F-35As operational as of December 2021 with a total order of 147, including 42 F-35Bs. Royal Netherlands Air Force – 24 F-35As delivered and operational out of 46 ordered Royal Norwegian Air Force – 31 F-35As delivered and operational, of which 21 are in Norway and 10 are based in the US for training as of August 11th 2021 of 52 F-35As planned in total. They differ from other F-35A through the addition of a drogue parachute. Polish Air Force – 32 F-35As on order. Option for additional 16. Republic of Korea Air Force – 40 F-35A delivered as of January 2022, with 20 more on order. Republic of Korea Navy – about 20 F-35Bs planned Republic of Singapore Air Force – four F-35Bs to be ordered with option to order eight more as of March 2019. Royal Air Force and Royal Navy (owned by the RAF but jointly operated) – 27 F-35Bs received with 23 in the UK after the loss of one aircraft in November 2021; the other three are in the US where they are used for testing and training. 42 (24 FOC fighters and 18 training aircraft) to be fast-tracked by 2023; A total of 48 ordered as of 2021; a total of 60 to 80 F-35Bs are planned to be ordered. United States Air Force – 1,763 F-35As planned United States Marine Corps – 353 F-35Bs and 67 F-35Cs planned United States Navy – 273 F-35Cs planned Order and approval cancellations Turkish Air Force – Four F-35As delivered to Luke Air Force Base for training in July 2018. 30 were ordered, of up to 120 total planned. Future purchases have been banned by the U.S. with contracts canceled by early 2020. All four F-35A have been withheld at Luke Air Force Base and not sent to Turkey. Accidents and notable incidents On 23 June 2014, an F-35A's engine caught fire at Eglin AFB. The pilot escaped unharmed, while the aircraft sustained an estimated US$50 million in damage. The accident caused all flights to be halted on 3 July. The fleet returned to flight on 15 July with flight envelope restrictions. In June 2015, the USAF Air Education and Training Command (AETC) issued its official report, which blamed the failure on the third stage rotor of the engine's fan module, pieces of which cut through the fan case and upper fuselage. Pratt & Whitney applied an extended "rub-in" to increase the gap between the second stator and the third rotor integral arm seal, as well as design alterations to pre-trench the stator by early 2016. On 28 September 2018, the first crash occurred involving a USMC F-35B near Marine Corps Air Station Beaufort, South Carolina; the pilot ejected safely. The crash was attributed to a faulty fuel tube; all F-35s were grounded on 11 October pending a fleet-wide inspection of the tubes. The next day, most USAF and USN F-35s returned to flight status following the inspection. On 9 April 2019, a JASDF F-35A attached to Misawa Air Base disappeared from radar about 84 miles (135 km) east of the Aomori Prefecture during a training mission over the Pacific Ocean. The pilot, Major Akinori Hosomi, had radioed his intention to abort the drill before disappearing. The US and Japanese navies searched for the missing aircraft and pilot, finding debris on the water that confirmed its crash; Hosomi's remains were recovered in June. In response, Japan grounded its 12 F-35As. There was speculation that China or Russia might attempt to salvage it; the Japanese Defense Ministry announced there had been no "reported activities" from either country. The F-35 reportedly did not send a distress signal nor did the pilot attempt any recovery maneuvers as it descended at a rapid rate. The accident report attributed the cause to the pilot's spatial disorientation. On 19 May 2020, a USAF F-35A from the 58th Fighter Squadron crashed while landing at Eglin AFB. The pilot ejected and was in stable condition. The accident was attributed to a combination of pilot error induced by fatigue, an design issue with the oxygen system and the aircraft's more complex nature being distracting, as well as a malfunctioning head-mounted display and an unresponsive flight control system. On 29 September 2020, a USMC F-35B crashed in Imperial County, California, after colliding with a Marine Corps KC-130 during air-to-air refuelling. The F-35B pilot was injured in the ejection, and the KC-130 crash-landed gear up in a field. On 17 November 2021, a Royal Air Force F-35B crashed during routine operations in the Mediterranean. The pilot was safely recovered to . Early reports suggested some of "the covers and engine blanks" had not been removed before takeoff. The wreckage, including all security sensitive equipment, was largely recovered with the assistance of U.S. and Italian forces. On 4 January 2022, a South Korean Air Force F-35A made a belly landing after all systems failed except the flight controls and the engine. The pilot heard a series of bangs during low altitude flight, and various systems stopped working. The control tower suggested that the pilot eject, but he managed to land the plane without deploying the landing gear, walking away uninjured. On 24 January 2022, a USN F-35C with VFA-147 suffered a ramp strike while landing on the and was lost overboard in the South China Sea, injuring seven crew members. The pilot ejected safely and was recovered from the water. Plans to recover the fighter are underway. Specifications (F-35A) Differences between variants Appearances in media See also Notes References Bibliography Lake, Jon. "The West's Great Hope". AirForces Monthly, December 2010. Further reading External links Official JSF web site, Official JSF videos Official F-35 Team web site F-35 page on U.S. Naval Air Systems Command site F-35 – Royal Air Force F-035 Lightning II 2000s United States fighter aircraft Single-engined jet aircraft Lift fan Carrier-based aircraft Stealth aircraft Mid-wing aircraft Aircraft first flown in 2006
[ -0.3149290382862091, -0.06739193946123123, -0.3142910599708557, 0.03800712153315544, -0.2692815363407135, 0.10544265061616898, 0.3503807783126831, -0.18923211097717285, -0.02929823100566864, -0.3158564269542694, -0.5171911120414734, 0.24198535084724426, -0.3900931179523468, 0.2679229676723...
11815
https://en.wikipedia.org/wiki/Food%20additive
Food additive
Food additives are substances added to food to preserve flavor or enhance taste, appearance, or other sensory qualities. Some additives have been used for centuries as part of an effort to preserve food, for example vinegar (pickling), salt (salting), smoke (smoking), sugar (crystallization), etc. This allows for longer-lasting foods such as bacon, sweets or wines. With the advent of processed foods in the second half of the twentieth century, many additives have been introduced, of both natural and artificial origin. Food additives also include substances that may be introduced to food indirectly (called "indirect additives") in the manufacturing process, through packaging, or during storage or transport. Numbering To regulate these additives and inform consumers, each additive is assigned a unique number called an "E number", which is used in Europe for all approved additives. This numbering scheme has now been adopted and extended by the Codex Alimentarius Commission to internationally identify all additives, regardless of whether they are approved for use. E numbers are all prefixed by "E", but countries outside Europe use only the number, whether the additive is approved in Europe or not. For example, acetic acid is written as E260 on products sold in Europe, but is simply known as additive 260 in some countries. Additive 103, alkannin, is not approved for use in Europe so does not have an E number, although it is approved for use in Australia and New Zealand. Since 1987, Australia has had an approved system of labelling for additives in packaged foods. Each food additive has to be named or numbered. The numbers are the same as in Europe, but without the prefix "E". The United States Food and Drug Administration (FDA) lists these items as "generally recognized as safe" (GRAS); they are listed under both their Chemical Abstracts Service number and FDA regulation under the United States Code of Federal Regulations. See list of food additives for a complete list of all the names. Categories Food additives can be divided into several groups, although there is some overlap because some additives exert more than one effect. For example, salt is both a preservative as well as a flavor. Acidulants Acidulants confer sour or acid taste. Common acidulants include vinegar, citric acid, tartaric acid, malic acid, fumaric acid, and lactic acid. Acidity regulators Acidity regulators are used for controlling the pH of foods for stability or to affect activity of enzymes. Anticaking agents Anticaking agents keep powders such as milk powder from caking or sticking. Antifoaming and foaming agents Antifoaming agents reduce or prevent foaming in foods. Foaming agents do the reverse. Antioxidants Antioxidants such as vitamin C are preservatives by inhibiting the degradation of food by oxygen. Bulking agents Bulking agents such as starch are additives that increase the bulk of a food without affecting its taste. Food coloring Colorings are added to food to replace colors lost during preparation or to make food look more attractive. Fortifying agents Vitamins, minerals, and dietary supplements to increase the nutritional value Color retention agents In contrast to colorings, color retention agents are used to preserve a food's existing color. Emulsifiers Emulsifiers allow water and oils to remain mixed together in an emulsion, as in mayonnaise, ice cream, and homogenized milk. Flavors* Flavors are additives that give food a particular taste or smell, and may be derived from natural ingredients or created artificially. *In EU flavors do not have an E-code and they are not considered as food additives. Flavor enhancers Flavor enhancers enhance a food's existing flavors. A popular example is monosodium glutamate. Some flavor enhancers have their own flavors that are independent of the food. Flour treatment agents Flour treatment agents are added to flour to improve its color or its use in baking. Glazing agents Glazing agents provide a shiny appearance or protective coating to foods. Humectants Humectants prevent foods from drying out. Tracer gas Tracer gas allow for package integrity testing to prevent foods from being exposed to atmosphere, thus guaranteeing shelf life. Preservatives Preservatives prevent or inhibit spoilage of food due to fungi, bacteria and other microorganisms. Stabilizers Stabilizers, thickeners and gelling agents, like agar or pectin (used in jam for example) give foods a firmer texture. While they are not true emulsifiers, they help to stabilize emulsions. Sweeteners Sweeteners are added to foods for flavoring. Sweeteners other than sugar are added to keep the food energy (calories) low, or because they have beneficial effects regarding diabetes mellitus, tooth decay, or diarrhea. Thickeners Thickening agents are substances which, when added to the mixture, increase its viscosity without substantially modifying its other properties. Packaging Bisphenols, phthalates, and perfluoroalkyl chemicals (PFCs) are indirect additives used in manufacturing or packaging. In July 2018 the American Academy of Pediatrics called for more careful study of those three substances, along with nitrates and food coloring, as they might harm children during development. Safety and regulation With the increasing use of processed foods since the 19th century, food additives are more widely used. Many countries regulate their use. For example, boric acid was widely used as a food preservative from the 1870s to the 1920s, but was banned after World War I due to its toxicity, as demonstrated in animal and human studies. During World War II, the urgent need for cheap, available food preservatives led to it being used again, but it was finally banned in the 1950s. Such cases led to a general mistrust of food additives, and an application of the precautionary principle led to the conclusion that only additives that are known to be safe should be used in foods. In the United States, this led to the adoption of the Delaney clause, an amendment to the Federal Food, Drug, and Cosmetic Act of 1938, stating that no carcinogenic substances may be used as food additives. However, after the banning of cyclamates in the United States and Britain in 1969, saccharin, the only remaining legal artificial sweetener at the time, was found to cause cancer in rats. Widespread public outcry in the United States, partly communicated to Congress by postage-paid postcards supplied in the packaging of sweetened soft drinks, led to the retention of saccharin, despite its violation of the Delaney clause. However, in 2000, saccharin was found to be carcinogenic in rats due only to their unique urine chemistry. In 2007, Food Standards Australia New Zealand published an official shoppers' guidance with which the concerns of food additives and their labeling are mediated. In the EU it can take 10 years or more to obtain approval for a new food additive. This includes five years of safety testing, followed by two years for evaluation by the European Food Safety Authority (EFSA) and another three years before the additive receives an EU-wide approval for use in every country in the European Union. Apart from testing and analyzing food products during the whole production process to ensure safety and compliance with regulatory standards, Trading Standards officers (in the UK) protect the public from any illegal use or potentially dangerous mis-use of food additives by performing random testing of food products. There has been significant controversy associated with the risks and benefits of food additives. Natural additives may be similarly harmful or be the cause of allergic reactions in certain individuals. For example, safrole was used to flavor root beer until it was shown to be carcinogenic. Due to the application of the Delaney clause, it may not be added to foods, even though it occurs naturally in sassafras and sweet basil. Hyperactivity Periodically, concerns have been expressed about a linkage between additives and hyperactivity, however "no clear evidence of ADHD was provided". Toxicity In 2012, the EFSA proposed the tier approach to evaluate the potential toxicity of food additives. It is based on four dimensioni: toxicokinetics (absorption, distribution, metabolism and excretion); genotoxicity; subchronic (at least 90 data) and chronic toxicity and carcinogenity; reproductive and developmental toxicity. Micronutrients A subset of food additives, micronutrients added in food fortification processes preserve nutrient value by providing vitamins and minerals to foods such as flour, cereal, margarine and milk which normally would not retain such high levels. Added ingredients, such as air, bacteria, fungi, and yeast, also contribute manufacturing and flavor qualities, and reduce spoilage. Food Additive Approval in the United States The United States Food and Drug Administration (FDA) defines a food additive as “any substance the intended use of which results or may reasonably be expected to result directly or indirectly in its becoming a component or otherwise affecting the characteristics of any food”. In order for a novel food additive to be approved in the U.S., a food additive approval petition (FAP) must be submitted to the FDA. The identity of the ingredient, the proposed use in the food system, the technical effect of the ingredient, a method of analysis for the ingredient in foods, information on the manufacturing process, and full safety reports must be defined in a FAP. For FDA approval of a FAP, the FDA evaluates the chemical composition of the ingredient, the quantities that would be typically consumed, acute and chronic health impacts, and other safety factors. The FDA reviews the petition prior to market approval of the additive. Standardization of its derived products ISO has published a series of standards regarding the topic and these standards are covered by ICS 67.220. See also Color retention agent Delaney clause Dietary supplement E number Food Chemicals Codex Food fortification Food industry Food processing Food supplements Joint FAO/WHO Expert Committee on Food Additives List of food additives List of food additives, Codex Alimentarius List of food labeling regulations List of phytochemicals in food Organic fertilizer Pink slime Processing aid Smoking Sugar substitute References Additional sources U.S. Food and Drug Administration. (1991). Everything Added to Food in the United States. Boca Raton, Florida: C.K. Smoley (c/o CRC Press, Inc.). The Food Labelling Regulations (1984) Advanced Modular Science, Nelson, Food and Health, by John Adds, Erica Larkcom and Ruth Miller External links Food Trade's Juicy Secrets by John Triggs in the Daily Express July 17, 2007 Everything Added to Food in the United States (EAFUS) i.e. Castor oil, etc. EU legislation on food additives CSPI's guide to food additives, (PDF) Food Standards Australia and New Zealand page on food additives Evaluation of certain Food Additives and Contaminants; Sixty-first report of the Joint FAO/WHO Expert Committee on Food Additives Food science
[ 0.6748872399330139, 0.6507397890090942, -0.3283323347568512, 0.14572910964488983, -0.002160371746867895, -0.22033879160881042, 0.07320539653301239, 0.4528762400150299, 0.3257107436656952, -0.6937934756278992, -0.3208509683609009, 0.39327341318130493, -0.19641917943954468, 0.286629348993301...
11820
https://en.wikipedia.org/wiki/Fridtjof%20Nansen
Fridtjof Nansen
Fridtjof Wedel-Jarlsberg Nansen (; 10 October 186113 May 1930) was a Norwegian polymath and Nobel Peace Prize laureate. He gained prominence at various points in his life as an explorer, scientist, diplomat and humanitarian. He led the team that made the first crossing of the Greenland interior in 1888, traversing the island on cross-country skis. He won international fame after reaching a record northern latitude of 86°14′ during his Fram expedition of 18931896. Although he retired from exploration after his return to Norway, his techniques of polar travel and his innovations in equipment and clothing influenced a generation of subsequent Arctic and Antarctic expeditions. Nansen studied zoology at the Royal Frederick University in Christiania and later worked as a curator at the University Museum of Bergen where his research on the central nervous system of lower marine creatures earned him a doctorate and helped establish neuron doctrine. Later, neuroscientist Santiago Ramón y Cajal won the 1906 Nobel Prize in Medicine for his research on the same subject. After 1896 his main scientific interest switched to oceanography; in the course of his research he made many scientific cruises, mainly in the North Atlantic, and contributed to the development of modern oceanographic equipment. As one of his country's leading citizens, in 1905 Nansen spoke out for the ending of Norway's union with Sweden, and was instrumental in persuading Prince Carl of Denmark to accept the throne of the newly independent Norway. Between 1906 and 1908 he served as the Norwegian representative in London, where he helped negotiate the Integrity Treaty that guaranteed Norway's independent status. In the final decade of his life, Nansen devoted himself primarily to the League of Nations, following his appointment in 1921 as the League's High Commissioner for Refugees. In 1922 he was awarded the Nobel Peace Prize for his work on behalf of the displaced victims of World War I and related conflicts. Among the initiatives he introduced was the "Nansen passport" for stateless persons, a certificate that used to be recognized by more than 50 countries. He worked on behalf of refugees until his sudden death in 1930, after which the League established the Nansen International Office for Refugees to ensure that his work continued. This office received the Nobel Peace Prize in 1938. His name is commemorated in numerous geographical features, particularly in the polar regions. Family background and childhood The Nansen family originated in Denmark. Hans Nansen (1598–1667), a trader, was an early explorer of the White Sea region of the Arctic Ocean. In later life he settled in Copenhagen, becoming the city's borgmester in 1654. Later generations of the family lived in Copenhagen until the mid-18th century, when Ancher Antoni Nansen moved to Norway (then in a union with Denmark). His son, Hans Leierdahl Nansen (1764–1821), was a magistrate first in the Trondheim district, later in Jæren. After Norway's separation from Denmark in 1814, he entered national political life as the representative for Stavanger in the first Storting, and became a strong advocate of union with Sweden. After suffering a paralytic stroke in 1821 Hans Leierdahl Nansen died, leaving a four-year-old son, Baldur Fridtjof Nansen, the explorer's father. Baldur was a lawyer without ambitions for public life, who became Reporter to the Supreme Court of Norway. He married twice, the second time to Adelaide Johanne Thekla Isidore Bølling Wedel-Jarlsberg from Bærum, a niece of Herman Wedel-Jarlsberg who had helped frame the Norwegian constitution of 1814 and was later the Swedish king's Norwegian Viceroy. Baldur and Adelaide settled at Store Frøen, an estate at Aker, a few kilometres north of Norway's capital city, Christiania (since renamed Oslo). The couple had three children; the first died in infancy, the second, born 10 October 1861, was Fridtjof Wedel-Jarlsberg Nansen. Store Frøen's rural surroundings shaped the nature of Nansen's childhood. In the short summers the main activities were swimming and fishing, while in the autumn the chief pastime was hunting for game in the forests. The long winter months were devoted mainly to skiing, which Nansen began to practice at the age of two, on improvised skis. At the age of 10 he defied his parents and attempted the ski jump at the nearby Huseby installation. This exploit had near-disastrous consequences, as on landing the skis dug deep into the snow, pitching the boy forward: "I, head first, described a fine arc in the air ... [W]hen I came down again I bored into the snow up to my waist. The boys thought I had broken my neck, but as soon as they saw there was life in me ... a shout of mocking laughter went up." Nansen's enthusiasm for skiing was undiminished, though as he records, his efforts were overshadowed by those of the skiers from the mountainous region of Telemark, where a new style of skiing was being developed. "I saw this was the only way", wrote Nansen later. At school, Nansen worked adequately without showing any particular aptitude. Studies took second place to sports, or to expeditions into the forests where he would live "like Robinson Crusoe" for weeks at a time. Through such experiences Nansen developed a marked degree of self-reliance. He became an accomplished skier and a highly proficient skater. Life was disrupted when, in the summer of 1877, Adelaide Nansen died suddenly. Distressed, Baldur Nansen sold the Store Frøen property and moved with his two sons to Christiania. Nansen's sporting prowess continued to develop; at 18 he broke the world one-mile (1.6 km) skating record, and in the following year won the national cross-country skiing championship, a feat he would repeat on 11 subsequent occasions. Student and adventurer In 1880 Nansen passed his university entrance examination, the examen artium. He decided to study zoology, claiming later that he chose the subject because he thought it offered the chance of a life in the open air. He began his studies at the Royal Frederick University in Christiania early in 1881. Early in 1882 Nansen took "...the first fatal step that led me astray from the quiet life of science." Professor Robert Collett of the university's zoology department proposed that Nansen take a sea voyage, to study Arctic zoology at first hand. Nansen was enthusiastic, and made arrangements through a recent acquaintance, Captain Axel Krefting, commander of the sealer Viking. The voyage began on 11 March 1882 and extended over the following five months. In the weeks before sealing started, Nansen was able to concentrate on scientific studies. From water samples he showed that, contrary to previous assumption, sea ice forms on the surface of the water rather than below. His readings also demonstrated that the Gulf Stream flows beneath a cold layer of surface water. Through the spring and early summer Viking roamed between Greenland and Spitsbergen in search of seal herds. Nansen became an expert marksman, and on one day proudly recorded that his team had shot 200 seal. In July, Viking became trapped in the ice close to an unexplored section of the Greenland coast; Nansen longed to go ashore, but this was impossible. However, he began to develop the idea that the Greenland icecap might be explored, or even crossed. On 17 July the ship broke free from the ice, and early in August was back in Norwegian waters. Nansen did not resume formal studies at the university. Instead, on Collett's recommendation, he accepted a post as curator in the zoological department of the Bergen Museum. He was to spend the next six years of his life there—apart from a six-month sabbatical tour of Europe—working and studying with leading figures such as Gerhard Armauer Hansen, the discoverer of the leprosy bacillus, and Daniel Cornelius Danielssen, the museum's director who had turned it from a backwater collection into a centre of scientific research and education. Nansen's chosen area of study was the then relatively unexplored field of neuroanatomy, specifically the central nervous system of lower marine creatures. Before leaving for his sabbatical in February 1886 he published a paper summarising his research to date, in which he stated that "anastomoses or unions between the different ganglion cells" could not be demonstrated with certainty. This unorthodox view was confirmed by the simultaneous researches of the embryologist Wilhelm His and the psychiatrist August Forel. Nansen is considered the first Norwegian defender of the neuron theory, originally proposed by Santiago Ramón y Cajal. His subsequent paper, The Structure and Combination of Histological Elements of the Central Nervous System, published in 1887, became his doctoral thesis. Crossing of Greenland Planning The idea of an expedition across the Greenland icecap grew in Nansen's mind throughout his Bergen years. In 1887, after the submission of his doctoral thesis, he finally began organising this project. Before then, the two most significant penetrations of the Greenland interior had been those of Adolf Erik Nordenskiöld in 1883, and Robert Peary in 1886. Both had set out from Disko Bay on the western coast, and had travelled about eastward before turning back. By contrast, Nansen proposed to travel from east to west, ending rather than beginning his trek at Disko Bay. A party setting out from the inhabited west coast would, he reasoned, have to make a return trip, as no ship could be certain of reaching the dangerous east coast and picking them up. By starting from the east—assuming that a landing could be made there—Nansen's would be a one-way journey towards a populated area. The party would have no line of retreat to a safe base; the only way to go would be forward, a situation that fitted Nansen's philosophy completely. Nansen rejected the complex organisation and heavy manpower of other Arctic ventures, and instead planned his expedition for a small party of six. Supplies would be manhauled on specially designed lightweight sledges. Much of the equipment, including sleeping bags, clothing and cooking stoves, also needed to be designed from scratch. These plans received a generally poor reception in the press; one critic had no doubt that "if [the] scheme be attempted in its present form ... the chances are ten to one that he will ... uselessly throw his own and perhaps others' lives away". The Norwegian parliament refused to provide financial support, believing that such a potentially risky undertaking should not be encouraged. The project was eventually launched with a donation from a Danish businessman, Augustin Gamél; the rest came mainly from small contributions from Nansen's countrymen, through a fundraising effort organised by students at the university. Despite the adverse publicity, Nansen received numerous applications from would-be adventurers. He wanted expert skiers, and attempted to recruit from the skiers of Telemark, but his approaches were rebuffed. Nordenskiöld had advised Nansen that Sami people, from Finnmark in the far north of Norway, were expert snow travellers, so Nansen recruited a pair, Samuel Balto and Ole Nielsen Ravna. The remaining places went to Otto Sverdrup, a former sea-captain who had more recently worked as a forester; Oluf Christian Dietrichson, an army officer, and Kristian Kristiansen, an acquaintance of Sverdrup's. All had experience of outdoor life in extreme conditions, and were experienced skiers. Just before the party's departure, Nansen attended a formal examination at the university, which had agreed to receive his doctoral thesis. In accordance with custom he was required to defend his work before appointed examiners acting as "devil's advocates". He left before knowing the outcome of this process. Expedition The sealer Jason picked up Nansen's party on 3 June 1888 from the Icelandic port of Ísafjörður. They sighted the Greenland coast a week later, but thick pack ice hindered progress. With the coast still away, Nansen decided to launch the small boats. They were within sight of Sermilik Fjord on 17 July; Nansen believed it would offer a route up the icecap. The expedition left Jason "in good spirits and with the highest hopes of a fortunate result." Days of extreme frustration followed as they drifted south. Weather and sea conditions prevented them from reaching the shore. They spent most time camping on the ice itself—it was too dangerous to launch the boats. By 29 July, they found themselves south of the point where they left the ship. That day they finally reached land but were too far south to begin the crossing. Nansen ordered the team back into the boats after a brief rest and to begin rowing north. The party battled northward along the coast through the ice floes for the next 12 days. They encountered a large Eskimo encampment on the first day, near Cape Steen Bille. Occasional contacts with the nomadic native population continued as the journey progressed. The party reached Umivik Bay on 11 August, after covering . Nansen decided they needed to begin the crossing. Although they were still far south of his intended starting place; the season was becoming too advanced. After they landed at Umivik, they spent the next four days preparing for their journey. They set out on the evening of 15 August, heading north-west towards Christianhaab on the western shore of Disko Bay— away. Over the next few days, the party struggled to ascend. The inland ice had a treacherous surface with many hidden crevasses and the weather was bad. Progress stopped for three days because of violent storms and continuous rain one time. The last ship was due to leave Christianhaab by mid-September. They would not be able to reach it in time, Nansen concluded on 26 August. He ordered a change of course due west, towards Godthaab; a shorter journey by at least . The rest of the party, according to Nansen, "hailed the change of plan with acclamation." They continued climbing until 11 September and reached a height of above sea level. Temperatures on the icecap summit of the icecap dropped to at night. From then on the downward slope made travelling easier. Yet, the terrain was rugged and the weather remained hostile. Progress was slow: fresh snowfalls made dragging the sledges like pulling them through sand. On 26 September, they battled their way down the edge of a fjord westward towards Godthaab. Sverdrup constructed a makeshift boat out of parts of the sledges, willows, and their tent. Three days later, Nansen and Sverdrup began the last stage of the journey; rowing down the fjord. On 3 October, they reached Godthaab, where the Danish town representative greeted them. He first informed Nansen that he secured his doctorate, a matter that "could not have been more remote from [Nansen's] thoughts at that moment." The team accomplished their crossing in 49 days. Throughout the journey, they maintained meteorological and geographical and other records relating to the previously unexplored interior. The rest of the team arrived in Godthaab on 12 October. Nansen soon learned no ship was likely to call at Godthaab until the following spring. Still, they were able to send letters back to Norway via a boat leaving Ivigtut at the end of October. He and his party spent the next seven months in Greenland. On 15 April 1889, the Danish ship Hvidbjørnen finally entered the harbour. Nansen recorded: "It was not without sorrow that we left this place and these people, among whom we had enjoyed ourselves so well." Interlude and marriage Hvidbjørnen reached Copenhagen on 21 May 1889. News of the crossing had preceded its arrival, and Nansen and his companions were feted as heroes. This welcome, however, was dwarfed by the reception in Christiania a week later, when crowds of between thirty and forty thousand—a third of the city's population—thronged the streets as the party made its way to the first of a series of receptions. The interest and enthusiasm generated by the expedition's achievement led directly to the formation that year of the Norwegian Geographical Society. Nansen accepted the position of curator of the Royal Frederick University's zoology collection, a post which carried a salary but involved no duties; the university was satisfied by the association with the explorer's name. Nansen's main task in the following weeks was writing his account of the expedition, but he found time late in June to visit London, where he met the Prince of Wales (the future Edward VII), and addressed a meeting of the Royal Geographical Society (RGS). The RGS president, Sir Mountstuart Elphinstone Grant Duff, said that Nansen has claimed "the foremost place amongst northern travellers", and later awarded him the Society's prestigious Founder's Medal. This was one of many honours Nansen received from institutions all over Europe. He was invited by a group of Australians to lead an expedition to Antarctica, but declined, believing that Norway's interests would be better served by a North Pole conquest. On 11 August 1889 Nansen announced his engagement to Eva Sars, the daughter of Michael Sars, a zoology professor who had died when Eva was 11 years old. The couple had met some years previously, at the skiing resort of Frognerseteren, where Nansen recalled seeing "two feet sticking out of the snow". Eva was three years older than Nansen, and despite the evidence of this first meeting, was an accomplished skier. She was also a celebrated classical singer who had been coached in Berlin by Désirée Artôt, one-time paramour of Tchaikovsky. The engagement surprised many; since Nansen had previously expressed himself forcefully against the institution of marriage, Otto Sverdrup assumed he had read the message wrongly. The wedding took place on 6 September 1889, less than a month after the engagement. Fram expedition Planning Nansen first began to consider the possibility of reaching the North Pole after reading meteorologist Henrik Mohn's theory on transpolar drift in 1884. Artefacts found on the coast of Greenland were identified to have come from the Jeannette expedition. In June 1881, was crushed and sunk off the Siberian coast—the opposite side of the Arctic Ocean. Mohn surmised the location of the artefacts indicated the existence of an ocean current from east to west, all the way across the polar sea and possibly over the pole itself. The idea remained fixated in Nansen's mind for the next couple of years. He developed a detailed plan for a polar venture after his triumphant return from Greenland. He made his idea public in February 1890, at a meeting of the newly formed Norwegian Geographical Society. Previous expeditions, he argued, approached the North Pole from the west and failed because they were working against the prevailing east–west current; the secret was to work with the current. A workable plan would require a sturdy and manoeuvrable small ship, capable of carrying fuel and provisions for twelve men for five years. This ship would enter the ice pack close to the approximate location of Jeannette's sinking, drifting west with the current towards the pole and beyond it—eventually reaching the sea between Greenland and Spitsbergen. Experienced polar explorers were dismissive: Adolphus Greely called the idea "an illogical scheme of self-destruction". Equally dismissive were Sir Allen Young, a veteran of the searches for Franklin's lost expedition, and Sir Joseph Dalton Hooker, who had sailed to the Antarctic on the Ross expedition. Nansen still managed to secure a grant from the Norwegian parliament after an impassioned speech. Additional funding was secured through a national appeal for private donations. Preparations Nansen chose naval engineer Colin Archer to design and build a ship. Archer designed an extraordinarily sturdy vessel with an intricate system of crossbeams and braces of the toughest oak timbers. Its rounded hull was designed to push the ship upwards when beset by pack ice. Speed and manoeuvrability were to be secondary to its ability as a safe and warm shelter during their predicted confinement. The length-to-beam ratio— and —gave it a stubby appearance, justified by Archer: "A ship that is built with exclusive regard to its suitability for [Nansen's] object must differ essentially from any known vessel." It was christened Fram and launched on 6 October 1892. Nansen selected a party of twelve from thousands of applicants. Otto Sverdrup, who took part in Nansen's earlier Greenland expedition was appointed as the expedition's second-in-command. Competition was so fierce that army lieutenant and dog-driving expert Hjalmar Johansen signed on as ship's stoker, the only position still available. Into the ice Fram left Christiania on 24 June 1893, cheered on by thousands of well-wishers. After a slow journey around the coast, the final port of call was Vardø, in the far north-east of Norway. Fram left Vardø on 21 July, following the North-East Passage route pioneered by Nordenskiöld in 1878–1879, along the northern coast of Siberia. Progress was impeded by fog and ice conditions in the mainly uncharted seas. The crew also experienced the dead water phenomenon, where a ship's forward progress is impeded by friction caused by a layer of fresh water lying on top of heavier salt water. Nevertheless, Cape Chelyuskin, the most northerly point of the Eurasian continental mass, was passed on 10 September. Heavy pack ice was sighted ten days later at around latitude 78°N, as Fram approached the area in which was crushed. Nansen followed the line of the pack northwards to a position recorded as , before ordering engines stopped and the rudder raised. From this point Fram's drift began. The first weeks in the ice were frustrating, as the drift moved unpredictably; sometimes north, sometimes south. By 19 November, Fram's latitude was south of that at which she had entered the ice. Only after the turn of the year, in January 1894, did the northerly direction become generally settled; the 80°N mark was finally passed on 22 March. Nansen calculated that, at this rate, it might take the ship five years to reach the pole. As the ship's northerly progress continued at a rate rarely above a kilometre and a half per day, Nansen began privately to consider a new plan—a dog sledge journey towards the pole. With this in mind, he began to practice dog-driving, making many experimental journeys over the ice. In November, Nansen announced his plan: when the ship passed latitude 83°N, he and Hjalmar Johansen would leave the ship with the dogs and make for the pole while Fram, under Sverdrup, continued its drift until it emerged from the ice in the North Atlantic. After reaching the pole, Nansen and Johansen would make for the nearest known land, the recently discovered and sketchily mapped Franz Josef Land. They would then cross to Spitzbergen where they would find a ship to take them home. The crew spent the rest of the winter of 1894 preparing clothing and equipment for the forthcoming sledge journey. Kayaks were built, to be carried on the sledges until needed for the crossing of open water. Preparations were interrupted early in January when violent tremors shook the ship. The crew disembarked, fearing the vessel would be crushed, but Fram proved herself equal to the danger. On 8 January 1895, the ship's position was 83°34′N, above Greely's previous record of 83°24′N. Dash for the pole With the ship's latitude at 84°4′N and after two false starts, Nansen and Johansen began their journey on 14 March 1895. Nansen allowed 50 days to cover the to the pole, an average daily journey of . After a week of travel, a sextant observation indicated they averaged per day, which put them ahead of schedule. However, uneven surfaces made skiing more difficult, and their speeds slowed. They also realised they were marching against a southerly drift, and that distances travelled did not necessarily equate to distance progressed. On 3 April, Nansen began to doubt whether the pole was attainable. Unless their speed improved, their food would not last them to the pole and back to Franz Josef Land. He confided in his diary: "I have become more and more convinced we ought to turn before time." Four days later, after making camp, he observed the way ahead was "... a veritable chaos of iceblocks stretching as far as the horizon." Nansen recorded their latitude as 86°13′6″N—almost three degrees beyond the previous record—and decided to turn around and head back south. Retreat At first Nansen and Johansen made good progress south, but suffered a serious setback on 13 April, when in his eagerness to break camp, they had forgotten to wind their chronometers, which made it impossible to calculate their longitude and accurately navigate to Franz Josef Land. They restarted the watches based on Nansen's guess they were at 86°E. From then on they were uncertain of their true position. The tracks of an Arctic fox were observed towards the end of April. It was the first trace of a living creature other than their dogs since they left Fram. They soon saw bear tracks and by the end of May saw evidence of nearby seals, gulls and whales. On 31 May, Nansen calculated they were only from Cape Fligely, Franz Josef Land's northernmost point. Travel conditions worsened as increasingly warmer weather caused the ice to break up. On 22 June, the pair decided to rest on a stable ice floe while they repaired their equipment and gathered strength for the next stage of their journey. They remained on the floe for a month. The day after leaving this camp, Nansen recorded: "At last the marvel has come to pass—land, land, and after we had almost given up our belief in it!" Whether this still-distant land was Franz Josef Land or a new discovery they did not know—they had only a rough sketch map to guide them. The edge of the pack ice was reached on 6 August and they shot the last of their dogs—the weakest of which they killed regularly to feed the others since 24 April. The two kayaks were lashed together, a sail was raised, and they made for the land. It soon became clear this land was part of an archipelago. As they moved southwards, Nansen tentatively identified a headland as Cape Felder on the western edge of Franz Josef Land. Towards the end of August, as the weather grew colder and travel became increasingly difficult, Nansen decided to camp for the winter. In a sheltered cove, with stones and moss for building materials, the pair erected a hut which was to be their home for the next eight months. With ready supplies of bear, walrus and seal to keep their larder stocked, their principal enemy was not hunger but inactivity. After muted Christmas and New Year celebrations, in slowly improving weather, they began to prepare to leave their refuge, but it was 19 May 1896 before they were able to resume their journey. Rescue and return On 17 June, during a stop for repairs after the kayaks had been attacked by a walrus, Nansen thought he heard a dog barking as well as human voices. He went to investigate, and a few minutes later saw the figure of a man approaching. It was the British explorer Frederick Jackson, who was leading an expedition to Franz Josef Land and was camped at Cape Flora on nearby Northbrook Island. The two were equally astonished by their encounter; after some awkward hesitation Jackson asked: "You are Nansen, aren't you?", and received the reply "Yes, I am Nansen." Johansen was picked up and the pair were taken to Cape Flora where, during the following weeks, they recuperated from their ordeal. Nansen later wrote that he could "still scarcely grasp" their sudden change of fortune; had it not been for the walrus attack that caused the delay, the two parties might have been unaware of each other's existence. On 7 August, Nansen and Johansen boarded Jackson's supply ship Windward, and sailed for Vardø where they arrived on the 13th. They were greeted by Hans Mohn, the originator of the polar drift theory, who was in the town by chance. The world was quickly informed by telegram of Nansen's safe return, but as yet there was no news of Fram. Taking the weekly mail steamer south, Nansen and Johansen reached Hammerfest on 18 August, where they learned that Fram had been sighted. She had emerged from the ice north and west of Spitsbergen, as Nansen had predicted, and was now on her way to Tromsø. She had not passed over the pole, nor exceeded Nansen's northern mark. Without delay Nansen and Johansen sailed for Tromsø, where they were reunited with their comrades. The homeward voyage to Christiania was a series of triumphant receptions at every port. On 9 September, Fram was escorted into Christiania's harbour and welcomed by the largest crowds the city had ever seen. The crew were received by King Oscar, and Nansen, reunited with family, remained at the palace for several days as special guests. Tributes arrived from all over the world; typical was that from the British mountaineer Edward Whymper, who wrote that Nansen had made "almost as great an advance as has been accomplished by all other voyages in the nineteenth century put together". National figure Scientist and polar oracle Nansen's first task on his return was to write his account of the voyage. This he did remarkably quickly, producing 300,000 words of Norwegian text by November 1896; the English translation, titled Farthest North, was ready in January 1897. The book was an instant success, and secured Nansen's long-term financial future. Nansen included without comment the one significant adverse criticism of his conduct, that of Greely, who had written in Harper's Weekly on Nansen's decision to leave Fram and strike for the pole: "It passes comprehension how Nansen could have thus deviated from the most sacred duty devolving on the commander of a naval expedition." During the 20 years following his return from the Arctic, Nansen devoted most of his energies to scientific work. In 1897 he accepted a professorship in zoology at the Royal Frederick University, which gave him a base from which he could tackle the major task of editing the reports of the scientific results of the Fram expedition. This was a much more arduous task than writing the expedition narrative. The results were eventually published in six volumes, and according to a later polar scientist, Robert Rudmose-Brown, "were to Arctic oceanography what the Challenger expedition results had been to the oceanography of other oceans." In 1900, Nansen became director of the Christiania-based International Laboratory for North Sea Research, and helped found the International Council for the Exploration of the Sea. Through his connection with the latter body, in the summer of 1900 Nansen embarked on his first visit to Arctic waters since the Fram expedition, a cruise to Iceland and Jan Mayen Land on the oceanographic research vessel Michael Sars, named after Eva's father. Shortly after his return he learned that his Farthest North record had been passed, by members of the Duke of the Abruzzi's Italian expedition. They had reached 86°34′N on 24 April 1900, in an attempt to reach the North Pole from Franz Josef Land. Nansen received the news philosophically: "What is the value of having goals for their own sake? They all vanish ... it is merely a question of time." Nansen was now considered an oracle by all would-be explorers of the north and south polar regions. Abruzzi had consulted him, as had the Belgian Adrien de Gerlache, each of whom took expeditions to the Antarctic. Although Nansen refused to meet his own countryman and fellow-explorer Carsten Borchgrevink (whom he considered a fraud), he gave advice to Robert Falcon Scott on polar equipment and transport, prior to the 1901–04 Discovery expedition. At one point Nansen seriously considered leading a South Pole expedition himself, and asked Colin Archer to design two ships. However, these plans remained on the drawing board. By 1901 Nansen's family had expanded considerably. A daughter, Liv, had been born just before Fram set out; a son, Kåre was born in 1897 followed by a daughter, Irmelin, in 1900 and a second son Odd in 1901. The family home, which Nansen had built in 1891 from the profits of his Greenland expedition book, was now too small. Nansen acquired a plot of land in the Lysaker district and built, substantially to his own design, a large and imposing house which combined some of the characteristics of an English manor house with features from the Italian renaissance. The house was ready for occupation by April 1902; Nansen called it Polhøgda (in English "polar heights"), and it remained his home for the rest of his life. A fifth and final child, son Asmund, was born at Polhøgda in 1903. Politician and diplomat The union between Norway and Sweden, imposed by the Great Powers in 1814, had been under considerable strain through the 1890s, the chief issue in question being Norway's rights to its own consular service. Nansen, although not by inclination a politician, had spoken out on the issue on several occasions in defence of Norway's interests. It seemed, early in the 20th century that agreement between the two countries might be possible, but hopes were dashed when negotiations broke down in February 1905. The Norwegian government fell, and was replaced by one led by Christian Michelsen, whose programme was one of separation from Sweden. In February and March Nansen published a series of newspaper articles which placed him firmly in the separatist camp. The new prime minister wanted Nansen in the cabinet, but Nansen had no political ambitions. However, at Michelsen's request he went to Berlin and then to London where, in a letter to The Times, he presented Norway's legal case for a separate consular service to the English-speaking world. On 17 May 1905, Norway's Constitution Day, Nansen addressed a large crowd in Christiania, saying: "Now have all ways of retreat been closed. Now remains only one path, the way forward, perhaps through difficulties and hardships, but forward for our country, to a free Norway". He also wrote a book, Norway and the Union with Sweden, to promote Norway's case abroad. On 23 May the Storting passed the Consulate Act establishing a separate consular service. King Oscar refused his assent; on 27 May the Norwegian cabinet resigned, but the king would not recognise this step. On 7 June the Storting unilaterally announced that the union with Sweden was dissolved. In a tense situation the Swedish government agreed to Norway's request that the dissolution should be put to a referendum of the Norwegian people. This was held on 13 August 1905 and resulted in an overwhelming vote for independence, at which point King Oscar relinquished the crown of Norway while retaining the Swedish throne. A second referendum, held in November, determined that the new independent state should be a monarchy rather than a republic. In anticipation of this, Michelsen's government had been considering the suitability of various princes as candidates for the Norwegian throne. Faced with King Oscar's refusal to allow anyone from his own House of Bernadotte to accept the crown, the favoured choice was Prince Charles of Denmark. In July 1905 Michelsen sent Nansen to Copenhagen on a secret mission to persuade Charles to accept the Norwegian throne. Nansen was successful; shortly after the second referendum Charles was proclaimed king, taking the name Haakon VII. He and his wife, the British princess Maud, were crowned in the Nidaros Cathedral in Trondheim on 22 June 1906. In April 1906 Nansen was appointed Norway's first Minister in London. His main task was to work with representatives of the major European powers on an Integrity Treaty which would guarantee Norway's position. Nansen was popular in England, and got on well with King Edward, though he found court functions and diplomatic duties disagreeable; "frivolous and boring" was his description. However, he was able to pursue his geographical and scientific interests through contacts with the Royal Geographical Society and other learned bodies. The Treaty was signed on 2 November 1907, and Nansen considered his task complete. Resisting the pleas of, among others, King Edward that he should remain in London, on 15 November Nansen resigned his post. A few weeks later, still in England as the king's guest at Sandringham, Nansen received word that Eva was seriously ill with pneumonia. On 8 December he set out for home, but before he reached Polhøgda he learned, from a telegram, that Eva had died. Oceanographer and traveller After a period of mourning, Nansen returned to London. He had been persuaded by his government to rescind his resignation until after King Edward's state visit to Norway in April 1908. His formal retirement from the diplomatic service was dated 1 May 1908, the same day on which his university professorship was changed from zoology to oceanography. This new designation reflected the general character of Nansen's more recent scientific interests. In 1905, he had supplied the Swedish physicist Walfrid Ekman with the data which established the principle in oceanography known as the Ekman spiral. Based on Nansen's observations of ocean currents recorded during the Fram expedition, Ekman concluded that the effect of wind on the sea's surface produced currents which "formed something like a spiral staircase, down towards the depths". In 1909 Nansen combined with Bjørn Helland-Hansen to publish an academic paper, The Norwegian Sea: its Physical Oceanography, based on the Michael Sars voyage of 1900. Nansen had by now retired from polar exploration, the decisive step being his release of Fram to fellow Norwegian Roald Amundsen, who was planning a North Pole expedition. When Amundsen made his controversial change of plan and set out for the South Pole, Nansen stood by him. Between 1910 and 1914, Nansen participated in several oceanographic voyages. In 1910, aboard the Norwegian naval vessel Fridtjof, he carried out researches in the northern Atlantic, and in 1912 he took his own yacht, Veslemøy, to Bear Island and Spitsbergen. The main objective of the Veslemøy cruise was the investigation of salinity in the North Polar Basin. One of Nansen's lasting contributions to oceanography was his work designing instruments and equipment; the "Nansen bottle" for taking deep water samples remained in use into the 21st century, in a version updated by Shale Niskin. At the request of the Royal Geographical Society, Nansen began work on a study of Arctic discoveries, which developed into a two-volume history of the exploration of the northern regions up to the beginning of the 16th century. This was published in 1911 as Nord i Tåkeheimen ("In Northern Mists"). That year he renewed an acquaintance with Kathleen Scott, wife of Robert Falcon Scott, whose Terra Nova Expedition had sailed for Antarctica in 1910. Biographer Roland Huntford has asserted, without any compelling evidence, that Nansen and Kathleen Scott had a brief love affair. Louisa Young, in her biography of Lady Scott, refutes the claim. Many women were attracted to Nansen, and he had a reputation as a womaniser. His personal life was troubled around this time; in January 1913 he received news of the suicide of Hjalmar Johansen, who had returned in disgrace from Amundsen's successful South Pole expedition. In March 1913, Nansen's youngest son Asmund died after a long illness. In the summer of 1913, Nansen travelled to the Kara Sea, by the invitation of Jonas Lied, as part of a delegation investigating a possible trade route between Western Europe and the Siberian interior. The party then took a steamer up the Yenisei River to Krasnoyarsk, and travelled on the Trans-Siberian Railway to Vladivostok before turning for home. Nansen published a report from the trip in Through Siberia. The life and culture of the Russian peoples aroused in Nansen an interest and sympathy he would carry through to his later life. Immediately before the First World War, Nansen joined Helland-Hansen in an oceanographical cruise in eastern Atlantic waters. Statesman and humanitarian League of Nations On the outbreak of war in 1914, Norway declared its neutrality, alongside Sweden and Denmark. Nansen was appointed as the president of the Norwegian Union of Defence, but had few official duties, and continued with his professional work as far as circumstances permitted. As the war progressed, the loss of Norway's overseas trade led to acute shortages of food in the country, which became critical in April 1917, when the United States entered the war and placed extra restrictions on international trade. Nansen was dispatched to Washington by the Norwegian government; after months of discussion, he secured food and other supplies in return for the introduction of a rationing system. When his government hesitated over the deal, he signed the agreement on his own initiative. Within a few months of the war's end in November 1918, a draft agreement had been accepted by the Paris Peace Conference to create a League of Nations, as a means of resolving disputes between nations by peaceful means. The foundation of the League at this time was providential as far as Nansen was concerned, giving him a new outlet for his restless energy. He became president of the Norwegian League of Nations Society, and although the Scandinavian nations with their traditions of neutrality initially held themselves aloof, his advocacy helped to ensure that Norway became a full member of the League in 1920, and he became one of its three delegates to the League's General Assembly. In April 1920, at the League's request, Nansen began organising the repatriation of around half a million prisoners of war, stranded in various parts of the world. Of these, 300,000 were in Russia which, gripped by revolution and civil war, had little interest in their fate. Nansen was able to report to the Assembly in November 1920 that around 200,000 men had been returned to their homes. "Never in my life", he said, "have I been brought into touch with so formidable an amount of suffering." Nansen continued this work for a further two years until, in his final report to the Assembly in 1922, he was able to state that 427,886 prisoners had been repatriated to around 30 different countries. In paying tribute to his work, the responsible committee recorded that the story of his efforts "would contain tales of heroic endeavour worthy of those in the accounts of the crossing of Greenland and the great Arctic voyage." Russian famine Even before this work was complete, Nansen was involved in a further humanitarian effort. On 1 September 1921, prompted by the British delegate Philip Noel-Baker, he accepted the post of the League's High Commissioner for Refugees. His main brief was the resettlement of around two million Russian refugees displaced by the upheavals of the Russian Revolution. At the same time he tried to tackle the urgent problem of famine in Russia; following a widespread failure of crops around 30 million people were threatened with starvation and death. Despite Nansen's pleas on behalf of the starving, Russia's revolutionary government was feared and distrusted internationally, and the League was reluctant to come to its peoples' aid. Nansen had to rely largely on fundraising from private organisations, and his efforts met with limited success. Later he was to express himself bitterly on the matter: A major problem impeding Nansen's work on behalf of refugees was that most of them lacked documentary proof of identity or nationality. Without legal status in their country of refuge, their lack of papers meant they were unable to go anywhere else. To overcome this, Nansen devised a document that became known as the "Nansen passport", a form of identity for stateless persons that was in time recognised by more than 50 governments, and which allowed refugees to cross borders legally. Although the passport was created initially for refugees from Russia, it was extended to cover other groups. While attending the Conference of Lausanne in November 1922, Nansen learned that he had been awarded the Nobel Peace Prize for 1922. The citation referred to "his work for the repatriation of the prisoners of war, his work for the Russian refugees, his work to bring succour to the millions of Russians afflicted by famine, and finally his present work for the refugees in Asia Minor and Thrace". Nansen donated the prize money to international relief efforts. Greco-Turkish resettlement After the Greco-Turkish War of 1919–1922, Nansen travelled to Constantinople to negotiate the resettlement of hundreds of thousands of refugees, mainly ethnic Greeks who had fled from Turkey after the defeat of the Greek Army. The impoverished Greek state was unable to take them in, and so Nansen devised a scheme for a population exchange whereby half a million Turks in Greece were returned to Turkey, with full financial compensation, while further loans facilitated the absorption of the refugee Greeks into their homeland. Despite some controversy over the principle of a population exchange, the plan was implemented successfully over a period of several years. Armenian genocide From 1925 onwards, Nansen devoted much time trying to help Armenian refugees, victims of Armenian genocide at the hands of the Ottoman Empire during the First World War and further ill-treatment thereafter. His goal was the establishment of a national home for these refugees, within the borders of Soviet Armenia. His main assistant in this endeavour was Vidkun Quisling, the future Nazi collaborator and head of a Norwegian puppet government during the Second World War. After visiting the region, Nansen presented the Assembly with a modest plan for the irrigation of on which 15,000 refugees could be settled. The plan ultimately failed, because even with Nansen's unremitting advocacy the money to finance the scheme was not forthcoming. Despite this failure, his reputation among the Armenian people remains high. Nansen wrote Armenia and the Near East (1923) wherein he describes the plight of the Armenians in the wake of losing its independence to the Soviet Union. The book was translated into many languages. After his visit to Armenia, Nansen wrote two additional books: Across Armenia (1927) and Through the Caucasus to the Volga (1930). Within the League's Assembly, Nansen spoke out on many issues besides those related to refugees. He believed that the Assembly gave the smaller countries such as Norway a "unique opportunity for speaking in the councils of the world." He believed that the extent of the League's success in reducing armaments would be the greatest test of its credibility. He was a signatory to the Slavery Convention of 25 September 1926, which sought to outlaw the use of forced labour. He supported a settlement of the post-war reparations issue and championed Germany's membership of the League, which was granted in September 1926 after intensive preparatory work by Nansen. Later life On 17 January 1919 Nansen married Sigrun Munthe, a long-time friend with whom he had had a love affair in 1905, while Eva was still alive. The marriage was resented by the Nansen children, and proved unhappy; an acquaintance writing of them in the 1920s said Nansen appeared unbearably miserable and Sigrun steeped in hate. Nansen's League of Nations commitments through the 1920s meant that he was mostly absent from Norway, and was able to devote little time to scientific work. Nevertheless, he continued to publish occasional papers. He entertained the hope that he might travel to the North Pole by airship, but could not raise sufficient funding. In any event he was forestalled in this ambition by Amundsen, who flew over the pole in Umberto Nobile's airship Norge in May 1926. Two years later Nansen broadcast a memorial oration to Amundsen, who had disappeared in the Arctic while organising a rescue party for Nobile whose airship had crashed during a second polar voyage. Nansen said of Amundsen: "He found an unknown grave under the clear sky of the icy world, with the whirring of the wings of eternity through space." In 1926 Nansen was elected Rector of the University of St Andrews in Scotland, the first foreigner to hold this largely honorary position. He used the occasion of his inaugural address to review his life and philosophy, and to deliver a call to the youth of the next generation. He ended: We all have a Land of Beyond to seek in our life—what more can we ask? Our part is to find the trail that leads to it. A long trail, a hard trail, maybe; but the call comes to us, and we have to go. Rooted deep in the nature of every one of us is the spirit of adventure, the call of the wild—vibrating under all our actions, making life deeper and higher and nobler. Nansen largely avoided involvement in domestic Norwegian politics, but in 1924 he was persuaded by the long-retired former Prime Minister Christian Michelsen to take part in a new anti-communist political grouping, the Fatherland League. There were fears in Norway that should the Marxist-oriented Labour Party gain power it would introduce a revolutionary programme. At the inaugural rally of the League in Oslo (as Christiania had now been renamed), Nansen declared: "To talk of the right of revolution in a society with full civil liberty, universal suffrage, equal treatment for everyone ... [is] idiotic nonsense." Following continued turmoil between the centre-right parties, there was even an independent petition in 1926 gaining some momentum that proposed for Nansen to head a centre-right national unity government on a balanced budget program, an idea he did not reject. He was the headline speaker at the single largest Fatherland League rally with 15,000 attendees in Tønsberg in 1928. In 1929 he went on his final tour for the League on the ship Stella Polaris, holding speeches from Bergen to Hammerfest. In between his various duties and responsibilities, Nansen had continued to take skiing holidays when he could. In February 1930, aged 68, he took a short break in the mountains with two old friends, who noted that Nansen was slower than usual and appeared to tire easily. On his return to Oslo he was laid up for several months, with influenza and later phlebitis, and was visited on his sickbed by King Haakon VII. Nansen was a close friend of a clergyman named Wilhelm. Nansen himself was an atheist. Death and legacy Nansen died of a heart attack on 13 May 1930. He was given a non-religious state funeral before cremation, after which his ashes were laid under a tree at Polhøgda. Nansen's daughter Liv recorded that there were no speeches, just music: Schubert's Death and the Maiden, which Eva used to sing. In his lifetime and thereafter, Nansen received honours and recognition from many countries. Among the many tributes paid to him subsequently was that of Lord Robert Cecil, a fellow League of Nations delegate, who spoke of the range of Nansen's work, done with no regard for his own interests or health: "Every good cause had his support. He was a fearless peacemaker, a friend of justice, an advocate always for the weak and suffering." Nansen was a pioneer and innovator in many fields. As a young man he embraced the revolution in skiing methods that transformed it from a means of winter travel to a universal sport, and quickly became one of Norway's leading skiers. He was later able to apply this expertise to the problems of polar travel, in both his Greenland and his Fram expeditions. He invented the "Nansen sledge" with broad, ski-like runners, the "Nansen cooker" to improve the heat efficiency of the standard spirit stoves then in use, and the layer principle in polar clothing, whereby the traditionally heavy, awkward garments were replaced by layers of lightweight material. In science, Nansen is recognised both as one of the founders of modern neurology, and as a significant contributor to early oceanographical science, in particular for his work in establishing the Central Oceanographic Laboratory in Christiania. Through his work on behalf of the League of Nations, Nansen helped to establish the principle of international responsibility for refugees. Immediately after his death the League set up the Nansen International Office for Refugees, a semi-autonomous body under the League's authority, to continue his work. The Nansen Office faced great difficulties, in part arising from the large numbers of refugees from the European dictatorships during the 1930s. Nevertheless, it secured the agreement of 14 countries (including a reluctant Great Britain) to the Refugee Convention of 1933. It also helped to repatriate 10,000 Armenians to Yerevan in Soviet Armenia, and to find homes for a further 40,000 in Syria and Lebanon. In 1938, the year in which it was superseded by a wider-ranging body, the Nansen Office was awarded the Nobel Peace Prize. In 1954, the League's successor body, the United Nations, established the Nansen Medal, later named the Nansen Refugee Award, given annually by the United Nations High Commissioner for Refugees to an individual, group or organisation "for outstanding work on behalf of the forcibly displaced". Numerous geographical features bear his name: the Nansen Basin and the Nansen-Gakkel Ridge in the Arctic Ocean; Mount Nansen in the Yukon region of Canada; Mount Nansen, Mount Fridtjof Nansen and Nansen Island, all in Antarctica; as well as Nansen Island in the Kara Sea, Nansen Land in Greenland and Nansen Island in Franz Josef Land; 853 Nansenia, an asteroid; Nansen crater at the Moon's north pole and Nansen crater on Mars. His Polhøgda mansion is now home to the Fridtjof Nansen Institute, an independent foundation which engages in research on environmental, energy and resource management politics. A 1968 Norwegian/Soviet biographical film Just a Life: the Story of Fridtjof Nansen was released with Knut Wigert as Nansen. The Royal Norwegian Navy launched the first of a series of five s in 2004, with as its lead ship. Cruise ship was launched in 2020. Orders and decorations Works Paa ski over Grønland. En skildring af Den norske Grønlands-ekspedition 1888–89. Aschehoug, Kristiania 1890. Tr. as The First Crossing of Greenland, 1890. Eskimoliv. Aschehoug, Kristiania 1891. Tr. as Eskimo Life, (London: Longmans, Green & Co., 1893). Fram over Polhavet. Den norske polarfærd 1893–1896. Aschehoug, Kristiania 1897. Tr. as Farthest North, 1897. The Norwegian North Polar Expedition, 1893–1896; Scientific Results (6 volumes, 1901). Norge og foreningen med Sverige. Jacob Dybwads Forlag, Kristiania 1905. Tr. as Norway and the Union With Sweden, 1905. Northern Waters: Captain Roald Amundsen's Oceanographic Observations in the Arctic Seas in 1901. Jacob Dybwads Forlag, Kristiania, 1906. Nord i tåkeheimen. Utforskningen av jordens nordlige strøk i tidlige tider. Jacob Dybwads Forlag, Kristiania 1911. Tr. as In Northern Mists: Arctic Exploration in Early Times, 1911. Gjennem Sibirien. Jacob Dybwads forlag, Kristiania, 1914. Tr. as Through Siberia the Land of the Future, 1914. Frilufts-liv. Jacob Dybwads Forlag, Kristiania, 1916. En ferd til Spitsbergen. Jacob Dybwads Forlag, Kristiania, 1920. Rusland og freden. Jacob Dybwads Forlag, Kristiania, 1923. Blant sel og bjørn. Min første Ishavs-ferd. Jacob Dybwads Forlag, Kristiania, 1924. Gjennem Armenia. Jacob Dybwads Forlag, Oslo, 1927. Gjennem Kaukasus til Volga. Jacob Dybwads Forlag, Oslo, 1929. Tr. as Through The Caucasus To The Volga, 1931. English translations Armenia and the Near East. Publisher: J.C. & A.L. Fawcett, Inc., New York, 1928. (excerpts). See also Arctic exploration List of polar explorers Nansen Ski Club Nansen Ski Jump Notes References Inline citations Sources referenced Brøgger, Waldemar Christofer and Rolfsen, Nordahl (translated by William Archer (1896)). Fridtiof Nansen 1861–1893. New York. Longmans Green & Co. (First published in 1997 by Gerald Duckworth) (in Norwegian) Further reading Jones, Max (1 March 2021). "Exploration, Celebrity, and the Making of a Transnational Hero: Fridtjof Nansen and the Fram Expedition". The Journal of Modern History. 93 (1): 68–108. External links Portrait of Fridtjof Nansen published on ICRC Library and Archives blog CROSS-files Fridtjof Nansen Collection at Dartmouth College Library including the Nobel Lecture. 19 December 1922 The Suffering People of Europe 1861 births 1930 deaths People from Bærum Explorers from Oslo 19th-century Norwegian zoologists 20th-century Norwegian zoologists Ambassadors of Norway to the United Kingdom Commandeurs of the Légion d'honneur Explorers of the Arctic Fatherland League (Norway) Honorary Fellows of the Royal Society of Edinburgh Honorary members of the Saint Petersburg Academy of Sciences League of Nations people Honorary Knights Grand Cross of the Royal Victorian Order Nobel Peace Prize laureates Norwegian atheists Norwegian geomorphologists Norwegian Nobel laureates Norwegian people of Danish descent Norwegian polar explorers Norwegian politicians Recipients of the Cullum Geographical Medal Recipients of the Medal of Merit (Denmark) Rectors of the University of St Andrews University of Oslo faculty Rectors of universities and colleges in Norway Royal Norwegian Society of Sciences and Letters Grand Crosses of the Order of Franz Joseph Grand Officers of the Order of Saints Maurice and Lazarus Recipients of the Order of the Crown (Italy) Knights of the Order of the Dannebrog Scientists from Oslo Diplomats from Oslo
[ -0.4799618124961853, 1.0197958946228027, -0.9965239763259888, -0.2790944278240204, -0.6426302194595337, 0.9470313191413879, 1.1434189081192017, -0.25733426213264465, -0.5788366794586182, -0.1914399415254593, -0.020186420530080795, 0.04439482465386391, -0.5780525803565979, 0.578331589698791...
11824
https://en.wikipedia.org/wiki/Frederick%20Augustus%20II%20of%20Saxony
Frederick Augustus II of Saxony
Frederick Augustus II (; 18 May 1797 in Dresden – 9 August 1854 in Brennbüchel, Karrösten, Tyrol) was King of Saxony and a member of the House of Wettin. He was the eldest son of Maximilian, Prince of Saxony – younger son of the Elector Frederick Christian of Saxony – by his first wife, Caroline of Bourbon, Princess of Parma. Life Early years From his birth, it was clear that one day Frederick Augustus would become the ruler of Saxony. His father was the only son of the Elector Frederick Christian of Saxony who left surviving male issue. When the King Frederick Augustus I died (1827) and Anton succeeded him as King, Frederick Augustus became second in line to the throne, preceded only by his father Maximilian. He was an officer in the War of the Sixth Coalition. However, he had little interest in military affairs. Co-Regent to the Kingdom The July Revolution of 1830 in France marked the beginning of disturbances in Saxony that autumn. The people claimed a change in the constitution and demanded a young regent of the kingdom to share the government with the King Anton. On 1 September the Prince Maximilian renounced his rights of succession in favor of his son Frederick Augustus, who was proclaimed Prince Co-Regent (de: Prinz-Mitregenten) of Saxony. On 2 February 1832 Frederick Augustus brought Free Autonomy to the cities. Also, by an edict of 17 March of that year, the farmers were freed from the corvée and hereditary submission. King of Saxony On 6 June 1836, King Anton died and Frederick Augustus succeeded him. As an intelligent man, he was quickly popular with the people as he had been since the time of his regency. The new king solved political questions only from a pure sense of duty. Mostly he preferred to leave these things on the hands of his ministers. A standardized jurisdiction for Saxony created the Criminal Code of 1836. During the Revolutionary disturbances of 1848 (March Revolution), he appointed liberal ministers in the government, lifted censorship, and remitted a liberal electoral law. Later his attitude changed. On 28 April Frederick August II dissolved the Parliament. In 1849, Frederick Augustus was forced to flee to the Königstein Fortress. The May Uprising was crushed by Saxon and Prussian troops and Frederick was able to return after only a few days. Journey through England and Scotland In 1844 Frederick Augustus, accompanied by his personal physician Carl Gustav Carus, made an informal (incognito) visit to England and Scotland. Among places they visited were Lyme Regis where he purchased from the local fossil collector and dealer, Mary Anning, an ichthyosaur skeleton for his own extensive natural history collection. It was not a state visit, but the King was the guest of Queen Victoria and Prince Albert at Windsor Castle, visited many of the sights in London and in the university cities of Oxford and Cambridge, and toured widely in England, Wales and Scotland. Accidental Death During a journey in Tyrol, he had an accident in Brennbüchel in which he fell in front of a horse that stepped on his head. On 8 August 1854, he died in the Gasthof Neuner. He was buried on 16 August in the Katholische Hofkirche of Dresden. In his memory, the Dowager Queen Maria arranged to establish the Königskapelle (King's Chapel) at the accident place, which was consecrated one year later, some of the last members of the Saxon royal family, including Maria Emanuel, Margrave of Meissen, are buried beside the chapel. Marriages In Vienna on 26 September 1819 (by proxy) and again in Dresden on 7 October 1819 (in person), Frederick Augustus married firstly with the Archduchess Maria Caroline of Austria (Maria Karoline Ferdinande Theresia Josephine Demetria), daughter of Emperor Francis I of Austria. They had no children. In Dresden on 24 April 1833 Frederick Augustus married secondly with the Princess Maria Anna of Bavaria (Maria Anna Leopoldine Elisabeth Wilhelmine), daughter of the King Maximilian I Joseph of Bavaria. Like his first marriage, this was childless. The musician Theodor Uhlig (1822–1853) was an illegitimate son of Frederick Augustus. Without legitimate issue, after his death Frederick Augustus was succeeded by his younger brother, Johann. Ancestry Notes 1797 births 1854 deaths Nobility from Dresden German Roman Catholics Crown Princes of Saxony House of Wettin Kings of Saxony Albertine branch Extra Knights Companion of the Garter Knights of the Golden Fleece of Austria People of the Revolutions of 1848 Burials at Dresden Cathedral German military personnel of the Napoleonic Wars Deaths by horse-riding accident in Austria
[ -0.6394854187965393, 0.364260196685791, 0.06692179292440414, 0.6665445566177368, -0.4492144286632538, 0.7733397483825684, 1.028388500213623, -0.04255037009716034, -0.3146587610244751, -0.25312528014183044, -0.1560053527355194, -0.7170926928520203, -0.10789977014064789, 0.6234149932861328, ...
11826
https://en.wikipedia.org/wiki/Free%20market
Free market
In economics, a free market is a system in which the prices for goods and services are self-regulated by buyers and sellers negotiating in an open market without market coercions. In a free market, the laws and forces of supply and demand are free from any intervention by a government or other authority other than those interventions which are made to prohibit market coercions. Examples of such prohibited market coercions include: economic privilege, monopolies, and artificial scarcities. Proponents of the concept of free market contrast it with a regulated market in which a government intervenes in the exchange of property for any reason other than reducing market coercions. Scholars contrast the concept of a free market with the concept of a coordinated market in fields of study such as political economy, new institutional economics, economic sociology and political science. All of these fields emphasize the importance in currently existing market systems of rule-making institutions external to the simple forces of supply and demand which create space for those forces to operate to control productive output and distribution. Although free markets are commonly associated with capitalism in contemporary usage and popular culture, free markets have also been components in some forms of market socialism. Criticism of the theoretical concept may regard realities of the difficulty of regulating systems to prevent significant market dominance, inequality of bargaining power, or information asymmetry, in order to allow markets to function more freely. Historically, free market has also been used synonymously with other economic policies. For instance proponents of laissez-faire capitalism, may refer to it as free market capitalism because they claim it to achieve the most economic freedom. Economic systems Capitalism Capitalism is an economic system based on the private ownership of the means of production and their operation for profit. Central characteristics of capitalism include capital accumulation, competitive markets, a price system, private property and the recognition of property rights, voluntary exchange and wage labor. In a capitalist market economy, decision-making and investments are determined by every owner of wealth, property or production ability in capital and financial markets whereas prices and the distribution of goods and services are mainly determined by competition in goods and services markets. Economists, historians, political economists and sociologists have adopted different perspectives in their analyses of capitalism and have recognized various forms of it in practice. These include laissez-faire or free-market capitalism, state capitalism and welfare capitalism. Different forms of capitalism feature varying degrees of free markets, public ownership, obstacles to free competition and state-sanctioned social policies. The degree of competition in markets and the role of intervention and regulation as well as the scope of state ownership vary across different models of capitalism. The extent to which different markets are free and the rules defining private property are matters of politics and policy. Most of the existing capitalist economies are mixed economies that combine elements of free markets with state intervention and in some cases economic planning. Market economies have existed under many forms of government and in many different times, places and cultures. Modern capitalist societies—marked by a universalization of money-based social relations, a consistently large and system-wide class of workers who must work for wages (the proletariat) and a capitalist class which owns the means of production—developed in Western Europe in a process that led to the Industrial Revolution. Capitalist systems with varying degrees of direct government intervention have since become dominant in the Western world and continue to spread. Capitalism has been shown to be strongly correlated with economic growth. Georgism For classical economists such as Adam Smith, the term free market refers to a market free from all forms of economic privilege, monopolies and artificial scarcities. They say this implies that economic rents, which they describe as profits generated from a lack of perfect competition, must be reduced or eliminated as much as possible through free competition. Economic theory suggests the returns to land and other natural resources are economic rents that cannot be reduced in such a way because of their perfect inelastic supply. Some economic thinkers emphasize the need to share those rents as an essential requirement for a well functioning market. It is suggested this would both eliminate the need for regular taxes that have a negative effect on trade (see deadweight loss) as well as release land and resources that are speculated upon or monopolised, two features that improve the competition and free market mechanisms. Winston Churchill supported this view by the following statement: "Land is the mother of all monopoly". The American economist and social philosopher Henry George, the most famous proponent of this thesis, wanted to accomplish this through a high land value tax that replaces all other taxes. Followers of his ideas are often called Georgists or geoists and geolibertarians. Léon Walras, one of the founders of the neoclassical economics who helped formulate the general equilibrium theory, had a very similar view. He argued that free competition could only be realized under conditions of state ownership of natural resources and land. Additionally, income taxes could be eliminated because the state would receive income to finance public services through owning such resources and enterprises. Laissez-faire The laissez-faire principle expresses a preference for an absence of non-market pressures on prices and wages such as those from discriminatory government taxes, subsidies, tariffs, regulations, or government-granted monopolies. In The Pure Theory of Capital, Friedrich Hayek argued that the goal is the preservation of the unique information contained in the price itself. According to Karl Popper, the idea of the free market is paradoxical, as it requires interventions towards the goal of preventing interventions. Although laissez-faire has been commonly associated with capitalism, there is a similar economic theory associated with socialism called left-wing or socialist laissez-faire, also known as free-market anarchism, free-market anti-capitalism and free-market socialism to distinguish it from laissez-faire capitalism. Critics of laissez-faire as commonly understood argue that a truly laissez-faire system would be anti-capitalist and socialist. American individualist anarchists such as Benjamin Tucker saw themselves as economic free-market socialists and political individualists while arguing that their "anarchistic socialism" or "individual anarchism" was "consistent Manchesterism". Socialism Various forms of socialism based on free markets have existed since the 19th century. Early notable socialist proponents of free markets include Pierre-Joseph Proudhon, Benjamin Tucker and the Ricardian socialists. These economists believed that genuinely free markets and voluntary exchange could not exist within the exploitative conditions of capitalism. These proposals ranged from various forms of worker cooperatives operating in a free-market economy such as the mutualist system proposed by Proudhon, to state-owned enterprises operating in unregulated and open markets. These models of socialism are not to be confused with other forms of market socialism (e.g. the Lange model) where publicly owned enterprises are coordinated by various degrees of economic planning, or where capital good prices are determined through marginal cost pricing. Advocates of free-market socialism such as Jaroslav Vanek argue that genuinely free markets are not possible under conditions of private ownership of productive property. Instead, he contends that the class differences and inequalities in income and power that result from private ownership enable the interests of the dominant class to skew the market to their favor, either in the form of monopoly and market power, or by utilizing their wealth and resources to legislate government policies that benefit their specific business interests. Additionally, Vanek states that workers in a socialist economy based on cooperative and self-managed enterprises have stronger incentives to maximize productivity because they would receive a share of the profits (based on the overall performance of their enterprise) in addition to receiving their fixed wage or salary. The stronger incentives to maximize productivity that he conceives as possible in a socialist economy based on cooperative and self-managed enterprises might be accomplished in a free-market economy if employee-owned companies were the norm as envisioned by various thinkers including Louis O. Kelso and James S. Albus. Socialists also assert that free-market capitalism leads to an excessively skewed distributions of income and economic instabilities which in turn leads to social instability. Corrective measures in the form of social welfare, re-distributive taxation and regulatory measures and their associated administrative costs which are required create agency costs for society. These costs would not be required in a self-managed socialist economy. Concepts Economic equilibrium The general equilibrium theory has demonstrated that, under certain theoretical conditions of perfect competition, the law of supply and demand influences prices toward an equilibrium that balances the demands for the products against the supplies. At these equilibrium prices, the market distributes the products to the purchasers according to each purchaser's preference or utility for each product and within the relative limits of each buyer's purchasing power. This result is described as market efficiency, or more specifically a Pareto optimum. Low barriers to entry A free market does not directly require the existence of competition; however, it does require a framework that freely allows new market entrants. Hence, competition in a free market is a consequence of the conditions of a free market, including that market participants not be obstructed from following their profit motive. Perfect competition and market failure An absence of any of the conditions of perfect competition is considered a market failure. Regulatory intervention may provide a substitute force to counter a market failure, which leads some economists to believe that some forms of market regulation may be better than an unregulated market at providing a free market. Spontaneous order Friedrich Hayek popularized the view that market economies promote spontaneous order which results in a better "allocation of societal resources than any design could achieve". According to this view, market economies are characterized by the formation of complex transactional networks that produce and distribute goods and services throughout the economy. These networks are not designed, but they nevertheless emerge as a result of decentralized individual economic decisions. The idea of spontaneous order is an elaboration on the invisible hand proposed by Adam Smith in The Wealth of Nations. About the individual, Smith wrote: By preferring the support of domestic to that of foreign industry, he intends only his own security; and by directing that industry in such a manner as its produce may be of the greatest value, he intends only his own gain, and he is in this, as in many other cases, led by an invisible hand to promote an end which was no part of his intention. Nor is it always the worse for society that it was no part of it. By pursuing his own interest, he frequently promotes that of the society more effectually than when he really intends to promote it. I have never known much good done by those who affected to trade for the public good. Smith pointed out that one does not get one's dinner by appealing to the brother-love of the butcher, the farmer or the baker. Rather, one appeals to their self-interest and pays them for their labor, arguing: It is not from the benevolence of the butcher, the brewer or the baker, that we expect our dinner, but from their regard to their own self-interest. We address ourselves, not to their humanity but to their self-love, and never talk to them of our own necessities but of their advantages. Supporters of this view claim that spontaneous order is superior to any order that does not allow individuals to make their own choices of what to produce, what to buy, what to sell and at what prices due to the number and complexity of the factors involved. They further believe that any attempt to implement central planning will result in more disorder, or a less efficient production and distribution of goods and services. Critics such as political economist Karl Polanyi question whether a spontaneously ordered market can exist, completely free of distortions of political policy, claiming that even the ostensibly freest markets require a state to exercise coercive power in some areas, namely to enforce contracts, govern the formation of labor unions, spell out the rights and obligations of corporations, shape who has standing to bring legal actions and define what constitutes an unacceptable conflict of interest. Supply and demand Demand for an item (such as goods or services) refers to the economic market pressure from people trying to buy it. Buyers have a maximum price they are willing to pay for an item, and sellers have a minimum price at which they are willing to offer their product. The point at which the supply and demand curves meet is the equilibrium price of the good and quantity demanded. Sellers willing to offer their goods at a lower price than the equilibrium price receive the difference as producer surplus. Buyers willing to pay for goods at a higher price than the equilibrium price receive the difference as consumer surplus. The model is commonly applied to wages in the market for labor. The typical roles of supplier and consumer are reversed. The suppliers are individuals, who try to sell (supply) their labor for the highest price. The consumers are businesses, which try to buy (demand) the type of labor they need at the lowest price. As more people offer their labor in that market, the equilibrium wage decreases and the equilibrium level of employment increases as the supply curve shifts to the right. The opposite happens if fewer people offer their wages in the market as the supply curve shifts to the left. In a free market, individuals and firms taking part in these transactions have the liberty to enter, leave and participate in the market as they so choose. Prices and quantities are allowed to adjust according to economic conditions in order to reach equilibrium and allocate resources. However, in many countries around the world governments seek to intervene in the free market in order to achieve certain social or political agendas. Governments may attempt to create social equality or equality of outcome by intervening in the market through actions such as imposing a minimum wage (price floor) or erecting price controls (price ceiling). Other lesser-known goals are also pursued, such as in the United States, where the federal government subsidizes owners of fertile land to not grow crops in order to prevent the supply curve from further shifting to the right and decreasing the equilibrium price. This is done under the justification of maintaining farmers' profits; due to the relative inelasticity of demand for crops, increased supply would lower the price but not significantly increase quantity demanded, thus placing pressure on farmers to exit the market. Those interventions are often done in the name of maintaining basic assumptions of free markets such as the idea that the costs of production must be included in the price of goods. Pollution and depletion costs are sometimes not included in the cost of production (a manufacturer that withdraws water at one location then discharges it polluted downstream, avoiding the cost of treating the water), therefore governments may opt to impose regulations in an attempt to try to internalize all of the cost of production and ultimately include them in the price of the goods. Advocates of the free market contend that government intervention hampers economic growth by disrupting the efficient allocation of resources according to supply and demand while critics of the free market contend that government intervention is sometimes necessary to protect a country's economy from better-developed and more influential economies, while providing the stability necessary for wise long-term investment. Milton Friedman argued against central planning, price controls and state-owned corporations, particularly as practiced in the Soviet Union and China while Ha-Joon Chang cites the examples of post-war Japan and the growth of South Korea's steel industry as positive examples of government intervention. Criticism Critics of a laissez-faire free market have argued that in real world situations it has proven to be susceptible to the development of price fixing monopolies. Such reasoning has led to government intervention, e.g. the United States antitrust law. Two prominent Canadian authors argue that government at times has to intervene to ensure competition in large and important industries. Naomi Klein illustrates this roughly in her work The Shock Doctrine and John Ralston Saul more humorously illustrates this through various examples in The Collapse of Globalism and the Reinvention of the World. While its supporters argue that only a free market can create healthy competition and therefore more business and reasonable prices, opponents say that a free market in its purest form may result in the opposite. According to Klein and Ralston, the merging of companies into giant corporations or the privatization of government-run industry and national assets often result in monopolies or oligopolies requiring government intervention to force competition and reasonable prices. Another form of market failure is speculation, where transactions are made to profit from short term fluctuation, rather from the intrinsic value of the companies or products. This criticism has been challenged by historians such as Lawrence Reed, who argued that monopolies have historically failed to form even in the absence of antitrust law. This is because monopolies are inherently difficult to maintain as a company that tries to maintain its monopoly by buying out new competitors, for instance, is incentivizing newcomers to enter the market in hope of a buy-out. Furthermore, according to writer Walter Lippman and economist Milton Friedman, historical analysis of the formation of monopolies reveals that, contrary to popular belief, these were the result not of unfettered market forces, but of legal privileges granted by government. American philosopher and author Cornel West has derisively termed what he perceives as dogmatic arguments for laissez-faire economic policies as free-market fundamentalism. West has contended that such mentality "trivializes the concern for public interest" and "makes money-driven, poll-obsessed elected officials deferential to corporate goals of profit – often at the cost of the common good". American political philosopher Michael J. Sandel contends that in the last thirty years the United States has moved beyond just having a market economy and has become a market society where literally everything is for sale, including aspects of social and civic life such as education, access to justice and political influence. The economic historian Karl Polanyi was highly critical of the idea of the market-based society in his book The Great Transformation, noting that any attempt at its creation would undermine human society and the common good. David McNally of the University of Houston argues in the Marxist tradition that the logic of the market inherently produces inequitable outcomes and leads to unequal exchanges, arguing that Adam Smith's moral intent and moral philosophy espousing equal exchange was undermined by the practice of the free market he championed. According to McNally, the development of the market economy involved coercion, exploitation and violence that Smith's moral philosophy could not countenance. McNally also criticizes market socialists for believing in the possibility of fair markets based on equal exchanges to be achieved by purging parasitical elements from the market economy such as private ownership of the means of production, arguing that market socialism is an oxymoron when socialism is defined as an end to wage labour. Some would argue that only one known example of a true free market exists, namely the black market. The black market is under constant threat by the police, but under no circumstances do the police regulate the substances that are being created. The black market produces wholly unregulated goods and are purchased and consumed unregulated. That is to say, anyone can produce anything at any time and anyone can purchase anything available at any time. The alternative view is that the black market is not a free market at all since high prices and natural monopolies are often enforced through murder, theft and destruction. Black markets can only exist peripheral to regulated markets where laws are being regularly enforced. See also Binary economics Crony capitalism Economic liberalism Freedom of choice Free price system Grey market Left-wing market anarchism Market economy Neoliberalism Participatory economics Quasi-market Self-managed economy Transparency (market) Notes Further reading Block, Fred and Somers, Margaret R (2014). The Power of Market Fundamentalism: Karl Polanyi's Critique. Harvard University Press. . Boettke, Peter J. "What Went Wrong with Economics?", Critical Review Vol. 11, No. 1, pp. 35, 58. Harcourt, Bernard (2012). The Illusion of Free Markets: Punishment and the Myth of Natural Order. Harvard University Press. . Cox, Harvey (2016). The Market as God. Harvard University Press. . Hayek, Friedrich A. (1948). Individualism and Economic Order. Chicago: University of Chicago Press. vii, 271, [1]. Palda, Filip (2011) Pareto's Republic and the New Science of Peace 2011 chapters online. Published by Cooper-Wolfling. . Robin, Ron. “Castrophobia and the Free Market: The Wohlstetters’ Moral Economy.” The Cold World They Made: The Strategic Legacy of Roberta and Albert Wohlstetter, Harvard University Press, 2016, pp. 118–38, . Sandel, Michael J. (2013). What Money Can't Buy: The Moral Limits of Markets. Farrar, Straus and Giroux. . Stiglitz, Joseph. (1994). Whither Socialism? Cambridge, Massachusetts: MIT Press. Verhaeghe, Paul (2014). What About Me? The Struggle for Identity in a Market-Based Society. Scribe Publications. . Robert Kuttner, "The Man from Red Vienna" (review of Gareth Dale, Karl Polanyi: A Life on the Left, Columbia University Press, 381 pp.), The New York Review of Books, vol. LXIV, no. 20 (21 December 2017), pp. 55–57. "In sum, Polanyi got some details wrong, but he got the big picture right. Democracy cannot survive an excessively free market; and containing the market is the task of politics. To ignore that is to court fascism." (Robert Kuttner, p. 57). PHILIPPON, THOMAS. “The Rise in Market Power.” The Great Reversal: How America Gave Up on Free Markets, Harvard University Press, 2019, pp. 45–61, . Noriega, Roger F., and Andrés Martínez-Fernández. The Free-Market Moment: Making Grassroots Capitalism Succeed Where Populism Has Failed. American Enterprise Institute, 2016, . Cremers, Jan, and Ronald Dekker. “Labour Arbitrage on European Labour Markets: Free Movement and the Role of Intermediaries.” Towards a Decent Labour Market for Low Waged Migrant Workers, edited by Conny Rijken and Tesseltje de Lange, Amsterdam University Press, 2018, pp. 109–28, . Jónsson, Örn D., and Rögnvaldur J. Sæmundsson. “Free Market Ideology, Crony Capitalism, and Social Resilience.” Gambling Debt: Iceland’s Rise and Fall in the Global Economy, edited by E. PAUL DURRENBERGER and GISLI PALSSON, University Press of Colorado, 2015, pp. 23–32, . MITTERMAIER, KARL, Karl Mittermaier, and Isabella Mittermaier. “Free-Market Dogmatism and Pragmatism.” In The Hand Behind the Invisible Hand: Dogmatic and Pragmatic Views on Free Markets and the State of Economic Theory, 1st ed., 23–26. Bristol University Press, 2020. . Sloman, Peter. “Welfare in a Neoliberal Age: The Politics of Redistributive Market Liberalism.” In The Neoliberal Age?: Britain since the 1970s, edited by Aled Davies, Ben Jackson, and Florence Sutcliffe-Braithwaite, 75–93. UCL Press, 2021. . Orłowska, Agnieszka. “Toward Mutual Understanding, Respect, and Trust: On Past and Present Dog Training in Poland.” Free Market Dogs: The Human-Canine Bond in Post-Communist Poland, edited by Michał Piotr Pręgowski and Justyna Włodarczyk, Purdue University Press, 2016, pp. 35–60, . Block, Fred, and Margaret R. Somers. “TURNING THE TABLES: Polanyi’s Critique of Free Market Utopianism.” The Power of Market Fundamentalism, Harvard University Press, 2014, pp. 98–113, . TOMASI, JOHN. “Free Market Fairness.” Free Market Fairness, STU-Student edition, Princeton University Press, 2012, pp. 226–66, . Hoopes, James. “Corporations as Enemies of the Free Market.” Corporate Dreams: Big Business in American Democracy from the Great Depression to the Great Recession, Rutgers University Press, 2011, pp. 27–32, . Althammer, Jörg. “Economic Efficiency and Solidarity: The Idea of a Social Market Economy.” Free Markets with Sustainability and Solidarity, edited by MARTIN SCHLAG and JUAN A. MERCADO, Catholic University of America Press, 2016, pp. 199–216, . Chua, Beng Huat. “DISRUPTING FREE MARKET: State Capitalism and Social Distribution.” Liberalism Disavowed: Communitarianism and State Capitalism in Singapore, Cornell University Press, 2017, pp. 98–122, . Holland, Eugene W. “Free-Market Communism.” Nomad Citizenship: Free-Market Communism and the Slow-Motion General Strike, NED-New edition, University of Minnesota Press, 2011, pp. 99–140, . Newland, Carlos. “Is Support for Capitalism Declining around the World? A Free-Market Mentality Index, 1990–2012.” The Independent Review, vol. 22, no. 4, Independent Institute, 2018, pp. 569–83, . Taylor, Lance. “Keynesianism and the Crisis.” Maynard’s Revenge: The Collapse of Free Market Macroeconomics, Harvard University Press, 2010, pp. 337–58, . ADLER, JONATHAN H. “Excerpts from ‘About Free-Market Environmentalism.’” In Environment and Society: A Reader, edited by Christopher Schlottmann, Dale Jamieson, Colin Jerolmack, Anne Rademacher, and Maria Damon, 259–64. New York University Press, 2017. . Symons, Michael. “FREE THE MARKET! (IT’S BEEN CAPTURED BY CAPITALISM).” Meals Matter: A Radical Economics Through Gastronomy, Columbia University Press, 2020, pp. 225–46, . Higgs, Kerryn. “The Rise of Free Market Fundamentalism.” Collision Course: Endless Growth on a Finite Planet, The MIT Press, 2014, pp. 79–104, . SIM, STUART. “Neoliberalism, Financial Crisis, and Profit.” Addicted to Profit: Reclaiming Our Lives from the Free Market, Edinburgh University Press, 2012, pp. 70–95, . SINGER, JOSEPH WILLIAM. “Why Consumer Protection Promotes the Free Market.” No Freedom without Regulation: The Hidden Lesson of the Subprime Crisis, Yale University Press, 2015, pp. 58–94, . Roberts, Alasdair. “The Market Comes Back.” The End of Protest: How Free-Market Capitalism Learned to Control Dissent, Cornell University Press, 2013, pp. 41–57, . OTT, JULIA C. “The ‘Free and Open Market’ Responds.” When Wall Street Met Main Street, Harvard University Press, 2011, pp. 36–54, . Zeitlin, Steve, and Bob Holman. “Free Market Flavor: Poetry of the Palate.” The Poetry of Everyday Life: Storytelling and the Art of Awareness, 1st ed., Cornell University Press, 2016, pp. 127–31, . Baradaran, Mehrsa. “The Free Market Confronts Black Poverty.” The Color of Money: Black Banks and the Racial Wealth Gap, Harvard University Press, 2017, pp. 215–46, . External links "Free market" at Encyclopædia Britannica "Free Enterprise: The Economics of Cooperation" looks at how communication, coordination and cooperation interact to make free markets work Free-market anarchism Capitalism Classical liberalism Economic ideologies Economic liberalism Economic systems Georgism Libertarianism Libertarian theory Market (economics) Market socialism Market socialism
[ 0.10204780101776123, -0.3633144497871399, 0.07757440954446793, 0.44215720891952515, -0.0822179913520813, 0.03949589282274246, -0.23401015996932983, 0.35058626532554626, -0.22309528291225433, -0.5820692777633667, -0.6734299063682556, 0.8377370834350586, -0.6993253827095032, 0.49980953335762...
11830
https://en.wikipedia.org/wiki/Ford%20GT40
Ford GT40
The Ford GT40 is a high-performance endurance racing car commissioned by the Ford Motor Company. It grew out of the "Ford GT" (for Grand Touring) project, an effort to compete in European long-distance sports car races against Ferrari, which won every 24 Hours of Le Mans race from 1960 to 1965. Ford succeeded with the GT40, winning the 1966 through 1969 races. The effort began in the early 1960s when Ford Advanced Vehicles began to build the GT40 Mk I, based upon the Lola Mk6, at their base in Slough, UK. After disappointing race results, the engineering team was moved in 1964 to Dearborn, Michigan (Kar Kraft). The range was powered by a series of American-built Ford V8 engines modified for racing. In 1966, the GT40 Mk II broke Ferrari's streak at Le Mans, notching the first win for an American manufacturer in a major European race since Jimmy Murphy's triumph with Duesenberg at the 1921 French Grand Prix. In 1967, the Mk IV became the only car designed and built entirely in the United States to achieve the overall win at Le Mans. The Mk I, the oldest of the cars, won in 1968 and 1969, the second chassis to win Le Mans more than once. (This Ford/Shelby chassis, #P-1075, was believed to have been the first until the Ferrari 275P chassis 0816 was revealed to have won the 1964 race after winning the 1963 race in 250P configuration and with a 0814 chassis plate). Its American Ford V8 engine, originally of 4.7-liter displacement capacity (289 cubic inches), was enlarged to 4.9 liters (302 cubic inches), with custom alloy Gurney–Weslake cylinder heads. The "40" represented its height of 40 inches (1.02 m), measured at the windshield, the minimum allowed. The first 12 "prototype" vehicles carried serial numbers GT-101 to GT-112. Once "production" began, the Mk I, Mk II, Mk III, and Mk IV were numbered GT40P/1000 through GT40P/1145, and thus officially "GT40s". The Mk IVs were numbered J1-J12. The contemporary Ford GT is a modern homage to the GT40. History Henry Ford II had wanted a Ford at Le Mans since the early 1960s. In early 1963, Ford reportedly received word through a European intermediary that Enzo Ferrari was interested in selling to Ford Motor Company. Ford reportedly spent several million dollars in an audit of Ferrari factory assets and in legal negotiations, only to have Ferrari unilaterally cut off talks at a late stage due to disputes about the ability to direct open-wheel racing. Ferrari, who wanted to remain the sole operator of his company's motorsports division, was angered when he was told that he would not be allowed to race at the Indianapolis 500 if the deal went through, since Ford fielded Indy cars using its own engine and didn't want competition from Ferrari. Enzo cut the deal off out of spite and Henry Ford II, enraged, directed his racing division to find a company that could build a Ferrari-beater on the world endurance-racing circuit. To this end, Ford began negotiation with Lotus, Lola, and Cooper. Cooper had no experience in GT or prototype and its performances in Formula One were declining. The Lola proposal was chosen since Lola had used a Ford V8 engine in its mid-engined Lola Mk6 (also known as Lola GT). It was one of the most advanced racing cars of the time and made a noted performance in Le Mans 1963, even though the car did not finish, due to low gearing and slow revving out on the Mulsanne Straight. However, Eric Broadley, Lola Cars' owner and chief designer, agreed on a short-term personal contribution to the project without involving Lola Cars. The agreement with Broadley included a one-year collaboration between Ford and Broadley, and the sale of the two Lola Mk 6 chassis builds to Ford. To form the development team, Ford also hired the ex-Aston Martin team manager John Wyer. Ford Motor Co. engineer Roy Lunn was sent to England; he had designed the mid-engined Mustang I concept car powered by a 1.7-liter V4. Despite the small engine of the Mustang I, Lunn was the only Dearborn engineer to have some experience with a mid-engined car. Overseen by Harley Copp, the team of Broadley, Lunn, and Wyer began working on the new car at the Lola Factory in Bromley. At the end of 1963, the team moved to Slough, near Heathrow Airport. Ford then established Ford Advanced Vehicles (FAV) Ltd, a new subsidiary under the direction of Wyer, to manage the project. The first chassis built by Abbey Panels of Coventry was delivered on 16 March 1964, with fiber-glass moldings produced by Fibre Glass Engineering Ltd of Farnham. The first "Ford GT" the GT/101 was unveiled in England on 1 April and soon after exhibited in New York. Purchase price of the completed car for competition use was £5,200. It was powered by the 4.7 L 289 cu in Fairlane engine with a Colotti transaxle. An aluminium block DOHC version, known as the Ford Indy Engine, was used in later years at Indy. It won in 1965 in the Lotus 38. Racing history The Ford GT40 was first raced in May 1964 at the Nürburgring 1000 km race where it retired with suspension failure after holding second place early in the event. Three weeks later at the 24 Hours of Le Mans, all three entries retired, although the Ginther/Gregory car led the field from the second lap until its first pitstop. After a season-long series of dismal results under John Wyer in 1964, the program was handed over to Carroll Shelby after the 1964 Nassau race. The cars were sent directly to Shelby, still bearing the dirt and damage from the Nassau race. Carroll Shelby was noted for complaining that the cars were poorly maintained when he received them, but later information revealed the cars were packed up as soon as the race was over, and FAV never had a chance to clean and organize the cars to be transported to Shelby. Shelby's first victory came on their maiden race with the Ford program, with Ken Miles and Lloyd Ruby taking a Shelby American-entered Ford GT40 to victory in the Daytona 2000 in February 1965. One month later, Ken Miles and Bruce McLaren came in second overall (to the winning Chaparral in the sports class) and first in prototype class at the Sebring 12-hour race. The rest of the season, however, was a disappointment. The experience gained in 1964 and 1965 allowed the 7-liter Mk II to dominate the following year. In February, the GT40 again won at Daytona. This was the first year Daytona was run in the 24 Hour format and Mk II's finished 1st, 2nd, and 3rd. In March, at the 1966 12 Hours of Sebring, GT40s again took all three top finishes, with the X-1 Roadster first, a Mk II taking second, and a Mk I in third. Then in June, at the 24 Hours of Le Mans, the GT40 achieved yet another 1–2–3 result. The Le Mans finish, however, was clouded in controversy: The No1 car of Ken Miles and Denny Hulme held a four lap lead over the No2 car of Bruce McLaren and Chris Amon. This disintegrated when the No1 car was forced to make a pit-stop for replacement brake rotors, following an incorrect set being fitted a lap prior in a scheduled rotor change. It was found to be a result of the correct brake rotors being taken by the No2 crew. This meant that in the final few hours, the Ford GT40 of New Zealanders Bruce McLaren and Chris Amon closely trailed the leading Ford GT40 driven by Englishman Ken Miles and New Zealander Denny Hulme. With a multimillion-dollar program finally on the very brink of success, Ford team officials faced a difficult choice. They could allow the drivers to settle the outcome by racing each other—and risk one or both cars breaking down or crashing; they could dictate a finishing order to the drivers—guaranteeing that one set of drivers would be extremely unhappy; or they could arrange a tie, with the McLaren/Amon and Miles/Hulme cars crossing the line side by side. The team chose the latter and informed Shelby. He told McLaren and Miles of the decision just before the two got into their cars for the final stint. Then, not long before the finish, the Automobile Club de l'Ouest (ACO), organizers of the Le Mans event, informed Ford that the geographical difference in starting positions would be taken into account at a close finish. This meant that the McLaren/Amon vehicle, which had started perhaps behind the Hulme-Miles car, would have covered slightly more ground over the 24 hours and would, in the event of a tie for first place, be the winner. Secondly, Ford officials admitted later, the company's contentious relationship with Miles, its top contract driver, placed executives in a difficult position. They could reward an outstanding driver who had been at times extremely difficult to work with, or they could decide in favor of drivers (McLaren/Amon) who had committed less to the Ford program but who had been easier to deal with. Ford stuck with the orchestrated photo finish. What happened on the last lap remains the subject of speculation. Either Miles, deeply bitter over this decision after his dedication to the program, issued his own protest by suddenly slowing just yards from the finish and letting McLaren across the line first, or McLaren accelerated just before the finish line robbing Miles of his victory. Either way, McLaren was declared the victor. Miles died in a testing accident in the J-car (later to become the Mk IV) at Riverside (CA) Raceway just two months later. Miles' death occurred at the wheel of the Ford "J-car", an iteration of the GT40 that included several unique features. These included an aluminum honeycomb chassis construction and a "bread van" body design that experimented with "Kammback" aerodynamic theories. Unfortunately, the fatal Miles accident was attributed at least partly to the unproven aerodynamics of the J-car design, as well as the experimental chassis' strength. The team embarked on a complete redesign of the car, which became known as the Mk IV. The Mk IV newer design, with a Mk II engine but a different chassis and a different body, won the following year at Le Mans (when four Mark IVs, three Mark IIs, and three Mark Is raced). The high speeds achieved in that race caused a rule change, which already came into effect in 1968: the prototypes were limited to the capacity of 3.0 liters, the same as in Formula One. This took out the V12-powered Ferrari 330P as well as the Chaparral and the Mk IV. If at least 50 cars had been built, sportscars like the GT40 and the Lola T70 were allowed, with a maximum of 5.0  L. John Wyer's revised 4.7-liter (bored to 4.9 liter, and O-rings cut and installed between the block and head to prevent head gasket failure, a common problem found with the 4.7 engine) Mk I won the 24 hours of Le Mans race in 1968 against the fragile smaller prototypes. This result, added to four other round wins for the GT40, gave Ford victory in the 1968 International Championship for Makes. The GT40's intended 3.0  L replacement, the Ford P68, and Mirage cars proved a dismal failure. While facing more experienced prototypes and the new yet still unreliable 4.5  L flat-12-powered Porsche 917s, Wyer's 1969 24 Hours of Le Mans winners Jacky Ickx/Jackie Oliver managed to beat the remaining 3.0-liter Porsche 908 by just a few seconds with the already outdated GT40 Mk I, in the very car that had won in 1968—the legendary GT40P/1075. Apart from brake wear in the Porsche and the decision not to change brake pads so close to the race end, the winning combination was relaxed driving by both GT40 drivers and heroic efforts at the right time by (at that time Le Mans' rookie) Ickx, who won Le Mans five more times in later years. Le Mans 24 Hours victories International titles In addition to four consecutive overall Le Mans victories, Ford also won the following four FIA international titles (at what was then unofficially known as the World Sportscar Championship) with the GT40: 1966 International Manufacturers Championship – Over 2000cc 1966 International Championship for Sports Cars – Division III (Over 2000cc) 1967 International Championship for Sports Cars – Division III (Over 2000cc) 1968 International Championship for Makes Versions Mk I The Mk I was the original Ford GT40. Early prototypes were powered by 255 cu in (4.2 L) alloy V8 engines and production models were powered by engines as used in the Ford Mustang. Five prototype models were built with roadster bodywork, including the Ford X-1. Two lightweight cars (of a planned five), AMGT40/1 and AMGT40/2, were built by Alan Mann Racing in 1966, with light alloy bodies and other weight-saving modifications. The Mk I met with little success in its initial tune for the 1964 and 1965 Le Mans races. The first success came after their demise at the Nassau Speed Weekend Nov 1964 when the racing was handed over to Carrol Shelby. Shelby's team modified the Ford GT40 and the first win at Daytona February 1965 was achieved. Much was later modified and run by John Wyer in 1968 and 1969, winning Le Mans in both those years and Sebring in 1969. The Mk II and IV were both obsolete after the FIA had changed the rules to ban unlimited capacity engines, ruling out the 427 cu in (7 L) Ford V8. However, the Mk I, with its smaller engine, was legally able to race as a homologated sports car because of its production numbers. In 1968 competition came from the Porsche 908 which was the first prototype built for the 3-liter Group 6. The result of the 1968 was resounding success at the 24 Hours of Le Mans with Pedro Rodríguez and Lucien Bianchi having a clear lead over the Porsches, driving the ‘almighty’ #9 car with the 'Gulf Oil' colors.<ref>{{cite web |url=https://primotipo.com/tag/ford-gt40-1076/ |title=Posts Tagged 'Ford GT40 '1076|website=Primotipo.com |editor=GP Library |first=Rainer |last=Schlegelmilch |date=May 9, 2016 |access-date=2018-09-08}}</ref> The season began slowly for JW, losing at Sebring and Daytona before taking their first win at the BOAC International 500 at Brands Hatch. Later victories included the Grand Prix de Spa, 21st Annual Watkins Glen Sports Car Road Race and the 1000 km di Monza. The engine installed on this car was a naturally aspirated Windsor V8 with a compression ratio of 10.6:1; fuel fed by four 2-barrel 48 IDA Weber carburetors, rated at at 6,000 rpm and a maximum torque of at 4,750 rpm. 31 Mk I cars were built at the Slough factory in "road" trim, which differed little from the race versions. Wire wheels, carpet, ruched fabric map pockets in the doors and a cigarette lighter made up most of the changes. Some cars deleted the ventilated seats, and at least one (chassis 1049) was built with the opening, metal-framed, windows from the Mk III. X-1 Roadster The X-1 was a roadster built to contest the Fall 1965 North American Pro Series, a forerunner of Can-Am, entered by the Bruce McLaren team and driven by Chris Amon. The car had an aluminum chassis built at Abbey Panels and was originally powered by a 289 cu in (4.7L) engine. The real purpose of this car was to test several improvements originating from Kar Kraft, Shelby, and McLaren. Several gearboxes were used: a Hewland LG500 and at least one automatic gearbox. It was later upgraded to Mk II specifications with a 427 cu in (7 L) engine and a standard four ratio Kar Kraft (subsidiary of Ford) gearbox, however, the car kept specific features such as its open roof and lightweight aluminum chassis. The car went on to win the 12 Hours of Sebring in 1966. The X-1 was a one-off and having been built in the United Kingdom and being liable for United States tariffs, was later ordered to be destroyed by United States customs officials. Mk II The Mk II was rebuilt by Holman Moody in California to handle the 7.0-liter FE (427 ci) engine from the Ford Galaxie, used in NASCAR at the time and modified for road course use. The car's chassis was similar to the British-built Mk I chassis, but it and other parts of the car had to be redesigned and modified by Holman Moody to accommodate the larger and heavier 427 engine. A new Kar Kraft-built four-speed gearbox replaced the ZF five-speed used in the Mk I. This car is sometimes called the Ford Mk II. In 1966, the three teams racing the Mk II (Chris Amon and Bruce McLaren, Denny Hulme and Ken Miles, and Dick Hutcherson and Ronnie Bucknum) dominated Le Mans, taking European audiences by surprise and beating Ferrari to finish 1-2-3 in the standings. The Ford GT40 went on to win the race for the next three years. For 1967, the Mk IIs were upgraded to "B" spec; they had re-designed bodywork and twin Holley carburetors for an additional . A batch of improperly heat-treated input shafts in the transaxles sidelined virtually every Ford in the race at Daytona, however, and Ferrari won 1-2-3. The Mk IIBs were also used for Sebring and Le Mans that year and won the Reims 12 Hours in France. For the Daytona 24 Hours, two Mk II models (chassis 1016 and 1047) had their engines re-badged as Mercury engines; Ford seeing a good opportunity to advertise that division of the company. In 2018, a Mk II that was 3rd overall at the 1966 Le Mans 24 Hours was sold by RM Sotheby's for $9,795,000 (£7,624,344) - the highest price achieved for a GT40 at auction. Mk III The Mk III was a road-car only, of which seven were built. The car had four headlamps, the rear part of the body was expanded to make room for luggage, the 4.7-liter engine was detuned to , the shock absorbers were softened, the shift lever was moved to the center, an ashtray was added, and the car was available with the steering wheel on the left side of the car. As the Mk III looked significantly different from the racing models many customers interested in buying a GT40 for road use chose to buy a Mk I that was available from Wyer Ltd. Of the seven MK III that were produced four were left-hand drive. J-car In an effort to develop a car with better aerodynamics (potentially resulting in superior control and speed compared to competitors), the decision was made to re-conceptualize and redesign everything about the vehicle other than its powerful 7-liter engine. This would result in the abandonment of the original Mk I/Mk II chassis. In order to bring the car into alignment with Ford's "in house" ideology at the time, more restrictive partnerships were implemented with English firms, which resulted in the sale of Ford Advanced Vehicles (acquired by John Wyer), ultimately leading to a new vehicle which would be slated for design by Ford's studios and produced by Ford's subsidiary Kar-Kraft under Ed Hull. Furthermore, there was also a partnership with the Brunswick Aircraft Corporation for expertise on the novel use of aluminum honeycomb panels bonded together to form a lightweight, rigid "tub". The car was designated as the J-car, as it was constructed to meet the new Appendix J regulations which were introduced by the FIA in 1966. The first J-car was completed in March 1966 and set the fastest time at the Le Mans trials that year. The tub weighed only , and the entire car weighed only , less than the Mk II. It was decided to run the Mk IIs due to their proven reliability, however, and little or no development was done on the J-car for the rest of the season. Following Le Mans, the development program for the J-car was resumed, and a second car was built. During a test session at Riverside International Raceway in August 1966 with Ken Miles driving, the car suddenly went out of control at the end of Riverside's high-speed, 1-mile-long back straight. The aluminum honeycomb chassis did not live up to its design goal, shattering upon impact. The car burst into flames, killing Miles. It was determined that the unique, flat-topped "bread van" aerodynamics of the car, lacking any sort of spoiler, were implicated in generating excess lift. Therefore, a conventional but significantly more aerodynamic body was designed for the subsequent development of the J-car which was officially known as the Mk IV. A total of nine cars were constructed with J-car chassis numbers although six were designated as Mk IVs and one as the G7A. Mk IV The Mk IV was built around a reinforced J chassis powered by the same 7.0 L engine as the Mk II. Excluding the engine, gearbox, some suspension parts and the brakes from the Mk II, the Mk IV was totally different from other GT40s, using a specific, all-new chassis and bodywork. It was undoubtedly the most radical and American variant of all the GT40's over the years. As a direct result of the Miles accident, the team installed a NASCAR-style steel-tube roll cage in the Mk IV, which made it much safer, but the roll cage was so heavy that it negated most of the weight saving of the then-highly advanced, radically innovative honeycomb-panel construction. The Mk IV had a long, streamlined shape, which gave it exceptional top speed, crucial to do well at Le Mans in those days (a circuit made up predominantly of straights)—the race it was ultimately built for. A 2-speed automatic gearbox was tried, but during the extensive testing of the J-car in 1966 and 1967, it was decided that the 4-speed from the Mk II would be retained. Dan Gurney often complained about the weight of the Mk IV, since the car was heavier than the Ferrari 330 P4's. During practice at Le Mans in 1967, in an effort to preserve the highly stressed brakes, Gurney developed a strategy (also adopted by co-driver A.J. Foyt) of backing completely off the throttle several hundred yards before the approach to the Mulsanne hairpin and virtually coasting into the braking area. This technique saved the brakes, but the resulting increase in the car's recorded lap times during practice led to speculation within the Ford team that Gurney and Foyt, in an effort to compromise on chassis settings, had hopelessly "dialed out" their car. The car proved to be fastest in a straight line that year, thanks to its streamlined aerodynamics, achieving 212 mph on the 3.6-mile Mulsanne Straight. The Mk IV ran in only two races, the 1967 12 Hours of Sebring and the 1967 24 Hours of Le Mans and won both events. Only one Mk IV was completed for Sebring; the pressure from Ford had been amped up considerably after Ford's humiliation at Daytona two months earlier. Mario Andretti and Bruce McLaren won Sebring, Dan Gurney and A. J. Foyt won Le Mans (Gurney and Foyt's car was the Mk IV that was apparently least likely to win), where the Ford-representing Shelby-American and Holman & Moody teams showed up to Le Mans with 2 Mk IVs each. The installation of the roll cage was ultimately credited by many with saving the life of Andretti, who crashed violently at the Esses during the 1967 Le Mans 24 Hours, but escaped with minor injuries. Unlike the earlier Mk I - III cars, which were built in England, the Mk IVs were built in the United States by Kar Kraft. Le Mans 1967 remains the only all-American victory in Le Mans history—American drivers, team, chassis, engine, and tires. A total of six Mk IVs were constructed. One of the Mk IVs was rebuilt to the Ford G7 in 1968, and used in the Can-Am series for 1969 and 1970, but with no success. This car is sometimes called the Ford Mk IV. Mk V For years Peter Thorp had searched for a GT40 in good condition. Most of the cars had problems including the dreaded rust issue. His company, Safir Engineering, was building and fielding Formula 3 race cars, in addition, had a Token Formula One car purchased from the Ron Dennis Company, Rondell Racing. Formula One events in which Safir Engineering competed included Brands Hatch and Silverstone. Safir was also redesigning Range Rovers modifying the unit to six-wheel drive and exporting them. Safir technical capabilities were such that they could rebuild GT40s. It was with this in mind that Thorp approached John Willment for his thoughts. It was soon decided that there would be a limited, further run of the significant GT40. JW Engineering would oversee the build, and Safir was to do the work. The continued JW Engineering/Safir Engineering production would utilize sequential serial numbers starting at the last used GT40 serial number and move forward. Maintaining the GT40 Mark nomenclature, this continued production would be named GT40 Mk V. JW Engineering wished to complete the GT40 chassis numbers GT40P-1087, 1088 and 1089. This was supposed to take place prior to the beginning of Safir production, however, the completion of these three chassis’ was very much delayed. Ford's Len Bailey was hired to inspect the proposed build and engineer any changes he thought prudent to ensure the car was safe, as well as minimize problems experienced in the past. Baily changed the front suspension to Alan Mann specifications, which minimized nose-dive under braking. Zinc coated steel replaced the previous uncoated rust-prone sheet metal. The vulnerable drive donuts were replaced with CV joints and the leak-prone rubber gas tanks were replaced with aluminum tanks. The GT40 chassis was upgraded without making any major changes. Tennant Panels supplied the roof structure and the balance of the chassis was completed by Safir. Bill Pink, noted for his electrical experience and the wiring installation of previous GT40s, was brought in. Also, Jim Rose was hired for his experience with working at both Alan Mann and Shelby. After the manufacture of chassis 1120, John Etheridge was hired to manage the GT40 build. The chassis was supplied from Adams McCall Engineering and parts supplied from Tennant panels. For the most part, the Mk V resembled very closely the Mk I car, although there were a few changes, and, as with the '60s production, very few cars were identical. The first car, GT40P-1090, had an open-top in place of roofed doors. Most motors were Ford small block, Webers or 4 Barrel Carburetor. Safir produced five Big Block GT40s, serial numbers GT40P-1128 to GT40P-1132. These aluminum big block cars all had easily removable door roof sections. Most GT40s were high-performance street cars however some of the Mk V production can be described as full race. Two road cars GT40P-1133 (roadster) and GT40P-1142 (roofed doors) were built as lightweights which included an aluminum honeycomb chassis and carbon fiber bodywork. Continuation models, replicas and modernizations Several kit cars and replicas inspired by the Ford GT40 have been built. They are generally intended for assembly in a home workshop or garage. There are two alternatives to the kit car approach, either continuation models (exact and licensed replicas true to the original GT40) or modernizations (replicas with upgraded components, ergonomics & trim for improved usability, drivability, and performance). GT40/R Competition, United States: Authentic GT40 built by Superformance and co-designed with Pathfinder Motorsports. This is the only GT40 continuation licensed by Safir GT40 Spares LLC, the holders of the GT40 trademark. A GT40/R (GT40P/2094) campaigned by Pathfinder Motorsports with an engine built by Holman Moody won both the 2009 US Vintage Grand Prix and the 2009 Governor's Cup at Watkins Glen. Southern GT: Built-in Swanmore, Southampton, UK. Specializing in GT40 Mk1 and Mk2, as well as Lola T70. Kit form or fully built to your specifications. CAV GT: Originally designed for customers to build as a kit, the CAV GT has evolved into a modernized replica that is now factory-built in Cape Town, South Africa. Holman Moody: GT40 Mark II won third at Le Mans in 1966, and can still manufacture a Holman GT from 1966 blueprints. GT40 Spyder, United States: Built by E.R.A. Replica Automobiles in New Britain, CT, the Spyder is a MK2 Canadian American (CAN-AM) racing replica. The ERA GT is "No Longer Available" according to their website (October 3, 2021). Ford GT At the 1995 North American International Auto Show, the Ford GT90 concept was shown and at the 2002 show, a new GT40 Concept was unveiled by Ford. While similar in appearance to the original cars, it was bigger, wider, and 3 inches (76 mm) taller than the original 40 inches (1020 mm). Three production prototype cars were shown in 2003 as part of Ford's centenary, and delivery of the production Ford GT began in the fall of 2004. The Ford GT was assembled in the Ford Wixom plant and painted by Saleen, Incorporated at their Saleen Special Vehicles plant in Troy, Michigan. A British company, Safir Engineering, who continued to produce a limited number of GT40s (the Mk V) in the 1980s under an agreement with Walter Hayes of Ford and John Wilmont of J.W. Automotive Engineering, owned the GT40 trademark at that time, and when they completed production, they sold the excess parts, tooling, design, and trademark to a small American company called Safir GT40 Spares, Limited based in Ohio. Safir GT40 Spares licensed the use of the GT40 trademark to Ford for the initial 2002 show car, but when Ford decided to make the production vehicle, negotiations between the two failed, and as a result, the new Ford GT does not wear the badge GT40. Bob Wood, one of three partners who own Safir GT40 Spares, said: "When we talked with Ford, they asked what we wanted. We said that Ford owns Beanstalk in New York, the company that licenses the Blue Oval for Ford on such things as T-shirts. Since Beanstalk gets 7.5 percent of the retail cost of the item for licensing the name, we suggested 7.5 percent on each GT40 sold." In this instance, Ford wished to purchase, not just license the GT40 trademark. At the then-estimated $125,000 per copy, 7.5% of 4,500 vehicles would have totalled approximately $42,187,500. It was widely and erroneously reported following an Automotive News Weekly story that Safir "demanded" the $40 million for the sale of the trademark. Discussions between Safir and Ford ensued. However, in fact, the Ford Motor Company never made an offer in writing to purchase the famed GT40 trademark. Later models or prototypes have also been called the Ford GT but have had different numbering on them such as the Ford GT90 or the Ford GT70. The GT40 name and trademark is currently licensed to Superformance in the USA. A second-generation Ford GT was unveiled at the 2015 North American International Auto Show. It features a 3.5L twin-turbocharged V6 engine, carbon fiber monocoque and body panels, pushrod suspension and active aerodynamics. It entered the 2016 season of the FIA World Endurance Championship and the United SportsCar Championship, and started being sold in a street-legal version at Ford dealerships in 2017. See also Ford Supervan, a van-bodied variant Bundle of Snakes, characteristic exhaust system Colotti Trasmissioni, transmission of the initial, and early models AC Cobra, a car of similar Anglo-American parentage Ford v Ferrari, 2019 film about the GT40's development References Further reading "17 Ford GT40s Stampede into Pebble Beach! We Dive into Their Histories" (with historic and modern photo gallery), by Don Sherman, Car and Driver, August 2016. "An American Challenge" , Ford press release, 1966. Auto Passion n°49 July 1991 (in French) La Revue de l'Automobile Historique n°7 March/April 2001 (in French) Ford: The Dust and the Glory/A motor racing history by Leo Levine/1968 Ford vs. Ferrari: the Battle for Le Mans by Anthony Pritchard, 1984 Zuma Marketing Ford GT-40: An Individual History and Race Record by Ronnie Spain 1986 Go Like Hell: Ford, Ferrari, and Their Battle for Speed and Glory at Le Mans by A. J. Baime 12 Hours of Sebring 1965 by Harry Hurst and Dave Friedman Ford GT40 Manual: An Insight into Owning, Racing and Maintaining Ford's Legendary Sports Racing Car''(Haynes Owners' Workshop Manuals) by Gordon Bruce External links GT40 GT40 Rear mid-engine, rear-wheel-drive vehicles Sports cars Group 4 cars 24 Hours of Le Mans Cars introduced in 1965 24 Hours of Le Mans race cars Le Mans winning cars Cars of England
[ -0.17214703559875488, 0.07162211090326309, -0.10479658097028732, 0.42175835371017456, 0.005192178301513195, 0.23628473281860352, -0.045941952615976334, 0.05535692721605301, 0.47939062118530273, -0.017203543335199356, -0.03289183974266052, 0.8370123505592346, 0.36834120750427246, 0.21405805...
11835
https://en.wikipedia.org/wiki/Glycine
Glycine
Glycine (symbol Gly or G; ) is an amino acid that has a single hydrogen atom as its side chain. It is the simplest stable amino acid (carbamic acid is unstable), with the chemical formula NH2‐CH2‐COOH. Glycine is one of the proteinogenic amino acids. It is encoded by all the codons starting with GG (GGU, GGC, GGA, GGG). Glycine is integral to the formation of alpha-helices in secondary protein structure due to its compact form. For the same reason, it is the most abundant amino acid in collagen triple-helices. Glycine is also an inhibitory neurotransmitter – interference with its release within the spinal cord (such as during a Clostridium tetani infection) can cause spastic paralysis due to uninhibited muscle contraction. It is the only achiral proteinogenic amino acid. It can fit into hydrophilic or hydrophobic environments, due to its minimal side chain of only one hydrogen atom. History and etymology Glycine was discovered in 1820 by the French chemist Henri Braconnot when he hydrolyzed gelatin by boiling it with sulfuric acid. He originally called it "sugar of gelatin", but the French chemist Jean-Baptiste Boussingault showed that it contained nitrogen. The American scientist Eben Norton Horsford, then a student of the German chemist Justus von Liebig, proposed the name "glycocoll"; however, the Swedish chemist Berzelius suggested the simpler name "glycine". The name comes from the Greek word γλυκύς "sweet tasting" (which is also related to the prefixes glyco- and gluco-, as in glycoprotein and glucose). In 1858, the French chemist Auguste Cahours determined that glycine was an amine of acetic acid. Production Although glycine can be isolated from hydrolyzed protein, this is not used for industrial production, as it can be manufactured more conveniently by chemical synthesis. The two main processes are amination of chloroacetic acid with ammonia, giving glycine and ammonium chloride, and the Strecker amino acid synthesis, which is the main synthetic method in the United States and Japan. About 15 thousand tonnes are produced annually in this way. Glycine is also cogenerated as an impurity in the synthesis of EDTA, arising from reactions of the ammonia coproduct. Chemical reactions Its acid–base properties are most important. In aqueous solution, glycine is amphoteric: below pH = 2.4, it converts to the ammonium cation called glycinium. Above about 9.6, it converts to glycinate. Glycine functions as a bidentate ligand for many metal ions, forming amino acid complexes. A typical complex is Cu(glycinate)2, i.e. Cu(H2NCH2CO2)2, which exists both in cis and trans isomers. With acid chlorides, glycine converts to the amidocarboxylic acid, such as hippuric acid and acetylglycine. With nitrous acid, one obtains glycolic acid (van Slyke determination). With methyl iodide, the amine becomes quaternized to give trimethylglycine, a natural product: + 3 CH3I → + 3 HI Glycine condenses with itself to give peptides, beginning with the formation of glycylglycine: 2 → + H2O Pyrolysis of glycine or glycylglycine gives 2,5-diketopiperazine, the cyclic diamide. It forms esters with alcohols. They are often isolated as their hydrochloride]], e.g., glycine methyl ester hydrochloride. Otherwise the free ester tends to convert to diketopiperazine. As a bifunctional molecule, glycine reacts with many reagents. These can be classified into N-centered and carboxylate-center reactions. Metabolism Biosynthesis Glycine is not essential to the human diet, as it is biosynthesized in the body from the amino acid serine, which is in turn derived from 3-phosphoglycerate, but the metabolic capacity for glycine biosynthesis does not satisfy the need for collagen synthesis. In most organisms, the enzyme serine hydroxymethyltransferase catalyses this transformation via the cofactor pyridoxal phosphate: serine + tetrahydrofolate → glycine + N5,N10-methylene tetrahydrofolate + H2O In the liver of vertebrates, glycine synthesis is catalyzed by glycine synthase (also called glycine cleavage enzyme). This conversion is readily reversible: CO2 + NH + N5,N10-methylene tetrahydrofolate + NADH + H+ ⇌ Glycine + tetrahydrofolate + NAD+ In addition to being synthesized from serine, glycine can also be derived from threonine, choline or hydroxyproline via inter-organ metabolism of the liver and kidneys. Degradation Glycine is degraded via three pathways. The predominant pathway in animals and plants is the reverse of the glycine synthase pathway mentioned above. In this context, the enzyme system involved is usually called the glycine cleavage system: Glycine + tetrahydrofolate + NAD+ ⇌ CO2 + NH + N5,N10-methylene tetrahydrofolate + NADH + H+ In the second pathway, glycine is degraded in two steps. The first step is the reverse of glycine biosynthesis from serine with serine hydroxymethyl transferase. Serine is then converted to pyruvate by serine dehydratase. In the third pathway of its degradation, glycine is converted to glyoxylate by D-amino acid oxidase. Glyoxylate is then oxidized by hepatic lactate dehydrogenase to oxalate in an NAD+-dependent reaction. The half-life of glycine and its elimination from the body varies significantly based on dose. In one study, the half-life varied between 0.5 and 4.0 hours. Glycine is extremely sensitive to antibiotics which target folate, and blood glycine levels drop severely within a minute of antibiotic injections. Some antibiotics can deplete more than 90% of glycine within a few minutes of being administered. Physiological function The principal function of glycine is it act as a precursor to proteins. Most proteins incorporate only small quantities of glycine, a notable exception being collagen, which contains about 35% glycine due to its periodically repeated role in the formation of collagen's helix structure in conjunction with hydroxyproline. In the genetic code, glycine is coded by all codons starting with GG, namely GGU, GGC, GGA and GGG. As a biosynthetic intermediate In higher eukaryotes, δ-aminolevulinic acid, the key precursor to porphyrins, is biosynthesized from glycine and succinyl-CoA by the enzyme ALA synthase. Glycine provides the central C2N subunit of all purines. As a neurotransmitter Glycine is an inhibitory neurotransmitter in the central nervous system, especially in the spinal cord, brainstem, and retina. When glycine receptors are activated, chloride enters the neuron via ionotropic receptors, causing an inhibitory postsynaptic potential (IPSP). Strychnine is a strong antagonist at ionotropic glycine receptors, whereas bicuculline is a weak one. Glycine is a required co-agonist along with glutamate for NMDA receptors. In contrast to the inhibitory role of glycine in the spinal cord, this behaviour is facilitated at the (NMDA) glutamatergic receptors which are excitatory. The of glycine is 7930 mg/kg in rats (oral), and it usually causes death by hyperexcitability. Uses In the US, glycine is typically sold in two grades: United States Pharmacopeia (“USP”), and technical grade. USP grade sales account for approximately 80 to 85 percent of the U.S. market for glycine. If purity greater than the USP standard is needed, for example for intravenous injections, a more expensive pharmaceutical grade glycine can be used. Technical grade glycine, which may or may not meet USP grade standards, is sold at a lower price for use in industrial applications, e.g., as an agent in metal complexing and finishing. Animal and human foods Glycine is not widely used in foods for its nutritional value, except in infusions. Instead glycine's role in food chemistry is as a flavorant. It is mildly sweet, and it counters the aftertaste of saccharine. It also has preservative properties, perhaps owing to its complexation to metal ions. Metal glycinate complexes, e.g. copper(II) glycinate are used as supplements for animal feeds. Chemical feedstock Glycine is an intermediate in the synthesis of a variety of chemical products. It is used in the manufacture of the herbicides glyphosate, iprodione, glyphosine, imiprothrin, and eglinazine. It is used as an intermediate of the medicine such as thiamphenicol. Laboratory research Glycine is a significant component of some solutions used in the SDS-PAGE method of protein analysis. It serves as a buffering agent, maintaining pH and preventing sample damage during electrophoresis. Glycine is also used to remove protein-labeling antibodies from Western blot membranes to enable the probing of numerous proteins of interest from SDS-PAGE gel. This allows more data to be drawn from the same specimen, increasing the reliability of the data, reducing the amount of sample processing, and number of samples required. This process is known as stripping. Presence in space The presence of glycine outside the earth was confirmed in 2009, based on the analysis of samples that had been taken in 2004 by the NASA spacecraft Stardust from comet Wild 2 and subsequently returned to earth. Glycine had previously been identified in the Murchison meteorite in 1970. The discovery of glycine in outer space bolstered the hypothesis of so called soft-panspermia, which claims that the "building blocks" of life are widespread throughout the universe. In 2016, detection of glycine within Comet 67P/Churyumov–Gerasimenko by the Rosetta spacecraft was announced. The detection of glycine outside the Solar System in the interstellar medium has been debated. In 2008, the Max Planck Institute for Radio Astronomy discovered the spectral lines of a glycine precursor (aminoacetonitrile) in the Large Molecule Heimat, a giant gas cloud near the galactic center in the constellation Sagittarius. Evolution Several independent evolutionary studies using different types of data have suggested that glycine belongs to a group of amino acids that constituted the early genetic code. For example, low complexity regions (in proteins), that may resemble the proto-peptides of the early genetic code are highly enriched in glycine. Presence in foods See also Trimethylglycine Amino acid neurotransmitter References Further reading External links Glycine MS Spectrum Glycine at PDRHealth.com Glycine cleavage system Glycine Therapy - A New Direction for Schizophrenia Treatment? ChemSub Online (Glycine). NASA scientists have discovered glycine, a fundamental building block of life, in samples of comet Wild 2 returned by NASA's Stardust spacecraft. Flavor enhancers Glucogenic amino acids Neurotransmitters Proteinogenic amino acids Glycine receptor agonists NMDA receptor agonists E-number additives
[ -0.07207179814577103, 0.4944142997264862, -0.5731005072593689, -0.11838171631097794, -0.7907911539077759, -0.15572205185890198, -0.2570553123950958, -0.03751678392291069, -0.08299724757671356, -0.7017984390258789, -0.45915210247039795, -0.11098861694335938, -0.7204012870788574, 0.728850364...
11844
https://en.wikipedia.org/wiki/GeekSpeak
GeekSpeak
GeekSpeak is a podcast with two to four hosts who focus on technology and technology news of the week. Though originally a radio tech call-in program, which first aired in 1998 on KUSP, GeekSpeak has been a weekly podcast since 2004. The program's slogan is "Bridging the gap between geeks and the rest of humanity". History GeekSpeak was created and originally broadcast on KUSP by Chris Neklason of Cruzio, Steve Schaefer of Guenther Computer, and board operator Ray Price from KUSP. Shortly there after Mark Hanford of Cruzio joined the program. Currently, the host/producer is Lyle Troxell, who took over in September 2000. In April 2016, citing financial difficulties, KUSP stopped broadcasting GeekSpeak with its final broadcast on May 5, 2016. GeekSpeak episodes have been distributed as an archive on the internet since 2001. The podcast went live prior to March 5, 2005 with its first episode December 3, 2004. See also Computer jargon Technobabble External links GeekSpeak Website iTunes Podcast Reference List American talk radio programs Presentation
[ 0.5507164001464844, 0.14934010803699493, -0.16005881130695343, 0.3730660676956177, -0.28315940499305725, -0.5787503123283386, -0.1240222156047821, 0.9069827795028687, -0.47537943720817566, -0.2504867911338806, -0.5419155955314636, 0.29536110162734985, 0.08011818677186966, 0.419787704944610...
11846
https://en.wikipedia.org/wiki/Guitar
Guitar
The guitar is a fretted musical instrument that typically has six strings. It is held flat against the player's body and played by strumming or plucking the strings with the dominant hand, while simultaneously pressing selected strings against frets with the fingers of the opposite hand. A plectrum or individual finger picks may also be used to strike the strings. The sound of the guitar is projected either acoustically, by means of a resonant chamber on the instrument, or amplified by an electronic pickup and an amplifier. The guitar is classified as a chordophone – meaning the sound is produced by a vibrating string stretched between two fixed points. Historically, a guitar was constructed from wood with its strings made of catgut. Steel guitar strings were introduced near the end of the nineteenth century in the United States; nylon strings came in the 1940s. The guitar's ancestors include the gittern, the vihuela, the four-course Renaissance guitar, and the five-course baroque guitar, all of which contributed to the development of the modern six-string instrument. There are three main types of modern guitar: the classical guitar (Spanish guitar/nylon-string guitar); the steel-string acoustic guitar or electric guitar; and the Hawaiian guitar (played across the player's lap). Traditional acoustic guitars include the flat top guitar (typically with a large sound hole) or an archtop guitar, which is sometimes called a "jazz guitar". The tone of an acoustic guitar is produced by the strings' vibration, amplified by the hollow body of the guitar, which acts as a resonating chamber. The classical Spanish guitar is often played as a solo instrument using a comprehensive fingerstyle technique where each string is plucked individually by the player's fingers, as opposed to being strummed. The term "finger-picking" can also refer to a specific tradition of folk, blues, bluegrass, and country guitar playing in the United States. Electric guitars, first patented in 1937, use a pickup and amplifier that made the instrument loud enough to be heard, but also enabled manufacturing guitars with a solid block of wood needing no resonant chamber. A wide array of electronic effects units became possible including reverb and distortion (or "overdrive"). Solid-body guitars began to dominate the guitar market during the 1960s and 1970s; they are less prone to unwanted acoustic feedback. As with acoustic guitars, there are a number of types of electric guitars, including hollowbody guitars, archtop guitars (used in jazz guitar, blues and rockabilly) and solid-body guitars, which are widely used in rock music. The loud, amplified sound and sonic power of the electric guitar played through a guitar amp has played a key role in the development of blues and rock music, both as an accompaniment instrument (playing riffs and chords) and performing guitar solos, and in many rock subgenres, notably heavy metal music and punk rock. The electric guitar has had a major influence on popular culture. The guitar is used in a wide variety of musical genres worldwide. It is recognized as a primary instrument in genres such as blues, bluegrass, country, flamenco, folk, jazz, jota, mariachi, metal, punk, reggae, rock, soul, and pop. History Before the development of the electric guitar and the use of synthetic materials, a guitar was defined as being an instrument having "a long, fretted neck, flat wooden soundboard, ribs, and a flat back, most often with incurved sides." The term is used to refer to a number of chordophones that were developed and used across Europe, beginning in the 12th century and, later, in the Americas. A 3,300-year-old stone carving of a Hittite bard playing a stringed instrument is the oldest iconographic representation of a chordophone and clay plaques from Babylonia show people playing an instrument that has a strong resemblance to the guitar, indicating a possible Babylonian origin for the guitar. The modern word guitar, and its antecedents, has been applied to a wide variety of chordophones since classical times and as such causes confusion. The English word guitar, the German , and the French were all adopted from the Spanish , which comes from the Andalusian Arabic () and the Latin , which in turn came from the Ancient Greek . Kithara appears in the Bible four times (1 Cor. 14:7, Rev. 5:8, 14:2 and 15:2), and is usually translated into English as harp. Many influences are cited as antecedents to the modern guitar. Although the development of the earliest "guitars" is lost in the history of medieval Spain, two instruments are commonly cited as their most influential predecessors, the European lute and its cousin, the four-string oud; the latter was brought to Iberia by the Moors in the 8th century. At least two instruments called "guitars" were in use in Spain by 1200: the (Latin guitar) and the so-called (Moorish guitar). The guitarra morisca had a rounded back, wide fingerboard, and several sound holes. The guitarra Latina had a single sound hole and a narrower neck. By the 14th century the qualifiers "moresca" or "morisca" and "latina" had been dropped, and these two chordophones were simply referred to as guitars. The Spanish vihuela, called in Italian the "", a guitar-like instrument of the 15th and 16th centuries, is widely considered to have been the single most important influence in the development of the baroque guitar. It had six courses (usually), lute-like tuning in fourths and a guitar-like body, although early representations reveal an instrument with a sharply cut waist. It was also larger than the contemporary four-course guitars. By the 16th century, the vihuela's construction had more in common with the modern guitar, with its curved one-piece ribs, than with the viols, and more like a larger version of the contemporary four-course guitars. The vihuela enjoyed only a relatively short period of popularity in Spain and Italy during an era dominated elsewhere in Europe by the lute; the last surviving published music for the instrument appeared in 1576. Meanwhile, the five-course baroque guitar, which was documented in Spain from the middle of the 16th century, enjoyed popularity, especially in Spain, Italy and France from the late 16th century to the mid-18th century. In Portugal, the word viola referred to the guitar, as guitarra meant the "Portuguese guitar", a variety of cittern. There were many different plucked instruments that were being invented and used in Europe, during the Middle Ages. By the 16th century, most of the forms of guitar had fallen off, to never be seen again. However, midway through the 16th century, the five-course guitar was established. It was not a straightforward process. There were two types of five-course guitars, they differed in the location of the major third and in the interval pattern. The fifth course can be placed on the instrument because it was known to play seventeen notes or more. Because the guitar had a fifth string, it was capable of playing that amount of notes. The guitar's strings were tuned in unison, so, in other words, it was tuned by placing a finger on the second fret of the thinnest string and tuning the guitar bottom to top. The strings were a whole octave apart from one another, which is the reason for the different method of tuning. Because it was so different, there was major controversy as to who created the five course guitar. A literary source, Lope de Vega's Dorotea, gives the credit to the poet and musician Vicente Espinel. This claim was also repeated by Nicolas Doizi de Velasco in 1640, however this claim has been refuted by others who state that Espinel's birth year (1550) make it impossible for him to be responsible for the tradition. He believed that the tuning was the reason the instrument became known as the Spanish guitar in Italy. Even later, in the same century, Gaspar Sanz wrote that other nations such as Italy or France added to the Spanish guitar. All of these nations even imitated the five-course guitar by "recreating" their own. Finally, circa 1850, the form and structure of the modern guitar are followed by different Spanish makers such as Manuel de Soto y Solares and perhaps the most important of all guitar makers Antonio Torres Jurado, who increased the size of the guitar body, altered its proportions, and invented the breakthrough fan-braced pattern. Bracing, which refers to the internal pattern of wood reinforcements used to secure the guitar's top and back and prevent the instrument from collapsing under tension, is an important factor in how the guitar sounds. Torres' design greatly improved the volume, tone, and projection of the instrument, and it has remained essentially unchanged since. Types Guitars can be divided into two broad categories, acoustic and electric guitars. Within each of these categories, there are also further sub-categories. For example, an electric guitar can be purchased in a six-string model (the most common model) or in seven- or twelve-string models. Acoustic Acoustic guitars form several notable subcategories within the acoustic guitar group: classical and flamenco guitars; steel-string guitars, which include the flat-topped, or "folk", guitar; twelve-string guitars; and the arched-top guitar. The acoustic guitar group also includes unamplified guitars designed to play in different registers, such as the acoustic bass guitar, which has a similar tuning to that of the electric bass guitar. Renaissance and Baroque Renaissance and Baroque guitars are the ancestors of the modern classical and flamenco guitar. They are substantially smaller, more delicate in construction, and generate less volume. The strings are paired in courses as in a modern 12-string guitar, but they only have four or five courses of strings rather than six single strings normally used now. They were more often used as rhythm instruments in ensembles than as solo instruments, and can often be seen in that role in early music performances. (Gaspar Sanz's Instrucción de Música sobre la Guitarra Española of 1674 contains his whole output for the solo guitar.) Renaissance and Baroque guitars are easily distinguished, because the Renaissance guitar is very plain and the Baroque guitar is very ornate, with ivory or wood inlays all over the neck and body, and a paper-cutout inverted "wedding cake" inside the hole. Classical Classical guitars, also known as "Spanish" guitars, are typically strung with nylon strings, plucked with the fingers, played in a seated position and are used to play a diversity of musical styles including classical music. The classical guitar's wide, flat neck allows the musician to play scales, arpeggios, and certain chord forms more easily and with less adjacent string interference than on other styles of guitar. Flamenco guitars are very similar in construction, but they are associated with a more percussive tone. In Portugal, the same instrument is often used with steel strings particularly in its role within fado music. The guitar is called viola, or violão in Brazil, where it is often used with an extra seventh string by choro musicians to provide extra bass support. In Mexico, the popular mariachi band includes a range of guitars, from the small requinto to the guitarrón, a guitar larger than a cello, which is tuned in the bass register. In Colombia, the traditional quartet includes a range of instruments too, from the small bandola (sometimes known as the Deleuze-Guattari, for use when traveling or in confined rooms or spaces), to the slightly larger tiple, to the full-sized classical guitar. The requinto also appears in other Latin-American countries as a complementary member of the guitar family, with its smaller size and scale, permitting more projection for the playing of single-lined melodies. Modern dimensions of the classical instrument were established by the Spaniard Antonio de Torres Jurado (1817–1892). Flat-top Flat-top guitars with steel strings are similar to the classical guitar, however, the flat-top body size is usually significantly larger than a classical guitar, and has a narrower, reinforced neck and stronger structural design. The robust X-bracing typical of flat-top guitars was developed in the 1840s by German-American luthiers, of whom Christian Friedrich "C. F." Martin is the best known. Originally used on gut-strung instruments, the strength of the system allowed the later guitars to withstand the additional tension of steel strings. Steel strings produce a brighter tone and a louder sound. The acoustic guitar is used in many kinds of music including folk, country, bluegrass, pop, jazz, and blues. Many variations are possible from the roughly classical-sized OO and Parlour to the large Dreadnought (the most commonly available type) and Jumbo. Ovation makes a modern variation, with a rounded back/side assembly molded from artificial materials. Archtop Archtop guitars are steel-string instruments in which the top (and often the back) of the instrument are carved, from a solid billet, into a curved, rather than a flat, shape. This violin-like construction is usually credited to the American Orville Gibson. Lloyd Loar of the Gibson Mandolin-Guitar Mfg. Co introduced the violin-inspired "F"-shaped hole design now usually associated with archtop guitars, after designing a style of mandolin of the same type. The typical archtop guitar has a large, deep, hollow body whose form is much like that of a mandolin or a violin-family instrument. Nowadays, most archtops are equipped with magnetic pickups, and they are therefore both acoustic and electric. F-hole archtop guitars were immediately adopted, upon their release, by both jazz and country musicians, and have remained particularly popular in jazz music, usually with flatwound strings. Resonator, resophonic or Dobros All three principal types of resonator guitars were invented by the Slovak-American John Dopyera (1893–1988) for the National and Dobro (Dopyera Brothers) companies. Similar to the flat top guitar in appearance, but with a body that may be made of brass, nickel-silver, or steel as well as wood, the sound of the resonator guitar is produced by one or more aluminum resonator cones mounted in the middle of the top. The physical principle of the guitar is therefore similar to the loudspeaker. The original purpose of the resonator was to produce a very loud sound; this purpose has been largely superseded by electrical amplification, but the resonator guitar is still played because of its distinctive tone. Resonator guitars may have either one or three resonator cones. The method of transmitting sound resonance to the cone is either a "biscuit" bridge, made of a small piece of hardwood at the vertex of the cone (Nationals), or a "spider" bridge, made of metal and mounted around the rim of the (inverted) cone (Dobros). Three-cone resonators always use a specialized metal bridge. The type of resonator guitar with a neck with a square cross-section—called "square neck" or "Hawaiian"—is usually played face up, on the lap of the seated player, and often with a metal or glass slide. The round neck resonator guitars are normally played in the same fashion as other guitars, although slides are also often used, especially in blues. Steel guitar A steel guitar is any guitar played while moving a polished steel bar or similar hard object against plucked strings. The bar itself is called a "steel" and is the source of the name "steel guitar". The instrument differs from a conventional guitar in that it does not use frets; conceptually, it is somewhat akin to playing a guitar with one finger (the bar). Known for its portamento capabilities, gliding smoothly over every pitch between notes, the instrument can produce a sinuous crying sound and deep vibrato emulating the human singing voice. Typically, the strings are plucked (not strummed) by the fingers of the dominant hand, while the steel tone bar is pressed lightly against the strings and moved by the opposite hand. The instrument is played while sitting, placed horizontally across the player's knees or otherwise supported. The horizontal playing style is called "Hawaiian style". Twelve-string The twelve-string guitar usually has steel strings, and it is widely used in folk music, blues, and rock and roll. Rather than having only six strings, the 12-string guitar has six courses made up of two strings each, like a mandolin or lute. The highest two courses are tuned in unison, while the others are tuned in octaves. The 12-string guitar is also made in electric forms. The chime-like sound of the 12-string electric guitar was the basis of jangle pop. Acoustic bass The acoustic bass guitar is a bass instrument with a hollow wooden body similar to, though usually somewhat larger than, that of a 6-string acoustic guitar. Like the traditional electric bass guitar and the double bass, the acoustic bass guitar commonly has four strings, which are normally tuned E-A-D-G, an octave below the lowest four strings of the 6-string guitar, which is the same tuning pitch as an electric bass guitar. It can, more rarely, be found with 5 or 6 strings, which provides a wider range of notes to be played with less movement up and down the neck. Electric Electric guitars can have solid, semi-hollow, or hollow bodies; solid bodies produce little sound without amplification. In contrast to a standard acoustic guitar, electric guitars instead rely on electromagnetic pickups, and sometimes piezoelectric pickups, that convert the vibration of the steel strings into signals, which are fed to an amplifier through a patch cable or radio transmitter. The sound is frequently modified by other electronic devices (effects units) or the natural distortion of valves (vacuum tubes) or the pre-amp in the amplifier. There are two main types of magnetic pickups, single- and double-coil (or humbucker), each of which can be passive or active. The electric guitar is used extensively in jazz, blues, R & B, and rock and roll. The first successful magnetic pickup for a guitar was invented by George Beauchamp, and incorporated into the 1931 Ro-Pat-In (later Rickenbacker) "Frying Pan" lap steel; other manufacturers, notably Gibson, soon began to install pickups in archtop models. After World War II the completely solid-body electric was popularized by Gibson in collaboration with Les Paul, and independently by Leo Fender of Fender Music. The lower fretboard action (the height of the strings from the fingerboard), lighter (thinner) strings, and its electrical amplification lend the electric guitar to techniques less frequently used on acoustic guitars. These include tapping, extensive use of legato through pull-offs and hammer-ons (also known as slurs), pinch harmonics, volume swells, and use of a tremolo arm or effects pedals. Some electric guitar models feature piezoelectric pickups, which function as transducers to provide a sound closer to that of an acoustic guitar with the flip of a switch or knob, rather than switching guitars. Those that combine piezoelectric pickups and magnetic pickups are sometimes known as hybrid guitars. Hybrids of acoustic and electric guitars are also common. There are also more exotic varieties, such as guitars with two, three, or rarely four necks, all manner of alternate string arrangements, fretless fingerboards (used almost exclusively on bass guitars, meant to emulate the sound of a stand-up bass), 5.1 surround guitar, and such. Seven-string and eight-string Solid-body seven-string guitars were popularized in the 1980s and 1990s. Other artists go a step further, by using an eight-string guitar with two extra low strings. Although the most common seven-string has a low B string, Roger McGuinn (of The Byrds and Rickenbacker) uses an octave G string paired with the regular G string as on a 12-string guitar, allowing him to incorporate chiming 12-string elements in standard six-string playing. In 1982 Uli Jon Roth developed the "Sky Guitar", with a vastly extended number of frets, which was the first guitar to venture into the upper registers of the violin. Roth's seven-string and "Mighty Wing" guitar features a wider octave range. Electric bass The bass guitar (also called an "electric bass", or simply a "bass") is similar in appearance and construction to an electric guitar, but with a longer neck and scale length, and four to six strings. The four-string bass, by far the most common, is usually tuned the same as the double bass, which corresponds to pitches one octave lower than the four lowest pitched strings of a guitar (E, A, D, and G). The bass guitar is a transposing instrument, as it is notated in bass clef an octave higher than it sounds (as is the double bass) to avoid excessive ledger lines being required below the staff. Like the electric guitar, the bass guitar has pickups and it is plugged into an amplifier and speaker for live performances. Construction Handedness Modern guitars can be constructed to suit both left- and right-handed players. Typically the dominant hand is used to pluck or strum the strings. This is similar to the violin family of instruments where the dominant hand controls the bow. Left-handed players usually play a mirror image instrument manufactured especially for left-handed players. There are other options, some unorthodox, including learn to play a right-handed guitar as if the player is right-handed or playing an unmodified right-handed guitar reversed. Guitarist Jimi Hendrix) played a right-handed guitar strung in reverse (the treble strings and bass strings reversed). The problem with doing this is that it reverses the guitar's saddle angle. The saddle is the strip of material on top of the bridge where the strings rest. It is normally slanted slightly, making the bass strings longer than the treble strings. In part, the reason for this is the difference in the thickness of the strings. Physical properties of the thicker bass strings require them to be slightly longer than the treble strings to correct intonation. Reversing the strings, therefore, reverses the orientation of the saddle, adversely affecting intonation. Components Head The headstock is located at the end of the guitar neck farthest from the body. It is fitted with machine heads that adjust the tension of the strings, which in turn affects the pitch. The traditional tuner layout is "3+3", in which each side of the headstock has three tuners (such as on Gibson Les Pauls). In this layout, the headstocks are commonly symmetrical. Many guitars feature other layouts, including six-in-line tuners (featured on Fender Stratocasters) or even "4+2" (e.g. Ernie Ball Music Man). Some guitars (such as Steinbergers) do not have headstocks at all, in which case the tuning machines are located elsewhere, either on the body or the bridge. The nut is a small strip of bone, plastic, brass, corian, graphite, stainless steel, or other medium-hard material, at the joint where the headstock meets the fretboard. Its grooves guide the strings onto the fretboard, giving consistent lateral string placement. It is one of the endpoints of the strings' vibrating length. It must be accurately cut, or it can contribute to tuning problems due to string slippage or string buzz. To reduce string friction in the nut, which can adversely affect tuning stability, some guitarists fit a roller nut. Some instruments use a zero fret just in front of the nut. In this case the nut is used only for lateral alignment of the strings, the string height and length being dictated by the zero fret. Neck A guitar's frets, fretboard, tuners, headstock, and truss rod, all attached to a long wooden extension, collectively constitute its neck. The wood used to make the fretboard usually differs from the wood in the rest of the neck. The bending stress on the neck is considerable, particularly when heavier gauge strings are used (see Tuning), and the ability of the neck to resist bending (see Truss rod) is important to the guitar's ability to hold a constant pitch during tuning or when strings are fretted. The rigidity of the neck with respect to the body of the guitar is one determinant of a good instrument versus a poor-quality one. The cross-section of the neck can also vary, from a gentle "C" curve to a more pronounced "V" curve. There are many different types of neck profiles available, giving the guitarist many options. Some aspects to consider in a guitar neck may be the overall width of the fretboard, scale (distance between the frets), the neck wood, the type of neck construction (for example, the neck may be glued in or bolted on), and the shape (profile) of the back of the neck. Other types of material used to make guitar necks are graphite (Steinberger guitars), aluminum (Kramer Guitars, Travis Bean and Veleno guitars), or carbon fiber (Modulus Guitars and ThreeGuitars). Double neck electric guitars have two necks, allowing the musician to quickly switch between guitar sounds. The neck joint or heel is the point at which the neck is either bolted or glued to the body of the guitar. Almost all acoustic steel-string guitars, with the primary exception of Taylors, have glued (otherwise known as set) necks, while electric guitars are constructed using both types. Most classical guitars have a neck and headblock carved from one piece of wood, known as a "Spanish heel". Commonly used set neck joints include mortise and tenon joints (such as those used by C. F. Martin & Co.), dovetail joints (also used by C. F. Martin on the D-28 and similar models) and Spanish heel neck joints, which are named after the shoe they resemble and commonly found in classical guitars. All three types offer stability. Bolt-on necks, though they are historically associated with cheaper instruments, do offer greater flexibility in the guitar's set-up, and allow easier access for neck joint maintenance and repairs. Another type of neck, only available for solid-body electric guitars, is the neck-through-body construction. These are designed so that everything from the machine heads down to the bridge is located on the same piece of wood. The sides (also known as wings) of the guitar are then glued to this central piece. Some luthiers prefer this method of construction as they claim it allows better sustain of each note. Some instruments may not have a neck joint at all, having the neck and sides built as one piece and the body built around it. The fingerboard, also called the fretboard, is a piece of wood embedded with metal frets that comprises the top of the neck. It is flat on classical guitars and slightly curved crosswise on acoustic and electric guitars. The curvature of the fretboard is measured by the fretboard radius, which is the radius of a hypothetical circle of which the fretboard's surface constitutes a segment. The smaller the fretboard radius, the more noticeably curved the fretboard is. Most modern guitars feature a 12" neck radius, while older guitars from the 1960s and 1970s usually feature a 6-8" neck radius. Pinching a string against a fret on the fretboard effectively shortens the vibrating length of the string, producing a higher pitch. Fretboards are most commonly made of rosewood, ebony, maple, and sometimes manufactured using composite materials such as HPL or resin. See the section "Neck" below for the importance of the length of the fretboard in connection to other dimensions of the guitar. The fingerboard plays an essential role in the treble tone for acoustic guitars. The quality of vibration of the fingerboard is the principal characteristic for generating the best treble tone. For that reason, ebony wood is better, but because of high use, ebony has become rare and extremely expensive. Most guitar manufacturers have adopted rosewood instead of ebony. Frets Almost all guitars have frets, which are metal strips (usually nickel alloy or stainless steel) embedded along the fretboard and located at exact points that divide the scale length in accordance with a specific mathematical formula. The exceptions include fretless bass guitars and very rare fretless guitars. Pressing a string against a fret determines the strings' vibrating length and therefore its resultant pitch. The pitch of each consecutive fret is defined at a half-step interval on the chromatic scale. Standard classical guitars have 19 frets and electric guitars between 21 and 24 frets, although guitars have been made with as many as 27 frets. Frets are laid out to accomplish an equal tempered division of the octave. Each set of twelve frets represents an octave. The twelfth fret divides the scale length exactly into two halves, and the 24th fret position divides one of those halves in half again. The ratio of the spacing of two consecutive frets is (twelfth root of two). In practice, luthiers determine fret positions using the constant 17.817—an approximation to 1/(1-1/). If the nth fret is a distance x from the bridge, then the distance from the (n+1)th fret to the bridge is x-(x/17.817). Frets are available in several different gauges and can be fitted according to player preference. Among these are "jumbo" frets, which have a much thicker gauge, allowing for use of a slight vibrato technique from pushing the string down harder and softer. "Scalloped" fretboards, where the wood of the fretboard itself is "scooped out" between the frets, allow a dramatic vibrato effect. Fine frets, much flatter, allow a very low string-action, but require that other conditions, such as curvature of the neck, be well-maintained to prevent buzz. Truss rod The truss rod is a thin, strong metal rod that runs along the inside of the neck. It is used to correct changes to the neck's curvature caused by aging of the neck timbers, changes in humidity, or to compensate for changes in the tension of strings. The tension of the rod and neck assembly is adjusted by a hex nut or an allen-key bolt on the rod, usually located either at the headstock, sometimes under a cover, or just inside the body of the guitar underneath the fretboard and accessible through the sound hole. Some truss rods can only be accessed by removing the neck. The truss rod counteracts the immense amount of tension the strings place on the neck, bringing the neck back to a straighter position. Turning the truss rod clockwise tightens it, counteracting the tension of the strings and straightening the neck or creating a backward bow. Turning the truss rod counter-clockwise loosens it, allowing string tension to act on the neck and creating a forward bow. Adjusting the truss rod affects the intonation of a guitar as well as the height of the strings from the fingerboard, called the action. Some truss rod systems, called double action truss systems, tighten both ways, pushing the neck both forward and backward (standard truss rods can only release to a point beyond which the neck is no longer compressed and pulled backward). The artist and luthier Irving Sloane pointed out, in his book Steel-String Guitar Construction, that truss rods are intended primarily to remedy concave bowing of the neck, but cannot correct a neck with "back bow" or one that has become twisted. Classical guitars do not require truss rods, as their nylon strings exert a lower tensile force with lesser potential to cause structural problems. However, their necks are often reinforced with a strip of harder wood, such as an ebony strip that runs down the back of a cedar neck. There is no tension adjustment on this form of reinforcement. Inlays Inlays are visual elements set into the exterior surface of a guitar, both for decoration and artistic purposes and, in the case of the markings on the 3rd, 5th, 7th and 12th fret (and in higher octaves), to provide guidance to the performer about the location of frets on the instrument. The typical locations for inlay are on the fretboard, headstock, and on acoustic guitars around the soundhole, known as the rosette. Inlays range from simple plastic dots on the fretboard to intricate works of art covering the entire exterior surface of a guitar (front and back). Some guitar players have used LEDs in the fretboard to produce unique lighting effects onstage. Fretboard inlays are most commonly shaped like dots, diamond shapes, parallelograms, or large blocks in between the frets. Dots are usually inlaid into the upper edge of the fretboard in the same positions, small enough to be visible only to the player. These usually appear on the odd-numbered frets, but also on the 12th fret (the one-octave mark) instead of the 11th and 13th frets. Some older or high-end instruments have inlays made of mother of pearl, abalone, ivory, colored wood or other exotic materials and designs. Simpler inlays are often made of plastic or painted. High-end classical guitars seldom have fretboard inlays as a well-trained player is expected to know his or her way around the instrument. In addition to fretboard inlay, the headstock and soundhole surround are also frequently inlaid. The manufacturer's logo or a small design is often inlaid into the headstock. Rosette designs vary from simple concentric circles to delicate fretwork mimicking the historic rosette of lutes. Bindings that edge the finger and soundboards are sometimes inlaid. Some instruments have a filler strip running down the length and behind the neck, used for strength or to fill the cavity through which the truss rod was installed in the neck. Body In acoustic guitars, string vibration is transmitted through the bridge and saddle to the body via sound board. The sound board is typically made of tonewoods such as spruce or cedar. Timbers for tonewoods are chosen for both strength and ability to transfer mechanical energy from the strings to the air within the guitar body. Sound is further shaped by the characteristics of the guitar body's resonant cavity. In expensive instruments, the entire body is made of wood. In inexpensive instruments, the back may be made of plastic. In an acoustic instrument, the body of the guitar is a major determinant of the overall sound quality. The guitar top, or soundboard, is a finely crafted and engineered element made of tonewoods such as spruce and red cedar. This thin piece of wood, often only 2 or 3 mm thick, is strengthened by differing types of internal bracing. Many luthiers consider the top the dominant factor in determining the sound quality. The majority of the instrument's sound is heard through the vibration of the guitar top as the energy of the vibrating strings is transferred to it. The body of an acoustic guitar has a sound hole through which sound projects. The sound hole is usually a round hole in the top of the guitar under the strings. The air inside the body vibrates as the guitar top and body is vibrated by the strings, and the response of the air cavity at different frequencies is characterized, like the rest of the guitar body, by a number of resonance modes at which it responds more strongly. The top, back and ribs of an acoustic guitar body are very thin (1–2 mm), so a flexible piece of wood called lining is glued into the corners where the rib meets the top and back. This interior reinforcement provides 5 to 20 mm of solid gluing area for these corner joints. Solid linings are often used in classical guitars, while kerfed lining is most often found in steel-string acoustics. Kerfed lining is also called kerfing because it is scored, or "kerfed"(incompletely sawn through), to allow it to bend with the shape of the rib). During final construction, a small section of the outside corners is carved or routed out and filled with binding material on the outside corners and decorative strips of material next to the binding, which is called purfling. This binding serves to seal off the end grain of the top and back. Purfling can also appear on the back of an acoustic guitar, marking the edge joints of the two or three sections of the back. Binding and purfling materials are generally made of either wood or plastic. Body size, shape and style have changed over time. 19th-century guitars, now known as salon guitars, were smaller than modern instruments. Differing patterns of internal bracing have been used over time by luthiers. Torres, Hauser, Ramirez, Fleta, and C. F. Martin were among the most influential designers of their time. Bracing not only strengthens the top against potential collapse due to the stress exerted by the tensioned strings but also affects the resonance characteristics of the top. The back and sides are made out of a variety of timbers such as mahogany, Indian rosewood and highly regarded Brazilian rosewood (Dalbergia nigra). Each one is primarily chosen for their aesthetic effect and can be decorated with inlays and purfling. Instruments with larger areas for the guitar top were introduced by Martin in an attempt to create greater volume levels. The popularity of the larger "dreadnought" body size amongst acoustic performers is related to the greater sound volume produced. Most electric guitar bodies are made of wood and include a plastic pickguard. Boards wide enough to use as a solid body are very expensive due to the worldwide depletion of hardwood stock since the 1970s, so the wood is rarely one solid piece. Most bodies are made from two pieces of wood with some of them including a seam running down the center line of the body. The most common woods used for electric guitar body construction include maple, basswood, ash, poplar, alder, and mahogany. Many bodies consist of good-sounding, but inexpensive woods, like ash, with a "top", or thin layer of another, more attractive wood (such as maple with a natural "flame" pattern) glued to the top of the basic wood. Guitars constructed like this are often called "flame tops". The body is usually carved or routed to accept the other elements, such as the bridge, pickup, neck, and other electronic components. Most electrics have a polyurethane or nitrocellulose lacquer finish. Other alternative materials to wood are used in guitar body construction. Some of these include carbon composites, plastic material, such as polycarbonate, and aluminum alloys. Bridge The main purpose of the bridge on an acoustic guitar is to transfer the vibration from the strings to the soundboard, which vibrates the air inside of the guitar, thereby amplifying the sound produced by the strings. On all electric, acoustic and original guitars, the bridge holds the strings in place on the body. There are many varied bridge designs. There may be some mechanism for raising or lowering the bridge saddles to adjust the distance between the strings and the fretboard (action), or fine-tuning the intonation of the instrument. Some are spring-loaded and feature a "whammy bar", a removable arm that lets the player modulate the pitch by changing the tension on the strings. The whammy bar is sometimes also called a "tremolo bar". (The effect of rapidly changing pitch is properly called "vibrato". See Tremolo for further discussion of this term.) Some bridges also allow for alternate tunings at the touch of a button. On almost all modern electric guitars, the bridge has saddles that are adjustable for each string so that intonation stays correct up and down the neck. If the open string is in tune, but sharp or flat when frets are pressed, the bridge saddle position can be adjusted with a screwdriver or hex key to remedy the problem. In general, flat notes are corrected by moving the saddle forward and sharp notes by moving it backward. On an instrument correctly adjusted for intonation, the actual length of each string from the nut to the bridge saddle is slightly, but measurably longer than the scale length of the instrument. This additional length is called compensation, which flattens all notes a bit to compensate for the sharping of all fretted notes caused by stretching the string during fretting. Saddle The saddle of a guitar is the part of the bridge that physically supports the strings. It may be one piece (typically on acoustic guitars) or separate pieces, one for each string (electric guitars and basses). The saddle's basic purpose is to provide the endpoint for the string's vibration at the correct location for proper intonation, and on acoustic guitars to transfer the vibrations through the bridge into the top wood of the guitar. Saddles are typically made of plastic or bone for acoustic guitars, though synthetics and some exotic animal tooth variations (e.g. fossilized tooth, ivory, etc. ) have become popular with some players. Electric guitar saddles are typically metal, though some synthetic saddles are available. Pickguard The pickguard, also known as the scratchplate, is usually a piece of laminated plastic or other material that protects the finish of the top of the guitar from damage due to the use of a plectrum ("pick") or fingernails. Electric guitars sometimes mount pickups and electronics on the pickguard. It is a common feature on steel-string acoustic guitars. Some performance styles that use the guitar as a percussion instrument (tapping the top or sides between notes, etc.), such as flamenco, require that a scratchplate or pickguard be fitted to nylon-string instruments. Strings The standard guitar has six strings, but four-, seven-, eight-, nine-, ten-, eleven-, twelve-, thirteen- and eighteen-string guitars are also available. Classical and flamenco guitars historically used gut strings, but these have been superseded by polymer materials, such as nylon and fluorocarbon. Modern guitar strings are constructed from metal, polymers, or animal or plant product materials. Instruments utilizing "steel" strings may have strings made from alloys incorporating steel, nickel or phosphor bronze. Bass strings for both instruments are wound rather than monofilament. Pickups and electronics Pickups are transducers attached to a guitar that detect (or "pick up") string vibrations and convert the mechanical energy of the string into electrical energy. The resultant electrical signal can then be electronically amplified. The most common type of pickup is electromagnetic in design. These contain magnets that are within a coil, or coils, of copper wire. Such pickups are usually placed directly underneath the guitar strings. Electromagnetic pickups work on the same principles and in a similar manner to an electric generator. The vibration of the strings creates a small electric current in the coils surrounding the magnets. This signal current is carried to a guitar amplifier that drives a loudspeaker. Traditional electromagnetic pickups are either single-coil or double-coil. Single-coil pickups are susceptible to noise induced by stray electromagnetic fields, usually mains-frequency (60 or 50 hertz) hum. The introduction of the double-coil humbucker in the mid-1950s solved this problem through the use of two coils, one of which is wired in opposite polarity to cancel or "buck" stray fields. The types and models of pickups used can greatly affect the tone of the guitar. Typically, humbuckers, which are two magnet-coil assemblies attached to each other, are traditionally associated with a heavier sound. Single-coil pickups, one magnet wrapped in copper wire, are used by guitarists seeking a brighter, twangier sound with greater dynamic range. Modern pickups are tailored to the sound desired. A commonly applied approximation used in the selection of a pickup is that less wire (lower electrical impedance) gives a brighter sound, more wire gives a "fat" tone. Other options include specialized switching that produces coil-splitting, in/out of phase and other effects. Guitar circuits are either active, needing a battery to power their circuit, or, as in most cases, equipped with a passive circuit. Fender Stratocaster-type guitars generally utilize three single-coil pickups, while most Gibson Les Paul types use humbucker pickups. Piezoelectric, or piezo, pickups represent another class of pickup. These employ piezoelectricity to generate the musical signal and are popular in hybrid electro-acoustic guitars. A crystal is located under each string, usually in the saddle. When the string vibrates, the shape of the crystal is distorted, and the stresses associated with this change produce tiny voltages across the crystal that can be amplified and manipulated. Piezo pickups usually require a powered pre-amplifier to lift their output to match that of electromagnetic pickups. Power is typically delivered by an on-board battery. Most pickup-equipped guitars feature onboard controls, such as volume or tone, or pickup selection. At their simplest, these consist of passive components, such as potentiometers and capacitors, but may also include specialized integrated circuits or other active components requiring batteries for power, for preamplification and signal processing, or even for electronic tuning. In many cases, the electronics have some sort of shielding to prevent pickup of external interference and noise. Guitars may be shipped or retrofitted with a hexaphonic pickup, which produces a separate output for each string, usually from a discrete piezoelectric or magnetic pickup. This arrangement lets on-board or external electronics process the strings individually for modeling or Musical Instrument Digital Interface (MIDI) conversion. Roland makes "GK" hexaphonic pickups for guitar and bass, and a line of guitar modeling and synthesis products. Line 6's hexaphonic-equipped Variax guitars use on-board electronics to model the sound after various vintage instruments, and vary pitch on individual strings. MIDI converters use a hexaphonic guitar signal to determine pitch, duration, attack, and decay characteristics. The MIDI sends the note information to an internal or external sound bank device. The resulting sound closely mimics numerous instruments. The MIDI setup can also let the guitar be used as a game controller (i.e., Rock Band Squier) or as an instructional tool, as with the Fretlight Guitar. Tuning Standard By the 16th century, the guitar tuning of ADGBE had already been adopted in Western culture; a lower E was later added on the bottom as a sixth string. The result, known as "standard tuning", has the strings tuned from a low E, to a high E, traversing a two-octave range—EADGBE. This tuning is a series of ascending fourths (and a single major third) from low to high. The reason for ascending fourths is to accommodate four fingers on four frets up a scale before moving to the next string. This is musically convenient and physically comfortable and it eased the transition between fingering chords and playing scales. If the tuning contained all perfect fourths, the range would end up being two octaves plus one semitone; the high string would be an F, a dissonant half-step from the low E and much out of place. The pitches are as follows: The table below shows a pitch's name found over the six strings of a guitar in standard tuning, from the nut (zero), to the twelfth fret. For four strings, the 5th fret on one string is the same open-note as the next string; for example, a 5th-fret note on the sixth string is the same note as the open fifth string. However, between the second and third strings, an irregularity occurs: The 4th-fret note on the third string is equivalent to the open second string. Alternative Standard tuning has evolved to provide a good compromise between simple fingering for many chords and the ability to play common scales with reasonable left-hand movement. There are also a variety of commonly used alternative tunings, for example, the classes of open, regular, and dropped tunings. Open tuning refers to a guitar tuned so that strumming the open strings produces a chord, typically a major chord. The base chord consists of at least 3 notes and may include all the strings or a subset. The tuning is named for the open chord, Open D, open G, and open A are popular tunings. All similar chords in the chromatic scale can then be played by barring a single fret. Open tunings are common in blues music and folk music, and they are used in the playing of slide and bottleneck guitars. Many musicians use open tunings when playing slide guitar. For the standard tuning, there is exactly one interval of a major third between the second and third strings, and all the other intervals are fourths. The irregularity has a price – chords cannot be shifted around the fretboard in the standard tuning E-A-D-G-B-E, which requires four chord-shapes for the major chords. There are separate chord-forms for chords having their root note on the third, fourth, fifth, and sixth strings. In contrast, regular tunings have equal intervals between the strings, and so they have symmetrical scales all along the fretboard. This makes it simpler to translate chords. For the regular tunings, chords may be moved diagonally around the fretboard. The diagonal movement of chords is especially simple for the regular tunings that are repetitive, in which case chords can be moved vertically: Chords can be moved three strings up (or down) in major-thirds tuning and chords can be moved two strings up (or down) in augmented-fourths tuning. Regular tunings thus appeal to new guitarists and also to jazz-guitarists, whose improvisation is simplified by regular intervals. On the other hand, some chords are more difficult to play in a regular tuning than in standard tuning. It can be difficult to play conventional chords, especially in augmented-fourths tuning and all-fifths tuning, in which the large spacings require hand stretching. Some chords, which are conventional in folk music, are difficult to play even in all-fourths and major-thirds tunings, which do not require more hand-stretching than standard tuning. In major-thirds tuning, the interval between open strings is always a major third. Consequently, four frets suffice to play the chromatic scale. Chord inversion is especially simple in major-thirds tuning. Chords are inverted simply by raising one or two notes by three strings. The raised notes are played with the same finger as the original notes. In contrast, in standard tuning, the shape of inversions depends on the involvement of the irregular major-third. All-fourths tuning replaces the major third between the third and second strings with a fourth, extending the conventional tuning of a bass guitar. With all-fourths tuning, playing the triads is more difficult, but improvisation is simplified because chord-patterns remain constant when moved around the fretboard. Jazz guitarist Stanley Jordan uses the all-fourths tuning EADGCF. Invariant chord-shapes are an advantage of other regular tunings, such as major-thirds and all-fifths tunings. Extending the tunings of violins and cellos, all-fifths tuning offers an expanded range CGDAEB, which however has been impossible to implement on a conventional guitar. All-fifths tuning is used for the lowest five strings of the new standard tuning of Robert Fripp and his former students in Guitar Craft courses; new standard tuning has a high G on its last string CGDAE-G. Another class of alternative tunings is called drop tunings, because the tuning drops down the lowest string. Dropping down the lowest string a whole tone results in the "drop-D" (or "dropped D") tuning. Its open-string notes DADGBE (from low to high) allow for a deep bass D note, which can be used in keys such as D major, d minor and G major. It simplifies the playing of simple fifths (powerchords). Many contemporary rock bands re-tune all strings down, making, for example, Drop-C or Drop-B tunings. Scordatura Many scordatura (alternate tunings) modify the standard tuning of the lute, especially when playing Renaissance music repertoire originally written for that instrument. Some scordatura drop the pitch of one or more strings, giving access to new lower notes. Some scordatura makes it easier to play in unusual keys. Accessories Though a guitar may be played on its own, there are a variety of common accessories used for holding and playing the guitar. Capotasto A capo (short for capotasto) is used to change the pitch of open strings. Capos are clipped onto the fretboard with the aid of spring tension or, in some models, elastic tension. To raise the guitar's pitch by one semitone, the player would clip the capo onto the fretboard just below the first fret. Its use allows players to play in different keys without having to change the chord formations they use. For example, if a folk guitar player wanted to play a song in the key of B Major, they could put a capo on the second fret of the instrument, and then play the song as if it were in the key of A Major, but with the capo the instrument would make the sounds of B Major. This is because, with the capo barring the entire second fret, open chords would all sound two semitones (in other words, one tone) higher in pitch. For example, if a guitarist played an open A Major chord (a very common open chord), it would sound like a B Major chord. All of the other open chords would be similarly modified in pitch. Because of the ease with which they allow guitar players to change keys, they are sometimes referred to with pejorative names, such as "cheaters" or the "hillbilly crutch". Despite this negative viewpoint, another benefit of the capo is that it enables guitarists to obtain the ringing, resonant sound of the common keys (C, G, A, etc.) in "harder" and less-commonly used keys. Classical performers are known to use them to enable modern instruments to match the pitch of historical instruments such as the Renaissance music lute. Slides A slide (neck of a bottle, knife blade or round metal or glass bar or cylinder) is used in blues and rock to create a glissando or "Hawaiian" effect. The slide is used to fret notes on the neck, instead of using the fretting hand's fingers. The characteristic use of the slide is to move up to the intended pitch by, as the name implies, sliding up the neck to the desired note. The necks of bottles were often used in blues and country music as improvised slides. Modern slides are constructed of glass, plastic, ceramic, chrome, brass or steel bars or cylinders, depending on the weight and tone desired (and the amount of money a guitarist can spend). An instrument that is played exclusively in this manner (using a metal bar) is called a steel guitar or pedal steel. Slide playing to this day is very popular in blues music and country music. Some slide players use a so-called Dobro guitar. Some performers who have become famous for playing slide are Robert Johnson, Elmore James, Ry Cooder, George Harrison, Bonnie Raitt, Derek Trucks, Warren Haynes, Duane Allman, Muddy Waters, Rory Gallagher, and George Thorogood. Plectrum A "guitar pick" or "plectrum" is a small piece of hard material generally held between the thumb and first finger of the picking hand and is used to "pick" the strings. Though most classical players pick with a combination of fingernails and fleshy fingertips, the pick is most often used for electric and steel-string acoustic guitars. Though today they are mainly plastic, variations do exist, such as bone, wood, steel or tortoise shell. Tortoise shell was the most commonly used material in the early days of pick-making, but as tortoises and turtles became endangered, the practice of using their shells for picks or anything else was banned. Tortoise-shell picks made before the ban are often coveted for a supposedly superior tone and ease of use, and their scarcity has made them valuable. Picks come in many shapes and sizes. Picks vary from the small jazz pick to the large bass pick. The thickness of the pick often determines its use. A thinner pick (between 0.2 and 0.5 mm) is usually used for strumming or rhythm playing, whereas thicker picks (between 0.7 and 1.5+ mm) are usually used for single-note lines or lead playing. The distinctive guitar sound of Billy Gibbons is attributed to using a quarter or peso as a pick. Similarly, Brian May is known to use a sixpence coin as a pick, while noted 1970s and early 1980s session musician David Persons is known for using old credit cards, cut to the correct size, as plectrums. Thumb picks and finger picks that attach to the fingertips are sometimes employed in finger-picking styles on steel strings. These allow the fingers and thumb to operate independently, whereas a flat pick requires the thumb and one or two fingers to manipulate. Straps A guitar strap is a strip of material with an attachment mechanism on each end, made to hold a guitar via the shoulders at an adjustable length. Guitars have varying accommodations for attaching a strap. The most common are strap buttons, also called strap pins, which are flanged steel posts anchored to the guitar with screws. Two strap buttons come pre-attached to virtually all electric guitars, and many steel-string acoustic guitars. Strap buttons are sometimes replaced with "strap locks", which connect the guitar to the strap more securely. The lower strap button is usually located at the bottom (bridge end) of the body. The upper strap button is usually located near or at the top (neck end) of the body: on the upper body curve, at the tip of the upper "horn" (on a double cutaway), or at the neck joint (heel). Some electrics, especially those with odd-shaped bodies, have one or both strap buttons on the back of the body. Some Steinberger electric guitars, owing to their minimalist and lightweight design, have both strap buttons at the bottom of the body. Rarely, on some acoustics, the upper strap button is located on the headstock. Some acoustic and classical guitars only have a single strap button at the bottom of the body—the other end must be tied onto the headstock, above the nut and below the machine heads. Amplifiers, effects and speakers Electric guitars and bass guitars have to be used with a guitar amplifier and loudspeaker or a bass amplifier and speaker, respectively, in order to make enough sound to be heard by the performer and audience. Electric guitars and bass guitars almost always use magnetic pickups, which generate an electric signal when the musician plucks, strums or otherwise plays the instrument. The amplifier and speaker strengthen this signal using a power amplifier and a loudspeaker. Acoustic guitars that are equipped with a piezoelectric pickup or microphone can also be plugged into an instrument amplifier, acoustic guitar amp or PA system to make them louder. With electric guitar and bass, the amplifier and speaker are not just used to make the instrument louder; by adjusting the equalizer controls, the preamplifier, and any onboard effects units (reverb, distortion/overdrive, etc.) the player can also modify the tone (also called the timbre or "colour") and sound of the instrument. Acoustic guitar players can also use the amp to change the sound of their instrument, but in general, acoustic guitar amps are used to make the natural acoustic sound of the instrument louder without significantly changing its sound. See also List of guitar manufacturers Outline of guitars Paracho de Verduzco Notes and references Notes Citations Sources Books, journals Online External links Instruments In Depth: The Guitar An online feature from Bloomingdale School of Music (October 2007) International Guitar Research Archive The Guitar, Heilbrunn Timeline of Art History, The Metropolitan Museum of Art featuring many historic guitars from the Museum's collection Online Guitar Acoustic guitar realistic simulator Guitars Necked box lutes Articles containing video clips String instruments C instruments Rhythm section Folk music instruments Blues instruments Classical music instruments
[ 0.02738684043288231, 0.04603176563978195, 0.08860906213521957, 0.08472291380167007, 0.07180563360452652, 0.7227152585983276, 0.11125566065311432, 0.35070013999938965, -0.5155256390571594, -0.12749139964580536, 0.24677926301956177, 0.2946831285953522, 0.3037644922733307, 0.4710593521595001,...
11856
https://en.wikipedia.org/wiki/Gnutella
Gnutella
Gnutella is a peer-to-peer network protocol. Founded in 2000, it was the first decentralized peer-to-peer network of its kind, leading to other, later networks adopting the model. In June 2005, Gnutella's population was 1.81 million computers increasing to over three million nodes by January 2006. In late 2007, it was the most popular file-sharing network on the Internet with an estimated market share of more than 40%. History The first client (also called Gnutella) from which the network got its name was developed by Justin Frankel and Tom Pepper of Nullsoft in early 2000, soon after the company's acquisition by AOL. On March 14, the program was made available for download on Nullsoft's servers. The event was prematurely announced on Slashdot, and thousands downloaded the program that day. The source code was to be released later, under the GNU General Public License (GPL); however, the original developers never got the chance to accomplish this purpose. The next day, AOL stopped the availability of the program over legal concerns and restrained Nullsoft from doing any further work on the project. This did not stop Gnutella; after a few days, the protocol had been reverse engineered, and compatible free and open source clones began to appear. This parallel development of different clients by different groups remains the modus operandi of Gnutella development today. Among the first independent Gnutella pioneers were Gene Kan and Spencer Kimball, who launched the first portal aimed to assemble the open-source community to work on Gnutella and also developed "GNUbile", one of the first open-source (GNU-GPL) programs to implement the Gnutella protocol. The Gnutella network is a fully distributed alternative to such semi-centralized systems as FastTrack (KaZaA) and the original Napster. The initial popularity of the network was spurred on by Napster's threatened legal demise in early 2001. This growing surge in popularity revealed the limits of the initial protocol's scalability. In early 2001, variations on the protocol (first implemented in proprietary and closed source clients) allowed an improvement in scalability. Instead of treating every user as client and server, some users were now treated as ultrapeers, routing search requests and responses for users connected to them. This allowed the network to grow in popularity. In late 2001, the Gnutella client LimeWire Basic became free and open source. In February 2002, Morpheus, a commercial file sharing group, abandoned its FastTrack-based peer-to-peer software and released a new client based on the free and open source Gnutella client Gnucleus. The word Gnutella today refers not to any one project or piece of software, but to the open protocol used by the various clients. The name is a portmanteau of GNU and Nutella, the brand name of an Italian hazelnut flavored spread: supposedly, Frankel and Pepper ate a lot of Nutella working on the original project, and intended to license their finished program under the GNU General Public License. Gnutella is not associated with the GNU project or GNU's own peer-to-peer network, GNUnet. On October 26, 2010, the popular Gnutella client LimeWire was ordered shut down by Judge Kimba Wood of the United States District Court for the Southern District of New York when she signed a Consent Decree to which recording industry plaintiffs and LimeWire had agreed. This event was the likely cause of a notable drop in the size of the network, because, while negotiating the injunction, LimeWire staff had inserted remote-disabling code into the software. As the injunction came into force, users who had installed affected versions (newer than 5.5.10) were cut off from the P2P network. Since LimeWire was free software, nothing had prevented the creation of forks that omitted the disabling code, as long as LimeWire trademarks were not used. The shutdown did not affect, for example, FrostWire, a fork of LimeWire created in 2004 that carries neither the remote-disabling code nor adware. On November 9, 2010, LimeWire was resurrected by a secret team of developers and named LimeWire Pirate Edition. It was based on LimeWire 5.6 BETA. This version had its server dependencies removed and all the PRO features enabled for free. Design To envision how Gnutella originally worked, imagine a large circle of users (called nodes), each of whom has Gnutella client software. On initial startup, the client software must bootstrap and find at least one other node. Various methods have been used for this, including a pre-existing address list of possibly working nodes shipped with the software, using updated web caches of known nodes (called Gnutella Web Caches), UDP host caches and, rarely, even IRC. Once connected, the client requests a list of working addresses. The client tries to connect to the nodes it was shipped with, as well as nodes it receives from other clients until it reaches a certain quota. It connects to only that many nodes, locally caching the addresses which it has not yet tried and discarding the addresses which it tried and found to be invalid. When the user wants to do a search, the client sends the request to each actively connected node. In version 0.4 of the protocol, the number of actively connected nodes for a client was quite small (around 5). In that version of the protocol, each node forwards the request to all its actively connected nodes, who, in turn, forward the request. This continues until the packet has reached a predetermined number of hops from the sender (maximum 7). Since version 0.6 (2002), Gnutella is a composite network made of leaf nodes and ultra nodes (also called ultrapeers). The leaf nodes are connected to a small number of ultrapeers (typically 3) while each ultrapeer is connected to more than 32 other ultrapeers. With this higher outdegree, the maximum number of hops a query can travel was lowered to 4. Leaves and ultrapeers use the Query Routing Protocol to exchange a Query Routing Table (QRT), a table of 64 Ki-slots and up to 2 Mi-slots consisting of hashed keywords. A leaf node sends its QRT to each of the ultrapeers to which it is connected, and ultrapeers merge the QRT of all their leaves (downsized to 128 Ki-slots) plus their own QRT (if they share files) and exchange that with their own neighbors. Query routing is then done by hashing the words of the query and seeing whether all of them match in the QRT. Ultrapeers do that check before forwarding a query to a leaf node, and also before forwarding the query to a peer ultra node provided this is the last hop the query can travel. If a search request turns up a result, the node that has the result contacts the searcher. In the classic Gnutella protocol, response messages were sent back along the route taken by the query, as the query itself did not contain identifying information for the node. This scheme was later revised, to deliver search results over UDP, directly to the node that initiated the search, usually an ultrapeer of the node. Thus, in the current protocol, the queries carry the IP address and port number of either node. This lowers the amount of traffic routed through the Gnutella network, making it significantly more scalable. If the user decides to download the file, they negotiate the file transfer. If the node which has the requested file is not firewalled, the querying node can connect to it directly. However, if the node is firewalled, stopping the source node from receiving incoming connections, the client wanting to download a file sends it a so-called push request to the server for the remote client to initiate the connection instead (to push the file). At first, these push requests were routed along the original chain it used to send the query. This was rather unreliable because routes would often break and routed packets are always subject to flow control. push proxies were introduced to address this problem. These are usually the ultrapeers of a leaf node and they are announced in search results. The client connects to one of these push proxies using an HTTP request and the proxy sends a push request to a leaf on behalf of the client. Normally, it is also possible to send a push request over UDP to the push proxy, which is more efficient than using TCP. Push proxies have two advantages: First, ultrapeer-leaf connections are more stable than routes. This makes push requests much more reliable. Second, it reduces the amount of traffic routed through the Gnutella network. Finally, when a user disconnects, the client software saves a list of known nodes. This contains the nodes to which the client was connected and the nodes learned from pong packets. The client uses that as its seed list, when it next starts, thus becoming independent of bootstrap services. In practice, this method of searching on the Gnutella network was often unreliable. Each node is a regular computer user; as such, they are constantly connecting and disconnecting, so the network is never completely stable. Also, the bandwidth cost of searching on Gnutella grew exponentially to the number of connected users, often saturating connections and rendering slower nodes useless. Therefore, search requests would often be dropped, and most queries reached only a very small part of the network. This observation identified the Gnutella network as an unscalable distributed system, and inspired the development of distributed hash tables, which are much more scalable but support only exact-match, rather than keyword, search. To address the problems of bottlenecks, Gnutella developers implemented a tiered system of ultrapeers and leaves. Instead of all nodes being considered equal, nodes entering the network were kept at the 'edge' of the network, as a leaf. Leaves don't provide routing. Nodes which are capable of routing messages are promoted to ultrapeers. Ultrapeers accept leaf connections and route searches and network maintenance messages. This allows searches to propagate further through the network and allows for numerous alterations in topology. This greatly improved efficiency and scalability. Additionally, gnutella adopted a number of other techniques to reduce traffic overhead and make searches more efficient. Most notable are Query Routing Protocol (QRP) and Dynamic Querying (DQ). With QRP, a search reaches only those clients which are likely to have the files, so searches for rare files become far more efficient. With DQ, the search stops as soon as the program has acquired enough search results. This vastly reduces the amount of traffic caused by popular searches. One of the benefits of having Gnutella so decentralized is to make it very difficult to shut the network down and to make it a network in which the users are the only ones who can decide which content will be available. Unlike Napster, where the entire network relied on the central server, Gnutella cannot be shut down by shutting down any one node. A decentralized network prevents bad actors from taking control of the contents of the network and/or manipulating data by controlling the central server. Protocol features and extensions Gnutella once operated on a purely query flooding-based protocol. The outdated Gnutella version 0.4 network protocol employs five different packet types, namely: ping: discover hosts on network pong: reply to ping query: search for a file query hit: reply to query push: download request for firewalled servants These packets facilitate searches. File transfers are instead handled by HTTP. The development of the Gnutella protocol is currently led by the Gnutella Developers Forum (The GDF). Many protocol extensions have been and are being developed by the software vendors and by the free Gnutella developers of the GDF. These extensions include intelligent query routing, SHA-1 checksums, query hit transmission via UDP, querying via UDP, dynamic queries via TCP, file transfers via UDP, XML metadata, source exchange (also termed the download mesh) and parallel downloading in slices (swarming). There are efforts to finalize these protocol extensions in the Gnutella 0.6 specification, at the Gnutella protocol development website. The Gnutella 0.4 standard is outdated but it remains the latest protocol specification because all extensions, so far, exist as proposals. In fact, it is hard or impossible to connect today with 0.4 handshakes. According to developers in the GDF, version 0.6 is what new developers should pursue using the work-in-progress specifications. The Gnutella protocol remains under development. Despite attempts to make a clean break with the complexity inherited from the old Gnutella 0.4 and to design a clean new message architecture, it remains one of the most successful file-sharing protocols to date. Software The following tables compare general and technical information for a number of applications supporting the Gnutella network. The tables do not attempt to give a complete list of Gnutella clients. The tables are limited to clients that can participate in the current Gnutella network. General specifications Gnutella features Notes Morpheus differs significantly and may have completely independent code from the GnucDNA engine. Morpheus can function as a modern ultrapeer whereas other GnucDNA clients cannot. Gnucleus and Kiwi Alpha use the GnucDNA engine. BearFlix, a functionally limited version of the BearShare 5.2 series, can search only for images or videos and shared videos are limited to a relatively short length. giFTcurs, Apollon, FilePipe, giFToxic, giFTui, giFTwin32, KCeasy, Poisoned, and Xfactor are GUI front-ends for the giFT engine. etomi uses outdated Shareaza networking code. MP3 Rocket, 360Share, LemonWire, MP3Torpedo, and DexterWire are variants of LimeWire. FrostWire (up to version 4.21.8) is nearly identical to LimeWire 4.18 but versions greater than 5.00 no longer use gnutella. Acquisition and Cabos are custom front-ends overlaying the LimeWire engine. LimeWire Pirate Edition (5.6.2) is a resurrected version of the unreleased LimeWire 5.6.1 alpha, thus has similar features minus automatic updates (with nags) and centralized remote controls to disable core functions like searches and downloads were removed. Gnutella2 The Gnutella2 protocol (often referred to as G2), despite its name, is not a successor protocol of Gnutella nor related to the original Gnutella project, but rather is a completely different protocol that forked from the original project and piggybacked on the Gnutella name. A sore point with many Gnutella developers is that the Gnutella2 name conveys an upgrade or superiority, which led to a flame war. Other criticism included the use of the Gnutella network to bootstrap G2 peers and poor documentation of the G2 protocol. Additionally, the more frequent search retries of the Shareaza client, one of the initial G2 clients, could unnecessarily burden the Gnutella network. Both protocols have undergone significant changes since the fork in 2002. G2 has advantages and disadvantages compared to Gnutella. An advantage often cited is that Gnutella2's hybrid search is more efficient than the original Gnutella's query flooding. However, Gnutella replaced query flooding with more efficient search methods, starting with Query Routing in 2002. This was proposed in 2001 by Limewire developers. An advantage of Gnutella is its large user base, which numbers in the millions,. The G2 network is approximately an order of magnitude smaller. It is difficult to compare the protocols in their current form. The choice of client, on either network, probably affects the end user just as much. See also Bitzi Gnutella crawler GNUnet References Dye, Mark. McDonald, Rick. Rufi, Antoon W., 'Network Fundamentals', Cisco Networking Academy, Cisco Press, Ch 3. p91 Dye, Mark. McDonald, Rick. Rufi, Antoon W., 'Network Fundamentals', Cisco Networking Academy, Cisco Press, Ch 3. p90 External links Gnutella Forums Official user support boards Gnutella Protocol Development Wiki (on Internet Archive, 2009) Gnutella Protocol Development Portal (on Internet Archive) Gnutella official website (on Internet Archive) GnuFU, Gnutella For Users: A description of the inner workings of the gnutella network in User-Friendly Style Regarding Gnutella by GNU Glasnost test Gnutella traffic shaping (Max Planck Institute for Software Systems) File sharing networks Application layer protocols Hash based data structures
[ -0.08414888381958008, -0.17788124084472656, 0.15713706612586975, 0.14243456721305847, -0.2278546541929245, -0.31216558814048767, -0.06469488143920898, -0.0035286201164126396, -0.22726213932037354, -0.3491921126842499, -0.12777630984783173, 0.4293217957019806, -0.43500274419784546, 0.126281...
11857
https://en.wikipedia.org/wiki/George%20Lucas
George Lucas
George Walton Lucas Jr. (born May 14, 1944) is an American film director, producer, screenwriter, and entrepreneur. Lucas is best known for creating the Star Wars and Indiana Jones franchises and founding Lucasfilm, Lucasfilm Games, and Industrial Light & Magic. He served as chairman of Lucasfilm before selling it to The Walt Disney Company in 2012. Lucas is one of history's most financially successful filmmakers and has been nominated for four Academy Awards. His films are among the 100 highest-grossing movies at the North American box office, adjusted for ticket-price inflation. Lucas is considered one of the most significant figures of the 20th-century New Hollywood movement, and a pioneer of the modern blockbuster. After graduating from the University of Southern California in 1967, Lucas co-founded American Zoetrope with filmmaker Francis Ford Coppola. Lucas wrote and directed THX 1138 (1971), based on his student short Electronic Labyrinth: THX 1138 4EB, which was a critical success but a financial failure. His next work as a writer-director was the film American Graffiti (1973), inspired by his youth in the early 1960s Modesto, California, and produced through the newly founded Lucasfilm. The film was critically and commercially successful and received five Academy Award nominations, including Best Director and Best Picture. Lucas's next film, the epic space opera Star Wars (1977), had a troubled production but was a surprise hit, becoming the highest-grossing film at the time, winning six Academy Awards and sparking a cultural phenomenon. Lucas produced and co-wrote the sequels The Empire Strikes Back (1980) and Return of the Jedi (1983). With director Steven Spielberg, he created, produced, and co-wrote the Indiana Jones films Raiders of the Lost Ark (1981), The Temple of Doom (1984), The Last Crusade (1989) and The Kingdom of the Crystal Skull (2008). Lucas is also known for his collaboration with composer John Williams, who was recommended to him by Spielberg, and with whom he has worked for all the films in both of these franchises. He also produced and wrote a variety of films and television series through Lucasfilm between the 1970s and the 2010s. In 1997, Lucas re-released the Star Wars Trilogy as part of a special edition featuring several alterations; home media versions with further changes were released in 2004 and 2011. He returned to directing with a Star Wars prequel trilogy comprising Star Wars: Episode I – The Phantom Menace (1999), Star Wars: Episode II – Attack of the Clones (2002) and Star Wars: Episode III – Revenge of the Sith (2005). He last collaborated on the CGI-animated television series Star Wars: The Clone Wars (2008–2014, 2020), the war film Red Tails (2012), and the CGI film Strange Magic (2015). Early life Lucas was born and raised in Modesto, California, the son of Dorothy Ellinore Lucas (née Bomberger) and George Walton Lucas Sr., and is of German, Swiss-German, English, Scottish, and distant Dutch and French descent. His family attended Disneyland during its opening week in July 1955, and Lucas would remain enthusiastic about the park. He was interested in comics and science fiction, including television programs such as the Flash Gordon serials. Long before Lucas began making films, he yearned to be a racecar driver, and he spent most of his high school years racing on the underground circuit at fairgrounds and hanging out at garages. On June 12, 1962, a few days before his high school graduation, Lucas was driving his souped-up Autobianchi Bianchina when another driver broadsided him, flipping his car several times before it crashed into a tree; Lucas's seatbelt had snapped, ejecting him and thereby saving his life. However, his lungs were bruised from severe hemorrhaging and he required emergency medical treatment. This incident caused him to lose interest in racing as a career, but also inspired him to pursue his other interests. Lucas's father owned a stationery store, and had wanted George to work for him when he turned 18. Lucas had been planning to go to art school, and declared upon leaving home that he would be a millionaire by the age of 30. He attended Modesto Junior College, where he studied anthropology, sociology, and literature, amongst other subjects. He also began shooting with an 8 mm camera, including filming car races. At this time, Lucas and his friend John Plummer became interested in Canyon Cinema: screenings of underground, avant-garde 16 mm filmmakers like Jordan Belson, Stan Brakhage, and Bruce Conner. Lucas and Plummer also saw classic European films of the time, including Jean-Luc Godard's Breathless, François Truffaut's Jules et Jim, and Federico Fellini's 8½. "That's when George really started exploring," Plummer said. Through his interest in autocross racing, Lucas met renowned cinematographer Haskell Wexler, another race enthusiast. Wexler, later to work with Lucas on several occasions, was impressed by Lucas's talent. "George had a very good eye, and he thought visually," he recalled. At Plummer's recommendation, Lucas then transferred to the University of Southern California (USC) School of Cinematic Arts. USC was one of the earliest universities to have a school devoted to motion picture film. During the years at USC, Lucas shared a dorm room with Randal Kleiser. Along with classmates such as Walter Murch, Hal Barwood, and John Milius, they became a clique of film students known as The Dirty Dozen. He also became good friends with fellow acclaimed student filmmaker and future Indiana Jones collaborator, Steven Spielberg. Lucas was deeply influenced by the Filmic Expression course taught at the school by filmmaker Lester Novros which concentrated on the non-narrative elements of Film Form like color, light, movement, space, and time. Another inspiration was the Serbian montagist (and dean of the USC Film Department) Slavko Vorkapić, a film theoretician who made stunning montage sequences for Hollywood studio features at MGM, RKO, and Paramount. Vorkapich taught the autonomous nature of the cinematic art form, emphasizing the kinetic energy inherent in motion pictures. Film career 1965–1969: Early career Lucas saw many inspiring films in class, particularly the visual films coming out of the National Film Board of Canada like Arthur Lipsett's 21-87, the French-Canadian cameraman Jean-Claude Labrecque's cinéma vérité 60 Cycles, the work of Norman McLaren, and the documentaries of Claude Jutra. Lucas fell madly in love with pure cinema and quickly became prolific at making 16 mm nonstory noncharacter visual tone poems and cinéma vérité with such titles as Look at Life, Herbie, 1:42.08, The Emperor, Anyone Lived in a Pretty (how) Town, Filmmaker, and 6-18-67. He was passionate and interested in camerawork and editing, defining himself as a filmmaker as opposed to being a director, and he loved making abstract visual films that created emotions purely through cinema. After graduating with a bachelor of fine arts in film in 1967, he tried joining the United States Air Force as an officer, but he was immediately turned down because of his numerous speeding tickets. He was later drafted by the Army for military service in Vietnam, but he was exempted from service after medical tests showed he had diabetes, the disease that killed his paternal grandfather. In 1967, Lucas re-enrolled as a USC graduate student in film production. He began working under Verna Fields for the United States Information Agency, where he met his future wife Marcia Griffin. Working as a teaching instructor for a class of U.S. Navy students who were being taught documentary cinematography, Lucas directed the short film Electronic Labyrinth: THX 1138 4EB, which won first prize at the 1967–68 National Student film festival. Lucas was awarded a student scholarship by Warner Bros. to observe and work on the making of a film of his choosing. The film he chose was Finian's Rainbow (1968) which was being directed by Francis Ford Coppola, who was revered among film school students of the time as a cinema graduate who had "made it" in Hollywood. In 1969, Lucas was one of the camera operators on the classic Rolling Stones concert film Gimme Shelter. 1969–1977: THX 1138, American Graffiti, and Star Wars In 1969, Lucas co-founded the studio American Zoetrope with Coppola, hoping to create a liberating environment for filmmakers to direct outside the perceived oppressive control of the Hollywood studio system. Coppola thought Lucas's Electronic Labyrinth could be adapted into his first full-length feature film, which was produced by American Zoetrope as THX 1138, but was not a success. Lucas then created his own company, Lucasfilm, Ltd., and directed the successful American Graffiti (1973). Lucas then set his sights on adapting Flash Gordon, an adventure serial from his childhood that he fondly remembered. When he was unable to obtain the rights, he set out to write an original space adventure that would eventually become Star Wars. Despite his success with his previous film, all but one studio turned Star Wars down. It was only because Alan Ladd, Jr., at 20th Century Fox liked American Graffiti that he forced through a production and distribution deal for the film, which ended up restoring Fox to financial stability after a number of flops. Star Wars was significantly influenced by samurai films of Akira Kurosawa, Spaghetti Westerns, as well as classic sword and sorcery fantasy stories. Star Wars quickly became the highest-grossing film of all-time, displaced five years later by Spielberg's E.T. the Extra-Terrestrial. After the success of American Graffiti and prior to the beginning of filming on Star Wars, Lucas was encouraged to renegotiate for a higher fee for writing and directing Star Wars than the $150,000 agreed. He declined to do so, instead negotiating for advantage in some of the as-yet-unspecified parts of his contract with Fox, in particular, ownership of licensing and merchandising rights (for novelizations, clothing, toys, etc.) and contractual arrangements for sequels. Lucasfilm has earned hundreds of millions of dollars from licensed games, toys, and collectibles created for the franchise. The original Star Wars film went through a tumultuous production, and during editing, Lucas suffered chest pains initially feared to be a heart attack, but actually a fit of hypertension and exhaustion. The effort that Lucas exerted during post-production for the film, and its subsequent sequels, caused strains on his relationship with his wife Marcia Lucas, and was a contributing factor to their divorce at the end of the trilogy. The success of the first Star Wars film also resulted in more attention focused on Lucas, both positive and negative, attracting wealth and fame but also many people who wanted Lucas's financial backing or just to threaten him. 1977–1993: Hiatus from directing, Indiana Jones Following the release of the first Star Wars film, Lucas worked extensively as a writer and producer, including on the many Star Wars spinoffs made for film, television, and other media. Lucas acted as executive producer for the next two Star Wars films, commissioning Irvin Kershner to direct The Empire Strikes Back, and Richard Marquand to direct Return of the Jedi, while receiving a story credit on the former and sharing a screenwriting credit with Lawrence Kasdan on the latter. He also acted as story writer and executive producer on all four of the Indiana Jones films, which his colleague and good friend Steven Spielberg directed. Other successful projects where Lucas credited as executive producer and sometimes story writer in this period include Kurosawa's Kagemusha (1980), Twice Upon A Time (1983), Ewoks: Caravan of Courage (1984), Ewoks: Battle for Endor (1985), Mishima: A Life in Four Chapters (1985), Jim Henson's Labyrinth (1986), Don Bluth's The Land Before Time (1988), and the Indiana Jones television spinoff The Young Indiana Jones Chronicles (1992–96). There were unsuccessful projects, however, including More American Graffiti (1979), Willard Huyck's Howard the Duck (1986), which was the biggest flop of Lucas's career, Ron Howard's Willow (1988), Coppola's Tucker: The Man and His Dream (1988), and Mel Smith's Radioland Murders (1994). The animation studio Pixar was founded in 1979 as the Graphics Group, one third of the Computer Division of Lucasfilm. Pixar's early computer graphics research resulted in groundbreaking effects in films such as Star Trek II: The Wrath of Khan and Young Sherlock Holmes, and the group was purchased in 1986 by Steve Jobs shortly after he left Apple Computer. Jobs paid Lucas US$5 million and put US$5 million as capital into the company. The sale reflected Lucas's desire to stop the cash flow losses from his 7-year research projects associated with new entertainment technology tools, as well as his company's new focus on creating entertainment products rather than tools. As of June 1983, Lucas was worth US$60 million, but he met cash-flow difficulties following his divorce that year, concurrent with the sudden dropoff in revenues from Star Wars licenses following the theatrical run of Return of the Jedi. At this point, Lucas had no desire to return to Star Wars, and had unofficially canceled the sequel trilogy. Also in 1983, Lucas and Tomlinson Holman founded the audio company THX Ltd. The company was formerly owned by Lucasfilm and contains equipment for stereo, digital, and theatrical sound for films, and music. Skywalker Sound and Industrial Light & Magic, are the sound and visual effects subdivisions of Lucasfilm, while Lucasfilm Games, later renamed LucasArts, produces products for the gaming industry. 1993–2012: Return to directing, Star Wars and Indiana Jones Having lost much of his fortune in a divorce settlement in 1987, Lucas was reluctant to return to Star Wars. However, the prequels, which were still only a series of basic ideas partially pulled from his original drafts of "The Star Wars", continued to tantalize him with technical possibilities that would make it worthwhile to revisit his older material. When Star Wars became popular once again, in the wake of Dark Horse's comic book line and Timothy Zahn's trilogy of spin-off novels, Lucas realized that there was still a large audience. His children were older, and with the explosion of CGI technology he began to consider directing once again. By 1993, it was announced, in Variety among other sources, that Lucas would be making the prequels. He began penning more to the story, indicating that the series would be a tragic one, examining Anakin Skywalker's fall to the dark side. Lucas also began to change the status of the prequels relative to the originals; at first, they were supposed to be a "filling-in" of history tangential to the originals, but now he saw that they could form the beginning of one long story that started with Anakin's childhood and ended with his death. This was the final step towards turning the film series into a "Saga". In 1994, Lucas began work on the screenplay of the first prequel, tentatively titled Episode I: The Beginning. In 1997, to celebrate the 20th anniversary of Star Wars, Lucas returned to the original trilogy and made numerous modifications using newly available digital technology, releasing them in theaters as the Star Wars Special Edition. For DVD releases in 2004 and Blu-ray releases in 2011, the trilogy received further revisions to make them congruent with the prequel trilogy. Besides the additions to the Star Wars franchise, Lucas released a Director's Cut of THX 1138 in 2004, with the film re-cut and containing a number of CGI revisions. The first Star Wars prequel was finished and released in 1999 as Episode I – The Phantom Menace, which would be the first film Lucas had directed in over two decades. Following the release of the first prequel, Lucas announced that he would also be directing the next two, and began working on Episode II. The first draft of Episode II was completed just weeks before principal photography, and Lucas hired Jonathan Hales, a writer from The Young Indiana Jones Chronicles, to polish it. It was completed and released in 2002 as Attack of the Clones. The final prequel, Episode III – Revenge of the Sith, began production in 2002 and was released in 2005. Numerous fans and critics considered the prequels inferior to the original trilogy, though they were box office successes. In 2004, Lucas reflected that his transition from independent to corporate filmmaker mirrored the story of Star Wars character Darth Vader in some ways. Lucas collaborated with Jeff Nathanson as a writer of the 2008 film Indiana Jones and the Kingdom of the Crystal Skull, directed by Steven Spielberg. Like the Star Wars prequels, the reception was mixed, with numerous fans and critics once again considering it inferior to its predecessors. From 2008 to 2014, Lucas also served as the creator and executive producer and for a second Star Wars animated series on Cartoon Network, Star Wars: The Clone Wars which premiered with a feature film of the same name before airing its first episode. The supervising director for this series was Dave Filoni, who was chosen by Lucas and closely collaborated with him on its development. Series it bridged the events between Attack of the Clones and Revenge of the Sith. The animated series also featured the last Star Wars stories in which Lucas was majorly involved. In 2012, Lucas served as executive producer for Red Tails, a war film based on the exploits of the Tuskegee Airmen during World War II. He also took over direction of reshoots while director Anthony Hemingway worked on other projects. 2012–present: Semi-retirement In January 2012, Lucas announced his retirement from producing large blockbuster films and instead re-focusing his career on smaller, independently budgeted features. In June 2012, it was announced that producer Kathleen Kennedy, a long-term collaborator with Steven Spielberg and a producer of the Indiana Jones films, had been appointed as co-chair of Lucasfilm Ltd. It was reported that Kennedy would work alongside Lucas, who would remain chief executive and serve as co-chairman for at least one year, after which she would succeed him as the company's sole leader. With the sale of Lucasfilm to Disney, Lucas is currently Disney's second-largest single shareholder after the estate of Steve Jobs. Lucas worked as a creative consultant on the Star Wars sequel trilogy's first film, The Force Awakens. As creative consultant on the film, Lucas's involvement included attending early story meetings; according to Lucas, "I mostly say, 'You can't do this. You can do that.' You know, 'The cars don't have wheels. They fly with antigravity.' There's a million little pieces ... I know all that stuff." Lucas's son Jett told The Guardian that his father was "very torn" about having sold the rights to the franchise, despite having hand-picked Abrams to direct, and that his father was "there to guide" but that "he wants to let it go and become its new generation." Among the materials turned over to the production team were rough story treatments Lucas developed when he considered creating episodes VII–IX himself years earlier; in January 2015, Lucas stated that Disney had discarded his story ideas. The Force Awakens, directed by J. J. Abrams, was released on December 18, 2015. Kathleen Kennedy executive produced the film and its sequels. The new sequel trilogy was jointly produced by Lucasfilm and The Walt Disney Company, which had acquired Lucasfilm in 2012. During an interview with talk show host and journalist Charlie Rose that aired on December 24, 2015, Lucas likened his decision to sell Lucasfilm to Disney to a divorce and outlined the creative differences between him and the producers of The Force Awakens. Lucas described the previous six Star Wars films as his "children" and defended his vision for them, while criticizing The Force Awakens for having a "retro feel", saying, "I worked very hard to make them completely different, with different planets, with different spaceships – you know, to make it new." Lucas also drew some criticism and subsequently apologized for his remark likening Disney to "white slavers". In 2015, Lucas wrote the CGI film Strange Magic, his first musical. The film was produced at Skywalker Ranch. Gary Rydstrom directed the movie. At the same time the sequel trilogy was announced a fifth installment of the Indiana Jones series also entered pre-development phase with Harrison Ford and Steven Spielberg set to return. Lucas originally did not specify whether the selling of Lucasfilm would affect his involvement with the film. In October 2016, Lucas announced his decision to not be involved in the story of the film but would remain an executive producer. In 2016, Rogue One: A Star Wars Story, the first film of a Star Wars anthology series was released. It told the story of the rebels who stole the plans for the Death Star featured in the original Star Wars film, and it was reported that Lucas liked it more than The Force Awakens. The Last Jedi, the second film in the sequel trilogy, was released in 2017; Lucas described the film as "beautifully made". Lucas has had cursory involvement with Solo: A Star Wars Story (2018), the Star Wars streaming series The Mandalorian, and the premiere of the eighth season of Game of Thrones. Lucas met with J. J. Abrams before the latter began writing the script to the sequel trilogy's final film, The Rise of Skywalker, which was released in 2019. Filmmaking Collaboration with John Williams Lucas was also heavily involved and invested in the scoring process for the original Star Wars soundtrack, which was composed by John Williams, on the recommendation of his friend and colleague Steven Spielberg. Whilst initially wanting to use tracks and film music in a similar manner to 2001: A Space Odyssey, which served as the inspiration for the film, Williams advised against this and instead proposed a system of recurring themes (or leitmotifs) to enhance the story in the style of classical composers Gustav Holst, William Walton, and Igor Stravinsky; works that Lucas had used as "temp tracks" for Williams to gain inspiration from. The film, and subsequent sequels and prequels, make use of the Main Title Theme, the Force Theme (less commonly referred to as Obi Wan Kenobi's Theme), the Rebel Alliance Theme, and Princess Leia's Theme (all introduced in this film) repeatedly. Subsequent films also added to the catalogue of themes for different characters, factions, and locations. The score was released to critical acclaim and won Williams his third Academy Award for Best Original Score. The score was listed by the American Film Institute in 2005 as the greatest film score of all time. The professional relationship formed by Lucas and Williams extended through to Williams working on all of Lucas's blockbuster franchise movies: the remaining two films of the Star Wars original trilogy; all three films of prequel trilogy developed over fifteen years later; and the four (to be five) films of the Indiana Jones franchise, in which Williams reunited with his long-time collaborator Spielberg. In his collaborations with Lucas, Williams received six of his fifty-two Academy Award nominations (Star Wars, The Empire Strikes Back, Return of the Jedi, Raiders of the Lost Ark, Indiana Jones and the Temple of Doom, and Indiana Jones and the Last Crusade). After Lucas sold Lucasfilm to Disney, Williams stayed on board with the franchise, and continued to score the remaining three films of the "Skywalker saga" (The Force Awakens, The Last Jedi, and The Rise of Skywalker, for which he received a further three Oscar nominations), after which he announced his "retirement" from the series. Lucas was in attendance for a ceremony honouring Williams as the 44th recipient of the AFI Life Achievement Award, the first composer to receive the honour, and gave a speech in praise of their relationship and his work. In interviews, and most famously at the 40th Anniversary Star Wars Celebration convention, Lucas has repeatedly reaffirmed the importance of Williams to the Star Wars saga, affectionately referring to him as the "secret sauce" of his movies. Philanthropy Lucas is one of the wealthiest celebrities in the world. Lucas has pledged to give half of his fortune to charity as part of an effort called The Giving Pledge led by Bill Gates and Warren Buffett to persuade America's richest individuals to donate their financial wealth to charities. George Lucas Educational Foundation In 1991, The George Lucas Educational Foundation was founded as a nonprofit operating foundation to celebrate and encourage innovation in schools. The Foundation's content is available under the brand Edutopia, in an award-winning web site, social media and via documentary films. Lucas, through his foundation, was one of the leading proponents of the E-rate program in the universal service fund, which was enacted as part of the Telecommunications Act of 1996. On June 24, 2008, Lucas testified before the United States House of Representatives subcommittee on Telecommunications and the Internet as the head of his Foundation to advocate for a free wireless broadband educational network. Proceeds from the sale of Lucasfilm to Disney In 2012, Lucas sold Lucasfilm to The Walt Disney Company for a reported sum of $4.05 billion. It was widely reported at the time that Lucas intends to give the majority of the proceeds from the sale to charity. A spokesperson for Lucasfilm said, "George Lucas has expressed his intention, in the event the deal closes, to donate the majority of the proceeds to his philanthropic endeavors." Lucas also spoke on the matter: "For 41 years, the majority of my time and money has been put into the company. As I start a new chapter in my life, it is gratifying that I have the opportunity to devote more time and resources to philanthropy." Lucas Museum of Narrative Art By June 2013, Lucas was considering establishing a museum, the Lucas Cultural Arts Museum, to be built on Crissy Field near the Golden Gate Bridge in San Francisco, which would display his collection of illustrations and pop art, with an estimated value of more than $1 billion. Lucas offered to pay the estimated $300 million cost of constructing the museum, and would endow it with $400 million when it opened, eventually adding an additional $400 million to its endowment. After being unable to reach an agreement with The Presidio Trust, Lucas turned to Chicago. A potential lakefront site on Museum Campus in Chicago was proposed in May 2014. By June 2014, Chicago had been selected, pending approval of the Chicago Plan Commission, which was granted. The museum project was renamed the Lucas Museum of Narrative Art. On June 24, 2016, Lucas announced that he was abandoning his plans to locate the museum in Chicago, due to a lawsuit by a local preservation group, Friends of the Parks, and would instead build the museum in California. On January 17, 2017, Lucas announced that the museum will be constructed in Exposition Park, Los Angeles, California. Other initiatives In 2005, Lucas gave US$1 million to help build the Martin Luther King Jr. Memorial on the National Mall in Washington D.C. to commemorate American civil rights leader Martin Luther King Jr. On September 19, 2006, the University of Southern California announced that Lucas had donated $175–180 million to his alma mater to expand the film school. It is the largest single donation to USC and the largest gift to a film school anywhere. Previous donations led to the already-existing George Lucas Instructional Building and Marcia Lucas Post-Production building. In 2013, Lucas and his wife Mellody Hobson donated $25 million to the Chicago-based not-for-profit After School Matters, of which Hobson is the chair. On April 15, 2016, it was reported that Lucas had donated between $501,000 and $1 million through the Lucas Family Foundation to the Obama Foundation, which is charged with overseeing the construction of the Barack Obama Presidential Center on Chicago's South Side. Personal life In 1969, Lucas married film editor Marcia Lou Griffin, who went on to win an Academy Award for her editing work on the original Star Wars film. They adopted a daughter, Amanda Lucas, in 1981, and divorced in 1983. Lucas subsequently adopted two more children as a single parent: daughter Katie Lucas, born in 1988, and son Jett Lucas, born in 1993. His three eldest children all appeared in the three Star Wars prequels, as did Lucas himself. Following his divorce, Lucas was in a relationship with singer Linda Ronstadt in the 1980s. Lucas began dating Mellody Hobson, president of Ariel Investments and chair of DreamWorks Animation, in 2006. Lucas and Hobson announced their engagement in January 2013, and married on June 22, 2013, at Lucas's Skywalker Ranch in Marin County, California. They have one daughter together, born via gestational carrier in August 2013. Lucas was born and raised in a Methodist family. The religious and mythical themes in Star Wars were inspired by Lucas's interest in the writings of mythologist Joseph Campbell, and he would eventually come to identify strongly with the Eastern religious philosophies he studied and incorporated into his films, which were a major inspiration for "the Force". Lucas has come to state that his religion is "Buddhist Methodist". He resides in Marin County. Lucas is a major collector of the American illustrator and painter Norman Rockwell. A collection of 57 Rockwell paintings and drawings owned by Lucas and fellow Rockwell collector and film director Steven Spielberg were displayed at the Smithsonian American Art Museum from July 2, 2010, to January 2, 2011, in an exhibition titled Telling Stories. Lucas has said that he is a fan of Seth MacFarlane's hit TV show Family Guy. MacFarlane has said that Lucasfilm was extremely helpful when the Family Guy crew wanted to parody their works. Lucas supported Democratic candidate Hillary Clinton in the run-up for the 2016 U.S. presidential election. Awards and honors In 1977, Lucas was awarded the Inkpot Award. The American Film Institute awarded Lucas its Life Achievement Award on June 9, 2005. This was shortly after the release of Star Wars: Episode III – Revenge of the Sith, about which he joked stating that, since he views the entire Star Wars series as one film, he could actually receive the award now that he had finally "gone back and finished the movie." Lucas was nominated for four Academy Awards: Best Directing and Writing for American Graffiti and Star Wars. He received the Academy's Irving G. Thalberg Award in 1991. He appeared at the 79th Academy Awards ceremony in 2007 with Steven Spielberg and Francis Ford Coppola to present the Best Director award to their friend Martin Scorsese. During the speech, Spielberg and Coppola talked about the joy of winning an Oscar, making fun of Lucas, who has not won a competitive Oscar. The Science Fiction Hall of Fame inducted Lucas in 2006, its second "Film, Television, and Media" contributor, after Spielberg. The Discovery Channel named him one of the 100 "Greatest Americans" in September 2008. Lucas served as Grand Marshal for the Tournament of Roses Parade and made the ceremonial coin toss at the Rose Bowl, New Year's Day 2007. In 2009, he was one of 13 California Hall of Fame inductees in The California Museum's yearlong exhibit. In July 2013, Lucas was awarded the National Medal of Arts by President Barack Obama for his contributions to American cinema. In October 2014, Lucas received Honorary Membership of the Society of Motion Picture and Television Engineers. In August 2015, Lucas was inducted as a Disney Legend, and on December 6, 2015, he was an honoree at the Kennedy Center Honors. In 2021, coinciding with Lucasfilm's 50th anniversary, an action figure of Lucas in stormtrooper disguise was released as part of Hasbro's Star Wars: The Black Series. Filmography Bibliography 1980: Alan Arnold: A Journal of the Making of "The Empire Strikes Back." . (contributor) 1983: Dale Pollock: Sky Walking: The Life and Films of George Lucas. . (contributor) 1995: George Lucas, Chris Claremont: Shadow Moon. . (story) 1996: Chris Claremont: Shadow Dawn. . (story) 1997: Laurent Bouzereau: Star Wars. The Annotated Screenplays. (contributor) . 2000: Terry Brooks: Star Wars: Episode I: The Phantom Menace (novelization, contributor), Del Rey Books, 2000: Chris Claremont: Shadow Star. . (story) 2003: R. A. Salvatore: Star Wars: Episode II – Attack of the Clones (novelization, contributor), Del Rey, 2004: Matthew Stover: Shatterpoint. (novel, prologue), Del Rey, . 2005: James Luceno: Labyrinth of Evil (novel, contributor), Del Rey, 2005: Matthew Stover: Star Wars: Episode III – Revenge of the Sith., Del Rey, . (novelization, contributor & line editor) 2007: J. W. Rinzler: The Making of "Star Wars". The Definitive Story Behind the Original Film. . (contributor) 2012: James Luceno: Star Wars: Darth Plagueis. novel (contributor), Del Rey, . 2020: Paul Duncan: The Star Wars Archives. 1999–2005 (contributor), Taschen, See also The Making of Star Wars References Footnotes Citations Sources Further reading External links George Lucas biography at Lucasfilm.com George Lucas at World of Business Ideas 1944 births 21st-century philanthropists AFI Life Achievement Award recipients American art collectors American billionaires American Buddhists American chief executives American Cinema Editors American cinematographers American entertainment industry businesspeople American film editors American film producers American male film actors American male novelists American male screenwriters American Methodists American science fiction writers American people of Swiss-German descent American people of English descent American people of Scottish descent American speculative fiction artists Businesspeople from California Cinema of the San Francisco Bay Area Fellows of the American Academy of Arts and Sciences Film directors from California Film producers from California Giving Pledgers Golden Raspberry Award winners Inkpot Award winners Kennedy Center honorees Living people Lucasfilm people Mythopoeic writers People from Marin County, California People from Modesto, California Science fiction artists Science fiction fans Science fiction film directors Science Fiction Hall of Fame inductees Screenwriters from California Special effects people USC School of Cinematic Arts alumni United States National Medal of Arts recipients
[ -0.2807653546333313, 0.6810516715049744, -0.7678773403167725, 0.08926761895418167, 0.0783994272351265, 0.28720715641975403, 0.7558586597442627, 0.1447715312242508, -0.5843145251274109, -0.3814675211906433, 0.03529021143913269, -0.35799697041511536, -0.016381049528717995, 0.7785120010375977...
11861
https://en.wikipedia.org/wiki/Gothenburg
Gothenburg
Gothenburg (; abbreviated Gbg; ) is the second-largest city in Sweden, fifth-largest in the Nordic countries, and capital of the Västra Götaland County. It is situated by Kattegat, on the west coast of Sweden, and has a population of approximately 570,000 in the city proper and about 1 million inhabitants in the metropolitan area. Gothenburg was founded as a heavily fortified, primarily Dutch, trading colony, by royal charter in 1621 by King Gustavus Adolphus. In addition to the generous privileges (e.g. tax relaxation) given to his Dutch allies from the then-ongoing Thirty Years' War, the king also attracted significant numbers of his German and Scottish allies to populate his only town on the western coast. At a key strategic location at the mouth of the Göta älv, where Scandinavia's largest drainage basin enters the sea, the Port of Gothenburg is now the largest port in the Nordic countries. Gothenburg is home to many students, as the city includes the University of Gothenburg and Chalmers University of Technology. Volvo was founded in Gothenburg in 1927. The original parent Volvo Group and the now separate Volvo Car Corporation are still headquartered on the island of Hisingen in the city. Other key companies are SKF and AstraZeneca. Gothenburg is served by Göteborg Landvetter Airport southeast of the city center. The smaller Göteborg City Airport, from the city center, was closed to regular airline traffic in 2015. The city hosts the Gothia Cup, the world's largest youth football tournament, and the Göteborg Basketball Festival, Europe's largest youth basketball tournament, alongside some of the largest annual events in Scandinavia. The Gothenburg Film Festival, held in January since 1979, is the leading Scandinavian film festival with over 155,000 visitors each year. In summer, a wide variety of music festivals are held in the city, including the popular Way Out West Festival. During 2020, Gothenburg's population increased by 3,775 inhabitants. Name The city was named Göteborg in the city's charter in 1621 and simultaneously given the German and English name Gothenburg. The Swedish name was given after the Göta älv, called Göta River in English, and other cities ending in -borg. Both the Swedish and German/English names were in use before 1621 and had already been used for the previous city founded in 1604 that burned down in 1611. Gothenburg is one of few Swedish cities to still have an official and widely used exonym. The city council of 1641 consisted of four Swedish, three Dutch, three German, and two Scottish members. In Dutch, Scots, English, and German, all languages with a long history in this trade and maritime-oriented city, the name Gothenburg is or was (in the case of German) used for the city. Variations of the official German/English name Gothenburg in the city's 1621 charter existed or exist in many languages. The French form of the city name is Gothembourg, but in French texts, the Swedish name Göteborg is more frequent. "Gothenburg" can also be seen in some older English texts. In Spanish and Portuguese the city is called Gotemburgo. These traditional forms are sometimes replaced with the use of the Swedish Göteborg, for example by The Göteborg Opera and the Göteborg Ballet. However, Göteborgs universitet, previously designated as the Göteborg University in English, changed its name to the University of Gothenburg in 2008. The Gothenburg municipality has also reverted to the use of the English name in international contexts. In 2009, the city council launched a new logotype for Gothenburg. Since the name "Göteborg" contains the Swedish letter "ö", they planned to make the name more "international" and "up to date" by turning the "ö" sideways. , the name is spelled "Go:teborg" on a large number of signs in the city. History In the early modern period, the configuration of Sweden's borders made Gothenburg strategically critical as the only Swedish gateway to Skagerrak, the North Sea and Atlantic, situated on the west coast in a very narrow strip of Swedish territory between Danish Halland in the south and Norwegian Bohuslän in the north. After several failed attempts, Gothenburg was successfully founded in 1621 by King Gustavus Adolphus (Gustaf II Adolf). The site of the first church built in Gothenburg, subsequently destroyed by Danish invaders, is marked by a stone near the north end of the Älvsborg Bridge in the Färjenäs Park. The church was built in 1603 and destroyed in 1611. The city was heavily influenced by the Dutch, Germans, and Scots, and Dutch planners and engineers were contracted to construct the city as they had the skills needed to drain and build in the marshy areas chosen for the city. The town was designed like Dutch cities such as Amsterdam, Batavia (Jakarta) and New Amsterdam (Manhattan). The planning of the streets and canals of Gothenburg closely resembled that of Jakarta, which was built by the Dutch around the same time. The Dutchmen initially won political power, and it was not until 1652, when the last Dutch politician in the city's council died, that Swedes acquired political power over Gothenburg. During the Dutch period, the town followed Dutch town laws and Dutch was proposed as the official language in the town. Robust city walls were built during the 17th century. In 1807, a decision was made to tear down most of the city's wall. The work started in 1810 and was carried out by 150 soldiers from the Bohus regiment. Along with the Dutch, the town also was heavily influenced by Scots who settled down in Gothenburg. Many became people of high-profile. William Chalmers, the son of a Scottish immigrant, donated his fortunes to set up what later became the Chalmers University of Technology. In 1841, the Scotsman Alexander Keiller founded the Götaverken shipbuilding company that was in business until 1989. His son James Keiller donated Keiller Park to the city in 1906. The Gothenburg coat of arms was based on the lion of the coat of arms of Sweden, symbolically holding a shield with the national emblem, the Three Crowns, to defend the city against its enemies. In the Treaty of Roskilde (1658), Denmark–Norway ceded the then Danish province Halland, in the south, and the Norwegian province of Bohus County or Bohuslän in the north, leaving Gothenburg less exposed. Gothenburg was able to grow into a significant port and trade centre on the west coast, because it was the only city on the west coast that, along with Marstrand, was granted the rights to trade with merchants from other countries. In the 18th century, fishing was the most important industry. However, in 1731, the Swedish East India Company was founded, and the city flourished due to its foreign trade with highly profitable commercial expeditions to China. The harbour developed into Sweden's main harbour for trade towards the west, and when Swedish emigration to the United States increased, Gothenburg became Sweden's main point of departure for these travellers. The impact of Gothenburg as a main port of embarkation for Swedish emigrants is reflected by Gothenburg, Nebraska, a small Swedish settlement in the United States. With the 19th century, Gothenburg evolved into a modern industrial city that continued on into the 20th century. The population increased tenfold in the century, from 13,000 (1800) to 130,000 (1900). In the 20th century, major companies that developed included SKF (1907) and Volvo (1927). Geography Gothenburg is located on the west coast, in southwestern Sweden, about halfway between the capitals Copenhagen, Denmark, and Oslo, Norway. The location at the mouth of the Göta älv, which feeds into Kattegatt, an arm of the North Sea, has helped the city grow in significance as a trading city. The archipelago of Gothenburg consists of rough, barren rocks and cliffs, which also is typical for the coast of Bohuslän. Due to the Gulf Stream, the city has a mild climate and moderately heavy precipitation. It is the second-largest city in Sweden after the capital Stockholm. The Gothenburg Metropolitan Area (Stor-Göteborg) has 982,360 inhabitants and extends to the municipalities of Ale, Alingsås, Göteborg, Härryda, Kungälv, Lerum, Lilla Edet, Mölndal, Partille, Stenungsund, Tjörn, Öckerö within Västra Götaland County, and Kungsbacka within Halland County. Angered, a suburb outside Gothenburg, consists of Hjällbo, Eriksbo, Rannebergen, Hammarkullen, Gårdsten, and Lövgärdet. It is a Million Programme part of Gothenburg, like Rosengård in Malmö and Botkyrka in Stockholm. Angered had about 50,000 inhabitants in 2015.[?] It lies north of Gothenburg and is isolated from the rest of the city. Bergsjön is another Million Programme suburb north of Gothenburg, it has 14,000 inhabitants. Biskopsgården is the biggest multicultural suburb on the island of Hisingen, which is a part of Gothenburg but separated from the city by the river. Climate Gothenburg has an oceanic (Cfb according to the Köppen climate classification). Despite its northerly latitude, temperatures are quite mild throughout the year and warmer than places at a similar latitude like Stockholm, this is mainly because of the moderating influence of the Gulf Stream. During the summer, daylight extends 18 hours and 5 minutes, but lasts 6 hours and 32 minutes in late December. The climate has become significantly milder in later decades, particularly in summer and winter; July temperatures used to be below Stockholm's 1961–1990 averages, but have since been warmer than that benchmark. Summers are warm and pleasant with average high temperatures of and lows of , but temperatures of occur on many days during the summer. Winters are cold and windy with temperatures of around , though it rarely drops below . Precipitation is regular but generally moderate throughout the year. Snow mainly occurs from December to March, but is not unusual in November and April and can sometimes occur even in October and May. Parks and nature Gothenburg has several parks and nature reserves ranging in size from tens of square metres to hundreds of hectares. It also has many green areas that are not designated as parks or reserves. Selection of parks: Kungsparken, , built between 1839 and 1861, surrounds the canal that circles the city centre. Garden Society of Gothenburg, a park and horticultural garden, is located next to Kungsportsavenyen. Founded in 1842 by the Swedish king Carl XIV Johan and on initiative of the amateur botanist Henric Elof von Normann, the park has a noted rose garden with some 4,000 roses of 1,900 cultivars. Slottsskogen, , was created in 1874 by August Kobb. It has a free "open" zoo that includes harbor seals, penguins, horses, pigs, deer, moose, goats, and many birds. The Natural History Museum (Naturhistoriska Museet) and the city's oldest observatory are located in the park. The annual Way Out West festival is held in the park. Änggårdsbergens naturreservat, , was bought in 1840 by pharmacist Arvid Gren, and donated in 1963 to the city by Sven and Carl Gren Broberg, who stated the area must remain a nature and bird reserve. It lies partly in Mölndal. Delsjöområdets naturreservat, about , has been in use since the 17th century as a farming area; significant forest management was carried out in the late 19th century. Skatås gym and motionscentrum is situated here. Rya Skogs Naturreservat, , became a protected area in 1928. It contains remnants of a defensive wall built in the mid- to late-17th century. Keillers park was donated by James Keiller in 1906. He was the son of Scottish Alexander Keiller, who founded the Götaverken shipbuilding company. S A Hedlunds park: Sven Adolf Hedlund, newspaper publisher and politician, bought the Bjurslätt farm in 1857, and in 1928 it was given to the city. Hisingsparken is Gothenburg's largest park. Flunsåsparken, built in 1950, has many free activities during the summer such as concerts and theatre. Gothenburg Botanical Garden, , opened in 1923. It won an award in 2003, and in 2006 was third in "The most beautiful garden in Europe" competition. It has around 16,000 species of plants and trees. The greenhouses contain around 4,500 species including 1,600 orchids. It is considered to be one of the most important botanical gardens in Europe with three stars in the French Guide Rouge. Architecture Very few houses are left from the 17th century when the city was founded, since all but the military and royal houses were built of wood. A rare exception is the Skansen Kronan. The first major architecturally interesting period is the 18th century when the East India Company made Gothenburg an important trade city. Imposing stone houses in Neo-Classical style were erected around the canals. One example from this period is the East India House, which today houses the Göteborg City Museum. In the 19th century, the wealthy bourgeoisie began to move outside the city walls which had protected the city. The style now was an eclectic, academic, somewhat overdecorated style which the middle-class favoured. The working class lived in the overcrowded city district Haga in wooden houses. In the 19th century, the first comprehensive town plan after the founding of city was created, which led to the construction of the main street, Kungsportsavenyen. Perhaps the most significant type of houses of the city, Landshövdingehusen, were built in the end of the 19th century – three-storey houses with the first floor in stone and the other two in wood. The early 20th century, characterized by the National Romantic style, was rich in architectural achievements. Masthugg Church is a noted example of the style of this period. In the early 1920s, on the city's 300th anniversary, the Götaplatsen square with its Neoclassical look was built. After this, the predominant style in Gothenburg and rest of Sweden was Functionalism which especially dominated the suburbs such as Västra Frölunda and Bergsjön. The Swedish functionalist architect Uno Åhrén served as city planner from 1932 through 1943. In the 1950s, the big stadium Ullevi was built when Sweden hosted the 1958 FIFA World Cup. The modern architecture of the city has been formed by such architects as Gert Wingårdh, who started as a Post-modernist in the 1980s. Gustaf Adolf Square is a town square located in central Gothenburg. Noted buildings on the square include Gothenburg City Hall (formerly the stock exchange, opened in 1849) and the Nordic Classicism law court. The main canal of Gothenburg also flanks the square. Characteristic buildings The Gothenburg Central Station is in the centre of the city, next to Nordstan and Drottningtorget. The building has been renovated and expanded numerous times since the grand opening in October 1858. In 2003, a major reconstruction was finished which brought the 19th-century building into the 21st century expanding the capacity for trains, travellers, and shopping. Not far from the central station is the Skanskaskrapan, or more commonly known as "The Lipstick". It is high with 22 floors and coloured in red-white stripes. The skyscraper was designed by Ralph Erskine and built by Skanska in the late 1980s as the headquarters for the company. By the shore of the Göta Älv at Lilla Bommen is The Göteborg Opera. It was completed in 1994. The architect Jan Izikowitz was inspired by the landscape and described his vision as "Something that makes your mind float over the squiggling landscape like the wings of a seagull." Feskekörka, or Fiskhallen, is an indoor fishmarket by the Rosenlundskanalen in central Gothenburg. Feskekörkan was opened on 1November 1874 and its name from the building's resemblance to a Gothic church. The Gothenburg city hall is in the Beaux-Arts architectural style. The Gothenburg Synagogue at Stora Nygatan, near Drottningtorget, was built in 1855 according to the designs of the German architect August Krüger. The Gunnebo House is a country house located to the south of Gothenburg, in Mölndal. It was built in a neoclassical architecture towards the end of the 18th century. Created in the early 1900s was the Vasa Church. It is located in Vasastan and is built of granite in a neo-Romanesque style. Another noted construction is Brudaremossen TV Tower, one of the few partially guyed towers in the world. Culture The sea, trade, and industrial history of the city are evident in the cultural life of Gothenburg. It is also a popular destination for tourists on the Swedish west coast. Museums Many of the cultural institutions, as well as hospitals and the university, were created by donations from rich merchants and industrialists, for example the Röhsska Museum. On 29December 2004, the Museum of World Culture opened near Korsvägen. Museums include the Gothenburg Museum of Art, and several museums of sea and navigation history, natural history, the sciences, and East India. Aeroseum, close to the Göteborg City Airport, is an aircraft museum in a former military underground air force base. The Volvo museum has exhibits of the history of Volvo and the development from 1927 until today. Products shown include cars, trucks, marine engines, and buses. Universeum is a public science centre that opened in 2001, the largest of its kind in Scandinavia. It is divided into six sections, each containing experimental workshops and a collection of reptiles, fish, and insects. Universeum occasionally host debates between Swedish secondary-school students and Nobel Prize laureates or other scholars. Leisure and entertainment The most noted attraction is the amusement park Liseberg, located in the central part of the city. It is the largest amusement park in Scandinavia by number of rides, and was chosen as one of the top ten amusement parks in the world (2005) by Forbes. It is the most popular attraction in Sweden by number of visitors per year (more than 3 million). There are a number of independent theatre ensembles in the city, besides institutions such as Gothenburg City Theatre, Backa Theatre (youth theatre), and Folkteatern. The main boulevard is called Kungsportsavenyn (commonly known as Avenyn, "The Avenue"). It is about long and starts at Götaplatsen – which is the location of the Gothenburg Museum of Art, the city's theatre, and the city library, as well as the concert hall – and stretches all the way to Kungsportsplatsen in the old city centre of Gothenburg, crossing a canal and a small park. The Avenyn was created in the 1860s and 1870s as a result of an international architecture contest, and is the product of a period of extensive town planning and remodelling. Avenyn has Gothenburg's highest concentration of pubs and clubs. Gothenburg's largest shopping centre (8th largest in Sweden), Nordstan, is located in central Gothenburg. Gothenburg's Haga district is known for its picturesque wooden houses and its cafés serving the well-known Haga bulle – a large cinnamon roll similar to the kanelbulle. Five Gothenburg restaurants have a star in the 2008 Michelin Guide: 28 +, Basement, Fond, Kock & Vin, Fiskekrogen, and Sjömagasinet. The city has a number of star chefs – over the past decade, seven of the Swedish Chef of the Year awards have been won by people from Gothenburg. The Gustavus Adolphus pastry, eaten every 6November in Sweden, Gustavus Adolphus Day, is especially connected to, and appreciated in, Gothenburg because the city was founded by King Gustavus Adolphus. One of Gothenburg's most popular natural tourist attractions is the southern Gothenburg archipelago, which is a set of several islands that can be reached by ferry boats mainly operating from Saltholmen. Within the archipelago are the Älvsborg fortress, Vinga and Styrsö islands. Festivals and fairs The annual Gothenburg Film Festival, is the largest film festival in Scandinavia. The Gothenburg Book Fair, held each year in September. It is the largest literary festival in Scandinavia, and the second largest book fair in Europe. A radical bookfair is held at the same time at the Syndikalistiskt Forum. The International Science Festival in Gothenburg is an annual festival since April 1997, in central Gothenburg with thought-provoking science activities for the public. The festival is visited by about people each year. This makes it the largest popular-science event in Sweden and one of the leading popular-science events in Europe. Citing the financial crisis, the International Federation of Library Associations and Institutions moved the 2010 World Library and Information Congress, previously to be held in Brisbane, Australia, to Gothenburg. The event took place on 10–15August 2010. Music Gothenburg has a diverse music community—the Gothenburg Symphony Orchestra is the best-known in classical music. Gothenburg also was the birthplace of the Swedish composer Kurt Atterberg. The first internationally successfully Swedish group, instrumental rock group The Spotnicks came from Gothenburg. Bands such as The Soundtrack of Our Lives and Ace of Base are well-known pop representatives of the city. During the 1970s, Gothenburg had strong roots in the Swedish progressive movement (progg) with such groups as Nationalteatern, Nynningen, and Motvind. The record company Nacksving and the editorial office for the magazine Musikens Makt which also were part of the progg movement were located in Gothenburg during this time as well. There is also an active indie scene in Gothenburg. For example, the musician Jens Lekman was born in the suburb of Angered and named his 2007 release Night Falls Over Kortedala after another suburb, Kortedala. Other internationally acclaimed indie artists include the electro pop duos Studio, The Knife, Air France, The Tough Alliance, indie rock band Love is All, songwriter José González, and pop singer El Perro del Mar, as well as genre-bending quartet Little Dragon fronted by vocalist Yukimi Nagano. Another son of the city is one of Sweden's most popular singers, Håkan Hellström, who often includes many places from the city in his songs. The glam rock group Supergroupies derives from Gothenburg. Gothenburg's own commercially successful At the Gates, In Flames, and Dark Tranquillity are credited with pioneering melodic death metal. Other well-known bands of the Gothenburg scene are thrash metal band The Haunted, progressive power metal band Evergrey, and power metal bands HammerFall and Dream Evil. Many music festivals take place in the city every year. The Metaltown Festival is a two-day festival featuring heavy metal music bands, held in Gothenburg. It has been arranged annually since 2004, taking place at the Frihamnen venue. In June 2012, the festival included bands such as In Flames, Marilyn Manson, Slayer, Lamb of God, and Mastodon. Another popular festival, Way Out West, focuses more on rock, electronic, and hip-hop genres. Sports As in all of Sweden, a variety of sports are followed, including football, ice hockey, basketball, handball, floorball, baseball, and figure skating. A varied amateur and professional sports clubs scene exists. Gothenburg is the birthplace of football in Sweden as the first football match in Sweden was played there in 1892. The city's three major football clubs, IFK Göteborg, Örgryte IS, and GAIS share a total of 34 Swedish championships between them. IFK has also won the UEFA Cup twice. Other notable clubs include BK Häcken (football), Göteborg HC (women's ice hockey), Pixbo Wallenstam IBK (floorball), multiple national handball champion Redbergslids IK, and four-time national ice hockey champion Frölunda HC, Gothenburg had a professional basketball team, Gothia Basket, until 2010 when it ceased. The bandy department of GAIS, GAIS Bandy, played the first season in the highest division Elitserien last season. The group stage match between the main rivals Sweden and Russia in the 2013 Bandy World Championship was played at Arena Heden in central Gothenburg. The city's most notable sports venues are Scandinavium, and Ullevi (multisport) and the newly built Gamla Ullevi (football). The 2003 World Allround Speed Skating Championships were held in Rudhallen, Sweden's only indoor speed-skating arena. It is a part of Ruddalens IP, which also has a bandy field and several football fields. The only Swedish heavyweight champion of the world in boxing, Ingemar Johansson, who took the title from Floyd Paterson in 1959, was from Gothenburg. Gothenburg has hosted a number of international sporting events including the 1958 FIFA World Cup, the 1983 European Cup Winners' Cup Final, an NFL preseason game on 14August 1988 between the Chicago Bears and the Minnesota Vikings, the 1992 European Football Championship, the 1993 and the 2002 World Men's Handball Championship, the 1995 World Championships in Athletics, the 1997 World Championships in Swimming (short track), the 2002 Ice Hockey World Championships, the 2004 UEFA Cup final, the 2006 European Championships in Athletics, and the 2008 World Figure Skating Championships. Annual events held in the city are the Gothia Cup and the Göteborgsvarvet. The annual Gothia Cup, is the world's largest football tournament with regards to the number of participants: in 2011, a total of 35,200 players from 1,567 teams and 72 nations participated. Gothenburg hosted the XIII FINA World Masters Championships in 2010. Diving, swimming, synchronized swimming and open-water competitions were held on 28July to 7August. The water polo events were played on the neighboring city of Borås. Gothenburg is also home to the Gothenburg Sharks, a professional baseball team in the Elitserien division of baseball in Sweden. With around 25,000 sailboats and yachts scattered about the city, sailing is a popular sports activity in the region, particularly because of the nearby Gothenburg archipelago. In June 2015, the Volvo Ocean Race, professional sailing's leading crewed offshore race, concluded in Gothenburg, as well as an event in the 2015–2016 America's Cup World Series in August 2015. The Gothenburg Amateur Diving Club (Göteborgs amatördykarklubb) has been operating since October 1938. Economy Due to Gothenburg's advantageous location in the centre of Scandinavia, trade and shipping have always played a major role in the city's economic history, and they continue to do so. Gothenburg port has come to be the largest harbour in Scandinavia. Apart from trade, the second pillar of Gothenburg has traditionally been manufacturing and industry, which significantly contributes to the city's wealth. Major companies operating plants in the area include SKF, Volvo (both cars and trucks), and Ericsson. Volvo Cars is the largest employer in Gothenburg, not including jobs in supply companies. The blue-collar industries which have dominated the city for long are still important factors in the city's economy, but they are being gradually replaced by high-tech industries. Banking and finance are also important, as well as the event and tourist industry. Gothenburg is the terminus of the Valdemar-Göteborg gas pipeline, which brings natural gas from the North Sea fields to Sweden, through Denmark. Historically, Gothenburg was home base from the 18th century of the Swedish East India Company. From its founding until the late 1970s, the city was a world leader in shipbuilding, with such shipyards as Eriksbergs Mekaniska Verkstad, Götaverken, Arendalsvarvet, and Lindholmens varv. Gothenburg is classified as a global city by GaWC, with a ranking of Gamma. The city has been ranked as the 12th-most inventive city in the world by Forbes. Government Gothenburg became a city municipality with an elected city council when the first Swedish local government acts were implemented in 1863. The municipality has an assembly consisting of 81 members, elected every fourth year. Political decisions depend on citizens considering them legitimate. Political legitimacy can be based on various factors: legality, due process, and equality before the law, as well as the efficiency and effectiveness of public policy. One method used to achieve greater legitimacy for controversial policy reforms such as congestion charges is to allow citizens to decide or advise on the issue in public referendums. In December 2010 a petition for a local referendum on the congestion tax, signed by 28,000 citizens, was submitted to the City Council. This right to submit so-called “people's initiatives” was inscribed in the Local Government Act, which obliged local governments to hold a local referendum if petitioned by 5% of the citizens unless the issue was deemed to be outside their area of jurisdiction or if a majority in the City Council voted against holding such a referendum. A second petition for a referendum, signed by 57,000 citizens, was submitted to the local government in February 2013. This petition followed a campaign organised by a local newspaper – Göteborgs Tidningen – whose editor-in-chief argued that the paper's involvement was justified by the large public response to a series of articles on the congestion tax, as well as out of concern for the local democracy. Proportion of foreign born In 2019, approximately 28% (159,342 residents) of the population of Gothenburg were foreign born and approximately 46% (265,019 residents) had at least one parent born abroad. In addition, approximately 12% (69,263 residents) were foreign citizens. In 2016, 45% of Gothenburg's immigrant population is from other parts of Europe, and 10% of the total population is from another Nordic country. Education Gothenburg has two universities, both of which started as colleges founded by private donations in the 19th century. The University of Gothenburg has about 38,000 students and is one of the largest universities in Scandinavia, and one of the most versatile in Sweden. Chalmers University of Technology is a well-known university located in Johanneberg south of the inner city, lately also established at Lindholmen in Norra Älvstranden, Hisingen. In 2015, there were ten adult education centres in Gothenburg: Agnesbergs folkhögskola, Arbetarrörelsens folkhögskola i Göteborg, Finska folkhögskolan, Folkhögskolan i Angered, Göteborgs folkhögskola, Kvinnofolkhögskolan, Mo Gård folkhögskola, S:ta Birgittas folkhögskola, Västra Götalands folkhögskolor and Wendelsbergs folkhögskola. In 2015, there were 49 high schools in Gothenburg. Some of the more notable schools are Hvitfeldtska gymnasiet, Göteborgs Högre Samskola, Sigrid Rudebecks gymnasium and Polhemsgymnasiet. Some high-schools are also connected to large Swedish corporations, such as SKF Technical high-school owned by SKF and Gothenburg's technical high-school jointly owned by Volvo, Volvo Cars and Gothenburg municipality. There are two folkhögskola that teach fine arts: Domen and Goteborg Folkhögskola. Transport Public transport With over of double track, the Gothenburg tram network covers most of the city and is the largest tram/light rail network in Scandinavia. Gothenburg also has a bus network. Boat and ferry services connect the Gothenburg archipelago to the mainland. The lack of a subway is due to the soft ground on which Gothenburg is situated. Tunneling is very expensive in such conditions. The Gothenburg commuter rail with three lines services some nearby cities and towns. Public transport on the Göta älv river is operated on the Älvsnabben ferry line, operated by Styrsöbolaget on a commission from Västtrafik. Rail and intercity bus Other major transportation hubs are Centralstationen (Gothenburg Central Station) and the Nils Ericson Terminal with trains and buses to various destinations in Sweden, as well as connections to Oslo and Copenhagen (via Malmö). Air Gothenburg is served by Göteborg Landvetter Airport , located about 20 km (12 mi) east of the city centre. It is named after nearby locality Landvetter. Flygbussarna offer frequent bus connections to and from Gothenburg with travel time 20–30 minutes. Swebus, Flixbus and Nettbuss also serve the airport with several daily departures to Gothenburg, Borås and other destinations along European route E4. Västtrafik, the local public transport provider in the area, offers additional connections to Landvetter. The airport is operated by Swedish national airport operator Swedavia, and with 6.8 million passengers served in 2017, it is Sweden's second-largest airport after Stockholm Arlanda. It serves as a base for several domestic and international airlines, e.g. Scandinavian Airlines, Norwegian Air Shuttle and Ryanair. Göteborg Landvetter, however, does not serve as a hub for any airline. In total, there are about 50 destinations with scheduled direct flights to and from Gothenburg, most of them European. An additional 40 destinations are served via charter. The second airport in the area, Göteborg City Airport , is closed. On 13January 2015, Swedish airport operator Swedavia announced that Göteborg City Airport will not reopen for commercial services following an extensive rebuild of the airport started in November 2014, citing that the cost of making the airport viable for commercial operations again was too high, at 250 million kronor ($31 million). Commercial operations will be gradually wound down. The airport was located northwest of the city centre. It was formerly known as Säve Flygplats. It is located within the borders of Gothenburg Municipality. In addition to commercial airlines, the airport was also operated by a number of rescue services, including the Swedish Coast Guard, and was used for other general aviation. Most civil air traffic to Göteborg City Airport was via low-cost airlines such as Ryanair and Wizz Air. Those companies have now been relocated to Landvetter Airport. Sea The Swedish company Stena Line operates between Gothenburg/Frederikshavn in Denmark and Gothenburg/Kiel in Germany. The "England ferry" (Englandsfärjan) to Newcastle via Kristiansand (run by the Danish company DFDS Seaways) ceased at the end of October 2006, after being a Gothenburg institution since the 19th century. DFDS Seaways' sister company, DFDS Tor Line, continues to run scheduled cargo ships between Gothenburg and several English ports, and these used to have limited capacity for passengers and their private vehicles. Also freight ships to North America and East Asia leave from the port. Freight Gothenburg is an intermodal logistics hub and Gothenburg harbour has access to Sweden and Norway via rail and trucks. Gothenburg harbour is the largest port in Scandinavia with a cargo turnover of 36.9 million tonnes per year in 2004. Notable people Two of the noted people from Gothenburg are fictional, but have become synonymous with "people from Gothenburg". They are a working class couple called Kal and Ada, featured in "Gothenburg jokes" (göteborgsvitsar), songs, plays and names of events. Each year two persons who have significantly contributed to culture in the city are given the honorary titles of "Kal and Ada". A bronze statue of the couple made by Svenrobert Lundquist, was placed outside the entrance to Liseberg in 1995. Some of the noted people from Gothenburg are Academy Award Winning actress Alicia Vikander, cookbook author Sofia von Porat, footballer Gunnar Gren, artist Evert Taube, golfer Helen Alfredsson, industrialist Victor Hasselblad, singer-songwriter Björn Ulvaeus, diplomat Jan Eliasson, British Open Winner and professional golfer Henrik Stenson, Miss Sweden 1966 and Miss Universe 1966's winner Margareta Arvidsson, YouTuber PewDiePie (Felix Kjellberg), the most subscribed-to individual on the platform, with over 100 million subscribers and YouTuber RoomieOfficial (Joel Berghult). International rankings Gothenburg has performed well in international rankings, some of which are mentioned below: The Global Destination Sustainability Index has named Gothenburg the world's most sustainable destination every year since 2016. In 2019 Gothenburg was selected by the EU as one of the top 2020 European Capitals of Smart Tourism. In 2020 Business Region Göteborg received the 'European Entrepreneurial Region Award 2020' (EER Award 2020) from the EU. International relations The Gothenburg Award is the city's international prize that recognises and supports work to achieve sustainable development – in the Gothenburg region and from a global perspective. The award, which is one million Swedish crowns, is administrated and funded by a coalition of the City of Gothenburg and 12 companies. Past winners of the award have included Kofi Annan, Al Gore, and Michael Biddle. Twin towns and sister cities Gothenburg is twinned with: Oslo, Norway Aarhus, Denmark, 1946 Chicago, United States Turku, Finland, 1946 Tallinn, Estonia St. Petersburg, Russia, 1962 Bergen, Norway, 1946 Kraków, Poland, 1990 Rostock, Germany, 1965 Badalona, Spain 1990 Port Elizabeth, South Africa With Lyon (France) there is no formal partnership, but "a joint willingness to cooperate". Gothenburg had signed an agreement with Shanghai in 1986 which was upgraded in 2003 to include exchanges in culture, economics, trade and sport. The agreement was allowed to lapse in 2020. See also Gothenburg archipelago Gothenburg Protocol (on acidification, eutrophication and ground-level ozone) Gothenburg quadricentennial jubilee Largest cities of the European Union by population within city limits List of metropolitan areas in Europe Metropolitan Gothenburg Göteborgs Rapé References External links Goteborg.se – Official site for city of Gothenburg Goteborg.se/english – Official web page for short English description of the content in city of Gothenburg site International.Goteborg.se – Official international site for city of Gothenburg Goteborg.com – Gothenburg tourism portal VisitSweden – VisitSweden's profile of Gothenburg Virtual Tour Panoramas of Goteborg Metropolitan Gothenburg County seats in Sweden Municipal seats of Västra Götaland County Swedish municipal seats Populated places in Västra Götaland County Populated places in Gothenburg Municipality Populated places in Härryda Municipality Populated places in Mölndal Municipality Populated places in Partille Municipality Coastal cities and towns in Sweden Geats Port cities in Sweden Port cities and towns of the North Sea Populated places established in 1621 1621 establishments in Sweden Planned cities Skagerrak
[ -0.10847163945436478, -0.3568435609340668, 0.8169679641723633, -0.08808094263076782, -0.910531759262085, 0.09628719836473465, 0.6170293688774109, 0.7109615206718445, -0.8889709115028381, 0.07226870954036713, -0.3010200560092926, -0.1751800775527954, -0.4701099097728729, 0.5411604642868042,...
11863
https://en.wikipedia.org/wiki/Gotland%20County
Gotland County
Gotland County () is a county or län of Sweden. Gotland is located in the Baltic Sea to the east of Öland, and is the largest of Sweden's islands. Counties are usually sub-divided into municipalities, but Gotland County only consists of one municipality: Region Gotland. Gotland County is the only county in Sweden that is not governed by a county council. The municipality handles the tasks that are otherwise handled by the county council, mainly health care and public transport. Like other counties, Gotland has a County Administrative Board that oversees implementation of the Swedish state government. Both the County Administrative Board and the municipality have their seat in the largest city Visby, with over 22,000 inhabitants. Province The provinces of Sweden are no longer officially administrative units, but are used when reporting population size, politics, etc. In this case the province, the county and the municipality all have identical borders and cover an area of 3151 km² Administration Gotland is the only Swedish county that is not administered by a county council. Instead, the municipality is tasked with the responsibilities of a county, including public health care and public transport. The main aim of the County Administrative Board is to fulfil the goals set in national politics by the Riksdag and the Government, to coordinate the interests and promote the development of the county, to establish regional goals and safeguard the due process of law in the handling of each case. The County Administrative Board is a Government agency headed by a Governor. Mats Löfving is the regional police chief for both Stockholm and Gotland Counties. Politics During a trial period the County Council provisions for Gotland has been evolved to provisions for a Regional Council, meaning that it has assumed certain tasks from the County Administrative Board. Similar provisions are applicable to the counties of Västra Götaland and Skåne during the trial period. Governors Localities in order of size The five most populous localities of Gotland County in 2010: Foreign background SCB have collected statistics on backgrounds of residents since 2002. These tables consist of all who have two foreign-born parents or are born abroad themselves. The chart lists election years and the last year on record alone. Heraldry Gotland County inherited its coat of arms from the province of Gotland. When it is shown with a royal crown it represents the County Administrative Board. References External links Gotland County Administrative Board Region Gotland County Counties of Sweden
[ -0.05072075128555298, -0.11913784593343735, 0.6796197891235352, 0.4691983163356781, -0.5717586278915405, -0.5583510398864746, 0.3574761748313904, 0.4635857045650482, -0.6896736025810242, -0.437513142824173, -0.049359481781721115, -0.09848508983850479, -0.251208633184433, -0.030878858640789...
11866
https://en.wikipedia.org/wiki/Global%20Positioning%20System
Global Positioning System
The Global Positioning System (GPS), originally Navstar GPS, is a satellite-based radionavigation system owned by the United States government and operated by the United States Space Force. It is one of the global navigation satellite systems (GNSS) that provides geolocation and time information to a GPS receiver anywhere on or near the Earth where there is an unobstructed line of sight to four or more GPS satellites. Obstacles such as mountains and buildings can block the relatively weak GPS signals. The GPS does not require the user to transmit any data, and it operates independently of any telephonic or Internet reception, though these technologies can enhance the usefulness of the GPS positioning information. The GPS provides critical positioning capabilities to military, civil, and commercial users around the world. The United States government created the system, maintains and controls it, and makes it freely accessible to anyone with a GPS receiver. The GPS project was started by the U.S. Department of Defense in 1973. The first prototype spacecraft was launched in 1978 and the full constellation of 24 satellites became operational in 1993. Originally limited to use by the United States military, civilian use was allowed from the 1980s following an executive order from President Ronald Reagan after the Korean Air Lines Flight 007 incident. Advances in technology and new demands on the existing system have now led to efforts to modernize the GPS and implement the next generation of GPS Block IIIA satellites and Next Generation Operational Control System (OCX). Announcements from Vice President Al Gore and the Clinton Administration in 1998 initiated these changes, which were authorized by the U.S. Congress in 2000. During the 1990s, GPS quality was degraded by the United States government in a program called "Selective Availability"; this was discontinued on May 1, 2000, in accordance with a law signed by President Bill Clinton. The GPS service is controlled by the United States government, which can selectively deny access to the system, as happened to the Indian military in 1999 during the Kargil War, or degrade the service at any time. As a result, several countries have developed or are in the process of setting up other global or regional satellite navigation systems. The Russian Global Navigation Satellite System (GLONASS) was developed contemporaneously with GPS, but suffered from incomplete coverage of the globe until the mid-2000s. GLONASS can be added to GPS devices, making more satellites available and enabling positions to be fixed more quickly and accurately, to within . China's BeiDou Navigation Satellite System began global services in 2018, and finished its full deployment in 2020. There are also the European Union Galileo navigation satellite system, and India's NavIC. Japan's Quasi-Zenith Satellite System (QZSS) is a GPS satellite-based augmentation system to enhance GPS's accuracy in Asia-Oceania, with satellite navigation independent of GPS scheduled for 2023. When selective availability was lifted in 2000, GPS had about a accuracy. GPS receivers that use the L5 band can have much higher accuracy, pinpointing to within , while high-end users (typically engineering and land surveying applications) are able to have accuracy on several of the bandwidth signals to within two centimeters, and even sub-millimeter accuracy for long-term measurements. , 16 GPS satellites are broadcasting L5 signals, and the signals are considered pre-operational, scheduled to reach 24 satellites by approximately 2027. History The GPS project was launched in the United States in 1973 to overcome the limitations of previous navigation systems, combining ideas from several predecessors, including classified engineering design studies from the 1960s. The U.S. Department of Defense developed the system, which originally used 24 satellites, for use by the United States military, and became fully operational in 1995. Civilian use was allowed from the 1980s. Roger L. Easton of the Naval Research Laboratory, Ivan A. Getting of The Aerospace Corporation, and Bradford Parkinson of the Applied Physics Laboratory are credited with inventing it. The work of Gladys West is credited as instrumental in the development of computational techniques for detecting satellite positions with the precision needed for GPS. The design of GPS is based partly on similar ground-based radio-navigation systems, such as LORAN and the Decca Navigator, developed in the early 1940s. In 1955, Friedwardt Winterberg proposed a test of general relativity—detecting time slowing in a strong gravitational field using accurate atomic clocks placed in orbit inside artificial satellites. Special and general relativity predict that the clocks on the GPS satellites would be seen by the Earth's observers to run 38 microseconds faster per day than the clocks on the Earth. The design of GPS corrects for this difference; without doing so, GPS calculated positions would accumulate up to of error. Predecessors In 1955, Dutch Naval officer Wijnand Langeraar submitted a patent application for a radio-based Long-Range Navigation System, with the US Patent office on 16 Feb 1955 and was granted Patent US2980907A on 18 April 1961. When the Soviet Union launched the first artificial satellite (Sputnik 1) in 1957, two American physicists, William Guier and George Weiffenbach, at Johns Hopkins University's Applied Physics Laboratory (APL) decided to monitor its radio transmissions. Within hours they realized that, because of the Doppler effect, they could pinpoint where the satellite was along its orbit. The Director of the APL gave them access to their UNIVAC to do the heavy calculations required. Early the next year, Frank McClure, the deputy director of the APL, asked Guier and Weiffenbach to investigate the inverse problem—pinpointing the user's location, given the satellite's. (At the time, the Navy was developing the submarine-launched Polaris missile, which required them to know the submarine's location.) This led them and APL to develop the TRANSIT system. In 1959, ARPA (renamed DARPA in 1972) also played a role in TRANSIT. TRANSIT was first successfully tested in 1960. It used a constellation of five satellites and could provide a navigational fix approximately once per hour. In 1967, the U.S. Navy developed the Timation satellite, which proved the feasibility of placing accurate clocks in space, a technology required for GPS. In the 1970s, the ground-based OMEGA navigation system, based on phase comparison of signal transmission from pairs of stations, became the first worldwide radio navigation system. Limitations of these systems drove the need for a more universal navigation solution with greater accuracy. Although there were wide needs for accurate navigation in military and civilian sectors, almost none of those was seen as justification for the billions of dollars it would cost in research, development, deployment, and operation of a constellation of navigation satellites. During the Cold War arms race, the nuclear threat to the existence of the United States was the one need that did justify this cost in the view of the United States Congress. This deterrent effect is why GPS was funded. It is also the reason for the ultra-secrecy at that time. The nuclear triad consisted of the United States Navy's submarine-launched ballistic missiles (SLBMs) along with United States Air Force (USAF) strategic bombers and intercontinental ballistic missiles (ICBMs). Considered vital to the nuclear deterrence posture, accurate determination of the SLBM launch position was a force multiplier. Precise navigation would enable United States ballistic missile submarines to get an accurate fix of their positions before they launched their SLBMs. The USAF, with two thirds of the nuclear triad, also had requirements for a more accurate and reliable navigation system. The U.S. Navy and U.S. Air Force were developing their own technologies in parallel to solve what was essentially the same problem. To increase the survivability of ICBMs, there was a proposal to use mobile launch platforms (comparable to the Soviet SS-24 and SS-25) and so the need to fix the launch position had similarity to the SLBM situation. In 1960, the Air Force proposed a radio-navigation system called MOSAIC (MObile System for Accurate ICBM Control) that was essentially a 3-D LORAN. A follow-on study, Project 57, was performed in 1963 and it was "in this study that the GPS concept was born." That same year, the concept was pursued as Project 621B, which had "many of the attributes that you now see in GPS" and promised increased accuracy for Air Force bombers as well as ICBMs. Updates from the Navy TRANSIT system were too slow for the high speeds of Air Force operation. The Naval Research Laboratory (NRL) continued making advances with their Timation (Time Navigation) satellites, first launched in 1967, second launched in 1969, with the third in 1974 carrying the first atomic clock into orbit and the fourth launched in 1977. Another important predecessor to GPS came from a different branch of the United States military. In 1964, the United States Army orbited its first Sequential Collation of Range (SECOR) satellite used for geodetic surveying. The SECOR system included three ground-based transmitters at known locations that would send signals to the satellite transponder in orbit. A fourth ground-based station, at an undetermined position, could then use those signals to fix its location precisely. The last SECOR satellite was launched in 1969. Development With these parallel developments in the 1960s, it was realized that a superior system could be developed by synthesizing the best technologies from 621B, Transit, Timation, and SECOR in a multi-service program. Satellite orbital position errors, induced by variations in the gravity field and radar refraction among others, had to be resolved. A team led by Harold L Jury of Pan Am Aerospace Division in Florida from 1970 to 1973, used real-time data assimilation and recursive estimation to do so, reducing systematic and residual errors to a manageable level to permit accurate navigation. During Labor Day weekend in 1973, a meeting of about twelve military officers at the Pentagon discussed the creation of a Defense Navigation Satellite System (DNSS). It was at this meeting that the real synthesis that became GPS was created. Later that year, the DNSS program was named Navstar. Navstar is often erroneously considered an acronym for "NAVigation System Using Timing and Ranging" but was never considered as such by the GPS Joint Program Office (TRW may have once advocated for a different navigational system that used that acronym). With the individual satellites being associated with the name Navstar (as with the predecessors Transit and Timation), a more fully encompassing name was used to identify the constellation of Navstar satellites, Navstar-GPS. Ten "Block I" prototype satellites were launched between 1978 and 1985 (an additional unit was destroyed in a launch failure). The effect of the ionosphere on radio transmission was investigated in a geophysics laboratory of Air Force Cambridge Research Laboratory, renamed to Air Force Geophysical Research Lab (AFGRL) in 1974. AFGRL developed the Klobuchar model for computing ionospheric corrections to GPS location. Of note is work done by Australian space scientist Elizabeth Essex-Cohen at AFGRL in 1974. She was concerned with the curving of the paths of radio waves (atmospheric refraction) traversing the ionosphere from NavSTAR satellites. After Korean Air Lines Flight 007, a Boeing 747 carrying 269 people, was shot down in 1983 after straying into the USSR's prohibited airspace, in the vicinity of Sakhalin and Moneron Islands, President Ronald Reagan issued a directive making GPS freely available for civilian use, once it was sufficiently developed, as a common good. The first Block II satellite was launched on February 14, 1989, and the 24th satellite was launched in 1994. The GPS program cost at this point, not including the cost of the user equipment but including the costs of the satellite launches, has been estimated at US$5 billion (equivalent to $ billion in ). Initially, the highest-quality signal was reserved for military use, and the signal available for civilian use was intentionally degraded, in a policy known as Selective Availability. This changed with President Bill Clinton signing on May 1, 2000, a policy directive to turn off Selective Availability to provide the same accuracy to civilians that was afforded to the military. The directive was proposed by the U.S. Secretary of Defense, William Perry, in view of the widespread growth of differential GPS services by private industry to improve civilian accuracy. Moreover, the U.S. military was actively developing technologies to deny GPS service to potential adversaries on a regional basis. Since its deployment, the U.S. has implemented several improvements to the GPS service, including new signals for civil use and increased accuracy and integrity for all users, all the while maintaining compatibility with existing GPS equipment. Modernization of the satellite system has been an ongoing initiative by the U.S. Department of Defense through a series of satellite acquisitions to meet the growing needs of the military, civilians, and the commercial market. As of early 2015, high-quality, FAA grade, Standard Positioning Service (SPS) GPS receivers provided horizontal accuracy of better than , although many factors such as receiver and antenna quality and atmospheric issues can affect this accuracy. GPS is owned and operated by the United States government as a national resource. The Department of Defense is the steward of GPS. The Interagency GPS Executive Board (IGEB) oversaw GPS policy matters from 1996 to 2004. After that, the National Space-Based Positioning, Navigation and Timing Executive Committee was established by presidential directive in 2004 to advise and coordinate federal departments and agencies on matters concerning the GPS and related systems. The executive committee is chaired jointly by the Deputy Secretaries of Defense and Transportation. Its membership includes equivalent-level officials from the Departments of State, Commerce, and Homeland Security, the Joint Chiefs of Staff and NASA. Components of the executive office of the president participate as observers to the executive committee, and the FCC chairman participates as a liaison. The U.S. Department of Defense is required by law to "maintain a Standard Positioning Service (as defined in the federal radio navigation plan and the standard positioning service signal specification) that will be available on a continuous, worldwide basis," and "develop measures to prevent hostile use of GPS and its augmentations without unduly disrupting or degrading civilian uses." Timeline and modernization In 1972, the USAF Central Inertial Guidance Test Facility (Holloman AFB) conducted developmental flight tests of four prototype GPS receivers in a Y configuration over White Sands Missile Range, using ground-based pseudo-satellites. In 1978, the first experimental Block-I GPS satellite was launched. In 1983, after Soviet interceptor aircraft shot down the civilian airliner KAL 007 that strayed into prohibited airspace because of navigational errors, killing all 269 people on board, U.S. President Ronald Reagan announced that GPS would be made available for civilian uses once it was completed, although it had been previously published [in Navigation magazine], and that the CA code (Coarse/Acquisition code) would be available to civilian users. By 1985, ten more experimental Block-I satellites had been launched to validate the concept. Beginning in 1988, command and control of these satellites was moved from Onizuka AFS, California to the 2nd Satellite Control Squadron (2SCS) located at Falcon Air Force Station in Colorado Springs, Colorado. On February 14, 1989, the first modern Block-II satellite was launched. The Gulf War from 1990 to 1991 was the first conflict in which the military widely used GPS. In 1991, a project to create a miniature GPS receiver successfully ended, replacing the previous military receivers with a handheld receiver. In 1992, the 2nd Space Wing, which originally managed the system, was inactivated and replaced by the 50th Space Wing. By December 1993, GPS achieved initial operational capability (IOC), with a full constellation (24 satellites) available and providing the Standard Positioning Service (SPS). Full Operational Capability (FOC) was declared by Air Force Space Command (AFSPC) in April 1995, signifying full availability of the military's secure Precise Positioning Service (PPS). In 1996, recognizing the importance of GPS to civilian users as well as military users, U.S. President Bill Clinton issued a policy directive declaring GPS a dual-use system and establishing an Interagency GPS Executive Board to manage it as a national asset. In 1998, United States Vice President Al Gore announced plans to upgrade GPS with two new civilian signals for enhanced user accuracy and reliability, particularly with respect to aviation safety, and in 2000 the United States Congress authorized the effort, referring to it as GPS III. On May 2, 2000 "Selective Availability" was discontinued as a result of the 1996 executive order, allowing civilian users to receive a non-degraded signal globally. In 2004, the United States government signed an agreement with the European Community establishing cooperation related to GPS and Europe's Galileo system. In 2004, United States President George W. Bush updated the national policy and replaced the executive board with the National Executive Committee for Space-Based Positioning, Navigation, and Timing. November 2004, Qualcomm announced successful tests of assisted GPS for mobile phones. In 2005, the first modernized GPS satellite was launched and began transmitting a second civilian signal (L2C) for enhanced user performance. On September 14, 2007, the aging mainframe-based Ground Segment Control System was transferred to the new Architecture Evolution Plan. On May 19, 2009, the United States Government Accountability Office issued a report warning that some GPS satellites could fail as soon as 2010. On May 21, 2009, the Air Force Space Command allayed fears of GPS failure, saying "There's only a small risk we will not continue to exceed our performance standard." On January 11, 2010, an update of ground control systems caused a software incompatibility with 8,000 to 10,000 military receivers manufactured by a division of Trimble Navigation Limited of Sunnyvale, Calif. On February 25, 2010, the U.S. Air Force awarded the contract to develop the GPS Next Generation Operational Control System (OCX) to improve accuracy and availability of GPS navigation signals, and serve as a critical part of GPS modernization. Awards On February 10, 1993, the National Aeronautic Association selected the GPS Team as winners of the 1992 Robert J. Collier Trophy, the US's most prestigious aviation award. This team combines researchers from the Naval Research Laboratory, the USAF, the Aerospace Corporation, Rockwell International Corporation, and IBM Federal Systems Company. The citation honors them "for the most significant development for safe and efficient navigation and surveillance of air and spacecraft since the introduction of radio navigation 50 years ago." Two GPS developers received the National Academy of Engineering Charles Stark Draper Prize for 2003: Ivan Getting, emeritus president of The Aerospace Corporation and an engineer at MIT, established the basis for GPS, improving on the World War II land-based radio system called LORAN (Long-range Radio Aid to Navigation). Bradford Parkinson, professor of aeronautics and astronautics at Stanford University, conceived the present satellite-based system in the early 1960s and developed it in conjunction with the U.S. Air Force. Parkinson served twenty-one years in the Air Force, from 1957 to 1978, and retired with the rank of colonel. GPS developer Roger L. Easton received the National Medal of Technology on February 13, 2006. Francis X. Kane (Col. USAF, ret.) was inducted into the U.S. Air Force Space and Missile Pioneers Hall of Fame at Lackland A.F.B., San Antonio, Texas, March 2, 2010, for his role in space technology development and the engineering design concept of GPS conducted as part of Project 621B. In 1998, GPS technology was inducted into the Space Foundation Space Technology Hall of Fame. On October 4, 2011, the International Astronautical Federation (IAF) awarded the Global Positioning System (GPS) its 60th Anniversary Award, nominated by IAF member, the American Institute for Aeronautics and Astronautics (AIAA). The IAF Honors and Awards Committee recognized the uniqueness of the GPS program and the exemplary role it has played in building international collaboration for the benefit of humanity. On December 6, 2018, Gladys West was inducted into the Air Force Space and Missile Pioneers Hall of Fame in recognition of her work on an extremely accurate geodetic Earth model, which was ultimately used to determine the orbit of the GPS constellation. On February 12, 2019, four founding members of the project were awarded the Queen Elizabeth Prize for Engineering with the chair of the awarding board stating "Engineering is the foundation of civilisation; there is no other foundation; it makes things happen. And that's exactly what today's Laureates have done - they've made things happen. They've re-written, in a major way, the infrastructure of our world." Basic concept Fundamentals The GPS receiver calculates its own four-dimensional position in spacetime based on data received from multiple GPS satellites. Each satellite carries an accurate record of its position and time, and transmits that data to the receiver. The satellites carry very stable atomic clocks that are synchronized with one another and with ground clocks. Any drift from time maintained on the ground is corrected daily. In the same manner, the satellite locations are known with great precision. GPS receivers have clocks as well, but they are less stable and less precise. Since the speed of radio waves is constant and independent of the satellite speed, the time delay between when the satellite transmits a signal and the receiver receives it is proportional to the distance from the satellite to the receiver. At a minimum, four satellites must be in view of the receiver for it to compute four unknown quantities (three position coordinates and the deviation of its own clock from satellite time). More detailed description Each GPS satellite continually broadcasts a signal (carrier wave with modulation) that includes: A pseudorandom code (sequence of ones and zeros) that is known to the receiver. By time-aligning a receiver-generated version and the receiver-measured version of the code, the time of arrival (TOA) of a defined point in the code sequence, called an epoch, can be found in the receiver clock time scale A message that includes the time of transmission (TOT) of the code epoch (in GPS time scale) and the satellite position at that time Conceptually, the receiver measures the TOAs (according to its own clock) of four satellite signals. From the TOAs and the TOTs, the receiver forms four time of flight (TOF) values, which are (given the speed of light) approximately equivalent to receiver-satellite ranges plus time difference between the receiver and GPS satellites multiplied by speed of light, which are called pseudo-ranges. The receiver then computes its three-dimensional position and clock deviation from the four TOFs. In practice the receiver position (in three dimensional Cartesian coordinates with origin at the Earth's center) and the offset of the receiver clock relative to the GPS time are computed simultaneously, using the navigation equations to process the TOFs. The receiver's Earth-centered solution location is usually converted to latitude, longitude and height relative to an ellipsoidal Earth model. The height may then be further converted to height relative to the geoid, which is essentially mean sea level. These coordinates may be displayed, such as on a moving map display, or recorded or used by some other system, such as a vehicle guidance system. User-satellite geometry Although usually not formed explicitly in the receiver processing, the conceptual time differences of arrival (TDOAs) define the measurement geometry. Each TDOA corresponds to a hyperboloid of revolution (see Multilateration). The line connecting the two satellites involved (and its extensions) forms the axis of the hyperboloid. The receiver is located at the point where three hyperboloids intersect. It is sometimes incorrectly said that the user location is at the intersection of three spheres. While simpler to visualize, this is the case only if the receiver has a clock synchronized with the satellite clocks (i.e., the receiver measures true ranges to the satellites rather than range differences). There are marked performance benefits to the user carrying a clock synchronized with the satellites. Foremost is that only three satellites are needed to compute a position solution. If it were an essential part of the GPS concept that all users needed to carry a synchronized clock, a smaller number of satellites could be deployed, but the cost and complexity of the user equipment would increase. Receiver in continuous operation The description above is representative of a receiver start-up situation. Most receivers have a track algorithm, sometimes called a tracker, that combines sets of satellite measurements collected at different times—in effect, taking advantage of the fact that successive receiver positions are usually close to each other. After a set of measurements are processed, the tracker predicts the receiver location corresponding to the next set of satellite measurements. When the new measurements are collected, the receiver uses a weighting scheme to combine the new measurements with the tracker prediction. In general, a tracker can (a) improve receiver position and time accuracy, (b) reject bad measurements, and (c) estimate receiver speed and direction. The disadvantage of a tracker is that changes in speed or direction can be computed only with a delay, and that derived direction becomes inaccurate when the distance traveled between two position measurements drops below or near the random error of position measurement. GPS units can use measurements of the Doppler shift of the signals received to compute velocity accurately. More advanced navigation systems use additional sensors like a compass or an inertial navigation system to complement GPS. Non-navigation applications GPS requires four or more satellites to be visible for accurate navigation. The solution of the navigation equations gives the position of the receiver along with the difference between the time kept by the receiver's on-board clock and the true time-of-day, thereby eliminating the need for a more precise and possibly impractical receiver based clock. Applications for GPS such as time transfer, traffic signal timing, and synchronization of cell phone base stations, make use of this cheap and highly accurate timing. Some GPS applications use this time for display, or, other than for the basic position calculations, do not use it at all. Although four satellites are required for normal operation, fewer apply in special cases. If one variable is already known, a receiver can determine its position using only three satellites. For example, a ship on the open ocean usually has a known elevation close to 0m, and the elevation of an aircraft may be known. Some GPS receivers may use additional clues or assumptions such as reusing the last known altitude, dead reckoning, inertial navigation, or including information from the vehicle computer, to give a (possibly degraded) position when fewer than four satellites are visible. Structure The current GPS consists of three major segments. These are the space segment, a control segment, and a user segment. The U.S. Space Force develops, maintains, and operates the space and control segments. GPS satellites broadcast signals from space, and each GPS receiver uses these signals to calculate its three-dimensional location (latitude, longitude, and altitude) and the current time. Space segment The space segment (SS) is composed of 24 to 32 satellites, or Space Vehicles (SV), in medium Earth orbit, and also includes the payload adapters to the boosters required to launch them into orbit. The GPS design originally called for 24 SVs, eight each in three approximately circular orbits, but this was modified to six orbital planes with four satellites each. The six orbit planes have approximately 55° inclination (tilt relative to the Earth's equator) and are separated by 60° right ascension of the ascending node (angle along the equator from a reference point to the orbit's intersection). The orbital period is one-half a sidereal day, i.e., 11 hours and 58 minutes so that the satellites pass over the same locations or almost the same locations every day. The orbits are arranged so that at least six satellites are always within line of sight from everywhere on the Earth's surface (see animation at right). The result of this objective is that the four satellites are not evenly spaced (90°) apart within each orbit. In general terms, the angular difference between satellites in each orbit is 30°, 105°, 120°, and 105° apart, which sum to 360°. Orbiting at an altitude of approximately ; orbital radius of approximately , each SV makes two complete orbits each sidereal day, repeating the same ground track each day. This was very helpful during development because even with only four satellites, correct alignment means all four are visible from one spot for a few hours each day. For military operations, the ground track repeat can be used to ensure good coverage in combat zones. , there are 31 satellites in the GPS constellation, 27 of which are in use at a given time with the rest allocated as stand-bys. A 32nd was launched in 2018, but as of July 2019 is still in evaluation. More decommissioned satellites are in orbit and available as spares. The additional satellites improve the precision of GPS receiver calculations by providing redundant measurements. With the increased number of satellites, the constellation was changed to a nonuniform arrangement. Such an arrangement was shown to improve accuracy but also improves reliability and availability of the system, relative to a uniform system, when multiple satellites fail. With the expanded constellation, nine satellites are usually visible at any time from any point on the Earth with a clear horizon, ensuring considerable redundancy over the minimum four satellites needed for a position. Control segment The control segment (CS) is composed of: a master control station (MCS), an alternative master control station, four dedicated ground antennas, and six dedicated monitor stations. The MCS can also access Satellite Control Network (SCN) ground antennas (for additional command and control capability) and NGA (National Geospatial-Intelligence Agency) monitor stations. The flight paths of the satellites are tracked by dedicated U.S. Space Force monitoring stations in Hawaii, Kwajalein Atoll, Ascension Island, Diego Garcia, Colorado Springs, Colorado and Cape Canaveral, along with shared NGA monitor stations operated in England, Argentina, Ecuador, Bahrain, Australia and Washington DC. The tracking information is sent to the MCS at Schriever Space Force Base ESE of Colorado Springs, which is operated by the 2nd Space Operations Squadron (2 SOPS) of the U.S. Space Force. Then 2 SOPS contacts each GPS satellite regularly with a navigational update using dedicated or shared (AFSCN) ground antennas (GPS dedicated ground antennas are located at Kwajalein, Ascension Island, Diego Garcia, and Cape Canaveral). These updates synchronize the atomic clocks on board the satellites to within a few nanoseconds of each other, and adjust the ephemeris of each satellite's internal orbital model. The updates are created by a Kalman filter that uses inputs from the ground monitoring stations, space weather information, and various other inputs. Satellite maneuvers are not precise by GPS standards—so to change a satellite's orbit, the satellite must be marked unhealthy, so receivers don't use it. After the satellite maneuver, engineers track the new orbit from the ground, upload the new ephemeris, and mark the satellite healthy again. The operation control segment (OCS) currently serves as the control segment of record. It provides the operational capability that supports GPS users and keeps the GPS operational and performing within specification. OCS successfully replaced the legacy 1970s-era mainframe computer at Schriever Air Force Base in September 2007. After installation, the system helped enable upgrades and provide a foundation for a new security architecture that supported U.S. armed forces. OCS will continue to be the ground control system of record until the new segment, Next Generation GPS Operation Control System (OCX), is fully developed and functional. The new capabilities provided by OCX will be the cornerstone for revolutionizing GPS's mission capabilities, enabling U.S. Space Force to greatly enhance GPS operational services to U.S. combat forces, civil partners and myriad domestic and international users. The GPS OCX program also will reduce cost, schedule and technical risk. It is designed to provide 50% sustainment cost savings through efficient software architecture and Performance-Based Logistics. In addition, GPS OCX is expected to cost millions less than the cost to upgrade OCS while providing four times the capability. The GPS OCX program represents a critical part of GPS modernization and provides significant information assurance improvements over the current GPS OCS program. OCX will have the ability to control and manage GPS legacy satellites as well as the next generation of GPS III satellites, while enabling the full array of military signals. Built on a flexible architecture that can rapidly adapt to the changing needs of today's and future GPS users allowing immediate access to GPS data and constellation status through secure, accurate and reliable information. Provides the warfighter with more secure, actionable and predictive information to enhance situational awareness. Enables new modernized signals (L1C, L2C, and L5) and has M-code capability, which the legacy system is unable to do. Provides significant information assurance improvements over the current program including detecting and preventing cyber attacks, while isolating, containing and operating during such attacks. Supports higher volume near real-time command and control capabilities and abilities. On September 14, 2011, the U.S. Air Force announced the completion of GPS OCX Preliminary Design Review and confirmed that the OCX program is ready for the next phase of development. The GPS OCX program has missed major milestones and is pushing its launch into 2021, 5 years past the original deadline. According to the Government Accounting Office, even this new deadline looks shaky. User segment The user segment (US) is composed of hundreds of thousands of U.S. and allied military users of the secure GPS Precise Positioning Service, and tens of millions of civil, commercial and scientific users of the Standard Positioning Service. In general, GPS receivers are composed of an antenna, tuned to the frequencies transmitted by the satellites, receiver-processors, and a highly stable clock (often a crystal oscillator). They may also include a display for providing location and speed information to the user. A receiver is often described by its number of channels: this signifies how many satellites it can monitor simultaneously. Originally limited to four or five, this has progressively increased over the years so that, , receivers typically have between 12 and 20 channels. Though there are many receiver manufacturers, they almost all use one of the chipsets produced for this purpose. GPS receivers may include an input for differential corrections, using the RTCM SC-104 format. This is typically in the form of an RS-232 port at 4,800 bit/s speed. Data is actually sent at a much lower rate, which limits the accuracy of the signal sent using RTCM. Receivers with internal DGPS receivers can outperform those using external RTCM data. , even low-cost units commonly include Wide Area Augmentation System (WAAS) receivers. Many GPS receivers can relay position data to a PC or other device using the NMEA 0183 protocol. Although this protocol is officially defined by the National Marine Electronics Association (NMEA), references to this protocol have been compiled from public records, allowing open source tools like gpsd to read the protocol without violating intellectual property laws. Other proprietary protocols exist as well, such as the SiRF and MTK protocols. Receivers can interface with other devices using methods including a serial connection, USB, or Bluetooth. Applications While originally a military project, GPS is considered a dual-use technology, meaning it has significant civilian applications as well. GPS has become a widely deployed and useful tool for commerce, scientific uses, tracking, and surveillance. GPS's accurate time facilitates everyday activities such as banking, mobile phone operations, and even the control of power grids by allowing well synchronized hand-off switching. Civilian Many civilian applications use one or more of GPS's three basic components: absolute location, relative movement, and time transfer. Amateur radio: clock synchronization required for several digital modes such as FT8, FT4 and JS8; also used with APRS for position reporting; is often critical during emergency and disaster communications support. Atmosphere: studying the troposphere delays (recovery of the water vapor content) and ionosphere delays (recovery of the number of free electrons). Recovery of Earth surface displacements due to the atmospheric pressure loading. Astronomy: both positional and clock synchronization data is used in astrometry and celestial mechanics and precise orbit determination. GPS is also used in both amateur astronomy with small telescopes as well as by professional observatories for finding extrasolar planets. Automated vehicle: applying location and routes for cars and trucks to function without a human driver. Cartography: both civilian and military cartographers use GPS extensively. Cellular telephony: clock synchronization enables time transfer, which is critical for synchronizing its spreading codes with other base stations to facilitate inter-cell handoff and support hybrid GPS/cellular position detection for mobile emergency calls and other applications. The first handsets with integrated GPS launched in the late 1990s. The U.S. Federal Communications Commission (FCC) mandated the feature in either the handset or in the towers (for use in triangulation) in 2002 so emergency services could locate 911 callers. Third-party software developers later gained access to GPS APIs from Nextel upon launch, followed by Sprint in 2006, and Verizon soon thereafter. Clock synchronization: the accuracy of GPS time signals (±10 ns) is second only to the atomic clocks they are based on, and is used in applications such as GPS disciplined oscillators. Disaster relief/emergency services: many emergency services depend upon GPS for location and timing capabilities. GPS-equipped radiosondes and dropsondes: measure and calculate the atmospheric pressure, wind speed and direction up to from the Earth's surface. Radio occultation for weather and atmospheric science applications. Fleet tracking: used to identify, locate and maintain contact reports with one or more fleet vehicles in real-time. Geodesy: determination of Earth orientation parameters including the daily and sub-daily polar motion, and length-of-day variabilities, Earth's center-of-mass - geocenter motion, and low-degree gravity field parameters. Geofencing: vehicle tracking systems, person tracking systems, and pet tracking systems use GPS to locate devices that are attached to or carried by a person, vehicle, or pet. The application can provide continuous tracking and send notifications if the target leaves a designated (or "fenced-in") area. Geotagging: applies location coordinates to digital objects such as photographs (in Exif data) and other documents for purposes such as creating map overlays with devices like Nikon GP-1 GPS aircraft tracking GPS for mining: the use of RTK GPS has significantly improved several mining operations such as drilling, shoveling, vehicle tracking, and surveying. RTK GPS provides centimeter-level positioning accuracy. GPS data mining: It is possible to aggregate GPS data from multiple users to understand movement patterns, common trajectories and interesting locations. GPS tours: location determines what content to display; for instance, information about an approaching point of interest. Navigation: navigators value digitally precise velocity and orientation measurements, as well as precise positions in real-time with a support of orbit and clock corrections. Orbit determination of low-orbiting satellites with GPS receiver installed on board, such as GOCE, GRACE, Jason-1, Jason-2, TerraSAR-X, TanDEM-X, CHAMP, Sentinel-3, and some cubesats, e.g., CubETH. Phasor measurements: GPS enables highly accurate timestamping of power system measurements, making it possible to compute phasors. Recreation: for example, Geocaching, Geodashing, GPS drawing, waymarking, and other kinds of location based mobile games such as Pokémon Go. Reference frames: realization and densification of the terrestrial reference frames in the framework of Global Geodetic Observing System. Co-location in space between Satellite laser ranging and microwave observations for deriving global geodetic parameters. Robotics: self-navigating, autonomous robots using GPS sensors, which calculate latitude, longitude, time, speed, and heading. Sport: used in football and rugby for the control and analysis of the training load. Surveying: surveyors use absolute locations to make maps and determine property boundaries. Tectonics: GPS enables direct fault motion measurement of earthquakes. Between earthquakes GPS can be used to measure crustal motion and deformation to estimate seismic strain buildup for creating seismic hazard maps. Telematics: GPS technology integrated with computers and mobile communications technology in automotive navigation systems. Restrictions on civilian use The U.S. government controls the export of some civilian receivers. All GPS receivers capable of functioning above above sea level and , or designed or modified for use with unmanned missiles and aircraft, are classified as munitions (weapons)—which means they require State Department export licenses. This rule applies even to otherwise purely civilian units that only receive the L1 frequency and the C/A (Coarse/Acquisition) code. Disabling operation above these limits exempts the receiver from classification as a munition. Vendor interpretations differ. The rule refers to operation at both the target altitude and speed, but some receivers stop operating even when stationary. This has caused problems with some amateur radio balloon launches that regularly reach . These limits only apply to units or components exported from the United States. A growing trade in various components exists, including GPS units from other countries. These are expressly sold as ITAR-free. Military As of 2009, military GPS applications include: Navigation: Soldiers use GPS to find objectives, even in the dark or in unfamiliar territory, and to coordinate troop and supply movement. In the United States armed forces, commanders use the Commander's Digital Assistant and lower ranks use the Soldier Digital Assistant. Target tracking: Various military weapons systems use GPS to track potential ground and air targets before flagging them as hostile. These weapon systems pass target coordinates to precision-guided munitions to allow them to engage targets accurately. Military aircraft, particularly in air-to-ground roles, use GPS to find targets. Missile and projectile guidance: GPS allows accurate targeting of various military weapons including ICBMs, cruise missiles, precision-guided munitions and artillery shells. Embedded GPS receivers able to withstand accelerations of 12,000 g or about have been developed for use in howitzer shells. Search and rescue. Reconnaissance: Patrol movement can be managed more closely. GPS satellites carry a set of nuclear detonation detectors consisting of an optical sensor called a bhangmeter, an X-ray sensor, a dosimeter, and an electromagnetic pulse (EMP) sensor (W-sensor), that form a major portion of the United States Nuclear Detonation Detection System. General William Shelton has stated that future satellites may drop this feature to save money. GPS type navigation was first used in war in the 1991 Persian Gulf War, before GPS was fully developed in 1995, to assist Coalition Forces to navigate and perform maneuvers in the war. The war also demonstrated the vulnerability of GPS to being jammed, when Iraqi forces installed jamming devices on likely targets that emitted radio noise, disrupting reception of the weak GPS signal. GPS's vulnerability to jamming is a threat that continues to grow as jamming equipment and experience grows. GPS signals have been reported to have been jammed many times over the years for military purposes. Russia seems to have several objectives for this behavior, such as intimidating neighbors while undermining confidence in their reliance on American systems, promoting their GLONASS alternative, disrupting Western military exercises, and protecting assets from drones. China uses jamming to discourage US surveillance aircraft near the contested Spratly Islands. North Korea has mounted several major jamming operations near its border with South Korea and offshore, disrupting flights, shipping and fishing operations. Iranian Armed Forces disrupted the civilian airliner plane Flight PS752's GPS when it shot down the aircraft. Timekeeping Leap seconds While most clocks derive their time from Coordinated Universal Time (UTC), the atomic clocks on the satellites are set to "GPS time". The difference is that GPS time is not corrected to match the rotation of the Earth, so it does not contain leap seconds or other corrections that are periodically added to UTC. GPS time was set to match UTC in 1980, but has since diverged. The lack of corrections means that GPS time remains at a constant offset with International Atomic Time (TAI) (TAI - GPS = 19 seconds). Periodic corrections are performed to the on-board clocks to keep them synchronized with ground clocks. The GPS navigation message includes the difference between GPS time and UTC. GPS time is 18 seconds ahead of UTC because of the leap second added to UTC on December 31, 2016. Receivers subtract this offset from GPS time to calculate UTC and specific time zone values. New GPS units may not show the correct UTC time until after receiving the UTC offset message. The GPS-UTC offset field can accommodate 255 leap seconds (eight bits). Accuracy GPS time is theoretically accurate to about 14 nanoseconds, due to the clock drift relative to International Atomic Time that the atomic clocks in GPS transmitters experience Most receivers lose some accuracy in their interpretation of the signals and are only accurate to about 100 nanoseconds. Format As opposed to the year, month, and day format of the Gregorian calendar, the GPS date is expressed as a week number and a seconds-into-week number. The week number is transmitted as a ten-bit field in the C/A and P(Y) navigation messages, and so it becomes zero again every 1,024 weeks (19.6 years). GPS week zero started at 00:00:00 UTC (00:00:19 TAI) on January 6, 1980, and the week number became zero again for the first time at 23:59:47 UTC on August 21, 1999 (00:00:19 TAI on August 22, 1999). It happened the second time at 23:59:42 UTC on April 6, 2019. To determine the current Gregorian date, a GPS receiver must be provided with the approximate date (to within 3,584 days) to correctly translate the GPS date signal. To address this concern in the future the modernized GPS civil navigation (CNAV) message will use a 13-bit field that only repeats every 8,192 weeks (157 years), thus lasting until 2137 (157 years after GPS week zero). Communication The navigational signals transmitted by GPS satellites encode a variety of information including satellite positions, the state of the internal clocks, and the health of the network. These signals are transmitted on two separate carrier frequencies that are common to all satellites in the network. Two different encodings are used: a public encoding that enables lower resolution navigation, and an encrypted encoding used by the U.S. military. Message format {|class="wikitable" style="float:right; margin:0 0 0.5em 1em;" border="1" |+ ! Subframes !! Description |- | 1 || Satellite clock,GPS time relationship |- | 2–3 || Ephemeris(precise satellite orbit) |- | 4–5 || Almanac component(satellite network synopsis,error correction) |} Each GPS satellite continuously broadcasts a navigation message on L1 (C/A and P/Y) and L2 (P/Y) frequencies at a rate of 50 bits per second (see bitrate). Each complete message takes 750 seconds ( minutes) to complete. The message structure has a basic format of a 1500-bit-long frame made up of five subframes, each subframe being 300 bits (6 seconds) long. Subframes 4 and 5 are subcommutated 25 times each, so that a complete data message requires the transmission of 25 full frames. Each subframe consists of ten words, each 30 bits long. Thus, with 300 bits in a subframe times 5 subframes in a frame times 25 frames in a message, each message is 37,500 bits long. At a transmission rate of 50-bit/s, this gives 750 seconds to transmit an entire almanac message (GPS). Each 30-second frame begins precisely on the minute or half-minute as indicated by the atomic clock on each satellite. The first subframe of each frame encodes the week number and the time within the week, as well as the data about the health of the satellite. The second and the third subframes contain the ephemeris – the precise orbit for the satellite. The fourth and fifth subframes contain the almanac, which contains coarse orbit and status information for up to 32 satellites in the constellation as well as data related to error correction. Thus, to obtain an accurate satellite location from this transmitted message, the receiver must demodulate the message from each satellite it includes in its solution for 18 to 30 seconds. To collect all transmitted almanacs, the receiver must demodulate the message for 732 to 750 seconds or minutes. All satellites broadcast at the same frequencies, encoding signals using unique code-division multiple access (CDMA) so receivers can distinguish individual satellites from each other. The system uses two distinct CDMA encoding types: the coarse/acquisition (C/A) code, which is accessible by the general public, and the precise (P(Y)) code, which is encrypted so that only the U.S. military and other NATO nations who have been given access to the encryption code can access it. The ephemeris is updated every 2 hours and is sufficiently stable for 4 hours, with provisions for updates every 6 hours or longer in non-nominal conditions. The almanac is updated typically every 24 hours. Additionally, data for a few weeks following is uploaded in case of transmission updates that delay data upload. Satellite frequencies {|class="wikitable" style="float:right; width:30em; margin:0 0 0.5em 1em;" border="1" |+ ! Band !! Frequency !! Description |- | L1 || 1575.42 MHz || Coarse-acquisition (C/A) and encrypted precision (P(Y)) codes, plus the L1 civilian (L1C) and military (M) codes on Block III and newer satellites. |- | L2 || 1227.60 MHz || P(Y) code, plus the L2C and military codes on the Block IIR-M and newer satellites. |- | L3 || 1381.05 MHz || Used for nuclear detonation (NUDET) detection. |- | L4 || 1379.913 MHz || Being studied for additional ionospheric correction. |- | L5 || 1176.45 MHz || Used as a civilian safety-of-life (SoL) signal on Block IIF and newer satellites. |} All satellites broadcast at the same two frequencies, 1.57542 GHz (L1 signal) and 1.2276 GHz (L2 signal). The satellite network uses a CDMA spread-spectrum technique where the low-bitrate message data is encoded with a high-rate pseudo-random (PRN) sequence that is different for each satellite. The receiver must be aware of the PRN codes for each satellite to reconstruct the actual message data. The C/A code, for civilian use, transmits data at 1.023 million chips per second, whereas the P code, for U.S. military use, transmits at 10.23 million chips per second. The actual internal reference of the satellites is 10.22999999543 MHz to compensate for relativistic effects that make observers on the Earth perceive a different time reference with respect to the transmitters in orbit. The L1 carrier is modulated by both the C/A and P codes, while the L2 carrier is only modulated by the P code. The P code can be encrypted as a so-called P(Y) code that is only available to military equipment with a proper decryption key. Both the C/A and P(Y) codes impart the precise time-of-day to the user. The L3 signal at a frequency of 1.38105 GHz is used to transmit data from the satellites to ground stations. This data is used by the United States Nuclear Detonation (NUDET) Detection System (USNDS) to detect, locate, and report nuclear detonations (NUDETs) in the Earth's atmosphere and near space. One usage is the enforcement of nuclear test ban treaties. The L4 band at 1.379913 GHz is being studied for additional ionospheric correction. The L5 frequency band at 1.17645 GHz was added in the process of GPS modernization. This frequency falls into an internationally protected range for aeronautical navigation, promising little or no interference under all circumstances. The first Block IIF satellite that provides this signal was launched in May 2010. On February 5th 2016, the 12th and final Block IIF satellite was launched. The L5 consists of two carrier components that are in phase quadrature with each other. Each carrier component is bi-phase shift key (BPSK) modulated by a separate bit train. "L5, the third civil GPS signal, will eventually support safety-of-life applications for aviation and provide improved availability and accuracy." In 2011, a conditional waiver was granted to LightSquared to operate a terrestrial broadband service near the L1 band. Although LightSquared had applied for a license to operate in the 1525 to 1559 band as early as 2003 and it was put out for public comment, the FCC asked LightSquared to form a study group with the GPS community to test GPS receivers and identify issue that might arise due to the larger signal power from the LightSquared terrestrial network. The GPS community had not objected to the LightSquared (formerly MSV and SkyTerra) applications until November 2010, when LightSquared applied for a modification to its Ancillary Terrestrial Component (ATC) authorization. This filing (SAT-MOD-20101118-00239) amounted to a request to run several orders of magnitude more power in the same frequency band for terrestrial base stations, essentially repurposing what was supposed to be a "quiet neighborhood" for signals from space as the equivalent of a cellular network. Testing in the first half of 2011 has demonstrated that the impact of the lower 10 MHz of spectrum is minimal to GPS devices (less than 1% of the total GPS devices are affected). The upper 10 MHz intended for use by LightSquared may have some impact on GPS devices. There is some concern that this may seriously degrade the GPS signal for many consumer uses. Aviation Week magazine reports that the latest testing (June 2011) confirms "significant jamming" of GPS by LightSquared's system. Demodulation and decoding Because all of the satellite signals are modulated onto the same L1 carrier frequency, the signals must be separated after demodulation. This is done by assigning each satellite a unique binary sequence known as a Gold code. The signals are decoded after demodulation using addition of the Gold codes corresponding to the satellites monitored by the receiver. If the almanac information has previously been acquired, the receiver picks the satellites to listen for by their PRNs, unique numbers in the range 1 through 32. If the almanac information is not in memory, the receiver enters a search mode until a lock is obtained on one of the satellites. To obtain a lock, it is necessary that there be an unobstructed line of sight from the receiver to the satellite. The receiver can then acquire the almanac and determine the satellites it should listen for. As it detects each satellite's signal, it identifies it by its distinct C/A code pattern. There can be a delay of up to 30 seconds before the first estimate of position because of the need to read the ephemeris data. Processing of the navigation message enables the determination of the time of transmission and the satellite position at this time. For more information see Demodulation and Decoding, Advanced. Navigation equations Problem description The receiver uses messages received from satellites to determine the satellite positions and time sent. The x, y, and z components of satellite position and the time sent (s) are designated as [xi, yi, zi, si] where the subscript i denotes the satellite and has the value 1, 2, ..., n, where n ≥ 4. When the time of message reception indicated by the on-board receiver clock is t̃i, the true reception time is , where b is the receiver's clock bias from the much more accurate GPS clocks employed by the satellites. The receiver clock bias is the same for all received satellite signals (assuming the satellite clocks are all perfectly synchronized). The message's transit time is , where si is the satellite time. Assuming the message traveled at the speed of light, c, the distance traveled is . For n satellites, the equations to satisfy are: where di is the geometric distance or range between receiver and satellite i (the values without subscripts are the x, y, and z components of receiver position): Defining pseudoranges as , we see they are biased versions of the true range: . Since the equations have four unknowns [x, y, z, b]—the three components of GPS receiver position and the clock bias—signals from at least four satellites are necessary to attempt solving these equations. They can be solved by algebraic or numerical methods. Existence and uniqueness of GPS solutions are discussed by Abell and Chaffee. When n is greater than four, this system is overdetermined and a fitting method must be used. The amount of error in the results varies with the received satellites' locations in the sky, since certain configurations (when the received satellites are close together in the sky) cause larger errors. Receivers usually calculate a running estimate of the error in the calculated position. This is done by multiplying the basic resolution of the receiver by quantities called the geometric dilution of position (GDOP) factors, calculated from the relative sky directions of the satellites used. The receiver location is expressed in a specific coordinate system, such as latitude and longitude using the WGS 84 geodetic datum or a country-specific system. Geometric interpretation The GPS equations can be solved by numerical and analytical methods. Geometrical interpretations can enhance the understanding of these solution methods. Spheres The measured ranges, called pseudoranges, contain clock errors. In a simplified idealization in which the ranges are synchronized, these true ranges represent the radii of spheres, each centered on one of the transmitting satellites. The solution for the position of the receiver is then at the intersection of the surfaces of these spheres; see trilateration (more generally, true-range multilateration). Signals from at minimum three satellites are required, and their three spheres would typically intersect at two points. One of the points is the location of the receiver, and the other moves rapidly in successive measurements and would not usually be on Earth's surface. In practice, there are many sources of inaccuracy besides clock bias, including random errors as well as the potential for precision loss from subtracting numbers close to each other if the centers of the spheres are relatively close together. This means that the position calculated from three satellites alone is unlikely to be accurate enough. Data from more satellites can help because of the tendency for random errors to cancel out and also by giving a larger spread between the sphere centers. But at the same time, more spheres will not generally intersect at one point. Therefore, a near intersection gets computed, typically via least squares. The more signals available, the better the approximation is likely to be. Hyperboloids If the pseudorange between the receiver and satellite i and the pseudorange between the receiver and satellite j are subtracted, , the common receiver clock bias (b) cancels out, resulting in a difference of distances . The locus of points having a constant difference in distance to two points (here, two satellites) is a hyperbola on a plane and a hyperboloid of revolution (more specifically, a two-sheeted hyperboloid) in 3D space (see Multilateration). Thus, from four pseudorange measurements, the receiver can be placed at the intersection of the surfaces of three hyperboloids each with foci at a pair of satellites. With additional satellites, the multiple intersections are not necessarily unique, and a best-fitting solution is sought instead. Inscribed sphere The receiver position can be interpreted as the center of an inscribed sphere (insphere) of radius bc, given by the receiver clock bias b (scaled by the speed of light c). The insphere location is such that it touches other spheres. The circumscribing spheres are centered at the GPS satellites, whose radii equal the measured pseudoranges pi. This configuration is distinct from the one described above, in which the spheres' radii were the unbiased or geometric ranges di. Hypercones The clock in the receiver is usually not of the same quality as the ones in the satellites and will not be accurately synchronized to them. This produces pseudoranges with large differences compared to the true distances to the satellites. Therefore, in practice, the time difference between the receiver clock and the satellite time is defined as an unknown clock bias b. The equations are then solved simultaneously for the receiver position and the clock bias. The solution space [x, y, z, b] can be seen as a four-dimensional spacetime, and signals from at minimum four satellites are needed. In that case each of the equations describes a hypercone (or spherical cone), with the cusp located at the satellite, and the base a sphere around the satellite. The receiver is at the intersection of four or more of such hypercones. Solution methods Least squares When more than four satellites are available, the calculation can use the four best, or more than four simultaneously (up to all visible satellites), depending on the number of receiver channels, processing capability, and geometric dilution of precision (GDOP). Using more than four involves an over-determined system of equations with no unique solution; such a system can be solved by a least-squares or weighted least squares method. Iterative Both the equations for four satellites, or the least squares equations for more than four, are non-linear and need special solution methods. A common approach is by iteration on a linearized form of the equations, such as the Gauss–Newton algorithm. The GPS was initially developed assuming use of a numerical least-squares solution method—i.e., before closed-form solutions were found. Closed-form One closed-form solution to the above set of equations was developed by S. Bancroft. Its properties are well known; in particular, proponents claim it is superior in low-GDOP situations, compared to iterative least squares methods. Bancroft's method is algebraic, as opposed to numerical, and can be used for four or more satellites. When four satellites are used, the key steps are inversion of a 4x4 matrix and solution of a single-variable quadratic equation. Bancroft's method provides one or two solutions for the unknown quantities. When there are two (usually the case), only one is a near-Earth sensible solution. When a receiver uses more than four satellites for a solution, Bancroft uses the generalized inverse (i.e., the pseudoinverse) to find a solution. A case has been made that iterative methods, such as the Gauss–Newton algorithm approach for solving over-determined non-linear least squares (NLLS) problems, generally provide more accurate solutions. Leick et al. (2015) states that "Bancroft's (1985) solution is a very early, if not the first, closed-form solution." Other closed-form solutions were published afterwards, although their adoption in practice is unclear. Error sources and analysis GPS error analysis examines error sources in GPS results and the expected size of those errors. GPS makes corrections for receiver clock errors and other effects, but some residual errors remain uncorrected. Error sources include signal arrival time measurements, numerical calculations, atmospheric effects (ionospheric/tropospheric delays), ephemeris and clock data, multipath signals, and natural and artificial interference. Magnitude of residual errors from these sources depends on geometric dilution of precision. Artificial errors may result from jamming devices and threaten ships and aircraft or from intentional signal degradation through selective availability, which limited accuracy to ≈ , but has been switched off since May 1, 2000. Accuracy enhancement and surveying Augmentation Integrating external information into the calculation process can materially improve accuracy. Such augmentation systems are generally named or described based on how the information arrives. Some systems transmit additional error information (such as clock drift, ephemera, or ionospheric delay), others characterize prior errors, while a third group provides additional navigational or vehicle information. Examples of augmentation systems include the Wide Area Augmentation System (WAAS), European Geostationary Navigation Overlay Service (EGNOS), Differential GPS (DGPS), inertial navigation systems (INS) and Assisted GPS. The standard accuracy of about can be augmented to with DGPS, and to about with WAAS. Precise monitoring Accuracy can be improved through precise monitoring and measurement of existing GPS signals in additional or alternative ways. The largest remaining error is usually the unpredictable delay through the ionosphere. The spacecraft broadcast ionospheric model parameters, but some errors remain. This is one reason GPS spacecraft transmit on at least two frequencies, L1 and L2. Ionospheric delay is a well-defined function of frequency and the total electron content (TEC) along the path, so measuring the arrival time difference between the frequencies determines TEC and thus the precise ionospheric delay at each frequency. Military receivers can decode the P(Y) code transmitted on both L1 and L2. Without decryption keys, it is still possible to use a codeless technique to compare the P(Y) codes on L1 and L2 to gain much of the same error information. This technique is slow, so it is currently available only on specialized surveying equipment. In the future, additional civilian codes are expected to be transmitted on the L2 and L5 frequencies. All users will then be able to perform dual-frequency measurements and directly compute ionospheric delay errors. A second form of precise monitoring is called Carrier-Phase Enhancement (CPGPS). This corrects the error that arises because the pulse transition of the PRN is not instantaneous, and thus the correlation (satellite–receiver sequence matching) operation is imperfect. CPGPS uses the L1 carrier wave, which has a period of , which is about one-thousandth of the C/A Gold code bit period of , to act as an additional clock signal and resolve the uncertainty. The phase difference error in the normal GPS amounts to of ambiguity. CPGPS working to within 1% of perfect transition reduces this error to of ambiguity. By eliminating this error source, CPGPS coupled with DGPS normally realizes between of absolute accuracy. Relative Kinematic Positioning (RKP) is a third alternative for a precise GPS-based positioning system. In this approach, determination of range signal can be resolved to a precision of less than . This is done by resolving the number of cycles that the signal is transmitted and received by the receiver by using a combination of differential GPS (DGPS) correction data, transmitting GPS signal phase information and ambiguity resolution techniques via statistical tests—possibly with processing in real-time (real-time kinematic positioning, RTK). Carrier phase tracking (surveying) Another method that is used in surveying applications is carrier phase tracking. The period of the carrier frequency multiplied by the speed of light gives the wavelength, which is about for the L1 carrier. Accuracy within 1% of wavelength in detecting the leading edge reduces this component of pseudorange error to as little as . This compares to for the C/A code and for the P code. accuracy requires measuring the total phase—the number of waves multiplied by the wavelength plus the fractional wavelength, which requires specially equipped receivers. This method has many surveying applications. It is accurate enough for real-time tracking of the very slow motions of tectonic plates, typically per year. Triple differencing followed by numerical root finding, and the least squares technique can estimate the position of one receiver given the position of another. First, compute the difference between satellites, then between receivers, and finally between epochs. Other orders of taking differences are equally valid. Detailed discussion of the errors is omitted. The satellite carrier total phase can be measured with ambiguity as to the number of cycles. Let denote the phase of the carrier of satellite j measured by receiver i at time . This notation shows the meaning of the subscripts i, j, and k. The receiver (r), satellite (s), and time (t) come in alphabetical order as arguments of and to balance readability and conciseness, let be a concise abbreviation. Also we define three functions, :, which return differences between receivers, satellites, and time points, respectively. Each function has variables with three subscripts as its arguments. These three functions are defined below. If is a function of the three integer arguments, i, j, and k then it is a valid argument for the functions, :, with the values defined as , , and  . Also if are valid arguments for the three functions and a and b are constants then is a valid argument with values defined as , , and  . Receiver clock errors can be approximately eliminated by differencing the phases measured from satellite 1 with that from satellite 2 at the same epoch. This difference is designated as Double differencing computes the difference of receiver 1's satellite difference from that of receiver 2. This approximately eliminates satellite clock errors. This double difference is: Triple differencing subtracts the receiver difference from time 1 from that of time 2. This eliminates the ambiguity associated with the integral number of wavelengths in carrier phase provided this ambiguity does not change with time. Thus the triple difference result eliminates practically all clock bias errors and the integer ambiguity. Atmospheric delay and satellite ephemeris errors have been significantly reduced. This triple difference is: Triple difference results can be used to estimate unknown variables. For example, if the position of receiver 1 is known but the position of receiver 2 unknown, it may be possible to estimate the position of receiver 2 using numerical root finding and least squares. Triple difference results for three independent time pairs may be sufficient to solve for receiver 2's three position components. This may require a numerical procedure. An approximation of receiver 2's position is required to use such a numerical method. This initial value can probably be provided from the navigation message and the intersection of sphere surfaces. Such a reasonable estimate can be key to successful multidimensional root finding. Iterating from three time pairs and a fairly good initial value produces one observed triple difference result for receiver 2's position. Processing additional time pairs can improve accuracy, overdetermining the answer with multiple solutions. Least squares can estimate an overdetermined system. Least squares determines the position of receiver 2 that best fits the observed triple difference results for receiver 2 positions under the criterion of minimizing the sum of the squares. Regulatory spectrum issues concerning GPS receivers In the United States, GPS receivers are regulated under the Federal Communications Commission's (FCC) Part 15 rules. As indicated in the manuals of GPS-enabled devices sold in the United States, as a Part 15 device, it "must accept any interference received, including interference that may cause undesired operation." With respect to GPS devices in particular, the FCC states that GPS receiver manufacturers, "must use receivers that reasonably discriminate against reception of signals outside their allocated spectrum." For the last 30 years, GPS receivers have operated next to the Mobile Satellite Service band, and have discriminated against reception of mobile satellite services, such as Inmarsat, without any issue. The spectrum allocated for GPS L1 use by the FCC is 1559 to 1610 MHz, while the spectrum allocated for satellite-to-ground use owned by Lightsquared is the Mobile Satellite Service band. Since 1996, the FCC has authorized licensed use of the spectrum neighboring the GPS band of 1525 to 1559 MHz to the Virginia company LightSquared. On March 1, 2001, the FCC received an application from LightSquared's predecessor, Motient Services, to use their allocated frequencies for an integrated satellite-terrestrial service. In 2002, the U.S. GPS Industry Council came to an out-of-band-emissions (OOBE) agreement with LightSquared to prevent transmissions from LightSquared's ground-based stations from emitting transmissions into the neighboring GPS band of 1559 to 1610 MHz. In 2004, the FCC adopted the OOBE agreement in its authorization for LightSquared to deploy a ground-based network ancillary to their satellite system – known as the Ancillary Tower Components (ATCs) – "We will authorize MSS ATC subject to conditions that ensure that the added terrestrial component remains ancillary to the principal MSS offering. We do not intend, nor will we permit, the terrestrial component to become a stand-alone service." This authorization was reviewed and approved by the U.S. Interdepartment Radio Advisory Committee, which includes the U.S. Department of Agriculture, U.S. Space Force, U.S. Army, U.S. Coast Guard, Federal Aviation Administration, National Aeronautics and Space Administration (NASA), U.S. Department of the Interior, and U.S. Department of Transportation. In January 2011, the FCC conditionally authorized LightSquared's wholesale customers—such as Best Buy, Sharp, and C Spire—to only purchase an integrated satellite-ground-based service from LightSquared and re-sell that integrated service on devices that are equipped to only use the ground-based signal using LightSquared's allocated frequencies of 1525 to 1559 MHz. In December 2010, GPS receiver manufacturers expressed concerns to the FCC that LightSquared's signal would interfere with GPS receiver devices although the FCC's policy considerations leading up to the January 2011 order did not pertain to any proposed changes to the maximum number of ground-based LightSquared stations or the maximum power at which these stations could operate. The January 2011 order makes final authorization contingent upon studies of GPS interference issues carried out by a LightSquared led working group along with GPS industry and Federal agency participation. On February 14, 2012, the FCC initiated proceedings to vacate LightSquared's Conditional Waiver Order based on the NTIA's conclusion that there was currently no practical way to mitigate potential GPS interference. GPS receiver manufacturers design GPS receivers to use spectrum beyond the GPS-allocated band. In some cases, GPS receivers are designed to use up to 400 MHz of spectrum in either direction of the L1 frequency of 1575.42 MHz, because mobile satellite services in those regions are broadcasting from space to ground, and at power levels commensurate with mobile satellite services. As regulated under the FCC's Part 15 rules, GPS receivers are not warranted protection from signals outside GPS-allocated spectrum. This is why GPS operates next to the Mobile Satellite Service band, and also why the Mobile Satellite Service band operates next to GPS. The symbiotic relationship of spectrum allocation ensures that users of both bands are able to operate cooperatively and freely. The FCC adopted rules in February 2003 that allowed Mobile Satellite Service (MSS) licensees such as LightSquared to construct a small number of ancillary ground-based towers in their licensed spectrum to "promote more efficient use of terrestrial wireless spectrum." In those 2003 rules, the FCC stated "As a preliminary matter, terrestrial [Commercial Mobile Radio Service (“CMRS”)] and MSS ATC are expected to have different prices, coverage, product acceptance and distribution; therefore, the two services appear, at best, to be imperfect substitutes for one another that would be operating in predominantly different market segments... MSS ATC is unlikely to compete directly with terrestrial CMRS for the same customer base...". In 2004, the FCC clarified that the ground-based towers would be ancillary, noting that "We will authorize MSS ATC subject to conditions that ensure that the added terrestrial component remains ancillary to the principal MSS offering. We do not intend, nor will we permit, the terrestrial component to become a stand-alone service." In July 2010, the FCC stated that it expected LightSquared to use its authority to offer an integrated satellite-terrestrial service to "provide mobile broadband services similar to those provided by terrestrial mobile providers and enhance competition in the mobile broadband sector." GPS receiver manufacturers have argued that LightSquared's licensed spectrum of 1525 to 1559 MHz was never envisioned as being used for high-speed wireless broadband based on the 2003 and 2004 FCC ATC rulings making clear that the Ancillary Tower Component (ATC) would be, in fact, ancillary to the primary satellite component. To build public support of efforts to continue the 2004 FCC authorization of LightSquared's ancillary terrestrial component vs. a simple ground-based LTE service in the Mobile Satellite Service band, GPS receiver manufacturer Trimble Navigation Ltd. formed the "Coalition To Save Our GPS." The FCC and LightSquared have each made public commitments to solve the GPS interference issue before the network is allowed to operate. According to Chris Dancy of the Aircraft Owners and Pilots Association, airline pilots with the type of systems that would be affected "may go off course and not even realize it." The problems could also affect the Federal Aviation Administration upgrade to the air traffic control system, United States Defense Department guidance, and local emergency services including 911. On February 14, 2012, the FCC moved to bar LightSquared's planned national broadband network after being informed by the National Telecommunications and Information Administration (NTIA), the federal agency that coordinates spectrum uses for the military and other federal government entities, that "there is no practical way to mitigate potential interference at this time". LightSquared is challenging the FCC's action. Other systems Other notable satellite navigation systems in use or various states of development include: Beidou – system deployed and operated by the People's Republic of China's, initiating global services in 2019. Galileo – a global system being developed by the European Union and other partner countries, which began operation in 2016, and is expected to be fully deployed by 2020. GLONASS – Russia's global navigation system. Fully operational worldwide. NavIC – a regional navigation system developed by the Indian Space Research Organisation. QZSS – a regional navigation system receivable in the Asia-Oceania regions, with a focus on Japan. See also List of GPS satellites GPS satellite blocks GPS signals GPS navigation software GPS/INS GPS spoofing Indoor positioning system Local Area Augmentation System Local positioning system Military invention Mobile phone tracking Navigation paradox Notice Advisory to Navstar Users S-GPS Notes References Further reading Global Positioning System Open Courseware from MIT, 2012 External links FAA GPS FAQ GPS.gov – General public education website created by the U.S. Government 20th-century inventions Equipment of the United States Space Force Military equipment introduced in the 1970s
[ -0.04795793071389198, 0.3767467439174652, 0.7435884475708008, 0.11189079284667969, 0.3571857213973999, -0.02897571213543415, -0.2798827290534973, 0.5199750065803528, -0.21334494650363922, -0.34183037281036377, -0.37456196546554565, 0.29603031277656555, -0.6157525181770325, 0.16561609506607...
11867
https://en.wikipedia.org/wiki/Germany
Germany
Germany (, ), officially the Federal Republic of Germany, is a country in Central Europe. It is the second most populous country in Europe after Russia, and the most populous member state of the European Union. Germany is situated between the Baltic and North seas to the north, and the Alps to the south; it covers an area of , with a population of over 83 million within its 16 constituent states. Germany borders Denmark to the north, Poland and the Czech Republic to the east, Austria and Switzerland to the south, and France, Luxembourg, Belgium, and the Netherlands to the west. The nation's capital and largest city is Berlin, and its financial centre is Frankfurt; the largest urban area is the Ruhr. Various Germanic tribes have inhabited the northern parts of modern Germany since classical antiquity. A region named Germania was documented before AD 100. In the 10th century, German territories formed a central part of the Holy Roman Empire. During the 16th century, northern German regions became the centre of the Protestant Reformation. Following the Napoleonic Wars and the dissolution of the Holy Roman Empire in 1806, the German Confederation was formed in 1815. In 1871, Germany became a nation-state when most of the German states unified into the Prussian-dominated German Empire. After World War I and the German Revolution of 1918–1919, the Empire was replaced by the semi-presidential Weimar Republic. The Nazi seizure of power in 1933 led to the establishment of a dictatorship, World War II, and the Holocaust. After the end of World War II in Europe and a period of Allied occupation, Germany was divided into the Federal Republic of Germany, generally known as West Germany, and the German Democratic Republic, East Germany. The Federal Republic of Germany was a founding member of the European Economic Community and the European Union, while the German Democratic Republic was a communist Eastern Bloc state and member of the Warsaw Pact. After the fall of communism, German reunification saw the former East German states join the Federal Republic of Germany on 3 October 1990—becoming a federal parliamentary republic. Germany is a great power with a strong economy; it has the largest economy in Europe, the world's fourth-largest economy by nominal GDP, and the fifth-largest by PPP. As a global leader in several industrial, scientific and technological sectors, it is both the world's third-largest exporter and importer of goods. As a developed country, which ranks very high on the Human Development Index, it offers social security and a universal health care system, environmental protections, and a tuition-free university education. Germany is a member of the United Nations, NATO, the G7, the G20, and the OECD. It has the third-greatest number of UNESCO World Heritage Sites. Etymology The English word Germany derives from the Latin , which came into use after Julius Caesar adopted it for the peoples east of the Rhine. The German term , originally ('the German lands') is derived from (cf. Dutch), descended from Old High German 'of the people' (from or 'people'), originally used to distinguish the language of the common people from Latin and its Romance descendants. This in turn descends from Proto-Germanic 'of the people' (see also the Latinised form ), derived from , descended from Proto-Indo-European * 'people', from which the word Teutons also originates. History Ancient humans were present in Germany at least 600,000 years ago. The first non-modern human fossil (the Neanderthal) was discovered in the Neander Valley. Similarly dated evidence of modern humans has been found in the Swabian Jura, including 42,000-year-old flutes which are the oldest musical instruments ever found, the 40,000-year-old Lion Man, and the 35,000-year-old Venus of Hohle Fels. The Nebra sky disk, created during the European Bronze Age, is attributed to a German site. Germanic tribes and Frankish Empire The Germanic tribes are thought to date from the Nordic Bronze Age or the Pre-Roman Iron Age. From southern Scandinavia and north Germany, they expanded south, east, and west, coming into contact with the Celtic, Iranian, Baltic, and Slavic tribes. Under Augustus, the Roman Empire began to invade lands inhabited by the Germanic tribes, creating a short-lived Roman province of Germania between the Rhine and Elbe rivers. In 9 AD, three Roman legions were defeated by Arminius. By 100 AD, when Tacitus wrote Germania, Germanic tribes had settled along the Rhine and the Danube (the Limes Germanicus), occupying most of modern Germany. However, Baden Württemberg, southern Bavaria, southern Hesse and the western Rhineland had been incorporated into Roman provinces. Around 260, Germanic peoples broke into Roman-controlled lands. After the invasion of the Huns in 375, and with the decline of Rome from 395, Germanic tribes moved farther southwest: the Franks established the Frankish Kingdom and pushed east to subjugate Saxony and Bavaria, and areas of what is today eastern Germany were inhabited by Western Slavic tribes. East Francia and Holy Roman Empire Charlemagne founded the Carolingian Empire in 800; it was divided in 843 and the Holy Roman Empire emerged from the eastern portion. The territory initially known as East Francia stretched from the Rhine in the west to the Elbe River in the east and from the North Sea to the Alps. The Ottonian rulers (919–1024) consolidated several major duchies. In 996 Gregory V became the first German Pope, appointed by his cousin Otto III, whom he shortly after crowned Holy Roman Emperor. The Holy Roman Empire absorbed northern Italy and Burgundy under the Salian emperors (1024–1125), although the emperors lost power through the Investiture controversy. Under the Hohenstaufen emperors (1138–1254), German princes encouraged German settlement to the south and east (). Members of the Hanseatic League, mostly north German towns, prospered in the expansion of trade. The population declined starting with the Great Famine in 1315, followed by the Black Death of 1348–50. The Golden Bull issued in 1356 provided the constitutional structure of the Empire and codified the election of the emperor by seven prince-electors. Johannes Gutenberg introduced moveable-type printing to Europe, laying the basis for the democratization of knowledge. In 1517, Martin Luther incited the Protestant Reformation and his translation of the Bible began the standardization of the language; the 1555 Peace of Augsburg tolerated the "Evangelical" faith (Lutheranism), but also decreed that the faith of the prince was to be the faith of his subjects (). From the Cologne War through the Thirty Years' Wars (1618–1648), religious conflict devastated German lands and significantly reduced the population. The Peace of Westphalia ended religious warfare among the Imperial Estates; their mostly German-speaking rulers were able to choose Roman Catholicism, Lutheranism, or the Reformed faith as their official religion. The legal system initiated by a series of Imperial Reforms (approximately 1495–1555) provided for considerable local autonomy and a stronger Imperial Diet. The House of Habsburg held the imperial crown from 1438 until the death of Charles VI in 1740. Following the War of Austrian Succession and the Treaty of Aix-la-Chapelle, Charles VI's daughter Maria Theresa ruled as Empress Consort when her husband, Francis I, became Emperor. From 1740, dualism between the Austrian Habsburg Monarchy and the Kingdom of Prussia dominated German history. In 1772, 1793, and 1795, Prussia and Austria, along with the Russian Empire, agreed to the Partitions of Poland. During the period of the French Revolutionary Wars, the Napoleonic era and the subsequent final meeting of the Imperial Diet, most of the Free Imperial Cities were annexed by dynastic territories; the ecclesiastical territories were secularised and annexed. In 1806 the was dissolved; France, Russia, Prussia and the Habsburgs (Austria) competed for hegemony in the German states during the Napoleonic Wars. German Confederation and Empire Following the fall of Napoleon, the Congress of Vienna founded the German Confederation, a loose league of 39 sovereign states. The appointment of the Emperor of Austria as the permanent president reflected the Congress's rejection of Prussia's rising influence. Disagreement within restoration politics partly led to the rise of liberal movements, followed by new measures of repression by Austrian statesman Klemens von Metternich. The , a tariff union, furthered economic unity. In light of revolutionary movements in Europe, intellectuals and commoners started the revolutions of 1848 in the German states, raising the German Question. King Frederick William IV of Prussia was offered the title of Emperor, but with a loss of power; he rejected the crown and the proposed constitution, a temporary setback for the movement. King William I appointed Otto von Bismarck as the Minister President of Prussia in 1862. Bismarck successfully concluded the war with Denmark in 1864; the subsequent decisive Prussian victory in the Austro-Prussian War of 1866 enabled him to create the North German Confederation which excluded Austria. After the defeat of France in the Franco-Prussian War, the German princes proclaimed the founding of the German Empire in 1871. Prussia was the dominant constituent state of the new empire; the King of Prussia ruled as its Kaiser, and Berlin became its capital. In the period following the unification of Germany, Bismarck's foreign policy as Chancellor of Germany secured Germany's position as a great nation by forging alliances and avoiding war. However, under Wilhelm II, Germany took an imperialistic course, leading to friction with neighbouring countries. A dual alliance was created with the multinational realm of Austria-Hungary; the Triple Alliance of 1882 included Italy. Britain, France and Russia also concluded alliances to protect against Habsburg interference with Russian interests in the Balkans or German interference against France. At the Berlin Conference in 1884, Germany claimed several colonies including German East Africa, German South West Africa, Togoland, and Kamerun. Later, Germany further expanded its colonial empire to include holdings in the Pacific and China. The colonial government in South West Africa (present-day Namibia), from 1904 to 1907, carried out the annihilation of the local Herero and Namaqua peoples as punishment for an uprising; this was the 20th century's first genocide. The assassination of Austria's crown prince on 28 June 1914 provided the pretext for Austria-Hungary to attack Serbia and trigger World War I. After four years of warfare, in which approximately two million German soldiers were killed, a general armistice ended the fighting. In the German Revolution (November 1918), Emperor Wilhelm II and the ruling princes abdicated their positions, and Germany was declared a federal republic. Germany's new leadership signed the Treaty of Versailles in 1919, accepting defeat by the Allies. Germans perceived the treaty as humiliating, which was seen by historians as influential in the rise of Adolf Hitler. Germany lost around 13% of its European territory and ceded all of its colonial possessions in Africa and the South Sea. Weimar Republic and Nazi Germany On 11 August 1919, President Friedrich Ebert signed the democratic Weimar Constitution. In the subsequent struggle for power, communists seized power in Bavaria, but conservative elements elsewhere attempted to overthrow the Republic in the Kapp Putsch. Street fighting in the major industrial centres, the occupation of the Ruhr by Belgian and French troops, and a period of hyperinflation followed. A debt restructuring plan and the creation of a new currency in 1924 ushered in the Golden Twenties, an era of artistic innovation and liberal cultural life. The worldwide Great Depression hit Germany in 1929. Chancellor Heinrich Brüning's government pursued a policy of fiscal austerity and deflation which caused unemployment of nearly 30% by 1932. The Nazi Party led by Adolf Hitler became the largest party in Reichstag after a special election in 1932 and Hindenburg appointed Hitler as Chancellor of Germany on 30 January 1933. After the Reichstag fire, a decree abrogated basic civil rights and the first Nazi concentration camp opened. The Enabling Act gave Hitler unrestricted legislative power, overriding the constitution; his government established a centralised totalitarian state, withdrew from the League of Nations, and dramatically increased the country's rearmament. A government-sponsored programme for economic renewal focused on public works, the most famous of which was the autobahn. In 1935, the regime withdrew from the Treaty of Versailles and introduced the Nuremberg Laws which targeted Jews and other minorities. Germany also reacquired control of the Saarland in 1935, remilitarised the Rhineland in 1936, annexed Austria in 1938, annexed the Sudetenland in 1938 with the Munich Agreement, and in violation of the agreement occupied Czechoslovakia in March 1939. (Night of Broken Glass) saw the burning of synagogues, the destruction of Jewish businesses, and mass arrests of Jewish people. In August 1939, Hitler's government negotiated the Molotov–Ribbentrop pact that divided Eastern Europe into German and Soviet spheres of influence. On 1 September 1939, Germany invaded Poland, beginning World War II in Europe; Britain and France declared war on Germany on 3 September. In the spring of 1940, Germany conquered Denmark and Norway, the Netherlands, Belgium, Luxembourg, and France, forcing the French government to sign an armistice. The British repelled German air attacks in the Battle of Britain in the same year. In 1941, German troops invaded Yugoslavia, Greece and the Soviet Union. By 1942, Germany and its allies controlled most of continental Europe and North Africa, but following the Soviet victory at the Battle of Stalingrad, the allies' reconquest of North Africa and invasion of Italy in 1943, German forces suffered repeated military defeats. In 1944, the Soviets pushed into Eastern Europe; the Western allies landed in France and entered Germany despite a final German counteroffensive. Following Hitler's suicide during the Battle of Berlin, Germany surrendered on 8 May 1945, ending World War II in Europe. Following the end of the war, surviving Nazi officials were tried for war crimes at the Nuremberg trials. In what later became known as the Holocaust, the German government persecuted minorities, including interning them in concentration and death camps across Europe. In total 17 million people were systematically murdered, including 6 million Jews, at least 130,000 Romani, 275,000 disabled people, thousands of Jehovah's Witnesses, thousands of homosexuals, and hundreds of thousands of political and religious opponents. Nazi policies in German-occupied countries resulted in the deaths of an estimated 2.7 million Poles, 1.3 million Ukrainians, 1 million Belarusians and 3.5 million Soviet prisoners of war. German military casualties have been estimated at 5.3 million, and around 900,000 German civilians died. Around 12 million ethnic Germans were expelled from across Eastern Europe, and Germany lost roughly one-quarter of its pre-war territory. East and West Germany After Nazi Germany surrendered, the Allies partitioned Berlin and Germany's remaining territory into four occupation zones. The western sectors, controlled by France, the United Kingdom, and the United States, were merged on 23 May 1949 to form the Federal Republic of Germany (); on 7 October 1949, the Soviet Zone became the German Democratic Republic (; DDR). They were informally known as West Germany and East Germany. East Germany selected East Berlin as its capital, while West Germany chose Bonn as a provisional capital, to emphasise its stance that the two-state solution was temporary. West Germany was established as a federal parliamentary republic with a "social market economy". Starting in 1948 West Germany became a major recipient of reconstruction aid under the Marshall Plan. Konrad Adenauer was elected the first Federal Chancellor of Germany in 1949. The country enjoyed prolonged economic growth () beginning in the early 1950s. West Germany joined NATO in 1955 and was a founding member of the European Economic Community. East Germany was an Eastern Bloc state under political and military control by the USSR via occupation forces and the Warsaw Pact. Although East Germany claimed to be a democracy, political power was exercised solely by leading members () of the communist-controlled Socialist Unity Party of Germany, supported by the Stasi, an immense secret service. While East German propaganda was based on the benefits of the GDR's social programmes and the alleged threat of a West German invasion, many of its citizens looked to the West for freedom and prosperity. The Berlin Wall, built in 1961, prevented East German citizens from escaping to West Germany, becoming a symbol of the Cold War. Tensions between East and West Germany were reduced in the late 1960s by Chancellor Willy Brandt's . In 1989, Hungary decided to dismantle the Iron Curtain and open its border with Austria, causing the emigration of thousands of East Germans to West Germany via Hungary and Austria. This had devastating effects on the GDR, where regular mass demonstrations received increasing support. In an effort to help retain East Germany as a state, the East German authorities eased border restrictions, but this actually led to an acceleration of the reform process culminating in the Two Plus Four Treaty under which Germany regained full sovereignty. This permitted German reunification on 3 October 1990, with the accession of the five re-established states of the former GDR. The fall of the Wall in 1989 became a symbol of the Fall of Communism, the Dissolution of the Soviet Union, German Reunification and . Reunified Germany and the European Union United Germany was considered the enlarged continuation of West Germany so it retained its memberships in international organisations. Based on the Berlin/Bonn Act (1994), Berlin again became the capital of Germany, while Bonn obtained the unique status of a (federal city) retaining some federal ministries. The relocation of the government was completed in 1999, and modernisation of the east German economy was scheduled to last until 2019. Since reunification, Germany has taken a more active role in the European Union, signing the Maastricht Treaty in 1992 and the Lisbon Treaty in 2007, and co-founding the Eurozone. Germany sent a peacekeeping force to secure stability in the Balkans and sent German troops to Afghanistan as part of a NATO effort to provide security in that country after the ousting of the Taliban. In the 2005 elections, Angela Merkel became the first female chancellor. In 2009 the German government approved a €50 billion stimulus plan. Among the major German political projects of the early 21st century are the advancement of European integration, the energy transition () for a sustainable energy supply, the "Debt Brake" for balanced budgets, measures to increase the fertility rate (pronatalism), and high-tech strategies for the transition of the German economy, summarised as Industry 4.0. During the 2015 European migrant crisis, the country took in over a million refugees and migrants. Geography Germany is the seventh-largest country in Europe; bordering Denmark to the north, Poland and the Czech Republic to the east, Austria to the southeast, and Switzerland to the south-southwest. France, Luxembourg and Belgium are situated to the west, with the Netherlands to the northwest. Germany is also bordered by the North Sea and, at the north-northeast, by the Baltic Sea. German territory covers , consisting of of land and of water. Elevation ranges from the mountains of the Alps (highest point: the Zugspitze at ) in the south to the shores of the North Sea () in the northwest and the Baltic Sea () in the northeast. The forested uplands of central Germany and the lowlands of northern Germany (lowest point: in the municipality Neuendorf-Sachsenbande, Wilstermarsch at below sea level) are traversed by such major rivers as the Rhine, Danube and Elbe. Significant natural resources include iron ore, coal, potash, timber, lignite, uranium, copper, natural gas, salt, and nickel. Climate Most of Germany has a temperate climate, ranging from oceanic in the north to continental in the east and southeast. Winters range from the cold in the Southern Alps to mild and are generally overcast with limited precipitation, while summers can vary from hot and dry to cool and rainy. The northern regions have prevailing westerly winds that bring in moist air from the North Sea, moderating the temperature and increasing precipitation. Conversely, the southeast regions have more extreme temperatures. From February 2019 – 2020, average monthly temperatures in Germany ranged from a low of in January 2020 to a high of in June 2019. Average monthly precipitation ranged from 30 litres per square metre in February and April 2019 to 125 litres per square metre in February 2020. Average monthly hours of sunshine ranged from 45 in November 2019 to 300 in June 2019. The highest temperature ever recorded in Germany was 42.6 °C on 25 July 2019 in Lingen and the lowest was −37.8 °C on 12 February 1929 in Wolnzach. Biodiversity The territory of Germany can be divided into five terrestrial ecoregions: Atlantic mixed forests, Baltic mixed forests, Central European mixed forests, Western European broadleaf forests, and Alps conifer and mixed forests. 51% of Germany's land area is devoted to agriculture, while 30% is forested and 14% is covered by settlements or infrastructure. Plants and animals include those generally common to Central Europe. According to the National Forest Inventory, beeches, oaks, and other deciduous trees constitute just over 40% of the forests; roughly 60% are conifers, particularly spruce and pine. There are many species of ferns, flowers, fungi, and mosses. Wild animals include roe deer, wild boar, mouflon (a subspecies of wild sheep), fox, badger, hare, and small numbers of the Eurasian beaver. The blue cornflower was once a German national symbol. The 16 national parks in Germany include the Jasmund National Park, the Vorpommern Lagoon Area National Park, the Müritz National Park, the Wadden Sea National Parks, the Harz National Park, the Hainich National Park, the Black Forest National Park, the Saxon Switzerland National Park, the Bavarian Forest National Park and the Berchtesgaden National Park. In addition, there are 17 Biosphere Reserves, and 105 nature parks. More than 400 zoos and animal parks operate in Germany. The Berlin Zoo, which opened in 1844, is the oldest in Germany, and claims the most comprehensive collection of species in the world. Politics Germany is a federal, parliamentary, representative democratic republic. Federal legislative power is vested in the parliament consisting of the (Federal Diet) and (Federal Council), which together form the legislative body. The is elected through direct elections using the mixed-member proportional representation system. The members of the represent and are appointed by the governments of the sixteen federated states. The German political system operates under a framework laid out in the 1949 constitution known as the (Basic Law). Amendments generally require a two-thirds majority of both the and the ; the fundamental principles of the constitution, as expressed in the articles guaranteeing human dignity, the separation of powers, the federal structure, and the rule of law, are valid in perpetuity. The president, currently Frank-Walter Steinmeier, is the head of state and invested primarily with representative responsibilities and powers. He is elected by the (federal convention), an institution consisting of the members of the and an equal number of state delegates. The second-highest official in the German order of precedence is the (President of the Bundestag), who is elected by the and responsible for overseeing the daily sessions of the body. The third-highest official and the head of government is the chancellor, who is appointed by the after being elected by the party or coalition with the most seats in the . The chancellor, currently Olaf Scholz, is the head of government and exercises executive power through his Cabinet. Since 1949, the party system has been dominated by the Christian Democratic Union and the Social Democratic Party of Germany. So far every chancellor has been a member of one of these parties. However, the smaller liberal Free Democratic Party and the Alliance '90/The Greens have also been junior partners in coalition governments. Since 2007, the left-wing populist party The Left has been a staple in the German , though they have never been part of the federal government. In the 2017 German federal election, the right-wing populist Alternative for Germany gained enough votes to attain representation in the parliament for the first time. Constituent states Germany is a federal state and comprises sixteen constituent states which are collectively referred to as . Each state () has its own constitution, and is largely autonomous in regard to its internal organisation. Germany is divided into 401 districts () at a municipal level; these consist of 294 rural districts and 107 urban districts. Law Germany has a civil law system based on Roman law with some references to Germanic law. The (Federal Constitutional Court) is the German Supreme Court responsible for constitutional matters, with power of judicial review. Germany's supreme court system is specialised: for civil and criminal cases, the highest court of appeal is the inquisitorial Federal Court of Justice, and for other affairs the courts are the Federal Labour Court, the Federal Social Court, the Federal Finance Court and the Federal Administrative Court. Criminal and private laws are codified on the national level in the and the respectively. The German penal system seeks the rehabilitation of the criminal and the protection of the public. Except for petty crimes, which are tried before a single professional judge, and serious political crimes, all charges are tried before mixed tribunals on which lay judges () sit side by side with professional judges. Germany has a low murder rate with 1.18 murders per 100,000 . In 2018, the overall crime rate fell to its lowest since 1992. Foreign relations Germany has a network of 227 diplomatic missions abroad and maintains relations with more than 190 countries. Germany is a member of NATO, the OECD, the G8, the G20, the World Bank and the IMF. It has played an influential role in the European Union since its inception and has maintained a strong alliance with France and all neighbouring countries since 1990. Germany promotes the creation of a more unified European political, economic and security apparatus. The governments of Germany and the United States are close political allies. Cultural ties and economic interests have crafted a bond between the two countries resulting in Atlanticism. The development policy of Germany is an independent area of foreign policy. It is formulated by the Federal Ministry for Economic Cooperation and Development and carried out by the implementing organisations. The German government sees development policy as a joint responsibility of the international community. It was the world's second-biggest aid donor in 2019 after the United States. Military Germany's military, the , is organised into the (Army and special forces ), (Navy), (Air Force), (Joint Medical Service) and (Joint Support Service) branches. In absolute terms, German military expenditure is the eighth-highest in the world. In 2018, military spending was at $49.5 billion, about 1.2% of the country's GDP, well below the NATO target of 2%. , the has a strength of 184,001 active soldiers and 80,947 civilians. Reservists are available to the armed forces and participate in defence exercises and deployments abroad. Until 2011, military service was compulsory for men at age 18, but this has been officially suspended and replaced with a voluntary service. Since 2001 women may serve in all functions of service without restriction. According to the Stockholm International Peace Research Institute, Germany was the fourth-largest exporter of major arms in the world from 2014 to 2018. In peacetime, the is commanded by the Minister of Defence. In state of defence, the Chancellor would become commander-in-chief of the . The role of the is described in the Constitution of Germany as defensive only. But after a ruling of the Federal Constitutional Court in 1994, the term "defence" has been defined to not only include protection of the borders of Germany, but also crisis reaction and conflict prevention, or more broadly as guarding the security of Germany anywhere in the world. the German military has about 3,600 troops stationed in foreign countries as part of international peacekeeping forces, including about 1,200 supporting operations against Daesh, 980 in the NATO-led Resolute Support Mission in Afghanistan, and 800 in Kosovo. Economy Germany has a social market economy with a highly skilled labour force, a low level of corruption, and a high level of innovation. It is the world's third-largest exporter and third-largest importer of goods, and has the largest economy in Europe, which is also the world's fourth-largest economy by nominal GDP, and the fifth-largest by PPP. Its GDP per capita measured in purchasing power standards amounts to 121% of the EU27 average (100%). The service sector contributes approximately 69% of the total GDP, industry 31%, and agriculture 1% . The unemployment rate published by Eurostat amounts to 3.2% , which is the fourth-lowest in the EU. Germany is part of the European single market which represents more than 450 million consumers. In 2017, the country accounted for 28% of the Eurozone economy according to the International Monetary Fund. Germany introduced the common European currency, the Euro, in 2002. Its monetary policy is set by the European Central Bank, which is headquartered in Frankfurt. Being home to the modern car, the automotive industry in Germany is regarded as one of the most competitive and innovative in the world, and is the fourth-largest by production. The top ten exports of Germany are vehicles, machinery, chemical goods, electronic products, electrical equipments, pharmaceuticals, transport equipments, basic metals, food products, and rubber and plastics. Of the world's 500 largest stock-market-listed companies measured by revenue in 2019, the Fortune Global 500, 29 are headquartered in Germany. 30 major Germany-based companies are included in the DAX, the German stock market index which is operated by Frankfurt Stock Exchange. Well-known international brands include Mercedes-Benz, BMW, Volkswagen, Audi, Siemens, Allianz, Adidas, Porsche, Bosch and Deutsche Telekom. Berlin is a hub for startup companies and has become the leading location for venture capital funded firms in the European Union. Germany is recognised for its large portion of specialised small and medium enterprises, known as the model. These companies represent 48% global market leaders in their segments, labelled hidden champions. Research and development efforts form an integral part of the German economy. In 2018 Germany ranked fourth globally in terms of number of science and engineering research papers published. Germany was ranked 9th in the Global Innovation Index in 2019 and 2020. Research institutions in Germany include the Max Planck Society, the Helmholtz Association, and the Fraunhofer Society and the Leibniz Association. Germany is the largest contributor to the European Space Agency. Infrastructure With its central position in Europe, Germany is a transport hub for the continent. Its road network is among the densest in Europe. The motorway (Autobahn) is widely known for having no general federally mandated speed limit for some classes of vehicles. The Intercity Express or ICE train network serves major German cities as well as destinations in neighbouring countries with speeds up to . The largest German airports are Frankfurt Airport and Munich Airport. The Port of Hamburg is one of the top twenty largest container ports in the world. , Germany was the world's seventh-largest consumer of energy. The government and the nuclear power industry agreed to phase out all nuclear power plants by 2021. It meets the country's power demands using 40% renewable sources. Germany is committed to the Paris Agreement and several other treaties promoting biodiversity, low emission standards, and water management. The country's household recycling rate is among the highest in the world—at around 65%. The country's greenhouse gas emissions per capita were the ninth-highest in the EU . The German energy transition () is the recognised move to a sustainable economy by means of energy efficiency and renewable energy. Tourism Germany is the ninth most visited country in the world , with 37.4 million visits. Berlin has become the third most visited city destination in Europe. Domestic and international travel and tourism combined directly contribute over €105.3 billion to German GDP. Including indirect and induced impacts, the industry supports 4.2 million jobs. Germany's most visited and popular landmarks include Cologne Cathedral, the Brandenburg Gate, the Reichstag, the Dresden Frauenkirche, Neuschwanstein Castle, Heidelberg Castle, the Wartburg, and Sanssouci Palace. The Europa-Park near Freiburg is Europe's second most popular theme park resort. Demographics With a population of 80.2 million according to the 2011 German Census, rising to 83.1 million , Germany is the most populous country in the European Union, the second most populous country in Europe after Russia, and the nineteenth most populous country in the world. Its population density stands at 227 inhabitants per square kilometre (588 per square mile). The overall life expectancy in Germany at birth is 80.19 years (77.93 years for males and 82.58 years for females). The fertility rate of 1.41 children born per woman (2011 estimates) is below the replacement rate of 2.1 and is one of the lowest fertility rates in the world. Since the 1970s, Germany's death rate has exceeded its birth rate. However, Germany is witnessing increased birth rates and migration rates since the beginning of the 2010s. Germany has the third oldest population in the world, with an average age of 47.4 years. Four sizeable groups of people are referred to as "national minorities" because their ancestors have lived in their respective regions for centuries: There is a Danish minority in the northernmost state of Schleswig-Holstein; the Sorbs, a Slavic population, are in the Lusatia region of Saxony and Brandenburg; the Roma and Sinti live throughout the country; and the Frisians are concentrated in Schleswig-Holstein's western coast and in the north-western part of Lower Saxony. After the United States, Germany is the second most popular immigration destination in the world. The majority of migrants live in western Germany, in particular in urban areas. Of the country's residents, 18.6 million people (22.5%) were of immigrant or partially immigrant descent in 2016 (including persons descending or partially descending from ethnic German repatriates). In 2015, the Population Division of the United Nations Department of Economic and Social Affairs listed Germany as host to the second-highest number of international migrants worldwide, about 5% or 12 million of all 244 million migrants. , Germany ranks seventh amongst EU countries in terms of the percentage of migrants in the country's population, at 13.1%. Germany has a number of large cities. There are 11 officially recognised metropolitan regions. The country's largest city is Berlin, while its largest urban area is the Ruhr. Religion According to the 2011 census, Christianity was the largest religion in Germany, with 66.8% of respondents identifying as Christian, of which 3.8% were not church members. 31.7% declared themselves as Protestants, including members of the Evangelical Church in Germany (which encompasses Lutheran, Reformed, and administrative or confessional unions of both traditions) and the free churches (); 31.2% declared themselves as Roman Catholics, and Orthodox believers constituted 1.3%. According to data from 2016, the Catholic Church and the Evangelical Church claimed 28.5% and 27.5%, respectively, of the population. Islam is the second-largest religion in the country. In the 2011 census, 1.9% of respondents (1.52 million people) gave their religion as Islam, but this figure is deemed unreliable because a disproportionate number of adherents of this faith (and other religions, such as Judaism) are likely to have made use of their right not to answer the question. Most of the Muslims are Sunnis and Alevites from Turkey, but there are a small number of Shi'ites, Ahmadiyyas and other denominations. Other religions comprise less than one percent of Germany's population. A study in 2018 estimated that 38% of the population are not members of any religious organization or denomination, though up to a third may still consider themselves religious. Irreligion in Germany is strongest in the former East Germany, which used to be predominantly Protestant before the enforcement of state atheism, and in major metropolitan areas. Languages German is the official and predominant spoken language in Germany. It is one of 24 official and working languages of the European Union, and one of the three procedural languages of the European Commission. German is the most widely spoken first language in the European Union, with around 100 million native speakers. Recognised native minority languages in Germany are Danish, Low German, Low Rhenish, Sorbian, Romany, North Frisian and Saterland Frisian; they are officially protected by the European Charter for Regional or Minority Languages. The most used immigrant languages are Turkish, Arabic, Kurdish, Polish, the Balkan languages and Russian. Germans are typically multilingual: 67% of German citizens claim to be able to communicate in at least one foreign language and 27% in at least two. Education Responsibility for educational supervision in Germany is primarily organised within the individual states. Optional kindergarten education is provided for all children between three and six years old, after which school attendance is compulsory for at least nine years. Primary education usually lasts for four to six years. Secondary schooling is divided into tracks based on whether students pursue academic or vocational education. A system of apprenticeship called leads to a skilled qualification which is almost comparable to an academic degree. It allows students in vocational training to learn in a company as well as in a state-run trade school. This model is well regarded and reproduced all around the world. Most of the German universities are public institutions, and students traditionally study without fee payment. The general requirement for university is the . According to an OECD report in 2014, Germany is the world's third leading destination for international study. The established universities in Germany include some of the oldest in the world, with Heidelberg University (established in 1386) being the oldest. The Humboldt University of Berlin, founded in 1810 by the liberal educational reformer Wilhelm von Humboldt, became the academic model for many Western universities. In the contemporary era Germany has developed eleven Universities of Excellence. Health Germany's system of hospitals, called , dates from medieval times, and today, Germany has the world's oldest universal health care system, dating from Bismarck's social legislation of the 1880s. Since the 1880s, reforms and provisions have ensured a balanced health care system. The population is covered by a health insurance plan provided by statute, with criteria allowing some groups to opt for a private health insurance contract. According to the World Health Organization, Germany's health care system was 77% government-funded and 23% privately funded . In 2014, Germany spent 11.3% of its GDP on health care. Germany ranked 20th in the world in 2013 in life expectancy with 77 years for men and 82 years for women, and it had a very low infant mortality rate (4 per 1,000 live births). , the principal cause of death was cardiovascular disease, at 37%. Obesity in Germany has been increasingly cited as a major health issue. A 2014 study showed that 52 percent of the adult German population was overweight or obese. Culture Culture in German states has been shaped by major intellectual and popular currents in Europe, both religious and secular. Historically, Germany has been called ('the land of poets and thinkers'), because of the major role its scientists, writers and philosophers have played in the development of Western thought. A global opinion poll for the BBC revealed that Germany is recognised for having the most positive influence in the world in 2013 and 2014. Germany is well known for such folk festival traditions as Oktoberfest and Christmas customs, which include Advent wreaths, Christmas pageants, Christmas trees, Stollen cakes, and other practices. UNESCO inscribed 41 properties in Germany on the World Heritage List. There are a number of public holidays in Germany determined by each state; 3 October has been a national day of Germany since 1990, celebrated as the (German Unity Day). Music German classical music includes works by some of the world's most well-known composers. Dieterich Buxtehude, Johann Sebastian Bach and Georg Friedrich Händel were influential composers of the Baroque period. Ludwig van Beethoven was a crucial figure in the transition between the Classical and Romantic eras. Carl Maria von Weber, Felix Mendelssohn, Robert Schumann and Johannes Brahms were significant Romantic composers. Richard Wagner was known for his operas. Richard Strauss was a leading composer of the late Romantic and early modern eras. Karlheinz Stockhausen and Wolfgang Rihm are important composers of the 20th and early 21st centuries. As of 2013, Germany was the second-largest music market in Europe, and fourth-largest in the world. German popular music of the 20th and 21st centuries includes the movements of Neue Deutsche Welle, pop, Ostrock, heavy metal/rock, punk, pop rock, indie, Volksmusik (folk music), schlager pop and German hip hop. German electronic music gained global influence, with Kraftwerk and Tangerine Dream pioneering in this genre. DJs and artists of the techno and house music scenes of Germany have become well known (e.g. Paul van Dyk, Felix Jaehn, Paul Kalkbrenner, Robin Schulz and Scooter). Art and design German painters have influenced Western art. Albrecht Dürer, Hans Holbein the Younger, Matthias Grünewald and Lucas Cranach the Elder were important German artists of the Renaissance, Johann Baptist Zimmermann of the Baroque, Caspar David Friedrich and Carl Spitzweg of Romanticism, Max Liebermann of Impressionism and Max Ernst of Surrealism. Several German art groups formed in the 20th century; (The Bridge) and (The Blue Rider) influenced the development of expressionism in Munich and Berlin. The New Objectivity arose in response to expressionism during the Weimar Republic. After World War II, broad trends in German art include neo-expressionism and the New Leipzig School. Architectural contributions from Germany include the Carolingian and Ottonian styles, which were precursors of Romanesque. Brick Gothic is a distinctive medieval style that evolved in Germany. Also in Renaissance and Baroque art, regional and typically German elements evolved (e.g. Weser Renaissance). Vernacular architecture in Germany is often identified by its timber framing () traditions and varies across regions, and among carpentry styles. When industrialisation spread across Europe, classicism and a distinctive style of historicism developed in Germany, sometimes referred to as style. Expressionist architecture developed in the 1910s in Germany and influenced Art Deco and other modern styles. Germany was particularly important in the early modernist movement: it is the home of Werkbund initiated by Hermann Muthesius (New Objectivity), and of the Bauhaus movement founded by Walter Gropius. Ludwig Mies van der Rohe became one of the world's most renowned architects in the second half of the 20th century; he conceived of the glass façade skyscraper. Renowned contemporary architects and offices include Pritzker Prize winners Gottfried Böhm and Frei Otto. German designers became early leaders of modern product design. The Berlin Fashion Week and the fashion trade fair Bread & Butter are held twice a year. Literature and philosophy German literature can be traced back to the Middle Ages and the works of writers such as Walther von der Vogelweide and Wolfram von Eschenbach. Well-known German authors include Johann Wolfgang von Goethe, Friedrich Schiller, Gotthold Ephraim Lessing and Theodor Fontane. The collections of folk tales published by the Brothers Grimm popularised German folklore on an international level. The Grimms also gathered and codified regional variants of the German language, grounding their work in historical principles; their , or German Dictionary, sometimes called the Grimm dictionary, was begun in 1838 and the first volumes published in 1854. Influential authors of the 20th century include Gerhart Hauptmann, Thomas Mann, Hermann Hesse, Heinrich Böll and Günter Grass. The German book market is the third-largest in the world, after the United States and China. The Frankfurt Book Fair is the most important in the world for international deals and trading, with a tradition spanning over 500 years. The Leipzig Book Fair also retains a major position in Europe. German philosophy is historically significant: Gottfried Leibniz's contributions to rationalism; the enlightenment philosophy by Immanuel Kant; the establishment of classical German idealism by Johann Gottlieb Fichte, Georg Wilhelm Friedrich Hegel and Friedrich Wilhelm Joseph Schelling; Arthur Schopenhauer's composition of metaphysical pessimism; the formulation of communist theory by Karl Marx and Friedrich Engels; Friedrich Nietzsche's development of perspectivism; Gottlob Frege's contributions to the dawn of analytic philosophy; Martin Heidegger's works on Being; Oswald Spengler's historical philosophy; the development of the Frankfurt School has been particularly influential. Media The largest internationally operating media companies in Germany are the Bertelsmann enterprise, Axel Springer SE and ProSiebenSat.1 Media. Germany's television market is the largest in Europe, with some 38 million TV households. Around 90% of German households have cable or satellite TV, with a variety of free-to-view public and commercial channels. There are more than 300 public and private radio stations in Germany; Germany's national radio network is the Deutschlandradio and the public Deutsche Welle is the main German radio and television broadcaster in foreign languages. Germany's print market of newspapers and magazines is the largest in Europe. The papers with the highest circulation are , , and . The largest magazines include and . Germany has a large video gaming market, with over 34 million players nationwide. German cinema has made major technical and artistic contributions to film. The first works of the Skladanowsky Brothers were shown to an audience in 1895. The renowned Babelsberg Studio in Potsdam was established in 1912, thus being the first large-scale film studio in the world. Early German cinema was particularly influential with German expressionists such as Robert Wiene and Friedrich Wilhelm Murnau. Director Fritz Lang's Metropolis (1927) is referred to as the first major science-fiction film. After 1945, many of the films of the immediate post-war period can be characterised as (rubble film). East German film was dominated by state-owned film studio DEFA, while the dominant genre in West Germany was the ("homeland film"). During the 1970s and 1980s, New German Cinema directors such as Volker Schlöndorff, Werner Herzog, Wim Wenders, and Rainer Werner Fassbinder brought West German auteur cinema to critical acclaim. The Academy Award for Best Foreign Language Film ("Oscar") went to the German production The Tin Drum () in 1979, to Nowhere in Africa () in 2002, and to The Lives of Others () in 2007. Various Germans won an Oscar for their performances in other films. The annual European Film Awards ceremony is held every other year in Berlin, home of the European Film Academy. The Berlin International Film Festival, known as "Berlinale", awarding the "Golden Bear" and held annually since 1951, is one of the world's leading film festivals. The "Lolas" are annually awarded in Berlin, at the German Film Awards. Cuisine German cuisine varies from region to region and often neighbouring regions share some culinary similarities (e.g. the southern regions of Bavaria and Swabia share some traditions with Switzerland and Austria). International varieties such as pizza, sushi, Chinese food, Greek food, Indian cuisine and doner kebab are also popular. Bread is a significant part of German cuisine and German bakeries produce about 600 main types of bread and 1,200 types of pastries and rolls (). German cheeses account for about 22% of all cheese produced in Europe. In 2012 over 99% of all meat produced in Germany was either pork, chicken or beef. Germans produce their ubiquitous sausages in almost 1,500 varieties, including Bratwursts and Weisswursts. The national alcoholic drink is beer. German beer consumption per person stands at in 2013 and remains among the highest in the world. German beer purity regulations date back to the 16th century. Wine is becoming more popular in many parts of the country, especially close to German wine regions. In 2019, Germany was the ninth-largest wine producer in the world. The 2018 Michelin Guide awarded eleven restaurants in Germany three stars, giving the country a cumulative total of 300 stars. Sports Football is the most popular sport in Germany. With more than 7 million official members, the German Football Association (Deutscher Fußball-Bund) is the largest single-sport organisation worldwide, and the German top league, the Bundesliga, attracts the second-highest average attendance of all professional sports leagues in the world. The German men's national football team won the FIFA World Cup in 1954, 1974, 1990, and 2014, the UEFA European Championship in 1972, 1980 and 1996, and the FIFA Confederations Cup in 2017. Germany is one of the leading motor sports countries in the world. Constructors like BMW and Mercedes are prominent manufacturers in motor sport. Porsche has won the 24 Hours of Le Mans race 19 times, and Audi 13 times (). The driver Michael Schumacher has set many motor sport records during his career, having won seven Formula One World Drivers' Championships. Sebastian Vettel is also among the top five most successful Formula One drivers of all time. Historically, German athletes have been successful contenders in the Olympic Games, ranking third in an all-time Olympic Games medal count (when combining East and West German medals). Germany was the last country to host both the summer and winter games in the same year, in 1936: the Berlin Summer Games and the Winter Games in Garmisch-Partenkirchen. Munich hosted the Summer Games of 1972. See also Index of Germany-related articles Outline of Germany Notes References Sources External links Official site of the Federal Government Official tourism website Germany from the BBC News Germany. The World Factbook. Central Intelligence Agency. Germany from the OECD Germany at the EU Articles containing video clips Central European countries Countries in Europe Federal republics G7 nations G20 nations German-speaking countries and territories Member states of NATO Member states of the European Union Member states of the Union for the Mediterranean Current member states of the United Nations States and territories established in 1871 States and territories established in 1949 States and territories established in 1990 Western European countries
[ 0.13489894568920135, 0.1351192742586136, 0.29081061482429504, -0.15224379301071167, 0.12752731144428253, 0.09523417800664902, 0.27870824933052063, 0.3087384104728699, -0.1359826922416687, -0.24933235347270966, -0.46438178420066833, -0.7350768446922302, 0.06672410666942596, 0.67398750782012...
11874
https://en.wikipedia.org/wiki/Guatemala%20City
Guatemala City
Guatemala City (), locally known as Guatemala or Guate, officially Ciudad de Guatemala (art. 231 of the Political Constitution of the Republic of Guatemala), is the capital and largest city of Guatemala, and the most populous urban area in Central America. The city is located in the south-central part of the country, nestled in a mountain valley called Valle de la Ermita (). The city is the capital of the Municipality of Guatemala and of the Guatemala Department. Guatemala City is the site of the Mayan city of Kaminaljuyu, founded around 1500 BC. Following the Spanish conquest, a new town was established, and in 1776 it was made capital of the Kingdom of Guatemala. In 1821, Guatemala City was the scene of the declaration of independence of Central America from Spain, after which it became the capital of the newly established United Provinces of Central America (later the Federal Republic of Central America). In 1847, Guatemala declared itself an independent republic, with Guatemala City as its capital. The city was originally located in what is now Antigua Guatemala, and was moved to its current location in 1777. Guatemala City and the original location in Antigua Guatemala were almost completely destroyed by the 1917–18 earthquakes. Reconstructions following the earthquakes have resulted in a more modern architectural landscape. Today, Guatemala City is the political, cultural, and economic center of Guatemala. History Early history Human settlement on the present site of Guatemala City began with the Maya, who built a large ceremonial center at Kaminaljuyu. This large Maya settlement, the biggest outside the Maya lowlands in the Yucatan Peninsula, rose to prominence around 300 BC due to an increase in mining and trading of obsidian, a valuable commodity for pre-Columbian civilizations in Mesoamerica. Kaminaljuyu then collapsed for unknown causes around 300 AD. During the Spanish conquest of Guatemala, settlers coming after the conquistador Pedro de Alvarado established a small town about 1 km south of the old ruins of Kaminaljuyu. This small town was made the capital city of the Captaincy General of Guatemala by the Spanish royal authorities in 1775 after a series of devastating earthquakes had left the old capital city, Antigua Guatemala, in ruins and unusable to the Spanish colonial authorities. During this period the central plaza, with the Cathedral of Guatemala City and the Palace of the Captain-General, were constructed. After Central American independence from Spain the city became the capital of the United Provinces of Central America in 1821. The 19th century saw the construction of the monumental Carrera Theater in the 1850s, and the modern-day Presidential Palace in the 1890s. At this time the city was expanding around the 30 de Junio Boulevard and elsewhere, displacing native settlements on the peripheries of the growing city. Earthquakes in 1917–1918 destroyed many historic structures. Under President Jorge Ubico in the 1930s a hippodrome and many new public buildings were constructed, although slums that had formed after the 1917–1918 earthquakes continued to lack basic amenities. During the Guatemalan Civil War, terror attacks beginning with the burning of the Spanish Embassy in 1980 led to widespread political repression and loss of life in the city. Guatemala City continues to be subject to natural disasters, with the latest being the two disasters that struck in May 2010: the eruption of the Pacaya volcano and, two days later, the torrential downpours from Tropical Storm Agatha. Contemporary history Guatemala City serves as the economic, governmental, and cultural epicenter of the nation of Guatemala. The city also functions as Guatemala's main transportation hub, hosting an international airport, La Aurora International Airport, and serving as the origination or end points for most of Guatemala's major highways. The city, with its robust economy, attracts hundreds of thousands of rural migrants from Guatemala's interior hinterlands and serves as the main entry point for most foreign immigrants seeking to settle in Guatemala. In addition to a wide variety of restaurants, hotels, shops, and a modern BRT transport system (Transmetro), the city is home to many art galleries, theaters, sports venues and museums (including some fine collections of Pre-Columbian art) and provides a growing number of cultural offerings. Guatemala City not only possesses a history and culture unique to the Central American region, it also furnishes all the modern amenities of a world class city, ranging from an IMAX Theater to the Ícaro film festival (Festival Ícaro), where independent films produced in Guatemala and Central America are debuted. Structure and growth Guatemala City is located in the mountainous regions of the country, between the Pacific coastal plain to the south and the northern lowlands of the Peten region. The city's metropolitan area has recently grown very rapidly and has absorbed most of the neighboring municipalities of Villa Nueva, San Miguel Petapa, Mixco, San Juan Sacatepequez, San José Pinula, Santa Catarina Pinula, Fraijanes, San Pedro Ayampuc, Amatitlán, Villa Canales, Palencia and Chinautla forming what is now known as the Guatemala City Metropolitan Area. The city is subdivided into 22 zones ("Zonas") designed by the urban engineering of Raúl Aguilar Batres, each one with its own streets ("Calles"). avenues ("Avenidas") and sometimes "Diagonal" Streets, making it pretty easy to find addresses in the city. Zones are numbered 1–25 with Zones 20, 22 and 23 not existing as they would have fallen in two other municipalities' territory. Addresses are assigned according to the street or avenue number, followed by a dash and the number of metres it is away from the intersection. For example, the INGUAT Office on "7a Av. 1-17, Zona 4" is a building which is located on Avenida 7, 17 meters away from the intersection with Calle 1, toward Calle 2 in zone 4. 7a Av. 1-17, Zona 4; and 7a Av. 1-17, Zona 10, are two radically different addresses. Short streets/avenues do not get new sequenced number, for example, 6A Calle is a short street between 6a and 7a. Some "avenidas" or "Calles" have a name in addition to their number, if it is very wide, for example Avenida la Reforma is an avenue which separates Zone 9 and 10 and Calle Montúfar is Calle 12 in Zone 9. Calle 1 Avenida 1 Zona 1 is the center of every city in Guatemala. Zone One is the Historic Center, (Centro Histórico), lying in the very heart of the city, the location of many important historic buildings including the Palacio Nacional de la Cultura (National Palace of Culture), the Metropolitan Cathedral, the National Congress, the Casa Presidencial (Presidential House), the National Library and Plaza de la Constitución (Constitution Plaza, old Central Park). Efforts to revitalize this important part of the city have been undertaken by the municipal government. Besides the parks, the city offers a portfolio of entertainment in the region, focused on the so-called Zona Viva and the Calzada Roosevelt as well as four degrees North. Casino activity is considerable, with several located in different parts of the Zona Viva. The area around the East market is being redeveloped. Within the financial district are the tallest buildings in the country including: Club Premier, Tinttorento, Atlantis building, Atrium, Tikal Futura, Building of Finances, Towers Building Batteries, Torres Botticelli, Tadeus, building of the INTECAP, Royal Towers, Towers Geminis, Industrial Bank towers, Holiday Inn Hotel, Premier of the Americas, among many others to be used for offices, apartments etc. Also included are projects such as Zona Pradera and Interamerica's World Financial Center. One of the most outstanding mayors was the engineer Martin Prado Vélez, who took over in 1949, and ruled the city during the reformist Presidents Juan José Arévalo and Jacobo Arbenz Guzman, although he was not a member of the ruling party at the time and was elected due his well-known capabilities. Of cobanero origin, married with Marta Cobos, he studied at the University of San Carlos; under his tenure, among other modernist works of the city, infrastructure projects included El Incienso bridge, the construction of the Roosevelt Avenue, the main road axis from East to West of the city, the town hall building, and numerous road works which meant the widening of the colonial city, its order in the cardinal points and the generation of a ring road with the first cloverleaf interchange in the city. In an attempt to control the rapid growth of the city, the municipal government (Municipalidad de Guatemala) headed by longtime Mayor Álvaro Arzú, has implemented a plan to focus growth along important arterial roads and apply Transit-oriented development (TOD) characteristics. This plan denominated POT (Plan de Ordenamiento Territorial) aims to allow taller building structures of mixed uses to be built next to large arterial roads and gradually decline in height and density moving away from such. It is also worth mentioning, that due to the airport being in the south of the city, height limits based on aeronautical considerations have been applied to the construction code. This limits the maximum height for a building, at in Zone 10, up to in Zone 1. Climate Despite its location in the tropics, Guatemala City's relatively high altitude moderates average temperatures. The city has a tropical savanna climate (Köppen Aw) bordering on a subtropical highland climate (Cwb). Guatemala City is generally very warm, almost springlike, throughout the course of the year. It occasionally gets hot during the dry season, but not as hot and humid as in Central American cities at sea level. The hottest month is April. The rainy season extends from May to October, coinciding with the tropical storm and hurricane season in the western Atlantic Ocean and Caribbean Sea, while the dry season extends from November to April. The city can at times be windy, which also leads to lower ambient temperatures. The city's average annual temperature ranges are during the day and at night; its average relative humidity is 82% in the morning and 58% in the evening; and its average dew point is . Volcanic activity Four stratovolcanoes are visible from the city, two of them active. The nearest and most active is Pacaya, which at times erupts a considerable amount of ash. These volcanoes lie to the south of the Valle de la Ermita, providing a natural barrier between Guatemala City and the Pacific lowlands that define the southern regions of Guatemala. Agua, Fuego, Pacaya and Acatenango comprise a line of 33 stratovolcanoes that stretches across the breadth of Guatemala, from the Salvadorian border to the Mexican border. Earthquakes Lying on the Ring of Fire, the Guatemalan highlands and the Valle de la Ermita are frequently shaken by large earthquakes. The last large tremor to hit the Guatemala City region occurred in the 1976, on the Motagua Fault, a left-lateral strike-slip fault that forms the boundary between the Caribbean Plate and the North American Plate. The 1976 event registered 7.5 on the moment magnitude scale. Smaller, less severe tremors are frequently felt in Guatemala City and environs. Mudslides Torrential downpours, similar to the more famous monsoons, occur frequently in the Valle de la Ermita during the rainy season, leading to flash floods that sometimes inundate the city. Due to these heavy rainfalls, some of the slums perched on the steep edges of the canyons that criss-cross the Valle de la Ermita are washed away and buried under mudslides, as in October 2005. Tropical waves, tropical storms and hurricanes sometimes strike the Guatemalan highlands, which also bring torrential rains to the Guatemala City region and trigger these deadly mudslides. Piping pseudokarst In February 2007, a very large, deep circular hole with vertical walls opened in northeastern Guatemala City (), killing five people. This sinkhole, which is classified by geologists as either a "piping feature" or "piping pseudokarst", was deep, and apparently was created by fluid from a sewer eroding the loose volcanic ash, limestone, and other pyroclastic deposits that underlie Guatemala City. As a result, one thousand people were evacuated from the area. This piping feature has since been mitigated by City Hall by providing proper maintenance to the sewerage collection system and plans to develop the site have been proposed. However, critics believe municipal authorities have neglected needed maintenance on the city's aging sewerage system, and have speculated that more dangerous piping features are likely to develop unless action is taken. 3 years later the 2010 Guatemala City sinkhole arose. Demographics It is estimated that the population of Guatemala City proper is about 1 million, while its urban area is almost 3 million. The growth of the city's population has been robust, abetted by the mass migration of Guatemalans from the rural hinterlands to the largest and most vibrant regional economy in Guatemala. The inhabitants of Guatemala City are incredibly diverse given the size of the city, with those of Spanish and Mestizo descent being the most numerous. Guatemala City also has sizable indigenous populations, divided among the 23 distinct Mayan groups present in Guatemala. The numerous Mayan languages are now spoken in certain quarters of Guatemala City, making the city a linguistically rich area. Foreigners and foreign immigrants comprise the final distinct group of Guatemala City inhabitants, representing a very small minority among the city's denizens. Due to mass migration from impoverished rural districts wracked with political instability, Guatemala City's population has exploded since the 1970s, severely straining the existing bureaucratic and physical infrastructure of the city. As a result, chronic traffic congestion, shortages of safe potable water in some areas of the city, and a sudden and prolonged surge in crime have become perennial problems. The infrastructure, although continuing to grow and improve in some areas, is lagging in relation to the increasing population of rural migrants, who tend to be poorer. Communications Guatemala City is headquarters to many communications and telecom companies, among them Tigo, Claro-Telgua, and Movistar-Telefónica. These companies also offer cable television, internet services and telephone access. Due to Guatemala City's large and concentrated consumer base in comparison to the rest of the country, these telecom and communications companies provide most of their services and offerings within the confines of the city. There are also seven local television channels, in addition to numerous international channels. The international channels range from children's programming, like Nickelodeon and the Disney Channel, to more adult offerings, such as E! and HBO. While international programming is dominated by entertainment from the United States, domestic programming is dominated by shows from Mexico. Due to its small and relatively income-restricted domestic market, Guatemala City produces very little in the way of its own programming outside of local news and sports. Economy and Finance Guatemala City, as the capital, is home to Guatemala's central bank, from which Guatemala's monetary and fiscal policies are formulated and promulgated. Guatemala City is also headquarters to numerous regional private banks, among them CitiBank, Banco Agromercantil, Banco Promerica, Banco Industrial, Banco GyT Continental, Banco de Antigua, Banco Reformador, Banrural, Grupo Financiero de Occidente, BAC Credomatic, and Banco Internacional. By far the richest and most powerful regional economy within Guatemala, Guatemala City is the largest market for goods and services, which provides the greatest number of investment opportunities for public and private investors in all of Guatemala. Financing for these investments is provided by the regional private banks, as well as through foreign direct investment mostly coming from the United States. Guatemala City's ample consumer base and service sector is represented by the large department store chains present in the city, among them Siman, Hiper Paiz & Paiz (Walmart), Price Smart, ClubCo, Cemaco, Sears and Office Depot. Places of interest by zones Guatemala City is divided into 22 zones in accordance with the urban layout plan designed by Raúl Aguilar Batres. Each zone has its own streets and avenues, facilitating navigation within the city. Zones are numbered 1 through 25. However, numbers 20, 22 and 23 have not been designated to zones, thus these zones do not exist within the city proper. Transportation Renovated and expanded, La Aurora International Airport lies to the south of the city center. La Aurora serves as Guatemala's principal air hub. Public transport is provided by buses and supplemented by a BRT system. The three main highways that bisect and serve Guatemala start in the city. (CA9 Transoceanic Highway - Puerto San Jose to Puerto Santo Tomas de Castilla-, CA1 Panamerican Highway - from the Mexican border to Salvadorian border - and to Peten.) Construction of freeways and underpasses by the municipal government, the implementation of reversible lanes during peak rush-hour traffic, as well as the establishment of the Department of Metropolitan Transit Police (PMT), has helped improve traffic flow in the city. Despite these municipal efforts, the Guatemala City metropolitan area still faces growing traffic congestion. A BRT (bus rapid transit) system called Transmetro, consisting of special-purpose lanes for high-capacity buses, began operating in 2007, and aimed to improve traffic flow in the city through the implementation of an efficient mass transit system. The system consists of five lines. It is expected to be expanded around 10 lines, with some over-capacity expected lines being considered for Light Metro or Heavy Metro. Traditional buses are now required to discharge passengers at transfer stations at the city's edge to board the Transmetro. This is being implemented as new Transmetro lines become established. In conjunction with the new mass transit implementation in the city, there is also a prepaid bus card system called Transurbano that is being implemented in the metro area to limit cash handling for the transportation system. A new fleet of buses tailored for this system has been purchased from a Brazilian firm. A light rail line known as Metro Riel is proposed. Universities and schools Guatemala City is home to ten universities, among them the oldest institution of higher education in Central America, the University of San Carlos of Guatemala. Founded in 1676, the Universidad de San Carlos is older than all North American universities except for Harvard University. The other nine institutions of higher education to be found in Guatemala City include the Universidad Mariano Gálvez, the Universidad Panamericana, the Universidad Mesoamericana, the Universidad Rafael Landivar, the Universidad Francisco Marroquín, the Universidad del Valle, the Universidad del Istmo, Universidad Galileo, Universidad da Vinci and the Universidad Rural. Whereas these nine named universities are private, the Universidad de San Carlos remains the only public institution of higher learning. Sports Guatemala City possesses several sportsgrounds and is home to many sports clubs. Football is the most popular sport, with CSD Municipal, Aurora F.C. and Comunicaciones being the main clubs. The Estadio Mateo Flores, located in the Zone 5 of the city, is the largest stadium in the country, followed in capacity by the Estadio Cementos Progreso, Estadio del Ejército & Estadio El Trébol. An important multi-functional hall is the Domo Polideportivo de la CDAG. The city has hosted several promotional functions and some international sports events: in 1950 it hosted the VI Central American and Caribbean Games, and in 2000 the FIFA Futsal World Championship. On 4 July 2007 the International Olympic Committee gathered in Guatemala City and voted Sochi to become the host for the 2014 Winter Olympics and Paralympics. In April 2010, it hosted the XIVth Pan-American Mountain Bike Championships. Guatemala City hosted the 2008 edition of the CONCACAF Futsal Championship, played at the Domo Polideportivo from 2 to 8 June 2008. Panoramic views of Guatemala City 1875 2020 International relations International organizations with headquarters in Guatemala City Central American Parliament Twin towns – sister cities Guatemala City is twinned with: Notable residents Raúl Aguilar Batres, engineer, creator of Guatemala City's system of avenue/street notation María Dolores Bedoya, Central American independence activist Alejandro Giammattei, President of Guatemala Miguel Ángel Asturias, writer and diplomat, Nobel Prize Laureate Ricardo Arjona, singer /songwriter Manuel Colom Argueta, former mayor of Guatemala City and politician Toti Fernández, triathlete and ultramarathon runner Juan José Gutiérrez, CEO of Pollo Campero and on the board of directors of Corporación Multi Inversiones. Has been featured on the cover of Newsweek as Super CEO and named one of the Ten Big Thinkers for Big Business. Ted Hendricks, Oakland Raiders NFL Hall Of Fame Linebacker. 4-time Super Bowl Champion Jorge de León, performance artist Carlos Mérida, painter Jimmy Morales, Former President of Guatemala Gaby Moreno, singer/ songwriter Carlos Peña, singer, winner of Latin American Idol 2007 Georgina Pontaza, actress and artistic director of the Teatro Abril and Teatro Fantasía Fernando Quevedo, theoretical physicist, professor of High Energy Physics at the University of Cambridge Rodolfo Robles, physician, discovered onchocercosis "Robles' Disease" Fabiola Rodas, winner of The Third TV Azteca's Desafio de Estrellas 2nd Place in The Last Generation of La Academia Gabriela Asturias Ruiz, neuroscientist Carlos Ruíz, football/soccer player Shery, singer / songwriter Jaime Viñals, mountaineer (scaled seven highest peaks in the world) Luis von Ahn, computer scientist, CAPTCHA's creator and Researcher at Carnegie Mellon University Rodrigo Saravia, Guatemala national team footballer Sergio Custodio, professor and writer in logic and metaphysics Ricardo Cerna, a Guatemala-American suspect who somehow committed suicide inside a police station in the US See also 2007 Guatemala earthquake List of places in Guatemala Notes and references References Bibliography External links Official Website of the Municipalidad de Guatemala Municipalities of the Guatemala Department Capitals in Central America Capitals in North America Populated places established in 1773
[ 0.07612568140029907, 0.4419592022895813, 0.1301054060459137, -0.19751018285751343, -0.1933327317237854, 0.2729029357433319, 0.08766953647136688, 1.1669559478759766, -0.7023023366928101, -0.4774341285228729, -0.3443206250667572, 0.41601109504699707, 0.04839944839477539, 0.830697238445282, ...
11875
https://en.wikipedia.org/wiki/GNU
GNU
GNU () is an extensive collection of free software (383 packages as of January 2022), which can be used as an operating system or can be used in parts with other operating systems. The use of the completed GNU tools led to the family of operating systems popularly known as Linux. Most of GNU is licensed under the GNU Project's own General Public License (GPL). GNU is also the project within which the free software concept originated. Richard Stallman, the founder of the project, views GNU as a "technical means to a social end". Relatedly, Lawrence Lessig states in his introduction to the second edition of Stallman's book Free Software, Free Society that in it Stallman has written about "the social aspects of software and how Free Software can create community and social justice". Name GNU is a recursive acronym for "GNU's Not Unix!", chosen because GNU's design is Unix-like, but differs from Unix by being free software and containing no Unix code. Stallman chose the name by using various plays on words, including the song The Gnu. History Development of the GNU operating system was initiated by Richard Stallman while he worked at MIT Artificial Intelligence Laboratory. It was called the GNU Project, and was publicly announced on September 27, 1983, on the net.unix-wizards and net.usoft newsgroups by Stallman. Software development began on January 5, 1984, when Stallman quit his job at the Lab so that they could not claim ownership or interfere with distributing GNU components as free software. The goal was to bring a completely free software operating system into existence. Stallman wanted computer users to be free to study the source code of the software they use, share software with other people, modify the behavior of software, and publish their modified versions of the software. This philosophy was published as the GNU Manifesto in March 1985. Richard Stallman's experience with the Incompatible Timesharing System (ITS), an early operating system written in assembly language that became obsolete due to discontinuation of PDP-10, the computer architecture for which ITS was written, led to a decision that a portable system was necessary. It was thus decided that the development would be started using C and Lisp as system programming languages, and that GNU would be compatible with Unix. At the time, Unix was already a popular proprietary operating system. The design of Unix was modular, so it could be reimplemented piece by piece. Much of the needed software had to be written from scratch, but existing compatible third-party free software components were also used such as the TeX typesetting system, the X Window System, and the Mach microkernel that forms the basis of the GNU Mach core of GNU Hurd (the official kernel of GNU). With the exception of the aforementioned third-party components, most of GNU has been written by volunteers; some in their spare time, some paid by companies, educational institutions, and other non-profit organizations. In October 1985, Stallman set up the Free Software Foundation (FSF). In the late 1980s and 1990s, the FSF hired software developers to write the software needed for GNU. As GNU gained prominence, interested businesses began contributing to development or selling GNU software and technical support. The most prominent and successful of these was Cygnus Solutions, now part of Red Hat. Components The system's basic components include the GNU Compiler Collection (GCC), the GNU C library (glibc), and GNU Core Utilities (coreutils), but also the GNU Debugger (GDB), GNU Binary Utilities (binutils), the GNU Bash shell. GNU developers have contributed to Linux ports of GNU applications and utilities, which are now also widely used on other operating systems such as BSD variants, Solaris and macOS. Many GNU programs have been ported to other operating systems, including proprietary platforms such as Microsoft Windows and macOS. GNU programs have been shown to be more reliable than their proprietary Unix counterparts. As of January 2022, there are a total of 459 GNU packages (including decommissioned, 383 excluding) hosted on the official GNU development site. GNU as an operating system In its original meaning, and one still common in hardware engineering, the operating system is a basic set of functions to control the hardware and manage things like task scheduling and system calls. In modern terminology used by software developers, the collection of these functions is usually referred to as a kernel, while an 'operating system' is expected to have a more extensive set of programmes. The GNU project maintains two kernels itself, allowing the creation of pure GNU operating systems, but the GNU toolchain is also used with non-GNU kernels. Due to the two different definitions of the term 'operating system', there is an ongoing debate concerning the naming of distributions of GNU packages with a non-GNU kernel. (See below.) With kernels maintained by GNU and FSF GNU Hurd The original kernel of GNU Project is the GNU Hurd microkernel, which was the original focus of the Free Software Foundation (FSF). With the April 30, 2015 release of the Debian GNU/Hurd 2015 distro, GNU now provides all required components to assemble an operating system that users can install and use on a computer. However, the Hurd kernel is not yet considered production-ready but rather a base for further development and non-critical application usage. Linux-libre As of 2012, a fork of the Linux kernel became officially part of the GNU Project in the form of Linux-libre, a variant of Linux with all proprietary components removed. The GNU Project has endorsed Linux-libre distributions, such as gNewSense, Trisquel and Parabola GNU/Linux-libre. With non-GNU kernels Because of the development status of Hurd, GNU is usually paired with other kernels such as Linux or FreeBSD. Whether the combination of GNU libraries with external kernels is a GNU operating system with a kernel (e.g. GNU with Linux), because the GNU collection renders the kernel into a usable operating system as understood in modern software development, or whether the kernel is an operating system unto itself with a GNU layer on top (i.e. Linux with GNU), because the kernel can operate a machine without GNU, is a matter of ongoing debate. The FSF maintains that an operating system built using the Linux kernel and GNU tools and utilities should be considered a variant of GNU, and promotes the term GNU/Linux for such systems (leading to the GNU/Linux naming controversy). This view is not exclusive to the FSF. Notably, Debian, one of the biggest and oldest Linux distributions, refers to itself as Debian GNU/Linux. Copyright, GNU licenses, and stewardship The GNU Project recommends that contributors assign the copyright for GNU packages to the Free Software Foundation, though the Free Software Foundation considers it acceptable to release small changes to an existing project to the public domain. However, this is not required; package maintainers may retain copyright to the GNU packages they maintain, though since only the copyright holder may enforce the license used (such as the GNU GPL), the copyright holder in this case enforces it rather than the Free Software Foundation. For the development of needed software, Stallman wrote a license called the GNU General Public License (first called Emacs General Public License), with the goal to guarantee users freedom to share and change free software. Stallman wrote this license after his experience with James Gosling and a program called UniPress, over a controversy around software code use in the GNU Emacs program. For most of the 80s, each GNU package had its own license: the Emacs General Public License, the GCC General Public License, etc. In 1989, FSF published a single license they could use for all their software, and which could be used by non-GNU projects: the GNU General Public License (GPL). This license is now used by most of GNU software, as well as a large number of free software programs that are not part of the GNU Project; it also historically has been the most commonly used free software license (though recently challenged by the MIT license). It gives all recipients of a program the right to run, copy, modify and distribute it, while forbidding them from imposing further restrictions on any copies they distribute. This idea is often referred to as copyleft. In 1991, the GNU Lesser General Public License (LGPL), then known as the Library General Public License, was written for the GNU C Library to allow it to be linked with proprietary software. 1991 also saw the release of version 2 of the GNU GPL. The GNU Free Documentation License (FDL), for documentation, followed in 2000. The GPL and LGPL were revised to version 3 in 2007, adding clauses to protect users against hardware restrictions that prevent users from running modified software on their own devices. Besides GNU's packages, the GNU Project's licenses are used by many unrelated projects, such as the Linux kernel, often used with GNU software. A minority of the software used by most of Linux distributions, such as the X Window System, is licensed under permissive free software licenses. Logo The logo for GNU is a gnu head. Originally drawn by Etienne Suvasa, a bolder and simpler version designed by Aurelio Heckert is now preferred. It appears in GNU software and in printed and electronic documentation for the GNU Project, and is also used in Free Software Foundation materials. There was also a modified version of the official logo. It was created by the Free Software Foundation in September 2013 in order to commemorate the 30th anniversary of the GNU Project. See also Free software movement History of free and open-source software List of computing mascots :Category:Computing mascots References External links Ports of GNU utilities for Microsoft Windows The daemon, the GNU and the penguin Free software operating systems GNU Project GNU Project software Mach (kernel) Microkernel-based operating systems Unix variants Acronyms
[ 0.5293701887130737, 0.19900421798229218, 0.03320799767971039, 0.3313829004764557, -0.34457480907440186, -0.36601683497428894, 0.048964973539114, -0.012481496669352055, -0.25971975922584534, -0.24119079113006592, -0.5615421533584595, 0.585576593875885, -0.4603990614414215, 0.218481257557868...
11877
https://en.wikipedia.org/wiki/Gradualism
Gradualism
Gradualism, from the Latin gradus ("step"), is a hypothesis, a theory or a tenet assuming that change comes about gradually or that variation is gradual in nature and happens over time as opposed to in large steps. Uniformitarianism, incrementalism, and reformism are similar concepts. Geology and biology In the natural sciences, gradualism is the theory which holds that profound change is the cumulative product of slow but continuous processes, often contrasted with catastrophism. The theory was proposed in 1795 by James Hutton, a Scottish geologist, and was later incorporated into Charles Lyell's theory of uniformitarianism. Tenets from both theories were applied to biology and formed the basis of early evolutionary theory. Charles Darwin was influenced by Lyell's Principles of Geology, which explained both uniformitarian methodology and theory. Using uniformitarianism, which states that one cannot make an appeal to any force or phenomenon which cannot presently be observed (see catastrophism), Darwin theorized that the evolutionary process must occur gradually, not in saltations, since saltations are not presently observed, and extreme deviations from the usual phenotypic variation would be more likely to be selected against. Gradualism is often confused with the concept of phyletic gradualism. It is a term coined by Stephen Jay Gould and Niles Eldredge to contrast with their model of punctuated equilibrium, which is gradualist itself, but argues that most evolution is marked by long periods of evolutionary stability (called stasis), which is punctuated by rare instances of branching evolution. Politics and society In politics, gradualism is the hypothesis that social change can be achieved in small, discrete increments rather than in abrupt strokes such as revolutions or uprisings. Gradualism is one of the defining features of political liberalism and reformism. Machiavellian politics pushes politicians to espouse gradualism. In socialist politics and within the socialist movement, the concept of gradualism is frequently distinguished from reformism, with the former insisting that short-term goals need to be formulated and implemented in such a way that they inevitably lead into long-term goals. It is most commonly associated with the libertarian socialist concept of dual power and is seen as a middle way between reformism and revolutionism. Martin Luther King Jr. was opposed to the idea of gradualism as a method of eliminating segregation. The United States government wanted to try to integrate African-Americans and European-Americans slowly into the same society, but many believed it was a way for the government to put off actually doing anything about racial segregation: Linguistics and language change In linguistics, language change is seen as gradual, the product of chain reactions and subject to cyclic drift. The view that creole languages are the product of catastrophism is heavily disputed. Morality Christianity Buddhism, Theravada and Yoga Gradualism is the approach of certain schools of Buddhism and other Eastern philosophies (e.g. Theravada or Yoga), that enlightenment can be achieved step by step, through an arduous practice. The opposite approach, that insight is attained all at once, is called subitism. The debate on the issue was very important to the history of the development of Zen, which rejected gradualism, and to the establishment of the opposite approach within the Tibetan Buddhism, after the Debate of Samye. It was continued in other schools of Indian and Chinese philosophy. Types Phyletic gradualism is a model of evolution which theorizes that most speciation is slow, uniform and gradual. When evolution occurs in this mode, it is usually by the steady transformation of a whole species into a new one (through a process called anagenesis). In this view no clear line of demarcation exists between an ancestral species and a descendant species, unless splitting occurs. Punctuated gradualism is a microevolutionary hypothesis that refers to a species that has "relative stasis over a considerable part of its total duration [and] underwent periodic, relatively rapid, morphologic change that did not lead to lineage branching". It is one of the three common models of evolution. While the traditional model of palaeontology, the phylogenetic model, states that features evolved slowly without any direct association with speciation, the relatively newer and more controversial idea of punctuated equilibrium claims that major evolutionary changes don't happen over a gradual period but in localized, rare, rapid events of branching speciation. Punctuated gradualism is considered to be a variation of these models, lying somewhere in between the phyletic gradualism model and the punctuated equilibrium model. It states that speciation is not needed for a lineage to rapidly evolve from one equilibrium to another but may show rapid transitions between long-stable states. Contradictorial gradualism is the paraconsistent treatment of fuzziness developed by Lorenzo Peña which regards true contradictions as situations wherein a state of affairs enjoys only partial existence. Gradualism in social change implemented through reformist means is a moral principle to which the Fabian Society is committed. In a more general way, reformism is the assumption that gradual changes through and within existing institutions can ultimately change a society's fundamental economic system and political structures; and that an accumulation of reforms can lead to the emergence of an entirely different economic system and form of society than present-day capitalism. That hypothesis of social change grew out of opposition to revolutionary socialism, which contends that revolution is necessary for fundamental structural changes to occur. In the terminology of NWO-related speculations, gradualism refers to the gradual implementation of a totalitarian world government. See also Evolution Uniformitarianism Incrementalism Reformism Catastrophism Saltation Punctuated equilibrium Accelerationism References Geology theories Rate of evolution Liberalism Social democracy Historical linguistics Social theories
[ 0.037597086280584335, -0.3667431175708771, -0.03821520879864693, 0.03131253272294998, 0.0646454244852066, 0.24550694227218628, 0.24864400923252106, 0.2747621238231659, -0.14813916385173798, -0.7649723887443542, -0.19357402622699738, 0.3553229868412018, -0.11367782205343246, 0.7073237299919...
11882
https://en.wikipedia.org/wiki/Greek
Greek
Greek may refer to: Greece Anything of, from, or related to Greece, a country in Southern Europe: Greeks, an ethnic group Greek language, a branch of the Indo-European language family Proto-Greek language, the assumed last common ancestor of all known varieties of Greek Mycenaean Greek, most ancient attested form of the language (16th to 11th centuries BC) Ancient Greek, forms of the language used c. 1000–330 BC Koine Greek, common form of Greek spoken and written during Classical antiquity Medieval Greek or Byzantine Greek, language used between the Middle Ages and the Ottoman conquest of Constantinople Modern Greek, varieties spoken in the modern era (from 1453 AD) Greek alphabet, script used to write the Greek language Greek Orthodox Church, several Churches of the Eastern Orthodox Church Ancient Greece, the ancient civilization before the end of Antiquity Other uses Greek (play), 1980 play by Steven Berkoff Greek (opera), 1988 opera by Mark-Antony Turnage, based on Steven Berkoff's play Greek (TV series) (also stylized GRΣΣK), 2007 ABC Family channel's comedy-drama television series set at a fictitious college's fictional Greek system Greeks (finance), quantities representing the sensitivity of the price of derivatives Greeking, a style of displaying or rendering text or symbols in a computer display or typographic layout Greek-letter organizations (GLOs), social organizations for undergraduate students at North American colleges Greek Theatre (Los Angeles), a theatre located at Griffith Park in Los Angeles, California Greek Revival, an architectural movement of the late 18th and early 19th centuries Greek love, a term referring variously to male bonding, homosexuality, pederasty and anal sex The Greek, a fictional character on the HBO drama The Wire The Greeks (book), a 1951 non-fiction book on classical Greece by HDF Kitto Greeks, a group of scholars in 16th-century England who were part of the Grammarians' War See also Greeks (disambiguation) Greek dialects (disambiguation) Hellenic (disambiguation) Names of the Greeks, terms for the Greek people Name of Greece, names for the country Greek to me, an idiom for something not understandable Language and nationality disambiguation pages
[ 0.3956955075263977, 0.4209509491920471, -0.7682734131813049, 0.2606465220451355, 0.28351718187332153, 0.7474057674407959, 0.20155806839466095, 0.632082462310791, -0.2966977059841156, -0.5792970657348633, -0.23615214228630066, 0.050720926374197006, -0.3536882996559143, 0.27608567476272583, ...
11883
https://en.wikipedia.org/wiki/Germanic%20languages
Germanic languages
The Germanic languages are a branch of the Indo-European language family spoken natively by a population of about 515 million people mainly in Europe, North America, Oceania and Southern Africa. The most widely spoken Germanic language, English, is also the world's most widely spoken language with an estimated 2 billion speakers. All Germanic languages are derived from Proto-Germanic, spoken in Iron Age Scandinavia. The West Germanic languages include the three most widely spoken Germanic languages: English with around 360–400 million native speakers; German, with over 100 million native speakers; and Dutch, with 24 million native speakers. Other West Germanic languages include Afrikaans, an offshoot of Dutch, with over 7.1 million native speakers; Low German, considered a separate collection of unstandardized dialects, with roughly 4.35-7.15 million native speakers and probably 6.7–10 million people who can understand it (at least 2.2 million in Germany (2016) and 2.15 million in the Netherlands (2003)); Yiddish, once used by approximately 13 million Jews in pre-World War II Europe, now with approximately 1.5 million native speakers; Scots, with 1.5 million native speakers; Limburgish varieties with roughly 1.3 million speakers along the Dutch–Belgian–German border; and the Frisian languages with over 0.5 million native speakers in the Netherlands and Germany. The largest North Germanic languages are Swedish, Danish and Norwegian, which are in part mutually intelligible and have a combined total of about 20 million native speakers in the Nordic countries and an additional five million second language speakers; since the Middle Ages these languages have however been strongly influenced by the West Germanic language Middle Low German, and Low German words account for about 30–60% of their vocabularies according to various estimates. Other extant North Germanic languages are Faroese, Icelandic, and Elfdalian, which are more conservative languages with no significant Low German influence, more complex grammar and limited mutual intelligibility with the others today. The East Germanic branch included Gothic, Burgundian, and Vandalic, all of which are now extinct. The last to die off was Crimean Gothic, spoken until the late 18th century in some isolated areas of Crimea. The SIL Ethnologue lists 48 different living Germanic languages, 41 of which belong to the Western branch and six to the Northern branch; it places Riograndenser Hunsrückisch German in neither of the categories, but it is often considered a German dialect by linguists. The total number of Germanic languages throughout history is unknown as some of them, especially the East Germanic languages, disappeared during or after the Migration Period. Some of the West Germanic languages also did not survive past the Migration Period, including Lombardic. As a result of World War II and subsequent mass expulsion of Germans, the German language suffered a significant loss of Sprachraum, as well as moribundity and extinction of several of its dialects. In the 21st century, German dialects are dying out as Standard German gains primacy. The common ancestor of all of the languages in this branch is called Proto-Germanic, also known as Common Germanic, which was spoken in about the middle of the 1st millennium BC in Iron Age Scandinavia. Proto-Germanic, along with all of its descendants, notably has a number of unique linguistic features, most famously the consonant change known as "Grimm's law." Early varieties of Germanic entered history when the Germanic tribes moved south from Scandinavia in the 2nd century BC to settle in the area of today's northern Germany and southern Denmark. Modern status West Germanic languages English is an official language of Belize, Canada, Nigeria, Falkland Islands, Saint Helena, Malta, New Zealand, Ireland, South Africa, Philippines, Jamaica, Dominica, Guyana, Trinidad and Tobago, American Samoa, Palau, St. Lucia, Grenada, Barbados, St. Vincent and the Grenadines, Puerto Rico, Guam, Hong Kong, Singapore, Pakistan, India, Papua New Guinea, Namibia, Vanuatu, the Solomon Islands and former British colonies in Asia, Africa and Oceania. Furthermore, it is the de facto language of the United Kingdom, the United States and Australia, as well as a recognized language in Nicaragua and Malaysia. German is a language of Austria, Belgium, Germany, Liechtenstein, Luxembourg and Switzerland and has regional status in Italy, Poland, Namibia and Denmark. German also continues to be spoken as a minority language by immigrant communities in North America, South America, Central America, Mexico and Australia. A German dialect, Pennsylvania German, is still used among various populations in the American state of Pennsylvania in daily life. Dutch is an official language of Aruba, Belgium, Curaçao, the Netherlands, Sint Maarten, and Suriname. The Netherlands also colonized Indonesia, but Dutch was scrapped as an official language after Indonesian independence. Today, it is only used by older or traditionally educated people. Dutch was until 1984 an official language in South Africa but evolved into and was replaced by Afrikaans, a partially mutually intelligible daughter language of Dutch. Afrikaans is one of the 11 official languages in South Africa and is a lingua franca of Namibia. It is used in other Southern African nations, as well. Low German is a collection of very diverse dialects spoken in the northeast of the Netherlands and northern Germany. Scots is spoken in Lowland Scotland and parts of Ulster (where the local dialect is known as Ulster Scots). Frisian is spoken among half a million people who live on the southern fringes of the North Sea in the Netherlands and Germany. Luxembourgish is a Moselle Franconian dialect that is spoken mainly in the Grand Duchy of Luxembourg, where it is considered to be an official language. Similar varieties of Moselle Franconian are spoken in small parts of Belgium, France, and Germany. Yiddish, once a native language of some 11 to 13 million people, remains in use by some 1.5 million speakers in Jewish communities around the world, mainly in North America, Europe, Israel, and other regions with Jewish populations. Limburgish varieties are spoken in the Limburg and Rhineland regions, along the Dutch–Belgian–German border. North Germanic languages In addition to being the official language in Sweden, Swedish is also spoken natively by the Swedish-speaking minority in Finland, which is a large part of the population along the coast of western and southern Finland. Swedish is also one of the two official languages in Finland, along with Finnish, and the only official language in Åland. Swedish is also spoken by some people in Estonia. Danish is an official language of Denmark and in its overseas territory of the Faroe Islands, and it is a lingua franca and language of education in its other overseas territory of Greenland, where it was one of the official languages until 2009. Danish, a locally recognized minority language, is also natively spoken by the Danish minority in the German state of Schleswig-Holstein. Norwegian is the official language of Norway. Norwegian is also the official language in the overseas territories of Norway such as Svalbard, Jan Mayen, Bouvet island, Queen Maud Land and Peter I island. Icelandic is the official language of Iceland. Faroese is the official language of the Faroe Islands, and is also spoken by some people in Denmark. Statistics History All Germanic languages are thought to be descended from a hypothetical Proto-Germanic, united by subjection to the sound shifts of Grimm's law and Verner's law. These probably took place during the Pre-Roman Iron Age of Northern Europe from c. 500 BC. Proto-Germanic itself was likely spoken after c. 500 BC, and Proto-Norse from the 2nd century AD and later is still quite close to reconstructed Proto-Germanic, but other common innovations separating Germanic from Proto-Indo-European suggest a common history of pre-Proto-Germanic speakers throughout the Nordic Bronze Age. From the time of their earliest attestation, the Germanic varieties are divided into three groups: West, East, and North Germanic. Their exact relation is difficult to determine from the sparse evidence of runic inscriptions. The western group would have formed in the late Jastorf culture, and the eastern group may be derived from the 1st-century variety of Gotland, leaving southern Sweden as the original location of the northern group. The earliest period of Elder Futhark (2nd to 4th centuries) predates the division in regional script variants, and linguistically essentially still reflects the Common Germanic stage. The Vimose inscriptions include some of the oldest datable Germanic inscriptions, starting in c. 160 AD. The earliest coherent Germanic text preserved is the 4th-century Gothic translation of the New Testament by Ulfilas. Early testimonies of West Germanic are in Old Frankish/Old Dutch (the 5th-century Bergakker inscription), Old High German (scattered words and sentences 6th century and coherent texts 9th century), and Old English (oldest texts 650, coherent texts 10th century). North Germanic is only attested in scattered runic inscriptions, as Proto-Norse, until it evolves into Old Norse by about 800. Longer runic inscriptions survive from the 8th and 9th centuries (Eggjum stone, Rök stone), longer texts in the Latin alphabet survive from the 12th century (Íslendingabók), and some skaldic poetry dates back to as early as the 9th century. By about the 10th century, the varieties had diverged enough to make mutual intelligibility difficult. The linguistic contact of the Viking settlers of the Danelaw with the Anglo-Saxons left traces in the English language and is suspected to have facilitated the collapse of Old English grammar that, combined with the influx of Romance Old French vocabulary after the Norman Conquest, resulted in Middle English from the 12th century. The East Germanic languages were marginalized from the end of the Migration Period. The Burgundians, Goths, and Vandals became linguistically assimilated by their respective neighbors by about the 7th century, with only Crimean Gothic lingering on until the 18th century. During the early Middle Ages, the West Germanic languages were separated by the insular development of Middle English on one hand and by the High German consonant shift on the continent on the other, resulting in Upper German and Low Saxon, with graded intermediate Central German varieties. By early modern times, the span had extended into considerable differences, ranging from Highest Alemannic in the South to Northern Low Saxon in the North, and, although both extremes are considered German, they are hardly mutually intelligible. The southernmost varieties had completed the second sound shift, while the northern varieties remained unaffected by the consonant shift. The North Germanic languages, on the other hand, remained unified until well past 1000 AD, and in fact the mainland Scandinavian languages still largely retain mutual intelligibility into modern times. The main split in these languages is between the mainland languages and the island languages to the west, especially Icelandic, which has maintained the grammar of Old Norse virtually unchanged, while the mainland languages have diverged greatly. Distinctive characteristics Germanic languages possess a number of defining features compared with other Indo-European languages. Some of the best-known are the following: The sound changes known as Grimm's Law and Verner's Law, which shifted the values of all the Indo-European stop consonants (for example, original * became Germanic * in most cases; compare three with Latin , two with Latin , do with Sanskrit ). The recognition of these two sound laws were seminal events in the understanding of the regular nature of linguistic sound change and the development of the comparative method, which forms the basis of modern historical linguistics. The development of a strong stress on the first syllable of the word, which triggered significant phonological reduction of all other syllables. This is responsible for the reduction of most of the basic English, Norwegian, Danish and Swedish words into monosyllables, and the common impression of modern English and German as consonant-heavy languages. Examples are Proto-Germanic → strength, → ant, → head, → hear, → German "autumn, harvest", → German "witch, hag". A change known as Germanic umlaut, which modified vowel qualities when a high front vocalic segment (, or ) followed in the next syllable. Generally, back vowels were fronted, and front vowels were raised. In many languages, the modified vowels are indicated with a diaeresis (e.g., in German, pronounced , respectively). This change resulted in pervasive alternations in related words — still extremely prominent in modern German but present only in remnants in modern English (e.g., mouse/mice, goose/geese, broad/breadth, tell/told, old/elder, foul/filth, gold/gild). Large numbers of vowel qualities. English has around 11–12 vowels in most dialects (not counting diphthongs), Standard Swedish has 17 pure vowels (monophthongs), standard German and Dutch 14, and Danish at least 11. The Amstetten dialect of Bavarian German has 13 distinctions among long vowels alone, one of the largest such inventories in the world. Verb second (V2) word order, which is uncommon cross-linguistically. Exactly one noun phrase or adverbial element must precede the verb; in particular, if an adverb or prepositional phrase precedes the verb, then the subject must immediately follow the finite verb. In modern English, this survives only in a few relics, known in the EFL classroom as "inversion": examples include some constructions with here or there (Here comes the sun; there are five continents), verbs of speech after a quote ("Yes", said John), sentences beginning with certain conjunctions (Hardly had he said this when...; Only much later did he realize...) and sentences beginning with certain adverbs of motion to create a sense of drama (Over went the boat; out ran the cat; Pop Goes The Weasel). However it is common in all other modern Germanic languages. Other significant characteristics are: The reduction of the various tense and aspect combinations of the Indo-European verbal system into only two: the present tense and the past tense (also called the preterite). A large class of verbs that use a dental suffix ( or ) instead of vowel alternation (Indo-European ablaut) to indicate past tense. These are called the Germanic weak verbs; the remaining verbs with vowel ablaut are the Germanic strong verbs. A distinction in definiteness of a noun phrase that is marked by different sets of inflectional endings for adjectives, the so-called strong and weak adjectives. A similar development happened in the Balto-Slavic languages. This distinction has been lost in modern English but was present in Old English and remains in all other Germanic languages to various degrees. Some words with etymologies that are difficult to link to other Indo-European families but with variants that appear in almost all Germanic languages. See Germanic substrate hypothesis. Discourse particles, which are a class of short, unstressed words which speakers use to express their attitude towards the utterance or the hearer. This word category seems to be rare outside of the Germanic languages. English doesn't make extensive use of discourse particles; an example would be the word 'just', which the speaker can use to express surprise. Note that some of the above characteristics were not present in Proto-Germanic but developed later as areal features that spread from language to language: Germanic umlaut only affected the North and West Germanic languages (which represent all modern Germanic languages) but not the now-extinct East Germanic languages, such as Gothic, nor Proto-Germanic, the common ancestor of all Germanic languages. The large inventory of vowel qualities is a later development, due to a combination of Germanic umlaut and the tendency in many Germanic languages for pairs of long/short vowels of originally identical quality to develop distinct qualities, with the length distinction sometimes eventually lost. Proto-Germanic had only five distinct vowel qualities, although there were more actual vowel phonemes because length and possibly nasality were phonemic. In modern German, long-short vowel pairs still exist but are also distinct in quality. Proto-Germanic probably had a more general S-O-V-I word order. However, the tendency toward V2 order may have already been present in latent form and may be related to Wackernagel's Law, an Indo-European law dictating that sentence clitics must be placed second. Roughly speaking, Germanic languages differ in how conservative or how progressive each language is with respect to an overall trend toward analyticity. Some, such as Icelandic and, to a lesser extent, German, have preserved much of the complex inflectional morphology inherited from Proto-Germanic (and in turn from Proto-Indo-European). Others, such as English, Swedish, and Afrikaans, have moved toward a largely analytic type. Linguistic developments The subgroupings of the Germanic languages are defined by shared innovations. It is important to distinguish innovations from cases of linguistic conservatism. That is, if two languages in a family share a characteristic that is not observed in a third language, that is evidence of common ancestry of the two languages only if the characteristic is an innovation compared to the family's proto-language. The following innovations are common to the Northwest Germanic languages (all but Gothic): The lowering of /u/ to /o/ in initial syllables before /a/ in the following syllable: → bode, Icelandic "messages" ("a-Umlaut", traditionally called Brechung) "Labial umlaut" in unstressed medial syllables (the conversion of /a/ to /u/ and /ō/ to /ū/ before /m/, or /u/ in the following syllable) The conversion of /ē1/ into /ā/ (vs. Gothic /ē/) in stressed syllables. In unstressed syllables, West Germanic also has this change, but North Germanic has shortened the vowel to /e/, then raised it to /i/. This suggests it was an areal change. The raising of final /ō/ to /u/ (Gothic lowers it to /a/). It is kept distinct from the nasal /ǭ/, which is not raised. The monophthongization of /ai/ and /au/ to /ē/ and /ō/ in non-initial syllables (however, evidence for the development of /au/ in medial syllables is lacking). The development of an intensified demonstrative ending in /s/ (reflected in English "this" compared to "the") Introduction of a distinct ablaut grade in Class VII strong verbs, while Gothic uses reduplication (e.g. Gothic haihait; ON, OE hēt, preterite of the Gmc verb *haitan "to be called") as part of a comprehensive reformation of the Gmc Class VII from a reduplicating to a new ablaut pattern, which presumably started in verbs beginning with vowel or /h/ (a development which continues the general trend of de-reduplication in Gmc); there are forms (such as OE dial. heht instead of hēt) which retain traces of reduplication even in West and North Germanic The following innovations are also common to the Northwest Germanic languages but represent areal changes: Proto-Germanic /z/ > /r/ (e.g. Gothic dius; ON dȳr, OHG tior, OE dēor, "wild animal"); note that this is not present in Proto-Norse and must be ordered after West Germanic loss of final /z/ Germanic umlaut The following innovations are common to the West Germanic languages: Loss of final /z/. In single-syllable words, Old High German retains it (as /r/), while it disappears in the other West Germanic languages. Change of [ð] (fricative allophone of /d/) to stop [d] in all environments. Change of /lþ/ to stop /ld/ (except word-finally). West Germanic gemination of consonants, except r, before /j/. This only occurred in short-stemmed words due to Sievers' law. Gemination of /p/, /t/, /k/ and /h/ is also observed before liquids. Labiovelar consonants become plain velar when non-initial. A particular type of umlaut /e-u-i/ > /i-u-i/. Changes to the 2nd person singular past-tense: Replacement of the past-singular stem vowel with the past-plural stem vowel, and substitution of the ending -t with -ī. Short forms (*stān, stēn, *gān, gēn) of the verbs for "stand" and "go"; but note that Crimean Gothic also has gēn. The development of a gerund. The following innovations are common to the Ingvaeonic subgroup of the West Germanic languages, which includes English, Frisian, and in a few cases Dutch and Low German, but not High German: The so-called Ingvaeonic nasal spirant law, with loss of /n/ before voiceless fricatives: e.g. *munþ, *gans > Old English mūþ, gōs > "mouth, goose", but German Mund, Gans. The loss of the Germanic reflexive pronoun . Dutch has reclaimed the reflexive pronoun from Middle High German . The reduction of the three Germanic verbal plural forms into one form ending in -þ. The development of Class III weak verbs into a relic class consisting of four verbs (*sagjan "to say", *hugjan "to think", *habjan "to have", *libjan "to live"; cf. the numerous Old High German verbs in -ēn). The split of the Class II weak verb ending *-ō- into *-ō-/-ōja- (cf. Old English -ian < -ōjan, but Old High German -ōn). Development of a plural ending *-ōs in a-stem nouns (note, Gothic also has -ōs, but this is an independent development, caused by terminal devoicing of *-ōz; Old Frisian has -ar, which is thought to be a late borrowing from Danish). Cf. modern English plural -(e)s, but German plural -e. Possibly, the monophthongization of Germanic *ai to ē/ā (this may represent independent changes in Old Saxon and Anglo-Frisian). The following innovations are common to the Anglo-Frisian subgroup of the Ingvaeonic languages: Raising of nasalized a, ā into o, ō. Anglo-Frisian brightening: Fronting of non-nasal a, ā to æ,ǣ when not followed by n or m. Metathesis of CrV into CVr, where C represents any consonant and V any vowel. Monophthongization of ai into ā. Common linguistic features Phonology The oldest Germanic languages all share a number of features, which are assumed to be inherited from Proto-Germanic. Phonologically, it includes the important sound changes known as Grimm's Law and Verner's Law, which introduced a large number of fricatives; late Proto-Indo-European had only one, /s/. The main vowel developments are the merging (in most circumstances) of long and short /a/ and /o/, producing short /a/ and long /ō/. That likewise affected the diphthongs, with PIE /ai/ and /oi/ merging into /ai/ and PIE /au/ and /ou/ merging into /au/. PIE /ei/ developed into long /ī/. PIE long /ē/ developed into a vowel denoted as /ē1/ (often assumed to be phonetically ), while a new, fairly uncommon long vowel /ē2/ developed in varied and not completely understood circumstances. Proto-Germanic had no front rounded vowels, but all Germanic languages except for Gothic subsequently developed them through the process of i-umlaut. Proto-Germanic developed a strong stress accent on the first syllable of the root, but remnants of the original free PIE accent are visible due to Verner's Law, which was sensitive to this accent. That caused a steady erosion of vowels in unstressed syllables. In Proto-Germanic, that had progressed only to the point that absolutely-final short vowels (other than /i/ and /u/) were lost and absolutely-final long vowels were shortened, but all of the early literary languages show a more advanced state of vowel loss. This ultimately resulted in some languages (like Modern English) losing practically all vowels following the main stress and the consequent rise of a very large number of monosyllabic words. Table of outcomes The following table shows the main outcomes of Proto-Germanic vowels and consonants in the various older languages. For vowels, only the outcomes in stressed syllables are shown. Outcomes in unstressed syllables are quite different, vary from language to language and depend on a number of other factors (such as whether the syllable was medial or final, whether the syllable was open or closed and (in some cases) whether the preceding syllable was light or heavy). Notes: C- means before a vowel (word-initially, or sometimes after a consonant). -C- means between vowels. -C means after a vowel (word-finally or before a consonant). Word-final outcomes generally occurred after deletion of final short vowels, which occurred shortly after Proto-Germanic and is reflected in the history of all written languages except for Proto-Norse. The above three are given in the order C-, -C-, -C. If one is omitted, the previous one applies. For example, f, -[v]- means that [v] occurs after a vowel regardless of what follows. Something like a(…u) means "a if /u/ occurs in the next syllable". Something like a(n) means "a if /n/ immediately follows". Something like (n)a means "a if /n/ immediately precedes". Morphology The oldest Germanic languages have the typical complex inflected morphology of old Indo-European languages, with four or five noun cases; verbs marked for person, number, tense and mood; multiple noun and verb classes; few or no articles; and rather free word order. The old Germanic languages are famous for having only two tenses (present and past), with three PIE past-tense aspects (imperfect, aorist, and perfect/stative) merged into one and no new tenses (future, pluperfect, etc.) developing. There were three moods: indicative, subjunctive (developed from the PIE optative mood) and imperative. Gothic verbs had a number of archaic features inherited from PIE that were lost in the other Germanic languages with few traces, including dual endings, an inflected passive voice (derived from the PIE mediopassive voice), and a class of verbs with reduplication in the past tense (derived from the PIE perfect). The complex tense system of modern English (e.g. In three months, the house will still be being built or If you had not acted so stupidly, we would never have been caught) is almost entirely due to subsequent developments (although paralleled in many of the other Germanic languages). Among the primary innovations in Proto-Germanic are the preterite present verbs, a special set of verbs whose present tense looks like the past tense of other verbs and which is the origin of most modal verbs in English; a past-tense ending; (in the so-called "weak verbs", marked with -ed in English) that appears variously as /d/ or /t/, often assumed to be derived from the verb "to do"; and two separate sets of adjective endings, originally corresponding to a distinction between indefinite semantics ("a man", with a combination of PIE adjective and pronoun endings) and definite semantics ("the man", with endings derived from PIE n-stem nouns). Note that most modern Germanic languages have lost most of the inherited inflectional morphology as a result of the steady attrition of unstressed endings triggered by the strong initial stress. (Contrast, for example, the Balto-Slavic languages, which have largely kept the Indo-European pitch accent and consequently preserved much of the inherited morphology.) Icelandic and to a lesser extent modern German best preserve the Proto–Germanic inflectional system, with four noun cases, three genders, and well-marked verbs. English and Afrikaans are at the other extreme, with almost no remaining inflectional morphology. The following shows a typical masculine a-stem noun, Proto-Germanic *fiskaz ("fish"), and its development in the various old literary languages: Strong vs. weak nouns and adjectives Originally, adjectives in Proto-Indo-European followed the same declensional classes as nouns. The most common class (the o/ā class) used a combination of o-stem endings for masculine and neuter genders and ā-stems ending for feminine genders, but other common classes (e.g. the i class and u class) used endings from a single vowel-stem declension for all genders, and various other classes existed that were based on other declensions. A quite different set of "pronominal" endings was used for pronouns, determiners, and words with related semantics (e.g., "all", "only"). An important innovation in Proto-Germanic was the development of two separate sets of adjective endings, originally corresponding to a distinction between indefinite semantics ("a man") and definite semantics ("the man"). The endings of indefinite adjectives were derived from a combination of pronominal endings with one of the common vowel-stem adjective declensions – usually the o/ā class (often termed the a/ō class in the specific context of the Germanic languages) but sometimes the i or u classes. Definite adjectives, however, had endings based on n-stem nouns. Originally both types of adjectives could be used by themselves, but already by Proto-Germanic times a pattern evolved whereby definite adjectives had to be accompanied by a determiner with definite semantics (e.g., a definite article, demonstrative pronoun, possessive pronoun, or the like), while indefinite adjectives were used in other circumstances (either accompanied by a word with indefinite semantics such as "a", "one", or "some" or unaccompanied). In the 19th century, the two types of adjectives – indefinite and definite – were respectively termed "strong" and "weak", names which are still commonly used. These names were based on the appearance of the two sets of endings in modern German. In German, the distinctive case endings formerly present on nouns have largely disappeared, with the result that the load of distinguishing one case from another is almost entirely carried by determiners and adjectives. Furthermore, due to regular sound change, the various definite (n-stem) adjective endings coalesced to the point where only two endings (-e and -en) remain in modern German to express the sixteen possible inflectional categories of the language (masculine/feminine/neuter/plural crossed with nominative/accusative/dative/genitive – modern German merges all genders in the plural). The indefinite (a/ō-stem) adjective endings were less affected by sound change, with six endings remaining (-, -e, -es, -er, -em, -en), cleverly distributed in a way that is capable of expressing the various inflectional categories without too much ambiguity. As a result, the definite endings were thought of as too "weak" to carry inflectional meaning and in need of "strengthening" by the presence of an accompanying determiner, while the indefinite endings were viewed as "strong" enough to indicate the inflectional categories even when standing alone. (This view is enhanced by the fact that modern German largely uses weak-ending adjectives when accompanying an indefinite article, and hence the indefinite/definite distinction no longer clearly applies.) By analogy, the terms "strong" and "weak" were extended to the corresponding noun classes, with a-stem and ō-stem nouns termed "strong" and n-stem nouns termed "weak". However, in Proto-Germanic – and still in Gothic, the most conservative Germanic language – the terms "strong" and "weak" are not clearly appropriate. For one thing, there were a large number of noun declensions. The a-stem, ō-stem, and n-stem declensions were the most common and represented targets into which the other declensions were eventually absorbed, but this process occurred only gradually. Originally the n-stem declension was not a single declension but a set of separate declensions (e.g., -an, -ōn, -īn) with related endings, and these endings were in no way any "weaker" than the endings of any other declensions. (For example, among the eight possible inflectional categories of a noun — singular/plural crossed with nominative/accusative/dative/genitive — masculine an-stem nouns in Gothic include seven endings, and feminine ōn-stem nouns include six endings, meaning there is very little ambiguity of "weakness" in these endings and in fact much less than in the German "strong" endings.) Although it is possible to group the various noun declensions into three basic categories — vowel-stem, n-stem, and other-consonant-stem (a.k.a. "minor declensions") — the vowel-stem nouns do not display any sort of unity in their endings that supports grouping them together with each other but separate from the n-stem endings. It is only in later languages that the binary distinction between "strong" and "weak" nouns become more relevant. In Old English, the n-stem nouns form a single, clear class, but the masculine a-stem and feminine ō-stem nouns have little in common with each other, and neither has much similarity to the small class of u-stem nouns. Similarly, in Old Norse, the masculine a-stem and feminine ō-stem nouns have little in common with each other, and the continuations of the masculine an-stem and feminine ōn/īn-stem nouns are also quite distinct. It is only in Middle Dutch and modern German that the various vowel-stem nouns have merged to the point that a binary strong/weak distinction clearly applies. As a result, newer grammatical descriptions of the Germanic languages often avoid the terms "strong" and "weak" except in conjunction with German itself, preferring instead to use the terms "indefinite" and "definite" for adjectives and to distinguish nouns by their actual stem class. In English, both two sets of adjective endings were lost entirely in the late Middle English period. Classification Note that divisions between and among subfamilies of Germanic are rarely precisely defined; most form continuous clines, with adjacent varieties being mutually intelligible and more separated ones not. Within the Germanic language family are East Germanic, West Germanic, and North Germanic. However, East Germanic languages became extinct several centuries ago. All living Germanic languages belong either to the West Germanic or to the North Germanic branch. The West Germanic group is the larger by far, further subdivided into Anglo-Frisian on one hand and Continental West Germanic on the other. Anglo-Frisian notably includes English and all its variants, while Continental West Germanic includes German (standard register and dialects), as well as Dutch (standard register and dialects). East Germanic includes most notably the extinct Gothic and Crimean Gothic languages. Modern classification looks like this. For a full classification, see List of Germanic languages. West Germanic High German languages (includes Standard German and its dialects) Upper German Alemannic German Austro-Bavarian German Mòcheno language Cimbrian language Hutterite German Wymysorys Hunsrik Yiddish High Franconian (a transitional dialect between Upper and Central German) Central German East Central German West Central German Luxembourgish Pennsylvania German Low German West Low German East Low German Plautdietsch (Mennonite Low German) Low Franconian Dutch and its dialects Afrikaans (a separate standard language) Limburgish (an official minority language) Anglo-Frisian Anglic (or English) English and its dialects Scots in Scotland and Ulster Frisian West Frisian East Frisian Saterland Frisian (last remaining dialect of East Frisian) North Frisian North Germanic West Scandinavian Norwegian (of Western branch origin, but heavily influenced by the Eastern branch) Icelandic Faroese Elfdalian East Scandinavian Danish Swedish Dalecarlian dialects Gutnish East Germanic Gothic † Crimean Gothic † (relationship to earlier Gothic unclear) Burgundian † Vandalic † Writing The earliest evidence of Germanic languages comes from names recorded in the 1st century by Tacitus (especially from his work Germania), but the earliest Germanic writing occurs in a single instance in the 2nd century BC on the Negau helmet. From roughly the 2nd century AD, certain speakers of early Germanic varieties developed the Elder Futhark, an early form of the runic alphabet. Early runic inscriptions also are largely limited to personal names and difficult to interpret. The Gothic language was written in the Gothic alphabet developed by Bishop Ulfilas for his translation of the Bible in the 4th century. Later, Christian priests and monks who spoke and read Latin in addition to their native Germanic varieties began writing the Germanic languages with slightly modified Latin letters. However, throughout the Viking Age, runic alphabets remained in common use in Scandinavia. Modern Germanic languages mostly use an alphabet derived from the Latin Alphabet. In print, German used to be predominately set in blackletter typefaces (e.g., fraktur or schwabacher) until the 1940s, while Kurrent and, since the early 20th century, Sütterlin were formerly used for German handwriting. Yiddish is written using an adapted Hebrew alphabet. Vocabulary comparison The table compares cognates in several different Germanic languages. In some cases, the meanings may not be identical in each language. See also List of Germanic languages Language families and languages List of Germanic and Latinate equivalents Germanization Anglicization Germanic name Germanic verb and its various subordinated articles Germanic placename etymology German name German placename etymology Isogloss South Germanic languages Footnotes Notes Sources Germanic languages in general Proto-Germanic Gothic Old Norse Old English Old High German External links Germanic Lexicon Project 'Hover & Hear' pronunciations of the same Germanic words in dozens of Germanic languages and 'dialects', including English accents, and compare instantaneously side by side Bibliographie der Schreibsprachen: Bibliography of medieval written forms of High and Low German and Dutch Swadesh lists of Germanic basic vocabulary words (from Wiktionary's Swadesh-list appendix) Germanic languages fragments—YouTube (14:06) Indo-European languages
[ 0.18242229521274567, 0.3455909490585327, -0.7473316192626953, -0.6607396006584167, 0.028993094339966774, 1.1380032300949097, 1.4113866090774536, 0.8207792043685913, -0.41680440306663513, -0.7994296550750732, -0.3317421078681946, -0.26544472575187683, -0.2189740240573883, 0.334637850522995,...
11884
https://en.wikipedia.org/wiki/German%20language
German language
German (, ) is a West Germanic language mainly spoken in Central Europe. It is the most widely spoken and official or co-official language in Germany, Austria, Switzerland, Liechtenstein, and the Italian province of South Tyrol. It is also a co-official language of Luxembourg and Belgium, as well as a national language in Namibia. German is most similar to other languages within the West Germanic language branch, including Afrikaans, Dutch, English, the Frisian languages, Low German, Luxembourgish, Scots, and Yiddish. It also contains close similarities in vocabulary to some languages in the North Germanic group, such as Danish, Norwegian, and Swedish. German is the second most widely spoken Germanic language after English. German is one of the major languages of the world. It is the most spoken native language within the European Union. German is also widely taught as a foreign language, especially in continental Europe, where it is the third most taught foreign language (after English and French), and the United States. The language has been influential in the fields of philosophy, theology, science, and technology. It is the second most commonly used scientific language and among the most widely used languages on websites. The German-speaking countries are ranked fifth in terms of annual publication of new books, with one-tenth of all books (including e-books) in the world being published in German. German is an inflected language, with four cases for nouns, pronouns, and adjectives (nominative, accusative, genitive, dative); three genders (masculine, feminine, neuter); and two numbers (singular, plural). It has strong and weak verbs. The majority of its vocabulary derives from the ancient Germanic branch of the Indo-European language family, while a smaller share is partly derived from Latin and Greek, along with fewer words borrowed from French and Modern English. German is a pluricentric language; the three standardized variants are German, Austrian, and Swiss Standard High German. It is also notable for its broad spectrum of dialects, with many varieties existing in Europe and other parts of the world. Some of these non-standard varieties have become recognized and protected by regional or national governments. Classification Modern Standard German is a West Germanic language in the Germanic branch of the Indo-European languages. The Germanic languages are traditionally subdivided into three branches, North Germanic, East Germanic, and West Germanic. The first of these branches survives in modern Danish, Swedish, Norwegian, Faroese, and Icelandic, all of which are descended from Old Norse. The East Germanic languages are now extinct, and Gothic is the only language in this branch which survives in written texts. The West Germanic languages, however, have undergone extensive dialectal subdivision and are now represented in modern languages such as English, German, Dutch, Yiddish, Afrikaans, and others. Within the West Germanic language dialect continuum, the Benrath and Uerdingen lines (running through Düsseldorf-Benrath and Krefeld-Uerdingen, respectively) serve to distinguish the Germanic dialects that were affected by the High German consonant shift (south of Benrath) from those that were not (north of Uerdingen). The various regional dialects spoken south of these lines are grouped as High German dialects, while those spoken to the north comprise the Low German/Low Saxon and Low Franconian dialects. As members of the West Germanic language family, High German, Low German, and Low Franconian have been proposed to be further distinguished historically as Irminonic, Ingvaeonic, and Istvaeonic, respectively. This classification indicates their historical descent from dialects spoken by the Irminones (also known as the Elbe group), Ingvaeones (or North Sea Germanic group), and Istvaeones (or Weser-Rhine group). Standard German is based on a combination of Thuringian-Upper Saxon and Upper Franconian dialects, which are Central German and Upper German dialects belonging to the High German dialect group. German is therefore closely related to the other languages based on High German dialects, such as Luxembourgish (based on Central Franconian dialects) and Yiddish. Also closely related to Standard German are the Upper German dialects spoken in the southern German-speaking countries, such as Swiss German (Alemannic dialects) and the various Germanic dialects spoken in the French region of Grand Est, such as Alsatian (mainly Alemannic, but also Central- and Upper Franconian dialects) and Lorraine Franconian (Central Franconian). After these High German dialects, standard German is less closely related to languages based on Low Franconian dialects (e.g. Dutch and Afrikaans), Low German or Low Saxon dialects (spoken in northern Germany and southern Denmark), neither of which underwent the High German consonant shift. As has been noted, the former of these dialect types is Istvaeonic and the latter Ingvaeonic, whereas the High German dialects are all Irminonic; the differences between these languages and standard German are therefore considerable. Also related to German are the Frisian languages—North Frisian (spoken in Nordfriesland), Saterland Frisian (spoken in Saterland), and West Frisian (spoken in Friesland)—as well as the Anglic languages of English and Scots. These Anglo-Frisian dialects did not take part in the High German consonant shift. History of High German Old High German The history of the German language begins with the High German consonant shift during the Migration Period, which separated Old High German dialects from Old Saxon. This sound shift involved a drastic change in the pronunciation of both voiced and voiceless stop consonants (b, d, g, and p, t, k, respectively). The primary effects of the shift were the following below. Voiceless stops became long (geminated) voiceless fricatives following a vowel; Voiceless stops became affricates in word-initial position, or following certain consonants; Voiced stops became voiceless in certain phonetic settings. While there is written evidence of the Old High German language in several Elder Futhark inscriptions from as early as the sixth century AD (such as the Pforzen buckle), the Old High German period is generally seen as beginning with the Abrogans (written c. 765–775), a Latin-German glossary supplying over 3,000 Old High German words with their Latin equivalents. After the Abrogans, the first coherent works written in Old High German appear in the ninth century, chief among them being the Muspilli, the Merseburg Charms, and the Hildebrandslied, and other religious texts (the Georgslied, the Ludwigslied, the Evangelienbuch, and translated hymns and prayers). The Muspilli is a Christian poem written in a Bavarian dialect offering an account of the soul after the Last Judgment, and the Merseburg Charms are transcriptions of spells and charms from the pagan Germanic tradition. Of particular interest to scholars, however, has been the Hildebrandslied, a secular epic poem telling the tale of an estranged father and son unknowingly meeting each other in battle. Linguistically this text is highly interesting due to the mixed use of Old Saxon and Old High German dialects in its composition. The written works of this period stem mainly from the Alamanni, Bavarian, and Thuringian groups, all belonging to the Elbe Germanic group (Irminones), which had settled in what is now southern-central Germany and Austria between the second and sixth centuries during the great migration. In general, the surviving texts of OHG show a wide range of dialectal diversity with very little written uniformity. The early written tradition of OHG survived mostly through monasteries and scriptoria as local translations of Latin originals; as a result, the surviving texts are written in highly disparate regional dialects and exhibit significant Latin influence, particularly in vocabulary. At this point monasteries, where most written works were produced, were dominated by Latin, and German saw only occasional use in official and ecclesiastical writing. The German language through the OHG period was still predominantly a spoken language, with a wide range of dialects and a much more extensive oral tradition than a written one. Having just emerged from the High German consonant shift, OHG was also a relatively new and volatile language still undergoing a number of phonetic, phonological, morphological, and syntactic changes. The scarcity of written work, instability of the language, and widespread illiteracy of the time explain the lack of standardization up to the end of the OHG period in 1050. Middle High German While there is no complete agreement over the dates of the Middle High German (MHG) period, it is generally seen as lasting from 1050 to 1350. This was a period of significant expansion of the geographical territory occupied by Germanic tribes, and consequently of the number of German speakers. Whereas during the Old High German period the Germanic tribes extended only as far east as the Elbe and Saale rivers, the MHG period saw a number of these tribes expanding beyond this eastern boundary into Slavic territory (known as the Ostsiedlung). With the increasing wealth and geographic spread of the Germanic groups came greater use of German in the courts of nobles as the standard language of official proceedings and literature. A clear example of this is the mittelhochdeutsche Dichtersprache employed in the Hohenstaufen court in Swabia as a standardized supra-dialectal written language. While these efforts were still regionally bound, German began to be used in place of Latin for certain official purposes, leading to a greater need for regularity in written conventions. While the major changes of the MHG period were socio-cultural, High German was still undergoing significant linguistic changes in syntax, phonetics, and morphology as well (e.g. diphthongization of certain vowel sounds: hus (OHG & MHG "house")→haus (regionally in later MHG)→Haus (NHG), and weakening of unstressed short vowels to schwa [ə]: taga (OHG "days")→tage (MHG)). A great wealth of texts survives from the MHG period. Significantly, these texts include a number of impressive secular works, such as the Nibelungenlied, an epic poem telling the story of the dragon-slayer Siegfried ( thirteenth century), and the Iwein, an Arthurian verse poem by Hartmann von Aue ( 1203), lyric poems, and courtly romances such as Parzival and Tristan. Also noteworthy is the Sachsenspiegel, the first book of laws written in Middle Low German ( 1220). The abundance and especially the secular character of the literature of the MHG period demonstrate the beginnings of a standardized written form of German, as well as the desire of poets and authors to be understood by individuals on supra-dialectal terms. The Middle High German period is generally seen as ending when the 1346–53 Black Death decimated Europe's population. Early New High German Modern High German begins with the Early New High German (ENHG) period, which the influential German philologist Wilhelm Scherer dates 1350–1650, terminating with the end of the Thirty Years' War. This period saw the further displacement of Latin by German as the primary language of courtly proceedings and, increasingly, of literature in the German states. While these states were still part of the Holy Roman Empire, and far from any form of unification, the desire for a cohesive written language that would be understandable across the many German-speaking principalities and kingdoms was stronger than ever. As a spoken language German remained highly fractured throughout this period, with a vast number of often mutually incomprehensible regional dialects being spoken throughout the German states; the invention of the printing press 1440 and the publication of Luther's vernacular translation of the Bible in 1534, however, had an immense effect on standardizing German as a supra-dialectal written language. The ENHG period saw the rise of several important cross-regional forms of chancery German, one being gemeine tiutsch, used in the court of the Holy Roman Emperor Maximilian I, and the other being Meißner Deutsch, used in the Electorate of Saxony in the Duchy of Saxe-Wittenberg. Alongside these courtly written standards, the invention of the printing press led to the development of a number of printers' languages (Druckersprachen) aimed at making printed material readable and understandable across as many diverse dialects of German as possible. The greater ease of production and increased availability of written texts brought about increased standardization in the written form of German.One of the central events in the development of ENHG was the publication of Luther's translation of the Bible into High German (the New Testament was published in 1522; the Old Testament was published in parts and completed in 1534). Luther based his translation primarily on the Meißner Deutsch of Saxony, spending much time among the population of Saxony researching the dialect so as to make the work as natural and accessible to German speakers as possible. Copies of Luther's Bible featured a long list of glosses for each region, translating words which were unknown in the region into the regional dialect. Luther said the following concerning his translation method:One who would talk German does not ask the Latin how he shall do it; he must ask the mother in the home, the children on the streets, the common man in the market-place and note carefully how they talk, then translate accordingly. They will then understand what is said to them because it is German. When Christ says 'ex abundantia cordis os loquitur,' I would translate, if I followed the papists, aus dem Überflusz des Herzens redet der Mund. But tell me is this talking German? What German understands such stuff? No, the mother in the home and the plain man would say, Wesz das Herz voll ist, des gehet der Mund über.With Luther's rendering of the Bible in the vernacular, German asserted itself against the dominance of Latin as a legitimate language for courtly, literary, and now ecclesiastical subject-matter. Furthermore, his Bible was ubiquitous in the German states: nearly every household possessed a copy. Nevertheless, even with the influence of Luther's Bible as an unofficial written standard, a widely accepted standard for written German did not appear until the middle of the eighteenth century. Austrian Empire German was the language of commerce and government in the Habsburg Empire, which encompassed a large area of Central and Eastern Europe. Until the mid-nineteenth century, it was essentially the language of townspeople throughout most of the Empire. Its use indicated that the speaker was a merchant or someone from an urban area, regardless of nationality. Prague () and Budapest (Buda, ), to name two examples, were gradually Germanized in the years after their incorporation into the Habsburg domain; others, like Pressburg (Pozsony, now Bratislava), were originally settled during the Habsburg period and were primarily German at that time. Prague, Budapest, Bratislava, and cities like Zagreb () or Ljubljana (), contained significant German minorities. In the eastern provinces of Banat, Bukovina, and Transylvania (), German was the predominant language not only in the larger towns – like (Timișoara), (Sibiu) and (Brașov) – but also in many smaller localities in the surrounding areas. Standardization In 1901, the Second Orthographic Conference ended with a complete standardization of the Standard High German language in its written form, and the Duden Handbook was declared its standard definition. The () had established conventions for German pronunciation in theatres three years earlier; however, this was an artificial standard that did not correspond to any traditional spoken dialect. Rather, it was based on the pronunciation of Standard High German in Northern Germany, although it was subsequently regarded often as a general prescriptive norm, despite differing pronunciation traditions especially in the Upper-German-speaking regions that still characterise the dialect of the area todayespecially the pronunciation of the ending as [ɪk] instead of [ɪç]. In Northern Germany, Standard German was a foreign language to most inhabitants, whose native dialects were subsets of Low German. It was usually encountered only in writing or formal speech; in fact, most of Standard High German was a written language, not identical to any spoken dialect, throughout the German-speaking area until well into the 19th century. Official revisions of some of the rules from 1901 were not issued until the controversial German orthography reform of 1996 was made the official standard by governments of all German-speaking countries. Media and written works are now almost all produced in Standard High German which is understood in all areas where German is spoken. Geographical distribution As a result of the German diaspora, as well as the popularity of German taught as a foreign language, the geographical distribution of German speakers (or "Germanophones") spans all inhabited continents. However, an exact, global number of native German speakers is complicated by the existence of several varieties whose status as separate "languages" or "dialects" is disputed for political and linguistic reasons, including quantitatively strong varieties like certain forms of Alemannic and Low German. With the inclusion or exclusion of certain varieties, it is estimated that approximately 90–95 million people speak German as a first language, 10–25 million speak it as a second language, and 75–100 million as a foreign language. This would imply the existence of approximately 175–220 million German speakers worldwide. Europe , about 90 million people, or 16% of the European Union's population, spoke German as their mother tongue, making it the second most widely spoken language on the continent after Russian and the second biggest language in terms of overall speakers (after English), as well as the most spoken native language. German Sprachraum The area in central Europe where the majority of the population speaks German as a first language and has German as a (co-)official language is called the "German Sprachraum". German is the official language of the following countries: Germany Austria 17 cantons of Switzerland Liechtenstein German is a co-official language of the following countries: Belgium (as majority language only in the German-speaking Community, which represents 0.7% of the Belgian population) Luxembourg, along with French and Luxembourgish Switzerland, co-official at the federal level with French, Italian, and Romansh, and at the local level in four cantons: Bern (with French), Fribourg (with French), Grisons (with Italian and Romansh) and Valais (with French) Italian Autonomous Province of South Tyrol (also majority language) Outside the German Sprachraum Although expulsions and (forced) assimilation after the two World wars greatly diminished them, minority communities of mostly bilingual German native speakers exist in areas both adjacent to and detached from the Sprachraum. Within Europe, German is a recognized minority language in the following countries: Czech Republic (see also: Germans in the Czech Republic) Denmark (see also: North Schleswig Germans) Hungary (see also: Germans of Hungary) Poland (see also German minority in Poland; German is an auxiliary and co-official language in 31 communes) Romania (see also: Germans of Romania) Russia, (see also: Germans in Russia) Slovakia (see also: Carpathian Germans) In France, the High German varieties of Alsatian and Moselle Franconian are identified as "regional languages", but the European Charter for Regional or Minority Languages of 1998 has not yet been ratified by the government. Africa Namibia Namibia was a colony of the German Empire from 1884 to 1919. About 30,000 people still speak German as a native tongue today, mostly descendants of German colonial settlers. The period of German colonialism in Namibia also led to the evolution of a Standard German-based pidgin language called "Namibian Black German", which became a second language for parts of the indigenous population. Although it is nearly extinct today, some older Namibians still have some knowledge of it. German remained a de facto official language of Namibia after the end of German colonial rule alongside English and Afrikaans, and had de jure co-official status from 1984 until its independence from South Africa in 1990. However, the Namibian government perceived Afrikaans and German as symbols of apartheid and colonialism, and decided English would be the sole official language upon independence, stating that it was a "neutral" language as there were virtually no English native speakers in Namibia at that time. German, Afrikaans, and several indigenous languages thus became "national languages" by law, identifying them as elements of the cultural heritage of the nation and ensuring that the state acknowledged and supported their presence in the country. Today, Namibia is considered to be the only German-speaking country outside of the Sprachraum in Europe. German is used in a wide variety of spheres throughout the country, especially in business, tourism, and public signage, as well as in education, churches (most notably the German-speaking Evangelical Lutheran Church in Namibia (GELK)), other cultural spheres such as music, and media (such as German language radio programs by the Namibian Broadcasting Corporation). The is one of the three biggest newspapers in Namibia and the only German-language daily in Africa. South Africa An estimated 12,000 people speak German or a German variety as a first language in South Africa, mostly originating from different waves of immigration during the 19th and 20th centuries. One of the largest communities consists of the speakers of "Nataler Deutsch", a variety of Low German concentrated in and around Wartburg. The South African constitution identifies German as a "commonly used" language and the Pan South African Language Board is obligated to promote and ensure respect for it. North America In the United States, German is the fifth most spoken language in terms of native and second language speakers after English, Spanish, French, and Chinese (with figures for Cantonese and Mandarin combined), with over 1 million total speakers. In the states of North Dakota and South Dakota, German is the most common language spoken at home after English. As a legacy of significant German immigration to the country, German geographical names can be found throughout the Midwest region, such as New Ulm and Bismarck (North Dakota's state capital). A number of German varieties have developed in the country and are still spoken today, such as Pennsylvania German and Texas German. South America In Brazil, the largest concentrations of German speakers are in the states of Rio Grande do Sul (where Riograndenser Hunsrückisch developed), Santa Catarina, and Espírito Santo. German dialects (namely Hunsrik and East Pomeranian) are recognized languages in the following municipalities in Brazil: Espírito Santo (statewide cultural language): Domingos Martins, Laranja da Terra, Pancas, Santa Maria de Jetibá, Vila Pavão Rio Grande do Sul (Riograndenser Hunsrückisch German is a designated cultural language in the state): Santa Maria do Herval, Canguçu Santa Catarina: Antônio Carlos, Pomerode (standard German recognized) Small concentrations of German-speakers and their descendants are also found in Argentina, Chile, Paraguay, Venezuela, and Bolivia. Oceania In Australia, the state of South Australia experienced a pronounced wave of Prussian immigration in the 1840s (particularly from Silesia region). With the prolonged isolation from other German speakers and contact with Australian English, a unique dialect known as Barossa German developed, spoken predominantly in the Barossa Valley near Adelaide. Usage of German sharply declined with the advent of World War I, due to the prevailing anti-German sentiment in the population and related government action. It continued to be used as a first language into the 20th century, but its use is now limited to a few older speakers. As of the 2013 census, 36,642 people in New Zealand spoke German, mostly descendants of a small wave of 19th century German immigrants, making it the third most spoken European language after English and French and overall the ninth most spoken language. A German creole named was historically spoken in the former German colony of German New Guinea, modern day Papua New Guinea. It is at a high risk of extinction, with only about 100 speakers remaining, and a topic of interest among linguists seeking to revive interest in the language. As a foreign language Like English, French, and Spanish, German has become a standard foreign language throughout the world, especially in the Western World. German ranks second on par with French among the best known foreign languages in the European Union (EU) after English, as well as in Russia and Turkey. In terms of student numbers across all levels of education, German ranks third in the EU (after English and French) and in the United States (after Spanish and French). In 2020, approximately 15.4 million people were enrolled in learning German across all levels of education worldwide. This number has decreased from a peak of 20.1 million in 2000. Within the EU, not counting countries where it is an official language, German as a foreign language is most popular in Eastern and Northern Europe, namely the Czech Republic, Croatia, Denmark, the Netherlands, Slovakia, Hungary, Slovenia, Sweden, Poland, and Bosnia and Herzegovina. German was once, and to some extent still is, a lingua franca in those parts of Europe. Standard High German The basis of Standard High German developed with the Luther Bible and the chancery language spoken by the Saxon court. However, there are places where the traditional regional dialects have been replaced by new vernaculars based on Standard High German; that is the case in large stretches of Northern Germany but also in major cities in other parts of the country. It is important to note, however, that the colloquial Standard High German differs from the formal written language, especially in grammar and syntax, in which it has been influenced by dialectal speech. Standard High German differs regionally among German-speaking countries in vocabulary and some instances of pronunciation and even grammar and orthography. This variation must not be confused with the variation of local dialects. Even though the regional varieties of Standard High German are only somewhat influenced by the local dialects, they are very distinct. Standard High German is thus considered a pluricentric language. In most regions, the speakers use a continuum from more dialectal varieties to more standard varieties depending on the circumstances. Varieties In German linguistics, German dialects are distinguished from varieties of Standard High German. The varieties of Standard High German refer to the different local varieties of the pluricentric Standard High German. They differ only slightly in lexicon and phonology. In certain regions, they have replaced the traditional German dialects, especially in Northern Germany. German Standard German Austrian Standard German Swiss Standard German In the German-speaking parts of Switzerland, mixtures of dialect and standard are very seldom used, and the use of Standard High German is largely restricted to the written language. About 11% of the Swiss residents speak Standard High German at home, but this is mainly due to German immigrants. This situation has been called a medial diglossia. Swiss Standard German is used in the Swiss education system, while Austrian German is officially used in the Austrian education system. Dialects The German dialects are the traditional local varieties of the language; many of them are not mutually intelligibile with standard German, and they have great differences in lexicon, phonology, and syntax. If a narrow definition of language based on mutual intelligibility is used, many German dialects are considered to be separate languages (for instance in the Ethnologue). However, such a point of view is unusual in German linguistics. The German dialect continuum is traditionally divided most broadly into High German and Low German, also called Low Saxon. However, historically, High German dialects and Low Saxon/Low German dialects do not belong to the same language. Nevertheless, in today's Germany, Low Saxon/Low German is often perceived as a dialectal variation of Standard German on a functional level even by many native speakers. The variation among the German dialects is considerable, with often only neighbouring dialects being mutually intelligible. Some dialects are not intelligible to people who know only Standard German. However, all German dialects belong to the dialect continuum of High German and Low Saxon. Low German or Low Saxon Middle Low German was the lingua franca of the Hanseatic League. It was the predominant language in Northern Germany until the 16th century. In 1534, the Luther Bible was published. It aimed to be understandable to a broad audience and was based mainly on Central and Upper German varieties. The Early New High German language gained more prestige than Low German and became the language of science and literature. Around the same time, the Hanseatic League, a confederation of northern ports, lost its importance as new trade routes to Asia and the Americas were established, and the most powerful German states of that period were located in Middle and Southern Germany. The 18th and 19th centuries were marked by mass education in Standard German in schools. Gradually, Low German came to be politically viewed as a mere dialect spoken by the uneducated. The proportion of the population who can understand and speak it has decreased continuously since World War II. The major cities in the Low German area are Hamburg, Hanover, Bremen and Dortmund. Sometimes, Low Saxon and Low Franconian varieties are grouped together because both are unaffected by the High German consonant shift. Low Franconian In Germany, Low Franconian dialects are spoken in the northwest of North Rhine-Westphalia, along the Lower Rhine. The Low Franconian dialects spoken in Germany are referred to as Low Rhenish. In the north of the German Low Franconian language area, North Low Franconian dialects (also referred to as Cleverlands or as dialects of South Guelderish) are spoken. The South Low Franconian and Bergish dialects, which are spoken in the south of the German Low Franconian language area, are transitional dialects between Low Franconian and Ripuarian dialects. The Low Franconian dialects fall within a linguistic category used to classify a number of historical and contemporary West Germanic varieties most closely related to, and including, the Dutch language. Consequently, the vast majority of the Low Franconian dialects are spoken outside of the German language area, in the Netherlands and Belgium. During the Middle Ages and Early Modern Period, the Low Franconian dialects now spoken in Germany, used Middle Dutch or Early Modern Dutch as their literary language and Dachsprache. Following a 19th-century change in Prussian language policy, use of Dutch as an official and public language was forbidden; resulting in Standard German taking its place as the region's official language. As a result, these dialects are now considered German dialects from a socio-linguistic point of view. Nevertheless, topologically these dialects are structurally and phonologically far more similar to Dutch, than to German and form both the smallest and most divergent dialect cluster within the contemporary German language area. High German The High German dialects consist of the Central German, High Franconian and Upper German dialects. The High Franconian dialects are transitional dialects between Central and Upper German. The High German varieties spoken by the Ashkenazi Jews have several unique features and are considered as a separate language, Yiddish, written with the Hebrew alphabet. Central German The Central German dialects are spoken in Central Germany, from Aachen in the west to Görlitz in the east. They consist of Franconian dialects in the west (West Central German) and non-Franconian dialects in the east (East Central German). Modern Standard German is mostly based on Central German dialects. The Franconian, West Central German dialects are the Central Franconian dialects (Ripuarian and Moselle Franconian) and the Rhine Franconian dialects (Hessian and Palatine). These dialects are considered as German in Germany and Belgium Luxembourgish in Luxembourg Lorraine Franconian (spoken in Moselle) and as a Rhine Franconian variant of Alsatian (spoken in Alsace bossue only) in France Limburgish or Kerkrade dialect in the Netherlands. Luxembourgish as well as the Transylvanian Saxon dialect spoken in Transylvania are based on Moselle Franconian dialects. The major cities in the Franconian Central German area are Cologne and Frankfurt. Further east, the non-Franconian, East Central German dialects are spoken (Thuringian, Upper Saxon and North Upper Saxon-South Markish, and earlier, in the then German-speaking parts of Silesia also Silesian, and in then German southern East Prussia also High Prussian). The major cities in the East Central German area are Berlin and Leipzig. High Franconian The High Franconian dialects are transitional dialects between Central and Upper German. They consist of the East and South Franconian dialects. The East Franconian dialect branch is one of the most spoken dialect branches in Germany. These dialects are spoken in the region of Franconia and in the central parts of Saxon Vogtland. Franconia consists of the Bavarian districts of Upper, Middle, and Lower Franconia, the region of South Thuringia (Thuringia), and the eastern parts of the region of Heilbronn-Franken (Tauber Franconia and Hohenlohe) in Baden-Württemberg. The major cities in the East Franconian area are Nuremberg and Würzburg. South Franconian is mainly spoken in northern Baden-Württemberg in Germany, but also in the northeasternmost part of the region of Alsace in France. In Baden-Württemberg, they are considered as dialects of German. The major cities in the South Franconian area are Karlsruhe and Heilbronn. Upper German The Upper German dialects are the Alemannic and Swabian dialects in the west and the Bavarian dialects in the east. Alemannic and Swabian Alemannic dialects are spoken in Switzerland (High Alemannic in the densely populated Swiss Plateau, in the south also Highest Alemannic, and Low Alemannic in Basel), Baden-Württemberg (Swabian and Low Alemannic, in the southwest also High Alemannic), Bavarian Swabia (Swabian, in the southwesternmost part also Low Alemannic), Vorarlberg (Low, High, and Highest Alemannic), Alsace (Low Alemannic, in the southernmost part also High Alemannic), Liechtenstein (High and Highest Alemannic), and in the Tyrolean district of Reutte (Swabian). The Alemannic dialects are considered as Alsatian in Alsace. The major cities in the Alemannic area are Stuttgart, Freiburg, Basel, Zürich, Lucerne and Bern. Bavarian Bavarian dialects are spoken in Austria (Vienna, Lower and Upper Austria, Styria, Carinthia, Salzburg, Burgenland, and in most parts of Tyrol), Bavaria (Upper and Lower Bavaria as well as Upper Palatinate), South Tyrol, southwesternmost Saxony (Southern Vogtländisch), and in the Swiss village of Samnaun. The major cities in the Bavarian area are Vienna, Munich, Salzburg, Regensburg, Graz and Bolzano. Regiolects Berlinian, the High German regiolect or dialect of Berlin with Low German substrate Missingsch, a Low-German-coloured variety of High German. Ruhrdeutsch (Ruhr German), the High German regiolect of the Ruhr area. Grammar German is a fusional language with a moderate degree of inflection, with three grammatical genders; as such, there can be a large number of words derived from the same root. Noun inflection German nouns inflect by case, gender, and number: four cases: nominative, accusative, genitive, and dative. three genders: masculine, feminine, and neuter. Word endings sometimes reveal grammatical gender: for instance, nouns ending in (-ing), (-ship), or (-hood, -ness) are feminine, nouns ending in or (diminutive forms) are neuter and nouns ending in (-ism) are masculine. Others are more variable, sometimes depending on the region in which the language is spoken. And some endings are not restricted to one gender, for example: (-er), such as (feminine), celebration, party; (masculine), labourer; and (neuter), thunderstorm. two numbers: singular and plural. This degree of inflection is considerably less than in Old High German and other old Indo-European languages such as Latin, Ancient Greek, and Sanskrit, and it is also somewhat less than, for instance, Old English, modern Icelandic, or Russian. The three genders have collapsed in the plural. With four cases and three genders plus plural, there are 16 permutations of case and gender/number of the article (not the nouns), but there are only six forms of the definite article, which together cover all 16 permutations. In nouns, inflection for case is required in the singular for strong masculine and neuter nouns only in the genitive and in the dative (only in fixed or archaic expressions), and even this is losing ground to substitutes in informal speech. Weak masculine nouns share a common case ending for genitive, dative, and accusative in the singular. Feminine nouns are not declined in the singular. The plural has an inflection for the dative. In total, seven inflectional endings (not counting plural markers) exist in German: . Like the other Germanic languages, German forms noun compounds in which the first noun modifies the category given by the second: ("dog hut"; specifically: "dog kennel"). Unlike English, whose newer compounds or combinations of longer nouns are often written "open" with separating spaces, German (like some other Germanic languages) nearly always uses the "closed" form without spaces, for example: ("tree house"). Like English, German allows arbitrarily long compounds in theory (see also English compounds). The longest German word verified to be actually in (albeit very limited) use is , which, literally translated, is "beef labelling supervision duties assignment law" [from (cattle), (meat), (labelling), (supervision), (duties), (assignment), (law)]. However, examples like this are perceived by native speakers as excessively bureaucratic, stylistically awkward, or even satirical. Verb inflection The inflection of standard German verbs includes: two main conjugation classes: weak and strong (as in English). Additionally, there is a third class, known as mixed verbs, whose conjugation combines features of both the strong and weak patterns. three persons: first, second and third. two numbers: singular and plural. three moods: indicative, imperative and subjunctive (in addition to infinitive). two voices: active and passive. The passive voice uses auxiliary verbs and is divisible into static and dynamic. Static forms show a constant state and use the verb ’'to be'’ (sein). Dynamic forms show an action and use the verb "to become'’ (werden). two tenses without auxiliary verbs (present and preterite) and four tenses constructed with auxiliary verbs (perfect, pluperfect, future and future perfect). the distinction between grammatical aspects is rendered by combined use of the subjunctive or preterite marking so the plain indicative voice uses neither of those two markers; the subjunctive by itself often conveys reported speech; subjunctive plus preterite marks the conditional state; and the preterite alone shows either plain indicative (in the past), or functions as a (literal) alternative for either reported speech or the conditional state of the verb, when necessary for clarity. the distinction between perfect and progressive aspect is and has, at every stage of development, been a productive category of the older language and in nearly all documented dialects, but strangely enough it is now rigorously excluded from written usage in its present normalised form. disambiguation of completed vs. uncompleted forms is widely observed and regularly generated by common prefixes ( [to look], [to see – unrelated form: ]). Verb prefixes The meaning of basic verbs can be expanded and sometimes radically changed through the use of a number of prefixes. Some prefixes have a specific meaning; the prefix refers to destruction, as in (to tear apart), (to break apart), (to cut apart). Other prefixes have only the vaguest meaning in themselves; is found in a number of verbs with a large variety of meanings, as in (to try) from (to seek), (to interrogate) from (to take), (to distribute) from (to share), (to understand) from (to stand). Other examples include the following: (to stick), (to detain); (to buy), (to sell); (to hear), (to cease); (to drive), (to experience). Many German verbs have a separable prefix, often with an adverbial function. In finite verb forms, it is split off and moved to the end of the clause and is hence considered by some to be a "resultative particle". For example, , meaning "to go along", would be split, giving (Literal: "Go you with?"; Idiomatic: "Are you going along?"). Indeed, several parenthetical clauses may occur between the prefix of a finite verb and its complement (ankommen = to arrive, er kam an = he arrived, er ist angekommen = he has arrived): A selectively literal translation of this example to illustrate the point might look like this: He "came" on Friday evening, after a hard day at work and the usual annoyances that had time and again been troubling him for years now at his workplace, with questionable joy, to a meal which, as he hoped, his wife had already put on the table, finally home "to". Word order German word order is generally with the V2 word order restriction and also with the SOV word order restriction for main clauses. For yes-no questions, exclamations, and wishes, the finite verb always has the first position. In subordinate clauses, the verb occurs at the very end. German requires a verbal element (main verb or auxiliary verb) to appear second in the sentence. The verb is preceded by the topic of the sentence. The element in focus appears at the end of the sentence. For a sentence without an auxiliary, these are several possibilities: (The old man gave me yesterday the book; normal order) (The book gave [to] me yesterday the old man) (The book gave the old man [to] me yesterday) (The book gave [to] me the old man yesterday) (Yesterday gave [to] me the old man the book, normal order) ([To] me gave the old man the book yesterday (entailing: as for someone else, it was another date)) The position of a noun in a German sentence has no bearing on its being a subject, an object or another argument. In a declarative sentence in English, if the subject does not occur before the predicate, the sentence could well be misunderstood. However, German's flexible word order allows one to emphasise specific words: Normal word order: The manager entered yesterday at 10 o'clock with an umbrella in the hand his office. Object in front: His office entered the manager yesterday at 10 o'clock with an umbrella in the hand. The object (his office) is thus highlighted; it could be the topic of the next sentence. Adverb of time in front: Yesterday entered the manager at 10 o'clock with an umbrella in the hand his office. (but today without umbrella) Both time expressions in front: . Yesterday at 10 o'clock entered the manager with an umbrella in the hand his office. The full-time specification is highlighted. Another possibility: . Yesterday at 10 o'clock entered the manager his office with an umbrella in the hand. Both the time specification and the fact he carried an umbrella are accentuated. Swapped adverbs: The manager entered with an umbrella in the hand yesterday at 10 o'clock his office. The phrase is highlighted. Swapped object: The manager entered yesterday at 10 o'clock his office with an umbrella in the hand. The time specification and the object (his office) are lightly accentuated. The flexible word order also allows one to use language "tools" (such as poetic meter and figures of speech) more freely. Auxiliary verbs When an auxiliary verb is present, it appears in second position, and the main verb appears at the end. This occurs notably in the creation of the perfect tense. Many word orders are still possible: (The old man has me today the book given.) (The book has the old man me today given.) (Today has the old man me the book given.) The main verb may appear in first position to put stress on the action itself. The auxiliary verb is still in second position. (Given has me the old man the book today.) The bare fact that the book has been given is emphasized, as well as 'today'. Modal verbs Sentences using modal verbs place the infinitive at the end. For example, the English sentence "Should he go home?" would be rearranged in German to say "Should he (to) home go?" (). Thus, in sentences with several subordinate or relative clauses, the infinitives are clustered at the end. Compare the similar clustering of prepositions in the following (highly contrived) English sentence: "What did you bring that book that I do not like to be read to out of up for?" Multiple infinitives German subordinate clauses have all verbs clustered at the end. Given that auxiliaries encode future, passive, modality, and the perfect, very long chains of verbs at the end of the sentence can occur. In these constructions, the past participle formed with is often replaced by the infinitive. V psv perf mod One suspects that the deserter probably shot become be should. ("It is suspected that the deserter probably had been shot") He knew not that the agent a picklock had make let He knew not that the agent a picklock make let had ("He did not know that the agent had had a picklock made") The order at the end of such strings is subject to variation, but the second one in the last example is unusual. Vocabulary Most German vocabulary is derived from the Germanic branch of the Indo-European language family. However, there is a significant amount of loanwords from other languages, in particular Latin, Greek, Italian, French, and most recently English. In the early 19th century, Joachim Heinrich Campe estimated that one fifth of the total German vocabulary was of French or Latin origin. Latin words were already imported into the predecessor of the German language during the Roman Empire and underwent all the characteristic phonetic changes in German. Their origin is thus no longer recognizable for most speakers (e.g. , , , , from Latin , , , , ). Borrowing from Latin continued after the fall of the Roman Empire during Christianisation, mediated by the church and monasteries. Another important influx of Latin words can be observed during Renaissance humanism. In a scholarly context, the borrowings from Latin have continued until today, in the last few decades often indirectly through borrowings from English. During the 15th to 17th centuries, the influence of Italian was great, leading to many Italian loanwords in the fields of architecture, finance and music. The influence of the French language in the 17th to 19th centuries resulted in an even greater import of French words. The English influence was already present in the 19th century, but it did not become dominant until the second half of the 20th century. Thus, Notker Labeo was able to translate Aristotelian treatises into pure (Old High) German in the decades after the year 1000. The tradition of loan translation was revitalized in the 17th and 18th century with poets like Philipp von Zesen or linguists like Joachim Heinrich Campe, who introduced close to 300 words that are still used in modern German. Even today, there are movements that promote the substitution of foreign words that are deemed unnecessary with German alternatives. As in English, there are many pairs of synonyms due to the enrichment of the Germanic vocabulary with loanwords from Latin and Latinized Greek. These words often have different connotations from their Germanic counterparts and are usually perceived as more scholarly.  – "history, historical", ()  – "humaneness, humane", ()  – "millennium", ()  – "perception", ()  – "vocabulary", () – "dictionary, wordbook", () – "to try", () The size of the vocabulary of German is difficult to estimate. The (German Dictionary), initiated by the Brothers Grimm (Jacob and Wilhelm Grimm) and the most comprehensive guide to the vocabulary of the German language, already contained over 330,000 headwords in its first edition. The modern German scientific vocabulary is estimated at nine million words and word groups (based on the analysis of 35 million sentences of a corpus in Leipzig, which as of July 2003 included 500 million words in total). The Duden is the de facto official dictionary of the Standard High German language, first published by Konrad Duden in 1880. The Duden is updated regularly, with new editions appearing every four or five years. , it was in its 27th edition and in 12 volumes, each covering different aspects such as loanwords, etymology, pronunciation, synonyms, and so forth.The first of these volumes, (German Orthography), has long been the prescriptive source for the spelling of German. The Duden had become the bible of the German language, being the definitive set of rules regarding grammar, spelling, and usage of German. The ("Austrian Dictionary"), abbreviated , is the official dictionary of the German language in the Republic of Austria. It is edited by a group of linguists under the authority of the Austrian Federal Ministry of Education, Arts and Culture (). It is the Austrian counterpart to the German Duden and contains a number of terms unique to Austrian German or more frequently used or differently pronounced there. A considerable amount of this "Austrian" vocabulary is also common in Southern Germany, especially Bavaria, and some of it is used in Switzerland as well. Since the 39th edition in 2001 the orthography of the has been adjusted to the German spelling reform of 1996. The dictionary is also officially used in the Italian province of South Tyrol. Orthography Written texts in German are easily recognisable as such by distinguishing features such as umlauts and certain orthographical features – German is the only major language that capitalizes all nouns, a relic of a widespread practice in Northern Europe in the early modern era (including English for a while, in the 1700s) – and the frequent occurrence of long compounds. Because legibility and convenience set certain boundaries, compounds consisting of more than three or four nouns are almost exclusively found in humorous contexts. (In contrast, although English can also string nouns together, it usually separates the nouns with spaces. For example, "toilet bowl cleaner".) In German orthography, nouns are capitalised, which makes it easier for readers to determine the function of a word within a sentence. This convention is almost unique to German today (shared perhaps only by the closely related Luxembourgish language and several insular dialects of the North Frisian language), but it was historically common in other languages such as Danish (which abolished the capitalization of nouns in 1948) and English. Present Before the German orthography reform of 1996, ß replaced ss after long vowels and diphthongs and before consonants, word-, or partial-word endings. In reformed spelling, ß replaces ss only after long vowels and diphthongs. Since there is no traditional capital form of ß, it was replaced by SS (or SZ) when capitalization was required. For example, (tape measure) became in capitals. An exception was the use of ß in legal documents and forms when capitalizing names. To avoid confusion with similar names, lower case ß was sometimes maintained (thus "" instead of ""). Capital ß (ẞ) was ultimately adopted into German orthography in 2017, ending a long orthographic debate (thus " and "). Umlaut vowels (ä, ö, ü) are commonly transcribed with ae, oe, and ue if the umlauts are not available on the keyboard or other medium used. In the same manner, ß can be transcribed as ss. Some operating systems use key sequences to extend the set of possible characters to include, amongst other things, umlauts; in Microsoft Windows this is done using Alt codes. German readers understand these transcriptions (although they appear unusual), but they are avoided if the regular umlauts are available, because they are a makeshift and not proper spelling. (In Westphalia and Schleswig-Holstein, city and family names exist where the extra e has a vowel lengthening effect, e.g. Raesfeld , Coesfeld and Itzehoe , but this use of the letter e after a/o/u does not occur in the present-day spelling of words other than proper nouns.) There is no general agreement on where letters with umlauts occur in the sorting sequence. Telephone directories treat them by replacing them with the base vowel followed by an e. Some dictionaries sort each umlauted vowel as a separate letter after the base vowel, but more commonly words with umlauts are ordered immediately after the same word without umlauts. As an example in a telephone book occurs after but before (because Ä is replaced by Ae). In a dictionary comes after , but in some dictionaries and all other words starting with Ä may occur after all words starting with A. In some older dictionaries or indexes, initial Sch and St are treated as separate letters and are listed as separate entries after S, but they are usually treated as S+C+H and S+T. Written German also typically uses an alternative opening inverted comma (quotation mark) as in . Past Until the early 20th century, German was printed in blackletter typefaces (in Fraktur, and in Schwabacher), and written in corresponding handwriting (for example Kurrent and Sütterlin). These variants of the Latin alphabet are very different from the serif or sans-serif Antiqua typefaces used today, and the handwritten forms in particular are difficult for the untrained to read. The Nazis initially promoted Fraktur and Schwabacher because they were considered Aryan, but they abolished them in 1941, claiming that these letters were Jewish. It is believed that the Nazi régime had banned this script, as they realized that Fraktur would inhibit communication in the territories occupied during World War II. The Fraktur script however remains present in everyday life in pub signs, beer brands and other forms of advertisement, where it is used to convey a certain rusticality and antiquity. A proper use of the long s (), ſ, is essential for writing German text in Fraktur typefaces. Many Antiqua typefaces also include the long s. A specific set of rules applies for the use of long s in German text, but nowadays it is rarely used in Antiqua typesetting. Any lower case "s" at the beginning of a syllable would be a long s, as opposed to a terminal s or short s (the more common variation of the letter s), which marks the end of a syllable; for example, in differentiating between the words (guard-house) and (tube of polish/wax). One can easily decide which "s" to use by appropriate hyphenation, ( vs. ). The long s only appears in lower case. Consonant shifts German does not have any dental fricatives (as English th). The th sound, which the English language still has, disappeared on the continent in German with the consonant shifts between the 8th and 10th centuries. It is sometimes possible to find parallels between English and German by replacing the English th with d in German: "Thank" → in German , "this" and "that" → and , "thou" (old 2nd person singular pronoun) → , "think" → , "thirsty" → and many other examples. Likewise, the gh in Germanic English words, pronounced in several different ways in modern English (as an f or not at all), can often be linked to German ch: "to laugh" → , "through" → , "high" → , "naught" → , "light" → or , "sight" → , "daughter" → , "neighbour" → . Literature The German language is used in German literature and can be traced back to the Middle Ages, with the most notable authors of the period being Walther von der Vogelweide and Wolfram von Eschenbach. The , whose author remains unknown, is also an important work of the epoch. The fairy tales collected and published by Jacob and Wilhelm Grimm in the 19th century became famous throughout the world. Reformer and theologian Martin Luther, who has translated the Bible into High German, is widely credited for having set the basis for the modern Standard High German language. Among the best-known poets and authors in German are Lessing, Goethe, Schiller, Kleist, Hoffmann, Brecht, Heine and Kafka. Fourteen German-speaking people have won the Nobel Prize in literature: Theodor Mommsen, Rudolf Christoph Eucken, Paul von Heyse, Gerhart Hauptmann, Carl Spitteler, Thomas Mann, Nelly Sachs, Hermann Hesse, Heinrich Böll, Elias Canetti, Günter Grass, Elfriede Jelinek, Herta Müller and Peter Handke, making it the second most awarded linguistic region (together with French) after English. See also Outline of German language Denglisch Deutsch (disambiguation) German family name etymology German toponymy Germanism (linguistics) List of German exonyms List of German expressions in English List of German words of French origin List of pseudo-German words adapted to English List of terms used for Germans List of territorial entities where German is an official language Names for the German language DDR German Notes References Bibliography External links Dissemination of the German language in Europe around 1913 (map, 300 dpi) Fusional languages High German languages Languages of Austria Languages of Belgium Languages of Germany Languages of Liechtenstein Languages of Luxembourg Languages of Namibia Languages of Switzerland Languages of Trentino-Alto Adige/Südtirol Stress-timed languages Verb-second languages
[ 0.1909254789352417, 0.43374431133270264, -0.22516857087612152, -0.6480755805969238, 0.1027190238237381, 0.5913228988647461, 0.8379018306732178, 0.8131295442581177, -0.2693679928779602, -0.6176979541778564, -0.16903196275234222, -0.09756464511156082, 0.0866050198674202, 0.1093783900141716, ...
11887
https://en.wikipedia.org/wiki/Greek%20language
Greek language
Greek (; ) is an independent branch of the Indo-European family of languages, native to Greece, Cyprus, Albania, and other regions of the Balkans, the Black Sea coast, and the Eastern Mediterranean. It has the longest documented history of any Indo-European language, spanning at least 3,400 years of written records. Its writing system is the Greek alphabet, which has been used for approximately 2,800 years; previously, Greek was recorded in writing systems such as Linear B and the Cypriot syllabary. The alphabet arose from the Phoenician script and was in turn the basis of the Latin, Cyrillic, Armenian, Coptic, Gothic, and many other writing systems. The Greek language holds an important place in the history of the Western world. Beginning with the epics of Homer, ancient Greek literature includes many works of lasting importance in the European canon. Greek is also the language in which many of the foundational texts in science and philosophy were originally composed. The New Testament of the Christian Bible was also originally written in Greek. Together with the Latin texts and traditions of the Roman world, the Greek texts and Greek societies of antiquity constitute the objects of study of the discipline of Classics. During antiquity, Greek was by far the most widely spoken lingua franca in the Mediterranean world. It eventually became the official language of the Byzantine Empire and developed into Medieval Greek. In its modern form, Greek is the official language of Greece and Cyprus and one of the 24 official languages of the European Union. It is spoken by at least 13.5 million people today in Greece, Cyprus, Italy, Albania, Turkey, and the many other countries of the Greek diaspora. Greek roots have been widely used for centuries and continue to be widely used to coin new words in other languages; Greek and Latin are the predominant sources of international scientific vocabulary. History Greek has been spoken in the Balkan peninsula since around the 3rd millennium BC, or possibly earlier. The earliest written evidence is a Linear B clay tablet found in Messenia that dates to between 1450 and 1350 BC, making Greek the world's oldest recorded living language. Among the Indo-European languages, its date of earliest written attestation is matched only by the now-extinct Anatolian languages. Periods The Greek language is conventionally divided into the following periods: Proto-Greek: the unrecorded but assumed last ancestor of all known varieties of Greek. The unity of Proto-Greek would have ended as Hellenic migrants entered the Greek peninsula sometime in the Neolithic era or the Bronze Age. Mycenaean Greek: the language of the Mycenaean civilization. It is recorded in the Linear B script on tablets dating from the 15th century BC onwards. Ancient Greek: in its various dialects, the language of the Archaic and Classical periods of the ancient Greek civilization. It was widely known throughout the Roman Empire. Ancient Greek fell into disuse in western Europe in the Middle Ages, but remained officially in use in the Byzantine world and was reintroduced to the rest of Europe with the Fall of Constantinople and Greek migration to western Europe. Koine Greek: The fusion of Ionian with Attic, the dialect of Athens, began the process that resulted in the creation of the first common Greek dialect, which became a lingua franca across the Eastern Mediterranean and Near East. Koine Greek can be initially traced within the armies and conquered territories of Alexander the Great and after the Hellenistic colonization of the known world, it was spoken from Egypt to the fringes of India. After the Roman conquest of Greece, an unofficial bilingualism of Greek and Latin was established in the city of Rome and Koine Greek became a first or second language in the Roman Empire. The origin of Christianity can also be traced through Koine Greek, because the Apostles used this form of the language to spread Christianity. It is also known as Hellenistic Greek, New Testament Greek, and sometimes Biblical Greek because it was the original language of the New Testament and the Old Testament was translated into the same language via the Septuagint. Medieval Greek, also known as Byzantine Greek: the continuation of Koine Greek, up to the demise of the Byzantine Empire in the 15th century. Medieval Greek is a cover phrase for a whole continuum of different speech and writing styles, ranging from vernacular continuations of spoken Koine that were already approaching Modern Greek in many respects, to highly learned forms imitating classical Attic. Much of the written Greek that was used as the official language of the Byzantine Empire was an eclectic middle-ground variety based on the tradition of written Koine. Modern Greek (Neo-Hellenic): Stemming from Medieval Greek, Modern Greek usages can be traced in the Byzantine period, as early as the 11th century. It is the language used by the modern Greeks, and, apart from Standard Modern Greek, there are several dialects of it. Diglossia In the modern era, the Greek language entered a state of diglossia: the coexistence of vernacular and archaizing written forms of the language. What came to be known as the Greek language question was a polarization between two competing varieties of Modern Greek: Dimotiki, the vernacular form of Modern Greek proper, and Katharevousa, meaning 'purified', a compromise between Dimotiki and Ancient Greek, which was developed in the early 19th century, and was used for literary and official purposes in the newly formed Greek state. In 1976, Dimotiki was declared the official language of Greece, having incorporated features of Katharevousa and giving birth to Standard Modern Greek, which is used today for all official purposes and in education. Historical unity The historical unity and continuing identity between the various stages of the Greek language are often emphasized. Although Greek has undergone morphological and phonological changes comparable to those seen in other languages, never since classical antiquity has its cultural, literary, and orthographic tradition been interrupted to the extent that one can speak of a new language emerging. Greek speakers today still tend to regard literary works of ancient Greek as part of their own rather than a foreign language. It is also often stated that the historical changes have been relatively slight compared with some other languages. According to one estimation, "Homeric Greek is probably closer to Demotic than 12-century Middle English is to modern spoken English". Geographic distribution Greek is spoken today by at least 13 million people, principally in Greece and Cyprus along with a sizable Greek-speaking minority in Albania near the Greek-Albanian border. A significant percentage of Albania's population has some basic knowledge of the Greek language due in part to the Albanian wave of immigration to Greece in the 1980s and '90s. Prior to the Greco-Turkish War and the resulting population exchange in 1923 a very large population of Greek-speakers also existed in Turkey, though very few remain today. A small Greek-speaking community is also found in Bulgaria near the Greek-Bulgarian border. Greek is also spoken worldwide by the sizable Greek diaspora which has notable communities in the United States, Australia, Canada, South Africa, Chile, Brazil, Argentina, Russia, Ukraine, the United Kingdom, and throughout the European Union, especially in Germany. Historically, significant Greek-speaking communities and regions were found throughout the Eastern Mediterranean, in what are today Southern Italy, Turkey, Cyprus, Syria, Lebanon, Palestine, Israel, Egypt, and Libya; in the area of the Black Sea, in what are today Turkey, Bulgaria, Romania, Ukraine, Russia, Georgia, Armenia, and Azerbaijan; and, to a lesser extent, in the Western Mediterranean in and around colonies such as Massalia, Monoikos, and Mainake. It was also used as a liturgical language in Christian Nubian kingdom of Makuria which was in modern day Sudan. Official status Greek, in its modern form, is the official language of Greece, where it is spoken by almost the entire population. It is also the official language of Cyprus (nominally alongside Turkish). Because of the membership of Greece and Cyprus in the European Union, Greek is one of the organization's 24 official languages. Furthermore, Greek is officially recognized as official in Dropull and Himara (Albania), and as a minority language all over Albania. It is also recognized as an official minority language in the regions of Apulia and Calabria in Italy. In the framework of the European Charter for Regional or Minority Languages, Greek is protected and promoted officially as a regional and minority language in Armenia, Hungary, Romania, and Ukraine. Characteristics The phonology, morphology, syntax, and vocabulary of the language show both conservative and innovative tendencies across the entire attestation of the language from the ancient to the modern period. The division into conventional periods is, as with all such periodizations, relatively arbitrary, especially because at all periods, Ancient Greek has enjoyed high prestige, and the literate borrowed heavily from it. Phonology Across its history, the syllabic structure of Greek has varied little: Greek shows a mixed syllable structure, permitting complex syllabic onsets but very restricted codas. It has only oral vowels and a fairly stable set of consonantal contrasts. The main phonological changes occurred during the Hellenistic and Roman period (see Koine Greek phonology for details): replacement of the pitch accent with a stress accent. simplification of the system of vowels and diphthongs: loss of vowel length distinction, monophthongisation of most diphthongs and several steps in a chain shift of vowels towards (iotacism). development of the voiceless aspirated plosives and to the voiceless fricatives and , respectively; the similar development of to may have taken place later (the phonological changes are not reflected in the orthography, and both earlier and later phonemes are written with φ, θ, and χ). development of the voiced plosives , , and to their voiced fricative counterparts (later ), , and . Morphology In all its stages, the morphology of Greek shows an extensive set of productive derivational affixes, a limited but productive system of compounding and a rich inflectional system. Although its morphological categories have been fairly stable over time, morphological changes are present throughout, particularly in the nominal and verbal systems. The major change in the nominal morphology since the classical stage was the disuse of the dative case (its functions being largely taken over by the genitive). The verbal system has lost the infinitive, the synthetically-formed future, and perfect tenses and the optative mood. Many have been replaced by periphrastic (analytical) forms. Nouns and adjectives Pronouns show distinctions in person (1st, 2nd, and 3rd), number (singular, dual, and plural in the ancient language; singular and plural alone in later stages), and gender (masculine, feminine, and neuter), and decline for case (from six cases in the earliest forms attested to four in the modern language). Nouns, articles, and adjectives show all the distinctions except for a person. Both attributive and predicative adjectives agree with the noun. Verbs The inflectional categories of the Greek verb have likewise remained largely the same over the course of the language's history but with significant changes in the number of distinctions within each category and their morphological expression. Greek verbs have synthetic inflectional forms for: Syntax Many aspects of the syntax of Greek have remained constant: verbs agree with their subject only, the use of the surviving cases is largely intact (nominative for subjects and predicates, accusative for objects of most verbs and many prepositions, genitive for possessors), articles precede nouns, adpositions are largely prepositional, relative clauses follow the noun they modify and relative pronouns are clause-initial. However, the morphological changes also have their counterparts in the syntax, and there are also significant differences between the syntax of the ancient and that of the modern form of the language. Ancient Greek made great use of participial constructions and of constructions involving the infinitive, and the modern variety lacks the infinitive entirely (employing a raft of new periphrastic constructions instead) and uses participles more restrictively. The loss of the dative led to a rise of prepositional indirect objects (and the use of the genitive to directly mark these as well). Ancient Greek tended to be verb-final, but neutral word order in the modern language is VSO or SVO. Vocabulary Modern Greek inherits most of its vocabulary from Ancient Greek, which in turn is an Indo-European language, but also includes a number of borrowings from the languages of the populations that inhabited Greece before the arrival of Proto-Greeks, some documented in Mycenaean texts; they include a large number of Greek toponyms. The form and meaning of many words have evolved. Loanwords (words of foreign origin) have entered the language, mainly from Latin, Venetian, and Turkish. During the older periods of Greek, loanwords into Greek acquired Greek inflections, thus leaving only a foreign root word. Modern borrowings (from the 20th century on), especially from French and English, are typically not inflected; other modern borrowings are derived from South Slavic (Macedonian/Bulgarian) and Eastern Romance languages (Aromanian and Megleno-Romanian). Greek loanwords in other languages Greek words have been widely borrowed into other languages, including English. Example words include: mathematics, physics, astronomy, democracy, philosophy, athletics, theatre, rhetoric, baptism, evangelist, etc. Moreover, Greek words and word elements continue to be productive as a basis for coinages: anthropology, photography, telephony, isomer, biomechanics, cinematography, etc. Together with Latin words, they form the foundation of international scientific and technical vocabulary. For example, all words ending in –logy ("discourse"). There are many English words of Greek origin. Classification Greek is an independent branch of the Indo-European language family. The ancient language most closely related to it may be ancient Macedonian, which most scholars suggest may have been a dialect of Greek itself, but it is poorly attested and it is difficult to conclude. Independently of the Macedonian question, some scholars have grouped Greek into Graeco-Phrygian, as Greek and the extinct Phrygian share features that are not found in other Indo-European languages. Among living languages, some Indo-Europeanists suggest that Greek may be most closely related to Armenian (see Graeco-Armenian) or the Indo-Iranian languages (see Graeco-Aryan), but little definitive evidence has been found for grouping the living branches of the family. In addition, Albanian has also been considered somewhat related to Greek and Armenian by some linguists. If proven and recognized, the three languages would form a new Balkan sub-branch with other dead European languages. Writing system Linear B Linear B, attested as early as the late 15th century BC, was the first script used to write Greek. It is basically a syllabary, which was finally deciphered by Michael Ventris and John Chadwick in the 1950s (its precursor, Linear A, has not been deciphered and most likely encodes a non-Greek language). The language of the Linear B texts, Mycenaean Greek, is the earliest known form of Greek. Cypriot syllabary Another similar system used to write the Greek language was the Cypriot syllabary (also a descendant of Linear A via the intermediate Cypro-Minoan syllabary), which is closely related to Linear B but uses somewhat different syllabic conventions to represent phoneme sequences. The Cypriot syllabary is attested in Cyprus from the 11th century BC until its gradual abandonment in the late Classical period, in favor of the standard Greek alphabet. Greek alphabet Greek has been written in the Greek alphabet since approximately the 9th century BC. It was created by modifying the Phoenician alphabet, with the innovation of adopting certain letters to represent the vowels. The variant of the alphabet in use today is essentially the late Ionic variant, introduced for writing classical Attic in 403 BC. In classical Greek, as in classical Latin, only upper-case letters existed. The lower-case Greek letters were developed much later by medieval scribes to permit a faster, more convenient cursive writing style with the use of ink and quill. The Greek alphabet consists of 24 letters, each with an uppercase (majuscule) and lowercase (minuscule) form. The letter sigma has an additional lowercase form (ς) used in the final position: Diacritics In addition to the letters, the Greek alphabet features a number of diacritical signs: three different accent marks (acute, grave, and circumflex), originally denoting different shapes of pitch accent on the stressed vowel; the so-called breathing marks (rough and smooth breathing), originally used to signal presence or absence of word-initial /h/; and the diaeresis, used to mark the full syllabic value of a vowel that would otherwise be read as part of a diphthong. These marks were introduced during the course of the Hellenistic period. Actual usage of the grave in handwriting saw a rapid decline in favor of uniform usage of the acute during the late 20th century, and it has only been retained in typography. After the writing reform of 1982, most diacritics are no longer used. Since then, Greek has been written mostly in the simplified monotonic orthography (or monotonic system), which employs only the acute accent and the diaeresis. The traditional system, now called the polytonic orthography (or polytonic system), is still used internationally for the writing of Ancient Greek. Punctuation In Greek, the question mark is written as the English semicolon, while the functions of the colon and semicolon are performed by a raised point (•), known as the ano teleia (). In Greek the comma also functions as a silent letter in a handful of Greek words, principally distinguishing (ó,ti, 'whatever') from (óti, 'that'). Ancient Greek texts often used scriptio continua ('continuous writing'), which means that ancient authors and scribes would write word after word with no spaces or punctuation between words to differentiate or mark boundaries. Boustrophedon, or bi-directional text, was also used in Ancient Greek. Latin alphabet Greek has occasionally been written in the Latin script, especially in areas under Venetian rule or by Greek Catholics. The term / applies when the Latin script is used to write Greek in the cultural ambit of Catholicism (because / is an older Greek term for West-European dating to when most of (Roman Catholic Christian) West Europe was under the control of the Frankish Empire). / (meaning 'Catholic Chiot') alludes to the significant presence of Catholic missionaries based on the island of Chios. Additionally, the term Greeklish is often used when the Greek language is written in a Latin script in online communications. The Latin script is nowadays used by the Greek-speaking communities of Southern Italy. Hebrew alphabet The Yevanic dialect was written by Romaniote and Constantinopolitan Karaite Jews using the Hebrew Alphabet. Arabic alphabet Some Greek Muslims from Crete wrote their Cretan Greek in the Arabic alphabet. The same happened among Epirote Muslims in Ioannina. This usage is sometimes called aljamiado as when Romance languages are written in the Arabic alphabet. Example text The Article 1 of the Universal Declaration of Human Rights in Greek: 'Ολοι οι άνθρωποι γεννιούνται ελεύθεροι και ίσοι στην αξιοπρέπεια και τα δικαιώματα. Είναι προικισμένοι με λογική και συνείδηση, και οφείλουν να συμπεριφέρονται μεταξύ τους με πνεύμα αδελφοσύνης. Transcription of the example text into Latin alphabet: 'Oloi oi ánthropoi gennioúntai eléftheroi kai ísoi stin axioprépeia kai ta dikaiómata. Eínai proikisménoi me logikí kai syneídisi, kai ofeíloun na symperiférontai metaxý tous me pnévma adelfosýnis. The Article 1 of the Universal Declaration of Human Rights in English: All human beings are born free and equal in dignity and rights. They are endowed with reason and conscience and should act towards one another in a spirit of brotherhood. See also Modern Greek Varieties of Modern Greek Medieval Greek Ancient Greek Ancient Greek dialects Hellenic languages List of Greek and Latin roots in English List of medical roots, suffixes and prefixes Notes References Citations Sources Further reading External links General background Greek Language, Columbia Electronic Encyclopedia. The Greek Language and Linguistics Gateway, useful information on the history of the Greek language, application of modern Linguistics to the study of Greek, and tools for learning Greek. Aristotle University of Thessaloniki, The Greek Language Portal, a portal for Greek language and linguistic education. The Perseus Project has many useful pages for the study of classical languages and literatures, including dictionaries. Ancient Greek Tutorials, Berkeley Language Center of the University of California, Berkeley Language learning Hellenistic Greek Lessons Greek-Language.com provides a free online grammar of Hellenistic Greek. komvos.edu.gr, a website for the support of people who are being taught the Greek language. New Testament Greek Three graduated courses designed to help students learn to read the Greek New Testament Books on Greek language that are taught at schools in Greece (page in Greek) Greek Swadesh list of basic vocabulary words (from Wiktionary's Swadesh list appendix) USA Foreign Service Institute Modern Greek basic course Identifies the grammatical functions of all the words in sentences entered, using Perseus. Dictionaries Greek Lexical Aids, descriptions of both online lexicons (with appropriate links) and Greek Lexicons in Print. The Greek Language Portal, dictionaries of all forms of Greek (Ancient, Hellenistic, Medieval, Modern) scanned images from S. C. Woodhouse's English–Greek dictionary, 1910 Literature Center for Neo-Hellenic Studies, a non-profit organization that promotes modern Greek literature and culture Research lab of modern Greek philosophy, a large e-library of modern Greek texts/books Fusional languages Greek alphabet Languages of Albania Languages of Apulia Languages of Armenia Languages of Calabria Languages of Cyprus Languages of Georgia (country) Languages of Greece Languages of Hungary Languages of Romania Languages of Sicily Languages of Turkey Languages of Ukraine Subject–verb–object languages
[ 0.3473486602306366, 0.3691829741001129, -0.6557310223579407, -0.37988483905792236, 0.26043495535850525, 0.8008027076721191, 0.34170499444007874, 0.7114158272743225, -0.46017616987228394, -0.5406988263130188, -0.30939415097236633, 0.1191301941871643, -0.4023772180080414, 0.01373456418514251...
11888
https://en.wikipedia.org/wiki/Golem
Golem
A golem ( ; , gōlem) is an animated anthropomorphic being in Jewish folklore which is entirely created from inanimate matter (usually clay or mud). In the Psalms and medieval writings, the word golem was used as a term for an amorphous, unformed material. The most famous golem narrative involves Judah Loew ben Bezalel, the late-16th-century rabbi of Prague. Many tales differ on how the golem was brought to life and controlled. According to Moment Magazine, "the golem is a highly mutable metaphor with seemingly limitless symbolism. It can be a victim or villain, Jew or non-Jew, man or woman—or sometimes both. Over the centuries, it has been used to connote war, community, isolation, hope, and despair." Etymology The word golem occurs once in the Bible in Psalm 139:16, which uses the word (; my golem), that means "my light form", "raw" material, connoting the unfinished human being before God's eyes. The Mishnah uses the term for an uncultivated person: "Seven characteristics are in an uncultivated person, and seven in a learned one", () (Pirkei Avot 5:7 in the Hebrew text; English translations vary). In Modern Hebrew, is used to mean "dumb" or "helpless". Similarly, it is often used today as a metaphor for a mindless lunk or entity who serves a man under controlled conditions, but is hostile to him under other conditions. "Golem" passed into Yiddish as to mean someone who is lethargic or beneath a stupor. History Earliest stories The oldest stories of golems date to early Judaism. In the Talmud (Tractate Sanhedrin 38b), Adam was initially created as a golem () when his dust was "kneaded into a shapeless husk". Like Adam, all golems are created from mud by those close to divinity, but no anthropogenic golem is fully human. Early on, the main disability of the golem was its inability to speak. Sanhedrin 65b describes Rava creating a man (). He sent the man to Rav Zeira. Rav Zeira spoke to him, but he did not answer. Rav Zeira said, "You were created by the sages; return to your dust". During the Middle Ages, passages from the Sefer Yetzirah (Book of Creation) were studied as a means to create and animate a golem, although there is little in the writings of Jewish mysticism that supports this belief. It was believed that golems could be activated by an ecstatic experience induced by the ritualistic use of various letters of the Hebrew alphabet forming a "" (any one of the Names of God), wherein the was written on a piece of paper and inserted in the mouth or in the forehead of the golem. A golem is inscribed with Hebrew words in some tales (for example, some versions of Chełm and Prague, as well as in Polish tales and versions of the Brothers Grimm), such as the word (, "truth" in Hebrew) written on its forehead. The golem could then be deactivated by removing the aleph (א) in , thus changing the inscription from "truth" to "death" ( , meaning "dead"). Samuel of Speyer (12th century) was said to have created a golem. Rabbi Jacob ben Shalom arrived at Barcelona from Germany in 1325 and remarked that the law of destruction is the reversal of the law of creation. One source credits 11th century Solomon ibn Gabirol with creating a golem, possibly female, for household chores. In 1625, Joseph Delmedigo wrote that "many legends of this sort are current, particularly in Germany." The earliest known written account of how to create a golem can be found in Sodei Razayya by Eleazar ben Judah of Worms of the late 12th and early 13th century. The Golem of Chełm The oldest description of the creation of a golem by a historical figure is included in a tradition connected to Rabbi Eliyahu of Chełm (1550–1583). A Polish Kabbalist, writing in about 1630–1650, reported the creation of a golem by Rabbi Eliyahu thus: "And I have heard, in a certain and explicit way, from several respectable persons that one man [living] close to our time, whose name is R. Eliyahu, the master of the name, who made a creature out of matter [Heb. Golem] and form [Heb. tzurah] and it performed hard work for him, for a long period, and the name of emet was hanging upon his neck until he finally removed it for a certain reason, the name from his neck and it turned to dust." A similar account was reported by a Christian author, Christoph Arnold, in 1674. Rabbi Jacob Emden (d. 1776) elaborated on the story in a book published in 1748: "As an aside, I'll mention here what I heard from my father's holy mouth regarding the Golem created by his ancestor, the Gaon R. Eliyahu Ba'al Shem of blessed memory. When the Gaon saw that the Golem was growing larger and larger, he feared that the Golem would destroy the universe. He then removed the Holy Name that was embedded on his forehead, thus causing him to disintegrate and return to dust. Nonetheless, while he was engaged in extracting the Holy Name from him, the Golem injured him, scarring him on the face." According to the Polish Kabbalist, "the legend was known to several persons, thus allowing us to speculate that the legend had indeed circulated for some time before it was committed to writing and, consequently, we may assume that its origins are to be traced to the generation immediately following the death of R. Eliyahu, if not earlier." The classic narrative: The Golem of Prague The most famous golem narrative involves Judah Loew ben Bezalel, the late 16th century rabbi of Prague, also known as the Maharal, who reportedly "created a golem out of clay from the banks of the Vltava River and brought it to life through rituals and Hebrew incantations to defend the Prague ghetto from anti-Semitic attacks" and pogroms. Depending on the version of the legend, the Jews in Prague were to be either expelled or killed under the rule of Rudolf II, the Holy Roman Emperor. The Golem was called Josef and was known as Yossele. It was said that he could make himself invisible and summon spirits from the dead. Rabbi Loew deactivated the Golem on Friday evenings by removing the shem before the Sabbath (Saturday) began, so as to let it rest on Sabbath. One Friday evening, Rabbi Loew forgot to remove the shem, and feared that the Golem would desecrate the Sabbath. A different story tells of a golem that fell in love, and when rejected, became the violent monster seen in most accounts. Some versions have the golem eventually going on a murderous rampage. The rabbi then managed to pull the shem from his mouth and immobilize him in front of the synagogue, whereupon the golem fell in pieces. The Golem's body was stored in the attic genizah of the Old New Synagogue, where it would be restored to life again if needed. Rabbi Loew then forbade anyone except his successors from going into the attic. Rabbi Yechezkel Landau, a successor of Rabbi Loew, reportedly wanted to go up the steps to the attic when he was Chief Rabbi of Prague to verify the tradition. Rabbi Landau fasted and immersed himself in a mikveh, wrapped himself in phylacteries and a prayer-shawl and started ascending the steps. At the top of the steps, he hesitated and then came immediately back down, trembling and frightened. He then re-enacted Rabbi Loew's original warning. According to legend, the body of Rabbi Loew's Golem still lies in the synagogue's attic. When the attic was renovated in 1883, no evidence of the Golem was found. Some versions of the tale state that the Golem was stolen from the genizah and entombed in a graveyard in Prague's Žižkov district, where the Žižkov Television Tower now stands. A recent legend tells of a Nazi agent ascending to the synagogue attic, but he died instead under suspicious circumstances. The attic is not open to the general public. Some Orthodox Jews believe that the Maharal did actually create a golem. The evidence for this belief has been analyzed from an Orthodox Jewish perspective by Shnayer Z. Leiman. Sources of the Prague narrative The general view of historians and critics is that the story of the Golem of Prague was a German literary invention of the early 19th century. According to John Neubauer, the first writers on the Prague Golem were: 1837: Berthold Auerbach, Spinoza 1841: Gustav Philippson, Der Golam, eine Legende 1841: Franz Klutschak, Der Golam des Rabbi Löw 1842: Adam Tendlau Der Golem des Hoch-Rabbi-Löw 1847: Leopold Weisel, Der Golem However, there are in fact a couple of slightly earlier examples, in 1834 and 1836. All of these early accounts of the Golem of Prague are in German by Jewish writers. It has been suggested that they emerged as part of a Jewish folklore movement parallel with the contemporary German folklore movement. The origins of the story have been obscured by attempts to exaggerate its age and to pretend that it dates from the time of the Maharal. It has been said that Rabbi Yudel Rosenberg (1859–1935) of Tarłów (before moving to Canada where he became one of its most prominent rabbis) originated the idea that the narrative dates from the time of the Maharal. Rosenberg published Nifl'os Maharal (Wonders of Maharal) (Piotrków, 1909) which purported to be an eyewitness account by the Maharal's son-in-law, who had helped to create the Golem. Rosenberg claimed that the book was based upon a manuscript that he found in the main library in Metz. Wonders of Maharal "is generally recognized in academic circles to be a literary hoax". Gershom Sholem observed that the manuscript "contains not ancient legends but modern fiction". Rosenberg's claim was further disseminated in Chayim Bloch's (1881–1973) The Golem: Legends of the Ghetto of Prague (English edition 1925). The Jewish Encyclopedia of 1906 cites the historical work Zemach David by David Gans, a disciple of the Maharal, published in 1592. In it, Gans writes of an audience between the Maharal and Rudolph II: "Our lord the emperor ... Rudolph ... sent for and called upon our master Rabbi Low ben Bezalel and received him with a welcome and merry expression, and spoke to him face to face, as one would to a friend. The nature and quality of their words are mysterious, sealed and hidden." But it has been said of this passage, "Even when [the Maharal is] eulogized, whether in David Gans' Zemach David or on his epitaph …, not a word is said about the creation of a golem. No Hebrew work published in the 16th, 17th, and 18th centuries (even in Prague) is aware that the Maharal created a golem." Furthermore, the Maharal himself did not refer to the Golem in his writings. Rabbi Yedidiah Tiah Weil (1721–1805), a Prague resident, who described the creation of golems, including those created by Rabbis Avigdor Kara of Prague (died 1439) and Eliyahu of Chelm, did not mention the Maharal, and Rabbi Meir Perils' biography of the Maharal published in 1718 does not mention a golem. The Golem of Vilna There is a similar tradition relating to the Vilna Gaon or "the saintly genius from Vilnius" (1720–1797). Rabbi Chaim Volozhin (Lithuania 1749–1821) reported in an introduction to Sifra de Tzeniuta that he once presented to his teacher, the Vilna Gaon, ten different versions of a certain passage in the Sefer Yetzira and asked the Gaon to determine the correct text. The Gaon immediately identified one version as the accurate rendition of the passage. The amazed student then commented to his teacher that, with such clarity, he should easily be able to create a live human. The Gaon affirmed Rabbi Chaim's assertion and said that he once began to create a person when he was a child, under the age of 13, but during the process, he received a sign from Heaven ordering him to desist because of his tender age. Theme of hubris The existence of a golem is sometimes a mixed blessing. Golems are not intelligent, and if commanded to perform a task, they will perform the instructions literally. In many depictions, Golems are inherently perfectly obedient. In its earliest known modern form, the Golem of Chełm became enormous and uncooperative. In one version of this story, the rabbi had to resort to trickery to deactivate it, whereupon it crumbled upon its creator and crushed him. There is a similar theme of hubris in Frankenstein, The Sorcerer's Apprentice, and some other stories in popular culture, such as The Terminator. The theme also manifests itself in R.U.R. (Rossum's Universal Robots), Karel Čapek's 1921 play which coined the term robot; the play was written in Prague, and while Čapek denied that he modeled the robot after the Golem, there are many similarities in the plot. Culture of the Czech Republic The Golem is a popular figure in the Czech Republic. There are several restaurants and other businesses whose names make reference to the creature, a Czech strongman (René Richter) goes by the nickname "Golem", and a Czech monster truck outfit calls itself the "Golem Team". Abraham Akkerman preceded his article on human automatism in the contemporary city with a short satirical poem on a pair of golems turning human. Clay Boy variation A Yiddish and Slavic folktale is the Clay Boy, which combines elements of the Golem and The Gingerbread Man, in which a lonely couple makes a child out of clay, with disastrous or comical consequences. In one common Russian version, an older couple, whose children have left home, make a boy out of clay and dry him by their hearth. The Clay Boy (, ) comes to life; at first, the couple is delighted and treats him like a real child, but the Clay Boy does not stop growing and eats all their food, then all their livestock, and then the Clay Boy eats his parents. The Clay Boy rampages through the village until he is smashed by a quick-thinking goat. Golem in popular culture Film and television Golems are frequently depicted in movies and television shows. Programs with them in the title include: The Golem (, shown in the US as The Monster of Fate), a 1915 German silent horror film, written and directed by Paul Wegener and Henrik Galeen. The Golem and the Dancing Girl (), a 1917 German silent comedy-horror film, directed by Paul Wegener and Rochus Gliese. The Golem: How He Came into the World (, also referred to as Der Golem), a 1920 German silent horror film, directed by Paul Wegener and Carl Boese. Le Golem (), a 1936 Czechoslovakian monster movie directed by Julien Duvivier in French. Other references to Golem's in popular culture include: Daimajin, a 1966 Japanese kaiju film directed by Kimiyoshi Yasuda. It!, a 1967 British horror film directed by Herbert J. Leder. Music There have been a number of scores written to accompany or based on the 1920 film, including by Daniel Hoffman and performed by the San Francisco-based ensemble Davka and by Karl-Errnst Sasse. In 1962, Abraham Ellstein's opera The Golem, commissioned by the New York City Opera, premiered at City Opera, New York. In 1994, composer Richard Teitelbaum composed Golem, based on the Prague legend and combining music with electronics. See also Artificial Intelligence Brazen head Czech folklore Frankenstein's monster The Gingerbread Man and Kolobok (edible golems) Homunculus Pinocchio Prometheus Pygmalion and Galatea (mythology) Shabti Talos Tupilaq Zombie References Further reading Translated (2008) as Jewish Stories of Prague, Jewish Prague in History and Legend. . External links yutorah.org Czech folklore Hebrew-language names Kabbalistic words and phrases Medieval legends Practical Kabbalah History of Prague Supernatural legends Urban legends
[ 0.13363362848758698, 0.13425420224666595, -0.41008639335632324, 0.09416978806257248, -0.0647006630897522, 0.31078463792800903, 0.10242805629968643, 0.49421992897987366, -0.3235069215297699, -0.19693899154663086, -1.1652271747589111, 0.041645463556051254, -0.5409310460090637, 0.397577732801...
11891
https://en.wikipedia.org/wiki/George%20Orwell
George Orwell
Eric Arthur Blair (25 June 1903 – 21 January 1950), known by his pen name George Orwell, was an English novelist, essayist, journalist and critic. His work is characterised by lucid prose, biting social criticism, total opposition to totalitarianism, and outspoken support of democratic socialism. Orwell produced literary criticism and poetry, fiction and polemical journalism. He is known for the allegorical novella Animal Farm (1945) and the dystopian novel Nineteen Eighty-Four (1949). His non-fiction works, including The Road to Wigan Pier (1937), documenting his experience of working-class life in the north of England, and Homage to Catalonia (1938), an account of his experiences soldiering for the Republican faction of the Spanish Civil War (1936–1939), are as critically respected as his essays on politics and literature, language and culture. In 2008, The Times ranked George Orwell second among "The 50 greatest British writers since 1945". Orwell's work remains influential in popular culture and in political culture, and the adjective "Orwellian"—describing totalitarian and authoritarian social practices—is part of the English language, like many of his neologisms, such as "Big Brother", "Thought Police", "Two Minutes Hate", "Room 101", "memory hole", "Newspeak", "doublethink", "unperson", and "thoughtcrime", as well as providing direct inspiration for the neologism "groupthink". Life Early years Eric Arthur Blair was born on 25 June 1903 in Motihari, Bengal, British India into what he described as a "lower-upper-middle class" family. His great-grandfather, Charles Blair, was a wealthy country gentleman and absentee owner of Jamaican plantations from Dorset who married Lady Mary Fane, daughter of the 8th Earl of Westmorland. His grandfather, Thomas Richard Arthur Blair, was an Anglican clergyman, and Orwell's father was Richard Walmesley Blair, who worked as a Sub-Deputy Opium Agent in the Opium Department of the Indian Civil Service, overseeing the production and storage of opium for sale to China. His mother, Ida Mabel Blair (née Limouzin), grew up in Moulmein, Burma, where her French father was involved in speculative ventures. Eric had two sisters: Marjorie, five years older; and Avril, five years younger. When Eric was one year old, his mother took him and Marjorie to England. In 2014 restoration work began on Orwell's birthplace and ancestral house in Motihari. In 1904, Ida Blair settled with her children at Henley-on-Thames in Oxfordshire. Eric was brought up in the company of his mother and sisters and, apart from a brief visit in mid-1907, he did not see his father until 1912. Aged five, Eric was sent as a day-boy to a convent school in Henley-on-Thames, which Marjorie also attended. It was a Roman Catholic convent run by French Ursuline nuns. His mother wanted him to have a public school education, but his family could not afford the fees. Through the social connections of Ida Blair's brother Charles Limouzin, Blair gained a scholarship to St Cyprian's School, Eastbourne, East Sussex. Arriving in September 1911, he boarded at the school for the next five years, returning home only for school holidays. Although he knew nothing of the reduced fees, he "soon recognised that he was from a poorer home". Blair hated the school and many years later wrote an essay "Such, Such Were the Joys", published posthumously, based on his time there. At St Cyprian's, Blair first met Cyril Connolly, who became a writer and who, as the editor of Horizon, published several of Orwell's essays. Before the First World War, the family moved to Shiplake, Oxfordshire, where Eric became friendly with the Buddicom family, especially their daughter Jacintha. When they first met, he was standing on his head in a field. Asked why, he said, "You are noticed more if you stand on your head than if you are right way up." Jacintha and Eric read and wrote poetry, and dreamed of becoming famous writers. He said that he might write a book in the style of H. G. Wells's A Modern Utopia. During this period, he also enjoyed shooting, fishing and birdwatching with Jacintha's brother and sister. While at St Cyprian's, Blair wrote two poems that were published in the Henley and South Oxfordshire Standard. He came second to Connolly in the Harrow History Prize, had his work praised by the school's external examiner, and earned scholarships to Wellington and Eton. But inclusion on the Eton scholarship roll did not guarantee a place, and none was immediately available for Blair. He chose to stay at St Cyprian's until December 1916, in case a place at Eton became available. In January, Blair took up the place at Wellington, where he spent the Spring term. In May 1917 a place became available as a King's Scholar at Eton. At this time the family lived at Mall Chambers, Notting Hill Gate. Blair remained at Eton until December 1921, when he left midway between his 18th and 19th birthday. Wellington was "beastly", Orwell told his childhood friend Jacintha Buddicom, but he said he was "interested and happy" at Eton. His principal tutor was A. S. F. Gow, Fellow of Trinity College, Cambridge, who also gave him advice later in his career. Blair was briefly taught French by Aldous Huxley. Steven Runciman, who was at Eton with Blair, noted that he and his contemporaries appreciated Huxley's linguistic flair. Cyril Connolly followed Blair to Eton, but because they were in separate years, they did not associate with each other. Blair's academic performance reports suggest that he neglected his academic studies, but during his time at Eton he worked with Roger Mynors to produce a College magazine, The Election Times, joined in the production of other publications—College Days and Bubble and Squeak—and participated in the Eton Wall Game. His parents could not afford to send him to a university without another scholarship, and they concluded from his poor results that he would not be able to win one. Runciman noted that he had a romantic idea about the East, and the family decided that Blair should join the Imperial Police, the precursor of the Indian Police Service. For this he had to pass an entrance examination. In December 1921 he left Eton and travelled to join his retired father, mother, and younger sister Avril, who that month had moved to 40 Stradbroke Road, Southwold, Suffolk, the first of their four homes in the town. Blair was enrolled at a crammer there called Craighurst, and brushed up on his Classics, English, and History. He passed the entrance exam, coming seventh out of the 26 candidates who exceeded the pass mark. Policing in Burma Blair's maternal grandmother lived at Moulmein, so he chose a posting in Burma, then still a province of British India. In October 1922 he sailed on board SS Herefordshire via the Suez Canal and Ceylon to join the Indian Imperial Police in Burma. A month later, he arrived at Rangoon and travelled to the police training school in Mandalay. He was appointed an Assistant District Superintendent (on probation) on 29 November 1922, with effect from 27 November and at the pay of Rs. 525 per month. After a short posting at Maymyo, Burma's principal hill station, he was posted to the frontier outpost of Myaungmya in the Irrawaddy Delta at the beginning of 1924. Working as an imperial police officer gave him considerable responsibility while most of his contemporaries were still at university in England. When he was posted farther east in the Delta to Twante as a sub-divisional officer, he was responsible for the security of some 200,000 people. At the end of 1924, he was posted to Syriam, closer to Rangoon. Syriam had the refinery of the Burmah Oil Company, "the surrounding land a barren waste, all vegetation killed off by the fumes of sulphur dioxide pouring out day and night from the stacks of the refinery." But the town was near Rangoon, a cosmopolitan seaport, and Blair went into the city as often as he could, "to browse in a bookshop; to eat well-cooked food; to get away from the boring routine of police life". In September 1925 he went to Insein, the home of Insein Prison, the second largest prison in Burma. In Insein, he had "long talks on every conceivable subject" with Elisa Maria Langford-Rae (who later married Kazi Lhendup Dorjee). She noted his "sense of utter fairness in minutest details". By this time, Blair had completed his training and was receiving a monthly salary of Rs. 740, including allowances. In Burma, Blair acquired a reputation as an outsider. He spent much of his time alone, reading or pursuing non-pukka activities, such as attending the churches of the Karen ethnic group. A colleague, Roger Beadon, recalled (in a 1969 recording for the BBC) that Blair was fast to learn the language and that before he left Burma, "was able to speak fluently with Burmese priests in 'very high-flown Burmese'." Blair made changes to his appearance in Burma that remained for the rest of his life, including adopting a pencil moustache. Emma Larkin writes in the introduction to Burmese Days, "While in Burma, he acquired a moustache similar to those worn by officers of the British regiments stationed there. [He] also acquired some tattoos; on each knuckle he had a small untidy blue circle. Many Burmese living in rural areas still sport tattoos like this—they are believed to protect against bullets and snake bites." In April 1926 he moved to Moulmein, where his maternal grandmother lived. At the end of that year, he was assigned to Katha in Upper Burma, where he contracted dengue fever in 1927. Entitled to a leave in England that year, he was allowed to return in July due to his illness. While on leave in England and on holiday with his family in Cornwall in September 1927, he reappraised his life. Deciding against returning to Burma, he resigned from the Indian Imperial Police to become a writer, with effect from 12 March 1928 after five-and-a-half years of service. He drew on his experiences in the Burma police for the novel Burmese Days (1934) and the essays "A Hanging" (1931) and "Shooting an Elephant" (1936). London and Paris In England, he settled back in the family home at Southwold, renewing acquaintance with local friends and attending an Old Etonian dinner. He visited his old tutor Gow at Cambridge for advice on becoming a writer. In 1927 he moved to London. Ruth Pitter, a family acquaintance, helped him find lodgings, and by the end of 1927 he had moved into rooms in Portobello Road; a blue plaque commemorates his residence there. Pitter's involvement in the move "would have lent it a reassuring respectability in Mrs. Blair's eyes." Pitter had a sympathetic interest in Blair's writing, pointed out weaknesses in his poetry, and advised him to write about what he knew. In fact he decided to write of "certain aspects of the present that he set out to know" and ventured into the East End of London—the first of the occasional sorties he would make to discover for himself the world of poverty and the down-and-outers who inhabit it. He had found a subject. These sorties, explorations, expeditions, tours or immersions were made intermittently over a period of five years. In imitation of Jack London, whose writing he admired (particularly The People of the Abyss), Blair started to explore the poorer parts of London. On his first outing he set out to Limehouse Causeway, spending his first night in a common lodging house, possibly George Levy's "kip". For a while he "went native" in his own country, dressing like a tramp, adopting the name P.S. Burton and making no concessions to middle-class mores and expectations; he recorded his experiences of the low life for use in "The Spike", his first published essay in English, and in the second half of his first book, Down and Out in Paris and London (1933). In early 1928 he moved to Paris. He lived in the rue du Pot de Fer, a working class district in the 5th Arrondissement. His aunt Nellie Limouzin also lived in Paris and gave him social and, when necessary, financial support. He began to write novels, including an early version of Burmese Days, but nothing else survives from that period. He was more successful as a journalist and published articles in Monde, a political/literary journal edited by Henri Barbusse (his first article as a professional writer, "La Censure en Angleterre", appeared in that journal on 6 October 1928); G. K.'s Weekly, where his first article to appear in England, "A Farthing Newspaper", was printed on 29 December 1928; and Le Progrès Civique (founded by the left-wing coalition Le Cartel des Gauches). Three pieces appeared in successive weeks in Le Progrès Civique: discussing unemployment, a day in the life of a tramp, and the beggars of London, respectively. "In one or another of its destructive forms, poverty was to become his obsessive subject—at the heart of almost everything he wrote until Homage to Catalonia." He fell seriously ill in February 1929 and was taken to the Hôpital Cochin in the 14th arrondissement, a free hospital where medical students were trained. His experiences there were the basis of his essay "How the Poor Die", published in 1946. He chose not to identify the hospital, and indeed was deliberately misleading about its location. Shortly afterwards, he had all his money stolen from his lodging house. Whether through necessity or to collect material, he undertook menial jobs such as dishwashing in a fashionable hotel on the rue de Rivoli, which he later described in Down and Out in Paris and London. In August 1929, he sent a copy of "The Spike" to John Middleton Murry's New Adelphi magazine in London. The magazine was edited by Max Plowman and Sir Richard Rees, and Plowman accepted the work for publication. Southwold In December 1929 after nearly two years in Paris, Blair returned to England and went directly to his parents' house in Southwold, a coastal town in Suffolk, which remained his base for the next five years. The family was well established in the town, and his sister Avril was running a tea-house there. He became acquainted with many local people, including Brenda Salkeld, the clergyman's daughter who worked as a gym-teacher at St Felix Girls' School in the town. Although Salkeld rejected his offer of marriage, she remained a friend and regular correspondent for many years. He also renewed friendships with older friends, such as Dennis Collings, whose girlfriend Eleanor Jacques was also to play a part in his life. In early 1930 he stayed briefly in Bramley, Leeds, with his sister Marjorie and her husband Humphrey Dakin, who was as unappreciative of Blair as when they knew each other as children. Blair was writing reviews for Adelphi and acting as a private tutor to a disabled child at Southwold. He then became tutor to three young brothers, one of whom, Richard Peters, later became a distinguished academic. "His history in these years is marked by dualities and contrasts. There is Blair leading a respectable, outwardly eventless life at his parents' house in Southwold, writing; then in contrast, there is Blair as Burton (the name he used in his down-and-out episodes) in search of experience in the kips and spikes, in the East End, on the road, and in the hop fields of Kent." He went painting and bathing on the beach, and there he met Mabel and Francis Fierz, who later influenced his career. Over the next year he visited them in London, often meeting their friend Max Plowman. He also often stayed at the homes of Ruth Pitter and Richard Rees, where he could "change" for his sporadic tramping expeditions. One of his jobs was domestic work at a lodgings for half a crown (two shillings and sixpence, or one-eighth of a pound) a day. Blair now contributed regularly to Adelphi, with "A Hanging" appearing in August 1931. From August to September 1931 his explorations of poverty continued, and, like the protagonist of A Clergyman's Daughter, he followed the East End tradition of working in the Kent hop fields. He kept a diary about his experiences there. Afterwards, he lodged in the Tooley Street kip, but could not stand it for long, and with financial help from his parents moved to Windsor Street, where he stayed until Christmas. "Hop Picking", by Eric Blair, appeared in the October 1931 issue of New Statesman, whose editorial staff included his old friend Cyril Connolly. Mabel Fierz put him in contact with Leonard Moore, who became his literary agent in April 1932. At this time Jonathan Cape rejected A Scullion's Diary, the first version of Down and Out. On the advice of Richard Rees, he offered it to Faber and Faber, but their editorial director, T. S. Eliot, also rejected it. Blair ended the year by deliberately getting himself arrested, so that he could experience Christmas in prison, but after he was picked up and taken to Bethnal Green police station in the East End of London the authorities did not regard his "drunk and disorderly" behaviour as imprisonable, and after two days in a cell he returned home to Southwold. Teaching career In April 1932 Blair became a teacher at The Hawthorns High School, a school for boys, in Hayes, West London. This was a small school offering private schooling for children of local tradesmen and shopkeepers, and had only 14 or 16 boys aged between ten and sixteen, and one other master. While at the school he became friendly with the curate of the local parish church and became involved with activities there. Mabel Fierz had pursued matters with Moore, and at the end of June 1932, Moore told Blair that Victor Gollancz was prepared to publish A Scullion's Diary for a £40 advance, through his recently founded publishing house, Victor Gollancz Ltd, which was an outlet for radical and socialist works. At the end of the summer term in 1932, Blair returned to Southwold, where his parents had used a legacy to buy their own home. Blair and his sister Avril spent the holidays making the house habitable while he also worked on Burmese Days. He was also spending time with Eleanor Jacques, but her attachment to Dennis Collings remained an obstacle to his hopes of a more serious relationship. "Clink", an essay describing his failed attempt to get sent to prison, appeared in the August 1932 number of Adelphi. He returned to teaching at Hayes and prepared for the publication of his book, now known as Down and Out in Paris and London. He wished to publish under a different name to avoid any embarrassment to his family over his time as a "tramp". In a letter to Moore (dated 15 November 1932), he left the choice of pseudonym to Moore and to Gollancz. Four days later, he wrote to Moore, suggesting the pseudonyms P. S. Burton (a name he used when tramping), Kenneth Miles, George Orwell, and H. Lewis Allways. He finally adopted the nom de plume George Orwell because "It is a good round English name." The name George was inspired by the patron saint of England, and Orwell after the River Orwell in Suffolk which was one of Orwell's favourite locations. Down and Out in Paris and London was published by Victor Gollancz in London on 9 January 1933 and received favourable reviews, with Cecil Day-Lewis complimenting Orwell's "clarity and good sense", and The Times Literary Supplement comparing Orwell's eccentric characters to the characters of Dickens. Down and Out was modestly successful and was next published by Harper & Brothers in New York. In mid-1933 Blair left Hawthorns to become a teacher at Frays College, in Uxbridge, west London. This was a much larger establishment with 200 pupils and a full complement of staff. He acquired a motorcycle and took trips through the surrounding countryside. On one of these expeditions he became soaked and caught a chill that developed into pneumonia. He was taken to a cottage hospital in Uxbridge, where for a time his life was believed to be in danger. When he was discharged in January 1934, he returned to Southwold to convalesce and, supported by his parents, never returned to teaching. He was disappointed when Gollancz turned down Burmese Days, mainly on the grounds of potential suits for libel, but Harper were prepared to publish it in the United States. Meanwhile, Blair started work on the novel A Clergyman's Daughter, drawing upon his life as a teacher and on life in Southwold. Eleanor Jacques was now married and had gone to Singapore and Brenda Salkeld had left for Ireland, so Blair was relatively isolated in Southwold—working on the allotments, walking alone and spending time with his father. Eventually in October, after sending A Clergyman's Daughter to Moore, he left for London to take a job that had been found for him by his aunt Nellie Limouzin. Hampstead This job was as a part-time assistant in Booklovers' Corner, a second-hand bookshop in Hampstead run by Francis and Myfanwy Westrope, who were friends of Nellie Limouzin in the Esperanto movement. The Westropes were friendly and provided him with comfortable accommodation at Warwick Mansions, Pond Street. He was sharing the job with Jon Kimche, who also lived with the Westropes. Blair worked at the shop in the afternoons and had his mornings free to write and his evenings free to socialise. These experiences provided background for the novel Keep the Aspidistra Flying (1936). As well as the various guests of the Westropes, he was able to enjoy the company of Richard Rees and the Adelphi writers and Mabel Fierz. The Westropes and Kimche were members of the Independent Labour Party, although at this time Blair was not seriously politically active. He was writing for the Adelphi and preparing A Clergyman's Daughter and Burmese Days for publication. At the beginning of 1935 he had to move out of Warwick Mansions, and Mabel Fierz found him a flat in Parliament Hill. A Clergyman's Daughter was published on 11 March 1935. In early 1935 Blair met his future wife Eileen O'Shaughnessy, when his landlady, Rosalind Obermeyer, who was studying for a master's degree in psychology at University College London, invited some of her fellow students to a party. One of these students, Elizaveta Fen, a biographer and future translator of Chekhov, recalled Blair and his friend Richard Rees "draped" at the fireplace, looking, she thought, "moth-eaten and prematurely aged." Around this time, Blair had started to write reviews for The New English Weekly. In June, Burmese Days was published and Cyril Connolly's positive review in the New Statesman prompted Blair to re-establish contact with his old friend. In August, he moved into a flat, at 50 Lawford Road, Kentish Town, which he shared with Michael Sayers and Rayner Heppenstall. The relationship was sometimes awkward and Blair and Heppenstall even came to blows, though they remained friends and later worked together on BBC broadcasts. Blair was now working on Keep the Aspidistra Flying, and also tried unsuccessfully to write a serial for the News Chronicle. By October 1935 his flatmates had moved out and he was struggling to pay the rent on his own. He remained until the end of January 1936, when he stopped working at Booklovers' Corner. In 1980, English Heritage honoured Orwell with a blue plaque at his Kentish Town residence. The Road to Wigan Pier At this time, Victor Gollancz suggested Orwell spend a short time investigating social conditions in economically depressed Northern England. Two years earlier, J. B. Priestley had written about England north of the Trent, sparking an interest in reportage. The depression had also introduced a number of working-class writers from the North of England to the reading public. It was one of these working-class authors, Jack Hilton, whom Orwell sought for advice. Orwell had written to Hilton seeking lodging and asking for recommendations on his route. Hilton was unable to provide him lodging, but suggested that he travel to Wigan rather than Rochdale, "for there are the colliers and they're good stuff." On 31 January 1936, Orwell set out by public transport and on foot, reaching Manchester via Coventry, Stafford, the Potteries and Macclesfield. Arriving in Manchester after the banks had closed, he had to stay in a common lodging-house. The next day he picked up a list of contacts sent by Richard Rees. One of these, the trade union official Frank Meade, suggested Wigan, where Orwell spent February staying in dirty lodgings over a tripe shop. At Wigan, he visited many homes to see how people lived, took detailed notes of housing conditions and wages earned, went down Bryn Hall coal mine, and used the local public library to consult public health records and reports on working conditions in mines. During this time, he was distracted by concerns about style and possible libel in Keep the Aspidistra Flying. He made a quick visit to Liverpool and during March, stayed in south Yorkshire, spending time in Sheffield and Barnsley. As well as visiting mines, including Grimethorpe, and observing social conditions, he attended meetings of the Communist Party and of Oswald Mosley ("his speech the usual claptrap—The blame for everything was put upon mysterious international gangs of Jews") where he saw the tactics of the Blackshirts ("...one is liable to get both a hammering and a fine for asking a question which Mosley finds it difficult to answer."). He also made visits to his sister at Headingley, during which he visited the Brontë Parsonage at Haworth, where he was "chiefly impressed by a pair of Charlotte Brontë's cloth-topped boots, very small, with square toes and lacing up at the sides." Orwell needed somewhere he could concentrate on writing his book, and once again help was provided by Aunt Nellie, who was living at Wallington, Hertfordshire in a very small 16th-century cottage called the "Stores". Wallington was a tiny village 35 miles north of London, and the cottage had almost no modern facilities. Orwell took over the tenancy and moved in on 2 April 1936. He started work on The Road to Wigan Pier by the end of April, but also spent hours working on the garden and testing the possibility of reopening the Stores as a village shop. Keep the Aspidistra Flying was published by Gollancz on 20 April 1936. On 4 August, Orwell gave a talk at the Adelphi Summer School held at Langham, entitled An Outsider Sees the Distressed Areas; others who spoke at the school included John Strachey, Max Plowman, Karl Polanyi and Reinhold Niebuhr. The result of his journeys through the north was The Road to Wigan Pier, published by Gollancz for the Left Book Club in 1937. The first half of the book documents his social investigations of Lancashire and Yorkshire, including an evocative description of working life in the coal mines. The second half is a long essay on his upbringing and the development of his political conscience, which includes an argument for socialism (although he goes to lengths to balance the concerns and goals of socialism with the barriers it faced from the movement's own advocates at the time, such as "priggish" and "dull" socialist intellectuals and "proletarian" socialists with little grasp of the actual ideology). Gollancz feared the second half would offend readers and added a disculpatory preface to the book while Orwell was in Spain. Orwell's research for The Road to Wigan Pier led to him being placed under surveillance by the Special Branch from 1936, for 12 years, until one year before the publication of Nineteen Eighty-Four. Orwell married Eileen O'Shaughnessy on 9 June 1936. Shortly afterwards, the political crisis began in Spain and Orwell followed developments there closely. At the end of the year, concerned by Francisco Franco's military uprising (supported by Nazi Germany, Fascist Italy and local groups such as Falange), Orwell decided to go to Spain to take part in the Spanish Civil War on the Republican side. Under the erroneous impression that he needed papers from some left-wing organisation to cross the frontier, on John Strachey's recommendation he applied unsuccessfully to Harry Pollitt, leader of the British Communist Party. Pollitt was suspicious of Orwell's political reliability; he asked him whether he would undertake to join the International Brigade and advised him to get a safe-conduct from the Spanish Embassy in Paris. Not wishing to commit himself until he had seen the situation in situ, Orwell instead used his Independent Labour Party contacts to get a letter of introduction to John McNair in Barcelona. Spanish Civil War Orwell set out for Spain on about 23 December 1936, dining with Henry Miller in Paris on the way. Miller told Orwell that going to fight in the Civil War out of some sense of obligation or guilt was "sheer stupidity" and that the Englishman's ideas "about combating Fascism, defending democracy, etc., etc., were all baloney". A few days later in Barcelona, Orwell met John McNair of the Independent Labour Party (ILP) Office who quoted him: "I've come to fight against Fascism", but if someone had asked him what he was fighting for, "I should have answered: 'Common decency'". Orwell stepped into a complex political situation in Catalonia. The Republican government was supported by a number of factions with conflicting aims, including the Workers' Party of Marxist Unification (POUM – Partido Obrero de Unificación Marxista), the anarcho-syndicalist Confederación Nacional del Trabajo (CNT) and the Unified Socialist Party of Catalonia (a wing of the Spanish Communist Party, which was backed by Soviet arms and aid). Orwell was at first exasperated by this "kaleidoscope" of political parties and trade unions, "with their tiresome names". The ILP was linked to the POUM so Orwell joined the POUM. After a time at the Lenin Barracks in Barcelona he was sent to the relatively quiet Aragon Front under Georges Kopp. By January 1937 he was at Alcubierre above sea level, in the depth of winter. There was very little military action and Orwell was shocked by the lack of munitions, food and firewood as well as other extreme deprivations. With his Cadet Corps and police training, Orwell was quickly made a corporal. On the arrival of a British ILP Contingent about three weeks later, Orwell and the other English militiaman, Williams, were sent with them to Monte Oscuro. The newly arrived ILP contingent included Bob Smillie, Bob Edwards, Stafford Cottman and Jack Branthwaite. The unit was then sent on to Huesca. Meanwhile, back in England, Eileen had been handling the issues relating to the publication of The Road to Wigan Pier before setting out for Spain herself, leaving Nellie Limouzin to look after The Stores. Eileen volunteered for a post in John McNair's office and with the help of Georges Kopp paid visits to her husband, bringing him English tea, chocolate and cigars. Orwell had to spend some days in hospital with a poisoned hand and had most of his possessions stolen by the staff. He returned to the front and saw some action in a night attack on the Nationalist trenches where he chased an enemy soldier with a bayonet and bombed an enemy rifle position. In April, Orwell returned to Barcelona. Wanting to be sent to the Madrid front, which meant he "must join the International Column", he approached a Communist friend attached to the Spanish Medical Aid and explained his case. "Although he did not think much of the Communists, Orwell was still ready to treat them as friends and allies. That would soon change." This was the time of the Barcelona May Days and Orwell was caught up in the factional fighting. He spent much of the time on a roof, with a stack of novels, but encountered Jon Kimche from his Hampstead days during the stay. The subsequent campaign of lies and distortion carried out by the Communist press, in which the POUM was accused of collaborating with the fascists, had a dramatic effect on Orwell. Instead of joining the International Brigades as he had intended, he decided to return to the Aragon Front. Once the May fighting was over, he was approached by a Communist friend who asked if he still intended transferring to the International Brigades. Orwell expressed surprise that they should still want him, because according to the Communist press he was a fascist. "No one who was in Barcelona then, or for months later, will forget the horrible atmosphere produced by fear, suspicion, hatred, censored newspapers, crammed jails, enormous food queues and prowling gangs of armed men." After his return to the front, he was wounded in the throat by a sniper's bullet. At 6 ft 2 in (1.88 m), Orwell was considerably taller than the Spanish fighters and had been warned against standing against the trench parapet. Unable to speak, and with blood pouring from his mouth, Orwell was carried on a stretcher to Siétamo, loaded on an ambulance and after a bumpy journey via Barbastro arrived at the hospital in Lleida. He recovered sufficiently to get up and on 27 May 1937 was sent on to Tarragona and two days later to a POUM sanatorium in the suburbs of Barcelona. The bullet had missed his main artery by the barest margin and his voice was barely audible. It had been such a clean shot that the wound immediately went through the process of cauterisation. He received electrotherapy treatment and was declared medically unfit for service. By the middle of June the political situation in Barcelona had deteriorated and the POUM—painted by the pro-Soviet Communists as a Trotskyist organisation—was outlawed and under attack. The Communist line was that the POUM were "objectively" Fascist, hindering the Republican cause. "A particularly nasty poster appeared, showing a head with a POUM mask being ripped off to reveal a Swastika-covered face beneath." Members, including Kopp, were arrested and others were in hiding. Orwell and his wife were under threat and had to lie low, although they broke cover to try to help Kopp. Finally with their passports in order, they escaped from Spain by train, diverting to Banyuls-sur-Mer for a short stay before returning to England. In the first week of July 1937 Orwell arrived back at Wallington; on 13 July 1937 a deposition was presented to the Tribunal for Espionage & High Treason in Valencia, charging the Orwells with "rabid Trotskyism", and being agents of the POUM. The trial of the leaders of the POUM and of Orwell (in his absence) took place in Barcelona in October and November 1938. Observing events from French Morocco, Orwell wrote that they were "only a by-product of the Russian Trotskyist trials and from the start every kind of lie, including flagrant absurdities, has been circulated in the Communist press." Orwell's experiences in the Spanish Civil War gave rise to Homage to Catalonia (1938). In his book, The International Brigades: Fascism, Freedom and the Spanish Civil War, Giles Tremlett writes that according to Soviet files, Orwell and his wife Eileen were spied on in Barcelona in May 1937. "The papers are documentary evidence that not only Orwell, but also his wife Eileen, were being watched closely". Rest and recuperation Orwell returned to England in June 1937, and stayed at the O'Shaughnessy home at Greenwich. He found his views on the Spanish Civil War out of favour. Kingsley Martin rejected two of his works and Gollancz was equally cautious. At the same time, the communist Daily Worker was running an attack on The Road to Wigan Pier, taking out of context Orwell writing that "the working classes smell"; a letter to Gollancz from Orwell threatening libel action brought a stop to this. Orwell was also able to find a more sympathetic publisher for his views in Fredric Warburg of Secker & Warburg. Orwell returned to Wallington, which he found in disarray after his absence. He acquired goats, a cockerel (rooster) he called Henry Ford and a poodle puppy he called Marx; and settled down to animal husbandry and writing Homage to Catalonia. There were thoughts of going to India to work on The Pioneer, a newspaper in Lucknow, but by March 1938 Orwell's health had deteriorated. He was admitted to Preston Hall Sanatorium at Aylesford, Kent, a British Legion hospital for ex-servicemen to which his brother-in-law Laurence O'Shaughnessy was attached. He was thought initially to be suffering from tuberculosis and stayed in the sanatorium until September. A stream of visitors came to see him, including Common, Heppenstall, Plowman and Cyril Connolly. Connolly brought with him Stephen Spender, a cause of some embarrassment as Orwell had referred to Spender as a "pansy friend" some time earlier. Homage to Catalonia was published by Secker & Warburg and was a commercial flop. In the latter part of his stay at the clinic, Orwell was able to go for walks in the countryside and study nature. The novelist L. H. Myers secretly funded a trip to French Morocco for half a year for Orwell to avoid the English winter and recover his health. The Orwells set out in September 1938 via Gibraltar and Tangier to avoid Spanish Morocco and arrived at Marrakech. They rented a villa on the road to Casablanca and during that time Orwell wrote Coming Up for Air. They arrived back in England on 30 March 1939 and Coming Up for Air was published in June. Orwell spent time in Wallington and Southwold working on a Dickens essay and it was in June 1939 that Orwell's father, Richard Blair, died. Second World War and Animal Farm At the outbreak of the Second World War, Orwell's wife Eileen started working in the Censorship Department of the Ministry of Information in central London, staying during the week with her family in Greenwich. Orwell also submitted his name to the Central Register for war work, but nothing transpired. "They won't have me in the army, at any rate at present, because of my lungs", Orwell told Geoffrey Gorer. He returned to Wallington, and in late 1939 he wrote material for his first collection of essays, Inside the Whale. For the next year he was occupied writing reviews for plays, films and books for The Listener, Time and Tide and New Adelphi. On 29 March 1940 his long association with Tribune began with a review of a sergeant's account of Napoleon's retreat from Moscow. At the beginning of 1940, the first edition of Connolly's Horizon appeared, and this provided a new outlet for Orwell's work as well as new literary contacts. In May the Orwells took lease of a flat in London at Dorset Chambers, Chagford Street, Marylebone. It was the time of the Dunkirk evacuation and the death in France of Eileen's brother Lawrence caused her considerable grief and long-term depression. Throughout this period Orwell kept a wartime diary. Orwell was declared "unfit for any kind of military service" by the Medical Board in June, but soon afterwards found an opportunity to become involved in war activities by joining the Home Guard. He shared Tom Wintringham's socialist vision for the Home Guard as a revolutionary People's Militia. His lecture notes for instructing platoon members include advice on street fighting, field fortifications, and the use of mortars of various kinds. Sergeant Orwell managed to recruit Fredric Warburg to his unit. During the Battle of Britain he used to spend weekends with Warburg and his new Zionist friend, Tosco Fyvel, at Warburg's house at Twyford, Berkshire. At Wallington he worked on "England Your England" and in London wrote reviews for various periodicals. Visiting Eileen's family in Greenwich brought him face-to-face with the effects of the Blitz on East London. In mid-1940, Warburg, Fyvel and Orwell planned Searchlight Books. Eleven volumes eventually appeared, of which Orwell's The Lion and the Unicorn: Socialism and the English Genius, published on 19 February 1941, was the first. Early in 1941 he began to write for the American Partisan Review which linked Orwell with The New York Intellectuals who were also anti-Stalinist, and contributed to the Gollancz anthology The Betrayal of the Left, written in the light of the Molotov–Ribbentrop Pact (although Orwell referred to it as the Russo-German Pact and the Hitler-Stalin Pact). He also applied unsuccessfully for a job at the Air Ministry. Meanwhile, he was still writing reviews of books and plays and at this time met the novelist Anthony Powell. He also took part in a few radio broadcasts for the Eastern Service of the BBC. In March the Orwells moved to a seventh-floor flat at Langford Court, St John's Wood, while at Wallington Orwell was "digging for victory" by planting potatoes. In August 1941, Orwell finally obtained "war work" when he was taken on full-time by the BBC's Eastern Service. When interviewed for the job he indicated that he "accept[ed] absolutely the need for propaganda to be directed by the government" and stressed his view that, in wartime, discipline in the execution of government policy was essential. He supervised cultural broadcasts to India to counter propaganda from Nazi Germany designed to undermine imperial links. This was Orwell's first experience of the rigid conformity of life in an office, and it gave him an opportunity to create cultural programmes with contributions from T. S. Eliot, Dylan Thomas, E. M. Forster, Ahmed Ali, Mulk Raj Anand, and William Empson among others. At the end of August he had a dinner with H. G. Wells which degenerated into a row because Wells had taken offence at observations Orwell made about him in a Horizon article. In October Orwell had a bout of bronchitis and the illness recurred frequently. David Astor was looking for a provocative contributor for The Observer and invited Orwell to write for him—the first article appearing in March 1942. In early 1942 Eileen changed jobs to work at the Ministry of Food and in mid-1942 the Orwells moved to a larger flat, a ground floor and basement, 10a Mortimer Crescent in Maida Vale/Kilburn—"the kind of lower-middle-class ambience that Orwell thought was London at its best." Around the same time Orwell's mother and sister Avril, who had found work in a sheet-metal factory behind King's Cross Station, moved into a flat close to George and Eileen. At the BBC, Orwell introduced Voice, a literary programme for his Indian broadcasts, and by now was leading an active social life with literary friends, particularly on the political left. Late in 1942, he started writing regularly for the left-wing weekly Tribune directed by Labour MPs Aneurin Bevan and George Strauss. In March 1943, Orwell's mother died, and around the same time he told Moore he was starting work on a new book, which turned out to be Animal Farm. In September 1943, Orwell resigned from the BBC post that he had occupied for two years. His resignation followed a report confirming his fears that few Indians listened to the broadcasts, but he was also keen to concentrate on writing Animal Farm. Just six days before his last day of service, on 24 November 1943, his adaptation of the fairy tale, Hans Christian Andersen's The Emperor's New Clothes was broadcast. It was a genre in which he was greatly interested and which appeared on Animal Farms title-page. At this time he also resigned from the Home Guard on medical grounds. In November 1943, Orwell was appointed literary editor at Tribune, where his assistant was his old friend Jon Kimche. Orwell was on staff until early 1945, writing over 80 book reviews and on 3 December 1943 started his regular personal column, "As I Please", usually addressing three or four subjects in each. He was still writing reviews for other magazines, including Partisan Review, Horizon, and the New York Nation and becoming a respected pundit among left-wing circles but also a close friend of people on the right such as Powell, Astor and Malcolm Muggeridge. By April 1944 Animal Farm was ready for publication. Gollancz refused to publish it, considering it an attack on the Soviet regime which was a crucial ally in the war. A similar fate was met from other publishers (including T. S. Eliot at Faber and Faber) until Jonathan Cape agreed to take it. In May the Orwells had the opportunity to adopt a child, thanks to the contacts of Eileen's sister Gwen O'Shaughnessy, then a doctor in Newcastle upon Tyne. In June a V-1 flying bomb struck Mortimer Crescent and the Orwells had to find somewhere else to live. Orwell had to scrabble around in the rubble for his collection of books, which he had finally managed to transfer from Wallington, carting them away in a wheelbarrow. Another blow was Cape's reversal of his plan to publish Animal Farm. The decision followed his personal visit to Peter Smollett, an official at the Ministry of Information. Smollett was later identified as a Soviet agent. The Orwells spent some time in the North East, near Carlton, County Durham, dealing with matters in the adoption of a boy whom they named Richard Horatio Blair. By September 1944 they had set up home in Islington, at 27b Canonbury Square. Baby Richard joined them there, and Eileen gave up her work at the Ministry of Food to look after her family. Secker & Warburg had agreed to publish Animal Farm, planned for the following March, although it did not appear in print until August 1945. By February 1945 David Astor had invited Orwell to become a war correspondent for The Observer. Orwell had been looking for the opportunity throughout the war, but his failed medical reports prevented him from being allowed anywhere near action. He went first to liberated Paris and then to Germany and Austria, to such cites as Cologne and Stuttgart. He was never in the front line and was never under fire, but he followed the troops closely, "sometimes entering a captured town within a day of its fall while dead bodies lay in the streets." Some of his reports were published in the Manchester Evening News. It was while he was there that Eileen went into hospital for a hysterectomy and died under anaesthetic on 29 March 1945. She had not given Orwell much notice about this operation because of worries about the cost and because she expected to make a speedy recovery. Orwell returned home for a while and then went back to Europe. He returned finally to London to cover the 1945 general election at the beginning of July. Animal Farm: A Fairy Story was published in Britain on 17 August 1945, and a year later in the US, on 26 August 1946. Jura and Nineteen Eighty-Four Animal Farm had particular resonance in the post-war climate and its worldwide success made Orwell a sought-after figure. For the next four years, Orwell mixed journalistic work—mainly for Tribune, The Observer and the Manchester Evening News, though he also contributed to many small-circulation political and literary magazines—with writing his best-known work, Nineteen Eighty-Four, which was published in 1949. He was a leading figure in the so-called Shanghai Club (named after a restaurant in Soho) of left-leaning and émigré journalists, among them E. H. Carr, Sebastian Haffner, Isaac Deutscher, Barbara Ward and Jon Kimche. In the year following Eileen's death he published around 130 articles and a selection of his Critical Essays, while remaining active in various political lobbying campaigns. He employed a housekeeper, Susan Watson, to look after his adopted son at the Islington flat, which visitors now described as "bleak". In September he spent a fortnight on the island of Jura in the Inner Hebrides and saw it as a place to escape from the hassle of London literary life. David Astor was instrumental in arranging a place for Orwell on Jura. Astor's family owned Scottish estates in the area and a fellow Old Etonian, Robin Fletcher, had a property on the island. In late 1945 and early 1946 Orwell made several hopeless and unwelcome marriage proposals to younger women, including Celia Kirwan (who later became Arthur Koestler's sister-in-law); Ann Popham who happened to live in the same block of flats; and Sonia Brownell, one of Connolly's coterie at the Horizon office. Orwell suffered a tubercular haemorrhage in February 1946 but disguised his illness. In 1945 or early 1946, while still living at Canonbury Square, Orwell wrote an article on "British Cookery", complete with recipes, commissioned by the British Council. Given the post-war shortages, both parties agreed not to publish it. His sister Marjorie died of kidney disease in May, and soon afterwards, on 22 May 1946, Orwell set off to live on the Isle of Jura at a house known as Barnhill. This was an abandoned farmhouse with outbuildings near the northern end of the island, at the end of a five-mile (8 km) heavily rutted track from Ardlussa, where the owners lived. Conditions at the farmhouse were primitive but the natural history and the challenge of improving the place appealed to Orwell. His sister Avril accompanied him there and young novelist Paul Potts made up the party. In July Susan Watson arrived with Orwell's son Richard. Tensions developed and Potts departed after one of his manuscripts was used to light the fire. Orwell meanwhile set to work on Nineteen Eighty-Four. Later Susan Watson's boyfriend David Holbrook arrived. A fan of Orwell since school days, he found the reality very different, with Orwell hostile and disagreeable probably because of Holbrook's membership of the Communist Party. Watson could no longer stand being with Avril and she and her boyfriend left. Orwell returned to London in late 1946 and picked up his literary journalism again. Now a well-known writer, he was swamped with work. Apart from a visit to Jura in the new year he stayed in London for one of the coldest British winters on record and with such a national shortage of fuel that he burnt his furniture and his child's toys. The heavy smog in the days before the Clean Air Act 1956 did little to help his health, about which he was reticent, keeping clear of medical attention. Meanwhile, he had to cope with rival claims of publishers Gollancz and Warburg for publishing rights. About this time he co-edited a collection titled British Pamphleteers with Reginald Reynolds. As a result of the success of Animal Farm, Orwell was expecting a large bill from the Inland Revenue and he contacted a firm of accountants whose senior partner was Jack Harrison. The firm advised Orwell to establish a company to own his copyright and to receive his royalties and set up a "service agreement" so that he could draw a salary. Such a company, "George Orwell Productions Ltd" (GOP Ltd) was set up on 12 September 1947, although the service agreement was not then put into effect. Jack Harrison left the details at this stage to junior colleagues. Orwell left London for Jura on 10 April 1947. In July he ended the lease on the Wallington cottage. Back on Jura he worked on Nineteen Eighty-Four and made good progress. During that time his sister's family visited, and Orwell led a disastrous boating expedition, on 19 August, which nearly led to loss of life whilst trying to cross the notorious Gulf of Corryvreckan and gave him a soaking which was not good for his health. In December a chest specialist was summoned from Glasgow who pronounced Orwell seriously ill, and a week before Christmas 1947 he was in Hairmyres Hospital in East Kilbride, then a small village in the countryside, on the outskirts of Glasgow. Tuberculosis was diagnosed and the request for permission to import streptomycin to treat Orwell went as far as Aneurin Bevan, then Minister of Health. David Astor helped with supply and payment and Orwell began his course of streptomycin on 19 or 20 February 1948. By the end of July 1948 Orwell was able to return to Jura and by December he had finished the manuscript of Nineteen Eighty-Four. In January 1949, in a very weak condition, he set off for a sanatorium at Cranham, Gloucestershire, escorted by Richard Rees. The sanatorium at Cranham consisted of a series of small wooden chalets or huts in a remote part of the Cotswolds near Stroud. Visitors were shocked by Orwell's appearance and concerned by the shortcomings and ineffectiveness of the treatment. Friends were worried about his finances, but by now he was comparatively well off. He was writing to many of his friends, including Jacintha Buddicom, who had "rediscovered" him, and in March 1949, was visited by Celia Kirwan. Kirwan had just started working for a Foreign Office unit, the Information Research Department (IRD), set up by the Labour government to publish anti-communist propaganda, and Orwell gave her a list of people he considered to be unsuitable as IRD authors because of their pro-communist leanings. Orwell's list, not published until 2003, consisted mainly of writers but also included actors and Labour MPs. To further promote Animal Farm, the IRD commissioned cartoon strips, drawn by Norman Pett, to be placed in newspapers across the globe. Orwell received more streptomycin treatment and improved slightly. In June 1949 Nineteen Eighty-Four was published, to critical acclaim. Final months and death Orwell's health continued to decline after the diagnosis of tuberculosis in December 1947. In mid-1949, he courted Sonia Brownell, and they announced their engagement in September, shortly before he was removed to University College Hospital in London. Sonia took charge of Orwell's affairs and attended him diligently in the hospital. In September 1949, Orwell invited his accountant Harrison to visit him in hospital, and Harrison claimed that Orwell then asked him to become director of GOP Ltd and to manage the company, but there was no independent witness. Orwell's wedding took place in the hospital room on 13 October 1949, with David Astor as best man. Orwell was in decline and was visited by an assortment of visitors including Muggeridge, Connolly, Lucian Freud, Stephen Spender, Evelyn Waugh, Paul Potts, Anthony Powell, and his Eton tutor Anthony Gow. Plans to go to the Swiss Alps were mooted. Further meetings were held with his accountant, at which Harrison and Mr and Mrs Blair were confirmed as directors of the company, and at which Harrison claimed that the "service agreement" was executed, giving copyright to the company. Orwell's health was in decline again by Christmas. On the evening of 20 January 1950, Potts visited Orwell and slipped away on finding him asleep. Jack Harrison visited later and claimed that Orwell gave him 25% of the company. Early on the morning of 21 January, an artery burst in Orwell's lungs, killing him at age 46. Orwell had requested to be buried in accordance with the Anglican rite in the graveyard of the closest church to wherever he happened to die. The graveyards in central London had no space, and so in an effort to ensure his last wishes could be fulfilled, his widow appealed to his friends to see whether any of them knew of a church with space in its graveyard. David Astor lived in Sutton Courtenay, Oxfordshire, and arranged for Orwell to be interred in the churchyard of All Saints' there. Orwell's gravestone bears the epitaph: "Here lies Eric Arthur Blair, born June 25th 1903, died January 21st 1950"; no mention is made on the gravestone of his more famous pen name. Orwell's adopted son, Richard Horatio Blair, was brought up by Orwell's sister Avril. He is patron of The Orwell Society. In 1979, Sonia Brownell brought a High Court action against Harrison when he declared an intention to subdivide his 25 percent share of the company between his three children. For Sonia, the consequence of this manoeuvre would have made getting overall control of the company three times more difficult. She was considered to have a strong case, but was becoming increasingly ill and eventually was persuaded to settle out of court on 2 November 1980. She died on 11 December 1980, aged 62. Literary career and legacy During most of his career, Orwell was best known for his journalism, in essays, reviews, columns in newspapers and magazines and in his books of reportage: Down and Out in Paris and London (describing a period of poverty in these cities), The Road to Wigan Pier (describing the living conditions of the poor in northern England, and class division generally) and Homage to Catalonia. According to Irving Howe, Orwell was "the best English essayist since Hazlitt, perhaps since Dr Johnson". Modern readers are more often introduced to Orwell as a novelist, particularly through his enormously successful titles Animal Farm and Nineteen Eighty-Four. The former is often thought to reflect degeneration in the Soviet Union after the Russian Revolution and the rise of Stalinism; the latter, life under totalitarian rule. Nineteen Eighty-Four is often compared to Brave New World by Aldous Huxley; both are powerful dystopian novels warning of a future world where the state machine exerts complete control over social life. In 1984, Nineteen Eighty-Four and Ray Bradbury's Fahrenheit 451 were honoured with the Prometheus Award for their contributions to dystopian literature. In 2011 he received it again for Animal Farm. In 2003, Nineteen Eighty-Four was listed at number eight on the BBC's The Big Read poll. In 2021, readers of the New York Times Book Review rated it third in a list of "The best books of the past 125 years." Coming Up for Air, his last novel before World War II, is the most "English" of his novels; alarms of war mingle with images of idyllic Thames-side Edwardian childhood of protagonist George Bowling. The novel is pessimistic; industrialism and capitalism have killed the best of Old England, and there were great, new external threats. In homely terms, its protagonist George Bowling posits the totalitarian hypotheses of Franz Borkenau, Orwell, Ignazio Silone and Koestler: "Old Hitler's something different. So's Joe Stalin. They aren't like these chaps in the old days who crucified people and chopped their heads off and so forth, just for the fun of it ... They're something quite new—something that's never been heard of before". Literary influences In an autobiographical piece that Orwell sent to the editors of Twentieth Century Authors in 1940, he wrote: "The writers I care about most and never grow tired of are: Shakespeare, Swift, Fielding, Dickens, Charles Reade, Flaubert and, among modern writers, James Joyce, T. S. Eliot and D. H. Lawrence. But I believe the modern writer who has influenced me most is W. Somerset Maugham, whom I admire immensely for his power of telling a story straightforwardly and without frills." Elsewhere, Orwell strongly praised the works of Jack London, especially his book The Road. Orwell's investigation of poverty in The Road to Wigan Pier strongly resembles that of Jack London's The People of the Abyss, in which the American journalist disguises himself as an out-of-work sailor to investigate the lives of the poor in London. In his essay "Politics vs. Literature: An Examination of Gulliver's Travels" (1946) Orwell wrote: "If I had to make a list of six books which were to be preserved when all others were destroyed, I would certainly put Gulliver's Travels among them." On H. G. Wells he wrote, "The minds of all of us, and therefore the physical world, would be perceptibly different if Wells had never existed." Orwell was an admirer of Arthur Koestler and became a close friend during the three years that Koestler and his wife Mamain spent at the cottage of Bwlch Ocyn, a secluded farmhouse that belonged to Clough Williams-Ellis, in the Vale of Ffestiniog. Orwell reviewed Koestler's Darkness at Noon for the New Statesman in 1941, saying: Brilliant as this book is as a novel, and a piece of brilliant literature, it is probably most valuable as an interpretation of the Moscow "confessions" by someone with an inner knowledge of totalitarian methods. What was frightening about these trials was not the fact that they happened—for obviously such things are necessary in a totalitarian society—but the eagerness of Western intellectuals to justify them. Other writers admired by Orwell included: Ralph Waldo Emerson, George Gissing, Graham Greene, Herman Melville, Henry Miller, Tobias Smollett, Mark Twain, Joseph Conrad, and Yevgeny Zamyatin. He was both an admirer and a critic of Rudyard Kipling, praising Kipling as a gifted writer and a "good bad poet" whose work is "spurious" and "morally insensitive and aesthetically disgusting," but undeniably seductive and able to speak to certain aspects of reality more effectively than more enlightened authors. He had a similarly ambivalent attitude to G. K. Chesterton, whom he regarded as a writer of considerable talent who had chosen to devote himself to "Roman Catholic propaganda", and to Evelyn Waugh, who was, he wrote, "ab[ou]t as good a novelist as one can be (i.e. as novelists go today) while holding untenable opinions". Orwell as literary critic Throughout his life Orwell continually supported himself as a book reviewer. His reviews are well known and have had an influence on literary criticism. He wrote in the conclusion to his 1940 essay on Charles Dickens, George Woodcock suggested that the last two sentences also describe Orwell. Orwell wrote a critique of George Bernard Shaw's play Arms and the Man. He considered this Shaw's best play and the most likely to remain socially relevant, because of its theme that war is not, generally speaking, a glorious romantic adventure. His 1945 essay In Defence of P.G. Wodehouse contains an amusing assessment of Wodehouse's writing and also argues that his broadcasts from Germany (during the war) did not really make him a traitor. He accused The Ministry of Information of exaggerating Wodehouse's actions for propaganda purposes. Food writing In 1946, the British Council commissioned Orwell to write an essay on British food as part of a drive to promote British relations abroad. In the essay titled British Cookery, Orwell described the British diet as "a simple, rather heavy, perhaps slightly barbarous diet" and where "hot drinks are acceptable at most hours of the day". He discusses the ritual of breakfast in the UK, "this is not a snack but a serious meal. The hour at which people have their breakfast is of course governed by the time at which they go to work." He wrote that high tea in the United Kingdom consisted of a variety of savoury and sweet dishes, but "no tea would be considered a good one if it did not include at least one kind of cake”, before adding ”as well as cakes, biscuits are much eaten at tea-time”. Orwell included a recipe for marmalade, a popular British spread on bread. However, the British Council declined to publish the essay on the grounds that it was too problematic to write about food at the time of strict rationing in the UK. In 2019, the essay was discovered in the British Council's archives along with the rejection letter. The British Council issued an official apology to Orwell over the rejection of the commissioned essay. Reception and evaluations of Orwell's works Arthur Koestler said that Orwell's "uncompromising intellectual honesty made him appear almost inhuman at times". Ben Wattenberg stated: "Orwell's writing pierced intellectual hypocrisy wherever he found it". According to historian Piers Brendon, "Orwell was the saint of common decency who would in earlier days, said his BBC boss Rushbrook Williams, 'have been either canonised—or burnt at the stake'". Raymond Williams in Politics and Letters: Interviews with New Left Review describes Orwell as a "successful impersonation of a plain man who bumps into experience in an unmediated way and tells the truth about it". Christopher Norris declared that Orwell's "homespun empiricist outlook—his assumption that the truth was just there to be told in a straightforward common-sense way—now seems not merely naïve but culpably self-deluding". The American scholar Scott Lucas has described Orwell as an enemy of the Left. John Newsinger has argued that Lucas could only do this by portraying "all of Orwell's attacks on Stalinism [–] as if they were attacks on socialism, despite Orwell's continued insistence that they were not". Orwell's work has taken a prominent place in the school literature curriculum in England, with Animal Farm a regular examination topic at the end of secondary education (GCSE), and Nineteen Eighty-Four a topic for subsequent examinations below university level (A Levels). A 2016 UK poll saw Animal Farm ranked the nation's favourite book from school. Historian John Rodden stated: "John Podhoretz did claim that if Orwell were alive today, he'd be standing with the neo-conservatives and against the Left. And the question arises, to what extent can you even begin to predict the political positions of somebody who's been dead three decades and more by that time?" In Orwell's Victory, Christopher Hitchens argues: "In answer to the accusation of inconsistency Orwell as a writer was forever taking his own temperature. In other words, here was someone who never stopped testing and adjusting his intelligence". John Rodden points out the "undeniable conservative features in the Orwell physiognomy" and remarks on how "to some extent Orwell facilitated the kinds of uses and abuses by the Right that his name has been put to. In other ways there has been the politics of selective quotation." Rodden refers to the essay "Why I Write", in which Orwell refers to the Spanish Civil War as being his "watershed political experience", saying: "The Spanish War and other events in 1936–37, turned the scale. Thereafter I knew where I stood. Every line of serious work that I have written since 1936 has been written directly or indirectly against totalitarianism and for democratic socialism as I understand it." (emphasis in original) Rodden goes on to explain how, during the McCarthy era, the introduction to the Signet edition of Animal Farm, which sold more than 20 million copies, makes use of selective quotation: "[Introduction]: If the book itself, Animal Farm, had left any doubt of the matter, Orwell dispelled it in his essay Why I Write: 'Every line of serious work that I've written since 1936 has been written directly or indirectly against Totalitarianism ....'[Rodden]: dot, dot, dot, dot, the politics of ellipsis. 'For Democratic Socialism' is vaporized, just like Winston Smith did it at the Ministry of Truth, and that's very much what happened at the beginning of the McCarthy era and just continued, Orwell being selectively quoted." Fyvel wrote about Orwell: "His crucial experience [...] was his struggle to turn himself into a writer, one which led through long periods of poverty, failure and humiliation, and about which he has written almost nothing directly. The sweat and agony was less in the slum-life than in the effort to turn the experience into literature." In October 2015 Finlay Publisher, for the Orwell Society, published George Orwell 'The Complete Poetry''', compiled and presented by Dione Venables. Influence on language and writing In his essay "Politics and the English Language" (1946), Orwell wrote about the importance of precise and clear language, arguing that vague writing can be used as a powerful tool of political manipulation because it shapes the way we think. In that essay, Orwell provides six rules for writers: Never use a metaphor, simile or other figure of speech which you are used to seeing in print. Never use a long word where a short one will do. If it is possible to cut a word out, always cut it out. Never use the passive where you can use the active. Never use a foreign phrase, a scientific word or a jargon word if you can think of an everyday English equivalent. Break any of these rules sooner than say anything outright barbarous. Orwell worked as a journalist at The Observer for seven years, and its editor David Astor gave a copy of this celebrated essay to every new recruit. In 2003, literary editor at the newspaper Robert McCrum wrote, "Even now, it is quoted in our style book". Journalist Jonathan Heawood noted: "Orwell's criticism of slovenly language is still taken very seriously." Andrew N. Rubin argues that "Orwell claimed that we should be attentive to how the use of language has limited our capacity for critical thought just as we should be equally concerned with the ways in which dominant modes of thinking have reshaped the very language that we use." The adjective "Orwellian" connotes an attitude and a policy of control by propaganda, surveillance, misinformation, denial of truth and manipulation of the past. In Nineteen Eighty-Four, Orwell described a totalitarian government that controlled thought by controlling language, making certain ideas literally unthinkable. Several words and phrases from Nineteen Eighty-Four have entered popular language. "Newspeak" is a simplified and obfuscatory language designed to make independent thought impossible. "Doublethink" means holding two contradictory beliefs simultaneously. The "Thought Police" are those who suppress all dissenting opinion. "Prolefeed" is homogenised, manufactured superficial literature, film and music used to control and indoctrinate the populace through docility. "Big Brother" is a supreme dictator who watches everyone. Orwell may have been the first to use the term "cold war" to refer to the state of tension between powers in the Western Bloc and the Eastern Bloc that followed World War II in his essay, "You and the Atom Bomb", published in Tribune on 19 October 1945. He wrote: "We may be heading not for general breakdown but for an epoch as horribly stable as the slave empires of antiquity. James Burnham's theory has been much discussed, but few people have yet considered its ideological implications—this is, the kind of world-view, the kind of beliefs, and the social structure that would probably prevail in a State which was at once unconquerable and in a permanent state of 'cold war' with its neighbours." Modern culture In 2014, a play written by playwright Joe Sutton titled Orwell in America was first performed by the Northern Stage theatre company in White River Junction, Vermont. It is a fictitious account of Orwell doing a book tour in the United States (something he never did in his lifetime). It moved to off-Broadway in 2016. Orwell's birthplace, a bungalow in Motihari, Bihar, India, was opened as a museum in May 2015. Statue A statue of George Orwell, sculpted by the British sculptor Martin Jennings, was unveiled on 7 November 2017 outside Broadcasting House, the headquarters of the BBC. The wall behind the statue is inscribed with the following phrase: "If liberty means anything at all, it means the right to tell people what they do not want to hear". These are words from his proposed preface to Animal Farm and a rallying cry for the idea of free speech in an open society. Personal life Childhood Jacintha Buddicom's account, Eric & Us, provides an insight into Blair's childhood. She quoted his sister Avril that "he was essentially an aloof, undemonstrative person" and said herself of his friendship with the Buddicoms: "I do not think he needed any other friends beyond the schoolfriend he occasionally and appreciatively referred to as 'CC'". She could not recall him having schoolfriends to stay and exchange visits as her brother Prosper often did in holidays. Cyril Connolly provides an account of Blair as a child in Enemies of Promise. Years later, Blair mordantly recalled his prep school in the essay "Such, Such Were the Joys", claiming among other things that he "was made to study like a dog" to earn a scholarship, which he alleged was solely to enhance the school's prestige with parents. Jacintha Buddicom repudiated Orwell's schoolboy misery described in the essay, stating that "he was a specially happy child". She noted that he did not like his name because it reminded him of a book he greatly disliked—Eric, or, Little by Little, a Victorian boys' school story. Connolly remarked of him as a schoolboy, "The remarkable thing about Orwell was that alone among the boys he was an intellectual and not a parrot for he thought for himself". At Eton, John Vaughan Wilkes, his former headmaster's son at St Cyprians, recalled that "he was extremely argumentative—about anything—and criticising the masters and criticising the other boys [...] We enjoyed arguing with him. He would generally win the arguments—or think he had anyhow." Roger Mynors concurs: "Endless arguments about all sorts of things, in which he was one of the great leaders. He was one of those boys who thought for himself." Blair liked to carry out practical jokes. Buddicom recalls him swinging from the luggage rack in a railway carriage like an orangutan to frighten a woman passenger out of the compartment. At Eton, he played tricks on John Crace, his housemaster, among which was to enter a spoof advertisement in a college magazine implying pederasty. Gow, his tutor, said he "made himself as big a nuisance as he could" and "was a very unattractive boy". Later Blair was expelled from the crammer at Southwold for sending a dead rat as a birthday present to the town surveyor. In one of his As I Please essays he refers to a protracted joke when he answered an advertisement for a woman who claimed a cure for obesity. Blair had an interest in natural history which stemmed from his childhood. In letters from school he wrote about caterpillars and butterflies, and Buddicom recalls his keen interest in ornithology. He also enjoyed fishing and shooting rabbits, and conducting experiments as in cooking a hedgehog or shooting down a jackdaw from the Eton roof to dissect it. His zeal for scientific experiments extended to explosives—again Buddicom recalls a cook giving notice because of the noise. Later in Southwold, his sister Avril recalled him blowing up the garden. When teaching he enthused his students with his nature-rambles both at Southwold and at Hayes. His adult diaries are permeated with his observations on nature. Relationships and marriage Buddicom and Blair lost touch shortly after he went to Burma and she became unsympathetic towards him. She wrote that it was because of the letters he wrote complaining about his life, but an addendum to Eric & Us by Venables reveals that he may have lost her sympathy through an incident which was, at best, a clumsy attempt at seduction. Mabel Fierz, who later became Blair's confidante, said: "He used to say the one thing he wished in this world was that he'd been attractive to women. He liked women and had many girlfriends I think in Burma. He had a girl in Southwold and another girl in London. He was rather a womaniser, yet he was afraid he wasn't attractive." Brenda Salkield (Southwold) preferred friendship to any deeper relationship and maintained a correspondence with Blair for many years, particularly as a sounding board for his ideas. She wrote: "He was a great letter writer. Endless letters, and I mean when he wrote you a letter he wrote pages." His correspondence with Eleanor Jacques (London) was more prosaic, dwelling on a closer relationship and referring to past rendezvous or planning future ones in London and Burnham Beeches. When Orwell was in the sanatorium in Kent, his wife's friend Lydia Jackson visited. He invited her for a walk and out of sight "an awkward situation arose." Jackson was to be the most critical of Orwell's marriage to Eileen O'Shaughnessy, but their later correspondence hints at a complicity. Eileen at the time was more concerned about Orwell's closeness to Brenda Salkield. Orwell had an affair with his secretary at Tribune which caused Eileen much distress, and others have been mooted. In a letter to Ann Popham he wrote: "I was sometimes unfaithful to Eileen, and I also treated her badly, and I think she treated me badly, too, at times, but it was a real marriage, in the sense that we had been through awful struggles together and she understood all about my work, etc." Similarly he suggested to Celia Kirwan that they had both been unfaithful. There are several testaments that it was a well-matched and happy marriage.Patrica Donahue in Stephen Wadhams Remembering Orwell In June 1944, Orwell and Eileen adopted a three-week-old boy they named Richard Horatio. According to Richard, Orwell was a wonderful father who gave him devoted, if rather rugged, attention and a great degree of freedom. After Orwell's death Richard went to live with Orwell's sister and her husband. Blair was very lonely after Eileen's death in 1945, and desperate for a wife, both as companion for himself and as mother for Richard. He proposed marriage to four women, including Celia Kirwan, and eventually Sonia Brownell accepted. Orwell had met her when she was assistant to Cyril Connolly, at Horizon literary magazine. They were married on 13 October 1949, only three months before Orwell's death. Some maintain that Sonia was the model for Julia in Nineteen Eighty-Four. Social interactions Orwell was noted for very close and enduring friendships with a few friends, but these were generally people with a similar background or with a similar level of literary ability. Ungregarious, he was out of place in a crowd and his discomfort was exacerbated when he was outside his own class. Though representing himself as a spokesman for the common man, he often appeared out of place with real working people. His brother-in-law Humphrey Dakin, a "Hail fellow, well met" type, who took him to a local pub in Leeds, said that he was told by the landlord: "Don't bring that bugger in here again." Adrian Fierz commented "He wasn't interested in racing or greyhounds or pub crawling or shove ha'penny. He just did not have much in common with people who did not share his intellectual interests." Awkwardness attended many of his encounters with working-class representatives, as with Pollitt and McNair, but his courtesy and good manners were often commented on. Jack Common observed on meeting him for the first time, "Right away manners, and more than manners—breeding—showed through." In his tramping days, he did domestic work for a time. His extreme politeness was recalled by a member of the family he worked for; she declared that the family referred to him as "Laurel" after the film comedian. With his gangling figure and awkwardness, Orwell's friends often saw him as a figure of fun. Geoffrey Gorer commented "He was awfully likely to knock things off tables, trip over things. I mean, he was a gangling, physically badly co-ordinated young man. I think his feeling [was] that even the inanimate world was against him." When he shared a flat with Heppenstall and Sayer, he was treated in a patronising manner by the younger men. At the BBC in the 1940s, "everybody would pull his leg" and Spender described him as having real entertainment value "like, as I say, watching a Charlie Chaplin movie". A friend of Eileen's reminisced about her tolerance and humour, often at Orwell's expense. One biography of Orwell accused him of having had an authoritarian streak. In Burma, he struck out at a Burmese boy who, while "fooling around" with his friends, had "accidentally bumped into him" at a station, resulting in Orwell falling "heavily" down some stairs. One of his former pupils recalled being beaten so hard he could not sit down for a week. When sharing a flat with Orwell, Heppenstall came home late one night in an advanced stage of loud inebriation. The upshot was that Heppenstall ended up with a bloody nose and was locked in a room. When he complained, Orwell hit him across the legs with a shooting stick and Heppenstall then had to defend himself with a chair. Years later, after Orwell's death, Heppenstall wrote a dramatic account of the incident called "The Shooting Stick" and Mabel Fierz confirmed that Heppenstall came to her in a sorry state the following day. Orwell got on well with young people. The pupil he beat considered him the best of teachers and the young recruits in Barcelona tried to drink him under the table without success. His nephew recalled Uncle Eric laughing louder than anyone in the cinema at a Charlie Chaplin film. In the wake of his most famous works, he attracted many uncritical hangers-on, but many others who sought him found him aloof and even dull. With his soft voice, he was sometimes shouted down or excluded from discussions. At this time, he was severely ill; it was wartime or the austerity period after it; during the war his wife suffered from depression; and after her death he was lonely and unhappy. In addition to that, he always lived frugally and seemed unable to care for himself properly. As a result of all this, people found his circumstances bleak. Some, like Michael Ayrton, called him "Gloomy George", but others developed the idea that he was an "English secular saint". Although Orwell was frequently heard on the BBC for panel discussion and one-man broadcasts, no recorded copy of his voice is known to exist. Lifestyle Orwell was a heavy smoker, who rolled his own cigarettes from strong shag tobacco, despite his bronchial condition. His penchant for the rugged life often took him to cold and damp situations, both in the long term, as in Catalonia and Jura, and short term, for example, motorcycling in the rain and suffering a shipwreck. Described by The Economist as "perhaps the 20th century's best chronicler of English culture", Orwell considered fish and chips, football, the pub, strong tea, cut-price chocolate, the movies, and radio among the chief comforts for the working class. He advocated a patriotic defence of a British way of life that could not be trusted to intellectuals or, by implication, the state: Orwell enjoyed strong tea—he had Fortnum & Mason's tea brought to him in Catalonia. His 1946 essay, "A Nice Cup of Tea", appeared in the London Evening Standard article on how to make tea, with Orwell writing, "tea is one of the mainstays of civilisation in this country and causes violent disputes over how it should be made", with the main issue being whether to put tea in the cup first and add the milk afterward, or the other way round, on which he states, "in every family in Britain there are probably two schools of thought on the subject". He appreciated English beer, taken regularly and moderately, despised drinkers of lager and wrote about an imagined, ideal British pub in his 1946 Evening Standard article, "The Moon Under Water". Not as particular about food, he enjoyed the wartime "Victory Pie" and extolled canteen food at the BBC. He preferred traditional English dishes, such as roast beef, and kippers. His 1945 essay, "In Defence of English Cooking", included Yorkshire pudding, crumpets, muffins, innumerable biscuits, Christmas pudding, shortbread, various British cheeses and Oxford marmalade. Reports of his Islington days refer to the cosy afternoon tea table. His dress sense was unpredictable and usually casual. In Southwold, he had the best cloth from the local tailor but was equally happy in his tramping outfit. His attire in the Spanish Civil War, along with his size-12 boots, was a source of amusement.Jennie Lee in Peter Davison, Complete Works XI 5 David Astor described him as looking like a prep school master, while according to the Special Branch dossier, Orwell's tendency to dress "in Bohemian fashion" revealed that the author was "a Communist". Orwell's confusing approach to matters of social decorum—on the one hand expecting a working-class guest to dress for dinner, and on the other, slurping tea out of a saucer at the BBC canteen—helped stoke his reputation as an English eccentric. Views Religion Orwell was an atheist who identified himself with the humanist outlook on life. Despite this, and despite his criticisms of both religious doctrine and religious organisations, he nevertheless regularly participated in the social and civic life of the church, including by attending Church of England Holy Communion. Acknowledging this contradiction, he once said: "It seems rather mean to go to HC [Holy Communion] when one doesn't believe, but I have passed myself off for pious & there is nothing for it but to keep up with the deception." He had two Anglican marriages and left instructions for an Anglican funeral. Orwell was also extremely well-read in Biblical literature and could quote lengthy passages from the Book of Common Prayer from memory. His extensive knowledge of the Bible came coupled with unsparing criticism of its philosophy, and as an adult he could not bring himself to believe in its tenets. He said in part V of his essay, "Such, Such Were the Joys", that "Till about the age of fourteen I believed in God, and believed that the accounts given of him were true. But I was well aware that I did not love him." Orwell directly contrasted Christianity with secular humanism in his essay "Lear, Tolstoy and the Fool", finding the latter philosophy more palatable and less "self-interested". Literary critic James Wood wrote that in the struggle, as he saw it, between Christianity and humanism, "Orwell was on the humanist side, of course—basically an unmetaphysical, English version of Camus's philosophy of perpetual godless struggle." Orwell's writing was often explicitly critical of religion, and Christianity in particular. He found the church to be a "selfish [...] church of the landed gentry" with its establishment "out of touch" with the majority of its communicants and altogether a pernicious influence on public life. In their 1972 study, The Unknown Orwell, the writers Peter Stansky and William Abrahams noted that at Eton Blair displayed a "sceptical attitude" to Christian belief. Crick observed that Orwell displayed "a pronounced anti-Catholicism". Evelyn Waugh, writing in 1946, acknowledged Orwell's high moral sense and respect for justice but believed "he seems never to have been touched at any point by a conception of religious thought and life." His contradictory and sometimes ambiguous views about the social benefits of religious affiliation mirrored the dichotomies between his public and private lives: Stephen Ingle wrote that it was as if the writer George Orwell "vaunted" his unbelief while Eric Blair the individual retained "a deeply ingrained religiosity". Politics Orwell liked to provoke arguments by challenging the status quo, but he was also a traditionalist with a love of old English values. He criticised and satirised, from the inside, the various social milieux in which he found himself—provincial town life in A Clergyman's Daughter; middle-class pretension in Keep the Aspidistra Flying; preparatory schools in "Such, Such Were the Joys"; and some socialist groups in The Road to Wigan Pier. In his Adelphi days, he described himself as a "Tory-anarchist". Of colonialism in Burmese Days, he portrays the English colonists as a "dull, decent people, cherishing and fortifying their dullness behind a quarter of a million bayonets." In 1928, Orwell began his career as a professional writer in Paris at a journal owned by the French Communist Henri Barbusse. His first article, "La Censure en Angleterre" ("Censorship in England"), was an attempt to account for the "extraordinary and illogical" moral censorship of plays and novels then practised in Britain. His own explanation was that the rise of the "puritan middle class", who had stricter morals than the aristocracy, tightened the rules of censorship in the 19th century. Orwell's first published article in his home country, "A Farthing Newspaper", was a critique of the new French daily the Ami de Peuple. This paper was sold much more cheaply than most others, and was intended for ordinary people to read. Orwell pointed out that its proprietor François Coty also owned the right-wing dailies Le Figaro and Le Gaulois, which the Ami de Peuple was supposedly competing against. Orwell suggested that cheap newspapers were no more than a vehicle for advertising and anti-leftist propaganda, and predicted the world might soon see free newspapers which would drive legitimate dailies out of business. Writing for Le Progrès Civique, Orwell described the British colonial government in Burma and India: Spanish Civil War and socialism The Spanish Civil War played the most important part in defining Orwell's socialism. He wrote to Cyril Connolly from Barcelona on 8 June 1937: "I have seen wonderful things and at last really believe in Socialism, which I never did before." Having witnessed the success of the anarcho-syndicalist communities, for example in Anarchist Catalonia, and the subsequent brutal suppression of the anarcho-syndicalists, anti-Stalin communist parties and revolutionaries by the Soviet Union-backed Communists, Orwell returned from Catalonia a staunch anti-Stalinist and joined the British Independent Labour Party, his card being issued on 13 June 1938. Although he was never a Trotskyist, he was strongly influenced by the Trotskyist and anarchist critiques of the Soviet regime, and by the anarchists' emphasis on individual freedom. In Part 2 of The Road to Wigan Pier, published by the Left Book Club, Orwell stated that "a real Socialist is one who wishes—not merely conceives it as desirable, but actively wishes—to see tyranny overthrown". Orwell stated in "Why I Write" (1946): "Every line of serious work that I have written since 1936 has been written, directly or indirectly, against totalitarianism and for democratic socialism, as I understand it." Orwell's conception of socialism was of a planned economy alongside democracy, which was the common notion of socialism in the early and middle 20th century. Orwell's emphasis on "democracy" primarily referred to a strong emphasis on civil liberties within a socialist economy as opposed to majoritarian rule, though he was not necessarily opposed to majority rule. Orwell was a proponent of a federal socialist Europe, a position outlined in his 1947 essay "Toward European Unity", which first appeared in Partisan Review. According to biographer John Newsinger: In his 1938 essay "Why I joined the Independent Labour Party," published in the ILP-affiliated New Leader, Orwell wrote: Towards the end of the essay, he wrote: "I do not mean I have lost all faith in the Labour Party. My most earnest hope is that the Labour Party will win a clear majority in the next General Election." The Second World War Orwell was opposed to rearmament against Nazi Germany and at the time of the Munich Agreement he signed a manifesto entitled "If War Comes We Shall Resist"—but he changed his view after the Molotov–Ribbentrop Pact and the outbreak of the war. He left the ILP because of its opposition to the war and adopted a political position of "revolutionary patriotism". On 21 March 1940 he wrote a review of Adolf Hitler's Mein Kampf for The New English Weekly, in which he analysed the dictator's psychology. According to Orwell "a thing that strikes one is the rigidity of his mind, the way in which his world-view doesn't develop. It is the fixed vision of a monomaniac and not likely to be much affected by the temporary manoeuvres of power politics". Asking "how was it that he was able to put [his] monstrous vision across?", Orwell tried to understand why Hitler was worshipped by the German people: "The situation in Germany, with its seven million unemployed, was obviously favourable for demagogues. But Hitler could not have succeeded against his many rivals if it had not been for the attraction of his own personality, which one can feel even in the clumsy writing of Mein Kampf, and which is no doubt overwhelming when one hears his speeches…The fact is that there is something deeply appealing about him. The initial, personal cause of his grievance against the universe can only be guessed at; but at any rate the grievance is here. He is the martyr, the victim, Prometheus chained to the rock, the self-sacrificing hero who fights single-handed against impossible odds. If he were killing a mouse he would know how to make it seem like a dragon." In December 1940 he wrote in Tribune (the Labour left's weekly): "We are in a strange period of history in which a revolutionary has to be a patriot and a patriot has to be a revolutionary." During the war, Orwell was highly critical of the popular idea that an Anglo-Soviet alliance would be the basis of a post-war world of peace and prosperity. In 1942, commenting on London Times editor E. H. Carr's pro-Soviet views, Orwell stated that "all the appeasers, e.g. Professor E.H. Carr, have switched their allegiance from Hitler to Stalin". In his reply (dated 15 November 1943) to an invitation from the Duchess of Atholl to speak for the British League for European Freedom, he stated that he did not agree with their objectives. He admitted that what they said was "more truthful than the lying propaganda found in most of the press", but added that he could not "associate himself with an essentially Conservative body" that claimed to "defend democracy in Europe" but had "nothing to say about British imperialism". His closing paragraph stated: "I belong to the Left and must work inside it, much as I hate Russian totalitarianism and its poisonous influence in this country." Tribune and post-war Britain Orwell joined the staff of Tribune magazine as literary editor, and from then until his death, was a left-wing (though hardly orthodox) Labour-supporting democratic socialist. On 1 September 1944, writing about the Warsaw uprising, Orwell expressed in Tribune his hostility against the influence of the alliance with the USSR over the allies: "Do remember that dishonesty and cowardice always have to be paid for. Do not imagine that for years on end you can make yourself the boot-licking propagandist of the sovietic regime, or any other regime, and then suddenly return to honesty and reason. Once a whore, always a whore." According to Newsinger, although Orwell "was always critical of the 1945–51 Labour government's moderation, his support for it began to pull him to the right politically. This did not lead him to embrace conservatism, imperialism or reaction, but to defend, albeit critically, Labour reformism." Between 1945 and 1947, with A. J. Ayer and Bertrand Russell, he contributed a series of articles and essays to Polemic, a short-lived British "Magazine of Philosophy, Psychology, and Aesthetics" edited by the ex-Communist Humphrey Slater.Collini, Stefan (2006). Absent Minds: Intellectuals in Britain. Oxford University Press. Writing in early 1945 a long essay titled "Antisemitism in Britain", for the Contemporary Jewish Record, Orwell stated that antisemitism was on the increase in Britain and that it was "irrational and will not yield to arguments". He argued that it would be useful to discover why anti-Semites could "swallow such absurdities on one particular subject while remaining sane on others". He wrote: "For quite six years the English admirers of Hitler contrived not to learn of the existence of Dachau and Buchenwald. ... Many English people have heard almost nothing about the extermination of German and Polish Jews during the present war. Their own anti-Semitism has caused this vast crime to bounce off their consciousness." In Nineteen Eighty-Four, written shortly after the war, Orwell portrayed the Party as enlisting anti-Semitic passions against their enemy, Goldstein. Orwell publicly defended P. G. Wodehouse against charges of being a Nazi sympathiser—occasioned by his agreement to do some broadcasts over the German radio in 1941—a defence based on Wodehouse's lack of interest in and ignorance of politics. Special Branch, the intelligence division of the Metropolitan Police, maintained a file on Orwell for more than 20 years of his life. The dossier, published by The National Archives, states that, according to one investigator, Orwell had "advanced Communist views and several of his Indian friends say that they have often seen him at Communist meetings". MI5, the intelligence department of the Home Office, noted: "It is evident from his recent writings—'The Lion and the Unicorn'—and his contribution to Gollancz's symposium The Betrayal of the Left that he does not hold with the Communist Party nor they with him." Sexuality Sexual politics plays an important role in Nineteen Eighty-Four. In the novel, people's intimate relationships are strictly governed by the party's Junior Anti-Sex League, by opposing sexual relations and instead encouraging artificial insemination. Personally, Orwell disliked what he thought as misguided middle-class revolutionary emancipatory views, expressing disdain for "every fruit-juice drinker, nudist, sandal-wearer, sex-maniacs". Orwell was also openly against homosexuality, at a time when such prejudice was common. Speaking at the 2003 George Orwell Centenary Conference, Daphne Patai said: "Of course he was homophobic. That has nothing to do with his relations with his homosexual friends. Certainly, he had a negative attitude and a certain kind of anxiety, a denigrating attitude towards homosexuality. That is definitely the case. I think his writing reflects that quite fully." Orwell used the homophobic epithets "nancy" and "pansy", such in his expressions of contempt for what he called the "pansy Left", and "nancy poets", i.e. left-wing homosexual or bisexual writers and intellectuals such as Stephen Spender and W. H. Auden. The protagonist of Keep the Aspidistra Flying, Gordon Comstock, conducts an internal critique of his customers when working in a bookshop, and there is an extended passage of several pages in which he concentrates on a homosexual male customer, and sneers at him for his "nancy" characteristics, including a lisp, which he identifies in detail, with some disgust. Stephen Spender "thought Orwell's occasional homophobic outbursts were part of his rebellion against the public school". Biographies of Orwell Orwell's will requested that no biography of him be written, and his widow, Sonia Brownell, repelled every attempt by those who tried to persuade her to let them write about him. Various recollections and interpretations were published in the 1950s and 1960s, but Sonia saw the 1968 Collected Works as the record of his life. She did appoint Malcolm Muggeridge as official biographer, but later biographers have seen this as deliberate spoiling as Muggeridge eventually gave up the work. In 1972, two American authors, Peter Stansky and William Abrahams, produced The Unknown Orwell, an unauthorised account of his early years that lacked any support or contribution from Sonia Brownell. Sonia Brownell then commissioned Bernard Crick, a professor of politics at the University of London, to complete a biography and asked Orwell's friends to co-operate. Crick collated a considerable amount of material in his work, which was published in 1980, but his questioning of the factual accuracy of Orwell's first-person writings led to conflict with Brownell, and she tried to suppress the book. Crick concentrated on the facts of Orwell's life rather than his character, and presented primarily a political perspective on Orwell's life and work. After Sonia Brownell's death, other works on Orwell were published in the 1980s, particularly in 1984. These included collections of reminiscences by Coppard and Crick and Stephen Wadhams. In 1991, Michael Shelden, an American professor of literature, published a biography. More concerned with the literary nature of Orwell's work, he sought explanations for Orwell's character and treated his first-person writings as autobiographical. Shelden introduced new information that sought to build on Crick's work. Shelden speculated that Orwell possessed an obsessive belief in his failure and inadequacy. Peter Davison's publication of the Complete Works of George Orwell, completed in 2000, made most of the Orwell Archive accessible to the public. Jeffrey Meyers, a prolific American biographer, was first to take advantage of this and published a book in 2001 that investigated the darker side of Orwell and questioned his saintly image. Why Orwell Matters (released in the United Kingdom as Orwell's Victory) was published by Christopher Hitchens in 2002. In 2003, the centenary of Orwell's birth resulted in biographies by Gordon Bowker and D. J. Taylor, both academics and writers in the United Kingdom. Taylor notes the stage management which surrounds much of Orwell's behaviour and Bowker highlights the essential sense of decency which he considers to have been Orwell's main motivation.Review: Orwell by DJ Taylor and George Orwell by Gordon Bowker Observer on Sunday 1 June 2003 Bibliography Novels 1934 – Burmese Days 1935 – A Clergyman's Daughter 1936 – Keep the Aspidistra Flying 1939 – Coming Up for Air 1945 – Animal Farm 1949 – Nineteen Eighty-FourNonfiction 1933 – Down and Out in Paris and London 1937 – The Road to Wigan Pier 1938 – Homage to CataloniaNotes References Sources Anderson, Paul (ed). Orwell in Tribune: 'As I Please' and Other Writings. Methuen/Politico's 2006. Azurmendi, Joxe (1984): George Orwell. 1984: Reality exists in the human mind, Jakin, 32: 87–103. Bounds, Philip. Orwell and Marxism: The Political and Cultural Thinking of George Orwell. I.B. Tauris. 2009. Bowker, Gordon. George Orwell. Little Brown. 2003. Buddicom, Jacintha. Eric & Us. Finlay Publisher. 2006. Caute, David. Dr. Orwell and Mr. Blair, Weidenfeld & Nicolson. Crick, Bernard. George Orwell: A Life. Penguin. 1982. Davison, Peter; Angus, Ian; Davison, Sheila (eds). 2000 A Kind of Compulsion. London: Random House Flynn, Nigel. George Orwell. The Rourke Corporation, Inc. 1990. Haycock, David Boyd. I Am Spain: The Spanish Civil War and the Men and Women who went to Fight Fascism. Old Street Publishing. 2013. Hitchens, Christopher. Why Orwell Matters. Basic Books. 2003. Hollis, Christopher. A Study of George Orwell: The Man and His Works. Chicago: Henry Regnery Co. 1956. Larkin, Emma. Secret Histories: Finding George Orwell in a Burmese Teashop. Penguin. 2005. Lee, Robert A, Orwell's Fiction. University of Notre Dame Press, 1969. Leif, Ruth Ann, Homage to Oceania. The Prophetic Vision of George Orwell. Ohio State U.P. [1969] Meyers, Jeffery. Orwell: Wintry Conscience of a Generation. W.W. Norton. 2000. Newsinger, John. Orwell's Politics. Macmillan. 1999. . Rodden, John (ed.) The Cambridge Companion to George Orwell. Cambridge. 2007. Shelden, Michael. Orwell: The Authorized Biography. HarperCollins. 1991. Smith, D. & Mosher, M. Orwell for Beginners. 1984. London: Writers and Readers Publishing Cooperative. Taylor, D. J. Orwell: The Life. Henry Holt and Company. 2003. West, W. J. The Larger Evils. Edinburgh: Canongate Press. 1992. (Nineteen Eighty-Four – The truth behind the satire.) West, W. J. (ed.) George Orwell: The Lost Writings. New York: Arbor House. 1984. Williams, Raymond, Orwell, Fontana/Collins, 1971 Wood, James "A Fine Rage." The New Yorker. 2009. 85(9):54. Woodcock, George. The Crystal Spirit. Little Brown. 1966. Further reading Morgan, W. John, 'Pacifism or Bourgeois Pacifism? Huxley, Orwell, and Caudwell'. Chapter 5 in Morgan, W. John and Guilherme, Alexandre (Eds.), Peace and War-Historical, Philosophical, and Anthropological Perspectives, Palgrave Macmillan, 2020, pp, 71–96. . Orwell, George. Diaries, edited by Peter Davison (W. W. Norton & Company; 2012) 597 pages; annotated edition of 11 diaries kept by Orwell, from August 1931 to September 1949. Ostrom, Hans and Halton, William. Orwell's "Politics and the English Language" in the Age of Pseudocracy (New York: Routledge, 2018) Wilson, S.M and Huxtable, J. Such, "Such Were the Joys: graphic novel"'' (London: Pluto Press, Sept 2021) External links Blair, Eric Arthur (George Orwell) (1903–1950) at the Oxford Dictionary of National Biography George Orwell at the British Library Works: The complete works of George Orwell (george-orwell.org), a fan site Catalogs and further links: 1903 births 1950 deaths 20th-century British journalists 20th-century British male writers 20th-century British non-fiction writers 20th-century British novelists 20th-century British philosophers 20th-century British poets 20th-century deaths from tuberculosis 20th-century English historians 20th-century English male writers 20th-century English non-fiction writers 20th-century English novelists 20th-century English philosophers 20th-century English poets 20th-century essayists 20th-century historians 20th-century poets 20th-century pseudonymous writers Administrators in British Burma Anti-Stalinist left Aphorists Atheist philosophers British anti-communists British anti-fascists British atheists British booksellers British colonial police officers British critics British democracy activists British ethicists British Home Guard soldiers British humanists British investigative journalists British male essayists British male journalists British memoirists British opinion journalists British people in colonial India British people of the Spanish Civil War British people of World War II British political writers British propagandists British reporters and correspondents British satirists British sceptics British secularists British shooting survivors British socialists British writers in French Burials in Oxfordshire Constructed language creators Critics of Christianity Critics of religions Critics of the Catholic Church Cultural critics Democratic socialists English anti-fascists English atheists English booksellers English critics English democracy activists English essayists English humanists English male journalists English male non-fiction writers English memoirists English people of French descent English political writers English reporters and correspondents English satirists English sceptics English social commentators English socialists English war correspondents European democratic socialists Foreign volunteers in the Spanish Civil War Free speech activists Freethought writers Futurologists Historians of fascism Irony theorists Journalists from Bihar Literacy and society theorists Literary theorists Male essayists Mass media theorists Modernist writers Moral philosophers Novelists from Bihar People educated at Eton College People educated at St Cyprian's School People educated at Wellington College, Berkshire People from Henley-on-Thames People from Motihari People from Shiplake Philosophers of culture Philosophers of education Philosophers of ethics and morality Philosophers of history Philosophers of literature Philosophers of social science Philosophers of war Poets from Bihar Political philosophers Rhetoric theorists Secular humanists Social commentators Social critics Social philosophers Theorists on Western civilization Tuberculosis deaths in England Writers about activism and social change Writers about communism
[ 0.08433224260807037, 0.2863008379936218, -0.46438995003700256, 0.03162287920713425, 0.27860748767852783, 0.623862087726593, 0.2720799744129181, -0.47579723596572876, -0.14639627933502197, -0.09272494167089462, -0.05716932564973831, 0.07849708199501038, -0.47758471965789795, -0.140495061874...
11894
https://en.wikipedia.org/wiki/Goeldi%27s%20marmoset
Goeldi's marmoset
The Goeldi's marmoset or Goeldi's monkey (Callimico goeldii) is a small, South American New World monkey that lives in the upper Amazon basin region of Bolivia, Brazil, Colombia, Ecuador, and Peru. It is the only species classified in the genus Callimico, and the monkeys are sometimes referred to as "callimicos". Goeldi's marmosets are blackish or blackish-brown in color and the hair on their head and tail sometimes has red, white, or silverly brown highlights. Their bodies are about long, and their tails are about long. Goeldi's marmoset was first described in 1904, making Callimico one of the more recent monkey genera to be described. In older classification schemes it was sometimes placed in its own family Callimiconidae and sometimes, along with the marmosets and tamarins, in the subfamily Callitrichinae in the family Cebidae. More recently, Callitrichinae has been (re-)elevated to family status as Callitrichidae. Females reach sexual maturity at 8.5 months, males at 16.5 months. The gestation period lasts from 140 to 180 days. Unlike other New World monkeys, they have the capacity to give birth twice a year. The mother carries a single baby monkey per pregnancy, whereas most other species in the family Callitrichidae usually give birth to twins. For the first 2–3 weeks the mother acts as the primary caregiver until the father takes over most of the responsibilities except for nursing. The infant is weaned after about 65 days. Females outnumber males by 2 to 1. The life expectancy in captivity is about 10 years. Goeldi's marmosets prefer to forage in dense scrubby undergrowth; perhaps because of this, they are rare, with groups living in separate patches of suitable habitat, separated by miles of unsuitable flora. In the wet season, their diet includes fruit, insects, spiders, lizards, frogs, and snakes. In the dry season, they feed on fungi, the only tropical primates known to depend on this source of food. They live in small social groups (approximately six individuals) that stay within a few feet of one another most of the time, staying in contact via high-pitched calls. They are also known to form polyspecific groups with tamarins such as the white-lipped tamarin and brown-mantled tamarin. This is perhaps because Goeldi's marmosets are not known to have the X-linked polymorphism which enables some individuals of other New World monkey species to see in full tri-chromatic vision. The species takes its name from its discoverer, the Swiss naturalist Emil August Goeldi. References External links ARKive - images and movies of the Goeldi's monkey (Callimico goeldii) Press release on recent research on Goeldi's monkey by scientists at the University of Washington Primate Info Net Callimico goeldii Factsheet Pictures of Goeldi's Monkey Goeldi's marmoset Mammals of Bolivia Mammals of Brazil Mammals of Colombia Mammals of Ecuador Mammals of Peru Goeldi's marmoset Taxa named by Oldfield Thomas Articles containing video clips
[ -0.5753724575042725, 0.536450982093811, -0.7406901717185974, -0.10881238430738449, 0.0022756762336939573, 0.4839872717857361, 0.8953306078910828, 0.4668146073818207, -0.7072456479072571, -0.11018341779708862, 0.11660274863243103, -0.12365849316120148, -0.5134091377258301, 0.237419530749320...
11921
https://en.wikipedia.org/wiki/Gambling
Gambling
Gambling (also known as betting or gaming) is the wagering something of value ("the stakes") on an event with an uncertain outcome with the intent of winning something else of value. Gambling thus requires three elements to be present: consideration (an amount wagered), risk (chance), and a prize. The outcome of the wager is often immediate, such as a single roll of dice, a spin of a roulette wheel, or a horse crossing the finish line, but longer time frames are also common, allowing wagers on the outcome of a future sports contest or even an entire sports season. The term "gaming" in this context typically refers to instances in which the activity has been specifically permitted by law. The two words are not mutually exclusive; i.e., a "gaming" company offers (legal) "gambling" activities to the public and may be regulated by one of many gaming control boards, for example, the Nevada Gaming Control Board. However, this distinction is not universally observed in the English-speaking world. For instance, in the United Kingdom, the regulator of gambling activities is called the Gambling Commission (not the Gaming Commission). The word gaming is used more frequently since the rise of computer and video games to describe activities that do not necessarily involve wagering, especially online gaming, with the new usage still not having displaced the old usage as the primary definition in common dictionaries. "Gaming" has also been used to circumvent laws against "gambling". The media and others have used one term or the other to frame conversations around the subjects, resulting in a shift of perceptions among their audiences. Gambling is also a major international commercial activity, with the legal gambling market totaling an estimated $335 billion in 2009. In other forms, gambling can be conducted with materials that have a value, but are not real money. For example, players of marbles games might wager marbles, and likewise games of Pogs or Magic: The Gathering can be played with the collectible game pieces (respectively, small discs and trading cards) as stakes, resulting in a meta-game regarding the value of a player's collection of pieces. History Gambling dates back to the Paleolithic period, before written history. In Mesopotamia the earliest six-sided dice date to about 3000 BC. However, they were based on astragali dating back thousands of years earlier. In China, gambling houses were widespread in the first millennium BC, and betting on fighting animals was common. Lotto games and dominoes (precursors of Pai Gow) appeared in China as early as the 10th century. Playing cards appeared in the 9th century AD in China. Records trace gambling in Japan back at least as far as the 14th century. Poker, the most popular U.S. card game associated with gambling, derives from the Persian game As-Nas, dating back to the 17th century. The first known casino, the Ridotto, started operating in 1638 in Venice, Italy. Great Britain Gambling has been a main recreational activity in Great Britain for centuries. Horseracing has been a favorite theme for over three centuries. It has been heavily regulated. Historically much of the opposition comes from evangelical Protestants, and from social reformers. United States Gambling has been a popular activity in the United States for centuries. It has also been suppressed by law in many areas for almost as long. By the early 20th century, gambling was almost uniformly outlawed throughout the U.S. and thus became a largely illegal activity, helping to spur the growth of the mafia and other criminal organizations. The late 20th century saw a softening in attitudes towards gambling and a relaxation of laws against it. Regulation Many jurisdictions, local as well as national, either ban gambling or heavily control it by licensing the vendors. Such regulation generally leads to gambling tourism and illegal gambling in the areas where it is not allowed. The involvement of governments, through regulation and taxation, has led to a close connection between many governments and gaming organizations, where legal gambling provides significant government revenue, such as in Monaco and Macau, China. There is generally legislation requiring that gaming devices be statistically random, to prevent manufacturers from making some high-payoff results impossible. Since these high payoffs have very low probability, a house bias can quite easily be missed unless the devices are checked carefully. Most jurisdictions that allow gambling require participants to be above a certain age. In some jurisdictions, the gambling age differs depending on the type of gambling. For example, in many American states one must be over 21 to enter a casino, but may buy a lottery ticket after turning 18. Insurance Because contracts of insurance have many features in common with wagers, insurance contracts are often distinguished in law as agreements in which either party has an interest in the "bet-upon" outcome beyond the specific financial terms. e.g.: a "bet" with an insurer on whether one's house will burn down is not gambling, but rather insurance – as the homeowner has an obvious interest in the continued existence of his/her home independent of the purely financial aspects of the "bet" (i.e. the insurance policy). Nonetheless, both insurance and gambling contracts are typically considered aleatory contracts under most legal systems, though they are subject to different types of regulation. Asset recovery Under common law, particularly English Law (English unjust enrichment), a gambling contract may not give a casino bona fide purchaser status, permitting the recovery of stolen funds in some situations. In Lipkin Gorman v Karpnale Ltd, where a solicitor used stolen funds to gamble at a casino, the House of Lords overruled the High Court's previous verdict, adjudicating that the casino return the stolen funds less those subject to any change of position defence. U.S. Law precedents are somewhat similar. For case law on recovery of gambling losses where the loser had stolen the funds see "Rights of owner of stolen money as against one who won it in gambling transaction from thief". An interesting question is what happens when the person trying to make recovery is the gambler's spouse, and the money or property lost was either the spouse's, or was community property. This was a minor plot point in a Perry Mason novel, The Case of the Singing Skirt, and it cites an actual case Novo v. Hotel Del Rio. Religious views Hinduism Ancient Hindu poems like the Gambler's Lament and the Mahabharata testify to the popularity of gambling among ancient Indians. However, the text Arthashastra (c. 4th century BC) recommends taxation and control of gambling. Judaism Ancient Jewish authorities frowned on gambling, even disqualifying professional gamblers from testifying in court. Christianity Catholicism The Catholic Church holds the position that there is no moral impediment to gambling, so long as it is fair, all bettors have a reasonable chance of winning, there is no fraud involved, and the parties involved do not have actual knowledge of the outcome of the bet (unless they have disclosed this knowledge), and as long as the following conditions are met: the gambler can afford to lose the bet, and stops when the limit is reached, and the motivation is entertainment and not personal gain leading to the "love of money" or making a living. In general, Catholic bishops have opposed casino gambling on the grounds that it too often tempts people into problem gambling or addiction, and has particularly negative effects on poor people; they sometimes also cite secondary effects such as increases in loan sharking, prostitution, corruption, and general public immorality. Some parish pastors have also opposed casinos for the additional reason that they would take customers away from church bingo and annual festivals where games such as blackjack, roulette, craps, and poker are used for fundraising. St. Thomas Aquinas wrote that gambling should be especially forbidden where the losing bettor is underage or otherwise not able to consent to the transaction. Gambling has often been seen as having social consequences, as satirized by Balzac. For these social and religious reasons, most legal jurisdictions limit gambling, as advocated by Pascal. Protestantism Gambling views among Protestants vary, with some either discouraging or forbidding their members from participation in gambling. Methodists, in accordance with the doctrine of outward holiness, oppose gambling which they believe is a sin that feeds on greed; examples are the United Methodist Church, the Free Methodist Church, the Evangelical Wesleyan Church, the Salvation Army, and the Church of the Nazarene. Other Protestants that oppose gambling include many Mennonites, Quakers, the Christian Reformed Church in North America, the Church of the Lutheran Confession, the Southern Baptist Convention, the Assemblies of God, and the Seventh-day Adventist Church. Other Christian denominations Other churches that oppose gambling include the Jehovah's Witnesses, The Church of Jesus Christ of Latter-day Saints, the Iglesia ni Cristo, and the Members Church of God International. Islam Although different interpretations of Shari‘ah (Islamic Law) exist in the Muslim world, there is a consensus among the ‘Ulema’ (, Scholars (of Islam)) that gambling is haraam (, sinful or forbidden). In assertions made during its prohibition, Muslim jurists describe gambling as being both un-Qur’anic, and as being generally harmful to the Muslim Ummah (, Community). The Arabic terminology for gambling is Maisir. In parts of the world that implement full Shari‘ah, such as Aceh, punishments for Muslim gamblers can range up to 12 lashes or a one-year prison term and a fine for those who provide a venue for such practises. Some Islamic nations prohibit gambling; most other countries regulate it. Bahá'í Faith According to the Most Holy Book, paragraph 155, gambling is forbidden. Types Casino games While almost any game can be played for money, and any game typically played for money can also be played just for fun, some games are generally offered in a casino setting. Table games Electronic gaming Online roulette Pachinko Sic Bo Slot machine Video poker Video bingo Video poker Other gambling Bingo Keno Non-casino games Gambling games that take place outside of casinos include bingo (as played in the US and UK), dead pool, lotteries, pull-tab games and scratchcards, and Mahjong. Other non-casino gambling games include: Non-casino card games, including historical games like Basset, Ecarté, Lansquenet and Put. Technically, a gambling card game is one in which the cards are not actually played but simply bet on. Carnival Games such as The Razzle or Hanky Pank Coin-tossing games such as Head and Tail, Two-up* Confidence tricks such as Three-card Monte or the Shell game Dice-based games, such as Backgammon, Liar's dice, Passe-dix, Hazard, Threes, Pig, or Mexico (or Perudo); *Although coin tossing is not usually played in a casino, it has been known to be an official gambling game in some Australian casinos Fixed-odds betting Fixed-odds betting and Parimutuel betting frequently occur at many types of sporting events, and political elections. In addition many bookmakers offer fixed odds on a number of non-sports related outcomes, for example the direction and extent of movement of various financial indices, the winner of television competitions such as Big Brother, and election results. Interactive prediction markets also offer trading on these outcomes, with "shares" of results trading on an open market. Parimutuel betting One of the most widespread forms of gambling involves betting on horse or greyhound racing. Wagering may take place through parimutuel pools, or bookmakers may take bets personally. Parimutuel wagers pay off at prices determined by support in the wagering pools, while bookmakers pay off either at the odds offered at the time of accepting the bet; or at the median odds offered by track bookmakers at the time the race started. Sports betting Betting on team sports has become an important service industry in many countries. For example, millions of people play the football pools every week in the United Kingdom. In addition to organized sports betting, both legal and illegal, there are many side-betting games played by casual groups of spectators, such as NCAA Basketball Tournament Bracket Pools, Super Bowl Squares, Fantasy Sports Leagues with monetary entry fees and winnings, and in-person spectator games like Moundball. Virtual sports Based on Sports Betting, Virtual Sports are fantasy and never played sports events made by software that can be played every time without wondering about external things like weather conditions. Arbitrage betting Arbitrage betting is a theoretically risk-free betting system in which every outcome of an event is bet upon so that a known profit will be made by the bettor upon completion of the event regardless of the outcome. Arbitrage betting is a combination of the ancient art of arbitrage trading and gambling, which has been made possible by the large numbers of bookmakers in the marketplace, creating occasional opportunities for arbitrage. Other types of betting One can also bet with another person that a statement is true or false, or that a specified event will happen (a "back bet") or will not happen (a "lay bet") within a specified time. This occurs in particular when two people have opposing but strongly held views on truth or events. Not only do the parties hope to gain from the bet, they place the bet also to demonstrate their certainty about the issue. Some means of determining the issue at stake must exist. Sometimes the amount bet remains nominal, demonstrating the outcome as one of principle rather than of financial importance. Betting exchanges allow consumers to both back and lay at odds of their choice. Similar in some ways to a stock exchange, a bettor may want to back a horse (hoping it will win) or lay a horse (hoping it will lose, effectively acting as bookmaker). Spread betting allows gamblers to wagering on the outcome of an event where the pay-off is based on the accuracy of the wager, rather than a simple "win or lose" outcome. For example, a wager can be based on the when a point is scored in the game in minutes and each minute away from the prediction increases or reduces the payout. Staking systems Many betting systems have been created in an attempt to "beat the house" but no system can make a mathematically unprofitable bet in terms of expected value profitable over time. Widely used systems include: Card counting – Many systems exist for blackjack to keep track of the ratio of ten values to all others; when this ratio is high the player has an advantage and should increase the amount of their bets. Keeping track of cards dealt confers an advantage in other games as well. Due-column betting – A variation on fixed profits betting in which the bettor sets a target profit and then calculates a bet size that will make this profit, adding any losses to the target. Fixed profits – the stakes vary based on the odds to ensure the same profit from each winning selection. Fixed stakes – a traditional system of staking the same amount on each selection. Kelly – the optimum level to bet to maximize your future median bank level. Martingale – A system based on staking enough each time to recover losses from previous bet(s) until one wins. Other uses of the term Many risk-return choices are sometimes referred to colloquially as "gambling." Whether this terminology is acceptable is a matter of debate: Emotional or physical risk-taking, where the risk-return ratio is not quantifiable (e.g., skydiving, campaigning for political office, asking someone for a date, etc.) Insurance is a method of shifting risk from one party to another. Insurers use actuarial methods to calculate appropriate premiums, which is similar to calculating gambling odds. Insurers set their premiums to obtain a long term positive expected return in the same manner that professional gamblers select which bets to make. While insurance is sometimes distinguished from gambling by the requirement of an insurable interest, the equivalent in gambling is simply betting against one's own best interests (e.g., a sports coach betting against his own team to mitigate the financial repercussions of a losing season). Situations where the possible return is of secondary importance to the wager/purchase (e.g. entering a raffle in support of a charitable cause) Investments are also usually not considered gambling, although some investments can involve significant risk. Examples of investments include stocks, bonds and real estate. Starting a business can also be considered a form of investment. Investments are generally not considered gambling when they meet the following criteria: Economic utility Positive expected returns (at least in the long term) Underlying value independent of the risk being undertaken Some speculative investment activities are particularly risky, but are sometimes perceived to be different from gambling: Foreign currency exchange (forex) transactions Prediction markets Securities derivatives, such as options or futures, where the value of the derivative is dependent on the value of the underlying asset at a specific point in time (typically the derivative's associated expiration date) Negative consequences Studies show that though many people participate in gambling as a form of recreation or to earn an income, gambling, like any behavior involving variation in brain chemistry, can become a behavioral addiction. Behavioral addiction can occur with all the negative consequences in a person's life minus the physical issues faced by people who compulsively engage in drug and alcohol abuse. Problem gambling has multiple symptoms. Gamblers often gamble to try to win back money they have lost, and some gamble to relieve feelings of helplessness and anxiety. In the United Kingdom, the Advertising Standards Authority has censured several betting firms for advertisements disguised as news articles suggesting falsely that a person had cleared debts and paid for medical expenses by gambling online . The firms face possible fines. A 2020 study of 32 countries found that the greater the amount of gambling activity in a given country, the more volatile that country's stock market prices are. Psychological biases Gamblers exhibit a number of cognitive and motivational biases that distort the perceived odds of events and that influence their preferences for gambles. Preference for likely outcomes. When gambles are selected through a choice process – when people indicate which gamble they prefer from a set of gambles (e.g., win/lose, over/under) – people tend to prefer to bet on the outcome that is more likely to occur. Bettors tend to prefer to bet on favorites in athletic competitions, and sometimes will accept even bets on favorites when offered more favorable bets on the less likely outcome (e.g., an underdog team). Optimism/Desirability Bias. Gamblers also exhibit optimism, overestimating the likelihood that desired events will occur. Fans of NFL underdog teams, for example, will prefer to bet on their teams at even odds than to bet on the favorite, whether the bet is $5 or $50. Reluctance to bet against (hedge) desired outcomes. People are reluctant to bet against desired outcomes that are relevant to their identity. Gamblers exhibit reluctance to bet against the success of their preferred U.S. presidential candidates and Major League Baseball, National Football League, National Collegiate Athletic Association (NCAA) basketball, and NCAA hockey teams. More than 45% of NCAA fans in Studies 5 and 6, for instance, turned down a "free" real $5 bet against their team. From a psychological perspective, such a "hedge" creates an interdependence dilemma – a motivational conflict between a short-term monetary gain and the long-term benefits accrued from feelings of identification with and loyalty to a position, person, or group whom the bettor desires to succeed. In economic terms, this conflicted decision can be modeled as a trade-off between the outcome utility gained by hedging (e.g., money) and the diagnostic costs it incurs (e.g., disloyalty). People make inferences about their beliefs and identity from their behavior. If a person is uncertain about an aspect of his or her identity, such as the extent to which he or she values a candidate or team, hedging may signal to him or her that he or she is not as committed to that candidate or team as he or she originally believed. If the diagnostic cost of this self-signal and the resulting identity change are substantial, it may outweigh the outcome utility of hedging, and he or she may reject even very generous hedges. Ratio bias. Gamblers will prefer gambles with worse odds that are drawn from a large sample (e.g., drawing one red ball from an urn containing 89 red balls and 11 blue balls) to better odds that are drawn from a small sample (drawing one red ball from an urn containing 9 red balls and one blue ball). Gambler's fallacy/positive recency bias. See also References Further reading Chambers, Kerry. Gambling for profit: Lotteries, gaming machines, and casinos in cross-national focus (U of Toronto press, 2011). Ferentzy, Peter, and Nigel Turner. "Gambling and organized crime-A review of the literature." Journal of Gambling Issues 23 (2009): 111–155. Ferentzy, Peter, and Nigel E. Turner. A history of problem gambling (Springer-Verlag, 2013).online Haller, Mark H. "The changing structure of American gambling in the twentieth century." Journal of Social Issues 35.3 (1979): 87-114. Richard, Brian. "Diffusion of an economic development policy innovation: Explaining the international spread of casino gambling." Journal of Gambling Studies 26.2 (2010): 287–300. Online Schwartz, David G. Roll The Bones: The History of Gambling (2006), scholarly history with global perspective excerpt External links Center for Gaming Research – at University of Nevada, Las Vegas Institute for the Study of Gambling and Commercial Gaming at the University of Nevada, Reno
[ 0.37535005807876587, 0.19863101840019226, -0.36864787340164185, 0.527911901473999, 0.3695572018623352, 0.027939360588788986, 0.031032759696245193, 0.29406243562698364, -0.30734774470329285, -0.42742180824279785, -0.015990186482667923, 0.7628389596939087, -0.23388700187206268, -0.1068039238...
11924
https://en.wikipedia.org/wiki/Game%20theory
Game theory
Game theory is the study of mathematical models of strategic interactions among rational agents. It has applications in all fields of social science, as well as in logic, systems science and computer science. Originally, it addressed two-person zero-sum games, in which each participant's gains or losses are exactly balanced by those of other participants. In the 21st century, game theory applies to a wide range of behavioral relations; it is now an umbrella term for the science of logical decision making in humans, animals, as well as computers. Modern game theory began with the idea of mixed-strategy equilibria in two-person zero-sum game and its proof by John von Neumann. Von Neumann's original proof used the Brouwer fixed-point theorem on continuous mappings into compact convex sets, which became a standard method in game theory and mathematical economics. His paper was followed by the 1944 book Theory of Games and Economic Behavior, co-written with Oskar Morgenstern, which considered cooperative games of several players. The second edition of this book provided an axiomatic theory of expected utility, which allowed mathematical statisticians and economists to treat decision-making under uncertainty. Game theory was developed extensively in the 1950s by many scholars. It was explicitly applied to evolution in the 1970s, although similar developments go back at least as far as the 1930s. Game theory has been widely recognized as an important tool in many fields. , with the Nobel Memorial Prize in Economic Sciences going to game theorists Paul Milgrom and Robert B. Wilson, fifteen game theorists have won the economics Nobel Prize. John Maynard Smith was awarded the Crafoord Prize for his application of evolutionary game theory. History Precursors Discussions on the mathematics of games began long before the rise of modern mathematical game theory. Cardano's work on games of chance in Liber de ludo aleae (Book on Games of Chance), which was written around 1564 but published posthumously in 1663, formulated some of the field's basic ideas. In the 1650s, Pascal and Huygens developed the concept of expectation on reasoning about the structure of games of chance, and Huygens published his gambling calculus in De ratiociniis in ludo aleæ (On Reasoning in Games of Chance) in 1657. In 1713, a letter attributed to Charles Waldegrave analyzed a game called "le Her". He was an active Jacobite and uncle to James Waldegrave, a British diplomat. In this letter, Waldegrave provided a minimax mixed strategy solution to a two-person version of the card game le Her, and the problem is now known as Waldegrave problem. In his 1838 Recherches sur les principes mathématiques de la théorie des richesses (Researches into the Mathematical Principles of the Theory of Wealth), Antoine Augustin Cournot considered a duopoly and presented a solution that is the Nash equilibrium of the game. In 1913, Ernst Zermelo published Über eine Anwendung der Mengenlehre auf die Theorie des Schachspiels (On an Application of Set Theory to the Theory of the Game of Chess), which proved that the optimal chess strategy is strictly determined. This paved the way for more general theorems. In 1938, the Danish mathematical economist Frederik Zeuthen proved that the mathematical model had a winning strategy by using Brouwer's fixed point theorem. In his 1938 book Applications aux Jeux de Hasard and earlier notes, Émile Borel proved a minimax theorem for two-person zero-sum matrix games only when the pay-off matrix is symmetric and provided a solution to a non-trivial infinite game (known in English as Blotto game). Borel conjectured the non-existence of mixed-strategy equilibria in finite two-person zero-sum games, a conjecture that was proved false by von Neumann. Birth and early developments Game theory did not exist as a unique field until John von Neumann published the paper On the Theory of Games of Strategy in 1928. Von Neumann's original proof used Brouwer's fixed-point theorem on continuous mappings into compact convex sets, which became a standard method in game theory and mathematical economics. His paper was followed by his 1944 book Theory of Games and Economic Behavior co-authored with Oskar Morgenstern. The second edition of this book provided an axiomatic theory of utility, which reincarnated Daniel Bernoulli's old theory of utility (of money) as an independent discipline. Von Neumann's work in game theory culminated in this 1944 book. This foundational work contains the method for finding mutually consistent solutions for two-person zero-sum games. Subsequent work focused primarily on cooperative game theory, which analyzes optimal strategies for groups of individuals, presuming that they can enforce agreements between them about proper strategies. In 1950, the first mathematical discussion of the prisoner's dilemma appeared, and an experiment was undertaken by notable mathematicians Merrill M. Flood and Melvin Dresher, as part of the RAND Corporation's investigations into game theory. RAND pursued the studies because of possible applications to global nuclear strategy. Around this same time, John Nash developed a criterion for mutual consistency of players' strategies known as the Nash equilibrium, applicable to a wider variety of games than the criterion proposed by von Neumann and Morgenstern. Nash proved that every finite n-player, non-zero-sum (not just two-player zero-sum) non-cooperative game has what is now known as a Nash equilibrium in mixed strategies. Game theory experienced a flurry of activity in the 1950s, during which the concepts of the core, the extensive form game, fictitious play, repeated games, and the Shapley value were developed. The 1950s also saw the first applications of game theory to philosophy and political science. Prize-winning achievements In 1965, Reinhard Selten introduced his solution concept of subgame perfect equilibria, which further refined the Nash equilibrium. Later he would introduce trembling hand perfection as well. In 1994 Nash, Selten and Harsanyi became Economics Nobel Laureates for their contributions to economic game theory. In the 1970s, game theory was extensively applied in biology, largely as a result of the work of John Maynard Smith and his evolutionarily stable strategy. In addition, the concepts of correlated equilibrium, trembling hand perfection, and common knowledge were introduced and analyzed. In 2005, game theorists Thomas Schelling and Robert Aumann followed Nash, Selten, and Harsanyi as Nobel Laureates. Schelling worked on dynamic models, early examples of evolutionary game theory. Aumann contributed more to the equilibrium school, introducing equilibrium coarsening and correlated equilibria, and developing an extensive formal analysis of the assumption of common knowledge and of its consequences. In 2007, Leonid Hurwicz, Eric Maskin, and Roger Myerson were awarded the Nobel Prize in Economics "for having laid the foundations of mechanism design theory". Myerson's contributions include the notion of proper equilibrium, and an important graduate text: Game Theory, Analysis of Conflict. Hurwicz introduced and formalized the concept of incentive compatibility. In 2012, Alvin E. Roth and Lloyd S. Shapley were awarded the Nobel Prize in Economics "for the theory of stable allocations and the practice of market design". In 2014, the Nobel went to game theorist Jean Tirole. Game types Cooperative / non-cooperative A game is cooperative if the players are able to form binding commitments externally enforced (e.g. through contract law). A game is non-cooperative if players cannot form alliances or if all agreements need to be self-enforcing (e.g. through credible threats). Cooperative games are often analyzed through the framework of cooperative game theory, which focuses on predicting which coalitions will form, the joint actions that groups take, and the resulting collective payoffs. It is opposed to the traditional non-cooperative game theory which focuses on predicting individual players' actions and payoffs and analyzing Nash equilibria. The focus on individual payoff can result in a phenomenon known as Tragedy of the Commons, where resources are used to a collectively inefficient level. The lack of formal negotiation leads to the deterioration of public goods through over-use and under provision that stems from private incentives. Cooperative game theory provides a high-level approach as it describes only the structure, strategies, and payoffs of coalitions, whereas non-cooperative game theory also looks at how bargaining procedures will affect the distribution of payoffs within each coalition. As non-cooperative game theory is more general, cooperative games can be analyzed through the approach of non-cooperative game theory (the converse does not hold) provided that sufficient assumptions are made to encompass all the possible strategies available to players due to the possibility of external enforcement of cooperation. While using a single theory may be desirable, in many instances insufficient information is available to accurately model the formal procedures available during the strategic bargaining process, or the resulting model would be too complex to offer a practical tool in the real world. In such cases, cooperative game theory provides a simplified approach that allows analysis of the game at large without having to make any assumption about bargaining powers. Symmetric / asymmetric A symmetric game is a game where the payoffs for playing a particular strategy depend only on the other strategies employed, not on who is playing them. That is, if the identities of the players can be changed without changing the payoff to the strategies, then a game is symmetric. Many of the commonly studied 2×2 games are symmetric. The standard representations of chicken, the prisoner's dilemma, and the stag hunt are all symmetric games. Some scholars would consider certain asymmetric games as examples of these games as well. However, the most common payoffs for each of these games are symmetric. The most commonly studied asymmetric games are games where there are not identical strategy sets for both players. For instance, the ultimatum game and similarly the dictator game have different strategies for each player. It is possible, however, for a game to have identical strategies for both players, yet be asymmetric. For example, the game pictured in this section's graphic is asymmetric despite having identical strategy sets for both players. Zero-sum / non-zero-sum Zero-sum games are a special case of constant-sum games in which choices by players can neither increase nor decrease the available resources. In zero-sum games, the total benefit goes to all players in a game, for every combination of strategies, always adds to zero (more informally, a player benefits only at the equal expense of others). Poker exemplifies a zero-sum game (ignoring the possibility of the house's cut), because one wins exactly the amount one's opponents lose. Other zero-sum games include matching pennies and most classical board games including Go and chess. Many games studied by game theorists (including the famed prisoner's dilemma) are non-zero-sum games, because the outcome has net results greater or less than zero. Informally, in non-zero-sum games, a gain by one player does not necessarily correspond with a loss by another. Constant-sum games correspond to activities like theft and gambling, but not to the fundamental economic situation in which there are potential gains from trade. It is possible to transform any constant-sum game into a (possibly asymmetric) zero-sum game by adding a dummy player (often called "the board") whose losses compensate the players' net winnings. Simultaneous / sequential Simultaneous games are games where both players move simultaneously, or instead the later players are unaware of the earlier players' actions (making them effectively simultaneous). Sequential games (or dynamic games) are games where later players have some knowledge about earlier actions. This need not be perfect information about every action of earlier players; it might be very little knowledge. For instance, a player may know that an earlier player did not perform one particular action, while they do not know which of the other available actions the first player actually performed. The difference between simultaneous and sequential games is captured in the different representations discussed above. Often, normal form is used to represent simultaneous games, while extensive form is used to represent sequential ones. The transformation of extensive to normal form is one way, meaning that multiple extensive form games correspond to the same normal form. Consequently, notions of equilibrium for simultaneous games are insufficient for reasoning about sequential games; see subgame perfection. In short, the differences between sequential and simultaneous games are as follows: Cournot Competition The Cournot competition model involves players choosing quantity of a homogenous product to produce independently and simultaneously, where marginal cost can be different for each firm and the firm's payoff is profit. The production costs are public information and the firm aims to find their profit-maximising quantity based on what they believe the other firm will produce and behave like monopolies. In this game firms want to produce at the monopoly quantity but there is a high incentive to deviate and produce more, which decreases the market-clearing price. For example, firms may be tempted to deviate from the monopoly quantity if there is a low monopoly quantity and high price, with the aim of increasing production to maximise profit. However this option does not provide the highest payoff, as a firm's ability to maximise profits depends on its market share and the elasticity of the market demand. The Cournot equilibrium is reached when each firm operates on their reaction function with no incentive to deviate, as they have the best response based on the other firms output. Within the game, firms reach the Nash equilibrium when the Cournot equilibrium is achieved. Bertrand Competition The Bertrand competition, assumes homogenous products and a constant marginal cost and players choose the prices. The equilibrium of price competition is where the price is equal to marginal costs, assuming complete information about the competitors' costs. Therefore, the firms have an incentive to deviate from the equilibrium because a homogenous product with a lower price will gain all of the market share, known as a cost advantage. Perfect information and imperfect information An important subset of sequential games consists of games of perfect information. A game is one of perfect information if all players, at every move in the game, know the moves previously made by all other players. In reality, this can be applied to firms and consumers having information about price and quality of all the available goods in a market. An imperfect information game is played when the players do not know all moves already made by the opponent such as a simultaneous move game. Most games studied in game theory are imperfect-information games. Examples of perfect-information games include tic-tac-toe, checkers, chess, and Go. Many card games are games of imperfect information, such as poker and bridge. Perfect information is often confused with complete information, which is a similar concept. Complete information requires that every player know the strategies and payoffs available to the other players but not necessarily the actions taken, whereas perfect information is knowledge of all aspects of the game and players. Games of incomplete information can be reduced, however, to games of imperfect information by introducing "moves by nature". Bayesian game One of the assumptions of the Nash equilibrium is that every player has correct beliefs about the actions of the other players. However, there are many situations in game theory where participants do not fully understand the characteristics of their opponents. Negotiators may be unaware of their opponent's valuation of the object of negotiation, companies may be unaware of their opponent's cost functions, combatants may be unaware of their opponent's strengths, and jurors may be unaware of their colleague's interpretation of the evidence at trial. In some cases, participants may know the character of their opponent well, but may not know how well their opponent knows his or her own character. Bayesian game means a strategic game with incomplete information. For a strategic game, decision makers are players, and every player has a group of actions. A core part of the imperfect information specification is the set of states. Every state completely describes a collection of characteristics relevant to the player such as their preferences and details about them. There must be a state for every set of features that some player believes may exist. For example, where Player 1 is unsure whether Player 2 would rather date her or get away from her, while Player 2 understands Player 1's preferences as before. To be specific, supposing that Player 1 believes that Player 2 wants to date her under a probability of 1/2 and get away from her under a probability of 1/2 (this evaluation comes from Player 1's experience probably: she faces players who want to date her half of the time in such a case and players who want to avoid her half of the time). Due to the probability involved, the analysis of this situation requires to understand the player's preference for the draw, even though people are only interested in pure strategic equilibrium. Games in which the difficulty of finding an optimal strategy stems from the multiplicity of possible moves are called combinatorial games. Examples include chess and Go. Games that involve imperfect information may also have a strong combinatorial character, for instance backgammon. There is no unified theory addressing combinatorial elements in games. There are, however, mathematical tools that can solve particular problems and answer general questions. Games of perfect information have been studied in combinatorial game theory, which has developed novel representations, e.g. surreal numbers, as well as combinatorial and algebraic (and sometimes non-constructive) proof methods to solve games of certain types, including "loopy" games that may result in infinitely long sequences of moves. These methods address games with higher combinatorial complexity than those usually considered in traditional (or "economic") game theory. A typical game that has been solved this way is Hex. A related field of study, drawing from computational complexity theory, is game complexity, which is concerned with estimating the computational difficulty of finding optimal strategies. Research in artificial intelligence has addressed both perfect and imperfect information games that have very complex combinatorial structures (like chess, go, or backgammon) for which no provable optimal strategies have been found. The practical solutions involve computational heuristics, like alpha–beta pruning or use of artificial neural networks trained by reinforcement learning, which make games more tractable in computing practice. Infinitely long games Games, as studied by economists and real-world game players, are generally finished in finitely many moves. Pure mathematicians are not so constrained, and set theorists in particular study games that last for infinitely many moves, with the winner (or other payoff) not known until after all those moves are completed. The focus of attention is usually not so much on the best way to play such a game, but whether one player has a winning strategy. (It can be proven, using the axiom of choice, that there are gameseven with perfect information and where the only outcomes are "win" or "lose"for which neither player has a winning strategy.) The existence of such strategies, for cleverly designed games, has important consequences in descriptive set theory. Discrete and continuous games Much of game theory is concerned with finite, discrete games that have a finite number of players, moves, events, outcomes, etc. Many concepts can be extended, however. Continuous games allow players to choose a strategy from a continuous strategy set. For instance, Cournot competition is typically modeled with players' strategies being any non-negative quantities, including fractional quantities. Differential games Differential games such as the continuous pursuit and evasion game are continuous games where the evolution of the players' state variables is governed by differential equations. The problem of finding an optimal strategy in a differential game is closely related to the optimal control theory. In particular, there are two types of strategies: the open-loop strategies are found using the Pontryagin maximum principle while the closed-loop strategies are found using Bellman's Dynamic Programming method. A particular case of differential games are the games with a random time horizon. In such games, the terminal time is a random variable with a given probability distribution function. Therefore, the players maximize the mathematical expectation of the cost function. It was shown that the modified optimization problem can be reformulated as a discounted differential game over an infinite time interval. Evolutionary game theory Evolutionary game theory studies players who adjust their strategies over time according to rules that are not necessarily rational or farsighted. In general, the evolution of strategies over time according to such rules is modeled as a Markov chain with a state variable such as the current strategy profile or how the game has been played in the recent past. Such rules may feature imitation, optimization, or survival of the fittest. In biology, such models can represent evolution, in which offspring adopt their parents' strategies and parents who play more successful strategies (i.e. corresponding to higher payoffs) have a greater number of offspring. In the social sciences, such models typically represent strategic adjustment by players who play a game many times within their lifetime and, consciously or unconsciously, occasionally adjust their strategies. Stochastic outcomes (and relation to other fields) Individual decision problems with stochastic outcomes are sometimes considered "one-player games". These situations are not considered game theoretical by some authors. They may be modeled using similar tools within the related disciplines of decision theory, operations research, and areas of artificial intelligence, particularly AI planning (with uncertainty) and multi-agent system. Although these fields may have different motivators, the mathematics involved are substantially the same, e.g. using Markov decision processes (MDP). Stochastic outcomes can also be modeled in terms of game theory by adding a randomly acting player who makes "chance moves" ("moves by nature"). This player is not typically considered a third player in what is otherwise a two-player game, but merely serves to provide a roll of the dice where required by the game. For some problems, different approaches to modeling stochastic outcomes may lead to different solutions. For example, the difference in approach between MDPs and the minimax solution is that the latter considers the worst-case over a set of adversarial moves, rather than reasoning in expectation about these moves given a fixed probability distribution. The minimax approach may be advantageous where stochastic models of uncertainty are not available, but may also be overestimating extremely unlikely (but costly) events, dramatically swaying the strategy in such scenarios if it is assumed that an adversary can force such an event to happen. (See Black swan theory for more discussion on this kind of modeling issue, particularly as it relates to predicting and limiting losses in investment banking.) General models that include all elements of stochastic outcomes, adversaries, and partial or noisy observability (of moves by other players) have also been studied. The "gold standard" is considered to be partially observable stochastic game (POSG), but few realistic problems are computationally feasible in POSG representation. Metagames These are games the play of which is the development of the rules for another game, the target or subject game. Metagames seek to maximize the utility value of the rule set developed. The theory of metagames is related to mechanism design theory. The term metagame analysis is also used to refer to a practical approach developed by Nigel Howard. whereby a situation is framed as a strategic game in which stakeholders try to realize their objectives by means of the options available to them. Subsequent developments have led to the formulation of confrontation analysis. Pooling games These are games prevailing over all forms of society. Pooling games are repeated plays with changing payoff table in general over an experienced path, and their equilibrium strategies usually take a form of evolutionary social convention and economic convention. Pooling game theory emerges to formally recognize the interaction between optimal choice in one play and the emergence of forthcoming payoff table update path, identify the invariance existence and robustness, and predict variance over time. The theory is based upon topological transformation classification of payoff table update over time to predict variance and invariance, and is also within the jurisdiction of the computational law of reachable optimality for ordered system. Mean field game theory Mean field game theory is the study of strategic decision making in very large populations of small interacting agents. This class of problems was considered in the economics literature by Boyan Jovanovic and Robert W. Rosenthal, in the engineering literature by Peter E. Caines, and by mathematician Pierre-Louis Lions and Jean-Michel Lasry. Representation of games The games studied in game theory are well-defined mathematical objects. To be fully defined, a game must specify the following elements: the players of the game, the information and actions available to each player at each decision point, and the payoffs for each outcome. (Eric Rasmusen refers to these four "essential elements" by the acronym "PAPI".) A game theorist typically uses these elements, along with a solution concept of their choosing, to deduce a set of equilibrium strategies for each player such that, when these strategies are employed, no player can profit by unilaterally deviating from their strategy. These equilibrium strategies determine an equilibrium to the game—a stable state in which either one outcome occurs or a set of outcomes occur with known probability. Most cooperative games are presented in the characteristic function form, while the extensive and the normal forms are used to define noncooperative games. Extensive form The extensive form can be used to formalize games with a time sequencing of moves. Games here are played on trees (as pictured here). Here each vertex (or node) represents a point of choice for a player. The player is specified by a number listed by the vertex. The lines out of the vertex represent a possible action for that player. The payoffs are specified at the bottom of the tree. The extensive form can be viewed as a multi-player generalization of a decision tree. To solve any extensive form game, backward induction must be used. It involves working backward up the game tree to determine what a rational player would do at the last vertex of the tree, what the player with the previous move would do given that the player with the last move is rational, and so on until the first vertex of the tree is reached. The game pictured consists of two players. The way this particular game is structured (i.e., with sequential decision making and perfect information), Player 1 "moves" first by choosing either or (fair or unfair). Next in the sequence, Player 2, who has now seen Player 1s move, chooses to play either or . Once Player 2 has made their choice, the game is considered finished and each player gets their respective payoff. Suppose that Player 1 chooses and then Player 2 chooses : Player 1 then gets a payoff of "eight" (which in real-world terms can be interpreted in many ways, the simplest of which is in terms of money but could mean things such as eight days of vacation or eight countries conquered or even eight more opportunities to play the same game against other players) and Player 2 gets a payoff of "two". The extensive form can also capture simultaneous-move games and games with imperfect information. To represent it, either a dotted line connects different vertices to represent them as being part of the same information set (i.e. the players do not know at which point they are), or a closed line is drawn around them. (See example in the imperfect information section.) Normal form The normal (or strategic form) game is usually represented by a matrix which shows the players, strategies, and payoffs (see the example to the right). More generally it can be represented by any function that associates a payoff for each player with every possible combination of actions. In the accompanying example there are two players; one chooses the row and the other chooses the column. Each player has two strategies, which are specified by the number of rows and the number of columns. The payoffs are provided in the interior. The first number is the payoff received by the row player (Player 1 in our example); the second is the payoff for the column player (Player 2 in our example). Suppose that Player 1 plays Up and that Player 2 plays Left. Then Player 1 gets a payoff of 4, and Player 2 gets 3. When a game is presented in normal form, it is presumed that each player acts simultaneously or, at least, without knowing the actions of the other. If players have some information about the choices of other players, the game is usually presented in extensive form. Every extensive-form game has an equivalent normal-form game, however, the transformation to normal form may result in an exponential blowup in the size of the representation, making it computationally impractical. Characteristic function form In games that possess removable utility, separate rewards are not given; rather, the characteristic function decides the payoff of each unity. The idea is that the unity that is 'empty', so to speak, does not receive a reward at all. The origin of this form is to be found in John von Neumann and Oskar Morgenstern's book; when looking at these instances, they guessed that when a union appears, it works against the fraction as if two individuals were playing a normal game. The balanced payoff of C is a basic function. Although there are differing examples that help determine coalitional amounts from normal games, not all appear that in their function form can be derived from such. Formally, a characteristic function is seen as: (N,v), where N represents the group of people and is a normal utility. Such characteristic functions have expanded to describe games where there is no removable utility. Alternative game representations Alternative game representation forms are used for some subclasses of games or adjusted to the needs of interdisciplinary research. In addition to classical game representations, some of the alternative representations also encode time related aspects. General and applied uses As a method of applied mathematics, game theory has been used to study a wide variety of human and animal behaviors. It was initially developed in economics to understand a large collection of economic behaviors, including behaviors of firms, markets, and consumers. The first use of game-theoretic analysis was by Antoine Augustin Cournot in 1838 with his solution of the Cournot duopoly. The use of game theory in the social sciences has expanded, and game theory has been applied to political, sociological, and psychological behaviors as well. Although pre-twentieth-century naturalists such as Charles Darwin made game-theoretic kinds of statements, the use of game-theoretic analysis in biology began with Ronald Fisher's studies of animal behavior during the 1930s. This work predates the name "game theory", but it shares many important features with this field. The developments in economics were later applied to biology largely by John Maynard Smith in his 1982 book Evolution and the Theory of Games. In addition to being used to describe, predict, and explain behavior, game theory has also been used to develop theories of ethical or normative behavior and to prescribe such behavior. In economics and philosophy, scholars have applied game theory to help in the understanding of good or proper behavior. Game-theoretic arguments of this type can be found as far back as Plato. An alternative version of game theory, called chemical game theory, represents the player's choices as metaphorical chemical reactant molecules called "knowlecules".  Chemical game theory then calculates the outcomes as equilibrium solutions to a system of chemical reactions. Description and modeling The primary use of game theory is to describe and model how human populations behave. Some scholars believe that by finding the equilibria of games they can predict how actual human populations will behave when confronted with situations analogous to the game being studied. This particular view of game theory has been criticized. It is argued that the assumptions made by game theorists are often violated when applied to real-world situations. Game theorists usually assume players act rationally, but in practice, human behavior often deviates from this model. Game theorists respond by comparing their assumptions to those used in physics. Thus while their assumptions do not always hold, they can treat game theory as a reasonable scientific ideal akin to the models used by physicists. However, empirical work has shown that in some classic games, such as the centipede game, guess 2/3 of the average game, and the dictator game, people regularly do not play Nash equilibria. There is an ongoing debate regarding the importance of these experiments and whether the analysis of the experiments fully captures all aspects of the relevant situation. Some game theorists, following the work of John Maynard Smith and George R. Price, have turned to evolutionary game theory in order to resolve these issues. These models presume either no rationality or bounded rationality on the part of players. Despite the name, evolutionary game theory does not necessarily presume natural selection in the biological sense. Evolutionary game theory includes both biological as well as cultural evolution and also models of individual learning (for example, fictitious play dynamics). Prescriptive or normative analysis Some scholars see game theory not as a predictive tool for the behavior of human beings, but as a suggestion for how people ought to behave. Since a strategy, corresponding to a Nash equilibrium of a game constitutes one's best response to the actions of the other players – provided they are in (the same) Nash equilibrium – playing a strategy that is part of a Nash equilibrium seems appropriate. This normative use of game theory has also come under criticism. Economics and business Game theory is a major method used in mathematical economics and business for modeling competing behaviors of interacting agents. Applications include a wide array of economic phenomena and approaches, such as auctions, bargaining, mergers and acquisitions pricing, fair division, duopolies, oligopolies, social network formation, agent-based computational economics, general equilibrium, mechanism design, and voting systems; and across such broad areas as experimental economics, behavioral economics, information economics, industrial organization, and political economy. This research usually focuses on particular sets of strategies known as "solution concepts" or "equilibria". A common assumption is that players act rationally. In non-cooperative games, the most famous of these is the Nash equilibrium. A set of strategies is a Nash equilibrium if each represents a best response to the other strategies. If all the players are playing the strategies in a Nash equilibrium, they have no unilateral incentive to deviate, since their strategy is the best they can do given what others are doing. The payoffs of the game are generally taken to represent the utility of individual players. A prototypical paper on game theory in economics begins by presenting a game that is an abstraction of a particular economic situation. One or more solution concepts are chosen, and the author demonstrates which strategy sets in the presented game are equilibria of the appropriate type. Economists and business professors suggest two primary uses (noted above): descriptive and prescriptive. The Chartered Institute of Procurement & Supply (CIPS) promotes knowledge and use of game theory within the context of business procurement. CIPS and TWS Partners have conducted a series of surveys designed to explore the understanding, awareness and application of game theory among procurement professionals. Some of the main findings in their third annual survey (2019) include: application of game theory to procurement activity has increased – at the time it was at 19% across all survey respondents 65% of participants predict that use of game theory applications will grow 70% of respondents say that they have "only a basic or a below basic understanding" of game theory 20% of participants had undertaken on-the-job training in game theory 50% of respondents said that new or improved software solutions were desirable 90% of respondents said that they do not have the software they need for their work. Project management Sensible decision-making is critical for the success of projects. In project management, game theory is used to model the decision-making process of players, such as investors, project managers, contractors, sub-contractors, governments and customers. Quite often, these players have competing interests, and sometimes their interests are directly detrimental to other players, making project management scenarios well-suited to be modeled by game theory. Piraveenan (2019) in his review provides several examples where game theory is used to model project management scenarios. For instance, an investor typically has several investment options, and each option will likely result in a different project, and thus one of the investment options has to be chosen before the project charter can be produced. Similarly, any large project involving subcontractors, for instance, a construction project, has a complex interplay between the main contractor (the project manager) and subcontractors, or among the subcontractors themselves, which typically has several decision points. For example, if there is an ambiguity in the contract between the contractor and subcontractor, each must decide how hard to push their case without jeopardizing the whole project, and thus their own stake in it. Similarly, when projects from competing organizations are launched, the marketing personnel have to decide what is the best timing and strategy to market the project, or its resultant product or service, so that it can gain maximum traction in the face of competition. In each of these scenarios, the required decisions depend on the decisions of other players who, in some way, have competing interests to the interests of the decision-maker, and thus can ideally be modeled using game theory. Piraveenan summarises that two-player games are predominantly used to model project management scenarios, and based on the identity of these players, five distinct types of games are used in project management. Government-sector–private-sector games (games that model public–private partnerships) Contractor–contractor games Contractor–subcontractor games Subcontractor–subcontractor games Games involving other players In terms of types of games, both cooperative as well as non-cooperative, normal-form as well as extensive-form, and zero-sum as well as non-zero-sum are used to model various project management scenarios. Political science The application of game theory to political science is focused in the overlapping areas of fair division, political economy, public choice, war bargaining, positive political theory, and social choice theory. In each of these areas, researchers have developed game-theoretic models in which the players are often voters, states, special interest groups, and politicians. Early examples of game theory applied to political science are provided by Anthony Downs. In his 1957 book An Economic Theory of Democracy, he applies the Hotelling firm location model to the political process. In the Downsian model, political candidates commit to ideologies on a one-dimensional policy space. Downs first shows how the political candidates will converge to the ideology preferred by the median voter if voters are fully informed, but then argues that voters choose to remain rationally ignorant which allows for candidate divergence. Game theory was applied in 1962 to the Cuban Missile Crisis during the presidency of John F. Kennedy. It has also been proposed that game theory explains the stability of any form of political government. Taking the simplest case of a monarchy, for example, the king, being only one person, does not and cannot maintain his authority by personally exercising physical control over all or even any significant number of his subjects. Sovereign control is instead explained by the recognition by each citizen that all other citizens expect each other to view the king (or other established government) as the person whose orders will be followed. Coordinating communication among citizens to replace the sovereign is effectively barred, since conspiracy to replace the sovereign is generally punishable as a crime. Thus, in a process that can be modeled by variants of the prisoner's dilemma, during periods of stability no citizen will find it rational to move to replace the sovereign, even if all the citizens know they would be better off if they were all to act collectively. A game-theoretic explanation for democratic peace is that public and open debate in democracies sends clear and reliable information regarding their intentions to other states. In contrast, it is difficult to know the intentions of nondemocratic leaders, what effect concessions will have, and if promises will be kept. Thus there will be mistrust and unwillingness to make concessions if at least one of the parties in a dispute is a non-democracy. However, game theory predicts that two countries may still go to war even if their leaders are cognizant of the costs of fighting. War may result from asymmetric information; two countries may have incentives to mis-represent the amount of military resources they have on hand, rendering them unable to settle disputes agreeably without resorting to fighting. Moreover, war may arise because of commitment problems: if two countries wish to settle a dispute via peaceful means, but each wishes to go back on the terms of that settlement, they may have no choice but to resort to warfare. Finally, war may result from issue indivisibilities. Game theory could also help predict a nation's responses when there is a new rule or law to be applied to that nation. One example is Peter John Wood's (2013) research looking into what nations could do to help reduce climate change. Wood thought this could be accomplished by making treaties with other nations to reduce greenhouse gas emissions. However, he concluded that this idea could not work because it would create a prisoner's dilemma for the nations. Biology Unlike those in economics, the payoffs for games in biology are often interpreted as corresponding to fitness. In addition, the focus has been less on equilibria that correspond to a notion of rationality and more on ones that would be maintained by evolutionary forces. The best-known equilibrium in biology is known as the evolutionarily stable strategy (ESS), first introduced in . Although its initial motivation did not involve any of the mental requirements of the Nash equilibrium, every ESS is a Nash equilibrium. In biology, game theory has been used as a model to understand many different phenomena. It was first used to explain the evolution (and stability) of the approximate 1:1 sex ratios. suggested that the 1:1 sex ratios are a result of evolutionary forces acting on individuals who could be seen as trying to maximize their number of grandchildren. Additionally, biologists have used evolutionary game theory and the ESS to explain the emergence of animal communication. The analysis of signaling games and other communication games has provided insight into the evolution of communication among animals. For example, the mobbing behavior of many species, in which a large number of prey animals attack a larger predator, seems to be an example of spontaneous emergent organization. Ants have also been shown to exhibit feed-forward behavior akin to fashion (see Paul Ormerod's Butterfly Economics). Biologists have used the game of chicken to analyze fighting behavior and territoriality. According to Maynard Smith, in the preface to Evolution and the Theory of Games, "paradoxically, it has turned out that game theory is more readily applied to biology than to the field of economic behaviour for which it was originally designed". Evolutionary game theory has been used to explain many seemingly incongruous phenomena in nature. One such phenomenon is known as biological altruism. This is a situation in which an organism appears to act in a way that benefits other organisms and is detrimental to itself. This is distinct from traditional notions of altruism because such actions are not conscious, but appear to be evolutionary adaptations to increase overall fitness. Examples can be found in species ranging from vampire bats that regurgitate blood they have obtained from a night's hunting and give it to group members who have failed to feed, to worker bees that care for the queen bee for their entire lives and never mate, to vervet monkeys that warn group members of a predator's approach, even when it endangers that individual's chance of survival. All of these actions increase the overall fitness of a group, but occur at a cost to the individual. Evolutionary game theory explains this altruism with the idea of kin selection. Altruists discriminate between the individuals they help and favor relatives. Hamilton's rule explains the evolutionary rationale behind this selection with the equation , where the cost to the altruist must be less than the benefit to the recipient multiplied by the coefficient of relatedness . The more closely related two organisms are causes the incidences of altruism to increase because they share many of the same alleles. This means that the altruistic individual, by ensuring that the alleles of its close relative are passed on through survival of its offspring, can forgo the option of having offspring itself because the same number of alleles are passed on. For example, helping a sibling (in diploid animals) has a coefficient of , because (on average) an individual shares half of the alleles in its sibling's offspring. Ensuring that enough of a sibling's offspring survive to adulthood precludes the necessity of the altruistic individual producing offspring. The coefficient values depend heavily on the scope of the playing field; for example if the choice of whom to favor includes all genetic living things, not just all relatives, we assume the discrepancy between all humans only accounts for approximately 1% of the diversity in the playing field, a coefficient that was in the smaller field becomes 0.995. Similarly if it is considered that information other than that of a genetic nature (e.g. epigenetics, religion, science, etc.) persisted through time the playing field becomes larger still, and the discrepancies smaller. Computer science and logic Game theory has come to play an increasingly important role in logic and in computer science. Several logical theories have a basis in game semantics. In addition, computer scientists have used games to model interactive computations. Also, game theory provides a theoretical basis to the field of multi-agent systems. Separately, game theory has played a role in online algorithms; in particular, the -server problem, which has in the past been referred to as games with moving costs and request-answer games. Yao's principle is a game-theoretic technique for proving lower bounds on the computational complexity of randomized algorithms, especially online algorithms. The emergence of the Internet has motivated the development of algorithms for finding equilibria in games, markets, computational auctions, peer-to-peer systems, and security and information markets. Algorithmic game theory and within it algorithmic mechanism design combine computational algorithm design and analysis of complex systems with economic theory. Philosophy Game theory has been put to several uses in philosophy. Responding to two papers by , used game theory to develop a philosophical account of convention. In so doing, he provided the first analysis of common knowledge and employed it in analyzing play in coordination games. In addition, he first suggested that one can understand meaning in terms of signaling games. This later suggestion has been pursued by several philosophers since Lewis. Following game-theoretic account of conventions, Edna Ullmann-Margalit (1977) and Bicchieri (2006) have developed theories of social norms that define them as Nash equilibria that result from transforming a mixed-motive game into a coordination game. Game theory has also challenged philosophers to think in terms of interactive epistemology: what it means for a collective to have common beliefs or knowledge, and what are the consequences of this knowledge for the social outcomes resulting from the interactions of agents. Philosophers who have worked in this area include Bicchieri (1989, 1993), Skyrms (1990), and Stalnaker (1999). In ethics, some (most notably David Gauthier, Gregory Kavka, and Jean Hampton) authors have attempted to pursue Thomas Hobbes' project of deriving morality from self-interest. Since games like the prisoner's dilemma present an apparent conflict between morality and self-interest, explaining why cooperation is required by self-interest is an important component of this project. This general strategy is a component of the general social contract view in political philosophy (for examples, see and ). Other authors have attempted to use evolutionary game theory in order to explain the emergence of human attitudes about morality and corresponding animal behaviors. These authors look at several games including the prisoner's dilemma, stag hunt, and the Nash bargaining game as providing an explanation for the emergence of attitudes about morality (see, e.g., and ). Retail and consumer product pricing Game theory applications are often used in the pricing strategies of retail and consumer markets, particularly for the sale of inelastic goods. With retailers constantly competing against one another for consumer market share, it has become a fairly common practice for retailers to discount certain goods, intermittently, in the hopes of increasing foot-traffic in brick and mortar locations (websites visits for e-commerce retailers) or increasing sales of ancillary or complimentary products. Black Friday, a popular shopping holiday in the US, is when many retailers focus on optimal pricing strategies to capture the holiday shopping market. In the Black Friday scenario, retailers using game theory applications typically ask "what is the dominant competitor's reaction to me?" In such a scenario, the game has two players: the retailer, and the consumer. The retailer is focused on an optimal pricing strategy, while the consumer is focused on the best deal. In this closed system, there often is no dominant strategy as both players have alternative options. That is, retailers can find a different customer, and consumers can shop at a different retailer. Given the market competition that day, however, the dominant strategy for retailers lies in outperforming competitors. The open system assumes multiple retailers selling similar goods, and a finite number of consumers demanding the goods at an optimal price. A blog by a Cornell University professor provided an example of such a strategy, when Amazon priced a Samsung TV $100 below retail value, effectively undercutting competitors. Amazon made up part of the difference by increasing the price of HDMI cables, as it has been found that consumers are less price discriminatory when it comes to the sale of secondary items. Retail markets continue to evolve strategies and applications of game theory when it comes to pricing consumer goods. The key insights found between simulations in a controlled environment and real-world retail experiences show that the applications of such strategies are more complex, as each retailer has to find an optimal balance between pricing, supplier relations, brand image, and the potential to cannibalize the sale of more profitable items. Epidemiology Since the decision to take a vaccine for a particular disease is often made by individuals, who may consider a range of factors and parameters in making this decision (such as the incidence and prevalence of the disease, perceived and real risks associated with contracting the disease, mortality rate, perceived and real risks associated with vaccination, and financial cost of vaccination), game theory has been used to model and predict vaccination uptake in a society. In popular culture Based on the 1998 book by Sylvia Nasar, the life story of game theorist and mathematician John Nash was turned into the 2001 biopic A Beautiful Mind, starring Russell Crowe as Nash. The 1959 military science fiction novel Starship Troopers by Robert A. Heinlein mentioned "games theory" and "theory of games". In the 1997 film of the same name, the character Carl Jenkins referred to his military intelligence assignment as being assigned to "games and theory". The 1964 film Dr. Strangelove satirizes game theoretic ideas about deterrence theory. For example, nuclear deterrence depends on the threat to retaliate catastrophically if a nuclear attack is detected. A game theorist might argue that such threats can fail to be credible, in the sense that they can lead to subgame imperfect equilibria. The movie takes this idea one step further, with the Soviet Union irrevocably committing to a catastrophic nuclear response without making the threat public. The 1980s power pop band Game Theory was founded by singer/songwriter Scott Miller, who described the band's name as alluding to "the study of calculating the most appropriate action given an adversary... to give yourself the minimum amount of failure". Liar Game, a 2005 Japanese manga and 2007 television series, presents the main characters in each episode with a game or problem that is typically drawn from game theory, as demonstrated by the strategies applied by the characters. The 1974 novel Spy Story by Len Deighton explores elements of Game Theory in regard to cold war army exercises. The 2008 novel The Dark Forest by Liu Cixin explores the relationship between extraterrestrial life, humanity, and game theory. The prime antagonist Joker in the movie The Dark Knight presents game theory concepts—notably the prisoner's dilemma in a scene where he asks passengers in two different ferries to bomb the other one to save their own. See also Applied ethics Chainstore paradox Collective intentionality Glossary of game theory Intra-household bargaining Kingmaker scenario Law and economics Outline of artificial intelligence Parrondo's paradox Precautionary principle Quantum refereed game Risk management Self-confirming equilibrium Tragedy of the commons Wilson doctrine (economics) Lists List of cognitive biases List of emerging technologies List of games in game theory Notes References and further reading Textbooks and general references . , Description. . Suitable for undergraduate and business students. https://b-ok.org/book/2640653/e56341. . Suitable for upper-level undergraduates. . Suitable for advanced undergraduates. Published in Europe as . . Presents game theory in formal way suitable for graduate level. Joseph E. Harrington (2008) Games, strategies, and decision making, Worth, . Textbook suitable for undergraduates in applied fields; numerous examples, fewer formalisms in concept presentation. Maschler, Michael; Solan, Eilon; Zamir, Shmuel (2013), Game Theory, Cambridge University Press, . Undergraduate textbook. . Suitable for a general audience. . Undergraduate textbook. . A modern introduction at the graduate level. . A leading textbook at the advanced undergraduate level. Consistent treatment of game types usually claimed by different applied fields, e.g. Markov decision processes. Historically important texts reprinted edition: Shapley, L.S. (1953), A Value for n-person Games, In: Contributions to the Theory of Games volume II, H. W. Kuhn and A. W. Tucker (eds.) Shapley, L.S. (1953), Stochastic Games, Proceedings of National Academy of Science Vol. 39, pp. 1095–1100. English translation: "On the Theory of Games of Strategy," in A. W. Tucker and R. D. Luce, ed. (1959), Contributions to the Theory of Games, v. 4, p. 42. Princeton University Press. Other print references Allan Gibbard, "Manipulation of voting schemes: a general result", Econometrica, Vol. 41, No. 4 (1973), pp. 587–601. , (2002 edition) . A layman's introduction. . External links James Miller (2015): Introductory Game Theory Videos. Paul Walker: History of Game Theory Page. David Levine: Game Theory. Papers, Lecture Notes and much more stuff. Alvin Roth: — Comprehensive list of links to game theory information on the Web Adam Kalai: Game Theory and Computer Science — Lecture notes on Game Theory and Computer Science Mike Shor: GameTheory.net — Lecture notes, interactive illustrations and other information. Jim Ratliff's Graduate Course in Game Theory (lecture notes). Don Ross: Review Of Game Theory in the Stanford Encyclopedia of Philosophy. Bruno Verbeek and Christopher Morris: Game Theory and Ethics Elmer G. Wiens: Game Theory — Introduction, worked examples, play online two-person zero-sum games. Marek M. Kaminski: Game Theory and Politics — Syllabuses and lecture notes for game theory and political science. Websites on game theory and social interactions Kesten Green's — See Papers for evidence on the accuracy of forecasts from game theory and other methods. McKelvey, Richard D., McLennan, Andrew M., and Turocy, Theodore L. (2007) Gambit: Software Tools for Game Theory. Benjamin Polak: Open Course on Game Theory at Yale videos of the course Benjamin Moritz, Bernhard Könsgen, Danny Bures, Ronni Wiersch, (2007) Spieltheorie-Software.de: An application for Game Theory implemented in JAVA. Antonin Kucera: Stochastic Two-Player Games. Yu-Chi Ho: What is Mathematical Game Theory; What is Mathematical Game Theory (#2); What is Mathematical Game Theory (#3); What is Mathematical Game Theory (#4)-Many person game theory; What is Mathematical Game Theory ?( #5) – Finale, summing up, and my own view Artificial intelligence Formal sciences Mathematical economics John von Neumann
[ -0.014287653379142284, -0.18459469079971313, -0.4452463984489441, 0.41113001108169556, 0.2538430094718933, 0.20094159245491028, -0.08435919880867004, 0.18991678953170776, -0.25639528036117554, -0.655938446521759, -0.2663048207759857, 0.6396735310554504, -0.7639329433441162, 0.0297566074877...
11929
https://en.wikipedia.org/wiki/Demographics%20of%20Germany
Demographics of Germany
The demography of Germany is monitored by the Statistisches Bundesamt (Federal Statistical Office of Germany). According to the most recent data, Germany's population is 83,222,442 (30 September 2021) making it the second-most populous country in Europe after Russia, and the nineteenth-most populous country in the world. The total fertility rate was rated at 1.53 in 2020, which is far below the replacement rate of 2.1. For a long time Germany had one of the world's lowest fertility rates of around 1.3 to 1.4 however there has been a small increase in recent years. Due to the low birth rate there have been more death than births in Germany in every year since 1972, which means 2020 was the 49th consecutive year the German population would have decreased without immigration. It is the only country in the world to have such a long-term natural population decline. The decline has been somewhat mitigated by immigration: in 2019 the number of people with a foreign background was 26%. Under this category there are counted foreigners, naturalized citizens, ethnic German repatriates from east Europe and their children. Until the early 20th century Germany was also a large emigrant nation with 5 million people emigrating to the US alone from Germany in the Kaiserreich boundaries in the 19th century and more than two million in the 20th century plus additional emigrants to Latin America, Canada and eastern Europe. However after World War II immigration began to outweigh emigration, as around 14 million ethnic Germans were expelled from the former eastern Provinces of the Reich and other areas in eastern Europe of whom around 12 million made their way to present day Germany and several hundred thousand to Austria and other countries while several hundred thousand died. Some additional 4.5 million ethnic Germans from eastern Europe repatriated after 1950, especially around the end of the Eastern Bloc and mostly from the former Soviet Union, Poland and Romania. Large scale immigration to the BRD began during the time of the Wirtschaftswunder from the 1950s to early 1970s when Germany had a shortage of workers and let in Southern Europeans from countries like Turkey, Italy and Spain on a temporary basis as guest workers. The liberalisation of guest worker legislation allowed many to stay and build a life in the BRD. Another large wave of immigration happened around reunification when a large group of German repatriates but also many refugees arrived mostly from former Yugoslavia due to the Yugoslav War and Bosnian War and from Turkey seeking asylum in Germany. The next large immigration wave began after eastern Expansion of the European Union in 2011 as Eastern Europeans were now allowed to live and work in Germany without a visa. In 2015 Germany took in what was, in EU terms, a relatively large number of refugees fleeing the Syrian civil war but also other conflicts in Iraq and Afghanistan: 476,649 asylum seekers in 2015, 745,545 in 2016 and declining numbers after that. Germany has one of the world's highest levels of education, technological development, and economic productivity. Since the end of World War II, the number of students entering university has more than tripled, and the trade and technical schools are among the world's best. With a per capita income of about €40,883 in 2018, Germany is a broadly middle-class society. However, there has been a strong increase in the number of children living in poverty. In 1965, one in 75 children was on the welfare rolls; but by 2007 this had increased to one child in six. These children live in relative poverty, but not necessarily in absolute poverty. Germans are typically well-travelled, with millions travelling overseas each year. The social welfare system provides for universal health care, unemployment compensation, child benefits and other social programmes. Germany's ageing population and struggling economy strained the welfare system in the 1990s, so the government adopted a wide-ranging programme of - still controversial - belt-tightening reforms, Agenda 2010, including the labour-market reforms known as Hartz concept. History The contemporary demographics of Germany are also measured by a series of full censuses, with the most recent held in 1987. Since reunification, German authorities rely on a micro census. Total Fertility Rate from 1800 to 1899 The total fertility rate is the number of children born per woman. It is based on fairly good data for the entire period. Sources: Our World In Data and Gapminder Foundation. Life expectancy from 1875 to 2020 Sources: Our World In Data and the United Nations. 1875-1950 1950-2015 Source: UN World Population Prospects Statistics since 1817 Population statistics since 1817.Territorial changes of Germany occurred in 1866 (establishment of North German Confederation, 1871 (German unification and annexation of Alsace-Lorraine), 1918/1919, 1921/1922, 1945/1946 and in 1990. In 2020, 586,421 (75.8%) children were born to mothers with German citizenship, while 186,723 (24.2%) children were born to mothers with foreign citizenship. Current vital statistics 1945–1990 After the World War II border shifts and expulsions, the Germans from Central and Eastern Europe and the former eastern territories moved westward to post-war Germany. During the partition of Germany, many Germans from East Germany fled to West Germany for political and economic reasons. Since Germany's reunification, there are ongoing migrations from the eastern New Länder to the western Old Länder for economic reasons. The Federal Republic of Germany and the German Democratic Republic followed different paths when it came to demographics. The politics of the German Democratic Republic was pronatalistic while that of the Federal Republic was compensatory. Fertility in the GDR was higher than that in the FRG. Demographic politics was only one of the reasons. Women in the GDR had fewer "biographic options", young motherhood was expected of them. State funded costfree childcare was available to all mothers. Mother's mean age at first birth in East and West Germany Note: Berlin is included into East Germany for the year 2002 and 2008. Source: Kreyenfeld (2002); Kreyenfeld et al. (2010); HFD Germany (2010) 1990–today About 1.7 million people have left the new federal states (the East) since the fall of the Berlin Wall, or 12% of the population; a disproportionately high number of them were women under 35. After 1990, the total fertility rate (TFR) in the East dropped to 0.772 in 1994. This has been attributed to a "demographic shock": people not only had fewer children, they were also less likely to marry or divorce after the end of the GDR; the biographic options of the citizens of the former GDR had increased. Young motherhood seemed to be less attractive and the age of the first birth rose sharply. In the following years, the TFR in the East started to rise again, surpassing 1.0 in 1997 and 1.3 in 2004, and reaching the West's TFR (1.37) in 2007. In 2010, the East's fertility rate (1.459) clearly exceeded that of the West (1.385), while Germany's overall TFR had risen to 1.393, the highest value since 1990, which was still far below the natural replacement rate of 2.1 and the birth rates seen under communism. In 2016, the TFR was 1.64 in the East and 1.60 in the West. Between 1989 and 2009, about 2,000 schools closed because there were fewer children. In some regions the number of women between the ages of 20 and 30 has dropped by more than 30%. In 2004, in the age group 18-29 (statistically important for starting families) there were only 90 women for every 100 men in the new federal states (the East, including Berlin). Until 2007 family politics in the federal republic was compensatory, which means that poor families received more family benefits (such as the Erziehungsgeld) than rich ones. In 2007 the so-called Elterngeld was introduced. According to Christoph Butterwegge the Elterngeld was meant to "motivate highly educated women to have more children"; the poor on the other hand were disadvantaged by the Elterngeld, and now received lower child benefits than the middle classes. The very well-off (who earn more than 250.000 Euro per annum) and those on welfare receive no Elterngeld payments. In 2013 the following most recent developments were noticed: The income of families with young children has risen. Persons holding a college degree, persons older than 30 years and parents with only one child benefited the most. Single parents and young parents did not benefit. Fathers are becoming more involved in parenting, and 28% of them now take some time off work (3.3 months on average) when their children are born. Mothers are more likely to work and as a result less likely to be economically deprived than they used to be. The birth rate of college-educated women has risen. In the new federal states the fertility rate of college-educated women is now higher than that of those without college degrees. Differences in value priorities and the better availability of childcare in the eastern states are discussed as possible reasons. In 2019, the non-profit Austrian Institute of Economic Research and the Bertelsmann Stiftung published a study about the economic impact of demographics. The researchers assume a reduction in the per capita income of €3,700 until 2040. Demographic statistics Demographic statistics according to the World Population Review. One birth every 43 seconds One death every 34 seconds Net gain of one person every 4 minutes One net migrant every 2 minutes Demographic statistics according to the CIA World Factbook, unless otherwise indicated. Population 80,457,737 (July 2018 est.) 80,594,017 (July 2017 est.) 82,175,700 (2015 estimate) Age structure 0-14 years: 12.83% (male 5,299,798 /female 5,024,184) 15-24 years: 9.98% (male 4,092,901 /female 3,933,997) 25-54 years: 39.87% (male 16,181,931 /female 15,896,528) 55-64 years: 14.96% (male 5,989,111 /female 6,047,449) 65 years and over: 22.36% (male 7,930,590 /female 10,061,248) (2018 est.) 0–14 years: 12.8% (male 5,304,341/female 5,028,776) 15–24 years: 10.1% (male 4,145,486/female 3,986,302) 25-54 years: 40.5% (male 16,467,975/female 16,133,964) 55-64 years: 14.6% (male 5,834,179/female 5,913,322) 65 years and over: 22.06% (male 7,822,221/female 9,957,451) (2017 est.) 0–14 years: 13.9% (male 5,894,724; female 5,590,373) 15–64 years: 66.1% (male 27,811,357/female 26,790,222) 65 years and over: 19.6% (male 6,771,972/female 9,542,348) (2015 est.) 0–14 years: 13.7% (male 5,768,366/female 5,470,516) 15–64 years: 66.1% (male 27,707,761/female 26,676,759) 65 years and over: 20.3% (male 7,004,805/female 9,701,551) (2010 est.) Median age total: 47.4 years. Country comparison to the world: 3rd male: 46.2 years female: 48.5 years (2018 est.) Birth rate 8.6 births/1,000 population (2018 est.) Country comparison to the world: 213rd Death rate 11.8 deaths/1,000 population (2018 est.) Country comparison to the world: 19th 11.7 deaths/1,000 population (2017 est.) Total fertility rate 1.46 children born/woman (2018 est.) Country comparison to the world: 204th 1.43 children born/woman (2014) 1.42 children born/woman (2013) 1.38 children born/woman (2008) Net migration rate 1.5 migrant(s)/1,000 population (2018 est.) Country comparison to the world: 56th 1.5 migrant(s)/1,000 population (2017 est.) Population growth rate -0.17% (2018 est.) Country comparison to the world: 208th -0.16% (2017 est.) Mother's mean age at first birth 29.4 years (2015 est.) Life expectancy at birth total population: 80.8 years. Country comparison to the world: 34th male: 78.5 years female: 83.3 years (2017 est.) Urbanization urban population: 77.3% of total population (2018) rate of urbanization: 0.27% annual rate of change (2015-20 est.) Infant mortality rate total: 3.4 deaths/1,000 live births. Country comparison to the world: 205th male: 3.7 deaths/1,000 live births female: 3.1 deaths/1,000 live births (2017 est.) 4.09 deaths per 1,000 live births (2007) total: 3.99 deaths/1,000 live births (2010) total population: 81 years (2015) 80 years (2013) Sex ratio at birth: 1.06 male(s)/female under 15 years: 1.05 male(s)/female 15–64 years: 1.04 male(s)/female 65 years and over: 0.72 male(s)/female total population: 0.97 male(s)/female (2010 est.) Dependency ratios total dependency ratio: 52.1 youth dependency ratio: 19.9 elderly dependency ratio: 32.1 potential support ratio: 3.1 (2015 est.) School life expectancy (primary to tertiary education) total: 17 years male: 17 years female: 17 years (2015) Unemployment, youth ages 15–24 total: 7.2% male: 7.9% female: 6.5% (2015 est.) Country comparison to the world: 139th Most childbirths in Germany happen within marriage. Out of 778,080 births in 2019 258,835 were to unmarried parents, which means that around 33% or one third of the children are born out of wedlock, while two thirds are within. This percentage of unmarried birth has long been growing and reached 33% in 2010, more than twice of what it was in 1990. However in recent years it has started to stagnate or even decrease. The Mikrozensus done in 2008 revealed that the number of children a German woman aged 40 to 75 had, was closely linked to her educational achievement. In Western Germany the most educated women were the most likely to be childless. 26% of those groups stated they were childless, while 16% of those having an intermediate education, and 11% of those having compulsory education, stated the same. In Eastern Germany however, 9% of the most educated women of that age group and 7% of those who had an intermediary education were childless, while 12% of those having only compulsory education were childless. The reason for that east-western difference is that the GDR had an "educated mother scheme" and actively tried to encourage first births among the more educated. It did so by propagandizing the opinion that every educated woman should "present at least one child to socialism" and also by financially rewarding its more educated citizen to become parents. The government especially tried to persuade students to become parents while still in college and it was quite successful in doing so. In 1986 38% of all women, who were about to graduate from college, were mothers of at least one child and additional 14% were pregnant and 43% of all men, who were about to graduate from college, were fathers of at least one child. There was a sharp decline in the birth rate and especially in the birth rate of the educated after the fall of the Berlin wall. Nowadays, 5% of those about to graduate from college are parents. The more educated a Western German mother aged 40 to 75 was in 2008, the less likely she was to have a big family. The same was true for a mother living in Eastern Germany in 2008. In 2011, this trend was reversed in Eastern Germany, where more highly educated women now had a somewhat higher fertility rate than the rest of the population. Persons who said they had no religion tend to have fewer children than those who identify as Christians, and studies also found that conservative-leaning Christians had more children compared to liberal-leaning Christians. A study done in 2005 in the western German state of Nordrhein-Westfalen by the HDZ revealed that childlessness was especially widespread among scientists. It showed that 78% of the women scientists and 71% of the male scientists working in that state were childless. Ethnic minorities and migrant background (Migrationshintergrund) The Federal Statistical Office defines persons with a migrant background as all persons who migrated to the present area of the Federal Republic of Germany after 1949, plus all foreign nationals born in Germany and all persons born in Germany as German nationals with at least one parent who migrated to Germany or was born in Germany as a foreign national. The figures presented here are based on this definition only. In 2010, 2.3 million families with children under 18 years were living in Germany, in which at least one parent had foreign roots. They represented 29% of the total of 8.1 million families with minor children. Compared with 2005 – the year when the microcensus started to collect detailed information on the population with a migrant background – the proportion of migrant families has risen by 2 percentage points. In 2019, 40% children under 5 years old had migrant background. Most of the families with a migrant background live in the western part of Germany. In 2010, the proportion of migrant families in all families was 32% in the former territory of the Federal Republic. This figure was more than double that in the new Länder (incl. Berlin) where it stood at 15%.Eastern Germany has a much lower proportion of immigrants than the West, as the GDR did not let in that many guest workers and Eastern Germany's is not doing as good as West Germany's and had a higher percentage of jobless persons until recently. However in recent years the number of people with an immigrant background in East Germany has been growing as refugees (as well as German Repatriates) are distributed with the Königssteiner Schlüssel, so every German state has to take the same number of them compared to its population and economy. In 2019 19.036 million people or 89,6% of people with an immigrant background live in Western Germany (excluding Berlin), being 28,7% of its population, while 1.016 million people with immigrant background 4,8% live in Eastern States, being 8,2% of population, and 1.194 million people with an immigrant background 5,6% live in Berlin, being 33,1% of its population. In 2019, 26% of Germans of any age group (up from 18,4% in 2008) and 39% of German children (up from 30% in 2008) had at least one parent born abroad. Average age for Germans with at least one parent born abroad was 35.6 years (up from 33.8 years in 2008), while that for Germans, who had two parents born in Germany was 47.3 years (up from 44.6 in 2008). The largest groups of people with an immigrant background in Germany are people from Turkey, Poland and Russia. , the population by background was as follows: Four other sizable groups of people are referred to as "national minorities" (nationale Minderheiten) because they have lived in their respective regions for centuries: Danes, Frisians, Roma and Sinti, and Sorbs. There is a Danish minority (about 50,000, according to government sources) in the northernmost state of Schleswig-Holstein. Eastern and Northern Frisians live at Schleswig-Holstein's western coast, and in the north-western part of Lower Saxony. They are part of a wider community (Frisia) stretching from Germany to the northern Netherlands. The Sorbs, a Slavic people with about 60,000 members (according to government sources), are in the Lusatia region of Saxony and Brandenburg. They are the last remnants of the Slavs that lived in central and eastern Germany since the 7th century to have kept their traditions and not been completely integrated into the wider German nation. Until World War II the Poles were recognized as one of the national minorities. In 1924 the Union of Poles in Germany had initiated cooperation between all national minorities in Germany under the umbrella organization Association of National Minorities in Germany. Some of the union members wanted the Polish communities in easternmost Germany (now Poland) to join the newly established Polish nation after World War I. Even before the German invasion of Poland, leading anti-Nazi members of the Polish minority were deported to concentration camps; some were executed at the Piaśnica murder site. Minority rights for Poles in Germany were revoked by Hermann Göring's World War II decree of 27 February 1940, and their property was confiscated. After the war ended, the German government did not re-implement national minority rights for ethnic Poles. The reason for this is that the areas of Germany which formerly had a native Polish minority were annexed to Poland and the Soviet Union, while almost all of the native German populations (formerly the ethnic majority) in these areas subsequently fled or were expelled by force. With the mixed German-Polish territories now lost, the German government subsequently regarded ethnic Poles residing in what remained of Germany as immigrants, just like any other ethnic population with a recent history of arrival. In contrast, Germans living in Poland are recognized as national minority and have granted seats in Polish Parliament. It must be said, however, that an overwhelming number of Germans in Poland have centuries-old historical ties to the lands they now inhabit, whether from living in territory that once belonged to the German state, or from centuries-old communities. In contrast, most Poles in present-day Germany are recent immigrants, though there are some communities which have been present since the 19th and perhaps even the 18th centuries. Despite protests by some in the older Polish-German communities, and despite Germany being now a signatory to the Framework Convention for the Protection of National Minorities, Germany has so far refused to re-implement minority rights for ethnic Poles, based on the fact that almost all areas of historically mixed German-Polish heritage (where the minority rights formerly existed) are no longer part of Germany and because the vast majority of ethnic Poles now residing in Germany are recent immigrants. Roma people have been in Germany since the Middle Ages. They were persecuted by the Nazis, and thousands of Roma living in Germany were killed by the Nazi regime. Nowadays, they are spread all over Germany, mostly living in major cities. It is difficult to estimate their exact number, as the German government counts them as "persons without migrant background" in their statistics. There are also many assimilated Sinti and Roma. A vague figure given by the German Department of the Interior is about 70,000. In contrast to the old-established Roma population, the majority of them do not have German citizenship, they are classified as immigrants or refugees. After World War II, 14 million ethnic Germans were expelled from the eastern territories of Germany and homelands outside the former German Empire. The accommodation and integration of these Heimatvertriebene in the remaining part of Germany, in which many cities and millions of apartments had been destroyed, was a major effort in the post-war occupation zones and later states of Germany. Since the 1960s, ethnic Germans from the People's Republic of Poland and Soviet Union (especially from Kazakhstan, Russia, and Ukraine), have come to Germany. During the time of Perestroika, and after the dissolution of the Soviet Union, the number of immigrants increased heavily. Some of these immigrants are of mixed ancestry. During the 10-year period between 1987 and 2001, a total of 1,981,732 ethnic Germans from the FSU immigrated to Germany, along with more than a million of their non-German relatives. After 1997, however ethnic Slavs or those belonging to Slavic-Germanic mixed origins outnumbered these with only Germanic descent amongst the immigrants. The total number of people currently living in Germany having FSU connection is around 4 to 4.5 million (Including Germans, Slavs, Jews and those of mixed origins), out of that more than 50% is of German descent. Germany now has Europe's third-largest Jewish population. In 2004, twice as many Jews from former Soviet republics settled in Germany as in Israel, bringing the total inflow to more than 100,000 since 1991. Jews have a voice in German public life through the Central Council of Jews in Germany (Zentralrat der Juden in Deutschland). Some Jews from the former Soviet Union are of mixed heritage. In 2019 there were also a growing number of at least 529,000 black Afro-Germans defined as people with an African migrant background. Out of them more than 400 thousand have a citizenship of a Subsahara-African country, with others being German citizens. Most of them live in Berlin and Hamburg. Numerous persons from northern African Tunisia and Morocco live in Germany. While they are considered members of a minority group, for the most part, they do not considers themselves "Afro-Germans," nor are most of them perceived as such by the German people. However, Germany does not keep any statistics regarding ethnicity or race. Hence, the exact number of Germans of African descent is unknown. Germany's biggest East Asian minorities are the Chinese people in Germany, numbering 189,000 and Vietnamese people in Germany, numbering 188,000,many of whom living in Berlin and eastern Germany. Also there are about 35,000 Japanese citizens residing in Germany. There are also groups of South Asian and Southeast Asian immigrants. Around 163,000 Indians and 124,000 Pakistanis live in Germany. Additionally some 30,000 Filipino citizens and more than 20,000 Indonesian citizens reside in Germany. Numerous descendants of the so-called Gastarbeiter live in Germany. The Gastarbeiter mostly came from Turkey, Italy, Greece, Spain, Morocco, Portugal, the forme Yugoslavia, Tunisia and Chile. Also included were Vietnam, Mongolia, North Korea, Angola, Mozambique and Cuba when the former East Germany existed until reunification in 1990. The (socialist) German Democratic Republic (East Germany) however had their guest-workers stay in single-sex dormitories. Female guest workers had to sign contracts saying that they were not allowed to fall pregnant during their stay. If they fell pregnant nevertheless they faced forced abortion or deportation. This is one of the reasons why the vast majority of ethnic minorities today lives in western Germany and also one of the reasons why minorities such as the Vietnamese have the most unusual population pyramid, with nearly all second-generation Vietnamese Germans born after 1989. There is strong discrimination against Asian Germans in Germany. In a survey conducted by the Free University of Berlin between October and November 2020, 49% of Asian Germans said they had been discriminated against. In terms of discrimination, 62% were subjected to verbal insults, 11% were subjected to physical attacks such as being pushed, spit on, or sprayed with disinfectant. And 27% were rejected from medical clinics. The biggest problem is that Germans are insensitive to their own sense of discrimination, and most Germans are not aware that discrimination against Asians is taking place in Germany. Foreign nationals in Germany , the most common groups of resident foreign nationals in Germany were as follows: This list does not include non-ethnic Germans with German nationality and foreign nationals without resident status. Genetics of the German native people The most common Y chromosome haplogroups among German males are Haplogroup R1b, followed by Haplogroup I1, and Haplogroup R1a. Geography With an estimated 83.2 million inhabitants in December 2020,Germany is the second-most populous country in Europe after Russia, and ranks as the 19th largest country in the world in terms of population. Its population density stands at 233 inhabitants per square kilometer. States Germany comprises sixteen states that are collectively referred to as Länder. Due to differences in size and population the subdivision of these states varies, especially between city-states (Stadtstaaten) and states with larger territories (Flächenländer). For regional administrative purposes four states, namely Baden-Württemberg, Bavaria, Hesse and North Rhine-Westphalia, consist of a total of 19 Government Districts (Regierungsbezirke). As of 2019 Germany is divided into 400 districts (Kreise) on municipal level, these consist of 294 rural districts and 106 urban districts. Cities Metropolitan regions Germany officially has eleven metropolitan regions. In 2005, Germany had 82 cities with more than 100,000 inhabitants. Immigration The United Nations Population Fund lists Germany as host to the third-highest number of international migrants worldwide, behind the United States and Saudi Arabia. The largest ethnic group of non-German origin are the Turkish. Since the 1960s, West and later reunified Germany has attracted immigrants primarily from Southern and Eastern Europe as well as Turkey, many of whom (or their children) have acquired German citizenship over time. While most of these immigrants initially arrived as guest workers, changes to guest worker legislation allowed many to stay and to build lives in Germany. Germany had signed special visa agreements with several countries in times of severe labour shortages or when particular skills were deficient within the country. During the 1960s and 1970s, agreements were signed with the governments of Turkey, Yugoslavia, Italy and Spain to help Germany overcome its severe labour shortage. As of 2012, after Germany fully legalized visa-free immigrants from the eastern states of the EU, the largest sources of net immigration to Germany were other European countries, most importantly Poland, Romania, Bulgaria, Hungary, Italy, Spain, and Greece; notably, in the case of Turkey, German Turks moving to Turkey slightly outnumbered new immigrants in 2012, however, in recent years there are more Turkish immigrants in Germany than emigrants again, including illegal Turkish migrants. In 2015, there was a large increase in asylum applications, mainly due to the violent conflicts in Syria, Iraq and Afghanistan: 476,649 asylum applications were counted that year..This number went up to even 745,545 in 2016 and began to decline after it. Education Responsibility for educational oversight in Germany lies primarily with the individual federated states. Since the 1960s, a reform movement has attempted to unify secondary education into a Gesamtschule (comprehensive school); several West German states later simplified their school systems to two or three tiers. A system of apprenticeship called Duale Ausbildung ("dual education") allows pupils in vocational training to learn in a company as well as in a state-run vocational school. Optional kindergarten education is provided for all children between three and six years old, after which school attendance is compulsory for at least nine years, depending on the state. Primary education usually lasts for four years and public schools are not stratified at this stage. In contrast, secondary education includes three traditional types of schools focused on different levels of academic ability: the Gymnasium enrols the most academically promising children and prepares students for university studies; the Realschule for intermediate students lasts six years; the Hauptschule prepares pupils for vocational education. In addition Germany has a comprehensive school known as the Gesamtschule. While some German schools such as the Gymnasium and the Realschule have rather strict entrance requirements, the Gesamtschule does not have such requirements. They offer college preparatory classes for the students who are doing well, general education classes for average students, and remedial courses for those who aren't doing that well. In most cases students attending a Gesamtschule may graduate with the Hauptschulabschluss, the Realschulabschluss or the Abitur depending on how well they did in school. The percentage of students attending a Gesamtschule varies by Bundesland. In 2007 the State of Brandenburg more than 50% of all students attended a Gesamtschule, while in the State of Bavaria less than 1% did. The general entrance requirement for university is Abitur, a qualification normally based on continuous assessment during the last few years at school and final examinations; however there are a number of exceptions, and precise requirements vary, depending on the state, the university and the subject. Germany's universities are recognised internationally; in the Academic Ranking of World Universities (ARWU) for 2008, six of the top 100 universities in the world are in Germany, and 18 of the top 200. Nearly all German universities are public institutions, tuition fees in the range of €500 were introduced in some states after 2006, but quickly abolished again until 2014. Percentage of jobholders holding Hauptschulabschluss, Realschulabschluss or Abitur in Germany Literacy Over 99% of those of age 15 and above are estimated to be able to read and write. However, a growing number of inhabitants are functionally illiterate. The young are much more likely to be functionally illiterate than the old. According to a study done by the University of Bremen in cooperation with the "Bundesverband Alphabetisierung e.V.", 10% of youngsters living in Germany are functionally illiterate and one quarter are able to understand only basic level texts. Illiteracy rates of youngsters vary by ethnic group and parents' socioeconomic class. Health The life expectancy in Germany is 81.1 years (78.7 years males, 83.6 years females, 2020 est.). , the principal cause of death was cardiovascular disease, at 42%, followed by malignant tumours, at 25%. , about 82,000 Germans had been infected with HIV/AIDS and 26,000 had died from the disease (cumulatively, since 1982). According to a 2005 survey, 27% of German adults are smokers. A 2009 study shows Germany is near the median in terms of overweight and obese people in Europe. Religion The national constitutions of 1919 and 1949 guarantee freedom of faith and religion; earlier, these freedoms were mentioned only in state constitutions. The modern constitution of 1949 also states that no one may be discriminated against due to their faith or religious opinions. A state church does not exist in Germany (see Freedom of religion in Germany). According to a 1990s poll by Der Spiegel, 45% of Germans believe in God, and a quarter in Jesus Christ. According to the Eurobarometer Poll 2010, 44% of German citizens responded that "they believe there is a God", 25% responded that "they believe there is some sort of spirit or life force" and 27% responded that "they don't believe there is any sort of spirit, God or life force". 4% gave no response. Christianity is the largest religion in Germany, comprising an estimated 53.9% of the country's population. Smaller religious groups (less than 1%) include Judaism, Buddhism and Hinduism. The two largest churches, the Roman Catholic Church and the Protestant Evangelical Church in Germany (EKD), have lost significant number of adherents. In 2020 the Catholic Church accounted for 26.7% and the Evangelical Church for 24.3% of the population. Orthodox Church has 1.9% and other Christian churches and groups summed up to 1.1% of the population. Since the reunification of Germany, the number of non-religious people has grown and an estimated 40.7% of the country's population are not affiliated with any church or religion. The other religions make up to less than 1% of the population. Buddhism has around 200,000 adherents (0.2%), Judaism has around 200,000 adherents (0.2%), Hinduism 90,000 (0.1%), Sikhism 75,000 (0.1%) and Yazidis religion (45,000-60,000). All other religious communities in Germany have fewer than 50,000 (<0.1%) adherents. Protestantism is concentrated in the north and east and Roman Catholicism is concentrated in the south and west. According to the last nationwide census, Protestantism is more widespread among the population with German citizenship; there are slightly more Catholics total because of the Catholic immigrant population (including such groups as Poles and Italians). The former Pope, Benedict XVI, was born in Bavaria. Non-religious people, including atheists and agnostics, might make up as many as 55% of the total population, and are especially numerous in the former East Germany and major metropolitan areas. Of the roughly 4 million Muslims, most are Sunnis and Alevites from Turkey, but there are a small number of Shi'ites and other denominations. 1.9% of the country's overall population declare themselves Orthodox Christians, with Serbs, Greeks, Romanians, Ukrainians and Russians being the most numerous. Germany has Europe's third-largest Jewish population (after France and the United Kingdom). In 2004, twice as many Jews from former Soviet republics settled in Germany as in Israel, bringing the total Jewish population to more than 200,000, compared to 30,000 prior to German reunification. Large cities with significant Jewish populations include Berlin, Frankfurt and Munich. Around 250,000 active Buddhists live in Germany; 50% of them are Asian immigrants. 2011 Census Census results were as follows: Roman Catholic Church: 24,740,380 or 30.8% of the German population; Evangelical Church: 24,328,100 or 30.3% of the German population; Other, atheist or not specified (including Protestants outside EKD): 31,151,210 or 38.9% of the German population. Languages German is the only official and most widely spoken language. Standard German is understood throughout the country. Minority languages Danish, Low German, Low Rhenish, the Sorbian languages (Lower Sorbian and Upper Sorbian), and the two Frisian languages, Saterfrisian and North Frisian, are officially recognized and protected as minority languages by the European Charter for Regional or Minority Languages in their respective regions. With speakers of Romany living in all parts of Germany, the federal government has promised to take action to protect the language. Until now, only Hesse has followed Berlin's announcement, and agreed on implementing concrete measures to support Romany speakers. Implementation of the Charter is poor. The monitoring reports on charter implementation in Germany show many provisions unfulfilled. High German dialects German dialects – some quite distinct from the standard language – are used in everyday speech, especially in rural regions. Many dialects, for example the Upper German varieties, are to some degree cultivated as symbols of regional identity and have their own literature, theaters and some TV programming. While speaking a dialect outside its native region might be frowned upon, in their native regions some dialects can be spoken by all social classes. . Nevertheless, partly due to the prevalence of Standard German in media, the use of dialects has declined over the past century, especially in the younger population. The social status of different German dialects can vary greatly. The Alemannic and Bavarian dialects of the south are positively valued by their speakers and can be used in almost all social circumstances. The Saxonian and Thuringian dialects have less prestige and are subject to derision. While Bavarian and Alemannic have kept much of their distinctiveness, the Middle German dialects, which are closer to Standard German, have lost some of their distinctive lexical and grammatical features and tend to be only pronunciation variants of Standard German. Low Saxon dialects Low Saxon is officially recognized as a language on its own, but despite this fact, there's little official action taken on fostering the language. Historically one third of Germany's territory and population was Low Saxon speaking. No data was ever collected on the actual number of speakers, but today the number of speakers ranges around 5 million persons. Despite this relatively high number of speakers there is very little coverage in the media (mostly on NDR TV, no regular programming) and very little education in or on the language. The language is not fixed as part of the school curriculum and Low Saxon is used as a medium of instruction in one school only in the whole Germany (as a "model project" in primary school sided by education in Standard German). As a consequence the younger generation refused to adopt the native language of their parents. Language prevalence dropped from more than 90% (depending on the exact region) in the 1930s to less than 5% today. This accounts for a massive intergenerational gap in language use. Older people regularly use the language and take private initiative to maintain the language, but the lack of innovative potential of the younger generation hinders language maintenance. The language too has an own literature (around 150 published books every year) and there are many theatres (mostly lay stages, but some professional ones, like for example Ohnsorg-Theater). Use of Low Saxon is mainly restricted to use among acquaintances, like family members, neighbours and friends. A meeting of a village council can be held almost completely in Low Saxon if all participants know each other (as long as written protocols are written in Standard German), but a single foreigner can make the whole switching to Standard German. The Low Saxon dialects are different in their status too. There's a north–south gradient in language maintenance. The Southern dialects of Westfalian, Eastfalian and Brandenburgish have had much stronger speaker losses, than the northern coastal dialects of Northern Low Saxon. While Eastfalian has lost speakers to Standard German, Westfalian has lost speakers to Standard German and Standard German based regiolect of the Rhine-Ruhr area. Brandenburgish speakers mostly switched to the Standard German-based regiolect of Berlin. Brandenburgish is almost completely replaced by the Berlin regiolect. Northern Low Saxon speakers switched mostly to pure Standard German. Foreign languages English is the most common foreign language and almost universally taught by the secondary level; it is also taught at elementary level in some states. Other commonly-taught languages are French, Italian, Spanish, Portuguese, and Russian. Dutch is taught in states bordering the Netherlands, and Polish in the eastern states bordering Poland. Latin and Ancient Greek are part of the classical education syllabus offered in many secondary schools. According to a 2004 survey, two-thirds of Germany's citizens have at least basic knowledge of English. About 20% consider themselves to be competent speakers of French, followed by speakers of Russian (7%), Italian (6.1%), and Spanish (5.6%). The relatively high number of Russian speakers is a result of immigration from the former Soviet Union to Germany for almost 10 consecutive years, plus its having been learned in school by many older former East Germans as compulsory first foreign language. See also Germans Census in Germany Notes References External links Homepage of the Federal Statistical Office Germany (in English) German demographics in Online-Databank HISTAT (in German, Registration needed) Dossier "The Aging Society" of the Goethe-Institut Demographic Profile Germany: United in Decline Allianz Knowledge
[ -0.030729521065950394, 0.07256531715393066, -0.07790796458721161, -0.10612759739160538, -0.3509449362754822, 0.5377410650253296, 0.6848930716514587, 0.5401262044906616, -0.42202576994895935, -0.9370876550674438, -0.3459797203540802, -0.5937750935554504, 0.16070227324962616, 0.1176888346672...
11930
https://en.wikipedia.org/wiki/Economy%20of%20Germany
Economy of Germany
The economy of Germany is a highly developed social market economy. It has the largest national economy in Europe, the fourth-largest by nominal GDP in the world, and fifth by GDP (PPP). In 2017, the country accounted for 28% of the euro area economy according to the International Monetary Fund (IMF). Germany is a founding member of the European Union and the Eurozone. In 2016, Germany recorded the highest trade surplus in the world, worth $310 billion. This economic result made it the biggest capital exporter globally. Germany is one of the largest exporters globally with $1810.93 billion worth of goods and services exported in 2019. The service sector contributes around 70% of the total GDP, industry 29.1%, and agriculture 0.9%. Exports accounted for 41% of national output. The top 10 exports of Germany are vehicles, machinery, chemical goods, electronic products, electrical equipment, pharmaceuticals, transport equipment, basic metals, food products, and rubber and plastics. The economy of Germany is the largest manufacturing economy in Europe, and it is less likely to be affected by a financial downturn. Germany conducts applied research with practical industrial value and sees itself as a bridge between the latest university insights and industry-specific product and process improvements. It generates a great deal of knowledge in its own laboratories. Germany is rich in timber, lignite, potash and salt. Some minor sources of natural gas are being exploited in the state of Lower Saxony. Until the German reunification, the German Democratic Republic mined for uranium in the Ore Mountains (see also: SAG/SDAG Wismut). Energy in Germany is sourced predominantly by fossil fuels (30%), with wind power in second place, then nuclear power, gas, solar, biomass (wood and biofuels) and hydro. Germany is the first major industrialized nation to commit to the renewable energy transition called Energiewende. Germany is the leading producer of wind turbines in the world. Renewables produced 46% of electricity consumed in Germany (as of 2019). 99 percent of all German companies belong to the German "Mittelstand," small and medium-sized enterprises, which are mostly family-owned. Of the world's 2000 largest publicly listed companies measured by revenue, the Fortune Global 2000, 53 are headquartered in Germany, with the Top 10 being Allianz, Daimler, Volkswagen, Siemens, BMW, Deutsche Telekom, Bayer, BASF, Munich Re and SAP. Germany is the world's top location for trade fairs. Around two thirds of the world's leading trade fairs take place in Germany. The largest annual international trade fairs and congresses are held in several German cities such as Hanover, Frankfurt, Cologne, Leipzig and Düsseldorf. History Age of Industrialization The Industrial Revolution in Germany got underway approximately a century later than in the United Kingdom, France, and Belgium, partly because Germany only became a unified country in 1871.The establishment of the Deutscher Zollverein (German Customs Union) in 1834 and the expansion of railway systems were the main drivers of Germany's industrial development and political union. From 1834, tariff barriers between increasing numbers of the Kleindeutschland German states were eliminated. In 1835 the first German railway linked the Franconian cities of Nuremberg and Fürth – it proved so successful that the decade of the 1840s saw "railway mania" in all the German states. Between 1845 and 1870, of rail had been built and in 1850 Germany was building its own locomotives. Over time, other German states joined the customs union and started linking their railroads, which began to connect the corners of Germany. The growth of free trade and a rail system across Germany intensified economic development which opened up new markets for local products, created a pool of middle managers, increased the demand for engineers, architects, and skilled machinists, and stimulated investments in coal and iron. Another factor that propelled German industry forward was the unification of the monetary system, made possible in part by political unification. The Deutsche Mark, a new monetary coinage system backed by gold, was introduced in 1871. However, this system did not fully come into use as silver coins retained their value until 1907. The victory of Prussia and her allies over Napoleon III of France in the Franco-Prussian War of 1870-1871 marked the end of French hegemony in Europe and resulted in the proclamation of the German Empire in 1871. The establishment of the empire inherently presented Europe with the reality of a new populous and industrializing polity possessing a considerable, and undeniably increasing, economic and diplomatic presence. The influence of French economic principles produced important institutional reforms in Germany, including the abolition of feudal restrictions on the sale of large landed estates, the reduction of the power of the guilds in the cities, and the introduction of a new, more efficient commercial law. Nonetheless, political decisions about the economy of the empire were still largely controlled by a coalition of "rye and iron", that is the Prussian Junker landowners of the east and the Ruhr heavy industry of the west. Regarding politics and society, between 1881 and 1889 Chancellor Otto von Bismarck promoted laws that provided social insurance and improved working conditions. He instituted the world's first welfare state. Germany was the first to introduce social insurance programs including universal healthcare, compulsory education, sickness insurance, accident insurance, disability insurance, and a retirement pension. Moreover, the government's universal education policy bore fruit with Germany achieving the highest literacy rate in the world – 99% – education levels that provided the nation with more people good at handling numbers, more engineers, chemists, opticians, skilled workers for its factories, skilled managers, knowledgeable farmers, and skilled military personnel. By 1900 Germany surpassed Britain and the United States in steel production. The German economic miracle was also intensified by unprecedented population growth from 35 million in 1850 to 67 million in 1913. From 1895 to 1907, the number of workers engaged in machine building doubled from half a million to well over a million. Only 40 percent of Germans lived in rural areas by 1910, a drop from 67% at the birth of the Empire. Industry accounted for 60 percent of the gross national product in 1913. The German chemical industry became the most advanced in the world, and by 1914 the country was producing half the world's electrical equipment. The rapid advance to industrial maturity led to a drastic shift in Germany's economic situation – from a rural economy into a major exporter of finished goods. The ratio of the finished product to total exports jumped from 38% in 1872 to 63% in 1912. By 1913 Germany had come to dominate all the European markets. By 1914 Germany had become one of the biggest exporters in the world. Weimar Republic and Third Reich The Nazis rose to power while unemployment was very high, but achieved full employment later thanks to massive public works programs such as the Reichsbahn, Reichspost and the Reichsautobahn projects. In 1935 rearmament in contravention of the Treaty of Versailles added to the economy. Weimar and Nazi Germany By Stephen J. Lee The post-1931 financial crisis economic policies of expansionary fiscal policies (as Germany was off the gold standard) was advised by their non-Nazi Minister of Economics, Hjalmar Schacht, who in 1933 became the president of the central bank. Hjalmar Schacht later abdicated from the post in 1938 and was replaced by Hermann Göring. The trading policies of the Third Reich aimed at self-sufficiency but with a lack of raw materials Germany would have to maintain trade links but on bilateral preferences, foreign exchange controls, import quotas, and export subsidies under what was called the "New Plan"(Neuer Plan) of 19 September 1934. The "New Plan" was based on trade with less developed countries who would trade raw materials for German industrial goods saving currency. Southern Europe was preferable to Western Europe and North America as there could be no trade blockades. This policy became known as the Grosswirtschaftsraum ("greater economic area") policy. Eventually, the Nazi party developed strong relationships with big business and abolished trade unions in 1933 in order to form the National Labor Service (RAD), German Labor Front (DAF) to set working hours, Beauty of Labour (SDA) which set working conditions and Strength through Joy (KDF) to ensure sports clubs for workers. West Germany Beginning with the replacement of the Reichsmark with the Deutsche Mark as legal tender, a lasting period of low inflation and rapid industrial growth was overseen by the government led by German Chancellor Konrad Adenauer and his minister of economics, Ludwig Erhard, raising West Germany from total wartime devastation to one of the most developed nations in modern Europe. In 1953 it was decided that Germany was to repay $1.1 billion of the aid it had received. The last repayment was made in June 1971. Apart from these factors, hard work and long hours at full capacity among the population in the 1950s, 1960s, and early 1970s and extra labor supplied by thousands of Gastarbeiter ("guest workers") provided a vital base for the economic upturn. East Germany By the early 1950s, the Soviet Union had seized reparations in the form of agricultural and industrial products and demanded further heavy reparation payments. Silesia with the Upper Silesian Coal Basin, and Stettin, a prominent natural port, were lost to Poland. Exports from West Germany exceeded $323 billion in 1988. In the same year, East Germany exported $30.7 billion worth of goods; 65% to other communist states. East Germany had zero unemployment. In 1976 the average annual GDP growth was roughly 5.9%. Federal Republic The German economy practically stagnated in the beginning of the 2000s. The worst growth figures were achieved in 2002 (+1.4%), in 2003 (+1.0%) and in 2005 (+1.4%). Unemployment was also chronically high. Due to these problems, together with Germany's aging population, the welfare system came under considerable strain. This led the government to push through a wide-ranging program of belt-tightening reforms, Agenda 2010, including the labor market reforms known as Hartz I - IV. In the later part of the first decade of 2000, the world economy experienced high growth, from which Germany as a leading exporter also profited. Some credit the Hartz reforms with achieving high growth and declining unemployment but others contend that they resulted in a massive decrease in standards of living and that its effects are limited and temporary. The nominal GDP of Germany contracted in the second and third quarters of 2008, putting the country in a technical recession following a global and European recession cycle. German industrial output dropped to 3.6% in September vis-à-vis August. In January 2009 the German government under Angela Merkel approved a €50 billion ($70 billion) economic stimulus plan to protect several sectors from a downturn and a subsequent rise in unemployment rates. Germany exited the recession in the second and third quarters of 2009, mostly due to rebounding manufacturing orders and exports - primarily from outside the Euro Zone - and relatively steady consumer demand. Germany is a founding member of the EU, the G8 and the G20, and was the world's largest exporter from 2003 to 2008. In 2011 it remained the third largest exporter and third largest importer. Most of the country's exports are in engineering, especially machinery, automobiles, chemical goods and metals. Germany is a leading producer of wind turbines and solar-power technology. Annual trade fairs and congresses are held in cities throughout Germany. 2011 was a record-breaking year for the German economy. German companies exported goods worth over €1 trillion ($1.3 trillion), the highest figure in history. The number of people in work has risen to 41.6 million, the highest recorded figure. Through 2012, Germany's economy continued to be stronger relative to local neighboring nations. Data The following table shows the main economic indicators in 1980–2020 (with IMF staff estimtates in 2021–2026). Inflation below 2% is in green. Companies Of the world's 500 largest stock-market-listed companies measured by revenue in 2010, the Fortune Global 500, 37 are headquartered in Germany. 30 Germany-based companies are included in the DAX, the German stock market index. Well-known global brands are Mercedes-Benz, BMW, SAP, Siemens, Volkswagen, Adidas, Audi, Allianz, Porsche, Bayer, BASF, Bosch, and Nivea. Germany is recognised for its specialised small and medium enterprises, known as the Mittelstand model. SMEs account for more than 99 per cent of German companies. Around 1,000 of these companies are global market leaders in their segment and are labelled hidden champions. From 1991 to 2010, 40,301 mergers and acquisitions with an involvement of German firms with a total known value of 2,422 bil. EUR have been announced. The largest transactions since 1991 are: the acquisition of Mannesmann by Vodafone for 204.8 bil. EUR in 1999, the merger of Daimler-Benz with Chrysler to form DaimlerChrysler in 1998 valued at 36.3 bil. EUR. Berlin (Economy of Berlin) developed an international Startup ecosystem and became a leading location for venture capital funded firms in the European Union. The list includes the largest German companies by revenue in 2011: Mergers and acquisitions Since the German reunification, there have been 52,258 mergers or acquisitions deals inbound or outbound in Germany. The most active year in terms of value was 1999 with a cumulated value of 48. bil. EUR, twice as much as the runner up which was 2006 with 24. bil. EUR (see graphic "M&A in Germany"). Here is a list of the top 10 deals (ranked by value) that include a German company. The Vodafone - Mannesmann deal is still the biggest deal in global history. Economic region Germany as a federation is a polycentric country and does not have a single economic center. The stock exchange is located in Frankfurt am Main, the largest Media company (Bertelsmann SE & Co. KGaA) is headquartered in Gütersloh; the largest car manufacturers are in Wolfsburg (Volkswagen), Stuttgart (Mercedes-Benz and Porsche), and Munich (Audi and BMW). Germany is an advocate of closer European economic and political integration. Its commercial policies are increasingly determined by agreements among European Union (EU) members and EU single market legislation. Germany introduced the common European currency, the euro on 1 January 1999. Its monetary policy is set by the European Central Bank in Frankfurt. The southern states ("Bundesländer"), especially Bayern, Baden-Württemberg, and Hessen, are economically stronger than the northern states. One of Germany's traditionally strongest (and at the same time oldest) economic regions is the Ruhr area in the west, between Duisburg and Dortmund. 27 of the country's 100 largest companies are located there. In recent years, however, the area, whose economy is based on natural resources and heavy industry, has seen a substantial rise in unemployment (2010: 8.7%). The economy of Bayern and Baden-Württemberg, the states with the lowest number of unemployed people (2018: 2.7%, 3.1%), on the other hand, is based on high-value products. Important sectors are automobiles, electronics, aerospace, and biomedicine, among others. Baden-Württemberg is an industrial center especially for the automobile and machine-building industry and the home of brands like Mercedes-Benz (Daimler), Porsche and Bosch. With the reunification on 3 October 1990, Germany began the major task of reconciling the economic systems of the two former republics. Interventionist economic planning ensured gradual development in eastern Germany up to the level of former West Germany, but the standard of living and annual income remains significantly higher in western German states. The modernization and integration of the eastern German economy continues to be a long-term process scheduled to last until the year 2019, with annual transfers from west to east amounting to roughly $80 billion. The overall unemployment rate has consistently fallen since 2005 and reached a 20-year low in 2012. The country in July 2014 began legislating to introduce a federally mandated minimum wage which would come into effect on 1 January 2015. German states Wealth The following top 10 list of German billionaires is based on an annual assessment of wealth and assets compiled and published by Forbes magazine on 1 March 2016. $27.9 billion Beate Heister (b. Albrecht) & Karl Albrecht Jr. $20.3 billion Theo Albrecht Jr. $18.5 billion Susanne Klatten $18.1 billion Georg Schaeffler $16.4 billion Dieter Schwarz $15.6 billion Stefan Quandt $15.4 billion Michael Otto $11.7 billion Heinz Hermann Thiele $10 billion Klaus-Michael Kühne $9.5 billion Hasso Plattner Wolfsburg is the city in Germany with the country's highest per capita GDP, at $128,000. The following top 10 list of German cities with the highest per capita GDP is based on a study by the Cologne Institute for Economic Research on 31 July 2013. $128,000 Wolfsburg, Lower Saxony $114,281 Frankfurt am Main, Hesse $108,347 Schweinfurt, Bavaria $104,000 Ingolstadt, Bavaria $99,389 Regensburg, Bavaria $92,525 Düsseldorf, North Rhine-Westphalia $92,464 Ludwigshafen am Rhein, Rhineland-Palatinate $91,630 Erlangen, Bavaria $91,121 Stuttgart, Baden-Württemberg $88,692 Ulm, Baden-Württemberg Sectors Germany has a social market economy characterised by a highly qualified labor force, a developed infrastructure, a large capital stock, a low level of corruption, and a high level of innovation. It has the largest national economy in Europe, the fourth largest by nominal GDP in the world, and ranked fifth by GDP (PPP) in 2015. The service sector contributes around 70% of the total GDP, industry 29.1%, and agriculture 0.9%. Primary In 2010 agriculture, forestry, and mining accounted for only 0.9% of Germany's gross domestic product (GDP) and employed only 2.4% of the population, down from 4% in 1991. Agriculture is extremely productive, and Germany can cover 90% of its nutritional needs with domestic production. Germany is the third-largest agricultural producer in the European Union after France and Italy. Germany's principal agricultural products are potatoes, wheat, barley, sugar beets, fruit, and cabbages. Despite the country's high level of industrialization, almost one-third of its territory is covered by forest. The forestry industry provides for about two-thirds of domestic consumption of wood and wood products, so Germany is a net importer of these items. The German soil is relatively poor in raw materials. Only lignite (brown coal) and potash salt (Kalisalz) are available in significant quantities. However, the former GDR's Wismut mining company produced a total of 230,400 tonnes of uranium between 1947 and 1990 and made East Germany the fourth-largest producer of uranium ore worldwide (largest in USSR's sphere of control) at the time. Oil, natural gas, and other resources are, for the most part, imported from other countries. Potash salt is mined in the center of the country (Niedersachsen, Sachsen-Anhalt and Thüringen). The most important producer is K+S (formerly Kali und Salz AG). Germany's bituminous coal deposits were created more than 300 million years ago from swamps which extended from the present-day South England, over the Ruhr area to Poland. Lignite deposits developed similarly, but during a later period, about 66 million years ago. Because the wood is not yet completely transformed into coal, brown coal contains less energy than bituminous coal. Lignite is extracted in the extreme western and eastern parts of the country, mainly in Nordrhein-Westfalen, Sachsen and Brandenburg. Considerable amounts are burned in coal plants near the mining areas, to produce electricity. Transporting lignite over far distances is not economically feasible, therefore the plants are located practically next to the extraction sites. Bituminous coal is mined in Nordrhein-Westfalen and Saarland. Most power plants burning bituminous coal operate on imported material, therefore the plants are located not only near to the mining sites, but throughout the country. In 2019, the country was the world's 3rd largest producer of selenium, the world's 5th largest producer of potash, the world's 5th largest producer of boron, the world's 7th largest producer of lime, the world's 13th largest producer of fluorspar, the world's 14th largest producer of feldspar, the world's 17th largest producer of graphite, the world's 18th largest producer of sulfur, in addition to being the 4th largest world producer of salt. Industry Industry and construction accounted for 30.7% of the gross domestic product in 2017 and employed 24.2% of the workforce. Germany excels in the production of automobiles, machinery, electrical equipment and chemicals. With the manufacture of 5.2 million vehicles in 2009, Germany was the world's fourth-largest producer and largest exporter of automobiles. German automotive companies enjoy an extremely strong position in the so-called premium segment, with a combined world market share of about 90%. Small- to medium-sized manufacturing firms (Mittelstand companies) which specialize in technologically advanced niche products and are often family-owned form a major part of the German economy. It is estimated that about 1500 German companies occupy a top three position in their respective market segment worldwide. In about two thirds of all industry sectors German companies belong to the top three competitors. Germany is the only country among the top five arms exporters that is not a permanent member of the United Nations Security Council. Services In 2017 services constituted 68.6% of gross domestic product (GDP), and the sector employed 74.3% of the workforce. The subcomponents of services are financial, renting, and business activities (30.5%); trade, hotels and restaurants, and transport (18%); and other service activities (21.7%). Germany is the seventh most visited country in the world, with a total of 407 million overnights during 2012. This number includes 68.83 million nights by foreign visitors. In 2012, over 30.4 million international tourists arrived in Germany. Berlin has become the third most visited city destination in Europe. Additionally, more than 30% of Germans spend their holiday in their own country, with the biggest share going to Mecklenburg-Vorpommern. Domestic and international travel and tourism combined directly contribute over EUR43.2 billion to German GDP. Including indirect and induced impacts, the industry contributes 4.5% of German GDP and supports 2 million jobs (4.8% of total employment). The largest annual international trade fairs and congresses are held in several German cities such as Hannover, Frankfurt, and Berlin. Government finances The debt-to-GDP ratio of Germany had its peak in 2010 when it stood at 80.3% and decreased since then. According to Eurostat, the government gross debt of Germany amounts to €2,152.0 billion or 71.9% of its GDP in 2015. The federal government achieved a budget surplus of €12.1 billion ($13.1 billion) in 2015. Germany's credit rating by credit rating agencies Standard & Poor's, Moody's and Fitch Ratings stands at the highest possible rating AAA with a stable outlook in 2016. Germany's "debt clock" (Schuldenuhr) reversed for the first time in 20 years in January 2018. It is now currently increasing at 10,424.00 per second (Oct2020).. Economists generally see Germany's current account surplus as undesirable. Infrastructure Energy Germany is the world's fifth-largest consumer of energy, and two-thirds of its primary energy was imported in 2002. In the same year, Germany was Europe's largest consumer of electricity, totaling 512.9 terawatt-hours. Government policy promotes energy conservation and the development of renewable energy sources, such as solar, wind, biomass, hydroelectric, and geothermal energy. As a result of energy-saving measures, energy efficiency has been improving since the beginning of the 1970s. The government has set the goal of meeting half the country's energy demands from renewable sources by 2050. Renewable energy also plays an increasing role in the labor market: Almost 700,000 people are employed in the energy sector. About 50 percent of them work with renewable energies. In 2000, the red-green coalition under Chancellor Schröder and the German nuclear power industry agreed to phase out all nuclear power plants by 2021. The conservative coalition under Chancellor Merkel reversed this decision in January 2010, electing to keep plants open. The nuclear disaster of the Japanese nuclear plant Fukushima in March 2011 however, changed the political climate fundamentally: Older nuclear plants have been shut down. Germany is seeking to have wind, solar, biogas, and other renewable energy sources play a bigger role, as the country looks to completely phase out nuclear power by 2022 and coal-fired power plants by 2038. Renewable energy yet still plays a more modest role in energy consumption, though German solar and wind power industries play a leading role worldwide. In 2009, Germany's total energy consumption (not just electricity) came from the following sources: oil 34.6%, natural gas 21.7%, lignite 11.4%, bituminous coal 11.1%, nuclear power 11.0%, hydro and wind power 1.5%, others 9.0%. In the first half of 2021, coal, natural gas and nuclear energy comprised 56% of the total electricity fed into Germany's grid in the first half of 2021. Coal was the leader out of the conventional energy sources, comprising over 27% of Germany's electricity. Wind power's contribution to the electric grid was 22%. There are 3 major entry points for oil pipelines: in the northeast (the Druzhba pipeline, coming from Gdańsk), west (coming from Rotterdam) and southeast (coming from Nelahozeves). The oil pipelines of Germany do not constitute a proper network, and sometimes only connect two different locations. Major oil refineries are located in or near the following cities: Schwedt, Spergau, Vohburg, Burghausen, Karlsruhe, Cologne, Gelsenkirchen, Lingen, Wilhelmshaven, Hamburg and Heide. Germany's network of natural gas pipelines, on the other hand, is dense and well-connected. Imported pipeline gas comes mostly from Russia, the Netherlands and the United Kingdom. Although gas imports from Russia have been historically reliable, even during the cold war, recent price disputes between Gazprom and the former Soviet states, such as Ukraine, have also affected Germany. As a result, high political importance is placed on the construction of the Nord Stream pipeline, running from Vyborg in Russia along the Baltic sea to Greifswald in Germany. This direct connection avoids third-party transit countries. Germany imports 50% to 75% of its natural gas from Russia. Transport With its central position in Europe, Germany is an important transportation hub. This is reflected in its dense and modern transportation networks. The extensive motorway (Autobahn) network ranks worldwide third largest in its total length and features a lack of blanket speed limits on the majority of routes. Germany has established a polycentric network of high-speed trains. The InterCityExpress or ICE is the most advanced service category of the Deutsche Bahn and serves major German cities as well as destinations in neighbouring countries. The train maximum speed varies between 200 km/h and 320 km/h (125-200 mph). Connections are offered at either 30-minute, hourly, or two-hourly intervals. German railways are heavily subsidised, receiving €17.0 billion in 2014. The largest German airports are Frankfurt Airport and Munich Airport, both are global hubs of Lufthansa. Other major airports are Berlin Brandenburg Airport, Düsseldorf, Hamburg, Hanover, Cologne/Bonn, and Stuttgart. Technology Germany's achievements in sciences have been significant, and research and development efforts form an integral part of the economy. Germany is also one of the leading countries in developing and using green technologies. Companies specializing in green technology have an estimated turnover of €200 billion. German expertise in engineering, science, and research is eminently respectable. The lead markets of Germany's green technology industry are power generation, sustainable mobility, material efficiency, energy efficiency, waste management and recycling, sustainable water management. Regarding triadic patents, Germany is in third place after the US and Japan. With more than 26,500 registrations for patents submitted to the European Patent Office, Germany is the leading European nation. Siemens, Bosch and BASF, with almost 5,000 registrations for patents between them in 2008, are among the Top 5 of more than 35,000 companies registering patents. Together with the US and Japan, about patents for nano, bio, and new technologies Germany is one of the world's most active nations. With around one-third of triadic patents Germany leads the way worldwide in the field of vehicle emission reduction. According to Winfried Kretschmann, who is premier of the region where Daimler is based, "China dominates the production of solar cells? Tesla is ahead in electric cars and Germany has lost the first round of digitalization to Google, Apple, and the like. Whether Germany has a future as an industrial economy will depend on whether we can manage the ecological and digital transformation of our economy". Challenges Despite economic prosperity, Germany's biggest threat to future economic development is the nation's declining birthrate which is among the lowest in the world. This is particularly prevalent in parts of society with higher education. As a result, the numbers of workers are expected to decrease and the government spending needed to support pensioners and healthcare will increase if the trend is not reversed. Less than a quarter of German people expect living conditions to improve in the coming decades. On August 25, 2020, Federal Statistical Office of Germany revealed that the German economy plunged by 9.7% in the second quarter which is the worst on record. The latest figures show how hard the German economy was hit by the COVID-19 pandemic. See also Association of German Chambers of Industry and Commerce Codetermination in Germany Deutsche Bundesbank German Federal Association of Young Entrepreneurs German model Metropolitan regions in Germany List of German states by unemployment rate List of German cities by GDP Trade unions in Germany References Notes Further reading External links Federal Statistical Office (Destatis) Deutsche Bundesbank World Bank Germany Trade Statistics Germany - OECD Germany profile at the CIA World Factbook Germany profile at The World Bank Germany Germany Germany Germany Economies of Europe
[ 0.3239801228046417, 0.35580870509147644, 0.40860715508461, -0.20851096510887146, -0.10483314841985703, 0.0487130843102932, 0.4385281205177307, 0.8691767454147339, -0.22120580077171326, -0.8085838556289673, -0.45505520701408386, -0.24748097360134125, -0.057206496596336365, 0.572836995124816...
11932
https://en.wikipedia.org/wiki/Transport%20in%20Germany
Transport in Germany
As a densely populated country in a central location in Europe and with a developed economy, Germany has a dense and modern transport infrastructure. One of the first limited-access highway systems in the world to have been built, the extensive German Autobahn network famously has no general speed limit for light vehicles (although there are speed limits in many sections today, and there is a blanket limit for trucks). The country's most important waterway is the river Rhine, and largest port is that of Hamburg. Frankfurt Airport is a major international airport and European transport hub. Air travel is used for greater distances within Germany but faces competition from the state-owned Deutsche Bahn's rail network. High-speed trains called ICE connect cities for passenger travel with speeds up to 300 km/h. Many German cities have rapid transit systems and public transport is available in most areas. Buses have historically only played a marginal role in long-distance passenger service, as all routes directly competing with rail services were technically outlawed by a law dating to 1935 (during the Nazi era). Only in 2012 was this law officially amended and thus a long-distance bus market has also emerged in Germany since then. Since German reunification substantial effort has been made to improve and expand transport infrastructure in what was formerly East Germany. Due to Germany's varied history, main traffic flows have changed from primarily East-West (old Prussia and the German Empire) to primarily North-South (the 1949-1990 German partition era) to a more balanced flow with both major North-South and East-West corridors, both domestically and in transit. Infrastructure, which was further hampered by the havoc wars and scorched earth policies as well as reparations wrought, had to be adjusted and upgraded with each of those shifts. Verkehrsmittel and Verkehrszeichen - Transportation signs in Germany are available here in German and English. Road and automotive transport Overview The volume of traffic in Germany, especially goods transportation, is at a very high level due to its central location in Europe. In the past few decades, much of the freight traffic shifted from rail to road, which led the Federal Government to introduce a motor toll for trucks in 2005. Individual road usage increased resulting in a relatively high traffic density to other nations. A further increase of traffic is expected in the future. High-speed vehicular traffic has a long tradition in Germany given that the first freeway (Autobahn) in the world, the AVUS, and the world's first automobile were developed and built in Germany. Germany possesses one of the most dense road systems of the world. German motorways have no blanket speed limit for light vehicles. However, posted limits are in place on many dangerous or congested stretches as well as where traffic noise or pollution poses a problem (20.8% under static or temporary limits and an average 2.6% under variable traffic control limit applications as of 2015). The German government has had issues with upkeep of the country's autobahn network, having had to revamp the Eastern portion's transport system since the unification of Germany between the German Democratic Republic (East Germany) and the Federal Republic of Germany (West Germany). With that, numerous construction projects have been put on hold in the west, and a vigorous reconstruction has been going on for almost 20 years. However, ever since the European Union formed, an overall streamlining and change of route plans have occurred as faster and more direct links to former Soviet bloc countries now exist and are in the works, with intense co-operation among European countries. Intercity bus service within Germany fell out of favour as post-war prosperity increased, and became almost extinct when legislation was introduced in the 1980s to protect the national railway. After that market was deregulated in 2012, some 150 new intercity bus lines have been established, leading to a significant shift from rail to bus for long journeys. The market has since consolidated with Flixbus controlling over 90% of it and also expanding into neighboring countries. Roads Germany has approximately 650,000 km of roads, of which 231,000 km are non-local roads. The road network is extensively used with nearly 2 trillion km travelled by car in 2005, in comparison to just 70 billion km travelled by rail and 35 billion km travelled by plane. The Autobahn is the German federal highway system. The official German term is (plural , abbreviated 'BAB'), which translates as 'federal motorway'. Where no local speed limit is posted, the advisory limit (Richtgeschwindigkeit) is 130 km/h. The Autobahn network had a total length of about in 2016, which ranks it among the most dense and longest systems in the world. Only federally built controlled-access highways meeting certain construction standards including at least two lanes per direction are called "Bundesautobahn". They have their own, blue-coloured signs and their own numbering system. All Autobahnen are named by using the capital letter A, followed by a blank and a number (for example A 8). The main Autobahnen going all across Germany have single digit numbers. Shorter highways of regional importance have double digit numbers (like A 24, connecting Berlin and Hamburg). Very short stretches built for heavy local traffic (for example ring roads or the A 555 from Cologne to Bonn) usually have three digits, where the first digit depends on the region. East–west routes are usually even-numbered, north–south routes are usually odd-numbered. The numbers of the north–south Autobahnen increase from west to east; that is to say, the more easterly roads are given higher numbers. Similarly, the east–west routes use increasing numbers from north to south. The autobahns are considered the safest category of German roads: for example, in 2012, while carrying 31% of all motorized road traffic, they only accounted for 11% of Germany's traffic fatalities. German autobahns are still toll-free for light vehicles, but on 1 January 2005, a blanket mandatory toll on heavy trucks was introduced. The national roads in Germany are called Bundesstraßen (federal roads). Their numbers are usually well known to local road users, as they appear (written in black digits on a yellow rectangle with black border) on direction traffic signs and on street maps. A Bundesstraße is often referred to as "B" followed by its number, for example "B1", one of the main east–west routes. More important routes have lower numbers. Odd numbers are usually applied to north–south oriented roads, and even numbers for east–west routes. Bypass routes are referred to with an appended "a" (alternative) or "n" (new alignment), as in "B 56n". Other main public roads are maintained by the Bundesländer (states), called Landesstraße (country road) or Staatsstraße (state road). The numbers of these roads are prefixed with "L", "S" or "St", but are usually not seen on direction signs or written on maps. They appear on the kilometre posts on the roadside. Numbers are unique only within one state. The Landkreise (districts) and municipalities are in charge of the minor roads and streets within villages, towns and cities. These roads have the number prefix "K" indicating a Kreisstraße. Rail transport Overview Germany features a total of 43,468 km railways, of which at least 19,973 km are electrified (2014). Deutsche Bahn (German Rail) is the major German railway infrastructure and service operator. Though Deutsche Bahn is a private company, the government still holds all shares and therefore Deutsche Bahn can still be called a state-owned company. Since its reformation under private law in 1994, Deutsche Bahn AG (DB AG) no longer publishes details of the tracks it owns; in addition to the DBAG system there are about 280 privately or locally owned railway companies which own an approximate 3,000 km to 4,000 km of the total tracks and use DB tracks in open access. Railway subsidies amounted to €17.0 billion in 2014 and there are significant differences between the financing of long-distance and short-distance (or local) trains in Germany. While long-distance trains can be run by any railway company, the companies also receive no subsidies from the government. Local trains however are subsidised by the German states, which pay the operating companies to run these trains and indeed in 2013, 59% of the cost of short-distance passenger rail transport was covered by subsidies. This resulted in many private companies offering to run local train services as they can provide cheaper service than the state-owned Deutsche Bahn. Track construction is entirely and track maintenance partly government financed both for long and short range trains. On the other hand, all rail vehicles are charged track access charges by DB Netz which in turn delivers (part of) its profits to the federal budget. High speed rail started in the early 1990s with the introduction of the Inter City Express (ICE) into revenue service after first plans to modernize the rail system had been drawn up under the government of Willy Brandt. While the high speed network is not as dense as those of France or Spain, ICE or slightly slower (max. speed 200 km/h) Intercity (IC) serve most major cities. Several extensions or upgrades to high speed lines are under construction or planned for the near future, some of them after decades of planning. The fastest high-speed train operated by Deutsche Bahn, the InterCityExpress or ICE connects major German and neighbouring international centres such as Zurich, Vienna, Copenhagen, Paris, Amsterdam and Brussels. The rail network throughout Germany is extensive and provides excellent service in most areas. On regular lines, at least one train every two hours will call even in the smallest of villages during the day. Nearly all larger metropolitan areas are served by S-Bahn, U-Bahn, Straßenbahn and/or bus networks. The German government on 13 February 2018 announced plans to make public transportation free as a means to reduce road traffic and decrease air pollution to EU-mandated levels. The new policy will be put to the test by the end of the year in the cities of Bonn, Essen, Herrenberg, Reutlingen and Mannheim. Issues remain concerning the costs of such a move as ticket sales for public transportation constitute a major source of income for cities. International freight trains While Germany and most of contiguous Europe use , differences in signalling, rules and regulations, electrification voltages, etc. create obstacles for freight operations across borders. These obstacles are slowly being overcome, with international (in- and outgoing) and transit (through) traffic being responsible for a large part of the recent uptake in rail freight volume. EU regulations have done much to harmonize standards, making cross border operations easier. Maschen Marshalling Yard near Hamburg is the second biggest in the world and the biggest in Europe. It serves as a freight hub distributing goods from Scandinavia to southern Europe and from Central Europe to the port of Hamburg and overseas. Being a densely populated prosperous country in the center of Europe, there are many important transit routes through Germany. The Mannheim–Karlsruhe–Basel railway has undergone upgrades and refurbishments since the 1980s and will likely undergo further upgrades for decades to come as it is the main route from the North Sea Ports to northern Italy via the Gotthard Base Tunnel. S-Bahn Almost all major metro areas of Germany have suburban rail systems called S-Bahnen (Schnellbahnen). These usually connect larger agglomerations to their suburbs and often other regional towns, although the Rhein-Ruhr S-Bahn connects several large cities. A S-Bahn doesn't skip stations and runs more frequently than other trains. In Berlin and Hamburg the S-Bahn has a U-Bahn-like service and uses a third rail whereas all other S-Bahn services rely on regular catenary power supply. Rapid transit (U-Bahn) Relatively few cities have a full-fledged underground U-Bahn system; S-Bahn (suburban commuter railway) systems are far more common. In some cities the distinction between U-Bahn and S-Bahn systems is blurred, for instance some S-Bahn systems run underground, have frequencies similar to U-Bahn, and form part of the same integrated transport network. A larger number of cities has upgraded their tramways to light rail standards. These systems are called Stadtbahn (not to be confused with S-Bahn), on main line rails. Cities with U-Bahn systems are: Berlin (U-Bahn) Hamburg (U-Bahn) Munich (U-Bahn) Nuremberg/Fürth (U-Bahn) With the exception of Hamburg, all of those aforementioned cities also have a tram system, often with new lines built to light rail standards. Cities with Stadtbahn systems can be found in the article Trams in Germany. Trams (Straßenbahn) Germany was among the first countries to have electric street - running railways and Berlin has one of the longest tram networks in the world. Many West German cities abandoned their previous tram systems in the 1960s and 1970s while others upgraded them to "Stadtbahn" (~light rail) standard, often including underground sections. In the East, most cities retained or even expanded their tram systems and since reunification a trend towards new tram construction can be observed in most of the country. Today the only major German city without a tram or light rail system is Hamburg. Tram-train systems like the Karlsruhe model first came to prominence in Germany in the early 1990s and are implemented or discussed in several cities, providing coverage far into the rural areas surrounding cities. Air transport Short distances and the extensive network of motorways and railways make airplanes uncompetitive for travel within Germany. Only about 1% of all distance travelled was by plane in 2002. But due to a decline in prices with the introduction of low-fares airlines, domestic air travel is becoming more attractive. In 2013 Germany had the fifth largest passenger air market in the world with 105,016,346 passengers. However, the advent of new faster rail lines often leads to cuts in service by the airlines or even total abandonment of routes like Frankfurt-Cologne, Berlin-Hannover or Berlin-Hamburg. Airlines see: List of airlines of Germany Germany's largest airline is Lufthansa, which was privatised in the 1990s. Lufthansa also operates two regional subsidiaries under the Lufthansa Regional brand and a low-cost subsidiary, Eurowings, which operates independently. Lufthansa flies a dense network of domestic, European and intercontinental routes. Germany's second-largest airline was Air Berlin, which also operated a network of domestic and European destinations with a focus on leisure routes as well as some long-haul services. Air Berlin declared bankruptcy in 2017 with the last flight under its own name in October of that year. Charter and leisure carriers include Condor, TUIfly, MHS Aviation and Sundair. Major German cargo operators are Lufthansa Cargo, European Air Transport Leipzig (which is a subsidiary of DHL) and AeroLogic (which is jointly owned by DHL and Lufthansa Cargo). Airports see: List of airports in Germany Frankfurt Airport is Germany's largest airport, a major transportation hub in Europe and the world's twelfth busiest airport. It is one of the airports with the largest number of international destinations served worldwide. Depending on whether total passengers, flights or cargo traffic are used as a measure, it ranks first, second or third in Europe alongside London Heathrow Airport and Paris-Charles de Gaulle Airport. Germany's second biggest international airport is Munich Airport followed by Düsseldorf Airport. There are several more scheduled passenger airports throughout Germany, mainly serving European metropolitan and leisure destinations. Intercontinental long-haul routes are operated to and from the airports in Frankfurt, Munich, Düsseldorf, Berlin-Tegel, Cologne/Bonn, Hamburg and Stuttgart. Berlin Brandenburg Airport is expected to become the third largest German airport by annual passengers once it opens, serving as single airport for Berlin. Originally planned to be completed in 2011, the new airport has been delayed several times due to poor construction management and technical difficulties. As of September 2014, it is not yet known when the new airport will become operational. In 2017 it was announced that the airport wouldn't open before 2019. In the same year a non-binding referendum to keep Tegel Airport open even after the new airport opens was passed by Berlin voters. BER has opened on October 31, 2020 Airports — with paved runways: total: 318 over 3,047 m: 14 2,438 to 3,047 m: 49 1,524 to 2,437 m: 60 914 to 1,523 m: 70 under 914 m: 125 (2013 est.) Airports — with unpaved runways: total: 221 over 3,047 m: 0 2,438 to 3,047 m: 0 1,524 to 2,437 m: 1 914 to 1,523 m: 35 under 914 m: 185 (2013 est.) Heliports: 23 (2013 est.) Water transport Waterways: 7,467 km (2013); major rivers include the Rhine and Elbe; Kiel Canal is an important connection between the Baltic Sea and North Sea and one of the busiest waterways in the world, the Rhine-Main-Danube Canal links Rotterdam on the North Sea with the Black Sea. It passes through the highest point reachable by ocean-going vessels from the sea. The Canal has gained importance for leisure cruises in addition to cargo traffic. Pipelines: oil 2,400 km (2013) Ports and harbours: Berlin, Bonn, Brake, Bremen, Bremerhaven, Cologne, Dortmund, Dresden, Duisburg, Emden, Fürth, Hamburg, Karlsruhe, Kiel, Lübeck, Magdeburg, Mannheim, Nuremberg, Oldenburg, Rostock, Stuttgart, Wilhelmshaven The port of Hamburg is the largest sea-harbour in Germany and ranks #3 in Europe (after Rotterdam and Antwerpen), #17 worldwide (2016), in total container traffic. Merchant marine: total: 427 ships Ships by type: barge carrier 2, bulk carrier 6, cargo ship 51, chemical tanker 15, container ship 298, Liquified Gas Carrier 6, passenger ship 4, petroleum tanker 10, refrigerated cargo 3, roll-on/roll-off ship 6 (2010 est.) Ferries operate mostly between mainland Germany and its islands, serving both tourism and freight transport. Car ferries also operate across the Baltic Sea to the Nordic countries, Russia and the Baltic countries. Rail ferries operate across the Fehmahrnbelt, from Rostock to Sweden (both carrying passenger trains) and from the Mukran port in Sassnitz on the island of Rügen to numerous Baltic Sea destinations (freight only). See also List of airports in Germany License plates in Germany List of motorways in Germany List of federal highways in Germany Tourism in Germany CIA World Factbook - see section on transportation References
[ 0.10207387059926987, 0.15237556397914886, 0.2820200026035309, -0.00890325102955103, -0.010406645014882088, 0.1502852886915207, 0.2803592085838318, 0.42048126459121704, -0.17325294017791748, -0.8816570043563843, -0.2608761489391327, -0.38098233938217163, -0.11301614344120026, 0.250557661056...
11933
https://en.wikipedia.org/wiki/Military%20of%20Germany%20%28disambiguation%29
Military of Germany (disambiguation)
Military of Germany may refer to: Reichsheer (1871–1919), armed forces of the German Empire (lit. Imperial Army) Reichswehr (1919–1935), armed forces of the Weimar Republic Wehrmacht (1935–1945), armed forces of Nazi Germany National People's Army (1956–1990), armed forces of the former German Democratic Republic of eastern Germany, prior to the reunification. Bundeswehr (since 1955), current armed forces of Germany as well as the Federal Republic of Germany in the west, prior to reunification.
[ 0.467923104763031, 0.3329627215862274, -0.6314064264297485, -0.2611314058303833, -0.6049798727035522, 0.20421156287193298, 0.97941654920578, 0.11168228834867477, -0.12341143190860748, -0.2837010622024536, -0.3197472393512726, -0.21249863505363464, 0.18422524631023407, 0.5047721266746521, ...
11934
https://en.wikipedia.org/wiki/Foreign%20relations%20of%20Germany
Foreign relations of Germany
The Federal Republic of Germany (FRG) is a Central European country and member of the European Union, G4, G7, the G20, the Organisation for Economic Co-operation and Development and the North Atlantic Treaty Organization (NATO). It maintains a network of 229 diplomatic missions abroad and holds relations with more than 190 countries. As one of the world's leading industrialized countries it is recognized as a major power in European and global affairs. History Primary institutions and actors Federal Cabinet The three cabinet-level ministries responsible for guiding Germany's foreign policy are the Ministry of Defense, the Ministry of Economic Cooperation and Development and the Federal Foreign Office. In practice, most German federal departments play some role in shaping foreign policy in the sense that there are few policy areas left that remain outside of international jurisdiction. The bylaws of the Federal Cabinet (as delineated in Germany's Basic Law), however, assign the Federal Foreign Office a coordinating function. Accordingly, other ministries may only invite foreign guests or participate in treaty negotiations with the approval of the Federal Foreign Office. Bundestag With respect to foreign policy, the Bundestag acts in a supervisory capacity. Each of its committees – most notably the foreign relations committee – oversees the country's foreign policy. The consent of the Bundestag (and insofar as Länder are impacted, the Bundesrat) is required to ratify foreign treaties. If a treaty legislation passes first reading, it is referred to the Committee on Foreign Affairs, which is capable of delaying ratification and prejudice decision through its report to the Bundestag. In 1994, a full EU Committee was also created for the purpose of addressing the large flow of EU-related topics and legislation. Also, the committee has the mandate to speak on behalf of the Bundestag and represent it when deciding an EU policy position. A case in point was the committee's involvement regarding the European Union's eastern enlargement wherein the Committee on Foreign Affairs is responsible for relations with ECE states while the EU Committee is tasked with the negotiations. NGOs There is a raft of NGOs in Germany that engage foreign policy issues. These NGOs include think-tanks (German Council on Foreign Relations), single-issue lobbying organizations (Amnesty International), as well as other organizations that promote stronger bilateral ties between Germany and other countries (Atlantic Bridge). While the budgets and methods of NGOs are distinct, the overarching goal to persuade decision-makers to the wisdom of their own views is a shared one. In 2004, a new German governance framework, particularly on foreign and security policy areas, emerged where NGOs are integrated into actual policymaking. The idea is that the cooperation between state and civil society groups increases the quality of conflict resolution, development cooperation and humanitarian aid for fragile states. The framework seeks to benefit from the expertise of the NGOs in exchange for these groups to have a chance for influencing foreign policy. Disputes In 2001, the discovery that the terrorist cell which carried out the attacks against the United States on 11 September 2001, was based in Hamburg, sent shock waves through the country. The government of Chancellor Gerhard Schröder backed the following U.S. military actions, sending Bundeswehr troops to Afghanistan to lead a joint NATO program to provide security in the country after the ousting of the Taliban. Nearly all of the public was strongly against America's 2003 invasion of Iraq, and any deployment of troops. This position was shared by the SPD/Green government, which led to some friction with the United States. In August 2006, the German government disclosed a botched plot to bomb two German trains. The attack was to occur in July 2006 and involved a 21-year-old Lebanese man, identified only as Youssef Mohammed E. H. Prosecutors said Youssef and another man left suitcases stuffed with crude propane-gas bombs on the trains. As of February 2007, Germany had about 3,000 NATO-led International Security Assistance Force force in Afghanistan as part of the War on Terrorism, the third largest contingent after the United States (14,000) and the United Kingdom (5,200). German forces are mostly in the more secure north of the country. However, Germany, along with some other larger European countries (with the exception of the UK and the Netherlands), have been criticised by the UK and Canada for not sharing the burden of the more intensive combat operations in southern Afghanistan. Global initiatives Humanitarian aid Germany is the largest net contributor to the United Nations and has several development agencies working in Africa and the Middle East. The development policy of the Federal Republic of Germany is an independent area of German foreign policy. It is formulated by the Federal Ministry for Economic Cooperation and Development (BMZ) and carried out by the implementing organisations. The German government sees development policy as a joint responsibility of the international community. It is the world's third biggest aid donor after the United States and France. Germany spent 0.37 per cent of its gross domestic product (GDP) on development, which is below the government's target of increasing aid to 0.51 per cent of GDP by 2010. The international target of 0.7% of GNP would have not been reached either. Ecological involvement International organizations Germany is a member of the Council of Europe, European Union, European Space Agency, G4, G8, International Monetary Fund, NATO, OECD, Organization for Security and Co-operation in Europe, UN, World Bank Group and the World Trade Organization. European Union European integration has gone a long way since the European Coal and Steel Community (ECSC) and the Elysée Treaty. Peaceful collaborations with its neighbors remain one of Germany's biggest political objectives, and Germany has been on the forefront of most achievements made in European integration: Maastricht Treaty Most of the social issues facing European countries in general: immigration, aging populations, straining social-welfare and pension systems – are all important in Germany. Germany seeks to maintain peace through the "deepening" of integration among current members of the European Union member states European Defence Force Introduction of the single currency € Euro Germany has been the largest net contributor to EU budgets for decades (in absolute terms – given Germany's comparatively large population – not per capita) and seeks to limit the growth of these net payments in the enlarged union. European Constitution NATO Under the doctrine introduced by the 2003 Defense Policy Guidelines, Germany continues to give priority to the transatlantic partnership with the United States through the North Atlantic Treaty Organization. However, Germany is giving increasing attention to coordinating its policies with the European Union through the Common Foreign and Security Policy. UN The German Federal Government began an initiative to obtain a permanent seat in the United Nations Security Council, as part of the Reform of the United Nations. This would require approval of a two-thirds majority of the member states and approval of all five Security Council veto powers. This aspiration could be successful due to Germany's good relations with the People's Republic of China and the Russian Federation. Germany is a stable and democratic republic and a G7 country which are also favourable attributes. The United Kingdom and France support German ascension to the supreme body. The U.S. is sending mixed signals. NATO member states, including Germany, decided not to sign the UN treaty on the Prohibition of Nuclear Weapons, a binding agreement for negotiations for the total elimination of nuclear weapons, supported by more than 120 nations. Africa Americas Asia Europe Balkan states The German government was a strong supporter of the enlargement of NATO. Germany was one of the first nations to recognize Croatia and Slovenia as independent nations, rejecting the concept of Yugoslavia as the only legitimate political order in the Balkans (unlike other European powers, who first proposed a pro-Belgrade policy). This is why Serb authorities sometimes referred to "new German imperialism" as one of the main reasons for Yugoslavia's collapse. German troops participate in the multinational efforts to bring "peace and stability" to the Balkans. Central Europe Weimar triangle (France, Germany and Poland); Germany continues to be active economically in the states of Central Europe, and to actively support the development of democratic institutions. In the 2000s, Germany has been arguably the centerpiece of the European Union (though the importance of France cannot be overlooked in this connection). Oceania See also Anglo-German naval arms race Human rights in Germany List of diplomatic missions in Germany List of diplomatic missions of Germany Security issues in Germany Sino-German cooperation (1911–1941) Visa requirements for German citizens References Further reading German diplomacy Bark, Dennis L., and David R. Gress. A History of West Germany. Vol. 1: From Shadow to Substance, 1945–1963. Vol. 2: Democracy and Its Discontents, 1963–1991 (1993), the standard scholarly history Blumenau, Bernhard, 'German Foreign Policy and the 'German Problem' During and After the Cold War: Changes and Continuities'. in: B Blumenau, J Hanhimäki & B Zanchetta (eds), New Perspectives on the End of the Cold War: Unexpected Transformations? Ch. 5. London: Routledge, 2018. . Brandenburg, Erich. From Bismarck to the World War: A History of German Foreign Policy 1870-1914 (1927) online. Buse, Dieter K., and Juergen C. Doerr, eds. Modern Germany: an encyclopedia of history, people and culture, 1871-1990 (2 vol. Garland, 1998). Clark, Claudia. Dear Barack: The Extraordinary Partnership of Barack Obama and Angela Merkel (2021) Cole, Alistair. Franco-German Relations (2000) Feldman, Lily Gardner. Germany's Foreign Policy of Reconciliation: From Enmity to Amity (Rowman & Littlefield; 2012) 393 pages; on German relations with France, Israel, Poland, and Czechoslovakia/the Czech Republic. excerpt Forsberg, Tuomas. "From Ostpolitik to ‘frostpolitik’? Merkel, Putin and German foreign policy towards Russia." International Affairs 92.1 (2016): 21-42. online Gaskarth, Jamie, and Kai Oppermann. "Clashing traditions: German foreign policy in a New Era." International Studies Perspectives 22.1 (2021): 84-105. online Geiss, Imanuel. German foreign policy, 1871-1914 (1976) Haftendorn, Helga. German Foreign Policy Since 1945 (2006), 441pp Hanrieder, Wolfram F. Germany, America, Europe: Forty Years of German Foreign Policy (1991) Heuser, Beatrice. NATO, Britain, France & the FRG: Nuclear Strategies & Forces for Europe, 1949-2000 (1997) 256pp Hewitson, Mark. "Germany and France before the First World War: a reassessment of Wilhelmine foreign policy." English Historical Review 115.462 (2000): 570–606. in JSTOR Junker, Detlef, ed. The United States and Germany in the Era of the Cold War (2 vol 2004), 150 short essays by scholars covering 1945–1990 excerpt and text search vol 1; excerpt and text search vol 2 Kefferputz, Roderick and Jeremy Stern. "The United States, Germany, and World Order: New Priorities for a Changing Alliance." Atlantic Council: Issue Brief (2021) online Kimmich, Christoph. German Foreign Policy 1918-1945: A Guide to Research and Research Materials (2nd ed. Scholarly Resources, 1991) 264 pp. Leitz, Christian. Nazi Foreign Policy, 1933-1941: The Road to Global War (2004) Maulucci Jr., Thomas W. Adenauer's Foreign Office: West German Diplomacy in the Shadow of the Third Reich (2012) excerpt Oppermann, Kai. "National role conceptions, domestic constraints and the new 'normalcy' in German foreign policy: the Eurozone crisis, Libya and beyond." German Politics;; 21.4 (2012): 502-519. Paterson, William E. "Foreign Policy in the Grand Coalition." German politics 19.3-4 (2010): 497-514. Papayoanou, Paul A. "Interdependence, institutions, and the balance of power: Britain, Germany, and World War I." International Security 20.4 (1996): 42–76. Schwarz, Hans-Peter. Konrad Adenauer: A German Politician and Statesman in a Period of War, Revolution and Reconstruction (2 vol 1995) excerpt and text search vol 2. Schmitt, Bernadotte E. "Triple Alliance and Triple Entente, 1902-1914." American Historical Review 29.3 (1924): 449–473. in JSTOR Sontag, Raymond James. Germany and England: Background of Conflict, 1848-1898 (1938) Spang, Christian W. and Rolf-Harald Wippich, eds. Japanese-German Relations, 1895-1945: War, Diplomacy and Public Opinion (2006) Weinberg, Gerhard L. The Foreign Policy of Hitler’s Germany (2 vol, 1970–80). Wright, Jonathan. Germany and the Origins of the Second World War (Palgrave Macmillan, 2007) 223pp. online review Young, William. German Diplomatic Relations 1871-1945: The Wilhelmstrasse and the Formulation of Foreign Policy (2006); how the foreign ministry shaped policy World/European diplomatic context Albrecht-Carrié, René. A Diplomatic History of Europe Since the Congress of Vienna (1958), 736pp; a basic introduction that gives context to Germany's roles Kaiser, David E. Economic Diplomacy and the Origins of the Second World War: Germany, Britain, France, and Eastern Europe, 1930-1939 (Princeton UP, 2015). Kennedy, Paul. The Rise and Fall of the Great Powers: Economic Change and Military Conflict from 1500 to 2000 (1989) excerpt and text search; very wide-ranging, with much on economic power Langer, William. An Encyclopedia of World History (5th ed. 1973), very detailed outline Langer, William. European Alliances and Alignments 1870-1890 (2nd ed. 1950); advanced coverage of Bismarckian system Langer, William L. The Diplomacy of Imperialism 1890-1902 (2 vol, 1935) Macmillan, Margaret. The War That Ended Peace: The Road to 1914 (2013) cover 1890s to 1914; see esp. ch 3–5, 8, Mowat, R. B. A History of European Diplomacy 1815-1914 (1922), basic introduction Schroeder, Paul W. The Transformation of European Politics 1763-1848 (1996) Steiner, Zara. The Lights that Failed: European International History 1919-1933 (2007) excerpt and text search Steiner, Zara. The Triumph of the Dark: European International History 1933-1939 (2011) excerpt and text search Taylor, A. J. P. The Struggle for Mastery in Europe: 1848–1918 (1957) excerpt and text search, advanced coverage of all major powers External links German -Bashing and the Breakup of Yugoslavia, ("The Donald W. Treadgold Papers in Russian, East European and Central Asian Studies, nº 16, March 1998). University of Washington: HMJ School of International Studies The German Economy in the New Europe EU Enlargement and Transatlantic Relations Bierling, Stephan. Die Außenpolitik der Bundesrepublik Deutschland: Normen, Akteure, Entscheidungen. 2. Auflage. München: Oldenbourg, 2005 . von Bredow, Wilfried. Die Außenpolitik der Bundesrepublik Deutschland: Eine Einführung''. Wiesbaden: VS Verlag für Sozialwissenschaften, 2006 . Permanent Mission of Germany to the United Nations Auswärtiges Amt AICGS American Institute for Contemporary German Studies SWP German Institute for International and Security Affairs
[ 0.20738238096237183, 0.32066798210144043, 0.23874741792678833, -0.33074018359184265, -0.13365069031715393, 0.03994077444076538, 0.36955028772354126, 0.8873722553253174, -0.135716050863266, -0.4228033423423767, -0.5933130383491516, -0.27819329500198364, 0.14153949916362762, 0.50984138250350...
11935
https://en.wikipedia.org/wiki/Politics%20of%20Germany
Politics of Germany
Germany is a democratic, federal parliamentary republic, where federal legislative power is vested in the Bundestag (the parliament of Germany) and the Bundesrat (the representative body of the Länder, Germany's regional states). The federal system has, since 1949, been dominated by the Christian Democratic Union (CDU) and the Social Democratic Party of Germany (SPD). The judiciary of Germany is independent of the executive and the legislature, while it is common for leading members of the executive to be members of the legislature as well. The political system is laid out in the 1949 constitution, the Grundgesetz (Basic Law), which remained in effect with minor amendments after German reunification in 1990. The constitution emphasizes the protection of individual liberty in an extensive catalogue of human and civil rights and divides powers both between the federal and state levels and between the legislative, executive and judicial branches. West Germany was a founding member of the European Community in 1958, which became the EU in 1993. Germany is part of the Schengen Area, and has been a member of the eurozone since 1999. It is a member of the United Nations, NATO, the G7, the G20 and the OECD. History Prior to 1998 Beginning with the election of Konrad Adenauer in 1949, the Federal Republic of Germany had Christian Democratic chancellors for 20 years until a coalition between the Social Democrats and the Liberals took over. From 1982, Christian Democratic leader Helmut Kohl was chancellor in a coalition with the Liberals for 16 years. In this period fell the reunification of Germany, in 1990: the German Democratic Republic joined the Federal Republic. In the former GDR's territory, five Länder (states) were established or reestablished. The two parts of Berlin united as one "Land" (state). The political system of the Federal Republic remained more or less unchanged. Specific provisions for the former GDR territory were enabled via the unification treaty between the Federal Republic and the GDR prior to the unification day of 3 October 1990. However, Germany saw in the following two distinct party systems: the Green party and the Liberals remained mostly West German parties, while in the East the former socialist state party, now called PDS, flourished along with the Christian Democrats and Social Democrats. 1998–2005 After 16 years of the Christian–Liberal coalition, led by Helmut Kohl, the Social Democratic Party of Germany (SPD) together with the Greens won the Bundestag elections of 1998. SPD vice chairman Gerhard Schröder positioned himself as a centrist candidate, in contradiction to the leftist SPD chairman Oskar Lafontaine. The Kohl government was hurt at the polls by slower economic growth in the East in the previous two years, and constantly high unemployment. The final margin of victory was sufficiently high to permit a "red-green" coalition of the SPD with Alliance 90/The Greens (Bündnis '90/Die Grünen), bringing the Greens into a national government for the first time. Initial problems of the new government, marked by policy disputes between the moderate and traditional left wings of the SPD, resulted in some voter disaffection. Lafontaine left the government (and later his party) in early 1999. The CDU won in some important state elections but was hit in 2000 by a party donation scandal from the Kohl years. As a result of this Christian Democratic Union (CDU) crisis, Angela Merkel became chair. The next election for the Bundestag was on 22 September 2002. Gerhard Schröder led the coalition of SPD and Greens to an eleven-seat victory over the Christian Democrat challengers headed by Edmund Stoiber (CSU). Three factors are generally cited that enabled Schröder to win the elections despite poor approval ratings a few months before and a weaker economy: good handling of the 100-year flood, firm opposition to the US 2003 invasion of Iraq, and Stoiber's unpopularity in the east, which cost the CDU crucial seats there. In its second term, the red–green coalition lost several very important state elections, for example in Lower Saxony where Schröder was the prime minister from 1990 to 1998. On 20 April 2003, chancellor Schröder announced massive labor market reforms, called Agenda 2010, that cut unemployment benefits. Although these reforms sparked massive protests, they are now credited with being in part responsible for the relatively strong economic performance of Germany during the euro-crisis and the decrease in unemployment in Germany in the years 2006–2007. 2005–2009 On 22 May 2005 the SPD received a devastating defeat in its former heartland, North Rhine-Westphalia. Half an hour after the election results, the SPD chairman Franz Müntefering announced that the chancellor would clear the way for new federal elections. This took the republic by surprise, especially because the SPD was below 20% in polls at the time. The CDU quickly announced Angela Merkel as Christian Democrat candidate for chancellor, aspiring to be the first female chancellor in German history. New for the 2005 election was the alliance between the newly formed Electoral Alternative for Labor and Social Justice (WASG) and the PDS, planning to fuse into a common party (see Left Party.PDS). With the former SPD chairman, Oskar Lafontaine for the WASG and Gregor Gysi for the PDS as prominent figures, this alliance soon found interest in the media and in the population. Polls in July saw them as high as 12%. Whereas in May and June 2005 victory of the Christian Democrats seemed highly likely, with some polls giving them an absolute majority, this picture changed shortly before the election on 18 September 2005. The election results of 18 September were surprising because they differed widely from the polls of the previous weeks. The Christian Democrats even lost votes compared to 2002, narrowly reaching the first place with only 35.2%, and failed to get a majority for a "black–yellow" government of CDU/CSU and liberal FDP. But the red–green coalition also failed to get a majority, with the SPD losing votes, but polling 34.2% and the greens staying at 8.1%. The Left reached 8.7% and entered the Bundestag, whereas the far-right NPD only got 1.6%. The most likely outcome of coalition talks was a so-called grand coalition between the Christian Democrats (CDU/CSU) and the Social Democrats (SPD). Three party coalitions and coalitions involving The Left had been ruled out by all interested parties (including The Left itself). On 22 November 2005, Angela Merkel was sworn in by President Horst Köhler for the office of Bundeskanzlerin. The existence of the grand coalition on federal level helped smaller parties' electoral prospects in state elections. Since in 2008, the CSU lost its absolute majority in Bavaria and formed a coalition with the FDP, the grand coalition had no majority in the Bundesrat and depended on FDP votes on important issues. In November 2008, the SPD re-elected its already retired chair Franz Müntefering and made Frank-Walter Steinmeier its leading candidate for the federal election in September 2009. As a result of that federal election, the grand coalition brought losses for both parties and came to an end. The SPD suffered the heaviest losses in its history and was unable to form a coalition government. The CDU/CSU had only little losses but also reached a new historic low with its worst result since 1949. The three smaller parties thus had more seats in the German Bundestag than ever before, with the liberal party FDP winning 14.6% of votes. 2009–2013 The CDU/CSU and FDP together held 332 seats (of 622 total seats) and had been in coalition since 27 October 2009. Angela Merkel was re-elected as chancellor, and Guido Westerwelle served as the foreign minister and vice chancellor of Germany. After being elected into the federal government, the FDP suffered heavy losses in the following state elections. The FDP had promised to lower taxes in the electoral campaign, but after being part of the coalition they had to concede that this was not possible due to the economic crisis of 2008. Because of the losses, Guido Westerwelle had to resign as chair of the FDP in favor of Philipp Rösler, federal minister of health, who was consequently appointed as vice chancellor. Shortly after, Philipp Rösler changed office and became federal minister of economics and technology. After their electoral fall, the Social Democrats were led by Sigmar Gabriel, a former federal minister and prime minister of Lower Saxony, and by Frank-Walter Steinmeier as the head of the parliamentary group. He resigned on 16 January 2017 and proposed his longtime friend and president of European Parliament Martin Schulz as his successor and chancellor candidate. Germany has seen increased political activity by citizens outside the established political parties with respect to local and environmental issues such as the location of Stuttgart 21, a railway hub, and construction of Berlin Brandenburg Airport. 2013–2017 The 18th federal elections in Germany resulted in the re-election of Angela Merkel and her Christian democratic parliamentary group of the parties CDU and CSU, receiving 41.5% of all votes. Following Merkel's first two historically low results, her third campaign marked the CDU/CSU's best result since 1994 and only for the second time in German history the possibility of gaining an absolute majority. Their former coalition partner, the FDP, narrowly failed to reach the 5% threshold and did not gain seats in the Bundestag. Not having reached an absolute majority, the CDU/CSU formed a grand coalition with the social-democratic SPD after the longest coalition talks in history, making the head of the party Sigmar Gabriel vice-chancellor and federal minister for economic affairs and energy. Together they held 504 of a total 631 seats (CDU/CSU 311 and SPD 193). The only two opposition parties were The Left (64 seats) and Alliance '90/The Greens (63 seats), which was acknowledged as creating a critical situation in which the opposition parties did not even have enough seats to use the special controlling powers of the opposition. 2017–2021 The 19th federal elections in Germany took place on 24 September 2017. The two big parties, the conservative parliamentary group CDU/CSU and the social democrat SPD were in a similar situation as in 2009, after the last grand coalition had ended, and both had suffered severe losses; reaching their second worst and worst result respectively in 2017. Many votes in the 2017 elections went to smaller parties, leading the right-wing populist party AfD (Alternative for Germany) into the Bundestag which marked a big shift in German politics since it was the first far-right party to win seats in parliament since the 1950s. With Merkel's candidacy for a fourth term, the CDU/CSU only reached 33.0% of the votes, but won the highest number of seats, leaving no realistic coalition option without the CDU/CSU. As all parties in the Bundestag strictly ruled out a coalition with the AfD, the only options for a majority coalition were a so-called "Jamaican" coalition (CDU/CSU, FDP, Greens; named after the party colors resembling those of the Jamaican flag) and a grand coalition with the SPD, which was at first opposed by the Social Democrats and their leader Martin Schulz. Coalition talks between the three parties of the "Jamaican" coalition were held but the final proposal was rejected by the liberals of the FDP, leaving the government in limbo. Following the unprecedented situation, for the first time in German history different minority coalitions or even direct snap coalitions were also heavily discussed. At this point, Federal President Steinmeier invited leaders of all parties for talks about a government, being the first president in the history of the Federal Republic to do so. Official coalition talks between CDU/CSU and SPD started in January 2018 and led to a renewal of the grand coalition on 12 March 2018 as well as the subsequent re-election of Angela Merkel as chancellor. 2021 onwards Scheduled elections for the new Bundestag were held on 26 September 2021 during the COVID-19 pandemic. Angela Merkel did not stand for a fifth term but handed her post over after the second longest term for a chancellor in German history. Olaf Scholz was sworn in as the new chancellor on 8 December 2021. His Social Democrats had won the majority of votes and formed a liberal-left coalition government with The Greens and the FDP. Constitution The "Basic Law for the Federal Republic of Germany" (Grundgesetz der Bundesrepublik Deutschland) is the Constitution of Germany. It was formally approved on 8 May 1949, and, with the signature of the Allies of World War II on 12 May, came into effect on 23 May, as the constitution of those states of West Germany that were initially included within the Federal Republic. The 1949 Basic Law is a response to the perceived flaws of the 1919 Weimar Constitution, which failed to prevent the rise of the Nazi party in 1933. Since 1990, in the course of the reunification process after the fall of the Berlin Wall, the Basic Law also applies to the eastern states of the former German Democratic Republic. Executive Head of state The German head of state is the federal president. As in Germany's parliamentary system of government, the federal chancellor runs the government and day-to-day politics, while the role of the federal president is mostly ceremonial. The federal president, by their actions and public appearances, represents the state itself, its existence, its legitimacy, and unity. Their office involves an integrative role. Nearly all actions of the federal president become valid only after a countersignature of a government member. The president is not obliged by Constitution to refrain from political views. He or she is expected to give direction to general political and societal debates, but not in a way that links him to party politics. Most German presidents were active politicians and party members prior to the office, which means that they have to change their political style when becoming president. The function comprises the official residence of Bellevue Palace. Under Article 59 (1) of the Basic Law, the federal president represents the Federal Republic of Germany in matters of international law, concludes treaties with foreign states on its behalf and accredits diplomats. All federal laws must be signed by the president before they can come into effect; he or she does not have a veto, but the conditions for refusing to sign a law on the basis of unconstitutionality are the subject of debate. The office is currently held by Frank-Walter Steinmeier (since 2017). The federal president does have a role in the political system, especially at the establishment of a new government and the dissolution of the Bundestag (parliament). This role is usually nominal but can become significant in case of political instability. Additionally, a federal president together with the Federal Council can support the government in a "legislatory emergency state" to enable laws against the will of the Bundestag (Article 81 of the Basic Law). However, so far the federal president has never had to use these "reserve powers". Head of government The Bundeskanzler (federal chancellor) heads the Bundesregierung (federal government) and thus the executive branch of the federal government. They are elected by and responsible to the Bundestag, Germany's parliament. The other members of the government are the federal ministers; they are chosen by the Chancellor. Germany, like the United Kingdom, can thus be classified as a parliamentary system. The office is currently held by Olaf Scholz (since 2021). The Chancellor cannot be removed from office during a four-year term unless the Bundestag has agreed on a successor. This constructive vote of no confidence is intended to avoid a similar situation to that of the Weimar Republic in which the executive did not have enough support in the legislature to govern effectively, but the legislature was too divided to name a successor. The current system also prevents the Chancellor from calling a snap election. Except in the periods 1969–1972 and 1976–1982, when the Social Democratic party of Chancellor Brandt and Schmidt came in second in the elections, the chancellor has always been the candidate of the largest party, usually supported by a coalition of two parties with a majority in the parliament. The chancellor appoints one of the federal ministers as their deputy, who has the unofficial title Vice Chancellor (). The office is currently held by Robert Habeck (since 2021). Cabinet The German Cabinet (Bundeskabinett or Bundesregierung) is the chief executive body of the Federal Republic of Germany. It consists of the chancellor and the cabinet ministers. The fundamentals of the cabinet's organization are set down in articles 62–69 of the Basic Law. The current cabinet is Scholz (since 2021). Agencies Agencies of the German government include: Federal Intelligence Service (Bundesnachrichtendienst) Federal Bureau of Aircraft Accident Investigation (Bundesstelle für Flugunfalluntersuchung) Federal Aviation Office (Luftfahrt-Bundesamt) Federal Bureau for Maritime Casualty Investigation (Bundesstelle für Seeunfalluntersuchung) Federal Maritime and Hydrographic Agency (Bundesamt für Seeschifffahrt und Hydrographie) Federal Railway Accident Investigation Board (Eisenbahn-Unfalluntersuchungsstelle des Bundes) Federal Railway Authority (Eisenbahn-Bundesamt) Legislature Federal legislative power is divided between the Bundestag and the Bundesrat. The Bundestag is directly elected by the German people, while the Bundesrat represents the governments of the regional states (Länder). The federal legislature has powers of exclusive jurisdiction and concurrent jurisdiction with the states in areas specified in the constitution. The Bundestag is more powerful than the Bundesrat and only needs the latter's consent for proposed legislation related to revenue shared by the federal and state governments, and the imposition of responsibilities on the states. In practice, however, the agreement of the Bundesrat in the legislative process is often required, since federal legislation frequently has to be executed by state or local agencies. In the event of disagreement between the Bundestag and the Bundesrat, either side can appeal to the (Mediation Committee), a conference committee-like body of 16 Bundesrat and 16 Bundestag members, to find a compromise. Bundestag The Bundestag (Federal Diet) is elected for a four-year term and consists of 598 or more members elected by a means of mixed-member proportional representation, which Germans call "personalised proportional representation". 299 members represent single-seat constituencies and are elected by a first past the post electoral system. Parties that obtain fewer constituency seats than their national share of the vote are allotted seats from party lists to make up the difference. In contrast, parties that obtain more constituency seats than their national share of the vote are allowed to keep these so-called overhang seats. In the parliament that was elected in 2009, there were 24 overhang seats, giving the Bundestag a total of 622 members. After Bundestag elections since 2013, other parties obtain extra seats ("balance seats") that offset advantages from their rival's overhang seats. The current Bundestag is the largest in German history with 709 members. A party must receive either five percent of the national vote or win at least three directly elected seats to be eligible for non-constituency seats in the Bundestag. This rule, often called the "five percent hurdle", was incorporated into Germany's election law to prevent political fragmentation and disproportionately influential minority parties. The first Bundestag elections were held in the Federal Republic of Germany ("West Germany") on 14 August 1949. Following reunification, elections for the first all-German Bundestag were held on 2 December 1990. The last federal election was held on 26 September 2021. Judiciary Germany follows the civil law tradition. The judicial system comprises three types of courts. Ordinary courts, dealing with criminal and most civil cases, are the most numerous by far. The Federal Court of Justice of Germany (Bundesgerichtshof) is the highest ordinary court and also the highest court of appeals. Specialized courts hear cases related to administrative, labour, social, fiscal and patent law. Constitutional courts focus on judicial review and constitutional interpretation. The Federal Constitutional Court (Bundesverfassungsgericht) is the highest court dealing with constitutional matters. The main difference between the Federal Constitutional Court and the Federal Court of Justice is that the Federal Constitutional Court may only be called if a constitutional matter within a case is in question (e.g. a possible violation of human rights in a criminal trial), while the Federal Court of Justice may be called in any case. Foreign relations Germany maintains a network of 229 diplomatic missions abroad and holds relations with more than 190 countries. It is the largest contributor to the budget of the European Union (providing 27%) and third largest contributor to the United Nations (providing 8%). Germany is a member of the NATO defence alliance, the Organisation of Economic Co-operation and Development (OECD), the G8, the G20, the World Bank and the International Monetary Fund (IMF). Germany has played a leading role in the European Union since its inception and has maintained a strong alliance with France since the end of World War II. The alliance was especially close in the late 1980s and early 1990s under the leadership of Christian Democrat Helmut Kohl and Socialist François Mitterrand. Germany is at the forefront of European states seeking to advance the creation of a more unified European political, defence, and security apparatus. For a number of decades after WWII, the Federal Republic of Germany kept a notably low profile in international relations, because of both its recent history and its occupation by foreign powers. During the Cold War, Germany's partition by the Iron Curtain made it a symbol of East–West tensions and a political battleground in Europe. However, Willy Brandt's Ostpolitik was a key factor in the détente of the 1970s. In 1999, Chancellor Gerhard Schröder's government defined a new basis for German foreign policy by taking a full part in the decisions surrounding the NATO war against Yugoslavia and by sending German troops into combat for the first time since World War II. The governments of Germany and the United States are close political allies. The 1948 Marshall Plan and strong cultural ties have crafted a strong bond between the two countries, although Schröder's very vocal opposition to the Iraq War had suggested the end of Atlanticism and a relative cooling of German–American relations. The two countries are also economically interdependent: 5.0% of German exports in goods are US-bound and 3.5% of German imported goods originate from the US with a trade deficit of -63,678.5 million dollars for the United States (2017). Other signs of the close ties include the continuing position of German–Americans as the largest reported ethnic group in the US, and the status of Ramstein Air Base (near Kaiserslautern) as the largest US military community outside the US. The policy on foreign aid is an important area of German foreign policy. It is formulated by the Federal Ministry for Economic Cooperation and Development (BMZ) and carried out by the implementing organisations. The German government sees development policy as a joint responsibility of the international community. It is the world's fourth biggest aid donor after the United States, the United Kingdom and France. Germany spent 0.37 per cent of its gross domestic product (GDP) on development, which is below the government's target of increasing aid to 0.51 per cent of GDP by 2010. Administrative divisions Germany comprises sixteen states that are collectively referred to as Länder. Due to differences in size and population, the subdivision of these states varies especially between city-states (Stadtstaaten) and states with larger territories (Flächenländer). For regional administrative purposes five states, namely Baden-Württemberg, Bavaria, Hesse, North Rhine-Westphalia and Saxony, consist of a total of 22 Government Districts (Regierungsbezirke). As of 2009 Germany is divided into 403 districts (Kreise) on municipal level, these consist of 301 rural districts and 102 urban districts. See also Federalism in Germany German governing coalition List of political parties in Germany List of Federal Republic of Germany governments Lobbying in Germany Party finance in Germany Political culture of Germany References External links Official Site of the Bundesregierung, in English Official source of election results Official source from the German Embassy in Washington, DC
[ 0.33154934644699097, -0.0291794091463089, 0.2308976948261261, 0.08503231406211853, -0.2116137593984604, -0.24553728103637695, 0.39625096321105957, 0.3790913224220276, -0.1362474411725998, -0.4494817554950714, -0.7762858271598816, -0.0489388182759285, 0.1414598822593689, 0.24363040924072266...
11953
https://en.wikipedia.org/wiki/History%20of%20geometry
History of geometry
Geometry (from the ; geo- "earth", -metron "measurement") arose as the field of knowledge dealing with spatial relationships. Geometry was one of the two fields of pre-modern mathematics, the other being the study of numbers (arithmetic). Classic geometry was focused in compass and straightedge constructions. Geometry was revolutionized by Euclid, who introduced mathematical rigor and the axiomatic method still in use today. His book, The Elements is widely considered the most influential textbook of all time, and was known to all educated people in the West until the middle of the 20th century. In modern times, geometric concepts have been generalized to a high level of abstraction and complexity, and have been subjected to the methods of calculus and abstract algebra, so that many modern branches of the field are barely recognizable as the descendants of early geometry. (See Areas of mathematics and Algebraic geometry.) Early geometry The earliest recorded beginnings of geometry can be traced to early peoples, who discovered obtuse triangles in the ancient Indus Valley (see Harappan mathematics), and ancient Babylonia (see Babylonian mathematics) from around 3000 BC. Early geometry was a collection of empirically discovered principles concerning lengths, angles, areas, and volumes, which were developed to meet some practical need in surveying, construction, astronomy, and various crafts. Among these were some surprisingly sophisticated principles, and a modern mathematician might be hard put to derive some of them without the use of calculus and algebra . For example, both the Egyptians and the Babylonians were aware of versions of the Pythagorean theorem about 1500 years before Pythagoras and the Indian Sulba Sutras around 800 BC contained the first statements of the theorem; the Egyptians had a correct formula for the volume of a frustum of a square pyramid. Egyptian geometry The ancient Egyptians knew that they could approximate the area of a circle as follows: Area of Circle ≈ [ (Diameter) x 8/9 ]2. Problem 50 of the Ahmes papyrus uses these methods to calculate the area of a circle, according to a rule that the area is equal to the square of 8/9 of the circle's diameter. This assumes that is 4×(8/9)2 (or 3.160493...), with an error of slightly over 0.63 percent. This value was slightly less accurate than the calculations of the Babylonians (25/8 = 3.125, within 0.53 percent), but was not otherwise surpassed until Archimedes' approximation of 211875/67441 = 3.14163, which had an error of just over 1 in 10,000. Ahmes knew of the modern 22/7 as an approximation for , and used it to split a hekat, hekat x 22/x x 7/22 = hekat; however, Ahmes continued to use the traditional 256/81 value for for computing his hekat volume found in a cylinder. Problem 48 involved using a square with side 9 units. This square was cut into a 3x3 grid. The diagonal of the corner squares were used to make an irregular octagon with an area of 63 units. This gave a second value for of 3.111... The two problems together indicate a range of values for between 3.11 and 3.16. Problem 14 in the Moscow Mathematical Papyrus gives the only ancient example finding the volume of a frustum of a pyramid, describing the correct formula: where a and b are the base and top side lengths of the truncated pyramid and h is the height. Babylonian geometry The Babylonians may have known the general rules for measuring areas and volumes. They measured the circumference of a circle as three times the diameter and the area as one-twelfth the square of the circumference, which would be correct if π is estimated as 3. The volume of a cylinder was taken as the product of the base and the height, however, the volume of the frustum of a cone or a square pyramid was incorrectly taken as the product of the height and half the sum of the bases. The Pythagorean theorem was also known to the Babylonians. Also, there was a recent discovery in which a tablet used π as 3 and 1/8. The Babylonians are also known for the Babylonian mile, which was a measure of distance equal to about seven miles today. This measurement for distances eventually was converted to a time-mile used for measuring the travel of the Sun, therefore, representing time. There have been recent discoveries showing that ancient Babylonians may have discovered astronomical geometry nearly 1400 years before Europeans did. Vedic India geometry The Indian Vedic period had a tradition of geometry, mostly expressed in the construction of elaborate altars. Early Indian texts (1st millennium BC) on this topic include the Satapatha Brahmana and the Śulba Sūtras. According to , the Śulba Sūtras contain "the earliest extant verbal expression of the Pythagorean Theorem in the world, although it had already been known to the Old Babylonians." The diagonal rope () of an oblong (rectangle) produces both which the flank (pārśvamāni) and the horizontal () <ropes> produce separately." They contain lists of Pythagorean triples, which are particular cases of Diophantine equations. They also contain statements (that with hindsight we know to be approximate) about squaring the circle and "circling the square." The Baudhayana Sulba Sutra, the best-known and oldest of the Sulba Sutras (dated to the 8th or 7th century BC) contains examples of simple Pythagorean triples, such as: , , , , and as well as a statement of the Pythagorean theorem for the sides of a square: "The rope which is stretched across the diagonal of a square produces an area double the size of the original square." It also contains the general statement of the Pythagorean theorem (for the sides of a rectangle): "The rope stretched along the length of the diagonal of a rectangle makes an area which the vertical and horizontal sides make together." According to mathematician S. G. Dani, the Babylonian cuneiform tablet Plimpton 322 written c. 1850 BC "contains fifteen Pythagorean triples with quite large entries, including (13500, 12709, 18541) which is a primitive triple, indicating, in particular, that there was sophisticated understanding on the topic" in Mesopotamia in 1850 BC. "Since these tablets predate the Sulbasutras period by several centuries, taking into account the contextual appearance of some of the triples, it is reasonable to expect that similar understanding would have been there in India." Dani goes on to say: "As the main objective of the Sulvasutras was to describe the constructions of altars and the geometric principles involved in them, the subject of Pythagorean triples, even if it had been well understood may still not have featured in the Sulvasutras. The occurrence of the triples in the Sulvasutras is comparable to mathematics that one may encounter in an introductory book on architecture or another similar applied area, and would not correspond directly to the overall knowledge on the topic at that time. Since, unfortunately, no other contemporaneous sources have been found it may never be possible to settle this issue satisfactorily." In all, three Sulba Sutras were composed. The remaining two, the Manava Sulba Sutra composed by Manava (fl. 750-650 BC) and the Apastamba Sulba Sutra, composed by Apastamba (c. 600 BC), contained results similar to the Baudhayana Sulba Sutra. Greek geometry Classical Greek geometry For the ancient Greek mathematicians, geometry was the crown jewel of their sciences, reaching a completeness and perfection of methodology that no other branch of their knowledge had attained. They expanded the range of geometry to many new kinds of figures, curves, surfaces, and solids; they changed its methodology from trial-and-error to logical deduction; they recognized that geometry studies "eternal forms", or abstractions, of which physical objects are only approximations; and they developed the idea of the "axiomatic method", still in use today. Thales and Pythagoras Thales (635-543 BC) of Miletus (now in southwestern Turkey), was the first to whom deduction in mathematics is attributed. There are five geometric propositions for which he wrote deductive proofs, though his proofs have not survived. Pythagoras (582-496 BC) of Ionia, and later, Italy, then colonized by Greeks, may have been a student of Thales, and traveled to Babylon and Egypt. The theorem that bears his name may not have been his discovery, but he was probably one of the first to give a deductive proof of it. He gathered a group of students around him to study mathematics, music, and philosophy, and together they discovered most of what high school students learn today in their geometry courses. In addition, they made the profound discovery of incommensurable lengths and irrational numbers. Plato Plato (427-347 BC) was a philosopher, highly esteemed by the Greeks. There is a story that he had inscribed above the entrance to his famous school, "Let none ignorant of geometry enter here." However, the story is considered to be untrue. Though he was not a mathematician himself, his views on mathematics had great influence. Mathematicians thus accepted his belief that geometry should use no tools but compass and straightedge – never measuring instruments such as a marked ruler or a protractor, because these were a workman's tools, not worthy of a scholar. This dictum led to a deep study of possible compass and straightedge constructions, and three classic construction problems: how to use these tools to trisect an angle, to construct a cube twice the volume of a given cube, and to construct a square equal in area to a given circle. The proofs of the impossibility of these constructions, finally achieved in the 19th century, led to important principles regarding the deep structure of the real number system. Aristotle (384-322 BC), Plato's greatest pupil, wrote a treatise on methods of reasoning used in deductive proofs (see Logic) which was not substantially improved upon until the 19th century. Hellenistic geometry Euclid Euclid (c. 325-265 BC), of Alexandria, probably a student at the Academy founded by Plato, wrote a treatise in 13 books (chapters), titled The Elements of Geometry, in which he presented geometry in an ideal axiomatic form, which came to be known as Euclidean geometry. The treatise is not a compendium of all that the Hellenistic mathematicians knew at the time about geometry; Euclid himself wrote eight more advanced books on geometry. We know from other references that Euclid's was not the first elementary geometry textbook, but it was so much superior that the others fell into disuse and were lost. He was brought to the university at Alexandria by Ptolemy I, King of Egypt. The Elements began with definitions of terms, fundamental geometric principles (called axioms or postulates), and general quantitative principles (called common notions) from which all the rest of geometry could be logically deduced. Following are his five axioms, somewhat paraphrased to make the English easier to read. Any two points can be joined by a straight line. Any finite straight line can be extended in a straight line. A circle can be drawn with any center and any radius. All right angles are equal to each other. If two straight lines in a plane are crossed by another straight line (called the transversal), and the interior angles between the two lines and the transversal lying on one side of the transversal add up to less than two right angles, then on that side of the transversal, the two lines extended will intersect (also called the parallel postulate). Concepts, that are now understood as algebra, were expressed geometrically by Euclid, a method referred to as Greek geometric algebra. Archimedes Archimedes (287-212 BC), of Syracuse, Sicily, when it was a Greek city-state, is often considered to be the greatest of the Greek mathematicians, and occasionally even named as one of the three greatest of all time (along with Isaac Newton and Carl Friedrich Gauss). Had he not been a mathematician, he would still be remembered as a great physicist, engineer, and inventor. In his mathematics, he developed methods very similar to the coordinate systems of analytic geometry, and the limiting process of integral calculus. The only element lacking for the creation of these fields was an efficient algebraic notation in which to express his concepts. After Archimedes After Archimedes, Hellenistic mathematics began to decline. There were a few minor stars yet to come, but the golden age of geometry was over. Proclus (410-485), author of Commentary on the First Book of Euclid, was one of the last important players in Hellenistic geometry. He was a competent geometer, but more importantly, he was a superb commentator on the works that preceded him. Much of that work did not survive to modern times, and is known to us only through his commentary. The Roman Republic and Empire that succeeded and absorbed the Greek city-states produced excellent engineers, but no mathematicians of note. The great Library of Alexandria was later burned. There is a growing consensus among historians that the Library of Alexandria likely suffered from several destructive events, but that the destruction of Alexandria's pagan temples in the late 4th century was probably the most severe and final one. The evidence for that destruction is the most definitive and secure. Caesar's invasion may well have led to the loss of some 40,000-70,000 scrolls in a warehouse adjacent to the port (as Luciano Canfora argues, they were likely copies produced by the Library intended for export), but it is unlikely to have affected the Library or Museum, given that there is ample evidence that both existed later. Civil wars, decreasing investments in maintenance and acquisition of new scrolls and generally declining interest in non-religious pursuits likely contributed to a reduction in the body of material available in the Library, especially in the 4th century. The Serapeum was certainly destroyed by Theophilus in 391, and the Museum and Library may have fallen victim to the same campaign. Classical Indian geometry In the Bakhshali manuscript, there is a handful of geometric problems (including problems about volumes of irregular solids). The Bakhshali manuscript also "employs a decimal place value system with a dot for zero." Aryabhata's Aryabhatiya (499) includes the computation of areas and volumes. Brahmagupta wrote his astronomical work in 628. Chapter 12, containing 66 Sanskrit verses, was divided into two sections: "basic operations" (including cube roots, fractions, ratio and proportion, and barter) and "practical mathematics" (including mixture, mathematical series, plane figures, stacking bricks, sawing of timber, and piling of grain). In the latter section, he stated his famous theorem on the diagonals of a cyclic quadrilateral: Brahmagupta's theorem: If a cyclic quadrilateral has diagonals that are perpendicular to each other, then the perpendicular line drawn from the point of intersection of the diagonals to any side of the quadrilateral always bisects the opposite side. Chapter 12 also included a formula for the area of a cyclic quadrilateral (a generalization of Heron's formula), as well as a complete description of rational triangles (i.e. triangles with rational sides and rational areas). Brahmagupta's formula: The area, A, of a cyclic quadrilateral with sides of lengths a, b, c, d, respectively, is given by where s, the semiperimeter, given by: Brahmagupta's Theorem on rational triangles: A triangle with rational sides and rational area is of the form: for some rational numbers and . Chinese geometry The first definitive work (or at least oldest existent) on geometry in China was the Mo Jing, the Mohist canon of the early philosopher Mozi (470-390 BC). It was compiled years after his death by his followers around the year 330 BC. Although the Mo Jing is the oldest existent book on geometry in China, there is the possibility that even older written material existed. However, due to the infamous Burning of the Books in a political maneuver by the Qin Dynasty ruler Qin Shihuang (r. 221-210 BC), multitudes of written literature created before his time were purged. In addition, the Mo Jing presents geometrical concepts in mathematics that are perhaps too advanced not to have had a previous geometrical base or mathematic background to work upon. The Mo Jing described various aspects of many fields associated with physical science, and provided a small wealth of information on mathematics as well. It provided an 'atomic' definition of the geometric point, stating that a line is separated into parts, and the part which has no remaining parts (i.e. cannot be divided into smaller parts) and thus forms the extreme end of a line is a point. Much like Euclid's first and third definitions and Plato's 'beginning of a line', the Mo Jing stated that "a point may stand at the end (of a line) or at its beginning like a head-presentation in childbirth. (As to its invisibility) there is nothing similar to it." Similar to the atomists of Democritus, the Mo Jing stated that a point is the smallest unit, and cannot be cut in half, since 'nothing' cannot be halved. It stated that two lines of equal length will always finish at the same place, while providing definitions for the comparison of lengths and for parallels, along with principles of space and bounded space. It also described the fact that planes without the quality of thickness cannot be piled up since they cannot mutually touch. The book provided definitions for circumference, diameter, and radius, along with the definition of volume. The Han Dynasty (202 BC-220 AD) period of China witnessed a new flourishing of mathematics. One of the oldest Chinese mathematical texts to present geometric progressions was the Suàn shù shū of 186 BC, during the Western Han era. The mathematician, inventor, and astronomer Zhang Heng (78-139 AD) used geometrical formulas to solve mathematical problems. Although rough estimates for pi (π) were given in the Zhou Li (compiled in the 2nd century BC), it was Zhang Heng who was the first to make a concerted effort at creating a more accurate formula for pi. Zhang Heng approximated pi as 730/232 (or approx 3.1466), although he used another formula of pi in finding a spherical volume, using the square root of 10 (or approx 3.162) instead. Zu Chongzhi (429-500 AD) improved the accuracy of the approximation of pi to between 3.1415926 and 3.1415927, with 355⁄113 (密率, Milü, detailed approximation) and 22⁄7 (约率, Yuelü, rough approximation) being the other notable approximation. In comparison to later works, the formula for pi given by the French mathematician Franciscus Vieta (1540-1603) fell halfway between Zu's approximations. The Nine Chapters on the Mathematical Art The Nine Chapters on the Mathematical Art, the title of which first appeared by 179 AD on a bronze inscription, was edited and commented on by the 3rd century mathematician Liu Hui from the Kingdom of Cao Wei. This book included many problems where geometry was applied, such as finding surface areas for squares and circles, the volumes of solids in various three-dimensional shapes, and included the use of the Pythagorean theorem. The book provided illustrated proof for the Pythagorean theorem, contained a written dialogue between of the earlier Duke of Zhou and Shang Gao on the properties of the right angle triangle and the Pythagorean theorem, while also referring to the astronomical gnomon, the circle and square, as well as measurements of heights and distances. The editor Liu Hui listed pi as 3.141014 by using a 192 sided polygon, and then calculated pi as 3.14159 using a 3072 sided polygon. This was more accurate than Liu Hui's contemporary Wang Fan, a mathematician and astronomer from Eastern Wu, would render pi as 3.1555 by using 142⁄45. Liu Hui also wrote of mathematical surveying to calculate distance measurements of depth, height, width, and surface area. In terms of solid geometry, he figured out that a wedge with rectangular base and both sides sloping could be broken down into a pyramid and a tetrahedral wedge. He also figured out that a wedge with trapezoid base and both sides sloping could be made to give two tetrahedral wedges separated by a pyramid. Furthermore, Liu Hui described Cavalieri's principle on volume, as well as Gaussian elimination. From the Nine Chapters, it listed the following geometrical formulas that were known by the time of the Former Han Dynasty (202 BCE–9 CE). Areas for the Square Rectangle Circle Isosceles triangle Rhomboid Trapezoid Double trapezium Segment of a circle Annulus ('ring' between two concentric circles) Volumes for the Parallelepiped with two square surfaces Parallelepiped with no square surfaces Pyramid Frustum of pyramid with square base Frustum of pyramid with rectangular base of unequal sides Cube Prism Wedge with rectangular base and both sides sloping Wedge with trapezoid base and both sides sloping Tetrahedral wedge Frustum of a wedge of the second type (used for applications in engineering) Cylinder Cone with circular base Frustum of a cone Sphere Continuing the geometrical legacy of ancient China, there were many later figures to come, including the famed astronomer and mathematician Shen Kuo (1031-1095 CE), Yang Hui (1238-1298) who discovered Pascal's Triangle, Xu Guangqi (1562-1633), and many others. Islamic Golden Age By the beginning of the 9th century, the "Islamic Golden Age" flourished, the establishment of the House of Wisdom in Baghdad marking a separate tradition of science in the medieval Islamic world, building not only Hellenistic but also on Indian sources. Although the Islamic mathematicians are most famed for their work on algebra, number theory and number systems, they also made considerable contributions to geometry, trigonometry and mathematical astronomy, and were responsible for the development of algebraic geometry. Al-Mahani (born 820) conceived the idea of reducing geometrical problems such as duplicating the cube to problems in algebra. Al-Karaji (born 953) completely freed algebra from geometrical operations and replaced them with the arithmetical type of operations which are at the core of algebra today. Thābit ibn Qurra (known as Thebit in Latin) (born 836) contributed to a number of areas in mathematics, where he played an important role in preparing the way for such important mathematical discoveries as the extension of the concept of number to (positive) real numbers, integral calculus, theorems in spherical trigonometry, analytic geometry, and non-Euclidean geometry. In astronomy Thabit was one of the first reformers of the Ptolemaic system, and in mechanics he was a founder of statics. An important geometrical aspect of Thabit's work was his book on the composition of ratios. In this book, Thabit deals with arithmetical operations applied to ratios of geometrical quantities. The Greeks had dealt with geometric quantities but had not thought of them in the same way as numbers to which the usual rules of arithmetic could be applied. By introducing arithmetical operations on quantities previously regarded as geometric and non-numerical, Thabit started a trend which led eventually to the generalisation of the number concept. In some respects, Thabit is critical of the ideas of Plato and Aristotle, particularly regarding motion. It would seem that here his ideas are based on an acceptance of using arguments concerning motion in his geometrical arguments. Another important contribution Thabit made to geometry was his generalization of the Pythagorean theorem, which he extended from special right triangles to all triangles in general, along with a general proof. Ibrahim ibn Sinan ibn Thabit (born 908), who introduced a method of integration more general than that of Archimedes, and al-Quhi (born 940) were leading figures in a revival and continuation of Greek higher geometry in the Islamic world. These mathematicians, and in particular Ibn al-Haytham, studied optics and investigated the optical properties of mirrors made from conic sections. Astronomy, time-keeping and geography provided other motivations for geometrical and trigonometrical research. For example, Ibrahim ibn Sinan and his grandfather Thabit ibn Qurra both studied curves required in the construction of sundials. Abu'l-Wafa and Abu Nasr Mansur both applied spherical geometry to astronomy. A 2007 paper in the journal Science suggested that girih tiles possessed properties consistent with self-similar fractal quasicrystalline tilings such as the Penrose tilings. Renaissance The transmission of the Greek Classics to medieval Europe via the Arabic literature of the 9th to 10th century "Islamic Golden Age" began in the 10th century and culminated in the Latin translations of the 12th century. A copy of Ptolemy's Almagest was brought back to Sicily by Henry Aristippus (d. 1162), as a gift from the Emperor to King William I (r. 1154–1166). An anonymous student at Salerno travelled to Sicily and translated the Almagest as well as several works by Euclid from Greek to Latin. Although the Sicilians generally translated directly from the Greek, when Greek texts were not available, they would translate from Arabic. Eugenius of Palermo (d. 1202) translated Ptolemy's Optics into Latin, drawing on his knowledge of all three languages in the task. The rigorous deductive methods of geometry found in Euclid's Elements of Geometry were relearned, and further development of geometry in the styles of both Euclid (Euclidean geometry) and Khayyam (algebraic geometry) continued, resulting in an abundance of new theorems and concepts, many of them very profound and elegant. Advances in the treatment of perspective were made in Renaissance art of the 14th to 15th century which went beyond what had been achieved in antiquity. In Renaissance architecture of the Quattrocento, concepts of architectural order were explored and rules were formulated. A prime example of is the Basilica di San Lorenzo in Florence by Filippo Brunelleschi (1377–1446). In c. 1413 Filippo Brunelleschi demonstrated the geometrical method of perspective, used today by artists, by painting the outlines of various Florentine buildings onto a mirror. Soon after, nearly every artist in Florence and in Italy used geometrical perspective in their paintings, notably Masolino da Panicale and Donatello. Melozzo da Forlì first used the technique of upward foreshortening (in Rome, Loreto, Forlì and others), and was celebrated for that. Not only was perspective a way of showing depth, it was also a new method of composing a painting. Paintings began to show a single, unified scene, rather than a combination of several. As shown by the quick proliferation of accurate perspective paintings in Florence, Brunelleschi likely understood (with help from his friend the mathematician Toscanelli), but did not publish, the mathematics behind perspective. Decades later, his friend Leon Battista Alberti wrote De pictura (1435/1436), a treatise on proper methods of showing distance in painting based on Euclidean geometry. Alberti was also trained in the science of optics through the school of Padua and under the influence of Biagio Pelacani da Parma who studied Alhazen's Optics'. Piero della Francesca elaborated on Della Pittura in his De Prospectiva Pingendi in the 1470s. Alberti had limited himself to figures on the ground plane and giving an overall basis for perspective. Della Francesca fleshed it out, explicitly covering solids in any area of the picture plane. Della Francesca also started the now common practice of using illustrated figures to explain the mathematical concepts, making his treatise easier to understand than Alberti's. Della Francesca was also the first to accurately draw the Platonic solids as they would appear in perspective. Perspective remained, for a while, the domain of Florence. Jan van Eyck, among others, was unable to create a consistent structure for the converging lines in paintings, as in London's The Arnolfini Portrait, because he was unaware of the theoretical breakthrough just then occurring in Italy. However he achieved very subtle effects by manipulations of scale in his interiors. Gradually, and partly through the movement of academies of the arts, the Italian techniques became part of the training of artists across Europe, and later other parts of the world. The culmination of these Renaissance traditions finds its ultimate synthesis in the research of the architect, geometer, and optician Girard Desargues on perspective, optics and projective geometry. The Vitruvian Man by Leonardo da Vinci(c. 1490) depicts a man in two superimposed positions with his arms and legs apart and inscribed in a circle and square. The drawing is based on the correlations of ideal human proportions with geometry described by the ancient Roman architect Vitruvius in Book III of his treatise De Architectura. Modern geometry The 17th century In the early 17th century, there were two important developments in geometry. The first and most important was the creation of analytic geometry, or geometry with coordinates and equations, by René Descartes (1596–1650) and Pierre de Fermat (1601–1665). This was a necessary precursor to the development of calculus and a precise quantitative science of physics. The second geometric development of this period was the systematic study of projective geometry by Girard Desargues (1591–1661). Projective geometry is the study of geometry without measurement, just the study of how points align with each other. There had been some early work in this area by Hellenistic geometers, notably Pappus (c. 340). The greatest flowering of the field occurred with Jean-Victor Poncelet (1788–1867). In the late 17th century, calculus was developed independently and almost simultaneously by Isaac Newton (1642–1727) and Gottfried Wilhelm Leibniz (1646–1716). This was the beginning of a new field of mathematics now called analysis. Though not itself a branch of geometry, it is applicable to geometry, and it solved two families of problems that had long been almost intractable: finding tangent lines to odd curves, and finding areas enclosed by those curves. The methods of calculus reduced these problems mostly to straightforward matters of computation. The 18th and 19th centuries Non-Euclidean geometry The very old problem of proving Euclid's Fifth Postulate, the "Parallel Postulate", from his first four postulates had never been forgotten. Beginning not long after Euclid, many attempted demonstrations were given, but all were later found to be faulty, through allowing into the reasoning some principle which itself had not been proved from the first four postulates. Though Omar Khayyám was also unsuccessful in proving the parallel postulate, his criticisms of Euclid's theories of parallels and his proof of properties of figures in non-Euclidean geometries contributed to the eventual development of non-Euclidean geometry. By 1700 a great deal had been discovered about what can be proved from the first four, and what the pitfalls were in attempting to prove the fifth. Saccheri, Lambert, and Legendre each did excellent work on the problem in the 18th century, but still fell short of success. In the early 19th century, Gauss, Johann Bolyai, and Lobatchewsky, each independently, took a different approach. Beginning to suspect that it was impossible to prove the Parallel Postulate, they set out to develop a self-consistent geometry in which that postulate was false. In this they were successful, thus creating the first non-Euclidean geometry. By 1854, Bernhard Riemann, a student of Gauss, had applied methods of calculus in a ground-breaking study of the intrinsic (self-contained) geometry of all smooth surfaces, and thereby found a different non-Euclidean geometry. This work of Riemann later became fundamental for Einstein's theory of relativity. It remained to be proved mathematically that the non-Euclidean geometry was just as self-consistent as Euclidean geometry, and this was first accomplished by Beltrami in 1868. With this, non-Euclidean geometry was established on an equal mathematical footing with Euclidean geometry. While it was now known that different geometric theories were mathematically possible, the question remained, "Which one of these theories is correct for our physical space?" The mathematical work revealed that this question must be answered by physical experimentation, not mathematical reasoning, and uncovered the reason why the experimentation must involve immense (interstellar, not earth-bound) distances. With the development of relativity theory in physics, this question became vastly more complicated. Introduction of mathematical rigor All the work related to the Parallel Postulate revealed that it was quite difficult for a geometer to separate his logical reasoning from his intuitive understanding of physical space, and, moreover, revealed the critical importance of doing so. Careful examination had uncovered some logical inadequacies in Euclid's reasoning, and some unstated geometric principles to which Euclid sometimes appealed. This critique paralleled the crisis occurring in calculus and analysis regarding the meaning of infinite processes such as convergence and continuity. In geometry, there was a clear need for a new set of axioms, which would be complete, and which in no way relied on pictures we draw or on our intuition of space. Such axioms, now known as Hilbert's axioms, were given by David Hilbert in 1894 in his dissertation Grundlagen der Geometrie (Foundations of Geometry). Some other complete sets of axioms had been given a few years earlier, but did not match Hilbert's in economy, elegance, and similarity to Euclid's axioms. Analysis situs, or topology In the mid-18th century, it became apparent that certain progressions of mathematical reasoning recurred when similar ideas were studied on the number line, in two dimensions, and in three dimensions. Thus the general concept of a metric space was created so that the reasoning could be done in more generality, and then applied to special cases. This method of studying calculus- and analysis-related concepts came to be known as analysis situs, and later as topology. The important topics in this field were properties of more general figures, such as connectedness and boundaries, rather than properties like straightness, and precise equality of length and angle measurements, which had been the focus of Euclidean and non-Euclidean geometry. Topology soon became a separate field of major importance, rather than a sub-field of geometry or analysis. The 20th century Developments in algebraic geometry included the study of curves and surfaces over finite fields as demonstrated by the works of among others André Weil, Alexander Grothendieck, and Jean-Pierre Serre as well as over the real or complex numbers. Finite geometry itself, the study of spaces with only finitely many points, found applications in coding theory and cryptography. With the advent of the computer, new disciplines such as computational geometry or digital geometry deal with geometric algorithms, discrete representations of geometric data, and so forth. Timeline See also Flatland, a book by "A. Square" about two– and three-dimensional space, to understand the concept of four dimensions History of mathematics History of measurement Important publications in geometry Interactive geometry software List of geometry topics Modern triangle geometry Notes References Needham, Joseph (1986), Science and Civilization in China: Volume 3, Mathematics and the Sciences of the Heavens and the Earth'', Taipei: Caves Books Ltd External links Islamic Geometry Geometry in the 19th Century at the Stanford Encyclopedia of Philosophy Arabic mathematics : forgotten brilliance?
[ 0.01837916485965252, 0.1470634937286377, -0.13907866179943085, 0.2584407329559326, -0.26707109808921814, 0.37656423449516296, 0.3509480655193329, 0.3571697175502777, -0.2982691824436188, -1.0035854578018188, -0.2958190441131592, 0.185683012008667, -0.1188012883067131, 0.781848132610321, ...
11955
https://en.wikipedia.org/wiki/George%20H.%20W.%20Bush
George H. W. Bush
George Herbert Walker Bush (June 12, 1924November 30, 2018) was an American politician, diplomat, and businessman who served as the 41st president of the United States from 1989 to 1993. A member of the Republican Party, Bush also served as the 43rd vice president from 1981 to 1989 under Ronald Reagan, in the U.S. House of Representatives, as U.S. Ambassador to the United Nations, and as Director of Central Intelligence. Bush was raised in Greenwich, Connecticut, and attended Phillips Academy before serving in the United States Navy Reserve during World War II. After the war, he graduated from Yale and moved to West Texas, where he established a successful oil company. After an unsuccessful run for the United States Senate, he won the election to the 7th congressional district of Texas in 1966. President Richard Nixon appointed Bush to the position of Ambassador to the United Nations in 1971 and to the position of chairman of the Republican National Committee in 1973. In 1974, President Gerald Ford appointed him as the Chief of the Liaison Office to the People's Republic of China, and in 1976 Bush became the Director of Central Intelligence. Bush ran for president in 1980, but was defeated in the Republican presidential primaries by Ronald Reagan, who then selected Bush as his vice presidential running mate. In the 1988 presidential election, Bush defeated Democrat Michael Dukakis, becoming the first incumbent vice president to be elected president since Martin Van Buren in 1836. Foreign policy drove the Bush presidency, as he navigated the final years of the Cold War and played a key role in the reunification of Germany. Bush presided over the invasion of Panama and the Gulf War, ending the Iraqi occupation of Kuwait in the latter conflict. Though the agreement was not ratified until after he left office, Bush negotiated and signed the North American Free Trade Agreement (NAFTA), which created a trade bloc consisting of the United States, Canada, and Mexico. Domestically, Bush reneged on a 1988 campaign promise by enacting legislation to raise taxes with the justification of reducing the budget deficit. He also championed and signed three pieces of Bipartisan legislation, the Americans with Disabilities Act of 1990, Immigration Act of 1990 and the Clean Air Act Amendments of 1990. He also successfully appointed David Souter and Clarence Thomas to the Supreme Court. Bush lost the 1992 presidential election to Democrat Bill Clinton following an economic recession, his turnaround on his tax promise, and the decreased emphasis of foreign policy in a post–Cold War political climate. After leaving office in 1993, Bush was active in humanitarian activities, often working alongside Bill Clinton, his former opponent. With the victory of his son, George W. Bush, in the 2000 presidential election, the two became the second father–son pair to serve as the nation's president, following John Adams and John Quincy Adams. Another son, Jeb Bush, unsuccessfully sought the Republican presidential nomination in the 2016 Republican primaries. Historians generally rank Bush as an above-average president. Early life and education (1924–1948) George Herbert Walker Bush was born in Milton, Massachusetts on June 12, 1924. He was the second son of Prescott Bush and Dorothy (Walker) Bush. His paternal grandfather, Samuel P. Bush, worked as an executive for a railroad parts company in Columbus, Ohio, while his maternal grandfather and namesake, George Herbert Walker, led Wall Street investment bank W. A. Harriman & Co. Walker was known as "Pop", and young Bush was called "Poppy" as a tribute to him. The Bush family moved to Greenwich, Connecticut in 1925, and Prescott took a position with W. A. Harriman & Co. (which later merged into Brown Brothers Harriman & Co.) the following year. Bush spent most of his childhood in Greenwich, at the family vacation home in Kennebunkport, Maine, or at his maternal grandparents' plantation in South Carolina. Because of the family's wealth, Bush was largely unaffected by the Great Depression. He attended Greenwich Country Day School from 1929 to 1937 and Phillips Academy, an elite private academy in Massachusetts, from 1937 to 1942. While at Phillips Academy, he served as president of the senior class, secretary of the student council, president of the community fund-raising group, a member of the editorial board of the school newspaper, and captain of the varsity baseball and soccer teams. World War II On his 18th birthday, immediately after graduating from Phillips Academy, he enlisted in the United States Navy as a naval aviator. After a period of training, he was commissioned as an ensign in the Naval Reserve at Naval Air Station Corpus Christi on June 9, 1943, becoming one of the youngest aviators in the Navy. Beginning in 1944, Bush served in the Pacific theater, where he flew a Grumman TBF Avenger, a torpedo bomber capable of taking off from aircraft carriers. His squadron was assigned to the as a member of Air Group 51, where his lanky physique earned him the nickname "Skin". Bush flew his first combat mission in May 1944, bombing Japanese-held Wake Island, and was promoted to lieutenant (junior grade) on August 1, 1944. During an attack on a Japanese installation in Chichijima, Bush's aircraft successfully attacked several targets, but was downed by enemy fire. Though both of Bush's fellow crew members died, Bush successfully bailed out from the aircraft and was rescued by the . Several of the aviators shot down during the attack were captured and executed, and their livers were eaten by their captors. Bush's survival after such a close brush with death shaped him profoundly, leading him to ask, "Why had I been spared and what did God have for me?" He was later awarded the Distinguished Flying Cross for his role in the mission. Bush returned to San Jacinto in November 1944, participating in operations in the Philippines. In early 1945, he was assigned to a new combat squadron, VT-153, where he trained to take part in an invasion of mainland Japan. On September 2, 1945, before any invasion took place, Japan formally surrendered following the atomic bombings of Hiroshima and Nagasaki. Bush was released from active duty that same month, but was not formally discharged from the Navy until October 1955, at which point he had reached the rank of lieutenant. By the end of his period of active service, Bush had flown 58 missions, completed 128 carrier landings, and recorded 1228 hours of flight time. Marriage Bush met Barbara Pierce at a Christmas dance in Greenwich in December 1941, and, after a period of courtship, they became engaged in December 1943. While Bush was on leave from the Navy, they married in Rye, New York, on January 6, 1945. The Bushes enjoyed a strong marriage, and Barbara would later be a popular First Lady, seen by many as "a kind of national grandmother". They had six children: George W. (b. 1946), Robin (1949–1953), Jeb (b. 1953), Neil (b. 1955), Marvin (b. 1956), and Doro (b. 1959). Their oldest daughter, Robin, died of leukemia in 1953. College years Bush enrolled at Yale College, where he took part in an accelerated program that enabled him to graduate in two and a half years rather than the usual four. He was a member of the Delta Kappa Epsilon fraternity and was elected its president. He also captained the Yale baseball team and played in the first two College World Series as a left-handed first baseman. Like his father, he was a member of the Yale cheerleading squad and was initiated into the Skull and Bones secret society. He graduated Phi Beta Kappa in 1948 with a Bachelor of Arts degree, majoring in economics and minoring in sociology. Business career (1948–1963) After graduating from Yale, Bush moved his young family to West Texas. Biographer Jon Meacham writes that Bush's relocation to Texas allowed him to move out of the "daily shadow of his Wall Street father and Grandfather Walker, two dominant figures in the financial world", but would still allow Bush to "call on their connections if he needed to raise capital." His first position in Texas was an oil field equipment salesman for Dresser Industries, which was led by family friend Neil Mallon. While working for Dresser, Bush lived in various places with his family: Odessa, Texas; Ventura, Bakersfield and Compton, California; and Midland, Texas. In 1952, he volunteered for the successful presidential campaign of Republican candidate Dwight D. Eisenhower. That same year, his father won election to represent Connecticut in the United States Senate as a member of the Republican Party. With support from Mallon and Bush's uncle, George Herbert Walker Jr., Bush and John Overbey launched the Bush-Overbey Oil Development Company in 1951. In 1953 he co-founded the Zapata Petroleum Corporation, an oil company that drilled in the Permian Basin in Texas. In 1954, he was named president of the Zapata Offshore Company, a subsidiary which specialized in offshore drilling. Shortly after the subsidiary became independent in 1959, Bush moved the company and his family from Midland to Houston. There, he befriended James Baker, a prominent attorney who later became an important political ally. Bush remained involved with Zapata until the mid-1960s, when he sold his stock in the company for approximately $1 million. In 1988, The Nation published an article alleging that Bush worked as an operative of the Central Intelligence Agency (CIA) during the 1960s; Bush denied this claim. Early political career (1963–1971) Entry into politics By the early 1960s, Bush was widely regarded as an appealing political candidate, and some leading Democrats attempted to convince Bush to become a Democrat. He declined to leave the Republican Party, later citing his belief that the national Democratic Party favored "big, centralized government". The Democratic Party had historically dominated Texas, but Republicans scored their first major victory in the state with John G. Tower's victory in a 1961 special election to the United States Senate. Motivated by Tower's victory, and hoping to prevent the far-right John Birch Society from coming to power, Bush ran for the chairmanship of the Harris County Republican Party, winning election in February 1963. Like most other Texas Republicans, Bush supported conservative Senator Barry Goldwater over the more centrist Nelson Rockefeller in the 1964 Republican Party presidential primaries. In 1964, Bush sought to unseat liberal Democrat Ralph W. Yarborough in Texas's U.S. Senate election. Bolstered by superior fundraising, Bush won the Republican primary by defeating former gubernatorial nominee Jack Cox in a run-off election. In the general election, Bush attacked Yarborough's vote for the Civil Rights Act of 1964, which banned racial and gender discrimination in public institutions and in many privately owned businesses. Bush argued that the act unconstitutionally expanded the powers of the federal government, but he was privately uncomfortable with the racial politics of opposing the act. He lost the election 56 percent to 44 percent, though he did run well ahead of Barry Goldwater, the Republican presidential nominee. Despite the loss, the New York Times reported that Bush was "rated by political friend and foe alike as the Republicans' best prospect in Texas because of his attractive personal qualities and the strong campaign he put up for the Senate". U.S. House of Representatives In 1966, Bush ran for the United States House of Representatives in Texas's 7th congressional district, a newly redistricted seat in the Greater Houston area. Initial polling showed him trailing his Democratic opponent, Harris County District Attorney Frank Briscoe, but he ultimately won the race with 57 percent of the vote. In an effort to woo potential candidates in the South and Southwest, House Republicans secured Bush an appointment to the powerful United States House Committee on Ways and Means, making Bush the first freshman to serve on the committee since 1904. His voting record in the House was generally conservative. He supported the Nixon administration's Vietnam policies, but broke with Republicans on the issue of birth control, which he supported. He also voted for the Civil Rights Act of 1968, although it was generally unpopular in his district. In 1968, Bush joined several other Republicans in issuing the party's Response to the State of the Union address; Bush's part of the address focused on a call for fiscal responsibility. Though most other Texas Republicans supported Ronald Reagan in the 1968 Republican Party presidential primaries, Bush endorsed Richard Nixon, who went on to win the party's nomination. Nixon considered selecting Bush as his running mate in the 1968 presidential election, but he ultimately chose Spiro Agnew instead. Bush won re-election to the House unopposed, while Nixon defeated Hubert Humphrey in the presidential election. In 1970, with President Nixon's support, Bush gave up his seat in the House to run for the Senate against Yarborough. Bush easily won the Republican primary, but Yarborough was defeated by the more conservative Lloyd Bentsen in the Democratic primary. Ultimately, Bentsen defeated Bush, taking 53.5 percent of the vote. Nixon and Ford administrations (1971–1977) Ambassador to the United Nations After the 1970 Senate election, Bush accepted a position as a senior adviser to the president, but he convinced Nixon to instead appoint him as the U.S. Ambassador to the United Nations. The position represented Bush's first foray into foreign policy, as well as his first major experiences with the Soviet Union and China, the two major U.S. rivals in the Cold War. During Bush's tenure, the Nixon administration pursued a policy of détente, seeking to ease tensions with both the Soviet Union and China. Bush's ambassadorship was marked by a defeat on the China question, as the United Nations General Assembly voted, in Resolution 2758, to expel the Republic of China and replace it with the People's Republic of China in October 1971. In the 1971 crisis in Pakistan, Bush supported an Indian motion at the UN General Assembly to condemn the Pakistani government of Yahya Khan for waging genocide in East Pakistan (modern Bangladesh), referring to the "tradition which we have supported that the human rights question transcended domestic jurisdiction and should be freely debated". Bush's support for India at the UN put him into conflict with Nixon who was supporting Pakistan, partly because Yahya Khan was a useful intermediary in his attempts to reach out to China and partly because the president was fond of Yahya Khan. Chairman of the Republican National Committee After Nixon won a landslide victory in the 1972 presidential election, he appointed Bush as chair of the Republican National Committee (RNC). In that position, he was charged with fundraising, candidate recruitment, and making appearances on behalf of the party in the media. When Agnew was being investigated for corruption, Bush assisted, at the request of Nixon and Agnew, in pressuring John Glenn Beall Jr., the U.S. Senator from Maryland, to force his brother, George Beall the U.S. Attorney in Maryland, who was supervising the investigation into Agnew. Attorney Beall ignored the pressure. During Bush's tenure at the RNC, the Watergate scandal emerged into public view; the scandal originated from the June 1972 break-in of the Democratic National Committee, but also involved later efforts to cover up the break-in by Nixon and other members of the White House. Bush initially defended Nixon steadfastly, but as Nixon's complicity became clear he focused more on defending the Republican Party. Following the resignation of Vice President Agnew in 1973 for a scandal unrelated to Watergate, Bush was considered for the position of vice president, but the appointment instead went to Gerald Ford. After the public release of an audio recording that confirmed that Nixon had plotted to use the CIA to cover up the Watergate break-in, Bush joined other party leaders in urging Nixon to resign. When Nixon resigned on August 9, 1974, Bush noted in his diary that "There was an aura of sadness, like somebody died... The [resignation] speech was vintage Nixon—a kick or two at the press—enormous strains. One couldn't help but look at the family and the whole thing and think of his accomplishments and then think of the shame... [President Gerald Ford's swearing-in offered] indeed a new spirit, a new lift." Head of U.S. Liaison Office in China Upon his ascension to the presidency, Ford strongly considered Bush, Donald Rumsfeld, and Nelson Rockefeller for the vacant position of vice president. Ford ultimately chose Nelson Rockefeller, partly because of the publication of a news report claiming that Bush's 1970 campaign had benefited from a secret fund set up by Nixon; Bush was later cleared of any suspicion by a special prosecutor. Bush accepted appointment as Chief of the U.S. Liaison Office in the People's Republic of China, making him the de facto ambassador to China. According to biographer Jon Meacham, Bush's time in China convinced him that American engagement abroad was needed to ensure global stability, and that the United States "needed to be visible but not pushy, muscular but not domineering." Director of Central Intelligence In January 1976, Ford brought Bush back to Washington to become the Director of Central Intelligence (DCI), placing him in charge of the CIA. In the aftermath of the Watergate scandal and the Vietnam War, the CIA's reputation had been damaged for its role in various covert operations, and Bush was tasked with restoring the agency's morale and public reputation. During Bush's year in charge of the CIA, the U.S. national security apparatus actively supported Operation Condor operations and right-wing military dictatorships in Latin America. Meanwhile, Ford decided to drop Rockefeller from the ticket for the 1976 presidential election; he considered Bush as his running mate, but ultimately chose Bob Dole. In his capacity as DCI, Bush gave national security briefings to Jimmy Carter both as a presidential candidate and as president-elect. 1980 presidential election Bush's tenure at the CIA ended after Carter narrowly defeated Ford in the 1976 presidential election. Out of public office for the first time since the 1960s, Bush became chairman on the executive committee of the First International Bank in Houston. He also spent a year as a part-time professor of Administrative Science at Rice University's Jones School of Business, continued his membership in the Council on Foreign Relations, and joined the Trilateral Commission. Meanwhile, he began to lay the groundwork for his candidacy in the 1980 Republican Party presidential primaries. In the 1980 Republican primary campaign, Bush faced Ronald Reagan, who was widely regarded as the front-runner, as well as other contenders like Senator Bob Dole, Senator Howard Baker, Texas Governor John Connally, Congressman Phil Crane, and Congressman John B. Anderson. Bush's campaign cast him as a youthful, "thinking man's candidate" who would emulate the pragmatic conservatism of President Eisenhower. In the midst of the Soviet–Afghan War, which brought an end to a period of détente, and the Iran hostage crisis, in which 52 Americans were taken hostage, the campaign highlighted Bush's foreign policy experience. At the outset of the race, Bush focused heavily on winning the January 21 Iowa caucuses, making 31 visits to the state. He won a close victory in Iowa with 31.5% to Reagan's 29.4%. After the win, Bush stated that his campaign was full of momentum, or "the Big Mo", and Reagan reorganized his campaign. Partly in response to the Bush campaign's frequent questioning of Reagan's age (Reagan turned 69 in 1980), the Reagan campaign stepped up attacks on Bush, painting him as an elitist who was not truly committed to conservatism. Prior to the New Hampshire primary, Bush and Reagan agreed to a two-person debate, organized by The Nashua Telegraph but paid for by the Reagan campaign. Days before the debate, Reagan announced that he would invite four other candidates to the debate; Bush, who had hoped that the one-on-one debate would allow him to emerge as the main alternative to Reagan in the primaries, refused to debate the other candidates. All six candidates took the stage, but Bush refused to speak in the presence of the other candidates. Ultimately, the other four candidates left the stage and the debate continued, but Bush's refusal to debate anyone other than Reagan badly damaged his campaign in New Hampshire. He ended up decisively losing New Hampshire's primary to Reagan, winning just 23 percent of the vote. Bush revitalized his campaign with a victory in Massachusetts, but lost the next several primaries. As Reagan built up a commanding delegate lead, Bush refused to end his campaign, but the other candidates dropped out of the race. Criticizing his more conservative rival's policy proposals, Bush famously labeled Reagan's supply side-influenced plans for massive tax cuts as "voodoo economics". Though he favored lower taxes, Bush feared that dramatic reductions in taxation would lead to deficits and, in turn, cause inflation. After Reagan clinched a majority of delegates in late May, Bush reluctantly dropped out of the race. At the 1980 Republican National Convention, Reagan made the last-minute decision to select Bush as his vice presidential nominee after negotiations with Ford regarding a Reagan–Ford ticket collapsed. Though Reagan had resented many of the Bush campaign's attacks during the primary campaign, and several conservative leaders had actively opposed Bush's nomination, Reagan ultimately decided that Bush's popularity with moderate Republicans made him the best and safest pick. Bush, who had believed his political career might be over following the primaries, eagerly accepted the position and threw himself into campaigning for the Reagan–Bush ticket. The 1980 general election campaign between Reagan and Carter was conducted amid a multitude of domestic concerns and the ongoing Iran hostage crisis, and Reagan sought to focus the race on Carter's handling of the economy. Though the race was widely regarded as a close contest for most of the campaign, Reagan ultimately won over the large majority of undecided voters. Reagan took 50.7 percent of the popular vote and 489 of the 538 electoral votes, while Carter won 41% of the popular vote and John Anderson, running as an independent candidate, won 6.6% of the popular vote. Vice Presidency (1981–1989) As vice president, Bush generally maintained a low profile, recognizing the constitutional limits of the office; he avoided decision-making or criticizing Reagan in any way. This approach helped him earn Reagan's trust, easing tensions left over from their earlier rivalry. Bush also generally enjoyed a good relationship with Reagan staffers, including his close friend Jim Baker, who served as Reagan's initial chief of staff. His understanding of the vice presidency was heavily influenced by Vice President Walter Mondale, who enjoyed a strong relationship with President Carter in part because of his ability to avoid confrontations with senior staff and Cabinet members, and by Vice President Nelson Rockefeller's difficult relationship with some members of the White House staff during the Ford administration. The Bushes attended a large number of public and ceremonial events in their positions, including many state funerals, which became a common joke for comedians. As the President of the Senate, Bush also stayed in contact with members of Congress and kept the president informed on occurrences on Capitol Hill. First term On March 30, 1981, while Bush was in Texas, Reagan was shot and seriously wounded by John Hinckley Jr. Bush immediately flew back to Washington D.C.; when his plane landed, his aides advised him to proceed directly to the White House by helicopter in order to show that the government was still functioning. Bush rejected the idea, as he feared that such a dramatic scene risked giving the impression that he sought to usurp Reagan's powers and prerogatives. During Reagan's short period of incapacity, Bush presided over Cabinet meetings, met with congressional leaders and foreign leaders, and briefed reporters, but he consistently rejected the possibility of invoking the Twenty-fifth Amendment. Bush's handling of the attempted assassination and its aftermath made a positive impression on Reagan, who recovered and returned to work within two weeks of the shooting. From then on, the two men would have regular Thursday lunches in the Oval Office. Bush was assigned by Reagan to chair two special task forces, one on deregulation and one on international drug smuggling. Both were popular issues with conservatives, and Bush, largely a moderate, began courting them through his work. The deregulation task force reviewed hundreds of rules, making specific recommendations on which ones to amend or revise, in order to curb the size of the federal government. The Reagan administration's deregulation push had a strong impact on broadcasting, finance, resource extraction, and other economic activities, and the administration eliminated numerous government positions. Bush also oversaw the administration's national security crisis management organization, which had traditionally been the responsibility of the National Security Advisor. In 1983, Bush toured Western Europe as part of the Reagan administration's ultimately successful efforts to convince skeptical NATO allies to support the deployment of Pershing II missiles. Reagan's approval ratings fell after his first year in office, but they bounced back when the United States began to emerge from recession in 1983. Former Vice President Walter Mondale was nominated by the Democratic Party in the 1984 presidential election. Down in the polls, Mondale selected Congresswoman Geraldine Ferraro as his running mate in hopes of galvanizing support for his campaign, thus making Ferraro the first female major party vice presidential nominee in U.S. history. She and Bush squared off in a single televised vice presidential debate. Public opinion polling consistently showed a Reagan lead in the 1984 campaign, and Mondale was unable to shake up the race. In the end, Reagan won re-election, winning 49 of 50 states and receiving 59% of the popular vote to Mondale's 41%. Second term Mikhail Gorbachev came to power in the Soviet Union in 1985. Rejecting the ideologically rigidity of his three elderly sick predecessors, Gorbachev insisted on urgently needed economic and political reforms called "glasnost" (openness) and "perestroika" (restructuring). At the 1987 Washington Summit, Gorbachev and Reagan signed the Intermediate-Range Nuclear Forces Treaty, which committed both signatories to the total abolition of their respective short-range and medium-range missile stockpiles. The treaty marked the beginning of a new era of trade, openness, and cooperation between the two powers. President Reagan and Secretary of State George Shultz took the lead in these negotiations, but Bush sat in on many meetings. Bush did not agree with many of the Reagan policies, but he did tell Gorbachev that he would seek to continue improving relations if he succeeded Reagan. On July 13, 1985, Bush became the first vice president to serve as acting president when Reagan underwent surgery to remove polyps from his colon; Bush served as the acting president for approximately eight hours. In 1986, the Reagan administration was shaken by a scandal when it was revealed that administration officials had secretly arranged weapon sales to Iran during the Iran–Iraq War. The officials had used the proceeds to fund the Contra rebels in their fight against the leftist Sandinista government in Nicaragua. Democrats had passed a law that appropriated funds could not be used to help the Contras. Instead the administration used non-appropriated funds from the sales. When news of affair broke to the media, Bush stated that he had been "out of the loop" and unaware of the diversion of funds. Biographer Jon Meacham writes that "no evidence was ever produced proving Bush was aware of the diversion to the contras," but he criticizes Bush's "out of the loop" characterization, writing that the "record is clear that Bush was aware that the United States, in contravention of its own stated policy, was trading arms for hostages". The Iran–Contra scandal, as it became known, did serious damage to the Reagan presidency, raising questions about Reagan's competency. Congress established the Tower Commission to investigate the scandal, and, at Reagan's request, a panel of federal judges appointed Lawrence Walsh as a special prosecutor charged with investigating the Iran–Contra scandal. The investigations continued after Reagan left office and, though Bush was never charged with a crime, the Iran–Contra scandal would remain a political liability for him. On July 3, 1988, the guided missile cruiser accidentally shot down Iran Air Flight 655, killing 290 passengers. Bush, then-vice president, defended his country at the UN by arguing that the U.S. attack had been a wartime incident and the crew of Vincennes had acted appropriately to the situation. 1988 presidential election Bush began planning for a presidential run after the 1984 election, and he officially entered the 1988 Republican Party presidential primaries in October 1987. He put together a campaign led by Reagan staffer Lee Atwater, and which also included his son, George W. Bush, and media consultant Roger Ailes. Though he had moved to the right during his time as vice president, endorsing a Human Life Amendment and repudiating his earlier comments on "voodoo economics," Bush still faced opposition from many conservatives in the Republican Party. His major rivals for the Republican nomination were Senate Minority Leader Bob Dole of Kansas, Congressman Jack Kemp of New York, and Christian televangelist Pat Robertson. Reagan did not publicly endorse any candidate, but he privately expressed support for Bush. Though considered the early front-runner for the nomination, Bush came in third in the Iowa caucus, behind Dole and Robertson. Much as Reagan had done in 1980, Bush reorganized his staff and concentrated on the New Hampshire primary. With help from Governor John H. Sununu and an effective campaign attacking Dole for raising taxes, Bush overcame an initial polling deficit and won New Hampshire with 39 percent of the vote. After Bush won South Carolina and 16 of the 17 states holding a primary on Super Tuesday, his competitors dropped out of the race. Bush, occasionally criticized for his lack of eloquence when compared to Reagan, delivered a well-received speech at the Republican convention. Known as the "thousand points of light" speech, it described Bush's vision of America: he endorsed the Pledge of Allegiance, prayer in schools, capital punishment, and gun rights. Bush also pledged that he would not raise taxes, stating: "Congress will push me to raise taxes, and I'll say no, and they'll push, and I'll say no, and they'll push again. And all I can say to them is: read my lips. No new taxes." Bush selected little-known Senator Dan Quayle of Indiana as his running mate. Though Quayle had compiled an unremarkable record in Congress, he was popular among many conservatives, and the campaign hoped that Quayle's youth would appeal to younger voters. Meanwhile, the Democratic Party nominated Governor Michael Dukakis, who was known for presiding over an economic turnaround in Massachusetts. Leading in the general election polls against Bush, Dukakis ran an ineffective, low-risk campaign. The Bush campaign attacked Dukakis as an unpatriotic liberal extremist and seized on the Willie Horton case, in which a convicted felon from Massachusetts raped a woman while on a prison furlough, a program Dukakis supported as governor. The Bush campaign charged that Dukakis presided over a "revolving door" that allowed dangerous convicted felons to leave prison. Dukakis damaged his own campaign with a widely mocked ride in an M1 Abrams tank and a poor performance at the second presidential debate. Bush also attacked Dukakis for opposing a law that would require all students to recite the Pledge of Allegiance. The election is widely considered to have had a high level of negative campaigning, though political scientist John Geer has argued that the share of negative ads was in line with previous presidential elections. Bush defeated Dukakis by a margin of 426 to 111 in the Electoral College, and he took 53.4 percent of the national popular vote. Bush ran well in all the major regions of the country, but especially in the South. He became the fourth sitting vice president to be elected president and the first to do so since Martin Van Buren in 1836 and the first person to succeed a president from his own party via election since Herbert Hoover in 1929. In the concurrent congressional elections, Democrats retained control of both houses of Congress. Presidency (1989–1993) Bush was inaugurated on January 20, 1989, succeeding Ronald Reagan. In his inaugural address, Bush said: Bush's first major appointment was that of James Baker as Secretary of State. Leadership of the Department of Defense went to Dick Cheney, who had previously served as Gerald Ford's chief of staff and would later serve as vice president under his son George W. Bush. Jack Kemp joined the administration as Secretary of Housing and Urban Development, while Elizabeth Dole, the wife of Bob Dole and a former Secretary of Transportation, became the Secretary of Labor under Bush. Bush retained several Reagan officials, including Secretary of the Treasury Nicholas F. Brady, Attorney General Dick Thornburgh, and Secretary of Education Lauro Cavazos. New Hampshire Governor John Sununu, a strong supporter of Bush during the 1988 campaign, became chief of staff. Brent Scowcroft was appointed as the National Security Advisor, a role he had also held under Ford. Foreign affairs End of the Cold War During the first year of his tenure, Bush put a pause on Reagan's détente policy toward the USSR. Bush and his advisers were initially divided on Gorbachev; some administration officials saw him as a democratic reformer, but others suspected him of trying to make the minimum changes necessary to restore the Soviet Union to a competitive position with the United States. In 1989, all the Communist governments collapsed in Eastern Europe. Gorbachev declined to send in the Soviet military, effectively abandoning the Brezhnev Doctrine. The U.S. was not directly involved in these upheavals, but the Bush administration avoided gloating over the demise of the Eastern Bloc to avoid undermining further democratic reforms. Bush and Gorbachev met at the Malta Summit in December 1989. Though many on the right remained wary of Gorbachev, Bush came away with the belief that Gorbachev would negotiate in good faith. For the remainder of his term, Bush sought cooperative relations with Gorbachev, believing that he was the key to peace. The primary issue at the Malta Summit was the potential reunification of Germany. While Britain and France were wary of a re-unified Germany, Bush joined West German Chancellor Helmut Kohl in pushing for German reunification. Bush believed that a reunified Germany would serve American interests. After extensive negotiations, Gorbachev agreed to allow a reunified Germany to be a part of NATO, and Germany officially reunified in October 1990 after paying billions of marks to Moscow. Gorbachev used force to suppress nationalist movements within the Soviet Union itself. A crisis in Lithuania left Bush in a difficult position, as he needed Gorbachev's cooperation in the reunification of Germany and feared that the collapse of the Soviet Union could leave nuclear arms in dangerous hands. The Bush administration mildly protested Gorbachev's suppression of Lithuania's independence movement, but took no action to directly intervene. Bush warned independence movements of the disorder that could come with secession from the Soviet Union; in a 1991 address that critics labeled the "Chicken Kiev speech", he cautioned against "suicidal nationalism". In July 1991, Bush and Gorbachev signed the Strategic Arms Reduction Treaty (START I) treaty, in which both countries agreed to cut their strategic nuclear weapons by 30 percent. In August 1991, hard-line Communists launched a coup against Gorbachev; while the coup quickly fell apart, it broke the remaining power of Gorbachev and the central Soviet government. Later that month, Gorbachev resigned as general secretary of the Communist party, and Russian president Boris Yeltsin ordered the seizure of Soviet property. Gorbachev clung to power as the President of the Soviet Union until December 1991, when the Soviet Union dissolved. Fifteen states emerged from the Soviet Union, and of those states, Russia was the largest and most populous. Bush and Yeltsin met in February 1992, declaring a new era of "friendship and partnership". In January 1993, Bush and Yeltsin agreed to START II, which provided for further nuclear arms reductions on top of the original START treaty. The collapse of the Soviet Union prompted reflections on the future of the world following the end of the Cold War; one political scientist, Francis Fukuyama, speculated that humanity had reached the "end of history" in that liberal, capitalist democracy had permanently triumphed over Communism and fascism. Meanwhile, the collapse of the Soviet Union and other Communist governments led to post-Soviet conflicts in Central Europe, Eastern Europe, Central Asia, and Africa that would continue long after Bush left office. Invasion of Panama Through the late 1980s, the U.S. provided aid to Manuel Noriega, the anti-Communist leader of Panama. Noriega had long standing ties to United States intelligence agencies, including during Bush's tenure as Director of Central Intelligence, and was also deeply involved in drug trafficking. In May 1989, Noriega annulled the results of a democratic presidential election in which Guillermo Endara had been elected. Bush objected to the annulment of the election and worried about the status of the Panama Canal with Noriega still in office. Bush dispatched 2,000 soldiers to the country, where they began conducting regular military exercises in violation of prior treaties. After a U.S. serviceman was shot by Panamanian forces in December 1989, Bush ordered the United States invasion of Panama, known as "Operation Just Cause". The invasion was the first large-scale American military operation in more than 40 years that was not related to the Cold War. American forces quickly took control of the Panama Canal Zone and Panama City. Noriega surrendered on January 3, 1990, and was quickly transported to a prison in the United States. Twenty-three Americans died in the operation, while another 394 were wounded. Noriega was convicted and imprisoned on racketeering and drug trafficking charges in April 1992. Historian Stewart Brewer argues that the invasion "represented a new era in American foreign policy" because Bush did not justify the invasion under the Monroe Doctrine or the threat of Communism, but rather on the grounds that it was in the best interests of the United States. Gulf War Faced with massive debts and low oil prices in the aftermath of the Iran–Iraq War, Iraqi leader Saddam Hussein decided to conquer the country of Kuwait, a small, oil-rich country situated on Iraq's southern border. After Iraq invaded Kuwait in August 1990, Bush imposed economic sanctions on Iraq and assembled a multi-national coalition opposed to the invasion. The administration feared that a failure to respond to the invasion would embolden Hussein to attack Saudi Arabia or Israel, and wanted to discourage other countries from similar aggression. Bush also wanted to ensure continued access to oil, as Iraq and Kuwait collectively accounted for 20 percent of the world's oil production, and Saudi Arabia produced another 26 percent of the world's oil supply. At Bush's insistence, in November 1990, the United Nations Security Council approved a resolution authorizing the use of force if Iraq did not withdraw from Kuwait by January 15, 1991. Gorbachev's support, as well as China's abstention, helped ensure passage of the UN resolution. Bush convinced Britain, France, and other nations to commit soldiers to an operation against Iraq, and he won important financial backing from Germany, Japan, South Korea, Saudi Arabia, and the United Arab Emirates. In January 1991, Bush asked Congress to approve a joint resolution authorizing a war against Iraq. Bush believed that the UN resolution had already provided him with the necessary authorization to launch a military operation against Iraq, but he wanted to show that the nation was united behind a military action. Despite the opposition of a majority of Democrats in both the House and the Senate, Congress approved the Authorization for Use of Military Force Against Iraq Resolution of 1991. After the January 15 deadline passed without an Iraqi withdrawal from Kuwait, U.S. and coalition forces conducted a bombing campaign that devastated Iraq's power grid and communications network, and resulted in the desertion of about 100,000 Iraqi soldiers. In retaliation, Iraq launched Scud missiles at Israel and Saudi Arabia, but most of the missiles did little damage. On February 23, coalition forces began a ground invasion into Kuwait, evicting Iraqi forces by the end of February 27. About 300 Americans, as well as approximately 65 soldiers from other coalition nations, died during the military action. A cease fire was arranged on March 3, and the UN passed a resolution establishing a peacekeeping force in a demilitarized zone between Kuwait and Iraq. A March 1991 Gallup poll showed that Bush had an approval rating of 89 percent, the highest presidential approval rating in the history of Gallup polling. After 1991, the UN maintained economic sanctions against Iraq, and the United Nations Special Commission was assigned to ensure that Iraq did not revive its weapons of mass destruction program. NAFTA In 1987, the U.S. and Canada had reached a free trade agreement that eliminated many tariffs between the two countries. President Reagan had intended it as the first step towards a larger trade agreement to eliminate most tariffs among the United States, Canada, and Mexico. The Bush administration, along with the Progressive Conservative Canadian Prime Minister Brian Mulroney, spearheaded the negotiations of the North American Free Trade Agreement (NAFTA) with Mexico. In addition to lowering tariffs, the proposed treaty would affect patents, copyrights, and trademarks. In 1991, Bush sought fast track authority, which grants the president the power to submit an international trade agreement to Congress without the possibility of amendment. Despite congressional opposition led by House Majority Leader Dick Gephardt, both houses of Congress voted to grant Bush fast track authority. NAFTA was signed in December 1992, after Bush lost re-election, but President Clinton won ratification of NAFTA in 1993. NAFTA remains controversial for its impact on wages, jobs, and overall economic growth. Domestic affairs Economy and fiscal issues The U.S. economy had generally performed well since emerging from recession in late 1982, but it slipped into a mild recession in 1990. The unemployment rate rose from 5.9 percent in 1989 to a high of 7.8 percent in mid-1991. Large federal deficits, spawned during the Reagan years, rose from $152.1 billion in 1989 to $220 billion for 1990; the $220 billion deficit represented a threefold increase since 1980. As the public became increasingly concerned about the economy and other domestic affairs, Bush's well-received handling of foreign affairs became less of an issue for most voters. Bush's top domestic priority was to bring an end to federal budget deficits, which he saw as a liability for the country's long-term economic health and standing in the world. As he was opposed to major defense spending cuts and had pledged to not raise taxes, the president had major difficulties in balancing the budget. Bush and congressional leaders agreed to avoid major changes to the budget for fiscal year 1990, which began in October 1989. However, both sides knew that spending cuts or new taxes would be necessary in the following year's budget in order to avoid the draconian automatic domestic spending cuts required by the Gramm–Rudman–Hollings Balanced Budget Act of 1987. Bush and other leaders also wanted to cut deficits because Federal Reserve Chair Alan Greenspan refused to lower interest rates, and thus stimulate economic growth, unless the federal budget deficit was reduced. In a statement released in late June 1990, Bush said that he would be open to a deficit reduction program which included spending cuts, incentives for economic growth, budget process reform, as well as tax increases. To fiscal conservatives in the Republican Party, Bush's statement represented a betrayal, and they heavily criticized him for compromising so early in the negotiations. In September 1990, Bush and Congressional Democrats announced a compromise to cut funding for mandatory and discretionary programs while also raising revenue, partly through a higher gas tax. The compromise additionally included a "pay as you go" provision that required that new programs be paid for at the time of implementation. House Minority Whip Newt Gingrich led the conservative opposition to the bill, strongly opposing any form of tax increase. Some liberals also criticized the budget cuts in the compromise, and in October, the House rejected the deal, resulting in a brief government shutdown. Without the strong backing of the Republican Party, Bush agreed to another compromise bill, this one more favorable to Democrats. The Omnibus Budget Reconciliation Act of 1990 (OBRA-90), enacted on October 27, 1990, dropped much of the gasoline tax increase in favor of higher income taxes on top earners. It included cuts to domestic spending, but the cuts were not as deep as those that had been proposed in the original compromise. Bush's decision to sign the bill damaged his standing with conservatives and the general public, but it also laid the groundwork for the budget surpluses of the late 1990s. Discrimination The disabled had not received legal protections under the landmark Civil Rights Act of 1964, and many faced discrimination and segregation by the time Bush took office. In 1988, Lowell P. Weicker Jr. and Tony Coelho had introduced the Americans with Disabilities Act, which barred employment discrimination against qualified individuals with disabilities. The bill had passed the Senate but not the House, and it was reintroduced in 1989. Though some conservatives opposed the bill due to its costs and potential burdens on businesses, Bush strongly supported it, partly because his son, Neil, had struggled with dyslexia. After the bill passed both houses of Congress, Bush signed the Americans with Disabilities Act of 1990 into law in July 1990. The act required employers and public accommodations to make "reasonable accommodations" for the disabled, while providing an exception when such accommodations imposed an "undue hardship". Senator Ted Kennedy later led the congressional passage of a separate civil rights bill designed to facilitate launching employment discrimination lawsuits. In vetoing the bill, Bush argued that it would lead to racial quotas in hiring. In November 1991, Bush signed the Civil Rights Act of 1991, which was largely similar to the bill he had vetoed in the previous year. In August 1990, Bush signed the Ryan White CARE Act, the largest federally funded program dedicated to assisting persons living with HIV/AIDS. Throughout his presidency, the AIDS epidemic grew dramatically in the U.S. and around the world, and Bush often found himself at odds with AIDS activist groups who criticized him for not placing a high priority on HIV/AIDS research and funding. Frustrated by the administration's lack of urgency on the issue, ACT UP, dumped the ashes of HIV/AIDS victims on the White House lawn during a viewing of the AIDS Quilt in 1992. By that time, HIV had become the leading cause of death in the U.S. for men aged 25–44. Environment In June 1989, the Bush administration proposed a bill to amend the Clean Air Act. Working with Senate Majority Leader George J. Mitchell, the administration won passage of the amendments over the opposition of business-aligned members of Congress who feared the impact of tougher regulations. The legislation sought to curb acid rain and smog by requiring decreased emissions of chemicals such as sulfur dioxide, and was the first major update to the Clean Air Act since 1977. Bush also signed the Oil Pollution Act of 1990 in response to the Exxon Valdez oil spill. However, the League of Conservation Voters criticized some of Bush's other environmental actions, including his opposition to stricter auto-mileage standards. Points of Light President Bush devoted attention to voluntary service as a means of solving some of America's most serious social problems. He often used the "thousand points of light" theme to describe the power of citizens to solve community problems. In his 1989 inaugural address, President Bush said, "I have spoken of a thousand points of light, of all the community organizations that are spread like stars throughout the Nation, doing good." During his presidency, Bush honored numerous volunteers with the Daily Point of Light Award, a tradition that was continued by his presidential successors. In 1990, the Points of Light Foundation was created as a nonprofit organization in Washington to promote this spirit of volunteerism. In 2007, the Points of Light Foundation merged with the Hands On Network to create a new organization, Points of Light. Judicial appointments Bush appointed two justices to the Supreme Court of the United States. In 1990, Bush appointed a largely unknown state appellate judge, David Souter, to replace liberal icon William Brennan. Souter was easily confirmed and served until 2009, but joined the liberal bloc of the court, disappointing Bush. In 1991, Bush nominated conservative federal judge Clarence Thomas to succeed Thurgood Marshall, a long-time liberal stalwart. Thomas, the former head of the Equal Employment Opportunity Commission (EEOC), faced heavy opposition in the Senate, as well as from pro-choice groups and the NAACP. His nomination faced another difficulty when Anita Hill accused Thomas of having sexually harassed her during his time as the chair of EEOC. Thomas won confirmation in a narrow 52–48 vote; 43 Republicans and 9 Democrats voted to confirm Thomas's nomination, while 46 Democrats and 2 Republicans voted against confirmation. Thomas became one of the most conservative justices of his era. Other issues Bush's education platform consisted mainly of offering federal support for a variety of innovations, such as open enrollment, incentive pay for outstanding teachers, and rewards for schools that improve performance with underprivileged children. Though Bush did not pass a major educational reform package during his presidency, his ideas influenced later reform efforts, including Goals 2000 and the No Child Left Behind Act. Bush signed the Immigration Act of 1990, which led to a 40 percent increase in legal immigration to the United States. The act more than doubled the number of visas given to immigrants on the basis of job skills. In the wake of the savings and loan crisis, Bush proposed a $50 billion package to rescue the savings and loans industry, and also proposed the creation of the Office of Thrift Supervision to regulate the industry. Congress passed the Financial Institutions Reform, Recovery, and Enforcement Act of 1989, which incorporated most of Bush's proposals. Public image Bush was widely seen as a "pragmatic caretaker" president who lacked a unified and compelling long-term theme in his efforts. Indeed, Bush's sound bite where he refers to the issue of overarching purpose as "the vision thing" has become a metonym applied to other political figures accused of similar difficulties. His ability to gain broad international support for the Gulf War and the war's result were seen as both a diplomatic and military triumph, rousing bipartisan approval, though his decision to withdraw without removing Saddam Hussein left mixed feelings, and attention returned to the domestic front and a souring economy. A New York Times article mistakenly depicted Bush as being surprised to see a supermarket barcode reader; the report of his reaction exacerbated the notion that he was "out of touch". Amid the early 1990s recession, his image shifted from "conquering hero" to "politician befuddled by economic matters". At the elite level, a number of commentators and political experts deplored the state of American politics in 1991–1992, and reported the voters were angry. Many analysts blamed the poor quality of national election campaigns. 1992 presidential campaign Bush announced his reelection bid in early 1992; with a coalition victory in the Persian Gulf War and high approval ratings, Bush's reelection initially looked likely. As a result, many leading Democrats, including Mario Cuomo, Dick Gephardt, and Al Gore, declined to seek their party's presidential nomination. However, Bush's tax increase had angered many conservatives, who believed that Bush had strayed from the conservative principles of Ronald Reagan. He faced a challenge from conservative political columnist Pat Buchanan in the 1992 Republican primaries. Bush fended off Buchanan's challenge and won his party's nomination at the 1992 Republican National Convention, but the convention adopted a socially conservative platform strongly influenced by the Christian right. Meanwhile, the Democrats nominated Governor Bill Clinton of Arkansas. A moderate who was affiliated with the Democratic Leadership Council (DLC), Clinton favored welfare reform, deficit reduction, and a tax cut for the middle class. In early 1992, the race took an unexpected twist when Texas billionaire H. Ross Perot launched a third party bid, claiming that neither Republicans nor Democrats could eliminate the deficit and make government more efficient. His message appealed to voters across the political spectrum disappointed with both parties' perceived fiscal irresponsibility. Perot also attacked NAFTA, which he claimed would lead to major job losses. National polling taken in mid-1992 showed Perot in the lead, but Clinton experienced a surge through effective campaigning and the selection of Senator Al Gore, a popular and relatively young Southerner, as his running mate. Clinton won the election, taking 43 percent of the popular vote and 370 electoral votes, while Bush won 37.5 percent of the popular vote and 168 electoral votes. Perot won 19% of the popular vote, one of the highest totals for a third-party candidate in U.S. history, drawing equally from both major candidates, according to exit polls. Clinton performed well in the Northeast, the Midwest, and the West Coast, while also waging the strongest Democratic campaign in the South since the 1976 election. Several factors were important in Bush's defeat. The ailing economy which arose from recession may have been the main factor in Bush's loss, as 7 in 10 voters said on election day that the economy was either "not so good" or "poor". On the eve of the 1992 election, the unemployment rate stood at 7.8%, which was the highest it had been since 1984. The president was also damaged by his alienation of many conservatives in his party. Bush blamed Perot in part for his defeat, though exit polls showed that Perot drew his voters about equally from Clinton and Bush. Despite his defeat, Bush left office with a 56 percent job approval rating in January 1993. Like many of his predecessors, Bush issued a series of pardons during his last days in office. In December 1992, he granted executive clemency to six former senior government officials implicated in the Iran-Contra scandal, most prominently former Secretary of Defense Caspar Weinberger. The charges against the six were that they lied to or withheld information from Congress. The pardons effectively brought an end to the Iran-Contra scandal. According to Seymour Martin Lipset, the 1992 election had several unique characteristics. Voters felt that economic conditions were worse than they actually were, which harmed Bush. A rare event was the presence of a strong third-party candidate. Liberals launched a backlash against 12 years of a conservative White House. The chief factor was Clinton uniting his party, and winning over a number of heterogeneous groups. Post-presidency (1993–2018) Appearances After leaving office, Bush and his wife built a retirement house in the community of West Oaks, Houston. He established a presidential office within the Park Laureate Building on Memorial Drive in Houston. He also frequently spent time at his vacation home in Kennebunkport, took annual cruises in Greece, went on fishing trips in Florida, and visited the Bohemian Club in Northern California. He declined to serve on corporate boards, but delivered numerous paid speeches and served as an adviser to The Carlyle Group, a private equity firm. He never published his memoirs, but he and Brent Scowcroft co-wrote A World Transformed, a 1999 work on foreign policy. Portions of his letters and his diary were later published as The China Diary of George H. W. Bush and All The Best, George Bush. During a 1993 visit to Kuwait, Bush was targeted in an assassination plot directed by the Iraqi Intelligence Service. President Clinton retaliated when he ordered the firing of 23 cruise missiles at Iraqi Intelligence Service headquarters in Baghdad. Bush did not publicly comment on the assassination attempt or the missile strike, but privately spoke with Clinton shortly before the strike took place. In the 1994 gubernatorial elections, his sons George W. and Jeb concurrently ran for Governor of Texas and Governor of Florida. Concerning their political careers, he advised them both that "[a]t some point both of you may want to say 'Well, I don't agree with my Dad on that point' or 'Frankly I think Dad was wrong on that.' Do it. Chart your own course, not just on the issues but on defining yourselves". George W. won his race against Ann Richards while Jeb lost to Lawton Chiles. After the results came in, the elder Bush told ABC, "I have very mixed emotions. Proud father, is the way I would sum it all up." Jeb would again run for governor of Florida in 1998 and win at the same time that his brother George W. won re-election in Texas. It marked the second time in United States history that a pair of brothers served simultaneously as governors. Bush supported his son's candidacy in the 2000 presidential election, but did not actively campaign in the election and did not deliver a speech at the 2000 Republican National Convention. George W. Bush defeated Al Gore in the 2000 election and was re-elected in 2004. Bush and his son thus became the second father–son pair to each serve as President of the United States, following John Adams and John Quincy Adams. Through previous administrations, the elder Bush had ubiquitously been known as "George Bush" or "President Bush", but following his son's election the need to distinguish between them has made retronymic forms such as "George H. W. Bush" and "George Bush Sr." and colloquialisms such as "Bush 41" and "Bush the Elder" more common. Bush advised his son on some personnel choices, approving of the selection of Dick Cheney as running mate and the retention of George Tenet as CIA Director. However, he was not consulted on all appointments, including that of his old rival, Donald Rumsfeld, as Secretary of Defense. Though he avoided giving unsolicited advice to his son, Bush and his son also discussed some matters of policy, especially regarding national security issues. In his retirement, Bush used the public spotlight to support various charities. Despite earlier political differences with Bill Clinton, the two former presidents eventually became friends. They appeared together in television ads, encouraging aid for victims of Hurricane Katrina and the 2004 Indian Ocean earthquake and tsunami. However, when interviewed by Jon Meacham, Bush criticized Donald Rumsfeld, Dick Cheney, and even his own son George W. Bush for their handling of foreign policy after the September 11 attacks. Final years Bush supported Republican John McCain in the 2008 presidential election, and Republican Mitt Romney in the 2012 presidential election, but both were defeated by Democrat Barack Obama. In 2011, Obama awarded Bush with the Presidential Medal of Freedom, the highest civilian honor in the United States. Bush supported his son Jeb's bid in the 2016 Republican primaries. Jeb Bush's campaign struggled however, and he withdrew from the race during the primaries. Neither George H.W. nor George W. Bush endorsed the eventual Republican nominee, Donald Trump; all three Bushes emerged as frequent critics of Trump's policies and speaking style, while Trump frequently criticized George W. Bush's presidency. George H. W. Bush later said that he voted for the Democratic nominee, Hillary Clinton, in the general election. After the election, Bush wrote a letter to president-elect Donald Trump in January 2017 to inform him that because of his poor health, he would not be able to attend Trump's inauguration on January 20; he gave him his best wishes. In August 2017, after the violence at Unite the Right rally in Charlottesville, both Presidents Bush released a joint statement saying, "America must always reject racial bigotry, anti-Semitism, and hatred in all forms[. ...] As we pray for Charlottesville, we are all reminded of the fundamental truths recorded by that city's most prominent citizen in the Declaration of Independence: we are all created equal and endowed by our Creator with unalienable rights." On April 17, 2018, Barbara Bush, died at the age of 92 at her home in Houston, Texas. Her funeral was held at St. Martin's Episcopal Church in Houston four days later. Bush, along with former Presidents Barack Obama, George W. Bush (son), Bill Clinton and First Ladies Melania Trump, Michelle Obama, Laura Bush (daughter-in-law) and Hillary Clinton attended the funeral and posed together for a photo as a sign of unity. On November 1, 2018, Bush went to the polls to vote early in the midterm elections. This would be his final public appearance. Death and funeral After a long battle with vascular Parkinson's disease, Bush died at his home in Houston on November 30, 2018, at the age of 94. At the time of his death he was the longest-lived U.S. president, a distinction now held by Jimmy Carter. He was also the third-oldest vice president. Bush lay in state in the Rotunda of the U.S. Capitol from December 3 through December 5; he was the 12th U.S. president to be accorded this honor. Then, on December 5, Bush's casket was transferred from the Capitol rotunda to Washington National Cathedral where a state funeral was held. After the funeral, Bush's body was transported to George H.W. Bush Presidential Library in College Station, Texas, where he was buried next to his wife Barbara and daughter Robin. At the funeral, former president George W. Bush eulogized his father saying, Personal life In 1991, The New York Times revealed that Bush was suffering from Graves' disease, a non-contagious thyroid condition that his wife Barbara also suffered from. Bush had two separate hip replacement surgeries in 2000 and 2007. Thereafter, Bush started to experience weakness in his legs, which was attributed vascular parkinsonism, a form of Parkinson's disease. He progressively developed problems walking, initially needing a walking stick for mobility aid before he eventually came to rely on a wheelchair from 2011 onwards. Bush was a lifelong Episcopalian and a member of St. Martin's Episcopal Church in Houston. As President, Bush regularly attended services at St. John's Episcopal Church in Washington D.C. He cited various moments in his life on the deepening of his faith, including his escape from Japanese forces in 1944, and the death of his three-year-old daughter Robin in 1953. His faith was reflected in his "thousand points of light" speech, his support for prayer in schools, and his support for the pro-life movement (following his election as vice president). Legacy Historical reputation Polls of historians and political scientists have ranked Bush in the top half of presidents. A 2018 poll of the American Political Science Association's Presidents and Executive Politics section ranked Bush as the 17th best president out of 44. A 2017 C-Span poll of historians also ranked Bush as the 20th best president out of 43. Richard Rose described Bush as a "guardian" president, and many other historians and political scientists have similarly described Bush as a passive, hands-off president who was "largely content with things as they were". Professor Steven Knott writes that "[g]enerally the Bush presidency is viewed as successful in foreign affairs but a disappointment in domestic affairs." Biographer Jon Meacham writes that, after he left office, many Americans viewed Bush as "a gracious and underappreciated man who had many virtues but who had failed to project enough of a distinctive identity and vision to overcome the economic challenges of 1991–92 and to win a second term." Bush himself noted that his legacy was "lost between the glory of Reagan ... and the trials and tribulations of my sons." In the 2010s, Bush was fondly remembered for his willingness to compromise, which contrasted with the intensely partisan era that followed his presidency. In 2018, Vox highlighted Bush for his "pragmatism" as a moderate Republican president by working across the aisle. They specifically noted Bush's accomplishments within the domestic policy by making bipartisan deals, including raising the tax budget among the wealthy with the Omnibus Budget Reconciliation Act of 1990. Bush also helped pass the Americans with Disabilities Act of 1990 which The New York Times described as "the most sweeping anti-discrimination law since the Civil Rights Act of 1964. In response to the Exxon Valdez oil spill, Bush built another bipartisan coalition to strengthen the Clean Air Act Amendments of 1990. Bush also championed and signed into a law the Immigration Act of 1990, a sweeping bipartisan immigration reform act that made it easier for immigrants to legally enter the county, while also granting immigrants fleeing violence the temporary protected status visa, as well as lifted the pre-naturalization English testing process, and finally "eliminated the exclusion of homosexuals under what Congress now deemed the medically unsound classification of "sexual deviant" that was included in the 1965 act." Bush stated, "Immigration is not just a link to our past but its also a bridge to America's future". According to USA Today, the legacy of Bush's presidency was defined by his victory over Iraq after the invasion of Kuwait, and by his presiding over the dissolution of the Soviet Union and German reunification. Michael Beschloss and Strobe Talbott praise Bush's handling of the USSR, especially how he prodded Gorbachev in terms of releasing control over the satellite states and permitting German unification—and especially a united Germany in NATO. Andrew Bacevich judges the Bush administration as "morally obtuse" in the light of its "business-as-usual" attitude towards China after the massacre in Tiananmen Square and its uncritical support of Gorbachev as the Soviet Union disintegrated. David Rothkopf argues: In the recent history of U.S. foreign policy, there has been no president, nor any president's team, who, when confronted with profound international change and challenges, responded with such a thoughtful and well-managed foreign policy....[the Bush administration was] a bridge over one of the great fault lines of history [that] ushered in a 'new world order' it described with great skill and professionalism." Memorials, awards, and honors In 1990, Time magazine named him the Man of the Year. In 1997, the Houston Intercontinental Airport was renamed as the George Bush Intercontinental Airport. In 1999, the CIA headquarters in Langley, Virginia, was named the George Bush Center for Intelligence in his honor. In 2011, Bush, an avid golfer, was inducted in the World Golf Hall of Fame. The (CVN-77), the tenth and last supercarrier of the United States Navy, was named for Bush. Bush is commemorated on a postage stamp that was issued by the United States Postal Service in 2019. The George H.W. Bush Presidential Library and Museum, the tenth U.S. presidential library, was completed in 1997. It contains the presidential and vice presidential papers of Bush and the vice presidential papers of Dan Quayle. The library is located on a site on the west campus of Texas A&M University in College Station, Texas. Texas A&M University also hosts the Bush School of Government and Public Service, a graduate public policy school. See also Electoral history of George H. W. Bush List of presidents of the United States List of presidents of the United States by previous experience Notes References Works cited Further reading Secondary sources Cox, Michael, and Steven Hurst. "'His finest hour?'George Bush and the diplomacy of German unification." Diplomacy and statecraft 13.4 (2002): 123–150. Cull, Nicholas J. "Speeding the Strange Death of American Public Diplomacy: The George H. W. Bush Administration and the US Information Agency." Diplomatic History 34.1 (2010): 47–69. Engel, Jeffrey A. "A Better World...but Don't Get Carried Away: The Foreign Policy of George H. W. Bush Twenty Years On." Diplomatic History 34.1 (2010): 25–46. Engel, Jeffrey A. When the World Seemed New: George H. W. Bush and the End of the Cold War (2018) excerpt Maynard, Christopher. Out of the shadow: George H. W. Bush and the end of the Cold War (Texas A&M University Press, 2008). Troy, Gil. "Stumping in the bookstores: A literary history of the 1992 presidential campaign." Presidential Studies Quarterly (1995): 697–710. online Primary sources External links George H.W. Bush Presidential Library Center White House biography Full audio of a number of Bush speeches Miller Center of Public Affairs 1992 election episode in CNN's Race for the White House Extensive essays on Bush and shorter essays on each member of his cabinet and First Lady from the Miller Center of Public Affairs "Life Portrait of George H. W. Bush", from C-SPAN's American Presidents: Life Portraits, December 13, 1999 George H. W. Bush  an American Experience documentary 1924 births 1980 United States vice-presidential candidates 1984 United States vice-presidential candidates 2018 deaths 20th-century American businesspeople 20th-century American non-fiction writers 20th-century American politicians 20th-century presidents of the United States 20th-century vice presidents of the United States 21st-century American non-fiction writers Acting presidents of the United States Ambassadors of the United States to China American businesspeople in the oil industry American Episcopalians American people of Dutch descent American male non-fiction writers American people of the Gulf War American political writers Aviators from Texas Burials in Texas Bush family Candidates in the 1980 United States presidential election Candidates in the 1988 United States presidential election Candidates in the 1992 United States presidential election Cold War CIA chiefs Neurological disease deaths in Texas Eli Lilly and Company people Fathers of presidents of the United States Ford administration personnel Grand Crosses Special Class of the Order of Merit of the Federal Republic of Germany Honorary Knights Grand Cross of the Order of the Bath Livingston family Members of the United States House of Representatives from Texas Military personnel from Connecticut Nixon administration cabinet members People from Greenwich, Connecticut People from Kennebunkport, Maine People from Midland, Texas People from Milton, Massachusetts People of the Cold War Deaths from Parkinson's disease Permanent Representatives of the United States to the United Nations Phillips Academy alumni Presidential Medal of Freedom recipients Presidents of the United States Reagan administration cabinet members Recipients of the Air Medal Recipients of the Distinguished Flying Cross (United States) Recipients of the Order of the Cross of Terra Mariana, 1st Class Recipients of the Order of the White Lion Republican National Committee chairs Republican Party members of the United States House of Representatives Republican Party (United States) presidential nominees Republican Party presidents of the United States Republican Party vice presidents of the United States Rice University staff Schuyler family Shot-down aviators Sons of the American Revolution Texas Republicans Time Person of the Year United States Navy officers United States Navy pilots of World War II Vice presidents of the United States World Golf Hall of Fame inductees Writers from Texas Yale University 1940s alumni Yale Bulldogs baseball players
[ -0.43251532316207886, 0.5348782539367676, -0.3899615705013275, 0.48447126150131226, -0.0030141756869852543, 0.6069653034210205, 0.13385741412639618, -0.33195960521698, -0.4857420325279236, -0.24437175691127777, -0.10323266685009003, -0.10726579278707504, -0.556477427482605, 0.6635406017303...
11956
https://en.wikipedia.org/wiki/GPS%20%28disambiguation%29
GPS (disambiguation)
GPS is the Global Positioning System, a satellite-based navigation system. GPS may also refer to: Technology GPS navigation device, especially an automotive navigation system Generalized processor sharing, an algorithm to fairly share computer processing time General Problem Solver, a 1959 computer program GNAT Programming Studio, a software development package Satellite navigation, GPS (global positioning system) in common parlance Organizations Crossroads GPS (Grassroots Policy Strategies), a nonprofit corporation that works in conjunction with the Super PAC American Crossroads Fusion GPS, American commercial and strategic research firm Gap Inc. stock ticker Ghana Prisons Service GPS (band), a progressive rock band GPS Rugby, an Australian rugby union club Geirus Policies and Standards committee, a body of the Rabbinical Council of America Gabungan Parti Sarawak, a Malaysian political coalition based in Sarawak Education Gilbert Public Schools, a school district in Gilbert, Arizona, US Girls Preparatory School, an all-girls prep school in Chattanooga, Tennessee, US Athletic Association of the Great Public Schools of New South Wales, an association of private boys' schools, Australia Great Public Schools Association of Queensland Inc., an association of nine Australian schools Grosse Pointe South High School, a public high school in Grosse Pointe, Michigan, US Greenville Public Schools (a.k.a. Greenville Public School District), a school district in Greenville, Mississippi, US The School of Global Policy and Strategy, an institute of international studies at the University of California, San Diego Medicine Goodpasture syndrome, a rare autoimmune disease Gray platelet syndrome, a rare congenital autosomal recessive bleeding disorder Other uses Seymour Airport (IATA code), Galápagos Islands, Ecuador Fareed Zakaria GPS (Global Public Square), a CNN television show Geometrical Products Specification, an international standard for geometric dimensioning and tolerancing Genealogical Proof Standard (see also Cluster genealogy) "GPS" (song), a song by Maluma "Var är jag", renamed to "GPS", a song by Basshunter from his LOL <(^^,)> album See also Grams of protein per dollar (gP/$), a means of representing the cost of amino acids in a food product
[ -0.06111479923129082, 0.5615760087966919, 0.36529192328453064, 0.15251120924949646, 0.14401023089885712, -0.041905444115400314, 0.08223647624254227, 0.1630740612745285, -0.05013801530003548, -0.44155004620552063, -0.3018329441547394, 0.3952330946922302, -0.4387010335922241, 0.1727592200040...
11958
https://en.wikipedia.org/wiki/George%20Berkeley
George Berkeley
George Berkeley (; 12 March 168514 January 1753) – known as Bishop Berkeley (Bishop of Cloyne of the Anglican Church of Ireland) – was an Anglo-Irish philosopher whose primary achievement was the advancement of a theory he called "immaterialism" (later referred to as "subjective idealism" by others). This theory denies the existence of material substance and instead contends that familiar objects like tables and chairs are ideas perceived by the minds and, as a result, cannot exist without being perceived. Berkeley is also known for his critique of abstraction, an important premise in his argument for immaterialism. In 1709, Berkeley published his first major work, An Essay Towards a New Theory of Vision, in which he discussed the limitations of human vision and advanced the theory that the proper objects of sight are not material objects, but light and colour. This foreshadowed his chief philosophical work, A Treatise Concerning the Principles of Human Knowledge, in 1710, which, after its poor reception, he rewrote in dialogue form and published under the title Three Dialogues between Hylas and Philonous in 1713. In this book, Berkeley's views were represented by Philonous (Greek: "lover of mind"), while Hylas ("hyle", Greek: "matter") embodies the Irish thinker's opponents, in particular John Locke. Berkeley argued against Isaac Newton's doctrine of absolute space, time and motion in De Motu (On Motion), published 1721. His arguments were a precursor to the views of Mach and Einstein. In 1732, he published Alciphron, a Christian apologetic against the free-thinkers, and in 1734, he published The Analyst, a critique of the foundations of calculus, which was influential in the development of mathematics. Interest in Berkeley's work increased after World War II because he tackled many of the issues of paramount interest to philosophy in the 20th century, such as the problems of perception, the difference between primary and secondary qualities, and the importance of language. Biography Ireland Berkeley was born at his family home, Dysart Castle, near Thomastown, County Kilkenny, Ireland, the eldest son of William Berkeley, a cadet of the noble family of Berkeley whose ancestry can be trace back to the Anglo-Saxon period and who had served as feudal lords and landowners in Gloucester, England. Little is known of his mother. He was educated at Kilkenny College and attended Trinity College Dublin, where he was elected a Scholar in 1702, being awarded BA in 1704 and MA and a Fellowship in 1707. He remained at Trinity College after completion of his degree as a tutor and Greek lecturer. His earliest publication was on mathematics, but the first that brought him notice was his An Essay towards a New Theory of Vision, first published in 1709. In the essay, Berkeley examines visual distance, magnitude, position and problems of sight and touch. While this work raised much controversy at the time, its conclusions are now accepted as an established part of the theory of optics. The next publication to appear was the Treatise Concerning the Principles of Human Knowledge in 1710, which had great success and gave him a lasting reputation, though few accepted his theory that nothing exists outside the mind. This was followed in 1713 by Three Dialogues between Hylas and Philonous, in which he propounded his system of philosophy, the leading principle of which is that the world, as represented by our senses, depends for its existence on being perceived. For this theory, the Principles gives the exposition and the Dialogues the defence. One of his main objectives was to combat the prevailing materialism of his time. The theory was largely received with ridicule, while even those such as Samuel Clarke and William Whiston, who did acknowledge his "extraordinary genius," were nevertheless convinced that his first principles were false. England and Europe Shortly afterwards, Berkeley visited England and was received into the circle of Addison, Pope and Steele. In the period between 1714 and 1720, he interspersed his academic endeavours with periods of extensive travel in Europe, including one of the most extensive Grand Tours of the length and breadth of Italy ever undertaken. In 1721, he took Holy Orders in the Church of Ireland, earning his doctorate in divinity, and once again chose to remain at Trinity College Dublin, lecturing this time in Divinity and in Hebrew. In 1721/2 he was made Dean of Dromore and, in 1724, Dean of Derry. In 1723, following her violent quarrel with Jonathan Swift, who had been her intimate friend for many years, Esther Vanhomrigh (for whom Swift had created the nickname "Vanessa") named Berkeley her co-heir along with the barrister Robert Marshall; her choice of legatees caused a good deal of surprise since she did not know either of them well, although Berkeley as a very young man had known her father. Swift said generously that he did not grudge Berkeley his inheritance, much of which vanished in a lawsuit in any event. A story that Berkeley and Marshall disregarded a condition of the inheritance that they must publish the correspondence between Swift and Vanessa is probably untrue. In 1725, he began the project of founding a college in Bermuda for training ministers and missionaries in the colony, in pursuit of which he gave up his deanery with its income of £1100. Marriage and America In 1728, he married Anne Forster, daughter of John Forster, Chief Justice of the Irish Common Pleas, and his first wife Rebecca Monck. He then went to America on a salary of £100 per annum. He landed near Newport, Rhode Island, where he bought a plantation at Middletownthe famous "Whitehall". Berkeley purchased several enslaved Africans to work on the plantation. It has been claimed that "he introduced Palladianism into America by borrowing a design from [William] Kent's Designs of Inigo Jones for the door-case of his house in Rhode Island, Whitehall." He also brought to New England John Smibert, the Scottish artist he "discovered" in Italy, who is generally regarded as the founding father of American portrait painting. Meanwhile, he drew up plans for the ideal city he planned to build on Bermuda. He lived at the plantation while he waited for funds for his college to arrive. The funds, however, were not forthcoming. "With the withdrawal from London of his own persuasive energies, opposition gathered force; and the Prime Minister, Walpole grew steadily more sceptical and lukewarm. At last it became clear that the essential Parliamentary grant would be not forthcoming" and in 1732 he left America and returned to London. He and Anne had four children who survived infancy: Henry, George, William and Julia, and at least two other children who died in infancy. William's death in 1751 was a great cause of grief to his father. Episcopate in Ireland Berkeley was nominated to be the Bishop of Cloyne in the Church of Ireland on 18 January 1734. He was consecrated as such on 19 May 1734. He was the Bishop of Cloyne until his death on 14 January 1753, although he died at Oxford (see below). Humanitarian work While living in London's Saville Street, he took part in efforts to create a home for the city's abandoned children. The Foundling Hospital was founded by Royal Charter in 1739, and Berkeley is listed as one of its original governors. Last works His last two publications were Siris: A Chain of Philosophical Reflexions and Inquiries Concerning the Virtues of Tarwater, And divers other Subjects connected together and arising one from another (1744) and Further Thoughts on Tar-water (1752). Pine tar is an effective antiseptic and disinfectant when applied to cuts on the skin, but Berkeley argued for the use of pine tar as a broad panacea for diseases. His 1744 work on tar-water sold more copies than any of his other books during Berkeley's lifetime. He remained at Cloyne until 1752, when he retired. With his wife and daughter Julia he went to Oxford to live with his son George and supervise his education. He died soon afterward and was buried in Christ Church Cathedral, Oxford. His affectionate disposition and genial manners made him much loved and held in warm regard by many of his contemporaries. Anne outlived her husband by many years, and died in 1786. Contributions to philosophy The use of the concepts of "spirit" and "idea" is central in Berkeley's philosophy. As used by him, these concepts are difficult to translate into modern terminology. His concept of "spirit" is close to the concept of "conscious subject" or of "mind", and the concept of "idea" is close to the concept of "sensation" or "state of mind" or "conscious experience". Thus Berkeley denied the existence of matter as a metaphysical substance, but did not deny the existence of physical objects such as apples or mountains ("I do not argue against the existence of any one thing that we can apprehend, either by sense or reflection. That the things I see with mine eyes and touch with my hands do exist, really exist, I make not the least question. The only thing whose existence we deny, is that which philosophers call matter or corporeal substance. And in doing of this, there is no damage done to the rest of mankind, who, I dare say, will never miss it.", Principles #35). This basic claim of Berkeley's thought, his "idealism", is sometimes and somewhat derisively called "immaterialism" or, occasionally, subjective idealism. In Principles #3, he wrote, using a combination of Latin and English, esse is percipi (to be is to be perceived), most often if slightly inaccurately attributed to Berkeley as the pure Latin phrase esse est percipi. The phrase appears associated with him in authoritative philosophical sources, e.g., "Berkeley holds that there are no such mind-independent things, that, in the famous phrase, esse est percipi (aut percipere)—to be is to be perceived (or to perceive)." Hence, human knowledge is reduced to two elements: that of spirits and of ideas (Principles #86). In contrast to ideas, a spirit cannot be perceived. A person's spirit, which perceives ideas, is to be comprehended intuitively by inward feeling or reflection (Principles #89). For Berkeley, we have no direct 'idea' of spirits, albeit we have good reason to believe in the existence of other spirits, for their existence explains the purposeful regularities we find in experience ("It is plain that we cannot know the existence of other spirits otherwise than by their operations, or the ideas by them excited in us", Dialogues #145). This is the solution that Berkeley offers to the problem of other minds. Finally, the order and purposefulness of the whole of our experience of the world and especially of nature overwhelms us into believing in the existence of an extremely powerful and intelligent spirit that causes that order. According to Berkeley, reflection on the attributes of that external spirit leads us to identify it with God. Thus a material thing such as an apple consists of a collection of ideas (shape, color, taste, physical properties, etc.) which are caused in the spirits of humans by the spirit of God. Theology A convinced adherent of Christianity, Berkeley believed God to be present as an immediate cause of all our experiences. Here is Berkeley's proof of the existence of God: As T. I. Oizerman explained: Berkeley believed that God is not the distant engineer of Newtonian machinery that in the fullness of time led to the growth of a tree in the university quadrangle. Rather, the perception of the tree is an idea that God's mind has produced in the mind, and the tree continues to exist in the quadrangle when "nobody" is there, simply because God is an infinite mind that perceives all. The philosophy of David Hume concerning causality and objectivity is an elaboration of another aspect of Berkeley's philosophy. A.A. Luce, the most eminent Berkeley scholar of the 20th century, constantly stressed the continuity of Berkeley's philosophy. The fact that Berkeley returned to his major works throughout his life, issuing revised editions with only minor changes, also counts against any theory that attributes to him a significant volte-face. Relativity arguments John Locke (Berkeley's intellectual predecessor) states that we define an object by its primary and secondary qualities. He takes heat as an example of a secondary quality. If you put one hand in a bucket of cold water, and the other hand in a bucket of warm water, then put both hands in a bucket of lukewarm water, one of your hands is going to tell you that the water is cold and the other that the water is hot. Locke says that since two different objects (both your hands) perceive the water to be hot and cold, then the heat is not a quality of the water. While Locke used this argument to distinguish primary from secondary qualities, Berkeley extends it to cover primary qualities in the same way. For example, he says that size is not a quality of an object because the size of the object depends on the distance between the observer and the object, or the size of the observer. Since an object is a different size to different observers, then size is not a quality of the object. Berkeley rejects shape with a similar argument and then asks: if neither primary qualities nor secondary qualities are of the object, then how can we say that there is anything more than the qualities we observe? Relativity is the idea that there is no objective, universal truth; it is a state of dependence in which the existence of one independent object is solely dependent on that of another. According to Locke, characteristics of primary qualities are mind-independent, such as shape, size, etc., whereas secondary qualities are mind-dependent, for example, taste and color. George Berkeley refuted John Locke's belief on primary and secondary qualities because Berkeley believed that "we cannot abstract the primary qualities (e.g shape) from secondary ones (e.g color)". Berkeley argued that perception is dependent on the distance between the observer and the object, and "thus, we cannot conceive of mechanist material bodies which are extended but not (in themselves) colored". What perceived can be the same type of quality, but completely opposite form each other because of different positions and perceptions, what we perceive can be different even when the same types of things consist of contrary qualities. Secondary qualities aid in people's conception of primary qualities in an object, like how the color of an object leads people to recognize the object itself. More specifically, the color red can be perceived in apples, strawberries, and tomatoes, yet we would not know what these might look like without its color. We would also be unaware of what the color red looked like if red paint, or any object that has a perceived red color, failed to exist. From this, we can see that colors cannot exist on their own and can solely represent a group of perceived objects. Therefore, both primary and secondary qualities are mind-dependent: they cannot exist without our minds. George Berkeley was a philosopher who was against rationalism and "classical" empiricism. He was a "subjective idealist" or "empirical idealist", who believed that reality is constructed entirely of immaterial, conscious minds and their ideas; everything that exists is somehow dependent on the subject perceiving it, except the subject themselves. He refuted the existence of abstract objects that many other philosophers believed to exist, notably Plato. According to Berkeley, "an abstract object does not exist in space or time and which is therefore entirely non-physical and non-mental"; however, this argument contradicts with his relativity argument. If "esse est percipi", (Latin meaning that to exist is to be perceived) is true, then the objects in the relativity argument made by Berkeley can either exist or not. Berkeley believed that only the minds' perceptions and the Spirit that perceives are what exists in reality; what people perceive every day is only the idea of an object's existence, but the objects themselves are not perceived. Berkeley also discussed how, at times, materials cannot be perceived by oneself, and the mind of oneself cannot understand the objects. However, there also exists an "omnipresent, eternal mind" that Berkeley believed to consist of God and the Spirit, both omniscient and all-perceiving. According to Berkeley, God is the entity who controls everything, yet Berkeley also argued that "abstract object[s] do not exist in space or time". In other words, as Warnock argues, Berkeley "had recognized that he could not square with his own talk of spirits, of our minds and of God; for these are perceivers and not among objects of perception. Thus he says, rather weakly and without elucidation, that in addition to our ideas we also have notions—we know what it means to speak of spirits and their operations." However, the relativity argument violates the idea of immaterialism. Berkeley's immaterialism argues that "esse est percipi (aut percipere)", which in English is to be is to be perceived (or to perceive). That is saying only what perceived or perceives is real, and without our perception or God's nothing can be real. Yet, if the relativity argument, also by Berkeley, argues that the perception of an object depends on the different positions, then this means that what perceived can either be real or not because the perception does not show that whole picture and the whole picture cannot be perceived. Berkeley also believes that "when one perceives mediately, one perceives one idea by means of perceiving another". By this, it can be elaborated that if the standards of what perceived at first are different, what perceived after that can be different, as well. In the heat perception described above, one hand perceived the water to be hot and the other hand perceived the water to be cold due to relativity. If applying the idea "to be is to be perceived", the water should be both cold and hot because both perceptions are perceived by different hands. However, the water cannot be cold and hot at the same time for it self-contradicts, so this shows that what perceived is not always true because it sometimes can break the law of noncontradiction. In this case, "it would be arbitrary anthropocentrism to claim that humans have special access to the true qualities of objects". The truth for different people can be different, and humans are limited to accessing the absolute truth due to relativity. Summing up, nothing can be absolutely true due to relativity or the two arguments, to be is to be perceived and the relativity argument, do not always work together. New theory of vision In his Essay Towards a New Theory of Vision, Berkeley frequently criticised the views of the Optic Writers, a title that seems to include Molyneux, Wallis, Malebranche and Descartes. In sections 1–51, Berkeley argued against the classical scholars of optics by holding that: spatial depth, as the distance that separates the perceiver from the perceived object is itself invisible. That is, we do not see space directly or deduce its form logically using the laws of optics. Space for Berkeley is no more than a contingent expectation that visual and tactile sensations will follow one another in regular sequences that we come to expect through habit. Berkeley goes on to argue that visual cues, such as the perceived extension or 'confusion' of an object, can only be used to indirectly judge distance, because the viewer learns to associate visual cues with tactile sensations. Berkeley gives the following analogy regarding indirect distance perception: one perceives distance indirectly just as one perceives a person's embarrassment indirectly. When looking at an embarrassed person, we infer indirectly that the person is embarrassed by observing the red color on the person's face. We know through experience that a red face tends to signal embarrassment, as we've learned to associate the two. The question concerning the visibility of space was central to the Renaissance perspective tradition and its reliance on classical optics in the development of pictorial representations of spatial depth. This matter was debated by scholars since the 11th-century Arab polymath and mathematician Alhazen (al-Hasan Ibn al-Haytham) affirmed in experimental contexts the visibility of space. This issue, which was raised in Berkeley's theory of vision, was treated at length in the Phenomenology of Perception of Maurice Merleau-Ponty, in the context of confirming the visual perception of spatial depth (la profondeur), and by way of refuting Berkeley's thesis. Berkeley wrote about the perception of size in addition to that of distance. He is frequently misquoted as believing in size–distance invariance—a view held by the Optic Writers. This idea is that we scale the image size according to distance in a geometrical manner. The error may have become commonplace because the eminent historian and psychologist E. G. Boring perpetuated it. In fact, Berkeley argued that the same cues that evoke distance also evoke size, and that we do not first see size and then calculate distance. It is worth quoting Berkeley's words on this issue (Section 53): What inclines men to this mistake (beside the humour of making one see by geometry) is, that the same perceptions or ideas which suggest distance, do also suggest magnitude ... I say they do not first suggest distance, and then leave it to the judgement to use that as a medium, whereby to collect the magnitude; but they have as close and immediate a connexion with the magnitude as with the distance; and suggest magnitude as independently of distance, as they do distance independently of magnitude. Berkeley claimed that his visual theories were “vindicated” by a 1728 report regarding the recovery of vision in a 13-year-old boy operated for congenital cataracts by surgeon William Cheselden. In 2021, the name of Cheselden's patient was published for the first time: Daniel Dolins. Berkeley knew the Dolins family, had numerous social links to Cheselden, including the poet Alexander Pope, and Princess Caroline, to whom Cheselden's patient was presented. The report misspelled Cheselden's name, used language typical of Berkeley, and may even have been ghost-written by Berkeley. Unfortunately, Dolins was never able to see well enough to read, and there is no evidence that the surgery improved Dolins' vision at any point prior to his death at age 30. Philosophy of physics "Berkeley's works display his keen interest in natural philosophy [...] from his earliest writings (Arithmetica, 1707) to his latest (Siris, 1744). Moreover, much of his philosophy is shaped fundamentally by his engagement with the science of his time." The profundity of this interest can be judged from numerous entries in Berkeley's Philosophical Commentaries (1707–1708), e.g. "Mem. to Examine & accurately discuss the scholium of the 8th Definition of Mr Newton's Principia." (#316) Berkeley argued that forces and gravity, as defined by Newton, constituted "occult qualities" that "expressed nothing distinctly". He held that those who posited "something unknown in a body of which they have no idea and which they call the principle of motion, are in fact simply stating that the principle of motion is unknown." Therefore, those who "affirm that active force, action, and the principle of motion are really in bodies are adopting an opinion not based on experience." Forces and gravity existed nowhere in the phenomenal world. On the other hand, if they resided in the category of "soul" or "incorporeal thing", they "do not properly belong to physics" as a matter. Berkeley thus concluded that forces lay beyond any kind of empirical observation and could not be a part of proper science. He proposed his theory of signs as a means to explain motion and matter without reference to the "occult qualities" of force and gravity. Berkeley's razor Berkeley's razor is a rule of reasoning proposed by the philosopher Karl Popper in his study of Berkeley's key scientific work De Motu. Berkeley's razor is considered by Popper to be similar to Ockham's razor but "more powerful". It represents an extreme, empiricist view of scientific observation that states that the scientific method provides us with no true insight into the nature of the world. Rather, the scientific method gives us a variety of partial explanations about regularities that hold in the world and that are gained through experiment. The nature of the world, according to Berkeley, is only approached through proper metaphysical speculation and reasoning. Popper summarises Berkeley's razor as such: A general practical result—which I propose to call "Berkeley's razor"—of [Berkeley's] analysis of physics allows us a priori to eliminate from physical science all essentialist explanations. If they have a mathematical and predictive content they may be admitted qua mathematical hypotheses (while their essentialist interpretation is eliminated). If not they may be ruled out altogether. This razor is sharper than Ockham's: all entities are ruled out except those which are perceived. In another essay of the same book titled "Three Views Concerning Human Knowledge", Popper argues that Berkeley is to be considered as an instrumentalist philosopher, along with Robert Bellarmine, Pierre Duhem and Ernst Mach. According to this approach, scientific theories have the status of serviceable fictions, useful inventions aimed at explaining facts, and without any pretension to be true. Popper contrasts instrumentalism with the above-mentioned essentialism and his own "critical rationalism". Philosophy of mathematics In addition to his contributions to philosophy, Berkeley was also very influential in the development of mathematics, although in a rather indirect sense. "Berkeley was concerned with mathematics and its philosophical interpretation from the earliest stages of his intellectual life." Berkeley's "Philosophical Commentaries" (1707–1708) witness to his interest in mathematics: Axiom. No reasoning about things whereof we have no idea. Therefore no reasoning about Infinitesimals. (#354) Take away the signs from Arithmetic & Algebra, & pray what remains? (#767) These are sciences purely Verbal, & entirely useless but for Practise in Societys of Men. No speculative knowledge, no comparison of Ideas in them. (#768) In 1707, Berkeley published two treatises on mathematics. In 1734, he published The Analyst, subtitled A DISCOURSE Addressed to an Infidel Mathematician, a critique of calculus. Florian Cajori called this treatise "the most spectacular event of the century in the history of British mathematics." However, a recent study suggests that Berkeley misunderstood Leibnizian calculus. The mathematician in question is believed to have been either Edmond Halley, or Isaac Newton himself—though if to the latter, then the discourse was posthumously addressed, as Newton died in 1727. The Analyst represented a direct attack on the foundations and principles of calculus and, in particular, the notion of fluxion or infinitesimal change, which Newton and Leibniz used to develop the calculus. In his critique, Berkeley coined the phrase "ghosts of departed quantities", familiar to students of calculus. Ian Stewart's book From Here to Infinity captures the gist of his criticism. Berkeley regarded his criticism of calculus as part of his broader campaign against the religious implications of Newtonian mechanicsas a defence of traditional Christianity against deism, which tends to distance God from His worshipers. Specifically, he observed that both Newtonian and Leibnizian calculus employed infinitesimals sometimes as positive, nonzero quantities and other times as a number explicitly equal to zero. Berkeley's key point in "The Analyst" was that Newton's calculus (and the laws of motion based in calculus) lacked rigorous theoretical foundations. He claimed that In every other Science Men prove their Conclusions by their Principles, and not their Principles by the Conclusions. But if in yours you should allow your selves this unnatural way of proceeding, the Consequence would be that you must take up with Induction, and bid adieu to Demonstration. And if you submit to this, your Authority will no longer lead the way in Points of Reason and Science. Berkeley did not doubt that calculus produced real world truth; simple physics experiments could verify that Newton's method did what it claimed to do. "The cause of Fluxions cannot be defended by reason", but the results could be defended by empirical observation, Berkeley's preferred method of acquiring knowledge at any rate. Berkeley, however, found it paradoxical that "Mathematicians should deduce true Propositions from false Principles, be right in Conclusion, and yet err in the Premises." In The Analyst he endeavoured to show "how Error may bring forth Truth, though it cannot bring forth Science". Newton's science, therefore, could not on purely scientific grounds justify its conclusions, and the mechanical, deistic model of the universe could not be rationally justified. The difficulties raised by Berkeley were still present in the work of Cauchy whose approach to calculus was a combination of infinitesimals and a notion of limit, and were eventually sidestepped by Weierstrass by means of his (ε, δ) approach, which eliminated infinitesimals altogether. More recently, Abraham Robinson restored infinitesimal methods in his 1966 book Non-standard analysis by showing that they can be used rigorously. Moral philosophy The tract A Discourse on Passive Obedience (1712) is considered Berkeley's major contribution to moral and political philosophy. In A Discourse on Passive Obedience, Berkeley defends the thesis that people have "a moral duty to observe the negative precepts (prohibitions) of the law, including the duty not to resist the execution of punishment." However, Berkeley does make exceptions to this sweeping moral statement, stating that we need not observe precepts of "usurpers or even madmen" and that people can obey different supreme authorities if there are more than one claims to the highest authority. Berkeley defends this thesis with a deductive proof stemming from the laws of nature. First, he establishes that because God is perfectly good, the end to which he commands humans must also be good, and that end must not benefit just one person, but the entire human race. Because these commands—or laws—if practiced, would lead to the general fitness of humankind, it follows that they can be discovered by the right reason—for example, the law to never resist supreme power can be derived from reason because this law is "the only thing that stands between us and total disorder". Thus, these laws can be called the laws of nature, because they are derived from God—the creator of nature himself. "These laws of nature include duties never to resist the supreme power, lie under oath ... or do evil so that good may come of it." One may view Berkeley's doctrine on Passive Obedience as a kind of 'Theological Utilitarianism', insofar as it states that we have a duty to uphold a moral code which presumably is working towards the ends of promoting the good of humankind. However, the concept of 'ordinary' Utilitarianism is fundamentally different in that it "makes utility the one and only ground of obligation"—that is, Utilitarianism is concerned with whether particular actions are morally permissible in specific situations, while Berkeley's doctrine is concerned with whether or not we should follow moral rules in any and all circumstances. Whereas Act Utilitarianism might, for example, justify a morally impermissible act in light of the specific situation, Berkeley's doctrine of Passive Obedience holds that it is never morally permissible to not follow a moral rule, even when it seems like breaking that moral rule might achieve the happiest ends. Berkeley holds that even though sometimes, the consequences of an action in a specific situation might be bad, the general tendencies of that action benefits humanity. Other important sources for Berkeley's views on morality are Alciphron (1732), especially dialogues I–III, and the Discourse to Magistrates (1738)." Passive Obedience is notable partly for containing one of the earliest statements of rule utilitarianism. Immaterialism George Berkeley’s theory that matter does not exist comes from the belief that "sensible things are those only which are immediately perceived by sense." Berkeley says in his book called The Principles of Human Knowledge that "the ideas of sense are stronger, livelier, and clearer than those of the imagination; and they are also steady, orderly and coherent." From this we can tell that the things that we are perceiving are truly real rather than it just being a dream. All knowledge comes from perception; what we perceive are ideas, not things in themselves; a thing in itself must be outside experience; so the world only consists of ideas and minds that perceive those ideas; a thing only exists so far as it perceives or is perceived. Through this we can see that consciousness is considered something that exists to Berkeley due to its ability to perceive. "'To be,' said of the object, means to be perceived, 'esse est percipi'; 'to be', said of the subject, means to perceive or 'percipere'." Having established this, Berkeley then attacks the "opinion strangely prevailing amongst men, that houses, mountains, rivers, and in a word all sensible objects have an existence natural or real, distinct from being perceived". He believes this idea to be inconsistent because such an object with an existence independent of perception must have both sensible qualities, and thus be known (making it an idea), and also an insensible reality, which Berkeley believes is inconsistent. Berkeley believes that the error arises because people think that perceptions can imply or infer something about the material object. Berkeley calls this concept abstract ideas. He rebuts this concept by arguing that people cannot conceive of an object without also imagining the sensual input of the object. He argues in Principles of Human Knowledge that, similar to how people can only sense matter with their senses through the actual sensation, they can only conceive of matter (or, rather, ideas of matter) through the idea of sensation of matter. This implies that everything that people can conceive in regards to matter is only ideas about matter. Thus, matter, should it exist, must exist as collections of ideas, which can be perceived by the senses and interpreted by the mind. But if matter is just a collection of ideas, then Berkeley concludes that matter, in the sense of a material substance, does not exist as most philosophers of Berkeley's time believed. Indeed, if a person visualizes something, then it must have some color, however dark or light; it cannot just be a shape of no color at all if a person is to visualize it. Berkeley's ideas raised controversy because his argument refuted Descartes' worldview, which was expanded upon by Locke, and resulted in the rejection of Berkeley's form of empiricism by several philosophers of the seventeenth and eighteenth centuries. In Locke's worldview, "the world causes the perceptual ideas we have of it by the way it interacts with our senses." This contradicts with Berkeley's worldview because not only does it suggest the existence of physical causes in the world, but in fact there is no physical world beyond our ideas. The only causes that exist in Berkeley's worldview are those that are a result of the use of the will. Berkeley's theory relies heavily on his form of empiricism, which in turn relies heavily on the senses. His empiricism can be defined by five propositions: all significant words stand for ideas; all knowledge of things is about ideas; all ideas come from without or from within; if from without it must be by the senses, and they are called sensations (the real things), if from within they are the operations of the mind, and are called thoughts. Berkeley clarifies his distinction between ideas by saying they "are imprinted on the senses," "perceived by attending to the passions and operations of the mind," or "are formed by help of memory and imagination." One refutation of his idea was: if someone leaves a room and stops perceiving that room does that room no longer exist? Berkeley answers this by claiming that it is still being perceived and the consciousness that is doing the perceiving is God. (This makes Berkeley's argument hinge upon an omniscient, omnipresent deity.) This claim is the only thing holding up his argument which is "depending for our knowledge of the world, and of the existence of other minds, upon a God that would never deceive us." Berkeley anticipates a second objection, which he refutes in Principles of Human Knowledge. He anticipates that the materialist may take a representational materialist standpoint: although the senses can only perceive ideas, these ideas resemble (and thus can be compared to) the actual, existing object. Thus, through the sensing of these ideas, the mind can make inferences as to matter itself, even though pure matter is non-perceivable. Berkeley's objection to that notion is that "an idea can be like nothing but an idea; a color or figure can be like nothing but another color or figure". Berkeley distinguishes between an idea, which is mind-dependent, and a material substance, which is not an idea and is mind-independent. As they are not alike, they cannot be compared, just as one cannot compare the color red to something that is invisible, or the sound of music to silence, other than that one exists and the other does not. This is called the likeness principle: the notion that an idea can only be like (and thus compared to) another idea. Berkeley attempted to show how ideas manifest themselves into different objects of knowledge: Berkeley also attempted to prove the existence of God throughout his beliefs in immaterialism. Influence Berkeley's Treatise Concerning the Principles of Human Knowledge was published three years before the publication of Arthur Collier's Clavis Universalis, which made assertions similar to those of Berkeley's. However, there seemed to have been no influence or communication between the two writers. German philosopher Arthur Schopenhauer once wrote of him: "Berkeley was, therefore, the first to treat the subjective starting-point really seriously and to demonstrate irrefutably its absolute necessity. He is the father of idealism ...". Berkeley is considered one of the originators of British empiricism. A linear development is often traced from three great "British Empiricists", leading from Locke through Berkeley to Hume. Berkeley influenced many modern philosophers, especially David Hume. Thomas Reid admitted that he put forward a drastic criticism of Berkeleianism after he had been an admirer of Berkeley's philosophical system for a long time.<ref>Reid T. "Inquiry into the Human Mind, Dedication.</ref> Berkeley's "thought made possible the work of Hume and thus Kant, notes Alfred North Whitehead." Some authors draw a parallel between Berkeley and Edmund Husserl. When Berkeley visited America, the American educator Samuel Johnson visited him, and the two later corresponded. Johnson convinced Berkeley to establish a scholarship program at Yale, and to donate a large number of books as well as his plantation to the college when the philosopher returned to England. It was one of Yale's largest and most important donations; it doubled its library holdings, improved the college's financial position and brought Anglican religious ideas and English culture into New England. Johnson also took Berkeley's philosophy and used parts of it as a framework for his own American Practical Idealism school of philosophy. As Johnson's philosophy was taught to about half the graduates of American colleges between 1743 and 1776, and over half of the contributors to the Declaration of Independence were connected to it, Berkeley's ideas were indirectly a foundation of the American Mind. Outside of America, during Berkeley's lifetime his philosophical ideas were comparatively uninfluential. But interest in his doctrine grew from the 1870s when Alexander Campbell Fraser, "the leading Berkeley scholar of the nineteenth century", published The Works of George Berkeley. A powerful impulse to serious studies in Berkeley's philosophy was given by A. A. Luce and Thomas Edmund Jessop, "two of the twentieth century's foremost Berkeley scholars", thanks to whom Berkeley scholarship was raised to the rank of a special area of historico-philosophical science. In addition, the philosopher Colin Murray Turbayne wrote extensively on Berkeley's use of language as model for visual, physiological, natural and metaphysical relationships. The proportion of Berkeley scholarship, in literature on the history of philosophy, is increasing. This can be judged from the most comprehensive bibliographies on George Berkeley. During the period of 1709–1932, about 300 writings on Berkeley were published. That amounted to 1.5 publication per annum. During the course of 1932–79, over one thousand works were brought out, i.e., 20 works per annum. Since then, the number of publications has reached 30 per annum. In 1977 publication began in Ireland of a special journal on Berkeley's life and thought (Berkeley Studies). In 1988, the Australian philosopher Colin Murray Turbayne established the International Berkeley Essay Prize Competition at the University of Rochester in an effort to advance scholarship and research on the works of Berkeley.University of Rochester – Department of Philosophy – George Berkeley Essay Prize Competition on sas.rochester.edu Other than philosophy, Berkeley also influenced modern psychology with his work on John Locke's theory of association and how it could be used to explain how humans gain knowledge in the physical world. He also used the theory to explain perception, stating that all qualities were, as Locke would call them, "secondary qualities", therefore perception laid entirely in the perceiver and not in the object. These are both topics today studied in modern psychology. Appearances in literature Lord Byron's Don Juan references immaterialism in the Eleventh Canto: When Bishop Berkeley said 'there was no matter,' And proved it—'t was no matter what he said: They say his system 't is in vain to batter, Too subtle for the airiest human head; And yet who can believe it? I would shatter Gladly all matters down to stone or lead, Or adamant, to find the world a spirit, And wear my head, denying that I wear it. Herman Melville humorously references Berkeley in Chapter 20 of Mardi (1849), when outlining a character's belief of being on board a ghostship: And here be it said, that for all his superstitious misgivings about the brigantine; his imputing to her something equivalent to a purely phantom-like nature, honest Jarl was nevertheless exceedingly downright and practical in all hints and proceedings concerning her. Wherein, he resembled my Right Reverend friend, Bishop Berkeley–truly, one of your lords spiritual—who, metaphysically speaking, holding all objects to be mere optical delusions, was, notwithstanding, extremely matter-of-fact in all matters touching matter itself. Besides being pervious to the points of pins, and possessing a palate capable of appreciating plum-puddings:—which sentence reads off like a pattering of hailstones. James Joyce references Berkeley's philosophy in the third episode of Ulysses (1922): Who watches me here? Who ever anywhere will read these written words? Signs on a white field. Somewhere to someone in your flutiest voice. The good bishop of Cloyne took the veil of the temple out of his shovel hat: veil of space with coloured emblems hatched on its field. Hold hard. Coloured on a flat: yes, that's right. Flat I see, then think distance, near, far, flat I see, east, back. Ah, see now! In commenting on a review of Ada or Ardor, author Vladimir Nabokov alludes to Berkeley's philosophy as informing his novel: And finally I owe no debt whatsoever (as Mr. Leonard seems to think) to the famous Argentine essayist and his rather confused compilation "A New Refutation of Time." Mr. Leonard would have lost less of it had he gone straight to Berkeley and Bergson. (Strong Opinions, pp. 2892–90) James Boswell, in the part of his Life of Samuel Johnson covering the year 1763, recorded Johnson's opinion of one aspect of Berkeley's philosophy: After we came out of the church, we stood talking for some time together of Bishop Berkeley's ingenious sophistry to prove the non-existence of matter, and that every thing in the universe is merely ideal. I observed, that though we are satisfied his doctrine is untrue, it is impossible to refute it. I shall never forget the alacrity with which Johnson answered, striking his foot with mighty force against a large stone, till he rebounded from it,– "I refute it thus." Commemoration Both the University of California, Berkeley, and the city of Berkeley, California, were named after him, although the pronunciation has evolved to suit American English: ( ). The naming was suggested in 1866 by Frederick H. Billings, a trustee of the then College of California. Billings was inspired by Berkeley's Verses on the Prospect of Planting Arts and Learning in America, particularly the final stanza: "Westward the course of empire takes its way; The first four Acts already past, A fifth shall close the Drama with the day; Time's noblest offspring is the last." The Town of Berkley, currently the least populated town in Bristol County, Massachusetts, was founded on 18 April 1735 and named after the renowned philosopher. It is located 40 miles south of Boston and 25 miles north of Middletown, Rhode Island. A residential college and an Episcopal seminary at Yale University also bear Berkeley's name, as does the Berkeley Library at Trinity College Dublin. Berkeley Preparatory School in Tampa, Florida, a private school affiliated with the Episcopal Church, is also named for him. "Bishop Berkeley's Gold Medals" are two awards given annually at Trinity College Dublin, "provided outstanding merit is shown", to candidates answering a special examination in Greek. The awards were founded in 1752 by Berkeley. An Ulster History Circle blue plaque commemorating him is located in Bishop Street Within, city of Derry. Berkeley's farmhouse in Middletown, Rhode Island, is preserved as Whitehall Museum House, also known as Berkeley House, and was listed on the National Register of Historic Places in 1970. St. Columba's Chapel, located in the same town, was formerly named "The Berkeley Memorial Chapel," and the appellation still survives at the end of the formal name of the parish, "St. Columba's, the Berkeley Memorial Chapel". Veneration Berkeley is honoured together with Joseph Butler with a feast day on the liturgical calendar of the Episcopal Church (USA) on 16 June. Writings Original publications Arithmetica (1707) Miscellanea Mathematica (1707) Philosophical Commentaries or Common-Place Book (1707–08, notebooks) An Essay towards a New Theory of Vision (1709) A Treatise Concerning the Principles of Human Knowledge, Part I (1710) Passive Obedience, or the Christian doctrine of not resisting the Supreme Power (1712) Three Dialogues between Hylas and Philonous (1713) An Essay Towards Preventing the Ruin of Great Britain (1721) De Motu (1721) A Proposal for Better Supplying Churches in our Foreign Plantations, and for converting the Savage Americans to Christianity by a College to be erected in the Summer Islands (1725) A Sermon preached before the incorporated Society for the Propagation of the Gospel in Foreign Parts (1732) Alciphron, or the Minute Philosopher (1732) The Theory of Vision, or Visual Language, shewing the immediate presence and providence of a Deity, vindicated and explained (1733) The Analyst: a Discourse addressed to an Infidel Mathematician (1734) A Defence of Free-thinking in Mathematics, with Appendix concerning Mr. Walton's vindication of Sir Isaac Newton's Principle of Fluxions (1735) Reasons for not replying to Mr. Walton's Full Answer (1735) The Querist, containing several queries proposed to the consideration of the public (three parts, 1735–37). A Discourse addressed to Magistrates and Men of Authority (1736) Siris, a chain of philosophical reflections and inquiries, concerning the virtues of tar-water (1744). A Letter to the Roman Catholics of the Diocese of Cloyne (1745) A Word to the Wise, or an exhortation to the Roman Catholic clergy of Ireland (1749) Maxims concerning Patriotism (1750) Farther Thoughts on Tar-water (1752) Miscellany (1752) Collections The Works of George Berkeley, D.D. Late Bishop of Cloyne in Ireland. To which is added, an account of his life, and several of his letters to Thomas Prior, Esq. Dean Gervais, and Mr. Pope, &c. &c. Printed for George Robinson, Pater Noster Row, 1784. Two volumes. The Works of George Berkeley, D.D., formerly Bishop of Cloyne: Including Many of His Writings Hitherto Unpublished; With Prefaces, Annotations, His Life and Letters, and an Account of His Philosophy. Ed. by Alexander Campbell Fraser. In 4 Volumes. Oxford: Clarendon Press, 1901. Vol. 1 Vol. 2 Vol. 3 Vol. 4 The Works of George Berkeley. Ed. by A. A. Luce and T. E. Jessop. Nine volumes. Edinburgh and London, 1948–1957. Ewald, William B., ed., 1996. From Kant to Hilbert: A Source Book in the Foundations of Mathematics, 2 vols. Oxford Uni. Press. 1707. Of Infinites, 16–19. 1709. Letter to Samuel Molyneaux, 19–21. 1721. De Motu, 37–54. 1734. The Analyst, 60–92. See also List of people on stamps of Ireland Solipsism "Tlön, Uqbar, Orbis Tertius" Yogacara and consciousness-only schools of thought References Sources Bibliographic resources Jessop T. E., Luce A. A. A bibliography of George Berkeley 2 edn., Springer, 1973. Turbayne C. M. A Bibliography of George Berkeley 1963–1979 in: Berkeley: Critical and Interpretive Essays – via Google Books, Manchester, 1982. pp. 313–29. Berkeley Bibliography (1979–2010) – A Supplement to those of Jessop and Turbayne by Silvia Parigi. A Bibliography on George Berkeley – About 300 works from the 19th century to our days. Philosophical studies Daniel, Stephen H. (ed.), Re-examining Berkeley's Philosophy, Toronto: University of Toronto Press, 2007. Daniel, Stephen H. (ed.), New Interpretations of Berkeley's Thought, Amherst: Humanity Books, 2008. Dicker, Georges, Berkeley's Idealism. A Critical Examination, Cambridge: Cambridge University Press, 2011. Gaustad, Edwin. George Berkeley in America. New Haven: Yale University Press, 1959. Pappas, George S., Berkeley's Thought, Ithaca: Cornell University Press, 2000. Stoneham, Tom, Berkeley's World: An Examination of the Three Dialogues, Oxford University Press, 2002. Warnock, Geoffrey J., Berkeley, Penguin Books, 1953. Winkler, Kenneth P., The Cambridge Companion to Berkeley, Cambridge: Cambridge University Press, 2005. Attribution Further reading p. 349. "Shows a thorough mastery of the literature on Berkeley, along with very perceptive remarks about the strength and weaknesses of most of the central commentators. ... Exhibits a mastery of all the material, both primary and secondary ..." Charles Larmore, for the Editorial Board, Journal of Philosophy. R. Muehlmann is one of the Berkeley Prize Winners. New Interpretations of Berkeley's Thought. Ed. by S. H. Daniel. New York: Humanity Books, 2008, 319 pp. . For reviews see: Reviewed by Marc A. Hight , Hampden–Sydney College Reviewed by Thomas M. Lennon – Berkeley Studies 19 (2008):51–56. Edward Chaney (2000), 'George Berkeley's Grand Tours: The Immaterialist as Connoisseur of Art and Architecture', in E. Chaney, The Evolution of the Grand Tour: Anglo-Italian Cultural Relations since the Renaissance, 2nd ed. London, Routledge. Costica Bradatan (2006), The Other Bishop Berkeley. An Exercise in Reenchantment, Fordham University Press, New York Secondary literature available on the Internet Most sources listed below are suggested by Dr. Talia M. Bettcher in Berkeley: a Guide for the Perplexed (2008). See the textbook's description. Berman, David. George Berkeley: Idealism and the Man. Oxford: Clarendon Press, 1994. The Cambridge Companion to Berkeley. (EPUP, Google Books). Ed. by Kenneth P. Winkler. Cambridge: Cambridge University Press, 2005. Daniel, Stephen H., ed. Reexamining Berkeley's Philosophy. Toronto: University of Toronto Press, 2007. Luce, A. A. Berkeley and Malebranche. A Study in the Origins of Berkeley's Thought. Oxford: Oxford University Press, 1934 (2nd edn, with additional Preface, 1967). Reviewed by: Désirée Park. Studi internazionali filosofici 3 (1971):228–30; G. J. Warnock. Journal of Philosophy 69, 15 (1972):460–62; Günter Gawlick "Menschheitsglück und Wille Gottes: Neues Licht auf Berkeleys Ethik." Philosophische Rundschau 1–2 (January 1973):24–42; H. M. Bracken. Eighteenth-Century Studies 3 (1973): 396–97; and Stanley Grean. Journal of the History of Philosophy 12, 3 (1974): 398–403. Roberts, John. A Metaphysics for the Mob: The Philosophy of George Berkeley (EPUP, Google Books). New York: Oxford University Press, 2007. – 172 p.  Reviewed by Marc A. Hight , University of Tartu/Hampden–Sydney College Russell B. Berkeley // Bertrand Russell A History of Western Philosophy 3:1:16 Tipton, I. C. Berkeley, The Philosophy of Immaterialism London: Methuen, 1974. "Ian C. Tipton, one of the world's great Berkeley scholars and longtime president of the International Berkeley Society. ... Of the many works about Berkeley that were published in the twentieth century, few rival in importance his Berkeley: The Philosophy of Immaterialism ... The philosophical insight, combined with the mastery of Berkeley's texts, that Ian brought to this work make it one of the masterpieces of Berkeley scholarship. It is not surprising therefore that, when the Garland Publishing Company brought out, late in 1980s, a 15-volume collection of major works on Berkeley, Ian's book was one of only two full-length studies of Berkeley published after 1935 to be included" (Charles J. McCracken. In Memoriam: Ian C. Tipton // The Berkeley Newsletter 17 (2006), p. 4). Winkler, Kenneth P. Berkeley: An Interpretation. Oxford: Clarendon Press, 1989. External links George Berkeley at the Eighteenth-Century Poetry Archive (ECPA) George Berkeley article by Daniel E. Flage in the Internet Encyclopedia of Philosophy International Berkeley Society A list of the published works by and about Berkeley as well as online links Berkeley's Life and Works Another perspective on how Berkeley framed his immaterialism Original texts and discussion concerning The Analyst controversy Contains more easily readable versions of New Theory of Vision, Principles of Human Knowledge, Three Dialogues, and Alciphron An extensive compendium of online resources, including a gallery of Berkeley's images . Electronic Texts for philosopher Charlie Dunbar (1887–1971): Broad, C. D. Berkeley's Argument About Material Substance New York: 1975 (Repr. of the 1942 ed. publ. by the British Academy, London.) Broad, C. D. Berkeley's Denial of Material Substance – Published in: The Philosophical Review'' Vol. LXIII (1954). Rick Grush syllabus Empiricism (J. Locke, G. Berkeley, D. Hume) Berkeley's (1734) The Analyst – digital facsimile. 1685 births 1753 deaths Academics of Trinity College Dublin Alumni of Trinity College Dublin Fellows of Trinity College Dublin Anglican bishops of Cloyne Anglican philosophers Anglican saints Burials at Christ Church Cathedral, Oxford Deans of Derry Deans of Dromore Empiricists Enlightenment philosophers Epistemologists History of calculus Idealists Irish expatriates in England Irish natural philosophers Irish slave owners Irish religious writers Metaphysicians People educated at Kilkenny College People from County Kilkenny Philosophers of science Scholars of Trinity College Dublin 18th-century Irish philosophers 18th-century Irish writers 18th-century Irish male writers 17th-century Anglican theologians 18th-century Anglican theologians
[ -0.11965934932231903, 0.34885820746421814, -0.9373077750205994, -0.1315290480852127, -0.07215531915426254, 1.125587821006775, 0.8462759852409363, -0.19481410086154938, 0.3340340256690979, -0.3553386926651001, -0.1689341515302658, 0.3327784240245819, -0.2891501784324646, 0.3168639540672302,...
11959
https://en.wikipedia.org/wiki/G.%20E.%20Moore
G. E. Moore
George Edward Moore (4 November 1873 – 24 October 1958) was an English philosopher, who with Bertrand Russell, Ludwig Wittgenstein and earlier Gottlob Frege was among the founders of analytic philosophy. He and Russell led the turn from idealism in British philosophy and became known for advocating common-sense concepts and contributing to ethics, epistemology and metaphysics. He was said to have an "exceptional personality and moral character". Ray Monk later dubbed him "the most revered philosopher of his era". As Professor of Philosophy at the University of Cambridge, he influenced but abstained from the Bloomsbury Group. He edited the journal Mind. A fellow of the British Academy from 1918, he was among the intellectually secret Cambridge Apostles in 1894–1901 and chaired the Cambridge University Moral Sciences Club in 1912–1944. As a humanist, he presided over the British Ethical Union (now Humanists UK) in 1935–1936. Life George Edward Moore was born in Upper Norwood, in south-east London, on 4 November 1873, the middle child of seven of Daniel Moore, a medical doctor, and Henrietta Sturge. His grandfather was the author George Moore. His eldest brother was Thomas Sturge Moore, a poet, writer and engraver. He was educated at Dulwich College and in 1892 went up to Trinity College, Cambridge, to read classics and moral sciences. He became a Fellow of Trinity in 1898 and went on to hold the University of Cambridge chair of Mental Philosophy and Logic from 1925 to 1939. Moore is best known today for defending ethical non-naturalism, his emphasis on common sense in philosophical method, and the paradox that bears his name. He was admired by and influenced among other philosophers, and by the Bloomsbury Group, but unlike his colleague and admirer Russell, who for some years thought he fulfilled his "ideal of genius", mostly unknown today outside of academic philosophy. Moore's essays are known for their clarity and circumspection of writing style and methodical and patient approach to philosophical problems. He was critical of modern philosophy for lack of progress, which he saw as a stark contrast to the dramatic advances in the natural sciences since the Renaissance. Among Moore's most famous works are his Principia Ethica, and his essays, "The Refutation of Idealism", "A Defence of Common Sense", and "A Proof of the External World". Moore was an important and admired member of the secretive Cambridge Apostles, a discussion group drawn from the British intellectual elite. At the time another member, a 22-year-old Bertrand Russell, wrote, "I almost worship him as if he were a god. I have never felt such an extravagant admiration for anybody," and would later write that "for some years he fulfilled my ideal of genius. He was in those days beautiful and slim, with a look almost of inspiration as deeply passionate as Spinoza's". From 1918 to 1919 Moore chaired the Aristotelian Society, a group committed to systematic study of philosophy, its historical development and its methods and problems. G. E. Moore died at the Evelyn Nursing Home on 24 October 1958. He was cremated at Cambridge Crematorium on 28 October 1958 and his ashes interred at the Parish of the Ascension Burial Ground in the city. His wife Dorothy Ely (1892–1977) was buried there. Together they had two sons, the poet Nicholas Moore and the composer Timothy Moore. Philosophy Ethics His influential work Principia Ethica is one of the main inspirations of the movement against ethical naturalism (see ethical non-naturalism) and is partly responsible for the twentieth-century concern with meta-ethics. The naturalistic fallacy Moore asserted that philosophical arguments can suffer from a confusion between the use of a term in a particular argument and the definition of that term (in all arguments). He named this confusion the naturalistic fallacy. For example, an ethical argument may claim that if a thing has certain properties, then that thing is 'good.' A hedonist may argue that 'pleasant' things are 'good' things. Other theorists may argue that 'complex' things are 'good' things. Moore contends that, even if such arguments are correct, they do not provide definitions for the term 'good'. The property of 'goodness' cannot be defined. It can only be shown and grasped. Any attempt to define it (X is good if it has property Y) will simply shift the problem (Why is Y-ness good in the first place?). Open-question argument Moore's argument for the indefinability of 'good' (and thus for the fallaciousness in the "naturalistic fallacy") is often called the open-question argument; it is presented in §13 of Principia Ethica. The argument hinges on the nature of statements such as "Anything that is pleasant is also good" and the possibility of asking questions such as "Is it good that x is pleasant?". According to Moore, these questions are open and these statements are significant; and they will remain so no matter what is substituted for "pleasure". Moore concludes from this that any analysis of value is bound to fail. In other words, if value could be analysed, then such questions and statements would be trivial and obvious. Since they are anything but trivial and obvious, value must be indefinable. Critics of Moore's arguments sometimes claim that he is appealing to general puzzles concerning analysis (cf. the paradox of analysis), rather than revealing anything special about value. The argument clearly depends on the assumption that if 'good' were definable, it would be an analytic truth about 'good', an assumption that many contemporary moral realists like Richard Boyd and Peter Railton reject. Other responses appeal to the Fregean distinction between sense and reference, allowing that value concepts are special and sui generis, but insisting that value properties are nothing but natural properties (this strategy is similar to that taken by non-reductive materialists in philosophy of mind). Good as indefinable Moore contended that goodness cannot be analysed in terms of any other property. In Principia Ethica, he writes: It may be true that all things which are good are also something else, just as it is true that all things which are yellow produce a certain kind of vibration in the light. And it is a fact, that Ethics aims at discovering what are those other properties belonging to all things which are good. But far too many philosophers have thought that when they named those other properties they were actually defining good; that these properties, in fact, were simply not "other," but absolutely and entirely the same with goodness. (Principia, § 10 ¶ 3) Therefore, we cannot define 'good' by explaining it in other words. We can only point to a thing or an action and say "That is good." Similarly, we cannot describe to a person born totally blind exactly what yellow is. We can only show a sighted person a piece of yellow paper or a yellow scrap of cloth and say "That is yellow." Good as a non-natural property In addition to categorising 'good' as indefinable, Moore also emphasized that it is a non-natural property. This means that it cannot be empirically or scientifically tested or verifiedit is not within the bounds of "natural science". Moral knowledge Moore argued that, once arguments based on the naturalistic fallacy had been discarded, questions of intrinsic goodness could be settled only by appeal to what he (following Sidgwick) called "moral intuitions": self-evident propositions which recommend themselves to moral reflection, but which are not susceptible to either direct proof or disproof (Principia, § 45). As a result of his view, he has often been described by later writers as an advocate of ethical intuitionism. Moore, however, wished to distinguish his view from the views usually described as "Intuitionist" when Principia Ethica was written: Moore distinguished his view from the view of deontological intuitionists, who held that "intuitions" could determine questions about what actions are right or required by duty. Moore, as a consequentialist, argued that "duties" and moral rules could be determined by investigating the effects of particular actions or kinds of actions (Principia, § 89), and so were matters for empirical investigation rather than direct objects of intuition (Prncipia, § 90). On Moore's view, "intuitions" revealed not the rightness or wrongness of specific actions, but only what things were good in themselves, as ends to be pursued. Right action, duty and virtue Moore holds that are those producing the most good. The difficulty with this is that the consequences of most actions are too vast for us to properly take into account, especially the long-term consequences. Because of this, Moore suggests that the definition of duty is limited to what generally produces better results than probable alternatives in a comparatively near future. Whether a given rule of action turns out to be a duty depends to some extent on the conditions of the corresponding society but duties agree mostly with what common-sense recommends. Virtues, like honesty, can in turn be defined as permanent dispositions to perform duties. Proof of an external world One of the most important parts of Moore's philosophical development was his break from the idealism that dominated British philosophy (as represented in the works of his former teachers F. H. Bradley and John McTaggart), and his defence of what he regarded as a "common sense" form of realism. In his 1925 essay "A Defence of Common Sense", he argued against idealism and scepticism toward the external world, on the grounds that they could not give reasons to accept that their metaphysical premises were more plausible than the reasons we have for accepting the common sense claims about our knowledge of the world, which sceptics and idealists must deny. He famously put the point into dramatic relief with his 1939 essay "Proof of an External World", in which he gave a common sense argument against scepticism by raising his right hand and saying "Here is one hand" and then raising his left and saying "And here is another", then concluding that there are at least two external objects in the world, and therefore that he knows (by this argument) that an external world exists. Not surprisingly, not everyone inclined to sceptical doubts found Moore's method of argument entirely convincing; Moore, however, defends his argument on the grounds that sceptical arguments seem invariably to require an appeal to "philosophical intuitions" that we have considerably less reason to accept than we have for the common sense claims that they supposedly refute. (In addition to fueling Moore's own work, the "Here is one hand" argument also deeply influenced Wittgenstein, who spent his last years working out a new approach to Moore's argument in the remarks that were published posthumously as On Certainty.) Moore's paradox Moore is also remembered for drawing attention to the peculiar inconsistency involved in uttering a sentence such as "It is raining, but I do not believe it is raining", a puzzle now commonly called "Moore's paradox". The puzzle arises because it seems impossible for anyone to consistently assert such a sentence; but there doesn't seem to be any logical contradiction between "It is raining" and "I don't believe that it is raining", because the former is a statement about the weather and the latter a statement about a person's belief about the weather, and it is perfectly logically possible that it may rain whilst a person does not believe that it is raining. In addition to Moore's own work on the paradox, the puzzle also inspired a great deal of work by Ludwig Wittgenstein, who described the paradox as the most impressive philosophical insight that Moore had ever introduced. It is said that when Wittgenstein first heard this paradox one evening (which Moore had earlier stated in a lecture), he rushed round to Moore's lodgings, got him out of bed and insisted that Moore repeat the entire lecture to him. Organic wholes Moore's description of the principle of the organic whole is extremely straightforward, nonetheless, and a variant on a pattern that began with Aristotle: The value of a whole must not be assumed to be the same as the sum of the values of its parts (Principia, § 18). According to Moore, a moral actor cannot survey the 'goodness' inherent in the various parts of a situation, assign a value to each of them, and then generate a sum in order to get an idea of its total value. A moral scenario is a complex assembly of parts, and its total value is often created by the relations between those parts, and not by their individual value. The organic metaphor is thus very appropriate: biological organisms seem to have emergent properties which cannot be found anywhere in their individual parts. For example, a human brain seems to exhibit a capacity for thought when none of its neurons exhibit any such capacity. In the same way, a moral scenario can have a value far greater than the sum of its component parts. To understand the application of the organic principle to questions of value, it is perhaps best to consider Moore's primary example, that of a consciousness experiencing a beautiful object. To see how the principle works, a thinker engages in "reflective isolation", the act of isolating a given concept in a kind of null-context and determining its intrinsic value. In our example, we can easily see that, of themselves, beautiful objects and consciousnesses are not particularly valuable things. They might have some value, but when we consider the total value of a consciousness experiencing a beautiful object, it seems to exceed the simple sum of these values. Hence the value of a whole must not be assumed to be the same as the sum of the values of its parts. Works G. E. Moore, "The Nature of Judgment" (1899) G. E. Moore, Principia Ethica (1903) G. E. Moore, "Review of Franz Brentano's The Origin of the Knowledge of Right and Wrong" (1903) G. E. Moore, "The Refutation of Idealism" (1903) G. E. Moore, "The Nature and Reality of the Objects of Perception" (1905–6) G. E. Moore, Ethics (1912) G. E. Moore, "Some Judgments of Perception" (1918) G. E. Moore, Philosophical Studies (1922) [papers published 1903–21] G. E. Moore, "The Conception of Intrinsic Value" G. E. Moore, "The Nature of Moral Philosophy" G. E. Moore, "Are the Characteristics of Things Universal or Particular?" (1923) G. E. Moore, "A Defence of Common Sense" (1925) G. E. Moore and F. P. Ramsey, Facts and Proposition (Symposium) (1927) G. E. Moore, Some Main Problems of Philosophy (1953) [lectures delivered 1910–11] G. E. Moore, Ch. 3, "Propositions" G. E. Moore, Philosophical Papers (1959) G. E. Moore, Ch. 7: "Proof of an External World" "Margin Notes by G. E. Moore on The Works of Thomas Reid (1849: With Notes by Sir William Hamilton)". G. E. Moore, The Early Essays, edited by Tom Regan, Temple University Press (1986). G. E. Moore, The Elements of Ethics, edited and with an introduction by Tom Regan, Temple University Press, (1991). G. E. Moore, On Defining "Good," in Analytic Philosophy: Classic Readings, Stamford, CT: Wadsworth, 2002, pp. 1–10. . See also The Right and the Good References Further reading Daval, René, Moore et la philosophie analytique, 1997, Presses Universitaires de France (PUF), (French) Tom Regan. Bloomsbury's prophet: G.E. Moore and the development of his moral philosophy, Temple University Press (1986). External links George Edward Moore – philosophypages.com The Stanford Encyclopedia of Philosophy George Edward Moore Moore's Moral Philosophy Trinity College Chapel G. E. Moore and the Cambridge School of Analysis, Thomas Baldwin, The Oxford Handbook of The History of Analytic Philosophy Open Access papers by Moore published in Proceedings of the Aristotelian Society and Aristotelian Society Supplementary Volume. 1873 births 1958 deaths 19th-century British philosophers 19th-century British writers 19th-century English philosophers 19th-century English writers 20th-century British philosophers 20th-century British writers 20th-century English writers Alumni of Trinity College, Cambridge Analytic philosophers Aristotelian philosophers British agnostics British ethicists British logicians Cambridge University Moral Sciences Club Consequentialists English humanists English logicians Epistemologists Fellows of the British Academy Fellows of Trinity College, Cambridge Linguistic turn Members of the Order of Merit Metaphysicians Moral philosophers Moral realists Ontologists People educated at Dulwich College Philosophers of culture Philosophers of education Philosophers of ethics and morality Philosophers of language Philosophers of logic Philosophers of mind Presidents of the Aristotelian Society Victorian writers People from the London Borough of Southwark Mind (journal) editors
[ 0.16872254014015198, 0.7226549386978149, -0.5071804523468018, -0.19873729348182678, -0.24330975115299225, 0.6228315830230713, 0.689027726650238, -0.22796283662319183, -0.15398655831813812, -0.3063660264015198, -0.1423157900571823, 0.1530991494655609, -0.44116777181625366, 0.396503418684005...
11964
https://en.wikipedia.org/wiki/Genus%E2%80%93differentia%20definition
Genus–differentia definition
A genus–differentia definition is a type of intensional definition, and it is composed of two parts: a genus (or family): An existing definition that serves as a portion of the new definition; all definitions with the same genus are considered members of that genus. the differentia: The portion of the definition that is not provided by the genus. For example, consider these two definitions: a triangle: A plane figure that has 3 straight bounding sides. a quadrilateral: A plane figure that has 4 straight bounding sides. Those definitions can be expressed as one genus and two differentiae: one genus: the genus for both a triangle and a quadrilateral: "A plane figure" two differentiae: the differentia for a triangle: "that has 3 straight bounding sides." the differentia for a quadrilateral: "that has 4 straight bounding sides." The use of genus and differentia in constructing definitions goes back at least as far as Aristotle (384–322 BCE). Differentiation and Abstraction The process of producing new definitions by extending existing definitions is commonly known as differentiation (and also as derivation). The reverse process, by which just part of an existing definition is used itself as a new definition, is called abstraction; the new definition is called an abstraction and it is said to have been abstracted away from the existing definition. For instance, consider the following: a square: a quadrilateral that has interior angles which are all right angles, and that has bounding sides which all have the same length. A part of that definition may be singled out (using parentheses here): a square: (a quadrilateral that has interior angles which are all right angles), and that has bounding sides which all have the same length. and with that part, an abstraction may be formed: a rectangle: a quadrilateral that has interior angles which are all right angles. Then, the definition of a square may be recast with that abstraction as its genus: a square: a rectangle that has bounding sides which all have the same length. Similarly, the definition of a square may be rearranged and another portion singled out: a square: (a quadrilateral that has bounding sides which all have the same length), and that has interior angles which are all right angles. leading to the following abstraction: a rhombus: a quadrilateral that has bounding sides which all have the same length. Then, the definition of a square may be recast with that abstraction as its genus: a square: a rhombus that has interior angles which are all right angles. In fact, the definition of a square may be recast in terms of both of the abstractions, where one acts as the genus and the other acts as the differentia: a square: a rectangle that is a rhombus. a square: a rhombus that is a rectangle. Hence, abstraction is crucial in simplifying definitions. Multiplicity When multiple definitions could serve equally well, then all such definitions apply simultaneously. Thus, a square is a member of both the genus [a] rectangle and the genus [a] rhombus. In such a case, it is notationally convenient to consolidate the definitions into one definition that is expressed with multiple genera (and possibly no differentia, as in the following): a square: a rectangle and a rhombus. or completely equivalently: a square: a rhombus and a rectangle. More generally, a collection of equivalent definitions (each of which is expressed with one unique genus) can be recast as one definition that is expressed with genera. Thus, the following: a Definition: a Genus1 that is a Genus2 and that is a Genus3 and that is a… and that is a Genusn-1 and that is a Genusn, which has some non-genus Differentia. a Definition: a Genus2 that is a Genus1 and that is a Genus3 and that is a… and that is a Genusn-1 and that is a Genusn, which has some non-genus Differentia. a Definition: a Genus3 that is a Genus1 and that is a Genus2 and that is a… and that is a Genusn-1 and that is a Genusn, which has some non-genus Differentia. … a Definition: a Genusn-1 that is a Genus1 and that is a Genus2 and that is a Genus3 and that is a… and that is a Genusn, which has some non-genus Differentia. a Definition: a Genusn that is a Genus1 and that is a Genus2 and that is a Genus3 and that is a… and that is a Genusn-1, which has some non-genus Differentia. could be recast as: a Definition: a Genus1 and a Genus2 and a Genus3 and a… and a Genusn-1 and a Genusn, which has some non-genus Differentia. Structure A genus of a definition provides a means by which to specify an is-a relationship: A square is a rectangle, which is a quadrilateral, which is a plane figure, which is a… A square is a rhombus, which is a quadrilateral, which is a plane figure, which is a… A square is a quadrilateral, which is a plane figure, which is a… A square is a plane figure, which is a… A square is a… The non-genus portion of the differentia of a definition provides a means by which to specify a has-a relationship: A square has an interior angle that is a right angle. A square has a straight bounding side. A square has a… When a system of definitions is constructed with genera and differentiae, the definitions can be thought of as nodes forming a hierarchy or—more generally—a directed acyclic graph; a node that has no predecessor is a most general definition; each node along a directed path is more differentiated (or more derived) than any one of its predecessors, and a node with no successor is a most differentiated (or a most derived) definition. When a definition, S, is the tail of each of its successors (that is, S has at least one successor and each direct successor of S is a most differentiated definition), then S is often called the species of each of its successors, and each direct successor of S is often called an individual (or an entity) of the species S; that is, the genus of an individual is synonymously called the species of that individual. Furthermore, the differentia of an individual is synonymously called the identity of that individual. For instance, consider the following definition: [the] John Smith: a human that has the name 'John Smith'. In this case: The whole definition is an individual; that is, [the] John Smith is an individual. The genus of [the] John Smith (which is "a human") may be called synonymously the species of [the] John Smith; that is, [the] John Smith is an individual of the species [a] human. The differentia of [the] John Smith (which is "that has the name 'John Smith'") may be called synonymously the identity of [the] John Smith; that is, [the] John Smith is identified among other individuals of the same species by the fact that [the] John Smith is the one "that has the name 'John Smith'". As in that example, the identity itself (or some part of it) is often used to refer to the entire individual, a phenomenon that is known in linguistics as a pars pro toto synecdoche. References Definition Dichotomies Philosophy of language Abstraction Difference
[ 0.06316617876291275, -0.18591856956481934, -0.5934396386146545, 0.12779846787452698, -0.042564745992422104, -0.032433390617370605, 0.30854079127311707, -0.18614031374454498, -0.10836686939001083, -0.7590804100036621, -0.2676684558391571, 0.026573358103632927, -0.4008744955062866, 1.0282291...
11966
https://en.wikipedia.org/wiki/Firearm
Firearm
A firearm is any type of gun designed to be readily carried and used by an individual. The term is legally defined further in different countries (see Legal definitions). The first firearms originated in 10th-century China, when bamboo tubes containing gunpowder and pellet projectiles were mounted on spears to make the portable fire lance, operable by a single person, which was later used effectively as a shock weapon in the Siege of De'an in 1132. In the 13th century, fire lance barrels were replaced with metal tubes and transformed into the metal-barreled hand cannon. The technology gradually spread throughout Eurasia during the 14th century. Older firearms typically used black powder as a propellant, but modern firearms use smokeless powder or other propellants. Most modern firearms (with the notable exception of smoothbore shotguns) have rifled barrels to impart spin to the projectile for improved flight stability. Modern firearms can be described by their caliber (i.e. bore diameter). For pistols and rifles this is given in millimeters or inches (e.g. 7.62mm or .308 in.), or in the case of shotguns by their gauge (e.g. 12 ga. and 20 ga.). They are also described by the type of action employed (e.g. muzzleloader, breechloader, lever, bolt, pump, revolver, semi-automatic, fully automatic, etc.), together with the usual means of deportment (i.e. hand-held or mechanical mounting). Further classification may make reference to the type of barrel used (i.e. rifled) and to the barrel length (e.g. 24 inches), to the firing mechanism (e.g. matchlock, wheellock, flintlock, or percussion lock), to the design's primary intended use (e.g. hunting rifle), or to the commonly accepted name for a particular variation (e.g. Gatling gun). Shooters aim firearms at their targets with hand-eye coordination, using either iron sights or optical sights. The accurate range of pistols generally does not exceed , while most rifles are accurate to using iron sights, or to longer ranges whilst using optical sights. (Firearm rounds may be dangerous or lethal well beyond their accurate range; the minimum distance for safety is much greater than the specified range for accuracy). Purpose-built sniper rifles and anti-materiel rifles are accurate to ranges of more than . Types A firearm is a barreled ranged weapon that inflicts damage on targets by launching one or more projectiles driven by rapidly expanding high-pressure gas produced by exothermic combustion (deflagration) of a chemical propellant, historically black powder, now smokeless powder. In the military, firearms are categorized into "heavy" and "light" weapons regarding their portability by infantry. Light firearms are those that can be readily carried by individual foot soldier, though they might still require more than one individuals (crew-served) to achieve optimal operational capacity. Heavy firearms are those that are too large and heavy to be transported on foot, or too unstable against recoil, and thus require the support of a weapons platform (e.g. a fixed mount, wheeled carriage, vehicle, aircraft or water vessel) to be tactically mobile or useful. The subset of light firearms that only use kinetic projectiles and are compact enough to be operated to full capacity by a single infantryman (individual-served) are also referred to as "small arms". Such firearms include handguns such as revolvers, pistols and derringers, and long guns such as rifles (including many subtypes such as anti-material rifles, sniper rifles/designated marksman rifles, battle rifles, assault rifles and carbines), shotguns, submachine guns/personal defense weapons and squad automatic weapons/light machine guns. Among the world's arms manufacturers, the top firearms manufacturers are Browning, Remington, Colt, Ruger, Smith & Wesson, Savage, Mossberg (USA), Heckler & Koch, SIG Sauer, Walther (Germany), ČZUB (Czech Republic), Glock, Steyr-Mannlicher (Austria), FN Herstal (Belgium), Beretta (Italy), Norinco (China), Tula Arms and Kalashnikov (Russia), while former top producers included Mauser, Springfield Armory, and Rock Island Armory under Armscor (Philippines). the Small Arms Survey reported that there were over one billion firearms distributed globally, of which 857 million (about 85 percent) were in civilian hands. U.S. civilians alone account for 393 million (about 46 percent) of the worldwide total of civilian-held firearms. This amounts to "120.5 firearms for every 100 residents." The world's armed forces control about 133 million (about 13 percent) of the global total of small arms, of which over 43 percent belong to two countries: the Russian Federation (30.3 million) and China (27.5 million). Law enforcement agencies control about 23 million (about 2 percent) of the global total of small arms. Configuration Handguns Handguns are guns that can be used with a single hand, and are the smallest of all firearms. However, the legal definition of a "handgun" varies between countries and regions. For example, in South African law, a "handgun" means a pistol or revolver which can be held in and discharged with one hand. In Australia, the gun law considers a handgun as a firearm carry-able or concealable about the person; or capable of being raised and fired by one hand; or not exceeding . In the United States, Title 18 and the ATF considers a handgun as a firearm which has a short stock and is designed to be held and fired by the use of a single hand. There are two common types of handguns: revolvers and semi-automatic pistols. Revolvers have a number of firing chambers or "charge holes" in a revolving cylinder; each chamber in the cylinder is loaded with a single cartridge or charge. Semi-automatic pistols have a single fixed firing-chamber machined into the rear of the barrel, and a magazine so they can be used to fire more than one round. Each press of the trigger fires a cartridge, using the energy of the cartridge to activate a mechanism so that the next cartridge may be fired immediately. This is opposed to "double-action" revolvers, which accomplish the same end using a mechanical action linked to the trigger pull. With the invention of the revolver in 1818, handguns capable of holding multiple rounds became popular. Certain designs of auto-loading pistol appeared beginning in the 1870s and had largely supplanted revolvers in military applications by the end of World War I. By the end of the 20th century, most handguns carried regularly by military, police and civilians were semi-automatic, although revolvers were still widely used. Generally speaking, military and police forces use semi-automatic pistols due to their high magazine capacities and ability to rapidly reload by simply removing the empty magazine and inserting a loaded one. Revolvers are very common among handgun hunters because revolver cartridges are usually more powerful than similar caliber semi-automatic pistol cartridges (which are designed for self-defense) and the strength, simplicity and durability of the revolver design is well-suited to outdoor use. Revolvers, especially in .22 LR and 38 Special/357 Magnum, are also common concealed weapons in jurisdictions allowing this practice because their simple mechanics make them smaller than many autoloaders while remaining reliable. Both designs are common among civilian gun owners, depending on the owner's intention (self-defense, hunting, target shooting, competitions, collecting, etc.). Long guns A long gun is any firearm with a notably long barrel, typically a length of (there are restrictions on minimum barrel length in many jurisdictions; maximum barrel length is usually a matter of practicality). Unlike a handgun, long guns are designed to be held and fired with both hands, while braced against either the hip or the shoulder for better stability. The receiver and trigger group is mounted into a stock made of wood, plastic, metal, or composite material, which has sections that form a foregrip, rear grip, and optionally (but typically) a shoulder mount called the butt. Early long arms, from the Renaissance up to the mid-19th century, were generally smoothbore firearms that fired one or more ball shot, called muskets or arquebus depending on caliber and firing mechanism. Rifles and shotguns Most modern long guns are either rifles or shotguns. Both are the successors of the musket, diverging from their parent weapon in distinct ways. A rifle is so named for the spiral grooves (riflings) machined into the inner (bore) surface of its barrel, which imparts a gyroscopically-stabilizing spin to the bullets that it fires. Shotguns are predominantly smoothbore firearms designed to fire a number of shot in each discharge; pellet sizes commonly ranging between 2 mm #9 birdshot and 8.4 mm #00 (double-aught) buckshot. Shotguns are also capable of firing single solid projectiles called slugs, or specialty (often "less lethal") rounds such as bean bags, tear gas or breaching rounds. Rifles produce a single point of impact with each firing but a long range and high accuracy; while shotguns produce a cluster of impact points with considerably less range and accuracy. However, the larger impact area of shotguns can compensate for reduced accuracy, since shot spreads during flight; consequently, in hunting, shotguns are generally used for fast-flying game birds. Rifles and shotguns are commonly used for hunting and often also for home defense, security guard and law enforcement. Usually, large game are hunted with rifles (although shotguns can be used, particularly with slugs), while birds are hunted with shotguns. Shotguns are sometimes preferred for defending a home or business due to their wide impact area, multiple wound tracks (when using buckshot), shorter range, and reduced penetration of walls (when using lighter shot), which significantly reduces the likelihood of unintended harm, although the handgun is also common. There are a variety of types of rifles and shotguns based on the method in which they are reloaded. Bolt-action and lever-action rifles are manually operated. Manipulation of the bolt or the lever causes the spent cartridge to be removed, the firing mechanism recocked, and a fresh cartridge inserted. These two types of action are almost exclusively used by rifles. Slide-action (commonly called 'pump-action') rifles and shotguns are manually cycled by shuttling the foregrip of the firearm back and forth. This type of action is typically used by shotguns, but several major manufacturers make rifles that use this action. Both rifles and shotguns also come in break-action varieties that do not have any kind of reloading mechanism at all but must be hand-loaded after each shot. Both rifles and shotguns come in single- and double-barreled varieties; however, due to the expense and difficulty of manufacturing, double-barreled rifles are rare. Double-barreled rifles are typically intended for African big-game hunts where the animals are dangerous, ranges are short, and speed is of the essence. Very large and powerful calibers are normal for these firearms. Rifles have been in nationally featured marksmanship events in Europe and the United States since at least the 18th century, when rifles were first becoming widely available. One of the earliest purely "American" rifle-shooting competitions took place in 1775, when Daniel Morgan was recruiting sharpshooters in Virginia for the impending American Revolutionary War. In some countries, rifle marksmanship is still a matter of national pride. Some specialized rifles in the larger calibers are claimed to have an accurate range of up to about , although most have considerably less. In the second half of the 20th century, competitive shotgun sports became perhaps even more popular than riflery, largely due to the motion and immediate feedback in activities such as skeet, trap and sporting clays. In military use, bolt-action rifles with high-power scopes are common as sniper rifles, however by the Korean War the traditional bolt-action and semi-automatic rifles used by infantrymen had been supplemented by select-fire designs known as automatic rifles. Carbines A carbine is a firearm similar to a rifle in form and intended usage, but generally shorter or smaller than the typical "full-size" hunting or battle rifle of a similar time period, and sometimes using a smaller or less-powerful cartridge. Carbines were and are typically used by members of the military in roles that are expected to engage in combat, but where a full-size rifle would be an impediment to the primary duties of that soldier (vehicle drivers, field commanders and support staff, airborne troops, engineers, etc.). Carbines are also common in law enforcement and among civilian owners where similar size, space and/or power concerns may exist. Carbines, like rifles, can be single-shot, repeating-action, semi-automatic or select-fire/fully automatic, generally depending on the time period and intended market. Common historical examples include the Winchester Model 1892, Lee–Enfield "Jungle Carbine", SKS, M1 carbine (no relation to the larger M1 Garand) and M4 carbine (a more compact variant of the current M16 rifle). Modern U.S. civilian carbines include compact customizations of the AR-15, Ruger Mini-14, Beretta Cx4 Storm, Kel-Tec SUB-2000, bolt-action rifles generally falling under the specifications of a scout rifle, and aftermarket conversion kits for popular pistols including the M1911 and Glock models. Machine guns A machine gun is a fully automatic firearm, most often separated from other classes of automatic weapons by the use of belt-fed ammunition (though some designs employ drum, pan or hopper magazines), generally in a rifle-inspired caliber ranging between 5.56×45mm NATO (.223 Remington) for a light machine gun to as large as .50 BMG or even larger for crewed or aircraft weapons. Although not widely fielded until World War I, early machine guns were being used by militaries in the second half of the 19th century. Notables in the U.S. arsenal during the 20th century included the M2 Browning .50 caliber heavy machine gun, M1919 Browning .30 caliber medium machine gun, and the M60 7.62×51mm NATO general-purpose machine gun which came into use around the Vietnam War. Machine guns of this type were originally defensive firearms crewed by at least two men, mainly because of the difficulties involved in moving and placing them, their ammunition, and their tripod. In contrast, modern light machine guns such as the FN Minimi are often wielded by a single infantryman. They provide a large ammunition capacity and a high rate of fire, and are typically used to give suppressing fire during infantry movement. Accuracy on machine guns varies based on a wide number of factors from design to manufacturing tolerances, most of which have been improved over time. Machine guns are often mounted on vehicles or helicopters and have been used since World War I as offensive firearms in fighter aircraft and tanks (e.g. for air combat or suppressing fire for ground troop support). The definition of a machine gun is different in U.S. law. The National Firearms Act and Firearm Owners Protection Act define a "machine gun" in the United States code Title 26, Subtitle E, Chapter 53, Subchapter B, Part 1, § 5845 as: "... any firearm which shoots ... automatically more than one shot, without manual reloading, by a single function of the trigger". "Machine gun" is therefore largely synonymous with "automatic weapon" in the U.S. civilian parlance, covering all automatic firearms. Sniper rifles The definition of a sniper rifle is disputed among military, police and civilian observers alike, however most generally define a “sniper rifle” as a high powered, semi-automatic/bolt action, precision rifle with an accurate range further than that of a standard rifle. These are often purpose-built for their applications. For example, a police sniper rifle may differ in specs from a military rifle. Police snipers generally do not engage targets at extreme range, but rather, a target at medium range. They may also have multiple targets within the shorter range, and thus a semi-automatic model is preferred to a bolt action. They also may be more compact than mil-spec rifles as police marksmen may need more portability. On the other hand, a military rifle is more likely to use a higher-powered cartridge to defeat body armor or medium-light cover. They are more commonly (but not a lot more) bolt-action, as they are simpler to build and maintain. Also, due to fewer moving and overall parts, they are much more reliable under adverse conditions. They may also have a more powerful scope to acquire targets further away. Overall, sniper units never became prominent until World War I, when the Germans displayed their usefulness on the battlefield. Since then, they have become irrevocably embedded in warfare. Examples of sniper rifles include the Accuracy International AWM, Sako TRG-42 and the CheyTac M200. Examples of specialized sniper cartridges include the .338 Lapua Magnum, .300 Winchester Magnum, and .408 CheyTac rounds. Submachine guns A submachine gun is a magazine-fed firearm, usually smaller than other automatic firearms, that fires pistol-caliber ammunition; for this reason certain submachine guns can also be referred to as machine pistols, especially when referring to handgun-sized designs such as the Škorpion vz. 61 and Glock 18. Well-known examples are the Israeli Uzi and Heckler & Koch MP5 which use the 9×19mm Parabellum cartridge, and the American Thompson submachine gun which fires .45 ACP. Because of their small size and limited projectile penetration compared to high-power rifle rounds, submachine guns are commonly favored by military, paramilitary and police forces for close-quarters engagements such as inside buildings, in urban areas or in trench complexes. Submachine guns were originally about the size of carbines. Because they fire pistol ammunition, they have limited long-range use, but in close combat can be used in fully automatic in a controllable manner due to the lighter recoil of the pistol ammunition. They are also extremely inexpensive and simple to build in time of war, enabling a nation to quickly arm its military. In the latter half of the 20th century, submachine guns were being miniaturized to the point of being only slightly larger than some large handguns. The most widely used submachine gun at the end of the 20th century was the Heckler & Koch MP5. The MP5 is actually designated as a "machine pistol" by Heckler & Koch (MP5 stands for Maschinenpistole 5, or Machine Pistol 5), although some reserve this designation for even smaller submachine guns such as the MAC-10 and Glock 18, which are about the size and shape of pistols. Automatic rifles An automatic rifle is a magazine-fed firearm, wielded by a single infantryman, that is chambered for rifle cartridges and capable of automatic fire. The M1918 Browning Automatic Rifle was the first U.S. infantry weapon of this type, and was generally used for suppressive or support fire in the role now usually filled by the light machine gun. Other early automatic rifles include the Fedorov Avtomat and the Huot Automatic Rifle. Later, German forces fielded the Sturmgewehr 44 during World War II, a light automatic rifle firing a reduced power "intermediate cartridge". This design was to become the basis for the "assault rifle" subclass of automatic weapons, as contrasted with "battle rifles", which generally fire a traditional "full-power" rifle cartridge. Assault rifles In World War II, Germany introduced the StG 44, and brought to the forefront of firearm technology what eventually became the class of firearm most widely adopted by the military, the assault rifle. An assault rifle is usually slightly smaller than a battle rifle such as the American M14, but the chief differences defining an assault rifle are select-fire capability and the use of a rifle round of lesser power, known as an intermediate cartridge. Soviet engineer Mikhail Kalashnikov quickly adapted the German concept, using a less-powerful 7.62×39mm cartridge derived from the standard 7.62×54mmR Russian battle rifle round, to produce the AK-47, which has become the world's most widely used assault rifle. Soon after World War II, the Automatic Kalashnikov AK-47 assault rifle began to be fielded by the Soviet Union and its allies in the Eastern Bloc, as well as by nations such as China, North Korea, and North Vietnam. In the United States, the assault rifle design was later in coming; the replacement for the M1 Garand of WWII was another John Garand design chambered for the new 7.62×51mm NATO cartridge; the select-fire M14, which was used by the U.S. military until the 1960s. The significant recoil of the M14 when fired in full-automatic mode was seen as a problem as it reduced accuracy, and in the 1960s it was replaced by Eugene Stoner's AR-15, which also marked a switch from the powerful .30 caliber cartridges used by the U.S. military up until early in the Vietnam War to the much less powerful but far lighter and light recoiling .223 caliber (5.56mm) intermediate cartridge. The military later designated the AR-15 as the "M16". The civilian version of the M16 continues to be known as the AR-15 and looks exactly like the military version, although to conform to ATF regulations in the U.S., it lacks the mechanism that permits fully automatic fire. Variants of both of the M16 and AK-47 are still in wide international use today, though other automatic rifle designs have since been introduced. A smaller version of the M16A2, the M4 carbine, is widely used by U.S. and NATO tank and vehicle crews, airbornes, support staff, and in other scenarios where space is limited. The IMI Galil, an Israeli-designed weapon based on the action of the AK-47, is in use by Israel, Italy, Burma, the Philippines, Peru, and Colombia. Swiss Arms of Switzerland produces the SIG SG 550 assault rifle used by France, Chile, and Spain among others, and Steyr Mannlicher produces the AUG, a bullpup rifle in use in Austria, Australia, New Zealand, Ireland, and Saudi Arabia among other nations. Modern designs call for compact weapons retaining firepower. The bullpup design, by mounting the magazine behind the trigger, unifies the accuracy and firepower of the traditional assault rifle with the compact size of the submachine gun (though submachine guns are still used); examples are the French FAMAS and the British SA80. Personal defense weapons A recently developed class of firearm is the personal defense weapon or PDW, which is in simplest terms a submachine gun designed to fire ammunitions with ballistic performance similar to rifle cartridges. While a submachine gun is desirable for its compact size and ammunition capacity, its pistol cartridges lack the penetrating capability of a rifle round. Conversely, rifle bullets can pierce light armor and are easier to shoot accurately, but even a carbine such as the Colt M4 is larger and/or longer than a submachine gun, making it harder to maneuver in close quarters. The solution many firearms manufacturers have presented is a weapon resembling a submachine gun in size and general configuration, but which fires a higher-powered armor-penetrating round (often specially designed for the weapon), thus combining the advantages of a carbine and submachine gun. This also earned the PDWs an infrequently used nickname — the submachine carbines. The FN P90 and Heckler & Koch MP7 are most famous examples of PDWs. Battle rifles Battle rifles are another subtype of rifle, usually defined as selective fire rifles that use full power rifle cartridges, examples of which include the 7.62x51mm NATO, 7.92x57mm Mauser, and 7.62x54mmR. These serve similar purposes as assault rifles, as they both are usually employed by ground infantry. However, some prefer battle rifles due to their more powerful cartridge, despite added recoil. Some semi-automatic sniper rifles are configured from battle rifles. Function Firearms are also categorized by their functioning cycle or "action" which describes its loading, firing, and unloading cycle. Manual The earliest evolution of the firearm, there are many types of manual action firearms. These can be divided into two basic categories: single shot and repeating. A single shot firearm can only be fired once per equipped barrel before it must be reloaded or charged via an external mechanism or series of steps. A repeating firearm can be fired multiple times, but can only be fired once with each subsequent pull of the trigger. Between trigger pulls, the firearm's action must be reloaded or charged via an internal mechanism. Lever action A gun which has a lever that is pulled down then back up to expel the old cartridge then load a new round. Pump action Pump action weapons are primarily shotguns. A pump action is created when the user slides a lever (usually a grip) and it brings a new round in the chamber while expelling the old one. Semi-automatic A semi-automatic, self-loading, or "auto loader" firearm is one that performs all steps necessary to prepare it for firing again after a single discharge, until cartridges are no longer available in the weapon's feed device or magazine. Auto loaders fire one round with each pull of the trigger. Some people confuse the term with "fully automatic" firearms. (See next.) While some semi-automatic rifles may resemble military-style firearms, they are not properly classified "Assault Weapons" which refers to those that continue to fire until the trigger is no longer depressed. Automatic An automatic firearm, or "fully automatic", "fully auto", or "full auto", is generally defined as one that continues to load and fire cartridges from its magazine as long as the trigger is depressed (and until the magazine is depleted of available ammunition.) The first weapon generally considered in this category is the Gatling gun, originally a carriage-mounted, crank-operated firearm with multiple rotating barrels that was fielded in the American Civil War. The modern trigger-actuated machine gun began with various designs developed in the late 19th century and fielded in World War I, such as the Maxim gun, Lewis Gun, and MG 08 "Spandau". Most automatic weapons are classed as long guns (as the ammunition used is of similar type as for rifles, and the recoil of the weapon's rapid fire is better controlled with two hands), but handgun-sized automatic weapons also exist, generally in the "submachine gun" or "machine pistol" class. Selective fire Selective fire, or "select fire", means the capability of a weapon's fire control to be adjusted in either semi-automatic, fully automatic firing modes, or 3 round burst. The modes are chosen by means of a selector, which varies depending on the weapon's design. Some selective-fire weapons have burst fire mechanisms built in to limit the maximum number of shots fired in fully automatic mode, with most common limits being two or three rounds per trigger pull. The presence of selective-fire modes on firearms allows more efficient use of ammunition for specific tactical needs, either precision-aimed or suppressive fire. This capability is most commonly found on military weapons of the 20th and 21st centuries, most notably the assault rifles. History The first primitive firearms were invented about 1250 AD in China when the man-portable fire lance (a bamboo or metal tube that could shoot ignited gunpowder) was combined with projectiles such as scrap metal, broken porcelain, or darts/arrows. An early depiction of a firearm is a sculpture from a cave in Sichuan, China. The sculpture dates to the 12th century and represents a figure carrying a vase-shaped bombard, with flames and a cannonball coming out of it. The oldest surviving gun, a hand cannon made of bronze, has been dated to 1288 because it was discovered at a site in modern-day Acheng District, Heilongjiang, China, where the Yuan Shi records that battles were fought at that time. The firearm had a 6.9 inch barrel of a 1-inch diameter, a 2.6 inch chamber for the gunpowder and a socket for the firearm's handle. It is 13.4 inches long and 7.8 pounds without the handle, which would have been made of wood. The Arabs and Mamluks had firearms in the late-13th century. Europeans obtained firearms in the 14th century. The Koreans adopted firearms from the Chinese in the 14th century. The Iranians (first Aq Qoyunlu and Safavids) and Indians (first Mughals) all got them no later than the 15th century, from the Ottoman Turks. The people of the Nusantara archipelago of Southeast Asia used the long arquebus at least by the last quarter of 15th century. Even though the knowledge of making gunpowder-based weapons in the Nusantara archipelago had been known after the failed Mongol invasion of Java (1293), and the predecessor of firearms, the pole gun (bedil tombak), was recorded as being used by Java in 1413, the knowledge of making "true" firearms came much later, after the middle of 15th century. It was brought by the Islamic nations of West Asia, most probably the Arabs. The precise year of introduction is unknown, but it may be safely concluded to be no earlier than 1460. Before the arrival of the Portuguese in Southeast Asia, the natives already possessed firearms, the Java arquebus. The technology of firearms in Southeast Asia further improved after the Portuguese capture of Malacca (1511). Starting in the 1513, the traditions of German-Bohemian gun-making merged with Turkish gun-making traditions. This resulted in the Indo-Portuguese tradition of matchlocks. Indian craftsmen modified the design by introducing a very short, almost pistol-like buttstock held against the cheek, not the shoulder, when aiming. They also reduced the caliber and made the gun lighter and more balanced. This was a hit with the Portuguese who did a lot of fighting aboard ship and on river craft, and valued a more compact gun. The Malaccan gunfounders, compared as being in the same level with those of Germany, quickly adapted these new firearms, and thus a new type of arquebus, the istinggar, appeared. The Japanese did not acquire firearms until the 16th century, and then from the Portuguese rather than from the Chinese. Developments in firearms accelerated during the 19th and 20th centuries. Breech-loading became more or less a universal standard for the reloading of most hand-held firearms and continues to be so with some notable exceptions (such as mortars). Instead of loading individual rounds into weapons, magazines holding multiple munitions were adopted—these aided rapid reloading. Automatic and semi-automatic firing mechanisms meant that a single soldier could fire many more rounds in a minute than a vintage weapon could fire over the course of a battle. Polymers and alloys in firearm construction made weaponry progressively lighter and thus easier to deploy. Ammunition changed over the centuries from simple metallic ball-shaped projectiles that rattled down the barrel to bullets and cartridges manufactured to high precision. Especially in the past century particular attention has focused on accuracy and sighting to make firearms altogether far more accurate than ever before. More than any single factor though, firearms have proliferated due to the advent of mass production—enabling arms-manufacturers to produce large quantities of weaponry to a consistent standard. Velocities of bullets increased with the use of a "jacket" of metals such as copper or copper alloys that covered a lead core and allowed the bullet to glide down the barrel more easily than exposed lead. Such bullets are designated as "full metal jacket" (FMJ). Such FMJ bullets are less likely to fragment on impact and are more likely to traverse through a target while imparting less energy. Hence, FMJ bullets impart less tissue damage than non-jacketed bullets that expand. This led to their adoption for military use by countries adhering to the Hague Convention of 1899. That said, the basic principle behind firearm operation remains unchanged to this day. A musket of several centuries ago is still similar in principle to a modern-day assault-rifle—using the expansion of gases to propel projectiles over long distances—albeit less accurately and rapidly. Evolution Early models Fire lances The Chinese fire lance from the 10th century was the direct predecessor to the modern concept of the firearm. It was not a gun itself, but an addition to soldiers' spears. Originally it consisted of paper or bamboo barrels which would contain incendiary gunpowder that could be lit one time and which would project flames at the enemy. Sometimes Chinese troops would place small projectiles within the barrel that would also be projected when the gunpowder was lit, but most of the explosive force would create flames. Later, the barrel was changed to be made of metal, so that a more explosive gunpowder could be used and put more force into the propulsion of projectiles. Hand cannons The original predecessors of all firearms, the Chinese fire lance and hand cannon, were loaded with gunpowder and the shot (initially lead shot, later replaced by cast iron) through the muzzle, while a fuse was placed at the rear. This fuse was lit, causing the gunpowder to ignite and propel the projectiles. In military use, the standard hand cannon was tremendously powerful, while also being somewhat erratic due to relative inability of the gunner to aim the weapon, or to control the ballistic properties of the projectile. Recoil could be absorbed by bracing the barrel against the ground using a wooden support, the forerunner of the stock. Neither the quality or amount of gunpowder, nor the consistency in projectile dimensions were controlled, with resulting inaccuracy in firing due to windage, variance in gunpowder-composition, and the difference in diameter between the bore and the shot. Hand cannons were replaced by lighter carriage-mounted artillery pieces, and ultimately by the arquebus. In the 1420s gunpowder was used to propel missiles from hand-held tubes during the Hussite revolt in Bohemia. Muskets Muzzle-loading muskets (smooth-bored long guns) were among the first firearms developed. The firearm was loaded through the muzzle with gunpowder, optionally with some wadding and then with a bullet (usually a solid lead ball, but musketeers could shoot stones when they ran out of bullets). Greatly improved muzzleloaders (usually rifled instead of smooth-bored) are manufactured today and have many enthusiasts, many of whom hunt large and small game with their guns. Muzzleloaders have to be manually reloaded after each shot; a skilled archer could fire multiple arrows faster than most early muskets could be reloaded and fired, although by the mid-18th century, when muzzleloaders became the standard small-armament of the military, a well-drilled soldier could fire six rounds in a minute using prepared cartridges in his musket. Before then, the effectiveness of muzzleloaders was hindered both by the low reloading speed and, before the firing mechanism was perfected, by the very high risk posed by the firearm to the person attempting to fire it. One interesting solution to the reloading problem was the "Roman Candle Gun" with superposed loads. This was a muzzleloader in which multiple charges and balls were loaded one on top of the other, with a small hole in each ball to allow the subsequent charge to be ignited after the one ahead of it was ignited. It was neither a very reliable nor popular firearm, but it enabled a form of "automatic" fire long before the advent of the machine gun. Loading techniques Most early firearms were muzzle-loading. This form of loading has several disadvantages, such as a slow rate of fire and having to expose oneself to enemy fire to reload - as the weapon had to be pointed upright so the powder could be poured through the muzzle into the breech, followed by the ramming the projectile into the breech. As effective methods of sealing the breech developed along with sturdy, weatherproof, self-contained metallic cartridges, muzzle-loaders were replaced by single-shot breech loaders. Eventually single-shot weapons were replaced by the following repeater-type weapons. Internal magazines Many firearms made from the late-19th century through the 1950s used internal magazines to load the cartridge into the chamber of the weapon. The most notable and revolutionary weapons of this period appeared during the U.S. Civil War of 1861-1865: the Spencer and Henry repeating rifles. Both used fixed tubular magazines, the former having the magazine in the buttstock and the latter under the barrel, which allowed a larger capacity. Later weapons used fixed box magazines that could not be removed from the weapon without disassembling the weapon itself. Fixed magazines permitted the use of larger cartridges and eliminated the hazard of having the bullet of one cartridge butting next to the primer or rim of another cartridge. These magazines are loaded while they are in the weapon, often using a stripper clip. A clip is used to transfer cartridges into the magazine. Some notable weapons that use internal magazines include the Mosin–Nagant, the Mauser Kar 98k, the Springfield M1903, the M1 Garand, and the SKS. Firearms that have internal magazines are usually, but not always, rifles. Some exceptions to this include the Mauser C96 pistol, which uses an internal magazine, and the Breda 30, an Italian light machine-gun. Detachable magazines Many modern firearms use what are called detachable or box magazines as their method of chambering a cartridge. Detachable magazines can be removed from the weapon without disassembling the firearms, usually by pushing a magazine release. Belt-fed weapons A belt or ammunition belt, a device used to retain and feed cartridges into a firearm, is commonly used with machine guns. Belts were originally composed of canvas or cloth with pockets spaced evenly to allow the belt to be mechanically fed into the gun. These designs were prone to malfunctions due to the effects of oil and other contaminants altering the belt. Later belt-designs used permanently-connected metal links to retain the cartridges during feeding. These belts were more tolerant to exposure to solvents and oil. Notable weapons that use belts include the M240, the M249, the M134 Minigun, and the PK Machine Gun. Firing mechanisms Matchlock Matchlocks were the first and simplest firearms-firing mechanisms developed. In the matchlock mechanism, the powder in the gun barrel was ignited by a piece of burning cord called a "match". The match was wedged into one end of an S-shaped piece of steel. When the trigger (often actually a lever) was pulled, the match was brought into the open end of a "touch hole" at the base of the gun barrel, which contained a very small quantity of gunpowder, igniting the main charge of gunpowder in the gun barrel. The match usually had to be relit after each firing. The main parts to the matchlock firing-mechanism are the pan, match, arm and trigger. A benefit of the pan and arm swivel being moved to the side of the gun was it gave a clear line of fire. An advantage to the matchlock firing mechanism is that it did not misfire. However, it also came with some disadvantages. One disadvantage involved weather: in rain the match could not be kept lit to fire the weapon. Another issue with the match was it could give away the position of soldiers because of the glow, sound, and smell. While European pistols were equipped with wheellock and flintlock mechanisms, Asian pistols used matchlock mechanisms. Wheellock The wheellock action, a successor to the matchlock, predated the flintlock. Despite its many faults, the wheellock was a significant improvement over the matchlock in terms of both convenience and safety, since it eliminated the need to keep a smoldering match in proximity to loose gunpowder. It operated using a small wheel (much like that on a cigarette lighter) which was wound up with a key before use and which, when the trigger was pulled, spun against a flint, creating the shower of sparks that ignited the powder in the touch hole. Supposedly invented by Leonardo da Vinci (1452-1519), the Italian Renaissance man, the wheellock action was an innovation that was not widely adopted due to the high cost of the clockwork mechanism. Flintlock The flintlock action represented a major innovation in firearm design. The spark used to ignite the gunpowder in the touch hole came from a sharpened piece of flint clamped in the jaws of a "cock" which, when released by the trigger, struck a piece of steel called the "frizzen" to generate the necessary sparks. (The spring-loaded arm that holds a piece of flint or pyrite is referred to as a cock because of its resemblance to a rooster.) The cock had to be manually reset after each firing, and the flint had to be replaced periodically due to wear from striking the frizzen. (See also flintlock mechanism, snaphance, Miquelet lock.) The flintlock was widely used during the 17th, 18th, and 19th centuries in both muskets and rifles. Percussion cap Percussion caps (caplock mechanisms), coming into wide service in the early 19th century, offered a dramatic improvement over flintlocks. With the percussion-cap mechanism, the small primer charge of gunpowder used in all preceding firearms was replaced by a completely self-contained explosive charge contained in a small brass "cap". The cap was fastened to the touch hole of the gun (extended to form a "nipple") and ignited by the impact of the gun's "hammer". (The hammer is roughly the same as the cock found on flintlocks except that it does not clamp onto anything.) In the case of percussion caps the hammer was hollow on the end to fit around the cap in order to keep the cap from fragmenting and injuring the shooter. Once struck, the flame from the cap in turn ignited the main charge of gunpowder, as with the flintlock, but there was no longer any need to charge the touch hole with gunpowder, and even better, the touch hole was no longer exposed to the elements. As a result, the percussion-cap mechanism was considerably safer, far more weatherproof, and vastly more reliable (cloth-bound cartridges containing a pre-measured charge of gunpowder and a ball had been in regular military service for many years, but the exposed gunpowder in the entry to the touch hole had long been a source of misfires). All muzzleloaders manufactured since the second half of the 19th century use percussion caps except those built as replicas of the flintlock or earlier firearms. Cartridges Frenchman Louis-Nicolas Flobert invented the first rimfire metallic cartridge in 1845. His cartridge consisted of a percussion cap with a bullet attached to the top. Flobert then made what he called "parlor guns" for this cartridge, as these rifles and pistols were designed to be shot in indoor shooting-parlors in large homes. These 6mm Flobert cartridges do not contain any powder, the only propellant substance contained in the cartridge is the percussion cap. In English-speaking countries, the 6mm Flobert cartridge corresponds to .22 BB Cap and .22 CB Cap ammunition. These cartridges have a relatively low muzzle-velocity of around 700 ft/s (210 m/s). Cartridges represented a major innovation: firearms ammunition, previously delivered as separate bullets and powder, was combined in a single metallic (usually brass) cartridge containing a percussion cap, powder, and a bullet in one weatherproof package. The main technical advantage of the brass cartridge-case was the effective and reliable sealing of high-pressure gasses at the breech, as the gas pressure forces the cartridge case to expand outward, pressing it firmly against the inside of the gun-barrel chamber. This prevents the leakage of hot gas which could injure the shooter. The brass cartridge also opened the way for modern repeating arms, by uniting the bullet, gunpowder and primer into one assembly that could be fed reliably into the breech by a mechanical action in the firearm. Before this, a "cartridge" was simply a pre-measured quantity of gunpowder together with a ball in a small cloth bag (or rolled paper cylinder), which also acted as wadding for the charge and ball. This early form of cartridge had to be rammed into the muzzleloader's barrel, and either a small charge of gunpowder in the touch hole or an external percussion cap mounted on the touch hole ignited the gunpowder in the cartridge. Cartridges with built-in percussion caps (called "primers") continue to this day to be the standard in firearms. In cartridge-firing firearms, a hammer (or a firing-pin struck by the hammer) strikes the cartridge primer, which then ignites the gunpowder within. The primer charge is at the base of the cartridge, either within the rim (a "rimfire" cartridge) or in a small percussion cap embedded in the center of the base (a "centerfire" cartridge). As a rule, centerfire cartridges are more powerful than rimfire cartridges, operating at considerably higher pressures than rimfire cartridges. Centerfire cartridges are also safer, as a dropped rimfire cartridge has the potential to discharge if its rim strikes the ground with sufficient force to ignite the primer. This is practically impossible with most centerfire cartridges. Nearly all contemporary firearms load cartridges directly into their breech. Some additionally or exclusively load from a magazine that holds multiple cartridges. A magazine is defined as a part of the firearm which exists to store ammunition and to assist in its feeding by the action into the breech (such as through the rotation of a revolver's cylinder or by spring-loaded platforms in most pistol and rifle designs). Some magazines, such as that of most centerfire hunting-rifles and all revolvers, are internal to and inseparable from the firearm, and are loaded by using a "clip". A clip (the term often mistakingly refers to a detachable "magazine") is a device that holds the ammunition by the rim of the case and is designed to assist the shooter in reloading the firearm's magazine. Examples include revolver speedloaders, the stripper clip used to aid loading rifles such as the Lee–Enfield or Mauser 98, and the en-bloc clip used in loading the M1 Garand. In this sense, "magazines" and "clips", though often used synonymously, refer to different types of devices. Repeating, semi-automatic, and automatic firearms Many firearms are "single shot": i.e., each time a cartridge is fired, the operator must manually re-cock the firearm and load another cartridge. The classic single-barreled shotgun offers a good example. A firearm that can load multiple cartridges as the firearm is re-cocked is considered a "repeating firearm" or simply a "repeater". A lever-action rifle, a pump-action shotgun, and most bolt-action rifles are good examples of repeating firearms. A firearm that automatically re-cocks and reloads the next round with each trigger-pull is considered a semi-automatic or autoloading firearm. The first "rapid firing" firearms were usually similar to the 19th-century Gatling gun, which would fire cartridges from a magazine as fast as and as long as the operator turned a crank. Eventually, the "rapid" firing mechanism was perfected and miniaturized to the extent that either the recoil of the firearm or the gas pressure from firing could be used to operate it, thus the operator needed only to pull a trigger - this made the firing mechanisms truly "automatic". An automatic (or "fully automatic") firearm automatically re-cocks, reloads, and fires as long as the trigger is depressed. An automatic firearm is capable of firing multiple rounds with one pull of the trigger. The Gatling gun may have been the first automatic weapon, though the modern trigger-actuated machine gun was not widely introduced until the First World War (1914-1918) with the German "Spandau" (adopted in 1908) and the British Lewis gun (in service from 1914). Automatic rifles such as the Browning Automatic Rifle were in common use by the military during the early part of the 20th century, and automatic rifles that fired handgun rounds, known as submachine guns, also appeared at this time. Many modern military firearms have a selective fire option, which is a mechanical switch that allows the firearm be fired either in the semi-automatic or fully automatic mode. In the current M16A2 and M16A4 variants of the U.S.-made M16, continuous fully-automatic fire is not possible, having been replaced by an automatic burst of three cartridges (this conserves ammunition and increases controllability). Automatic weapons are largely restricted to military and paramilitary organizations, though many automatic designs are infamous for their use by civilians. Health hazards Firearm hazard is quite notable, with a significant impact on the health system. In 2001, for quantification purpose, it was estimated that the cost of fatalities and injuries was US$4700 million per year in Canada (US$170 per Canadian) and US$100,000 million per year in the USA (US$300 per American). Death From 1990 to 2015, global deaths from assault by firearm rose from 128,000 to 173,000, however this represents a drop in rate from 2.41/100,000 to 2.35/100,000, as world population has increased by more than two billion. Additionally, there were 32,000 unintentional firearm global deaths in 2015. In 2017, there were 39,773 gun-related deaths in the United States; over 60% were suicides from firearms. Firearms are the second leading mechanism of injury deaths after motor vehicle accidents. In the 52 high- and middle-income countries, with a combined population of 1,400 million and not engaged in civil conflict, fatalities due to firearm injuries were estimated at 115,000 people per annum, in the 1990s In those 52 countries, firearm is the first method used for homicide (two thirds) but only the second method for suicide (20%) To prevent unintentional injury, gun safety training includes education on proper firearm storage and firearm-handling etiquette. Injury Based on US data, it is estimated that three people are injured for one killed. Noise A common hazard of repeated firearm use is noise-induced hearing loss (NIHL). NIHL can result from long-term exposure to noise or from high intensity impact noises such as gunshots. Individuals who shoot guns often have a characteristic pattern of hearing loss referred to as "shooters ear". They often have a high frequency loss with better hearing in the low frequencies and one ear is typically worse than the other. The ear on the side the shooter is holding the gun will receive protection from the sound wave from the shoulder while the other ear remains unprotected and more susceptible to the full impact of the sound wave. The intensity of a gunshot does vary; lower caliber guns are typically on the softer side while higher caliber guns are often louder. The intensity of a gunshot though typically ranges from 140 dB to 175 dB. Indoor shooting also causes loud reverberations which can also be as damaging as the actual gunshot itself. According to the National Institute on Deafness and Other Communication Disorders, noise above 85 dB can begin to cause hearing loss. While many sounds cause damage over time, at the intensity level of a gunshot (140 dB or louder), damage to the ear can occur instantly. Shooters use custom hearing protection such as electronic type hearing protection for hunters which can amplify soft sounds like leaves crunching while reducing the intensity of the gunshot and custom hearing protection for skeet shooting. Even with hearing protection, due to the high intensity of the noise guns produce shooters still develop hearing loss over time. Legal definitions Firearms include a variety of ranged weapons and there is no agreed-upon definition. For instance English language laws of big legal entities such as the United States, India the European Union and Canada use different definitions. Other English language definitions are provided by international treaties. United States In the United States, under 26 USCA § 861 (a), the term ‘‘firearm’’ means According to the US Bureau of Alcohol, Tobacco, Firearms and Explosives, if gas pressurization is achieved through mechanical gas compression rather than through chemical propellant combustion, then the device is technically an air gun, not a firearm. India In India, the arms act, 1959, provides a definition of firearms where "firearms" means arms of any description designed or adapted to discharge a projectile or projectiles of any kind by the action of any explosive or other forms of energy, and includes: European Union In the European Union, a European Directive amended by EU directive 2017/853 set minimum standards regarding civilian firearms acquisition and possession that EU Member States must implement into their national legal systems. In this context, since 2017, firearms are considered as any portable barrelled weapon that expels, is designed to expel or may be converted to expel a shot, bullet or projectile by the action of a combustible propellant. For legal reasons, objects can be considered as a firearm if they have the appearance of a firearm or are made in a way which makes it possible to convert them to a firearm. Member states may be allowed to exclude from their gun control law items such as antique weapons, or specific purposes items which can only be used for that sole purpose. United Kingdom In the UK a fire arm does not have to use combustible propellant, as explained by Crown Prosecution Service Guidance Firearms The Firearms Act 1968 Section 57(1B), uses the definition of a firearm as a "lethal barrelled weapon" as a "barrelled weapon of any description from which a shot, bullet or other missile, with kinetic energy of more than one joule as measured at the muzzle of the weapon, can be discharged". As such, low energy air rifles and pistols also fall under UK firearm legislation, although the licensing requirements of low energy weapons are more relaxed. Canada In Canada, firearms are defined by the Criminal Code: Australia Australia has a definition of firearms in its 1996 legal act: South Africa In South Africa, Firearms Control Act [No. 60 of 2000] defines firearm since June 2001, with a 2006 amendment of the definition: International treaties An inter-American convention defines firearms as: An international UN protocol on firearms considers that See also References Sources . Weapons Projectile weapons Chinese inventions Gunpowder
[ 0.5846760869026184, 0.423837274312973, -0.2657385468482971, 0.12784390151500702, 0.09402158856391907, -0.0721607431769371, 0.4251825213432312, -0.3266087472438812, -0.15674932301044464, 0.08160540461540222, -0.41112610697746277, 0.5831686854362488, -0.41427046060562134, 0.07685906440019608...
11968
https://en.wikipedia.org/wiki/George%20Washington
George Washington
George Washington (February 22, 1732, 1799) was an American soldier, statesman, and Founding Father who served as the first president of the United States from 1789 to 1797. Appointed by the Continental Congress as commander of the Continental Army, Washington led the Patriot forces to victory in the American Revolutionary War, and presided at the Constitutional Convention of 1787, which established the Constitution of the United States and a federal government. Washington has been called the "Father of the Nation" for his manifold leadership in the formative days of the country. Washington's first public office was serving as official Surveyor of Culpeper County, Virginia from 1749 to 1750. Subsequently, he received his initial military training (as well as a command with the Virginia Regiment) during the French and Indian War. He was later elected to the Virginia House of Burgesses and was named a delegate to the Continental Congress. Here he was appointed Commanding General of the Continental Army. With this title, he commanded American forces (allied with France) in the defeat and surrender of the British at the Siege of Yorktown during the American Revolutionary War. He resigned his commission after the Treaty of Paris was signed in 1783. Washington played an indispensable role in adopting and ratifying the Constitution of the United States. He was then twice elected president by the Electoral College unanimously. As president, he implemented a strong, well-financed national government while remaining impartial in a fierce rivalry between cabinet members Thomas Jefferson and Alexander Hamilton. During the French Revolution, he proclaimed a policy of neutrality while sanctioning the Jay Treaty. He set enduring precedents for the office of president, including the title "Mr. President", and his Farewell Address is widely regarded as a pre-eminent statement on republicanism. Washington was a slaveowner who had a complicated relationship with slavery. During his lifetime he controlled a total of over 577 slaves, who were forced to work on his farms and wherever he lived, including the President's House in Philadelphia. As president, he signed laws passed by Congress that both protected and curtailed slavery. His will said that one of his slaves, William Lee, should be freed upon his death, and that the other 123 slaves must work for his wife and be freed on her death. She freed them during her lifetime to remove the incentive to hasten her death. He endeavored to assimilate Native Americans into the Anglo-American culture but fought indigenous resistance during instances of violent conflict. He was a member of the Anglican Church and the Freemasons, and he urged broad religious freedom in his roles as general and president. Upon his death, he was eulogized by Henry "Light-Horse Harry" Lee as "first in war, first in peace, and first in the hearts of his countrymen". Washington has been memorialized by monuments, a federal holiday, various media, geographical locations, including the national capital, the State of Washington, stamps, and currency, and many scholars and polls rank him among the greatest U.S. presidents. In 1976 Washington was posthumously promoted to the rank of General of the Armies of the United States. Early life (1732–1752) The Washington family was a wealthy Virginia planter family that had made its fortune through land speculation and the cultivation of tobacco. Washington's great-grandfather John Washington emigrated in 1656 from Sulgrave, Northamptonshire, England, to the English colony of Virginia where he accumulated of land, including Little Hunting Creek on the Potomac River. George Washington was born on February 22, 1732, at Popes Creek in Westmoreland County, Virginia, and was the first of six children of Augustine and Mary Ball Washington. His father was a justice of the peace and a prominent public figure who had four additional children from his first marriage to Jane Butler. The family moved to Little Hunting Creek in 1735. In 1738, they moved to Ferry Farm near Fredericksburg, Virginia on the Rappahannock River. When Augustine died in 1743, Washington inherited Ferry Farm and ten slaves; his older half-brother Lawrence inherited Little Hunting Creek and renamed it Mount Vernon. Washington did not have the formal education his elder brothers received at Appleby Grammar School in England, but did attend the Lower Church School in Hartfield. He learned mathematics, trigonometry, and land surveying and became a talented draftsman and map-maker. By early adulthood, he was writing with "considerable force" and "precision"; however, his writing displayed little wit or humor. In pursuit of admiration, status, and power, he tended to attribute his shortcomings and failures to someone else's ineffectuality. Washington often visited Mount Vernon and Belvoir, the plantation that belonged to Lawrence's father-in-law William Fairfax. Fairfax became Washington's patron and surrogate father, and Washington spent a month in 1748 with a team surveying Fairfax's Shenandoah Valley property. He received a surveyor's license the following year from the College of William & Mary. Even though Washington had not served the customary apprenticeship, Fairfax appointed him surveyor of Culpeper County, Virginia, and he appeared in Culpeper County to take his oath of office July 20, 1749. He subsequently familiarized himself with the frontier region, and though he resigned from the job in 1750, he continued to do surveys west of the Blue Ridge Mountains. By 1752 he had bought almost in the Valley and owned . In 1751, Washington made his only trip abroad when he accompanied Lawrence to Barbados, hoping the climate would cure his brother's tuberculosis. Washington contracted smallpox during that trip, which immunized him and left his face slightly scarred. Lawrence died in 1752, and Washington leased Mount Vernon from his widow Anne; he inherited it outright after her death in 1761. Colonial military career (1752–1758) Lawrence Washington's service as adjutant general of the Virginia militia inspired his half-brother George to seek a commission. Virginia's lieutenant governor, Robert Dinwiddie, appointed George Washington as a major and commander of one of the four militia districts. The British and French were competing for control of the Ohio Valley. While the British were constructing forts along the Ohio River, the French were doing the same—constructing forts between the Ohio River and Lake Erie. In October 1753, Dinwiddie appointed Washington as a special envoy. He had sent George to demand French forces to vacate land that was being claimed by the British. Washington was also appointed to make peace with the Iroquois Confederacy, and to gather further intelligence about the French forces. Washington met with Half-King Tanacharison, and other Iroquois chiefs, at Logstown, and gathered information about the numbers and locations of the French forts, as well as intelligence concerning individuals taken prisoner by the French. Washington was given the nickname Conotocaurius (town destroyer or devourer of villages) by Tanacharison. The nickname had previously been given to his great-grandfather John Washington in the late seventeenth century by the Susquehannock. Washington's party reached the Ohio River in November 1753, and were intercepted by a French patrol. The party was escorted to Fort Le Boeuf, where Washington was received in a friendly manner. He delivered the British demand to vacate to the French commander Saint-Pierre, but the French refused to leave. Saint-Pierre gave Washington his official answer in a sealed envelope after a few days' delay, as well as food and extra winter clothing for his party's journey back to Virginia. Washington completed the precarious mission in 77 days, in difficult winter conditions, achieving a measure of distinction when his report was published in Virginia and in London. French and Indian War In February 1754, Dinwiddie promoted Washington to lieutenant colonel and second-in-command of the 300-strong Virginia Regiment, with orders to confront French forces at the Forks of the Ohio. Washington set out for the Forks with half the regiment in April and soon learned a French force of 1,000 had begun construction of Fort Duquesne there. In May, having set up a defensive position at Great Meadows, he learned that the French had made camp seven miles (11 km) away; he decided to take the offensive. The French detachment proved to be only about fifty men, so Washington advanced on May 28 with a small force of Virginians and Indian allies to ambush them. What took place, known as the Battle of Jumonville Glen or the "Jumonville affair", was disputed, and French forces were killed outright with muskets and hatchets. French commander Joseph Coulon de Jumonville, who carried a diplomatic message for the British to evacuate, was killed. French forces found Jumonville and some of his men dead and scalped and assumed Washington was responsible. Washington blamed his translator for not communicating the French intentions. Dinwiddie congratulated Washington for his victory over the French. This incident ignited the French and Indian War, which later became part of the larger Seven Years' War. The full Virginia Regiment joined Washington at Fort Necessity the following month with news that he had been promoted to command of the regiment and colonel upon the regimental commander's death. The regiment was reinforced by an independent company of a hundred South Carolinians led by Captain James Mackay, whose royal commission outranked that of Washington, and a conflict of command ensued. On July 3, a French force attacked with 900 men, and the ensuing battle ended in Washington's surrender. In the aftermath, Colonel James Innes took command of intercolonial forces, the Virginia Regiment was divided, and Washington was offered a captaincy which he refused, with the resignation of his commission. In 1755, Washington served voluntarily as an aide to General Edward Braddock, who led a British expedition to expel the French from Fort Duquesne and the Ohio Country. On Washington's recommendation, Braddock split the army into one main column and a lightly equipped "flying column". Suffering from a severe case of dysentery, Washington was left behind, and when he rejoined Braddock at Monongahela the French and their Indian allies ambushed the divided army. Two-thirds of the British force became casualties, including the mortally wounded Braddock. Under the command of Lieutenant Colonel Thomas Gage, Washington, still very ill, rallied the survivors and formed a rear guard, allowing the remnants of the force to disengage and retreat. During the engagement, he had two horses shot from under him, and his hat and coat were bullet-pierced. His conduct under fire redeemed his reputation among critics of his command in the Battle of Fort Necessity, but he was not included by the succeeding commander (Colonel Thomas Dunbar) in planning subsequent operations. The Virginia Regiment was reconstituted in August 1755, and Dinwiddie appointed Washington its commander, again with the rank of colonel. Washington clashed over seniority almost immediately, this time with John Dagworthy, another captain of superior royal rank, who commanded a detachment of Marylanders at the regiment's headquarters in Fort Cumberland. Washington, impatient for an offensive against Fort Duquesne, was convinced Braddock would have granted him a royal commission and pressed his case in February 1756 with Braddock's successor, William Shirley, and again in January 1757 with Shirley's successor, Lord Loudoun. Shirley ruled in Washington's favor only in the matter of Dagworthy; Loudoun humiliated Washington, refused him a royal commission and agreed only to relieve him of the responsibility of manning Fort Cumberland. In 1758, the Virginia Regiment was assigned to the British Forbes Expedition to capture Fort Duquesne. Washington disagreed with General John Forbes' tactics and chosen route. Forbes nevertheless made Washington a brevet brigadier general and gave him command of one of the three brigades that would assault the fort. The French abandoned the fort and the valley before the assault was launched; Washington saw only a friendly fire incident which left 14 dead and 26 injured. The war lasted another four years, and Washington resigned his commission and returned to Mount Vernon. Under Washington, the Virginia Regiment had defended of frontier against twenty Indian attacks in ten months. He increased the professionalism of the regiment as it increased from 300 to 1,000 men, and Virginia's frontier population suffered less than other colonies. Some historians have said this was Washington's "only unqualified success" during the war. Though he failed to realize a royal commission, he did gain self-confidence, leadership skills, and invaluable knowledge of British military tactics. The destructive competition Washington witnessed among colonial politicians fostered his later support of a strong central government. Marriage, civilian, and political life (1755–1775) On January 6, 1759, Washington, at age 26, married Martha Dandridge Custis, the 27-year-old widow of wealthy plantation owner Daniel Parke Custis. The marriage took place at Martha's estate; she was intelligent, gracious, and experienced in managing a planter's estate, and the couple created a happy marriage. They raised John Parke Custis (Jacky) and Martha "Patsy" Parke Custis, children from her previous marriage, and later Jacky's children Eleanor Parke Custis (Nelly) and George Washington Parke Custis (Washy). Washington's 1751 bout with smallpox is thought to have rendered him sterile, though it is equally likely that "Martha may have sustained injury during the birth of Patsy, her final child, making additional births impossible." The couple lamented not having any children together. They moved to Mount Vernon, near Alexandria, where he took up life as a planter of tobacco and wheat and emerged as a political figure. The marriage gave Washington control over Martha's one-third dower interest in the Custis estate, and he managed the remaining two-thirds for Martha's children; the estate also included 84 slaves. He became one of Virginia's wealthiest men, which increased his social standing. At Washington's urging, Governor Lord Botetourt fulfilled Dinwiddie's 1754 promise of land bounties to all-volunteer militia during the French and Indian War. In late 1770, Washington inspected the lands in the Ohio and Great Kanawha regions, and he engaged surveyor William Crawford to subdivide it. Crawford allotted to Washington; Washington told the veterans that their land was hilly and unsuitable for farming, and he agreed to purchase , leaving some feeling they had been duped. He also doubled the size of Mount Vernon to and increased its slave population to more than a hundred by 1775. Washington's political activities included supporting the candidacy of his friend George William Fairfax in his 1755 bid to represent the region in the Virginia House of Burgesses. This support led to a dispute which resulted in a physical altercation between Washington and another Virginia planter, William Payne. Washington defused the situation, including ordering officers from the Virginia Regiment to stand down. Washington apologized to Payne the following day at a tavern. Payne had been expecting to be challenged to a duel. As a respected military hero and large landowner, Washington held local offices and was elected to the Virginia provincial legislature, representing Frederick County in the House of Burgesses for seven years beginning in 1758. He plied the voters with beer, brandy, and other beverages, although he was absent while serving on the Forbes Expedition. He won the election with roughly 40 percent of the vote, defeating three other candidates with the help of several local supporters. He rarely spoke in his early legislative career, but he became a prominent critic of Britain's taxation policy and mercantilist policies towards the American colonies starting in the 1760s. By occupation, Washington was a planter, and he imported luxuries and other goods from England, paying for them by exporting tobacco. His profligate spending combined with low tobacco prices left him £1,800 in debt by 1764, prompting him to diversify his holdings. In 1765, because of erosion and other soil problems, he changed Mount Vernon's primary cash crop from tobacco to wheat and expanded operations to include corn flour milling and fishing. Washington also took time for leisure with fox hunting, fishing, dances, theater, cards, backgammon, and billiards. Washington soon was counted among the political and social elite in Virginia. From 1768 to 1775, he invited some 2,000 guests to his Mount Vernon estate, mostly those whom he considered people of rank, and was known to be exceptionally cordial toward his guests. He became more politically active in 1769, presenting legislation in the Virginia Assembly to establish an embargo on goods from Great Britain. Washington's step-daughter Patsy Custis suffered from epileptic attacks from age 12, and she died in his arms in 1773. The following day, he wrote to Burwell Bassett: "It is easier to conceive, than to describe, the distress of this Family". He canceled all business activity and remained with Martha every night for three months. Opposition to British Parliament and Crown Washington played a central role before and during the American Revolution. His disdain for the British military had begun when he was passed over for promotion into the Regular Army. Opposed to taxes imposed by the British Parliament on the Colonies without proper representation, he and other colonists were also angered by the Royal Proclamation of 1763 which banned American settlement west of the Allegheny Mountains and protected the British fur trade. Washington believed the Stamp Act of 1765 was an "Act of Oppression", and he celebrated its repeal the following year. In March 1766, Parliament passed the Declaratory Act asserting that Parliamentary law superseded colonial law. In the late 1760s, the interference of the British Crown in American lucrative western land speculation spurred on the American Revolution. Washington himself was a prosperous land speculator, and in 1767, he encouraged "adventures" to acquire backcountry western lands. Washington helped lead widespread protests against the Townshend Acts passed by Parliament in 1767, and he introduced a proposal in May 1769 drafted by George Mason which called Virginians to boycott British goods; the Acts were mostly repealed in 1770. Parliament sought to punish Massachusetts colonists for their role in the Boston Tea Party in 1774 by passing the Coercive Acts, which Washington referred to as "an invasion of our rights and privileges". He said Americans must not submit to acts of tyranny since "custom and use shall make us as tame and abject slaves, as the blacks we rule over with such arbitrary sway". That July, he and George Mason drafted a list of resolutions for the Fairfax County committee which Washington chaired, and the committee adopted the Fairfax Resolves calling for a Continental Congress, and an end to the slave trade. On August 1, Washington attended the First Virginia Convention, where he was selected as a delegate to the First Continental Congress, September 5 to October 26, 1774, which he also attended. As tensions rose in 1774, he helped train county militias in Virginia and organized enforcement of the Continental Association boycott of British goods instituted by the Congress. The American Revolutionary War began on April 19, 1775, with the Battles of Lexington and Concord and the Siege of Boston. The colonists were divided over breaking away from British rule and split into two factions: Patriots who rejected British rule, and Loyalists who desired to remain subject to the King. General Thomas Gage was commander of British forces in America at the beginning of the war. Upon hearing the shocking news of the onset of war, Washington was "sobered and dismayed", and he hastily departed Mount Vernon on May 4, 1775, to join the Second Continental Congress in Philadelphia. Commander in chief (1775–1783) Congress created the Continental Army on June 14, 1775, and Samuel and John Adams nominated Washington to become its commander-in-chief. Washington was chosen over John Hancock because of his military experience and the belief that a Virginian would better unite the colonies. He was considered an incisive leader who kept his "ambition in check". He was unanimously elected commander in chief by Congress the next day. Washington appeared before Congress in uniform and gave an acceptance speech on June 16, declining a salary—though he was later reimbursed expenses. He was commissioned on June 19 and was roundly praised by Congressional delegates, including John Adams, who proclaimed that he was the man best suited to lead and unite the colonies. Congress appointed Washington "General & Commander in chief of the army of the United Colonies and of all the forces raised or to be raised by them", and instructed him to take charge of the siege of Boston on June 22, 1775. Congress chose his primary staff officers, including Major General Artemas Ward, Adjutant General Horatio Gates, Major General Charles Lee, Major General Philip Schuyler, Major General Nathanael Greene, Colonel Henry Knox, and Colonel Alexander Hamilton. Washington was impressed by Colonel Benedict Arnold and gave him responsibility for launching an invasion of Canada. He also engaged French and Indian War compatriot Brigadier General Daniel Morgan. Henry Knox impressed Adams with ordnance knowledge, and Washington promoted him to colonel and chief of artillery. At the start of the war, Washington opposed the recruiting of blacks, both free and enslaved, into the Continental Army. After his appointment, Washington banned their enlistment. The British saw an opportunity to divide the colonies, and the colonial governor of Virginia issued a proclamation, which promised freedom to slaves if they joined the British. Desperate for manpower by late 1777, Washington relented and overturned his ban. By the end of the war, around one-tenth of Washington's army were blacks. Following the British surrender, Washington sought to enforce terms of the preliminary Treaty of Paris (1783) by reclaiming slaves freed by the British and returning them to servitude. He arranged to make this request to Sir Guy Carleton on May 6, 1783. Instead, Carleton issued 3,000 freedom certificates and all former slaves in New York City were able to leave before the city was evacuated by the British in late November 1783. After the war Washington became the target of accusations made by General Lee involving his alleged questionable conduct as Commander in Chief during the war that were published by patriot-printer William Goddard. Goddard in a letter of May 30, 1785, had informed Washington of Lee's request to publish his account and assured him that he "...took the liberty to suppress such expressions as appeared to be the ebullitions of a disappointed & irritated mind ...". Washington replied, telling Goddard to print what he saw fit, and to let "... the impartial & dispassionate world," draw their own conclusions. Siege of Boston Early in 1775, in response to the growing rebellious movement, London sent British troops, commanded by General Thomas Gage, to occupy Boston. They set up fortifications about the city, making it impervious to attack. Various local militias surrounded the city and effectively trapped the British, resulting in a standoff. As Washington headed for Boston, word of his march preceded him, and he was greeted everywhere; gradually, he became a symbol of the Patriot cause. Upon arrival on July 2, 1775, two weeks after the Patriot defeat at nearby Bunker Hill, he set up his Cambridge, Massachusetts headquarters and inspected the new army there, only to find an undisciplined and badly outfitted militia. After consultation, he initiated Benjamin Franklin's suggested reforms—drilling the soldiers and imposing strict discipline, floggings, and incarceration. Washington ordered his officers to identify the skills of recruits to ensure military effectiveness, while removing incompetent officers. He petitioned Gage, his former superior, to release captured Patriot officers from prison and treat them humanely. In October 1775, King George III declared that the colonies were in open rebellion and relieved General Gage of command for incompetence, replacing him with General William Howe. The Continental Army, further diminished by expiring short-term enlistments, and by January 1776 reduced by half to 9,600 men, had to be supplemented with the militia, and was joined by Knox with heavy artillery captured from Fort Ticonderoga. When the Charles River froze over, Washington was eager to cross and storm Boston, but General Gates and others were opposed to untrained militia striking well-garrisoned fortifications. Washington reluctantly agreed to secure the Dorchester Heights, 100 feet above Boston, in an attempt to force the British out of the city. On March 9, under cover of darkness, Washington's troops brought up Knox's big guns and bombarded British ships in Boston harbor. On March 17, 9,000 British troops and Loyalists began a chaotic ten-day evacuation of Boston aboard 120 ships. Soon after, Washington entered the city with 500 men, with explicit orders not to plunder the city. He ordered vaccinations against smallpox to great effect, as he did later in Morristown, New Jersey. He refrained from exerting military authority in Boston, leaving civilian matters in the hands of local authorities. Invasion of Quebec (1775) The Invasion of Quebec (June 1775 – October 1776, French: Invasion du Québec) was the first major military initiative by the newly formed Continental Army during the American Revolutionary War. On June 27, 1775, Congress authorized General Philip Schuyler to investigate, and, if it seemed appropriate, begin an invasion. Benedict Arnold, passed over for its command, went to Boston and convinced General George Washington to send a supporting force to Quebec City under his command. The objective of the campaign was to seize the Province of Quebec (part of modern-day Canada) from Great Britain, and persuade French-speaking Canadiens to join the revolution on the side of the Thirteen Colonies. One expedition left Fort Ticonderoga under Richard Montgomery, besieged and captured Fort St. Johns, and very nearly captured British General Guy Carleton when taking Montreal. The other expedition, under Benedict Arnold, left Cambridge, Massachusetts and traveled with great difficulty through the wilderness of Maine to Quebec City. The two forces joined there, but they were defeated at the Battle of Quebec in December 1775. Battle of Long Island Washington then proceeded to New York City, arriving on April 13, 1776, and began constructing fortifications to thwart the expected British attack. He ordered his occupying forces to treat civilians and their property with respect, to avoid the abuses which Bostonian citizens suffered at the hands of British troops during their occupation. A plot to assassinate or capture him was discovered and thwarted, resulting in the arrest of 98 people involved or complicit (56 of which were from Long Island (Kings (Brooklyn) and Queens counties), including the Loyalist Mayor of New York David Mathews. Washington's bodyguard, Thomas Hickey, was hanged for mutiny and sedition. General Howe transported his resupplied army, with the British fleet, from Halifax to New York, knowing the city was key to securing the continent. George Germain, who ran the British war effort in England, believed it could be won with one "decisive blow". The British forces, including more than a hundred ships and thousands of troops, began arriving on Staten Island on July2 to lay siege to the city. After the Declaration of Independence was adopted on July 4, Washington informed his troops in his general orders of July9 that Congress had declared the united colonies to be "free and independent states". Howe's troop strength totaled 32,000 regulars and Hessians auxiliaries, and Washington's consisted of 23,000, mostly raw recruits and militia. In August, Howe landed 20,000 troops at Gravesend, Brooklyn, and approached Washington's fortifications, as George III proclaimed the rebellious American colonists to be traitors. Washington, opposing his generals, chose to fight, based upon inaccurate information that Howe's army had only 8,000-plus troops. In the Battle of Long Island, Howe assaulted Washington's flank and inflicted 1,500 Patriot casualties, the British suffering 400. Washington retreated, instructing General William Heath to acquisition river craft in the area. On August 30, General William Alexander held off the British and gave cover while the army crossed the East River under darkness to Manhattan Island without loss of life or materiel, although Alexander was captured. Howe, emboldened by his Long Island victory, dispatched Washington as "George Washington, Esq." in futility to negotiate peace. Washington declined, demanding to be addressed with diplomatic protocol, as general and fellow belligerent, not as a "rebel", lest his men are hanged as such if captured. The Royal Navy bombarded the unstable earthworks on lower Manhattan Island. Washington, with misgivings, heeded the advice of Generals Greene and Putnam to defend Fort Washington. They were unable to hold it, and Washington abandoned it despite General Lee's objections, as his army retired north to the White Plains. Howe's pursuit forced Washington to retreat across the Hudson River to Fort Lee to avoid encirclement. Howe landed his troops on Manhattan in November and captured Fort Washington, inflicting high casualties on the Americans. Washington was responsible for delaying the retreat, though he blamed Congress and General Greene. Loyalists in New York considered Howe a liberator and spread a rumor that Washington had set fire to the city. Patriot morale reached its lowest when Lee was captured. Now reduced to 5,400 troops, Washington's army retreated through New Jersey, and Howe broke off pursuit, delaying his advance on Philadelphia, and set up winter quarters in New York. Crossing the Delaware, Trenton, and Princeton Washington crossed the Delaware River into Pennsylvania, where Lee's replacement John Sullivan joined him with 2,000 more troops. The future of the Continental Army was in doubt for lack of supplies, a harsh winter, expiring enlistments, and desertions. Washington was disappointed that many New Jersey residents were Loyalists or skeptical about the prospect of independence. Howe split up his British Army and posted a Hessian garrison at Trenton to hold western New Jersey and the east shore of the Delaware, but the army appeared complacent, and Washington and his generals devised a surprise attack on the Hessians at Trenton, which he codenamed "Victory or Death". The army was to cross the Delaware River to Trenton in three divisions: one led by Washington (2,400 troops), another by General James Ewing (700), and the third by Colonel John Cadwalader (1,500). The force was to then split, with Washington taking the Pennington Road and General Sullivan traveling south on the river's edge. Washington first ordered a 60-mile search for Durham boats to transport his army, and he ordered the destruction of vessels that could be used by the British. Washington crossed the Delaware River on Christmas night, December 25, 1776, while he personally risked capture staking out the Jersey shoreline. His men followed across the ice-obstructed river in sleet and snow from McConkey's Ferry, with 40 men per vessel. The wind churned up the waters, and they were pelted with hail, but by 3:00a.m. on December 26, they made it across with no losses. Henry Knox was delayed, managing frightened horses and about 18 field guns on flat-bottomed ferries. Cadwalader and Ewing failed to cross due to the ice and heavy currents, and awaiting Washington doubted his planned attack on Trenton. Once Knox arrived, Washington proceeded to Trenton to take only his troops against the Hessians, rather than risk being spotted returning his army to Pennsylvania. The troops spotted Hessian positions a mile from Trenton, so Washington split his force into two columns, rallying his men: "Soldiers keep by your officers. For God's sake, keep by your officers." The two columns were separated at the Birmingham crossroads. General Nathanael Greene's column took the upper Ferry Road, led by Washington, and General John Sullivan's column advanced on River Road. (See map.) The Americans marched in sleet and snowfall. Many were shoeless with bloodied feet, and two died of exposure. At sunrise, Washington led them in a surprise attack on the Hessians, aided by Major General Knox and artillery. The Hessians had 22 killed (including Colonel Johann Rall), 83 wounded, and 850 captured with supplies. Washington retreated across Delaware River to Pennsylvania and returned to New Jersey on January 3, 1777, launching an attack on British regulars at Princeton, with 40 Americans killed or wounded and 273 British killed or captured. American Generals Hugh Mercer and John Cadwalader were being driven back by the British when Mercer was mortally wounded, then Washington arrived and led the men in a counterattack which advanced to within of the British line. Some British troops retreated after a brief stand, while others took refuge in Nassau Hall, which became the target of Colonel Alexander Hamilton's cannons. Washington's troops charged, the British surrendered in less than an hour, and 194 soldiers laid down their arms. Howe retreated to New York City where his army remained inactive until early the next year. Washington's depleted Continental Army took up winter headquarters in Morristown, New Jersey while disrupting British supply lines and expelling them from parts of New Jersey. Washington later said the British could have successfully counterattacked his encampment before his troops were dug in. The victories at Trenton and Princeton by Washington revived Patriot morale and changed the course of the war. The British still controlled New York, and many Patriot soldiers did not re-enlist or deserted after the harsh winter campaign. Congress instituted greater rewards for re-enlisting and punishments for desertion to effect greater troop numbers. Strategically, Washington's victories were pivotal for the Revolution and quashed the British strategy of showing overwhelming force followed by offering generous terms. In February 1777, word reached London of the American victories at Trenton and Princeton, and the British realized the Patriots were in a position to demand unconditional independence. Brandywine, Germantown, and Saratoga In July 1777, British General John Burgoyne led the Saratoga campaign south from Quebec through Lake Champlain and recaptured Fort Ticonderoga intending to divide New England, including control of the Hudson River. However, General Howe in British-occupied New York blundered, taking his army south to Philadelphia rather than up the Hudson River to join Burgoyne near Albany. Meanwhile, Washington and Gilbert du Motier, Marquis de Lafayette rushed to Philadelphia to engage Howe and were shocked to learn of Burgoyne's progress in upstate New York, where the Patriots were led by General Philip Schuyler and successor Horatio Gates. Washington's army of less experienced men were defeated in the pitched battles at Philadelphia. Howe outmaneuvered Washington at the Battle of Brandywine on September 11, 1777, and marched unopposed into the nation's capital at Philadelphia. A Patriot attack failed against the British at Germantown in October. Major General Thomas Conway prompted some members of Congress (referred to as the Conway Cabal) to consider removing Washington from command because of the losses incurred at Philadelphia. Washington's supporters resisted, and the matter was finally dropped after much deliberation. Once the plot was exposed, Conway wrote an apology to Washington, resigned, and returned to France. Washington was concerned with Howe's movements during the Saratoga campaign to the north, and he was also aware that Burgoyne was moving south toward Saratoga from Quebec. Washington took some risks to support Gates' army, sending reinforcements north with Generals Benedict Arnold, his most aggressive field commander, and Benjamin Lincoln. On October 7, 1777, Burgoyne tried to take Bemis Heights but was isolated from support by Howe. He was forced to retreat to Saratoga and ultimately surrendered after the Battles of Saratoga. As Washington suspected, Gates' victory emboldened his critics. Biographer John Alden maintains, "It was inevitable that the defeats of Washington's forces and the concurrent victory of the forces in upper New York should be compared." The admiration for Washington was waning, including little credit from John Adams. British commander Howe resigned in May 1778, left America forever, and was replaced by Sir Henry Clinton. Valley Forge and Monmouth Washington's army of 11,000 went into winter quarters at Valley Forge north of Philadelphia in December 1777. They suffered between 2,000 and 3,000 deaths in the extreme cold over six months, mostly from disease and lack of food, clothing, and shelter. Meanwhile, the British were comfortably quartered in Philadelphia, paying for supplies in pounds sterling, while Washington struggled with a devalued American paper currency. The woodlands were soon exhausted of game, and by February, lowered morale and increased desertions ensued. Washington made repeated petitions to the Continental Congress for provisions. He received a congressional delegation to check the Army's conditions and expressed the urgency of the situation, proclaiming: "Something must be done. Important alterations must be made." He recommended that Congress expedite supplies, and Congress agreed to strengthen and fund the army's supply lines by reorganizing the commissary department. By late February, supplies began arriving. Baron Friedrich Wilhelm von Steuben's incessant drilling soon transformed Washington's recruits into a disciplined fighting force, and the revitalized army emerged from Valley Forge early the following year. Washington promoted Von Steuben to Major General and made him chief of staff. In early 1778, the French responded to Burgoyne's defeat and entered into a Treaty of Alliance with the Americans. The Continental Congress ratified the treaty in May, which amounted to a French declaration of war against Britain. The British evacuated Philadelphia for New York that June and Washington summoned a war council of American and French Generals. He chose a partial attack on the retreating British at the Battle of Monmouth; the British were commanded by Howe's successor General Henry Clinton. Generals Charles Lee and Lafayette moved with 4,000 men, without Washington's knowledge, and bungled their first attack on June 28. Washington relieved Lee and achieved a draw after an expansive battle. At nightfall, the British continued their retreat to New York, and Washington moved his army outside the city. Monmouth was Washington's last battle in the North; he valued the safety of his army more than towns with little value to the British. West Point espionage Washington became "America's first spymaster" by designing an espionage system against the British. In 1778, Major Benjamin Tallmadge formed the Culper Ring at Washington's direction to covertly collect information about the British in New York. Washington had disregarded incidents of disloyalty by Benedict Arnold, who had distinguished himself in many battles. During mid-1780, Arnold began supplying British spymaster John André with sensitive information intended to compromise Washington and capture West Point, a key American defensive position on the Hudson River. Historians have noted as possible reasons for Arnold's treachery his anger at losing promotions to junior officers, or repeated slights from Congress. He was also deeply in debt, profiteering from the war, and disappointed by Washington's lack of support during his eventual court-martial. Arnold repeatedly asked for command of West Point, and Washington finally agreed in August. Arnold met André on September 21, giving him plans to take over the garrison. Militia forces captured André and discovered the plans, but Arnold escaped to New York. Washington recalled the commanders positioned under Arnold at key points around the fort to prevent any complicity, but he did not suspect Arnold's wife Peggy. Washington assumed personal command at West Point and reorganized its defenses. André's trial for espionage ended in a death sentence, and Washington offered to return him to the British in exchange for Arnold, but Clinton refused. André was hanged on October 2, 1780, despite his last request being to face a firing squad, to deter other spies. Southern theater and Yorktown In late 1778, General Clinton shipped 3,000 troops from New York to Georgia and launched a Southern invasion against Savannah, reinforced by 2,000 British and Loyalist troops. They repelled an attack by Patriots and French naval forces, which bolstered the British war effort. In mid-1779, Washington attacked Iroquois warriors of the Six Nations to force Britain's Indian allies out of New York, from which they had assaulted New England towns. In response, Indian warriors joined with Loyalist rangers led by Walter Butler and killed more than 200 frontiersmen in June, laying waste to the Wyoming Valley in Pennsylvania. Washington retaliated by ordering General John Sullivan to lead an expedition to effect "the total destruction and devastation" of Iroquois villages and take their women and children hostage. Those who managed to escape fled to Canada. Washington's troops went into quarters at Morristown, New Jersey during the winter of 1779–1780 and suffered their worst winter of the war, with temperatures well below freezing. New York Harbor was frozen over, snow and ice covered the ground for weeks, and the troops again lacked provisions. Clinton assembled 12,500 troops and attacked Charlestown, South Carolina in January 1780, defeating General Benjamin Lincoln who had only 5,100 Continental troops. The British went on to occupy the South Carolina Piedmont in June, with no Patriot resistance. Clinton returned to New York and left 8,000 troops commanded by General Charles Cornwallis. Congress replaced Lincoln with Horatio Gates; he failed in South Carolina and was replaced by Washington's choice of Nathaniel Greene, but the British already had the South in their grasp. Washington was reinvigorated, however, when Lafayette returned from France with more ships, men, and supplies, and 5,000 veteran French troops led by Marshal Rochambeau arrived at Newport, Rhode Island in July 1780. French naval forces then landed, led by Admiral Grasse, and Washington encouraged Rochambeau to move his fleet south to launch a joint land and naval attack on Arnold's troops. Washington's army went into winter quarters at New Windsor, New York in December 1780, and Washington urged Congress and state officials to expedite provisions in hopes that the army would not "continue to struggle under the same difficulties they have hitherto endured". On March 1, 1781, Congress ratified the Articles of Confederation, but the government that took effect on March2 did not have the power to levy taxes, and it loosely held the states together. General Clinton sent Benedict Arnold, now a British Brigadier General with 1,700 troops, to Virginia to capture Portsmouth and conduct raids on Patriot forces from there; Washington responded by sending Lafayette south to counter Arnold's efforts. Washington initially hoped to bring the fight to New York, drawing off British forces from Virginia and ending the war there, but Rochambeau advised Grasse that Cornwallis in Virginia was the better target. Grasse's fleet arrived off the Virginia coast, and Washington saw the advantage. He made a feint towards Clinton in New York, then headed south to Virginia. The Siege of Yorktown was a decisive Allied victory by the combined forces of the Continental Army commanded by General Washington, the French Army commanded by the General Comte de Rochambeau, and the French Navy commanded by Admiral de Grasse, in the defeat of Cornwallis' British forces. On August 19, the march to Yorktown led by Washington and Rochambeau began, which is known now as the "celebrated march". Washington was in command of an army of 7,800 Frenchmen, 3,100 militia, and 8,000 Continentals. Not well experienced in siege warfare, Washington often referred to the judgment of General Rochambeau and used his advice about how to proceed; however, Rochambeau never challenged Washington's authority as the battle's commanding officer. By late September, Patriot-French forces surrounded Yorktown, trapped the British army, and prevented British reinforcements from Clinton in the North, while the French navy emerged victorious at the Battle of the Chesapeake. The final American offensive was begun with a shot fired by Washington. The siege ended with a British surrender on October 19, 1781; over 7,000 British soldiers were made prisoners of war, in the last major land battle of the American Revolutionary War. Washington negotiated the terms of surrender for two days, and the official signing ceremony took place on October 19; Cornwallis claimed illness and was absent, sending General Charles O'Hara as his proxy. As a gesture of goodwill, Washington held a dinner for the American, French, and British generals, all of whom fraternized on friendly terms and identified with one another as members of the same professional military caste. After the surrender at Yorktown, a situation developed that threatened relations between the newly independent America and Britain. Following a series of retributive executions between Patriots and Loyalists, Washington, on May 18, 1782, wrote in a letter to General Moses Hazen that a British captain would be executed in retaliation for the execution of Joshua Huddy, a popular Patriot leader, who was hanged at the direction of the Loyalist Richard Lippincott. Washington wanted Lippincott himself to be executed but was rebuffed. Subsequently, Charles Asgill was chosen instead, by a drawing of lots from a hat. This was a violation of the 14th article of the Yorktown Articles of Capitulation, which protected prisoners of war from acts of retaliation. Later, Washington's feelings on matters changed and in a letter of November 13, 1782, to Asgill, he acknowledged Asgill's letter and situation, expressing his desire not to see any harm come to him. After much consideration between the Continental Congress, Alexander Hamilton, Washington, and appeals from the French Crown, Asgill was finally released, where Washington issued Asgill a pass that allowed his passage to New York. Demobilization and resignation When peace negotiations began in April 1782, both the British and French began gradually evacuating their forces. The American treasury was empty, unpaid, and mutinous soldiers forced the adjournment of Congress, and Washington dispelled unrest by suppressing the Newburgh Conspiracy in March 1783; Congress promised officers a five-year bonus. Washington submitted an account of $450,000 in expenses which he had advanced to the army. The account was settled, though it was allegedly vague about large sums and included expenses his wife had incurred through visits to his headquarters. The following month, a Congressional committee led by Alexander Hamilton began adapting the army for peacetime. In August 1783, Washington gave the Army's perspective to the committee in his Sentiments on a Peace Establishment. He advised Congress to keep a standing army, create a "national militia" of separate state units, and establish a navy and a national military academy. The Treaty of Paris was signed on September 3, 1783, and Great Britain officially recognized the independence of the United States. Washington then disbanded his army, giving a farewell address to his soldiers on November 2. During this time, Washington oversaw the evacuation of British forces in New York and was greeted by parades and celebrations. There he announced that Colonel Henry Knox had been promoted commander-in-chief. Washington and Governor George Clinton took formal possession of the city on November 25. In early December 1783, Washington bade farewell to his officers at Fraunces Tavern and resigned as commander-in-chief soon thereafter, refuting Loyalist predictions that he would not relinquish his military command. In a final appearance in uniform, he gave a statement to the Congress: "I consider it an indispensable duty to close this last solemn act of my official life, by commending the interests of our dearest country to the protection of Almighty God, and those who have the superintendence of them, to his holy keeping." Washington's resignation was acclaimed at home and abroad and showed a skeptical world that the new republic would not degenerate into chaos. The same month, Washington was appointed president-general of the Society of the Cincinnati, a newly established hereditary fraternity of Revolutionary War officers. He served in this capacity for the remainder of his life. Early republic (1783–1789) Return to Mount Vernon Washington was longing to return home after spending just ten days at Mount Vernon out of years of war. He arrived on Christmas Eve, delighted to be "free of the bustle of a camp and the busy scenes of public life". He was a celebrity and was fêted during a visit to his mother at Fredericksburg in February 1784, and he received a constant stream of visitors wishing to pay their respects to him at Mount Vernon. Washington reactivated his interests in the Great Dismal Swamp and Potomac canal projects begun before the war, though neither paid him any dividends, and he undertook a 34-day, 680-mile (1090 km) trip to check on his land holdings in the Ohio Country. He oversaw the completion of the remodeling work at Mount Vernon, which transformed his residence into the mansion that survives to this day—although his financial situation was not strong. Creditors paid him in depreciated wartime currency, and he owed significant amounts in taxes and wages. Mount Vernon had made no profit during his absence, and he saw persistently poor crop yields due to pestilence and poor weather. His estate recorded its eleventh year running at a deficit in 1787, and there was little prospect of improvement. Washington undertook a new landscaping plan and succeeded in cultivating a range of fast-growing trees and shrubs that were native to North America. He also began breeding mules after having been gifted a Spanish jack by King Charles III of Spain in 1784. There were few mules in the United States at that time, and he believed that properly bred mules would revolutionize agriculture and transportation. Constitutional Convention of 1787 Before returning to private life in June 1783, Washington called for a strong union. Though he was concerned that he might be criticized for meddling in civil matters, he sent a circular letter to all the states, maintaining that the Articles of Confederation was no more than "a rope of sand" linking the states. He believed the nation was on the verge of "anarchy and confusion", was vulnerable to foreign intervention, and that a national constitution would unify the states under a strong central government. When Shays' Rebellion erupted in Massachusetts on August 29, 1786, over taxation, Washington was further convinced that a national constitution was needed. Some nationalists feared that the new republic had descended into lawlessness, and they met together on September 11, 1786, at Annapolis to ask Congress to revise the Articles of Confederation. One of their biggest efforts, however, was getting Washington to attend. Congress agreed to a Constitutional Convention to be held in Philadelphia in Spring 1787, and each state was to send delegates. On December 4, 1786, Washington was chosen to lead the Virginia delegation, but he declined on December 21. He had concerns about the legality of the convention and consulted James Madison, Henry Knox, and others. They persuaded him to attend it, however, as his presence might induce reluctant states to send delegates and smooth the way for the ratification process. On March 28, Washington told Governor Edmund Randolph that he would attend the convention but made it clear that he was urged to attend. Washington arrived in Philadelphia on May 9, 1787, though a quorum was not attained until Friday, May 25. Benjamin Franklin nominated Washington to preside over the convention, and he was unanimously elected to serve as president general. The convention's state-mandated purpose was to revise the Articles of Confederation with "all such alterations and further provisions" required to improve them, and the new government would be established when the resulting document was "duly confirmed by the several states". Governor Edmund Randolph of Virginia introduced Madison's Virginia Plan on May 27, the third day of the convention. It called for an entirely new constitution and a sovereign national government, which Washington highly recommended. Washington wrote Alexander Hamilton on July 10: "I almost despair of seeing a favorable issue to the proceedings of our convention and do therefore repent having had any agency in the business." Nevertheless, he lent his prestige to the goodwill and work of the other delegates. He unsuccessfully lobbied many to support ratification of the Constitution, such as anti-federalist Patrick Henry; Washington told him "the adoption of it under the present circumstances of the Union is in my opinion desirable" and declared the alternative would be anarchy. Washington and Madison then spent four days at Mount Vernon evaluating the new government's transition. Chancellor of William & Mary In 1788, the Board of Visitors of the College of William & Mary decided to re-establish the position of Chancellor, and elected Washington to the office on January 18. The College Rector Samuel Griffin wrote to Washington inviting him to the post, and in a letter dated April 30, 1788, Washington accepted the position of the 14th Chancellor of the College of William & Mary. He continued to serve in the post through his presidency until his death on December 14, 1799. First presidential election The delegates to the Convention anticipated a Washington presidency and left it to him to define the office once elected. The state electors under the Constitution voted for the president on February 4, 1789, and Washington suspected that most republicans had not voted for him. The mandated March4 date passed without a Congressional quorum to count the votes, but a quorum was reached on April 5. The votes were tallied the next day, and Congressional Secretary Charles Thomson was sent to Mount Vernon to tell Washington he had been elected president. Washington won the majority of every state's electoral votes; John Adams received the next highest number of votes and therefore became vice president. Washington had "anxious and painful sensations" about leaving the "domestic felicity" of Mount Vernon, but departed for New York City on April 16 to be inaugurated. Presidency (1789–1797) Washington was inaugurated on April 30, 1789, taking the oath of office at Federal Hall in New York City. His coach was led by militia and a marching band and followed by statesmen and foreign dignitaries in an inaugural parade, with a crowd of 10,000. Chancellor Robert R. Livingston administered the oath, using a Bible provided by the Masons, after which the militia fired a 13-gun salute. Washington read a speech in the Senate Chamber, asking "that Almighty Being who rules over the universe, who presides in the councils of nations—and whose providential aids can supply every human defect, consecrate the liberties and happiness of the people of the United States". Though he wished to serve without a salary, Congress insisted adamantly that he accept it, later providing Washington $25,000 per year to defray costs of the presidency. Washington wrote to James Madison: "As the first of everything in our situation will serve to establish a precedent, it is devoutly wished on my part that these precedents be fixed on true principles." To that end, he preferred the title "Mr. President" over more majestic names proposed by the Senate, including "His Excellency" and "His Highness the President". His executive precedents included the inaugural address, messages to Congress, and the cabinet form of the executive branch. Washington had planned to resign after his first term, but the political strife in the nation convinced him he should remain in office. He was an able administrator and a judge of talent and character, and he regularly talked with department heads to get their advice. He tolerated opposing views, despite fears that a democratic system would lead to political violence, and he conducted a smooth transition of power to his successor. He remained non-partisan throughout his presidency and opposed the divisiveness of political parties, but he favored a strong central government, was sympathetic to a Federalist form of government, and leery of the Republican opposition. Washington dealt with major problems. The old Confederation lacked the powers to handle its workload and had weak leadership, no executive, a small bureaucracy of clerks, a large debt, worthless paper money, and no power to establish taxes. He had the task of assembling an executive department and relied on Tobias Lear for advice selecting its officers. Great Britain refused to relinquish its forts in the American West, and Barbary pirates preyed on American merchant ships in the Mediterranean at a time when the United States did not even have a navy. Cabinet and executive departments Congress created executive departments in 1789, including the State Department in July, the Department of War in August, and the Treasury Department in September. Washington appointed fellow Virginian Edmund Randolph as Attorney General, Samuel Osgood as Postmaster General, Thomas Jefferson as Secretary of State, and Henry Knox as Secretary of War. Finally, he appointed Alexander Hamilton as Secretary of the Treasury. Washington's cabinet became a consulting and advisory body, not mandated by the Constitution. Washington's cabinet members formed rival parties with sharply opposing views, most fiercely illustrated between Hamilton and Jefferson. Washington restricted cabinet discussions to topics of his choosing, without participating in the debate. He occasionally requested cabinet opinions in writing and expected department heads to agreeably carry out his decisions. Domestic issues Washington was apolitical and opposed the formation of parties, suspecting that conflict would undermine republicanism. He exercised great restraint in using his veto power, writing that "I give my Signature to many Bills with which my Judgment is at variance…." His closest advisors formed two factions, portending the First Party System. Secretary of the Treasury Alexander Hamilton formed the Federalist Party to promote national credit and a financially powerful nation. Secretary of State Thomas Jefferson opposed Hamilton's agenda and founded the Jeffersonian Republicans. Washington favored Hamilton's agenda, however, and it ultimately went into effect—resulting in bitter controversy. Washington proclaimed November 26 as a day of Thanksgiving to encourage national unity. "It is the duty of all nations to acknowledge the providence of Almighty God, to obey His will, to be grateful for His benefits, and humbly to implore His protection and favor." He spent that day fasting and visiting debtors in prison to provide them with food and beer. African Americans In response to two antislavery petitions that were presented to Congress in 1790, slaveholders in Georgia and South Carolina objected and threatened to "blow the trumpet of civil war". Washington and Congress responded with a series of racist measures: naturalized citizenship was denied to black immigrants; blacks were barred from serving in state militias; the Southwest Territory that would soon become the state of Tennessee was permitted to maintain slavery; and two more slave states were admitted (Kentucky in 1792, and Tennessee in 1796). On February 12, 1793, Washington signed into law the Fugitive Slave Act, which overrode state laws and courts, allowing agents to cross state lines to capture and return escaped slaves. Many free blacks in the north decried the law believing it would allow bounty hunting and the kidnappings of blacks. The Fugitive Slave Act gave effect to the Constitution's Fugitive Slave Clause, and the Act was passed overwhelmingly in Congress (e.g. the vote was 48 to 7 in the House). On the anti-slavery side of the ledger, in 1789 Washington signed a reenactment of the Northwest Ordinance which had freed all slaves brought after 1787 into a vast expanse of federal territory north of the Ohio River, except for slaves escaping from slave states. That 1787 law lapsed when the new U.S. Constitution was ratified in 1789. The Slave Trade Act of 1794, which sharply limited American involvement in the Atlantic slave trade, was also signed by Washington. And, Congress acted on February 18, 1791, to admit the free state of Vermont into the Union as the 14th state as of March 4, 1791. National Bank Washington's first term was largely devoted to economic concerns, in which Hamilton had devised various plans to address matters. The establishment of public credit became a primary challenge for the federal government. Hamilton submitted a report to a deadlocked Congress, and he, Madison, and Jefferson reached the Compromise of 1790 in which Jefferson agreed to Hamilton's debt proposals in exchange for moving the nation's capital temporarily to Philadelphia and then south near Georgetown on the Potomac River. The terms were legislated in the Funding Act of 1790 and the Residence Act, both of which Washington signed into law. Congress authorized the assumption and payment of the nation's debts, with funding provided by customs duties and excise taxes. Hamilton created controversy among Cabinet members by advocating establishing the First Bank of the United States. Madison and Jefferson objected, but the bank easily passed Congress. Jefferson and Randolph insisted that the new bank was beyond the authority granted by the constitution, as Hamilton believed. Washington sided with Hamilton and signed the legislation on February 25, and the rift became openly hostile between Hamilton and Jefferson. The nation's first financial crisis occurred in March 1792. Hamilton's Federalists exploited large loans to gain control of U.S. debt securities, causing a run on the national bank; the markets returned to normal by mid-April. Jefferson believed Hamilton was part of the scheme, despite Hamilton's efforts to ameliorate, and Washington again found himself in the middle of a feud. Jefferson–Hamilton feud Jefferson and Hamilton adopted diametrically opposed political principles. Hamilton believed in a strong national government requiring a national bank and foreign loans to function, while Jefferson believed the states and the farm element should primarily direct the government; he also resented the idea of banks and foreign loans. To Washington's dismay, the two men persistently entered into disputes and infighting. Hamilton demanded that Jefferson resign if he could not support Washington, and Jefferson told Washington that Hamilton's fiscal system would lead to the overthrow of the Republic. Washington urged them to call a truce for the nation's sake, but they ignored him. Washington reversed his decision to retire after his first term to minimize party strife, but the feud continued after his re-election. Jefferson's political actions, his support of Freneau's National Gazette, and his attempt to undermine Hamilton nearly led Washington to dismiss him from the cabinet; Jefferson ultimately resigned his position in December 1793, and Washington forsook him from that time on. The feud led to the well-defined Federalist and Republican parties, and party affiliation became necessary for election to Congress by 1794. Washington remained aloof from congressional attacks on Hamilton, but he did not publicly protect him, either. The Hamilton–Reynolds sex scandal opened Hamilton to disgrace, but Washington continued to hold him in "very high esteem" as the dominant force in establishing federal law and government. Whiskey Rebellion In March 1791, at Hamilton's urging, with support from Madison, Congress imposed an excise tax on distilled spirits to help curtail the national debt, which took effect in July. Grain farmers strongly protested in Pennsylvania's frontier districts; they argued that they were unrepresented and were shouldering too much of the debt, comparing their situation to excessive British taxation before the Revolutionary War. On August 2, Washington assembled his cabinet to discuss how to deal with the situation. Unlike Washington, who had reservations about using force, Hamilton had long waited for such a situation and was eager to suppress the rebellion by using federal authority and force. Not wanting to involve the federal government if possible, Washington called on Pennsylvania state officials to take the initiative, but they declined to take military action. On August 7, Washington issued his first proclamation for calling up state militias. After appealing for peace, he reminded the protestors that, unlike the rule of the British crown, the Federal law was issued by state-elected representatives. Threats and violence against tax collectors, however, escalated into defiance against federal authority in 1794 and gave rise to the Whiskey Rebellion. Washington issued a final proclamation on September 25, threatening the use of military force to no avail. The federal army was not up to the task, so Washington invoked the Militia Act of 1792 to summon state militias. Governors sent troops, initially commanded by Washington, who gave the command to Light-Horse Harry Lee to lead them into the rebellious districts. They took 150 prisoners, and the remaining rebels dispersed without further fighting. Two of the prisoners were condemned to death, but Washington exercised his Constitutional authority for the first time and pardoned them. Washington's forceful action demonstrated that the new government could protect itself and its tax collectors. This represented the first use of federal military force against the states and citizens, and remains the only time an incumbent president has commanded troops in the field. Washington justified his action against "certain self-created societies", which he regarded as "subversive organizations" that threatened the national union. He did not dispute their right to protest, but he insisted that their dissent must not violate federal law. Congress agreed and extended their congratulations to him; only Madison and Jefferson expressed indifference. Foreign affairs In April 1792, the French Revolutionary Wars began between Great Britain and France, and Washington declared America's neutrality. The revolutionary government of France sent diplomat Citizen Genêt to America, and he was welcomed with great enthusiasm. He created a network of new Democratic-Republican Societies promoting France's interests, but Washington denounced them and demanded that the French recall Genêt. The National Assembly of France granted Washington honorary French citizenship on August 26, 1792, during the early stages of the French Revolution. Hamilton formulated the Jay Treaty to normalize trade relations with Great Britain while removing them from western forts, and also to resolve financial debts remaining from the Revolution. Chief Justice John Jay acted as Washington's negotiator and signed the treaty on November 19, 1794; critical Jeffersonians, however, supported France. Washington deliberated, then supported the treaty because it avoided war with Britain, but was disappointed that its provisions favored Britain. He mobilized public opinion and secured ratification in the Senate but faced frequent public criticism. The British agreed to abandon their forts around the Great Lakes, and the United States modified the boundary with Canada. The government liquidated numerous pre-Revolutionary debts, and the British opened the British West Indies to American trade. The treaty secured peace with Britain and a decade of prosperous trade. Jefferson claimed that it angered France and "invited rather than avoided" war. Relations with France deteriorated afterward, leaving succeeding president John Adams with prospective war. James Monroe was the American Minister to France, but Washington recalled him for his opposition to the Treaty. The French refused to accept his replacement Charles Cotesworth Pinckney, and the French Directory declared the authority to seize American ships two days before Washington's term ended. Native American affairs Ron Chernow describes Washington as always trying to be even-handed in dealing with the Natives. He states that Washington hoped they would abandon their itinerant hunting life and adapt to fixed agricultural communities in the manner of white settlers. He also maintains that Washington never advocated outright confiscation of tribal land or the forcible removal of tribes and that he berated American settlers who abused natives, admitting that he held out no hope for pacific relations with the natives as long as "frontier settlers entertain the opinion that there is not the same crime (or indeed no crime at all) in killing a native as in killing a white man." By contrast, Colin G. Calloway writes that "Washington had a lifelong obsession with getting Indian land, either for himself or for his nation, and initiated policies and campaigns that had devastating effects in Indian country." "The growth of the nation," Galloway has stated, "demanded the dispossession of Indian people. Washington hoped the process could be bloodless and that Indian people would give up their lands for a "fair" price and move away. But if Indians refused and resisted, as they often did, he felt he had no choice but to "extirpate" them and that the expeditions he sent to destroy Indian towns were therefore entirely justified." During the Fall of 1789, Washington had to contend with the British refusing to evacuate their forts in the Northwest frontier and their concerted efforts to incite hostile Indian tribes to attack American settlers. The Northwest tribes under Miami chief Little Turtle allied with the British Army to resist American expansion, and killed 1,500 settlers between 1783 and 1790. As documented by Harless (2018), Washington declared that "The Government of the United States are determined that their Administration of Indian Affairs shall be directed entirely by the great principles of Justice and humanity", and provided that treaties should negotiate their land interests. The administration regarded powerful tribes as foreign nations, and Washington even smoked a peace pipe and drank wine with them at the Philadelphia presidential house. He made numerous attempts to conciliate them; he equated killing indigenous peoples with killing whites and sought to integrate them into European-American culture. Secretary of War Henry Knox also attempted to encourage agriculture among the tribes. In the Southwest, negotiations failed between federal commissioners and raiding Indian tribes seeking retribution. Washington invited Creek Chief Alexander McGillivray and 24 leading chiefs to New York to negotiate a treaty and treated them like foreign dignitaries. Knox and McGillivray concluded the Treaty of New York on August 7, 1790, in Federal Hall, which provided the tribes with agricultural supplies and McGillivray with a rank of Brigadier General Army and a salary of $1,500. In 1790, Washington sent Brigadier General Josiah Harmar to pacify the Northwest tribes, but Little Turtle routed him twice and forced him to withdraw. The Western Confederacy of tribes used guerrilla tactics and were an effective force against the sparsely manned American Army. Washington sent Major General Arthur St. Clair from Fort Washington on an expedition to restore peace in the territory in 1791. On November 4, St. Clair's forces were ambushed and soundly defeated by tribal forces with few survivors, despite Washington's warning of surprise attacks. Washington was outraged over what he viewed to be excessive Native American brutality and execution of captives, including women and children. St. Clair resigned his commission, and Washington replaced him with the Revolutionary War hero General Anthony Wayne. From 1792 to 1793, Wayne instructed his troops on Native American warfare tactics and instilled discipline which was lacking under St. Clair. In August 1794, Washington sent Wayne into tribal territory with authority to drive them out by burning their villages and crops in the Maumee Valley. On August 24, the American army under Wayne's leadership defeated the western confederacy at the Battle of Fallen Timbers, and the Treaty of Greenville in August 1795 opened up two-thirds of the Ohio Country for American settlement. Second term Originally, Washington had planned to retire after his first term, while many Americans could not imagine anyone else taking his place. After nearly four years as president, and dealing with the infighting in his own cabinet and with partisan critics, Washington showed little enthusiasm in running for a second term, while Martha also wanted him not to run. James Madison urged him not to retire, that his absence would only allow the dangerous political rift in his cabinet and the House to worsen. Jefferson also pleaded with him not to retire and agreed to drop his attacks on Hamilton, or he would also retire if Washington did. Hamilton maintained that Washington's absence would be "deplored as the greatest evil" to the country at this time. Washington's close nephew George Augustine Washington, his manager at Mount Vernon, was critically ill and had to be replaced, further increasing Washington's desire to retire and return to Mount Vernon. When the election of 1792 neared, Washington did not publicly announce his presidential candidacy. Still, he silently consented to run to prevent a further political-personal rift in his cabinet. The Electoral College unanimously elected him president on February 13, 1793, and John Adams as vice president by a vote of 77 to 50. Washington, with nominal fanfare, arrived alone at his inauguration in his carriage. Sworn into office by Associate Justice William Cushing on March 4, 1793, in the Senate Chamber of Congress Hall in Philadelphia, Washington gave a brief address and then immediately retired to his Philadelphia presidential house, weary of office and in poor health. On April 22, 1793, during the French Revolution, Washington issued his famous Neutrality Proclamation and was resolved to pursue "a conduct friendly and impartial toward the belligerent Powers" while he warned Americans not to intervene in the international conflict. Although Washington recognized France's revolutionary government, he would eventually ask French minister to America Citizen Genêt be recalled over the Citizen Genêt Affair. Genêt was a diplomatic troublemaker who was openly hostile toward Washington's neutrality policy. He procured four American ships as privateers to strike at Spanish forces (British allies) in Florida while organizing militias to strike at other British possessions. However, his efforts failed to draw America into the foreign campaigns during Washington's presidency. On July 31, 1793, Jefferson submitted his resignation from Washington's cabinet. Washington signed the Naval Act of 1794 and commissioned the first six federal frigates to combat Barbary pirates. In January 1795, Hamilton, who desired more income for his family, resigned office and was replaced by Washington appointment Oliver Wolcott, Jr. Washington and Hamilton remained friends. However, Washington's relationship with his Secretary of War Henry Knox deteriorated. Knox resigned office on the rumor he profited from construction contracts on U.S. Frigates. In the final months of his presidency, Washington was assailed by his political foes and a partisan press who accused him of being ambitious and greedy, while he argued that he had taken no salary during the war and had risked his life in battle. He regarded the press as a disuniting, "diabolical" force of falsehoods, sentiments that he expressed in his Farewell Address. At the end of his second term, Washington retired for personal and political reasons, dismayed with personal attacks, and to ensure that a truly contested presidential election could be held. He did not feel bound to a two-term limit, but his retirement set a significant precedent. Washington is often credited with setting the principle of a two-term presidency, but it was Thomas Jefferson who first refused to run for a third term on political grounds. Farewell Address In 1796, Washington declined to run for a third term of office, believing his death in office would create an image of a lifetime appointment. The precedent of a two-term limit was created by his retirement from office. In May 1792, in anticipation of his retirement, Washington instructed James Madison to prepare a "valedictory address", an initial draft of which was entitled the "Farewell Address". In May 1796, Washington sent the manuscript to his Secretary of Treasury Alexander Hamilton who did an extensive rewrite, while Washington provided final edits. On September 19, 1796, David Claypoole's American Daily Advertiser published the final version of the address. Washington stressed that national identity was paramount, while a united America would safeguard freedom and prosperity. He warned the nation of three eminent dangers: regionalism, partisanship, and foreign entanglements, and said the "name of AMERICAN, which belongs to you, in your national capacity, must always exalt the just pride of patriotism, more than any appellation derived from local discriminations." Washington called for men to move beyond partisanship for the common good, stressing that the United States must concentrate on its own interests. He warned against foreign alliances and their influence in domestic affairs, and bitter partisanship and the dangers of political parties. He counseled friendship and commerce with all nations, but advised against involvement in European wars. He stressed the importance of religion, asserting that "religion and morality are indispensable supports" in a republic. Washington's address favored Hamilton's Federalist ideology and economic policies. Washington closed the address by reflecting on his legacy: After initial publication, many Republicans, including Madison, criticized the Address and believed it was an anti-French campaign document. Madison believed Washington was strongly pro-British. Madison also was suspicious of who authored the Address. In 1839, Washington biographer Jared Sparks maintained that Washington's "...Farewell Address was printed and published with the laws, by order of the legislatures, as an evidence of the value they attached to its political precepts, and of their affection for its author." In 1972, Washington scholar James Flexner referred to the Farewell Address as receiving as much acclaim as Thomas Jefferson's Declaration of Independence and Abraham Lincoln's Gettysburg Address. In 2010, historian Ron Chernow reported the Farewell Address proved to be one of the most influential statements on Republicanism. Post-presidency (1797–1799) Retirement Washington retired to Mount Vernon in March 1797 and devoted time to his plantations and other business interests, including his distillery. His plantation operations were only minimally profitable, and his lands in the west (Piedmont) were under Indian attacks and yielded little income, with the squatters there refusing to pay rent. He attempted to sell these but without success. He became an even more committed Federalist. He vocally supported the Alien and Sedition Acts and convinced Federalist John Marshall to run for Congress to weaken the Jeffersonian hold on Virginia. Washington grew restless in retirement, prompted by tensions with France, and he wrote to Secretary of War James McHenry offering to organize President Adams' army. In a continuation of the French Revolutionary Wars, French privateers began seizing American ships in 1798, and relations deteriorated with France and led to the "Quasi-War". Without consulting Washington, Adams nominated him for a lieutenant general commission on July 4, 1798, and the position of commander-in-chief of the armies. Washington chose to accept, replacing James Wilkinson, and he served as the commanding general from July 13, 1798, until his death 17 months later. He participated in planning for a provisional army, but he avoided involvement in details. In advising McHenry of potential officers for the army, he appeared to make a complete break with Jefferson's Democratic-Republicans: "you could as soon scrub the blackamoor white, as to change the principles of a profest Democrat; and that he will leave nothing unattempted to overturn the government of this country." Washington delegated the active leadership of the army to Hamilton, a major general. No army invaded the United States during this period, and Washington did not assume a field command. Washington was known to be rich because of the well-known "glorified façade of wealth and grandeur" at Mount Vernon, but nearly all his wealth was in the form of land and slaves rather than ready cash. To supplement his income, he erected a distillery for substantial whiskey production. Historians estimate that the estate was worth about $1million in 1799 dollars, . He bought land parcels to spur development around the new Federal City named in his honor, and he sold individual lots to middle-income investors rather than multiple lots to large investors, believing they would more likely commit to making improvements. Final days and death On December 12, 1799, Washington inspected his farms on horseback. He returned home late and had guests over for dinner. He had a sore throat the next day but was well enough to mark trees for cutting. That evening, he complained of chest congestion but was still cheerful. On Saturday, he awoke to an inflamed throat and difficulty breathing, so he ordered estate overseer George Rawlins to remove nearly a pint of his blood; bloodletting was a common practice of the time. His family summoned Doctors James Craik, Gustavus Richard Brown, and Elisha C. Dick. (Dr. William Thornton arrived some hours after Washington died.) Dr. Brown thought Washington had quinsy; Dr. Dick thought the condition was a more serious "violent inflammation of the throat". They continued the process of bloodletting to approximately five pints, and Washington's condition deteriorated further. Dr. Dick proposed a tracheotomy, but the others were not familiar with that procedure and therefore disapproved. Washington instructed Brown and Dick to leave the room, while he assured Craik, "Doctor, I die hard, but I am not afraid to go." Washington's death came more swiftly than expected. On his deathbed, he instructed his private secretary Tobias Lear to wait three days before his burial, out of fear of being entombed alive. According to Lear, he died peacefully between 10 and 11 p.m. on December 14, 1799, with Martha seated at the foot of his bed. His last words were "'Tis well", from his conversation with Lear about his burial. He was 67. Congress immediately adjourned for the day upon news of Washington's death, and the Speaker's chair was shrouded in black the next morning. The funeral was held four days after his death on December 18, 1799, at Mount Vernon, where his body was interred. Cavalry and foot soldiers led the procession, and six colonels served as the pallbearers. The Mount Vernon funeral service was restricted mostly to family and friends. Reverend Thomas Davis read the funeral service by the vault with a brief address, followed by a ceremony performed by various members of Washington's Masonic lodge in Alexandria, Virginia. Congress chose Light-Horse Harry Lee to deliver the eulogy. Word of his death traveled slowly; church bells rang in the cities, and many places of business closed. People worldwide admired Washington and were saddened by his death, and memorial processions were held in major cities of the United States. Martha wore a black mourning cape for one year, and she burned their correspondence to protect their privacy. Only five letters between the couple are known to have survived: two from Martha to George and three from him to her. The diagnosis of Washington's illness and the immediate cause of his death have been subjects of debate since the day he died. The published account of Drs. Craik and Brown stated that his symptoms had been consistent with cynanche trachealis (tracheal inflammation), a term of that period used to describe severe inflammation of the upper windpipe, including quinsy. Accusations have persisted since Washington's death concerning medical malpractice, with some believing he had been bled to death. Various modern medical authors have speculated that he died from a severe case of epiglottitis complicated by the given treatments, most notably the massive blood loss which almost certainly caused hypovolemic shock. Burial, net worth, and aftermath Washington was buried in the old Washington family vault at Mount Vernon, situated on a grassy slope overspread with willow, juniper, cypress, and chestnut trees. It contained the remains of his brother Lawrence and other family members, but the decrepit brick vault needed repair, prompting Washington to leave instructions in his will for the construction of a new vault. Washington's estate at the time of his death was worth an estimated $780,000 in 1799, approximately equivalent to $17.82million in 2021. Washington's peak net worth was $587.0 million, including his 300 slaves. Washington held title to more than 65,000 acres of land in 37 different locations. In 1830, a disgruntled ex-employee of the estate attempted to steal what he thought was Washington's skull, prompting the construction of a more secure vault. The next year, the new vault was constructed at Mount Vernon to receive the remains of George and Martha and other relatives. In 1832, a joint Congressional committee debated moving his body from Mount Vernon to a crypt in the Capitol. The crypt had been built by architect Charles Bulfinch in the 1820s during the reconstruction of the burned-out capital, after the Burning of Washington by the British during the War of 1812. Southern opposition was intense, antagonized by an ever-growing rift between North and South; many were concerned that Washington's remains could end up on "a shore foreign to his native soil" if the country became divided, and Washington's remains stayed in Mount Vernon. On October 7, 1837, Washington's remains were placed, still in the original lead coffin, within a marble sarcophagus designed by William Strickland and constructed by John Struthers earlier that year. The sarcophagus was sealed and encased with planks, and an outer vault was constructed around it. The outer vault has the sarcophagi of both George and Martha Washington; the inner vault has the remains of other Washington family members and relatives. Personal life Washington was somewhat reserved in personality, but he generally had a strong presence among others. He made speeches and announcements when required, but he was not a noted orator or debater. He was taller than most of his contemporaries; accounts of his height vary from to tall, he weighed between as an adult, and he was known for his great strength. He had grey-blue eyes and reddish-brown hair which he wore powdered in the fashion of the day. He had a rugged and dominating presence, which garnered respect from his peers. He bought William Lee on May 27, 1768, and he was Washington's valet for 20 years. He was the only slave freed immediately in Washington's will. Washington frequently suffered from severe tooth decay and ultimately lost all his teeth but one. He had several sets of false teeth, which he wore during his presidency, made using a variety of materials including both animal and human teeth, but wood was not used despite common lore. These dental problems left him in constant pain, for which he took laudanum. As a public figure, he relied upon the strict confidence of his dentist. Washington was a talented equestrian early in life. He collected thoroughbreds at Mount Vernon, and his two favorite horses were Blueskin and Nelson. Fellow Virginian Thomas Jefferson said Washington was "the best horseman of his age and the most graceful figure that could be seen on horseback"; he also hunted foxes, deer, ducks, and other game. He was an excellent dancer and attended the theater frequently. He drank in moderation but was morally opposed to excessive drinking, smoking tobacco, gambling, and profanity. Religion and Freemasonry Washington was descended from Anglican minister Lawrence Washington (his great-great-grandfather), whose troubles with the Church of England may have prompted his heirs to emigrate to America. Washington was baptized as an infant in April 1732 and became a devoted member of the Church of England (the Anglican Church). He served more than 20 years as a vestryman and churchwarden for Fairfax Parish and Truro Parish, Virginia. He privately prayed and read the Bible daily, and he publicly encouraged people and the nation to pray. He may have taken communion on a regular basis prior to the Revolutionary War, but he did not do so following the war, for which he was admonished by Pastor James Abercrombie. Washington believed in a "wise, inscrutable, and irresistible" Creator God who was active in the Universe, contrary to deistic thought. He referred to God by the Enlightenment terms Providence, the Creator, or the Almighty, and also as the Divine Author or the Supreme Being. He believed in a divine power who watched over battlefields, was involved in the outcome of war, was protecting his life, and was involved in American politics—and specifically in the creation of the United States. Modern historian Ron Chernow has posited that Washington avoided evangelistic Christianity or hellfire-and-brimstone speech along with communion and anything inclined to "flaunt his religiosity". Chernow has also said Washington "never used his religion as a device for partisan purposes or in official undertakings". No mention of Jesus Christ appears in his private correspondence, and such references are rare in his public writings. He frequently quoted from the Bible or paraphrased it, and often referred to the Anglican Book of Common Prayer. There is debate on whether he is best classed as a Christian or a theistic rationalist—or both. Washington emphasized religious toleration in a nation with numerous denominations and religions. He publicly attended services of different Christian denominations and prohibited anti-Catholic celebrations in the Army. He engaged workers at Mount Vernon without regard for religious belief or affiliation. While president, he acknowledged major religious sects and gave speeches on religious toleration. He was distinctly rooted in the ideas, values, and modes of thinking of the Enlightenment, but he harbored no contempt of organized Christianity and its clergy, "being no bigot myself to any mode of worship". In 1793, speaking to members of the New Church in Baltimore, Washington proclaimed, "We have abundant reason to rejoice that in this Land the light of truth and reason has triumphed over the power of bigotry and superstition." Freemasonry was a widely accepted institution in the late 18th century, known for advocating moral teachings. Washington was attracted to the Masons' dedication to the Enlightenment principles of rationality, reason, and brotherhood. The American Masonic lodges did not share the anti-clerical perspective of the controversial European lodges. A Masonic lodge was established in Fredericksburg in September 1752, and Washington was initiated two months later at the age of 20 as one of its first Entered Apprentices. Within a year, he progressed through its ranks to become a Master Mason. Washington had high regard for the Masonic Order, but his personal lodge attendance was sporadic. In 1777, a convention of Virginia lodges asked him to be the Grand Master of the newly established Grand Lodge of Virginia, but he declined due to his commitments leading the Continental Army. After 1782, he frequently corresponded with Masonic lodges and members, and he was listed as Master in the Virginia charter of Alexandria Lodge No. 22 in 1788. Slavery In Washington's lifetime, slavery was deeply ingrained in the economic and social fabric of Virginia. Slavery was legal in all of the Thirteen Colonies prior to the American Revolution. Washington's slaves Washington owned and rented enslaved African Americans, and during his lifetime over 577 slaves lived and worked at Mount Vernon. He acquired them through inheritance, gaining control of 84 dower slaves upon his marriage to Martha, and purchased at least 71 slaves between 1752 and 1773. From 1786 he rented slaves, at his death he was renting 41. His early views on slavery were no different from any Virginia planter of the time. From the 1760s his attitudes underwent a slow evolution. The first doubts were prompted by his transition from tobacco to grain crops, which left him with a costly surplus of slaves, causing him to question the system's economic efficiency. His growing disillusionment with the institution was spurred by the principles of the American Revolution and revolutionary friends such as Lafayette and Hamilton. Most historians agree the Revolution was central to the evolution of Washington's attitudes on slavery; "After 1783", Kenneth Morgan writes, "...[Washington] began to express inner tensions about the problem of slavery more frequently, though always in private..." The many contemporary reports of slave treatment at Mount Vernon are varied and conflicting. Historian Kenneth Morgan (2000) maintains that Washington was frugal on spending for clothes and bedding for his slaves, and only provided them with just enough food, and that he maintained strict control over his slaves, instructing his overseers to keep them working hard from dawn to dusk year-round. However, historian Dorothy Twohig (2001) said: "Food, clothing, and housing seem to have been at least adequate". Washington faced growing debts involved with the costs of supporting slaves. He held an "engrained sense of racial superiority" towards African Americans but harbored no ill feelings toward them. Some enslaved families worked at different locations on the plantation but were allowed to visit one another on their days off. Washington's slaves received two hours off for meals during the workday and were given time off on Sundays and religious holidays. Some accounts report that Washington opposed flogging but at times sanctioned its use, generally as a last resort, on both men and women slaves. Washington used both reward and punishment to encourage discipline and productivity in his slaves. He tried appealing to an individual's sense of pride, gave better blankets and clothing to the "most deserving", and motivated his slaves with cash rewards. He believed "watchfulness and admonition" to be often better deterrents against transgressions but would punish those who "will not do their duty by fair means". Punishment ranged in severity from demotion back to fieldwork, through whipping and beatings, to permanent separation from friends and family by sale. Historian Ron Chernow maintains that overseers were required to warn slaves before resorting to the lash and required Washington's written permission before whipping, though his extended absences did not always permit this. Washington remained dependent on slave labor to work his farms and negotiated the purchase of more slaves in 1786 and 1787. Washington brought several of his slaves with him and his family to the federal capital during his presidency. When the capital moved from New York City to Philadelphia in 1791, the president began rotating his slave household staff periodically between the capital and Mount Vernon. This was done deliberately to circumvent Pennsylvania's Slavery Abolition Act, which, in part, automatically freed any slave who moved to the state and lived there for more than six months. In May 1796, Martha's personal and favorite slave Oney Judge escaped to Portsmouth. At Martha's behest, Washington attempted to capture Ona, using a Treasury agent, but this effort failed. In February 1797, Washington's personal slave Hercules escaped to Philadelphia and was never found. In February 1786, Washington took a census of Mount Vernon and recorded 224 slaves. By 1799, slaves at Mount Vernon totaled 317, including 143 children. Washington owned 124 slaves, leased 40, and held 153 for his wife's dower interest. Washington supported many slaves who were too young or too old to work, greatly increasing Mount Vernon's slave population and causing the plantation to operate at a loss. Abolition and manumission Based on his letters, diary, documents, accounts from colleagues, employees, friends, and visitors, Washington slowly developed a cautious sympathy toward abolitionism that eventually ended with his will freeing his military/war valet Billy Lee, and then subsequently freeing the rest of his personally-owned slaves outright upon Martha's death. As president, he remained publicly silent on the topic of slavery, believing it was a nationally divisive issue which could destroy the union. During the American Revolutionary War, Washington began to change his views on slavery. In a 1778 letter to Lund Washington, he made clear his desire "to get quit of Negroes" when discussing the exchange of slaves for the land he wanted to buy. The next year, Washington stated his intention not to separate enslaved families as a result of "a change of masters". During the 1780s, Washington privately expressed his support for the gradual emancipation of slaves. Between 1783 and 1786, he gave moral support to a plan proposed by Lafayette to purchase land and free slaves to work on it, but declined to participate in the experiment. Washington privately expressed support for emancipation to prominent Methodists Thomas Coke and Francis Asbury in 1785 but declined to sign their petition. In personal correspondence the next year, he made clear his desire to see the institution of slavery ended by a gradual legislative process, a view that correlated with the mainstream antislavery literature published in the 1780s that Washington possessed. He significantly reduced his purchases of slaves after the war but continued to acquire them in small numbers. In 1788, Washington declined a suggestion from a leading French abolitionist, Jacques Brissot, to establish an abolitionist society in Virginia, stating that although he supported the idea, the time was not yet right to confront the issue. The historian Henry Wiencek (2003) believes, based on a remark that appears in the notebook of his biographer David Humphreys, that Washington considered making a public statement by freeing his slaves on the eve of his presidency in 1789. The historian Philip D. Morgan (2005) disagrees, believing the remark was a "private expression of remorse" at his inability to free his slaves. Other historians agree with Morgan that Washington was determined not to risk national unity over an issue as divisive as slavery. Washington never responded to any of the antislavery petitions he received, and the subject was not mentioned in either his last address to Congress or his Farewell Address. The first clear indication that Washington seriously intended to free his slaves appears in a letter written to his secretary, Tobias Lear, in 1794. Washington instructed Lear to find buyers for his land in western Virginia, explaining in a private coda that he was doing so "to liberate a certain species of property which I possess, very repugnantly to my own feelings". The plan, along with others Washington considered in 1795 and 1796, could not be realized because he failed to find buyers for his land, his reluctance to break up slave families, and the refusal of the Custis heirs to help prevent such separations by freeing their dower slaves at the same time. On July 9, 1799, Washington finished making his last will; the longest provision concerned slavery. All his slaves were to be freed after the death of his wife, Martha. Washington said he did not free them immediately because his slaves intermarried with his wife's dower slaves. He forbade their sale or transportation out of Virginia. His will provided that old and young freed people be taken care of indefinitely; younger ones were to be taught to read and write and placed in suitable occupations. Washington freed more than 160 slaves, including about 25 he had acquired from his wife's brother Bartholomew Dandridge in payment of a debt. He was among the few large slave-holding Virginians during the Revolutionary Era who emancipated their slaves. On January 1, 1801, one year after George Washington's death, Martha Washington signed an order to free his slaves. Many of them, having never strayed far from Mount Vernon, were naturally reluctant to try their luck elsewhere; others refused to abandon spouses or children still held as dower slaves (the Custis estate) and also stayed with or near Martha. Following George Washington's instructions in his will, funds were used to feed and clothe the young, aged, and infirm slaves until the early 1830s. Historical reputation and legacy Washington's legacy endures as one of the most influential in American history since he served as commander-in-chief of the Continental Army, a hero of the Revolution, and the first president of the United States. Various historians maintain that he also was a dominant factor in America's founding, the Revolutionary War, and the Constitutional Convention. Revolutionary War comrade Light-Horse Harry Lee eulogized him as "First in war—first in peace—and first in the hearts of his countrymen". Lee's words became the hallmark by which Washington's reputation was impressed upon the American memory, with some biographers regarding him as the great exemplar of republicanism. He set many precedents for the national government and the presidency in particular, and he was called the "Father of His Country" as early as 1778. In 1879, Congress proclaimed Washington's Birthday to be a federal holiday. Twentieth-century biographer Douglas Southall Freeman concluded, "The great big thing stamped across that man is character." Modern historian David Hackett Fischer has expanded upon Freeman's assessment, defining Washington's character as "integrity, self-discipline, courage, absolute honesty, resolve, and decision, but also forbearance, decency, and respect for others". Washington became an international symbol for liberation and nationalism as the leader of the first successful revolution against a colonial empire. The Federalists made him the symbol of their party, but the Jeffersonians continued to distrust his influence for many years and delayed building the Washington Monument. Washington was elected a member of the American Academy of Arts and Sciences on January 31, 1781, before he had even begun his presidency. He was posthumously appointed to the grade of General of the Armies of the United States during the United States Bicentennial to ensure he would never be outranked; this was accomplished by the congressional joint resolution Public Law 94-479 passed on January 19, 1976, with an effective appointment date of July 4, 1976. On March 13, 1978, Washington was militarily promoted to the rank of General of the Armies. Parson Weems wrote a hagiographic biography in 1809 to honor Washington. Historian Ron Chernow maintains that Weems attempted to humanize Washington, making him look less stern, and to inspire "patriotism and morality" and to foster "enduring myths", such as Washington's refusal to lie about damaging his father's cherry tree. Weems' accounts have never been proven or disproven. Historian John Ferling, however, maintains that Washington remains the only founder and president ever to be referred to as "godlike", and points out that his character has been the most scrutinized by historians, past and present. Historian Gordon S. Wood concludes that "the greatest act of his life, the one that gave him his greatest fame, was his resignation as commander-in-chief of the American forces." Chernow suggests that Washington was "burdened by public life" and divided by "unacknowledged ambition mingled with self-doubt". A 1993 review of presidential polls and surveys consistently ranked Washington number 4, 3, or2 among presidents. A 2018 Siena College Research Institute survey ranked him number1 among presidents. In the 21st century, Washington's reputation has been critically scrutinized. Along with various other Founding Fathers, he has been condemned for holding enslaved human beings. Though he expressed the desire to see the abolition of slavery come through legislation, he did not initiate or support any initiatives for bringing about its end. This has led to calls from some activists to remove his name from public buildings and his statue from public spaces. Nonetheless, Washington maintains his place among the highest-ranked U.S. Presidents, listed second (after Lincoln) in a 2021 C-SPAN poll. Memorials Jared Sparks began collecting and publishing Washington's documentary record in the 1830s in Life and Writings of George Washington (12 vols., 1834–1837). The Writings of George Washington from the Original Manuscript Sources, 1745–1799 (1931–1944) is a 39-volume set edited by John Clement Fitzpatrick, whom the George Washington Bicentennial Commission commissioned. It contains more than 17,000 letters and documents and is available online from the University of Virginia. Educational institutions Numerous secondary schools are named in honor of Washington, as are many universities, including George Washington University and Washington University in St. Louis. Places and monuments Many places and monuments have been named in honor of Washington, most notably the capital of the United States, Washington, D.C. The state of Washington is the only US state to be named after a president. Washington appears as one of four U.S. presidents in a colossal statue by Gutzon Borglum on Mount Rushmore in South Dakota. Currency and postage George Washington appears on contemporary U.S. currency, including the one-dollar bill, the Presidential one-dollar coin and the quarter-dollar coin (the Washington quarter). Washington and Benjamin Franklin appeared on the nation's first postage stamps in 1847. Washington has since appeared on many postage issues, more than any other person. See also British Army during the American Revolutionary War List of American Revolutionary War battles List of Continental Forces in the American Revolutionary War Timeline of the American Revolution Founders Online References Notes Citations Bibliography Print sources Primary sources Online sources Further reading (Volume 1: Containing the debates in Massachusetts and New York) External links Copies of the wills of General George Washington: the first president of the United States and of Martha Washington, his wife (1904), edited by E. R. Holbrook George Washington Personal Manuscripts George Washington Resources at the University of Virginia Library George Washington's Speeches: Quote-search-tool Original Digitized Letters of George Washington Shapell Manuscript Foundation The Papers of George Washington, subset of Founders Online from the National Archives Washington & the American Revolution, BBC Radio4 discussion with Carol Berkin, Simon Middleton & Colin Bonwick (In Our Time, June 24, 2004) Guide to the George Washington Collection 1776–1792 at the University of Chicago Special Collections Research Center 1732 births 1799 deaths Washington family People from Mount Vernon, Virginia People from Westmoreland County, Virginia 18th-century American Episcopalians 18th-century American politicians 18th-century American writers 18th-century presidents of the United States 18th-century United States Army personnel American cartographers American foreign policy writers American Freemasons American male non-fiction writers American military personnel of the Seven Years' War American militia officers American people of English descent American planters American rebels American slave owners American surveyors British America army officers Burials at Mount Vernon Candidates in the 1789 United States presidential election Candidates in the 1792 United States presidential election Chancellors of the College of William & Mary Commanders in chief Commanding Generals of the United States Army Congressional Gold Medal recipients Continental Army generals Continental Army officers from Virginia Continental Congressmen from Virginia Deaths from respiratory disease Episcopalians from Virginia Farmers from Virginia Fellows of the American Academy of Arts and Sciences Free speech activists Hall of Fame for Great Americans inductees House of Burgesses members Members of the American Philosophical Society People of the American Enlightenment People of Virginia in the French and Indian War Presidents of the United States Signers of the Continental Association Signers of the United States Constitution United States Army generals Virginia Independents Virginia militiamen in the American Revolution Washington and Lee University people Washington College people
[ -0.3078320622444153, 0.4312490224838257, -0.09004863351583481, 0.1999497413635254, 0.41000130772590637, 0.4834238886833191, 0.6014474034309387, 0.2762430012226105, -0.42456287145614624, -0.5977540016174316, -0.38541746139526367, -0.021971723064780235, -0.44432884454727173, 0.65522319078445...
11969
https://en.wikipedia.org/wiki/Gulf%20Coast%20of%20the%20United%20States
Gulf Coast of the United States
The Gulf Coast of the United States is the coastline along the Southern United States where they meet the Gulf of Mexico. The coastal states that have a shoreline on the Gulf of Mexico are Texas, Louisiana, Mississippi, Alabama, and Florida, and these are known as the Gulf States. The economy of the Gulf Coast area is dominated by industries related to energy, petrochemicals, fishing, aerospace, agriculture, and tourism. The large cities of the region are (from west to east) Brownsville, Corpus Christi, Houston, Galveston, Beaumont, Lake Charles, Lafayette, Baton Rouge, New Orleans, Gulfport, Biloxi, Mobile, Pensacola, Navarre, St. Petersburg, and Tampa. All are the centers or major cities of their respective metropolitan areas and many of which contain large ports. Geography The Gulf Coast is made of many inlets, bays, and lagoons. The coast is intersected by numerous rivers, the largest of which is the Mississippi River. Much of the land along the Gulf Coast is, or was, marshland. Ringing the Gulf Coast is the Gulf Coastal Plain, which reaches from Southern Texas to the western Florida Panhandle, while the western portions of the Gulf Coast are made up of many barrier islands and peninsulas, including the Padre Island along the Texas coast. These landforms protect numerous bays and inlets providing as a barrier to oncoming waves. The central part of the Gulf Coast, from eastern Texas through Louisiana, consists primarily of marshland. The eastern part of the Gulf Coast, predominantly Florida, is dotted with many bays and inlets. Climate The Gulf Coast climate is humid subtropical, although the southwestern tip of Florida, such as Everglades City, features a tropical climate. Much of the year is warm to hot along the Gulf Coast, while the three winter months bring periods of cool (or rarely, cold) weather mixed with mild temperatures. The area is highly vulnerable to hurricanes as well as floods and severe thunderstorms. Much of the Gulf Coast has a summer precipitation maximum, with July or August commonly the wettest month due to the combination of frequent summer thunderstorms produced by relentless heat and humidity, and tropical weather systems (tropical depressions, tropical storms and hurricanes), while winter and early spring rainfall also can be heavy. This pattern is evident in southern cites as Houston, Texas, New Orleans, Louisiana, Mobile, Alabama and Pensacola, Florida. However, the central and southern Florida peninsula and South Texas has a pronounced winter dry season, as at Tampa and Fort Myers, Florida. On the central and southern Texas coast, winter, early spring and mid-summer are markedly drier, and September is the wettest month on average (as at Corpus Christi and Brownsville, Texas). Tornadoes are infrequent at the coast but do occur; however, they occur more frequently in inland portions of Gulf Coast states. Over most of the Gulf Coast from Houston, Texas eastward, extreme rainfall events are a significant threat, commonly from tropical weather systems, which can bring 4 to 10 or more inches of rain in a single day. In August 2017, Hurricane Harvey made landfall along the central Texas coast, then migrated to and stalled over the greater Houston area for several days, producing extreme, unprecedented rainfall totals of over 40 inches (1,000 mm) in many areas, unleashing widespread flooding. Earthquakes are extremely rare to the area, but a surprising 6.0 earthquake in the Gulf of Mexico on September 10, 2006, could be felt from the cities of New Orleans to Tampa. Economic activities The Gulf Coast is a major center of economic activity. The marshlands along the Louisiana and Texas coasts provide breeding grounds and nurseries for ocean life that drive the fishing and shrimping industries. The Port of South Louisiana (Metropolitan New Orleans in Laplace) and the Port of Houston are two of the ten busiest ports in the world by cargo volume. As of 2004, seven of the top ten busiest ports in the U.S. are on the Gulf Coast. The discovery of oil and gas deposits along the coast and offshore, combined with easy access to shipping, have made the Gulf Coast the heart of the U.S. petrochemical industry. The coast contains nearly 4,000 oil platforms. Besides the above, the region features other important industries including aerospace and biomedical research, as well as older industries such as agriculture and — especially since the development of the Gulf Coast beginning in the 1920s and the increase in wealth throughout the United States — tourism. History Before European settlers arrived in the region, the Gulf Coast was home to several pre-Columbian kingdoms which had extensive trade networks with empires such as the Aztecs and the Mississippi Mound Builders. Shark and alligator teeth and shells from the Gulf have been found as far north as Ohio, in the mounds of the Hopewell culture. The first Europeans to settle the Gulf Coast were primarily the French and the Spanish. The Louisiana Purchase, Adams–Onís Treaty and the Texas Revolution made the Gulf Coast a part of the United States during the first half of the 19th century. As the U.S. population continued to expand its frontiers westward, the Gulf Coast was a natural magnet in the South providing access to shipping lanes and both national and international commerce. The development of sugar and cotton production (enabled by slavery) allowed the South to prosper. By the mid 19th century the city of New Orleans, being situated as a key to commerce on the Mississippi River and in the Gulf, had become the largest U.S. city not on the Atlantic seaboard and the fourth largest in the U.S. overall. Two major events were turning points in the earlier history of the Gulf Coast region. The first was the American Civil War, which caused severe damage to some economic sectors in the South, including the Gulf Coast. The second event was the Galveston Hurricane of 1900. At the end of the 19th century Galveston was, with New Orleans, one of the most developed cities in the region. The city had the third busiest port in the U.S. and its financial district was known as the "Wall Street of the South". The storm mostly destroyed the city, which has never regained its former glory, and set back development in the region. Since then the Gulf Coast has been hit with numerous other hurricanes. On August 29, 2005, Hurricane Katrina struck the Gulf Coast as a Category 3 hurricane. It was the most damaging storm in the history of the United States, causing upwards of $80 billion in damages, and leaving over 1,800 dead. Again in 2008 the Gulf Coast was struck by a catastrophic hurricane. Due to its immense size, Hurricane Ike caused devastation from the Louisiana coastline all the way to the Kenedy County, Texas region near Corpus Christi. In addition, Ike caused flooding and significant damage along the Mississippi coastline and the Florida Panhandle Ike killed 112 people and left upwards of 300 people missing, never to be found. Hurricane Ike was the third most damaging storm in the history of the United States, causing more than $25 billion in damage along the coast, leaving hundreds of thousands of people homeless, and sparking the largest search-and-rescue operation in U.S. history. Other than the hurricanes, the Gulf Coast has redeveloped dramatically over the course of the 20th century. The gulf coast is highly populated. The petrochemical industry, launched with the major discoveries of oil in Texas and spurred on by further discoveries in the Gulf waters, has been a vehicle for development in the central and western Gulf which has spawned development on a variety of fronts in these regions. Texas in particular has benefited tremendously from this industry over the course of the 20th century and economic diversification has made the state a magnet for population and home to more Fortune 500 companies than any other U.S. state. Florida has grown as well, driven to a great extent by its long established tourism industry but also by its position as a gateway to the Caribbean and Latin America. As of 2006, these two states are the second and fourth most populous states in the nation, respectively (see this article). Other areas of the Gulf Coast have benefited less, though economic development fueled by tourism has greatly increased property values along the coast, and is now a severe danger to the valuable but fragile ecosystems of the Gulf Coast. Metropolitan areas The following table lists the 10 largest core-based statistical areas along the Gulf Coast. Transportation Road Major Interstates Major U.S. routes Other significant routes Air International service International Destinations Rail Amtrak service See also East Coast of the United States West Coast of the United States Emerald Coast Florida Panhandle Geography of the United States Gulf Coast Ecosystem Restoration Task Force Gulf States Marine Fisheries Commission Houston List of ports in the United States Megaregions of the United States Mississippi Gulf Coast New Orleans Tampa West Florida Notes Further reading Drescher, Christopher F., Stefan E. Schulenberg, and C. Veronica Smith. "The Deepwater Horizon Oil Spill and the Mississippi Gulf Coast: Mental health in the context of a technological disaster." American Journal of Orthopsychiatry 84.2 (2014): 142. Smith, F. Todd Louisiana and the Gulf South Frontier, 1500–1821 (Louisiana State University Press; 2014) 304 pages Williamson, James M., and John L. Pender. "Economic Stimulus and the Tax Code The Impact of the Gulf Opportunity Zone." Public Finance Review (2014): 1091142114557724. External links "Map of the Gulf Coast from Florida to Mexico" from 1639 via the World Digital Library Gulf Coasts of the Atlantic Ocean Landforms of the Gulf of Mexico Landforms of Florida Landforms of Louisiana Landforms of Mississippi Landforms of Texas Megapolitan areas of the United States Southeastern United States Southern United States Piracy by body of water
[ 0.10963764041662216, -0.013601436279714108, 0.40900859236717224, -0.22685612738132477, 0.4512805640697479, 0.15900631248950958, 0.10738088190555573, 0.8549339771270752, -0.32858186960220337, -0.4615643620491028, -0.41050830483436584, 0.7678977251052856, 0.04325912147760391, 0.4581875503063...
11971
https://en.wikipedia.org/wiki/Galaxy%20formation%20and%20evolution
Galaxy formation and evolution
The study of galaxy formation and evolution is concerned with the processes that formed a heterogeneous universe from a homogeneous beginning, the formation of the first galaxies, the way galaxies change over time, and the processes that have generated the variety of structures observed in nearby galaxies. Galaxy formation is hypothesized to occur from structure formation theories, as a result of tiny quantum fluctuations in the aftermath of the Big Bang. The simplest model in general agreement with observed phenomena is the Lambda-CDM model—that is, that clustering and merging allows galaxies to accumulate mass, determining both their shape and structure. Commonly observed properties of galaxies Because of the inability to conduct experiments in outer space, the only way to “test” theories and models of galaxy evolution is to compare them with observations. Explanations for how galaxies formed and evolved must be able to predict the observed properties and types of galaxies. Edwin Hubble created the first galaxy classification scheme known as the Hubble tuning-fork diagram. It partitioned galaxies into ellipticals, normal spirals, barred spirals (such as the Milky Way), and irregulars. These galaxy types exhibit the following properties which can be explained by current galaxy evolution theories: Many of the properties of galaxies (including the galaxy color–magnitude diagram) indicate that there are fundamentally two types of galaxies. These groups divide into blue star-forming galaxies that are more like spiral types, and red non-star forming galaxies that are more like elliptical galaxies. Spiral galaxies are quite thin, dense, and rotate relatively fast, while the stars in elliptical galaxies have randomly oriented orbits. The majority of giant galaxies contain a supermassive black hole in their centers, ranging in mass from millions to billions of times the mass of our Sun. The black hole mass is tied to the host galaxy bulge or spheroid mass. Metallicity has a positive correlation with the absolute magnitude (luminosity) of a galaxy. There is a common misconception that Hubble believed incorrectly that the tuning fork diagram described an evolutionary sequence for galaxies, from elliptical galaxies through lenticulars to spiral galaxies. This is not the case; instead, the tuning fork diagram shows an evolution from simple to complex with no temporal connotations intended. Astronomers now believe that disk galaxies likely formed first, then evolved into elliptical galaxies through galaxy mergers. Current models also predict that the majority of mass in galaxies is made up of dark matter, a substance which is not directly observable, and might not interact through any means except gravity. This observation arises because galaxies could not have formed as they have, or rotate as they are seen to, unless they contain far more mass than can be directly observed. Formation of disk galaxies The earliest stage in the evolution of galaxies is the formation. When a galaxy forms, it has a disk shape and is called a spiral galaxy due to spiral-like "arm" structures located on the disk. There are different theories on how these disk-like distributions of stars develop from a cloud of matter: however, at present, none of them exactly predicts the results of observation. Top-down theories Olin Eggen, Donald Lynden-Bell, and Allan Sandage in 1962, proposed a theory that disk galaxies form through a monolithic collapse of a large gas cloud. The distribution of matter in the early universe was in clumps that consisted mostly of dark matter. These clumps interacted gravitationally, putting tidal torques on each other that acted to give them some angular momentum. As the baryonic matter cooled, it dissipated some energy and contracted toward the center. With angular momentum conserved, the matter near the center speeds up its rotation. Then, like a spinning ball of pizza dough, the matter forms into a tight disk. Once the disk cools, the gas is not gravitationally stable, so it cannot remain a singular homogeneous cloud. It breaks, and these smaller clouds of gas form stars. Since the dark matter does not dissipate as it only interacts gravitationally, it remains distributed outside the disk in what is known as the dark halo. Observations show that there are stars located outside the disk, which does not quite fit the "pizza dough" model. It was first proposed by Leonard Searle and Robert Zinn that galaxies form by the coalescence of smaller progenitors. Known as a top-down formation scenario, this theory is quite simple yet no longer widely accepted. Bottom-up theories More recent theories include the clustering of dark matter halos in the bottom-up process. Instead of large gas clouds collapsing to form a galaxy in which the gas breaks up into smaller clouds, it is proposed that matter started out in these “smaller” clumps (mass on the order of globular clusters), and then many of these clumps merged to form galaxies, which then were drawn by gravitation to form galaxy clusters. This still results in disk-like distributions of baryonic matter with dark matter forming the halo for all the same reasons as in the top-down theory. Models using this sort of process predict more small galaxies than large ones, which matches observations. Astronomers do not currently know what process stops the contraction. In fact, theories of disk galaxy formation are not successful at producing the rotation speed and size of disk galaxies. It has been suggested that the radiation from bright newly formed stars, or from an active galactic nucleus can slow the contraction of a forming disk. It has also been suggested that the dark matter halo can pull the galaxy, thus stopping disk contraction. The Lambda-CDM model is a cosmological model that explains the formation of the universe after the Big Bang. It is a relatively simple model that predicts many properties observed in the universe, including the relative frequency of different galaxy types; however, it underestimates the number of thin disk galaxies in the universe. The reason is that these galaxy formation models predict a large number of mergers. If disk galaxies merge with another galaxy of comparable mass (at least 15 percent of its mass) the merger will likely destroy, or at a minimum greatly disrupt the disk, and the resulting galaxy is not expected to be a disk galaxy (see next section). While this remains an unsolved problem for astronomers, it does not necessarily mean that the Lambda-CDM model is completely wrong, but rather that it requires further refinement to accurately reproduce the population of galaxies in the universe. Galaxy mergers and the formation of elliptical galaxies Elliptical galaxies (such as IC 1101) are among some of the largest known thus far. Their stars are on orbits that are randomly oriented within the galaxy (i.e. they are not rotating like disk galaxies). A distinguishing feature of elliptical galaxies is that the velocity of the stars does not necessarily contribute to flattening of the galaxy, such as in spiral galaxies. Elliptical galaxies have central supermassive black holes, and the masses of these black holes correlate with the galaxy's mass. Elliptical galaxies have two main stages of evolution. The first is due to the supermassive black hole growing by accreting cooling gas. The second stage is marked by the black hole stabilizing by suppressing gas cooling, thus leaving the elliptical galaxy in a stable state. The mass of the black hole is also correlated to a property called sigma which is the dispersion of the velocities of stars in their orbits. This relationship, known as the M-sigma relation, was discovered in 2000. Elliptical galaxies mostly lack disks, although some bulges of disk galaxies resemble elliptical galaxies. Elliptical galaxies are more likely found in crowded regions of the universe (such as galaxy clusters). Astronomers now see elliptical galaxies as some of the most evolved systems in the universe. It is widely accepted that the main driving force for the evolution of elliptical galaxies is mergers of smaller galaxies. Many galaxies in the universe are gravitationally bound to other galaxies, which means that they will never escape their mutual pull. If the galaxies are of similar size, the resultant galaxy will appear similar to neither of the progenitors, but will instead be elliptical. There are many types of galaxy mergers, which do not necessarily result in elliptical galaxies, but result in a structural change. For example, a minor merger event is thought to be occurring between the Milky Way and the Magellanic Clouds. Mergers between such large galaxies are regarded as violent, and the frictional interaction of the gas between the two galaxies can cause gravitational shock waves, which are capable of forming new stars in the new elliptical galaxy. By sequencing several images of different galactic collisions, one can observe the timeline of two spiral galaxies merging into a single elliptical galaxy. In the Local Group, the Milky Way and the Andromeda Galaxy are gravitationally bound, and currently approaching each other at high speed. Simulations show that the Milky Way and Andromeda are on a collision course, and are expected to collide in less than five billion years. During this collision, it is expected that the Sun and the rest of the Solar System will be ejected from its current path around the Milky Way. The remnant could be a giant elliptical galaxy. Galaxy quenching One observation (see above) that must be explained by a successful theory of galaxy evolution is the existence of two different populations of galaxies on the galaxy color-magnitude diagram. Most galaxies tend to fall into two separate locations on this diagram: a "red sequence" and a "blue cloud". Red sequence galaxies are generally non-star-forming elliptical galaxies with little gas and dust, while blue cloud galaxies tend to be dusty star-forming spiral galaxies. As described in previous sections, galaxies tend to evolve from spiral to elliptical structure via mergers. However, the current rate of galaxy mergers does not explain how all galaxies move from the "blue cloud" to the "red sequence". It also does not explain how star formation ceases in galaxies. Theories of galaxy evolution must therefore be able to explain how star formation turns off in galaxies. This phenomenon is called galaxy "quenching". Stars form out of cold gas (see also the Kennicutt–Schmidt law), so a galaxy is quenched when it has no more cold gas. However, it is thought that quenching occurs relatively quickly (within 1 billion years), which is much shorter than the time it would take for a galaxy to simply use up its reservoir of cold gas. Galaxy evolution models explain this by hypothesizing other physical mechanisms that remove or shut off the supply of cold gas in a galaxy. These mechanisms can be broadly classified into two categories: (1) preventive feedback mechanisms that stop cold gas from entering a galaxy or stop it from producing stars, and (2) ejective feedback mechanisms that remove gas so that it cannot form stars. One theorized preventive mechanism called “strangulation” keeps cold gas from entering the galaxy. Strangulation is likely the main mechanism for quenching star formation in nearby low-mass galaxies. The exact physical explanation for strangulation is still unknown, but it may have to do with a galaxy's interactions with other galaxies. As a galaxy falls into a galaxy cluster, gravitational interactions with other galaxies can strangle it by preventing it from accreting more gas. For galaxies with massive dark matter halos, another preventive mechanism called “virial shock heating” may also prevent gas from becoming cool enough to form stars. Ejective processes, which expel cold gas from galaxies, may explain how more massive galaxies are quenched. One ejective mechanism is caused by supermassive black holes found in the centers of galaxies. Simulations have shown that gas accreting onto supermassive black holes in galactic centers produces high-energy jets; the released energy can expel enough cold gas to quench star formation. Our own Milky Way and the nearby Andromeda Galaxy currently appear to be undergoing the quenching transition from star-forming blue galaxies to passive red galaxies. Gallery See also List of galaxies Red nugget, small galaxies packed with large amounts of red stars Further reading References External links NOAO gallery of galaxy images Image of Andromeda galaxy (M31) Javascript passive evolution calculator for early type (elliptical) galaxies Video on the evolution of galaxies by Canadian astrophysicist Doctor P Formation and evolution Stellar evolution Concepts in astronomy
[ -0.1587345153093338, 0.5539700984954834, 0.33882442116737366, -0.39441531896591187, -0.04259913042187691, 0.09782908856868744, 0.00269308895803988, 0.5716829895973206, -0.4314081370830536, -1.051561951637268, -0.4167042374610901, 0.3330402672290802, -0.36688360571861267, 0.8925086855888367...
11973
https://en.wikipedia.org/wiki/Generation%20X
Generation X
Generation X (or Gen X for short) is the demographic cohort following the baby boomers and preceding the millennials. Researchers and popular media use the mid-to-late 1960s as starting birth years and the late 1970s to early 1980s as ending birth years, with the generation being generally defined as people born from 1965 to 1980. By this definition and U.S. Census data, there are 65.2 million Gen Xers in the United States as of 2019. Most members of Generation X are the children of the Silent Generation and early boomers; Xers are also often the parents of millennials and Generation Z. As children in the 1970s and 1980s, a time of shifting societal values, Gen Xers were sometimes called the "latchkey generation", an image spawning from children returning to an empty home and needing to use the door key, due to reduced adult supervision compared to previous generations. This was a result of increasing divorce rates and increased maternal participation in the workforce, prior to widespread availability of childcare options outside the home. As adolescents and young adults in the 1980s and 1990s, Xers were dubbed the "MTV Generation" (a reference to the music video channel), sometimes being characterized as slackers, cynical, and disaffected. Some of the many cultural influences on Gen X youth included a proliferation of musical genres with strong social-tribal identity such as punk, post-punk, and heavy metal, in addition to later forms developed by gen Xer's themselves (e.g. grunge, grindcore and related genres). Film, both the birth of franchise mega-sequels and a proliferation of Independent film enabled in part by video was also a notable cultural influence. Video games both in amusement parlours and in devices in western homes were also a major part of juvenile entertainment for the first time. Politically, in many Eastern Bloc countries generation X experienced the last days of communism and transition to capitalism as part of its youth. Whilst, in much of the western world, a similar time period was defined by a dominance of conservatism and free market economics. In midlife during the early 21st century, research describes them as active, happy, and achieving a work–life balance. The cohort has also been credited as entrepreneurial and productive in the workplace more broadly. Terminology and etymology The term Generation X has been used at various times to describe alienated youth. In the early 1950s, Hungarian photographer Robert Capa first used Generation X as the title for a photo-essay about young men and women growing up immediately following World War II. The term first appeared in print in a December 1952 issue of Holiday magazine announcing their upcoming publication of Capa's photo-essay. From 1976 to 1981, English musician Billy Idol used the moniker as the name for his punk rock band. Idol had attributed the name of his band to the book Generation X, a 1964 book on British popular youth culture written by journalists Jane Deverson and Charles Hamblett — a copy of which had been owned by Idol's mother. These uses of the term appear to have no connection to Robert Capa's photo-essay. The term acquired its contemporary application after the release of Generation X: Tales for an Accelerated Culture, a 1991 novel written by Canadian author Douglas Coupland. In 1987, Coupland had written a piece in Vancouver Magazine titled "Generation X" which was "the seed of what went on to become the book". Coupland referenced Billy Idol's band Generation X in the 1987 article and again in 1989 in Vista magazine. In the book proposal for his novel, Coupland writes that Generation X is "taken from the name of Billy Idol’s long-defunct punk band of the late 1970s". However, in 1995 Coupland denied the term's connection to the band, stating that: "The book's title came not from Billy Idol's band, as many supposed, but from the final chapter of a funny sociological book on American class structure titled Class, by Paul Fussell. In his final chapter, Fussell named an 'X' category of people who wanted to hop off the merry-go-round of status, money, and social climbing that so often frames modern existence." Author William Strauss noted that around the time Coupland's 1991 novel was published the symbol "X" was prominent in popular culture, as the film Malcolm X was released in 1992, and that the name "Generation X" ended up sticking. The "X" refers to an unknown variable or to a desire not to be defined. Strauss's coauthor Neil Howe noted the delay in naming this demographic cohort saying, "Over 30 years after their birthday, they didn't have a name. I think that's germane." Previously, the cohort had been referred to as Post-Boomers, Baby Busters (referencing the drop in the birth rates following the baby boom), New Lost Generation, latchkey kids, MTV Generation, and the 13th Generation (the 13th generation since American independence). Date and age range definitions Generation X is the demographic cohort following the post–World War II baby-boom, representing a generational change from the baby boomers. Many researchers and demographers use dates which correspond to the fertility-patterns in the population. For Generation X, in the U.S. (and broadly, in the Western world), the period begins at a time when fertility rates started to significantly decrease, following the baby boom peak of the late 1950s, until an upswing in the late 1970s and eventual recovery at the start of the 1980s. In the U.S., the Pew Research Center, a non-partisan think-tank, delineates a Generation X period of 1965–1980 which has, albeit gradually, come to gain acceptance in academic circles. Moreover, although fertility rates are preponderant in the definition of start and end dates, the center remarks: "Generations are analytical constructs, it takes time for popular and expert consensus to develop as to the precise boundaries that demarcate one generation from another." Pew takes into account other factors, notably the labor market as well as attitudinal and behavioral trends of a group. Writing for Pew's Trend magazine in 2018, psychologist Jean Twenge observed that the "birth year boundaries of Gen X are debated but settle somewhere around 1965–1980". According to this definition, the oldest Gen Xer is years old and the youngest is, or is turning, years old in . The Brookings Institution, another U.S. think-tank, sets the Gen X period as between 1965 and 1981. The U.S. Federal Reserve Board uses 1965–1980 to define Gen X. The U.S. Social Security Administration (SSA) defines the years for Gen X as between 1964 and 1979. The US Department of Defense (DoD), conversely, use dates 1965 to 1977. In their 2002 book When Generations Collide, Lynne Lancaster and David Stillman use 1965 to 1980, while in 2012 authors Jain and Pant also used parameters of 1965 to 1980. U.S. news outlets such as The New York Times and The Washington Post describe Generation X as people born between 1965 and 1980. Gallup, Bloomberg, Business Insider, and Forbes use 1965–1980. Time magazine states that Generation X is "roughly defined as anyone born between 1965 and 1980". In Australia, the McCrindle Research Center uses 1965–1979. In the UK, the Resolution Foundation think-tank defines Gen X as those born between 1966 and 1980. PricewaterhouseCoopers, a multinational professional services network headquartered in London, describes Generation X employees as those born from 1965 to 1980. Other age range markers On the basis of the time it takes for a generation to mature, U.S. authors William Strauss and Neil Howe define Generation X as those born between 1961 and 1981 in their 1991 book titled Generations, and differentiate the cohort into an early and late wave. Jeff Gordinier, in his 2008 book X Saves the World, also has a wider definition to include those born between 1961 and 1977 but possibly as late as 1980. George Masnick of the Harvard Joint Center for Housing Studies puts this generation in the time-frame of 1965 to 1984, in order to satisfy the premise that boomers, Xers, and millennials "cover equal 20-year age spans". In 2004, journalist J. Markert also acknowledged the 20-year increments but goes one step further and subdivides the generation into two 10-year cohorts with early and later members of the generation. The first begins in 1966 and ends in 1975 and the second begins in 1976 and ends in 1985; this thinking is applied to each generation (Silent, boomers, Gen X, millennials, etc.). Based on external events of historical importance, Schewe and Noble in 2002 argue that a cohort is formed against significant milestones and can be any length of time. Against this logic, Generation X begins in 1966 and ends in 1976, with those born between 1955 and 1965 being labelled as "trailing-edge boomers". In Canada, professor David Foot describes Generation X as late boomers and includes those born between 1960 and 1966, whilst the "Bust Generation", those born between 1967 and 1979, is considered altogether a separate generation, in his 1996 book Boom Bust & Echo: How to Profit from the Coming Demographic Shift. Generational cuspers Generation Jones is identified as the group of people born in the latter half of the Baby Boomers from the early 1960s to the early years of Generation X. Individuals born in the Generation X and millennial cusp years of the late 1970s and early to mid-1980s have been identified by the media as a "microgeneration" with characteristics of both generations. Names given to these "cuspers" include Xennials, Generation Catalano, and the Oregon Trail Generation. Demographics United States There are differences in Gen X population numbers depending on the date-range selected. In the U.S., using Census population projections, the Pew Research Center found that the Gen X population born from 1965 to 1980 numbered 65.2 million in 2019. The cohort is likely to overtake boomers in 2028. A 2010 Census report counted approximately 84 million people living in the US who are defined by birth years ranging from the early 1960s to the early 1980s. In a 2012 article for the Joint Center for Housing Studies of Harvard University, George Masnick wrote that the "Census counted 82.1 million" Gen Xers in the U.S. Masnick concluded that immigration filled in any birth year deficits during low fertility years of the late 1960s and early 1970s. Jon Miller at the Longitudinal Study of American Youth at the University of Michigan wrote that "Generation X refers to adults born between 1961 and 1981" and it "includes 84 million people". In their 1991 book Generations, authors Howe and Strauss indicated that the total number of Gen X individuals in the U.S. was 88.5 million. Impact of family planning programs The birth control pill, introduced in 1960, was one contributing factor of declining birth rates. Initially, the pill spread rapidly amongst married women as an approved treatment for menstrual disturbance. However, it was also found to prevent pregnancy and was prescribed as a contraceptive in 1964. The pill, as it became commonly known, reached younger, unmarried college women in the late 1960s when state laws were amended and reduced the age of majority from 21 to ages 18–20. These policies are commonly referred to as the Early Legal Access (ELA) laws. Another major factor was abortion, only available in a few states until its legalisation in a 1973 US Supreme Court decision in Roe v. Wade. This was replicated elsewhere, with reproductive rights legislation passed, notably in the UK (1967), France (1975), West Germany (1976), New Zealand (1977), Italy (1978), and the Netherlands (1980). From 1973 to 1980, the abortion rate per 1,000 US women aged 15–44 increased exponentially from 16% to 29% with more than 9.6 million terminations of pregnancy practiced. Between 1970 and 1980, on average, for every 10 American citizens born, 3 were aborted. However, increased immigration during the same period of time helped to partially offset declining birth-rates and contributed to making Generation X an ethnically and culturally diverse demographic cohort. Parental lineage Generally, Gen Xers are the children of the Silent Generation and older baby boomers. Characteristics In the United States As children and adolescents Rising divorce rates and women workforce participation Strauss and Howe, who wrote several books on generations, including one specifically on Generation X titled 13th Gen: Abort, Retry, Ignore, Fail? (1993), reported that Gen Xers were children at a time when society was less focused on children and more focused on adults. Xers were children during a time of increasing divorce rates, with divorce rates doubling in the mid-1960s, before peaking in 1980. Strauss and Howe described a cultural shift where the long-held societal value of staying together for the sake of the children was replaced with a societal value of parental and individual self-actualization. Strauss wrote that society "moved from what Leslie Fiedler called a 1950s-era 'cult of the child' to what Landon Jones called a 1970s-era 'cult of the adult'". The Generation Map, a report from Australia's McCrindle Research Center writes of Gen X children: "their Boomer parents were the most divorced generation in Australian history". According to Christine Henseler in the 2012 book Generation X Goes Global: Mapping a Youth Culture in Motion, "We watched the decay and demise (of the family), and grew callous to the loss." The Gen X childhood coincided with the sexual revolution of the 1960s to 1980s, which Susan Gregory Thomas described in her book In Spite of Everything as confusing and frightening for children in cases where a parent would bring new sexual partners into their home. Thomas also discussed how divorce was different during the Gen X childhood, with the child having a limited or severed relationship with one parent following divorce, often the father, due to differing societal and legal expectations. In the 1970s, only nine U.S. states allowed for joint custody of children, which has since been adopted by all 50 states following a push for joint custody during the mid-1980s. Kramer vs. Kramer, a 1979 American legal drama based on Avery Corman's best-selling novel, came to epitomize the struggle for child custody and the demise of the traditional nuclear family. The rapid influx of boomer women into the labor force that began in the 1970s was marked by the confidence of many in their ability to successfully pursue a career while meeting the needs of their children. This resulted in an increase in latchkey children, leading to the terminology of the "latchkey generation" for Generation X. These children lacked adult supervision in the hours between the end of the school day and when a parent returned home from work in the evening, and for longer periods of time during the summer. Latchkey children became common among all socioeconomic demographics, but this was particularly so among middle- and upper-class children. The higher the educational attainment of the parents, the higher the odds the children of this time would be latchkey children, due to increased maternal participation in the workforce at a time before childcare options outside the home were widely available. McCrindle Research Centre described the cohort as "the first to grow up without a large adult presence, with both parents working", stating this led to Gen Xers being more peer-oriented than previous generations. Conservative and neoliberal turn Some older Gen Xers started high school in the waning years of the Carter presidency, but much of the cohort became socially and politically conscious during the Reagan Era. President Ronald Reagan, voted in office principally by the boomer generation, embraced laissez-faire economics with vigor. His policies included cuts in the growth of government spending, reduction in taxes for the higher echelon of society, legalization of stock buybacks, and deregulation of key industries. Measures had drastic consequences on the social fabric of the country even if, gradually, reforms gained acceptability and exported overseas to willing participants. The early 1980s recession saw unemployment rise to 10.8% in 1982; requiring, more often than not, dual parental incomes. One-in-five American children grew up in poverty during this time. The federal debt almost tripled during Reagan's time in office, from $998 billion in 1981 to $2.857 trillion in 1989, placing greater burden of repayment on the incoming generation. Government expenditure shifted from domestic programs to defense. Remaining funding initiatives, moreover, tended to be diverted away from programs for children and often directed toward the elderly population, with cuts to Medicaid and programs for children and young families, and protection and expansion of Medicare and Social Security for the elderly population. These programs for the elderly were not tied to economic need. Congressman David Durenberger criticized this political situation, stating that while programs for poor children and for young families were cut, the government provided "free health care to elderly millionaires". The crack epidemic and AIDS Gen Xers came of age or were children during the 1980s crack epidemic, which disproportionately impacted urban areas as well as the African-American community. The U.S. Drug turf battles increased violent crime. crack addiction impacted communities and families. Between 1984 and 1989, the homicide rate for black males aged 14 to 17 doubled in the U.S., and the homicide rate for black males aged 18 to 24 increased almost as much. The crack epidemic had a destabilizing impact on families, with an increase in the number of children in foster care. In 1986, President Reagan signed the Anti-Drug Abuse Act to enforce strict mandatory minimum sentencing for drug users. He also increased the federal budget for supply-reduction efforts. Fear of the impending AIDS epidemic of the 1980s and 1990s loomed over the formative years of Generation X. The emergence of AIDS coincided with Gen X's adolescence, with the disease first clinically observed in the U.S. in 1981. By 1985, an estimated one-to-two million Americans were HIV-positive. This particularly hit the LGBT community. As the virus spread, at a time before effective treatments were available, a public panic ensued. Sex education programs in schools were adapted to address the AIDS epidemic, which taught Gen X students that sex could kill them. The rise of home computing Gen Xers were the first children to have access to personal computers in their homes and at schools. In the early 1980s, the growth in the use of personal computers exploded. Manufacturers such as Commodore, Atari, and Apple responded to the demand via 8-bit and 16-bit machines. This in turn stimulated the software industries with corresponding developments for backup storage, use of the floppy disk, zip drive, and CD-ROM. At school, several computer projects were supported by the Department of Education under United States Secretary of Education Terrel Bell's "Technology Initiative". This was later mirrored in the UK's 1982 Computers for Schools programme and, in France, under the 1985 scheme Plan Informatique pour Tous (IPT). The post-civil rights generation In the U.S., Generation X was the first cohort to grow up post-integration after the racist Jim Crow laws. They were described in a marketing report by Specialty Retail as the kids who "lived the civil rights movement". They were among the first children to be bused to attain integration in the public school system. In the 1990s, Strauss reported Gen Xers were "by any measure the least racist of today's generations". In the U.S., Title IX, which passed in 1972, provided increased athletic opportunities to Gen X girls in the public school setting. Roots, based on the novel by Alex Haley and broadcast as a 12-hour series, was viewed as a turning point in the country's ability to relate to the afro-American history. As young adults Continued growth in college enrollments In the U.S., compared to the boomer generation, Generation X was more educated than their parents. The share of young adults enrolling in college steadily increased from 1983, before peaking in 1998. In 1965, as early boomers entered college, total enrollment of new undergraduates was just over 5.7 million individuals across the public and private sectors. By 1983, the first year of Gen X college enrollments (as per Pew Research's definition), this figure had reached 12.2 million. This was an increase of 53%, effectively a doubling in student intake. As the 1990s progressed, Gen X college enrollments continued to climb, with increased loan borrowing as the cost of an education became substantially more expensive compared to their peers in the mid-1980s. By 1998, the generation's last year of college enrollment, those entering the higher education sector totaled 14.3 million. In addition, unlike Boomers and previous generations, women outpaced men in college completion rates. Adjusting to a new societal environment For early Gen Xer graduates entering the job market at the end of the 1980s, economic conditions were challenging and did not show signs of major improvements until the mid-1990s. In the U.S., restrictive monetary policy to curb rising inflation and the collapse of a large number of savings and loan associations (private banks that specialized in home mortgages) impacted the welfare of many American households. This precipitated a large government bailout, which placed further strain on the budget. Furthermore, three decades of growth came to an end. The social contract between employers and employees, which had endured during the 1960s and 1970s and was scheduled to last until retirement, was no longer applicable. By the late 1980s, there were large-scale layoffs of boomers, corporate downsizing, and accelerated offshoring of production. On the political front, in the U.S. the generation became ambivalent if not outright disaffected with politics. They had been reared in the shadow of the Vietnam War and the Watergate scandal. They came to maturity under the Reagan and George H. W. Bush presidencies, with first-hand experience of the impact of neoliberal policies. Few had experienced a Democratic administration and even then, only, at an atmospheric level. For those on the left of the political spectrum, the disappointments with the previous boomer student mobilizations of the 1960s and the collapse of those movements towards a consumerist "greed is good" and "yuppie" culture during the 1980s felt, to a greater extent, hypocrisy if not outright betrayal. Hence, the preoccupation on "authenticity" and not "selling-out". The Revolutions of 1989 and the collapse of the socialist utopia with the fall of the Berlin Wall, moreover, added to the disillusionment that any alternative to the capitalist model was possible. Birth of the slacker In 1990, Time magazine published an article titled "Living: Proceeding with Caution", which described those then in their 20s as aimless and unfocused. Media pundits and advertisers further struggled to define the cohort, typically portraying them as "unfocused twentysomethings". A MetLife report noted: "media would portray them as the Friends generation: rather self-involved and perhaps aimless...but fun". Gen Xers were often portrayed as apathetic or as "slackers", lacking bearings, a stereotype which was initially tied to Richard Linklater's comedic and essentially plotless 1991 film Slacker. After the film was released, "journalists and critics thought they put a finger on what was different about these young adults in that 'they were reluctant to grow up' and 'disdainful of earnest action'". Ben Stiller's 1994 film Reality Bites also sought to capture the zeitgeist of the generation with a portrayal of the attitudes and lifestyle choices of the time. Negative stereotypes of Gen X young adults continued, including that they were "bleak, cynical, and disaffected". In 1998, such stereotypes prompted sociological research at Stanford University to study the accuracy of the characterization of Gen X young adults as cynical and disaffected. Using the national General Social Survey, the researchers compared answers to identical survey questions asked of 18–29-year-olds in three different time periods. Additionally, they compared how older adults answered the same survey questions over time. The surveys showed 18–29-year-old Gen Xers did exhibit higher levels of cynicism and disaffection than previous cohorts of 18–29-year-olds surveyed. However, they also found that cynicism and disaffection had increased among all age groups surveyed over time, not just young adults, making this a period effect, not a cohort effect. In other words, adults of all ages were more cynical and disaffected in the 1990s, not just Generation X. Rise of the Internet and the dot-com bubble By the mid-late 1990s, under Bill Clinton's presidency, economic optimism had returned to the U.S., with unemployment reduced from 7.5% in 1992 to 4% in 2000. Younger members of Gen X, straddling across administrations, politically experienced a "liberal renewal". In 1997, Time magazine published an article titled "Generation X Reconsidered", which retracted the previously reported negative stereotypes and reported positive accomplishments. The article cited Gen Xers' tendency to found technology startup companies and small businesses, as well as their ambition, which research showed was higher among Gen X young adults than older generations. Yet, the slacker moniker stuck. As the decade progressed, Gen X gained a reputation for entrepreneurship. In 1999, The New York Times dubbed them "Generation 1099", describing them as the "once pitied but now envied group of self-employed workers whose income is reported to the Internal Revenue Service not on a W-2 form, but on Form 1099". Consumer access to the Internet and its commercial development throughout the 1990s witnessed a frenzy of IT initiatives. Newly created companies, launched on stock exchanges globally, were formed with dubitable revenue generation or cash flow. When the dot-com bubble eventually burst in 2000, early Gen Xers who had embarked as entrepreneurs in the IT industry while iding the Internet wave, as well as newly qualified programmers at the tail-end of the generation (who had grown up with AOL and the first Web browsers), were both caught in the crash. This had major repercussions, with cross-generational consequences; five years after the bubble burst, new matriculation of IT millennial undergraduates fell by 40% and by as much as 70% in some information systems programs. However, following the crisis, sociologist Mike Males reported continued confidence and optimism among the cohort. He reported "surveys consistently find 80% to 90% of Gen Xers self-confident and optimistic". Males wrote "these young Americans should finally get the recognition they deserve", praising the cohort and stating that "the permissively raised, universally deplored Generation X is the true 'great generation', for it has braved a hostile social climate to reverse abysmal trends". He described them as the hardest-working group since the World War II generation. He reported Gen Xers' entrepreneurial tendencies helped create the high-tech industry that fueled the 1990s economic recovery. In 2002, Time magazine published an article titled Gen Xers Aren't Slackers After All, reporting that four out of five new businesses were the work of Gen Xers. Response to 9/11 In the U.S., Gen Xers were described as the major heroes of the September 11 terrorist attacks by author William Strauss. The firefighters and police responding to the attacks were predominantly from Generation X. Additionally, the leaders of the passenger revolt on United Airlines Flight 93 were also, by majority, Gen Xers. Author Neil Howe reported survey data which showed that Gen Xers were cohabiting and getting married in increasing numbers following the terrorist attacks. Gen X survey respondents reported that they no longer wanted to live alone. In October 2001, the Seattle Post-Intelligencer wrote of Gen Xers: "Now they could be facing the most formative events of their lives and their generation." The Greensboro News & Record reported members of the cohort "felt a surge of patriotism since terrorists struck" by giving blood, working for charities, donating to charities, and by joining the military to fight the War on Terror. The Jury Expert, a publication of The American Society of Trial Consultants, reported: "Gen X members responded to the terrorist attacks with bursts of patriotism and national fervor that surprised even themselves." In midlife Achieving a work-life balance In 2011, survey analysis from the Longitudinal Study of American Youth found Gen Xers (defined as those who were then between the ages of 30 and 50) to be "balanced, active, and happy" in midlife and as achieving a work-life balance. The Longitudinal Study of Youth is an NIH-NIA funded study by the University of Michigan which has been studying Generation X since 1987. The study asked questions such as "Thinking about all aspects of your life, how happy are you? If zero means that you are very unhappy and 10 means that you are very happy, please rate your happiness." LSA reported that "mean level of happiness was 7.5 and the median (middle score) was 8. Only four percent of Generation X adults indicated a great deal of unhappiness (a score of three or lower). Twenty-nine percent of Generation X adults were very happy with a score of 9 or 10 on the scale." In 2014, Pew Research provided further insight, describing the cohort as "savvy, skeptical and self-reliant; they're not into preening or pampering, and they just might not give much of a hoot what others think of them. Or whether others think of them at all." Furthermore, guides regarding managing multiple generations in the workforce describe Gen Xers as: independent, resilient, resourceful, self-managing, adaptable, cynical, pragmatic, skeptical of authority, and as seeking a work-life balance. Entrepreneurship as an individual trait Individualism is one of the defining traits of Generation X, and is reflected in their entrepreneurial spirit. In the 2008 book X Saves the World: How Generation X Got the Shaft but Can Still Keep Everything from Sucking, author Jeff Gordinier describes Generation X as a "dark horse demographic" which "doesn't seek the limelight". Gordiner cites examples of Gen Xers' contributions to society such as: Google, Wikipedia, Amazon.com, and YouTube, arguing that if boomers had created them, "we'd never hear the end of it". In the book, Gordinier contrasts Gen Xers to baby boomers, saying boomers tend to trumpet their accomplishments more than Gen Xers do, creating what he describes as "elaborate mythologies" around their achievements. Gordiner cites Steve Jobs as an example, while Gen Xers, he argues, are more likely to "just quietly do their thing". In a 2007 article published in the Harvard Business Review, authors Strauss and Howe wrote of Generation X: "They are already the greatest entrepreneurial generation in U.S. history; their high-tech savvy and marketplace resilience have helped America prosper in the era of globalization." According to authors Michael Hais and Morley Winograd: Small businesses and the entrepreneurial spirit that Gen Xers embody have become one of the most popular institutions in America. There's been a recent shift in consumer behavior and Gen Xers will join the "idealist generation" in encouraging the celebration of individual effort and business risk-taking. As a result, Xers will spark a renaissance of entrepreneurship in economic life, even as overall confidence in economic institutions declines. Customers, and their needs and wants (including Millennials) will become the North Star for an entire new generation of entrepreneurs. A 2015 study by Sage Group reports Gen Xers "dominate the playing field" with respect to founding startups in the United States and Canada, with Xers launching the majority (55%) of all new businesses in 2015. Income benefits of a college education Unlike millennials, Generation X was the last generation in the U.S. for whom higher education was broadly financially remunerative. In 2019, the Federal Reserve Bank of St. Louis published research (using data from the 2016 Survey of Consumer Finances) demonstrating that after controlling for race and age, cohort families with heads of household with post-secondary education and born before 1980 have seen wealth and income premiums, while, for those after 1980, the wealth premium has weakened to a point of statistical insignificance (in part because of the rising cost of college). The income premium, while remaining positive, has declined to historic lows, with more pronounced downward trajectories among heads of household with postgraduate degrees. Parenting and volunteering In terms of advocating for their children in the educational setting, author Neil Howe describes Gen X parents as distinct from baby boomer parents. Howe argues that Gen Xers are not helicopter parents, which Howe describes as a parenting style of boomer parents of millennials. Howe described Gen Xers instead as "stealth fighter parents", due to the tendency of Gen X parents to let minor issues go and to not hover over their children in the educational setting, but to intervene forcefully and swiftly in the event of more serious issues. In 2012, the Corporation for National and Community Service ranked Gen X volunteer rates in the U.S. at "29.4% per year", the highest compared with other generations. The rankings were based on a three-year moving average between 2009 and 2011. Income differential with previous generations A report titled Economic Mobility: Is the American Dream Alive and Well? focused on the income of males 30–39 in 2004 (those born April 1964March 1974). The study was released on 25 May 2007 and emphasized that this generation's men made less (by 12%) than their fathers had at the same age in 1974, thus reversing a historical trend. It concluded that, per year increases in household income generated by fathers/sons slowed from an average of 0.9% to 0.3%, barely keeping pace with inflation. "Family incomes have risen though (over the period 1947 to 2005) because more women have gone to work", "supporting the incomes of men, by adding a second earner to the family. And as with male income, the trend is downward." Elsewhere Although, globally, children and adolescents of Generation X will have been heavily influenced by U.S. cultural industries with shared global currents (e.g. rising divorce rates, the AIDS epidemic, advancements in ICT), there is not one U.S.-born raised concept but multiple perspectives and geographical outgrowths. Even within the period of analysis, inside national communities, commonalities will have differed on the basis of one's birth date. The generation, Christine Henseler also remarks, was shaped as much by real-world events, within national borders, determined by specific political, cultural, and historical incidents. She adds "In other words, it is in between both real, clearly bordered spaces and more fluid global currents that we can spot the spirit of Generation X." In 2016, a global consumer insights project from Viacom International Media Networks and Viacom, based on over 12,000 respondents across 21 countries, reported on Gen X's unconventional approach to sex, friendship, and family, their desire for flexibility and fulfillment at work and the absence of midlife crisis for Gen Xers. The project also included a 20 min documentary titled Gen X Today. Russia In Russia Generation Xers are referred to as "the last Soviet children", as the last children to come of age prior to the downfall of communism in their nation and prior to the Dissolution of the Soviet Union. Those that reached adulthood in the 1980s and grew up educated in the doctrines of Marxism and Leninism found themselves against a background of economic and social change, with the advent of Mikhail Gorbachev to power and Perestroika. However, even before the collapse of the Soviet Union and the disbanding of the Communist Party of the Soviet Union, surveys demonstrated that Russian young people repudiated the key features of the Communist worldview that their party leaders, schoolteachers, and even parents had tried to instill in them. This generation, caught in the transition between Marxism–Leninism and an unknown future, and wooed by the new domestic political classes, remained largely apathetic. France In France, "Generation X" is not as widely known or used to define its members. Demographically, this denotes those born from the beginning of the 1960s to the early 1980s. There is general agreement that, domestically, the event that is accepted in France as the separating point between the baby boomer generation and Generation X are the French strikes and violent riots of May 1968 with those of the generation too young to participate. Those at the start of the cohort are sometimes referred to as 'Génération Bof' because of their tendency to use the word 'bof', which, translated into English, means "whatever". The generation is closely associated with socialist François Mitterrand who served as President of France during two consecutive terms between 1981 and 1995 as most transitioned into adulthood during that period. Economically, Xers started when the new labour market was emerging and were the first to fully experience the advent of the post-industrial society. For those at the tail-end of the generation, educational and defense reforms, a new style baccalauréat général with three distinct streams in 1995 (the preceding programme, introduced in 1968) and the cessation of military conscription in 1997 (for those born after January 1979) are considered as new transition points to the next. Republic of Ireland The term "Generation X" is used to describe Irish people born between 1965 and 1985; they grew up during The Troubles and the 1980s economic recession, coming of age during the Celtic Tiger period of prosperity in the 1990s onward. The appropriateness of the term to Ireland has been questioned, with Darach Ó Séaghdha noting that "Generation X is usually contrasted with the one before by growing up in smaller and different family units on account of their parents having greater access to contraception and divorce – again, things that were not widely available in Ireland. [Contraception was only available under prescription in 1978 and without prescription in 1985; divorce was illegal until 1996.] However, this generation was in prime position to benefit from the Celtic Tiger, the Peace Process and liberalisations introduced on foot of EU membership and was less likely to emigrate than those that came before and after. You could say that in many ways, these are Ireland’s real boomers." Culturally, Britpop, Celtic rock, the trad revival, Father Ted, the 1990 FIFA World Cup and rave culture were significant. The Divine Comedy song "Generation Sex" (1998) painted a picture of hedonism in the late 20th century, as well as its effect on the media. David McWilliams' 2005 book The Pope's Children: Ireland's New Elite profiled Irish people born in the 1970s (just prior to the papal visit to Ireland), which was a baby boom that saw Ireland's population increase for the first time since the 1840s Great Famine. The Pope's Children were in position to benefit from the Celtic Tiger and the newly liberal culture, where the Catholic Church had significantly less social power. United Kingdom As children, adolescents and young adults Political environment The United Kingdom's Economic and Social Research Council described Generation X as "Thatcher's children" because the cohort grew up while Margaret Thatcher was Prime Minister from 1979 to 1990, "a time of social flux and transformation". Those born in the late 1960s and early 1970s grew up in a period of social unrest. While unemployment was low in the early 1970s, industrial and social unrest escalated. Strike action culminated in the "Winter of Discontent" in 1978–79, and the Troubles began to unfold in Northern Ireland. The turn to neoliberal policies introduced and maintained by consecutive conservative governments from 1979 to 1997 marked the end of the post-war consensus. Education The almost universal dismantling of the grammar school system in Great Britain during the 1960s and the 1970s meant that the vast majority of the cohort attended secondary modern schools, relabelled comprehensive schools. Compulsory education ended at the age of 16. As older members of the cohort reached the end of their mandatory schooling, levels of educational enrollment among older adolescents remained below much of the Western world. By the early 1980s, some 80% to 90% of school leavers in France and West Germany received vocational training, compared with 40% in the United Kingdom. By the mid-1980s, over 80% of pupils in the United States and West Germany and over 90% in Japan stayed in education until the age of eighteen, compared with 33% of British pupils. There was, however, broadly a rise in education levels among this age range as Generation X passed through it. In 1990, 25% of young people in England stayed in some kind of full-time education after the age of 18, this was an increase from 15% a decade earlier. Later, the Further and Higher Education Act 1992 and the liberalisation of higher education in the UK saw greater numbers of those born towards the tail-end of the generation gaining university places. Employment The 1980s, when much of Generation X reached working age, was an era defined by high unemployment rates. This was particularly true of the youngest members of the working aged population. In 1984, 26% of 16 to 24 year olds were neither in full-time education or participating in the workforce. However, this figure did decrease as the economic situation improved reaching 17% by 1993. In midlife Generation X were far more likely to have children out of wedlock than their parents. The number of babies being born to unmarried parents in England and Wales rose from 11% in 1979, a quarter in 1998, 40% by 2002 and almost half in 2012. They were also significantly more likely to have children later in life than their predecessors. The average age of a mother giving birth rose from 27 in 1982 to 30 in 2012. That year saw 29,994 children born to mothers over the age 40, an increase of 360% from 2002. A 2016 study of over 2,500 British office workers conducted by Workfront found that survey respondents of all ages selected those from Generation X as the hardest-working employees and members of the workforce (chosen by 60%). Gen X was also ranked highest among fellow workers for having the strongest work ethic (chosen by 59.5%), being the most helpful (55.4%), the most skilled (54.5%), and the best troubleshooters/problem-solvers (41.6%). Political evolution Ipsos MORI reports that at the 1987 and 1992 general elections, the first United Kingdom general elections where significant numbers of Generation X members could vote, a plurality of 18 to 24 year olds opted for the Labour Party by a small margin. The polling organisation's figures suggest that in 1987 39% of that age group voted Labour, 37% for the Conservatives and 22% for the SDP–Liberal Alliance. Five years later, these numbers were fairly similar at 38% Labour, 35% Conservative and 19% Liberal Democrats, a party by then formed from the previously mentioned alliance. Both these elections saw a fairly significant lead for the conservatives in the popular vote among the general population. At the 1997 General election where Labour won a large majority of seats and a comfortable lead in the popular vote, research suggests that voters under the age of 35 were more likely to vote labour if they turned out than the wider electorate but significantly less likely to vote than in 1992. Analysts suggested this may have been due to fewer differences in policies between the major parties and young people having less of a sense of affiliation with particular political parties than older generations. A similar trend continued at the 2001 and 2005 general elections as turnout dropped further among both the relatively young and the wider public. Voter turnout across the electorate began to recover from a 2001 low until the 2017 general election. Generation X also became more likely to vote as they entered the midlife age demographics. Polling suggests a plurality of their age group backed the Conservatives in 2010 and 2015 but less overwhelming than much of the older generation. At the 2016 EU membership referendum and 2017 general election, Generation X was split with younger members appearing to back remain and Labour and older members tending towards Leave and Conservative in a British electorate more polarised by age than ever before. At the 2019 general election, voting trends continued to be heavily divided by age but a plurality of younger as well as older generation X members (then 39 to 55 year olds) voted Conservative. Germany In Germany, "Generation X" is not widely used or applied. Instead, reference is made to "Generation Golf" in the previous West German republic, based on a novel by Florian Illies. In the east, children of the "Mauerfall" or coming down of the wall. For former East Germans, there was adaptation, but also a sense of loss of accustomed values and structures. These effects turned into romantic narratives of their childhood. For those in the West, there was a period of discovery and exploration of what had been a forbidden land. South Africa In South Africa, Gen Xers spent their formative years of the 1980s during the "hyper-politicized environment of the final years of apartheid". Arts and culture Music Gen Xers were the first cohort to come of age with MTV. They were the first generation to experience the emergence of music videos as teenagers and are sometimes called the MTV Generation. Gen Xers were responsible for the alternative rock movement of the 1990s and 2000s, including the grunge subgenre. Hip hop has also been described as defining music of the generation, particularly artists such as Tupac Shakur, N.W.A., and The Notorious B.I.G. Punk rock From 1974 to 1976, a new generation of rock bands arose, such as the Ramones, Johnny Thunders and the Heartbreakers, The Dictators in New York City, the Sex Pistols, the Clash, the Damned, and Buzzcocks in the United Kingdom, and the Saints in Brisbane. By late 1976, these acts were generally recognized as forming the vanguard of "punk rock", and as 1977 approached, punk rock became a major and highly controversial cultural phenomenon in the UK. It spawned a punk subculture which expressed a youthful rebellion, characterized by distinctive styles of clothing and adornment (ranging from deliberately offensive T-shirts, leather jackets, studded or spiked bands and jewelry, as well as bondage and S&M clothes) and a variety of anti-authoritarian ideologies that have since been associated with the form. By 1977 the influence of punk rock music and its subculture became more pervasive, spreading throughout various countries worldwide. It generally took root in local scenes that tended to reject affiliation with the mainstream. In the late 1970s, punk experienced its second wave. Acts that were not active during its formative years adopted the style. While at first punk musicians were not Gen Xers themselves (many of them were late boomers, or Generation Jones), the fanbase for punk became increasingly Gen X-oriented as the earliest Xers entered their adolescence, and it therefore made a significant imprint on the cohort. By the 1980s, faster and more aggressive subgenres such as hardcore punk (e.g. Minor Threat), street punk (e.g. the Exploited, NOFX) and anarcho-punk (e.g. Subhumans) became the predominant modes of punk rock. Musicians identifying with or inspired by punk often later pursued other musical directions, resulting in a broad range of spinoffs. This development gave rise to genres such as post-punk, new wave and later indie pop, alternative rock, and noise rock. Gen Xers were no longer simply the consumers of punk, they became the creators as well. By the 1990s, punk rock re-emerged into the mainstream. Punk rock and pop punk bands with Gen X members such as Green Day, Rancid, The Offspring, and Blink-182 brought widespread popularity to the genre . Hard rock Arguably in a similar way to punk, a sense of disillusionment, angst and anger catalysed hard rock and heavy metal to grow from the earlier influence of rock. Post-punk The energy generated by the punk movement launched a subsequent proliferation of weird and eclectic post-punk sub cultures, spanning new wave, goth etc., and influencing the New Romantics, Grunge A notable example of alternative rock is grunge music and the associated subculture that developed in the Pacific Northwest of the U.S. Grunge song lyrics have been called the "...product of Generation X malaise". Vulture commented: "the best bands arose from the boredom of latchkey kids". "People made records entirely to please themselves because there was nobody else to please" commented producer Jack Endino. Grunge lyrics are typically dark, nihilistic, angst-filled, anguished, and often addressing themes such as social alienation, despair and apathy. The Guardian wrote that grunge "didn't recycle banal cliches but tackled weighty subjects". Topics of grunge lyrics included homelessness, suicide, rape, broken homes, drug addiction, self-loathing, misogyny, domestic abuse and finding "meaning in an indifferent universe". Grunge lyrics tended to be introspective and aimed to enable the listener to see into hidden personal issues and examine depravity in the world. Notable grunge bands include: Nirvana, Pearl Jam, Alice in Chains, Stone Temple Pilots and Soundgarden. Hip hop The golden age of hip hop refers to hip hop music made from the mid-1980s to mid-1990s, typically by artists originating from the New York metropolitan area. The music style was characterized by its diversity, quality, innovation and influence after the genre's emergence and establishment in the previous decade. There were various types of subject matter, while the music was experimental and the sampling eclectic. The artists most often associated with the period are LL Cool J, Run–D.M.C., Public Enemy, the Beastie Boys, KRS-One, Eric B. & Rakim, De La Soul, Big Daddy Kane, EPMD, A Tribe Called Quest, Wu-Tang Clan, Slick Rick, Ultramagnetic MC's, and the Jungle Brothers. Releases by these acts co-existed in this period with, and were as commercially viable as, those of early gangsta rap artists such as Ice-T, Geto Boys and N.W.A, the sex raps of 2 Live Crew and Too Short, and party-oriented music by acts such as Kid 'n Play, The Fat Boys, DJ Jazzy Jeff & The Fresh Prince and MC Hammer. In addition to lyrical self-glorification, hip hop was also used as a form of social protest. Lyrical content from the era often drew attention to a variety of social issues, including afrocentric living, drug use, crime and violence, religion, culture, the state of the American economy, and the modern man's struggle. Conscious and political hip hop tracks of the time were a response to the effects of American capitalism and former President Reagan's conservative political economy. According to Rose Tricia, "In rap, relationships between black cultural practice, social and economic conditions, technology, sexual and racial politics, and the institution policing of the popular terrain are complex and in constant motion". Even though hip hop was used as a mechanism for different social issues, it was still very complex with issues within the movement itself. There was also often an emphasis on black nationalism. Hip hop artists often talked about urban poverty and the problems of alcohol, drugs, and gangs in their communities. Public Enemy's most influential song, "Fight the Power", came out at this time; the song speaks up to the government, proclaiming that people in the ghetto have freedom of speech and rights like every other American. Film Indie films Gen Xers were largely responsible for the "indie film" movement of the 1990s, both as young directors and in large part as the film audiences which were fueling demand for such films. In cinema, directors Kevin Smith, Quentin Tarantino, Sofia Coppola, John Singleton, Spike Jonze, David Fincher, Steven Soderbergh, and Richard Linklater have been called Generation X filmmakers. Smith is most known for his View Askewniverse films, the flagship film being Clerks, which is set in New Jersey circa 1994, and focuses on two convenience-store clerks in their twenties. Linklater's Slacker similarly explores young adult characters who were interested in philosophizing. While not a member of Gen X himself, director John Hughes has been recognized as having created classic 1980s teen films with early Gen X characters which "an entire generation took ownership of", including The Breakfast Club, Sixteen Candles, Weird Science, and Ferris Bueller's Day Off. In France, a new movement emerged, the Cinéma du look, spearheaded by filmmakers Luc Besson, Jean-Jacques Beineix and Leos Carax. Although not Gen Xers themselves, Subway (1985), 37°2 le matin (English: Betty Blue; 1986), and Mauvais Sang (1986) sought to capture on screen the generation's malaise, sense of entrapment, and desire to escape. Franchise mega sequels The birth of franchise mega-sequels in the science fiction, fantasy, and horror fiction genres, such as the epic space opera Star Wars and the Halloween franchise, had a profound and notable cultural influence. Literature The literature of early Gen Xers is often dark and introspective. In the U.S., authors such as Elizabeth Wurtzel, David Foster Wallace, Bret Easton Ellis, and Douglas Coupland captured the zeitgeist of this generation. In France, Michel Houellebecq and Frédéric Beigbeder rank among major novelists whose work also reflect the dissatisfaction and melancholies of the cohort. In the UK, Alex Garland, author of The Beach (1996), further added to the genre. Health problems While previous research has indicated that the likelihood of heart attacks was declining among Americans aged 35 to 74, a 2018 study published in the American Heart Association's journal Circulation revealed that this did not apply to the younger half of that cohort (controlling for age, Generation X have not seen a reduction in heart attack risk, versus previous generations). Data from 28,000 patients from across the United States who were hospitalized for heart attacks between 1995 and 2014 showed that a growing proportion were between the ages of 35 to 54. The proportion of heart-attack patients in this age group at the end of the study was 32%, up from 27% at the start of the study. This increase is most pronounced among women, for whom the number jumped from 21% to 31%. A common theme among those who suffered from heart attacks is that they also had high-blood pressure, diabetes, and chronic kidney disease. These changes have been faster for women than for men. Experts suggest a number of reasons for this. Conditions such as coronary artery disease are traditionally viewed as a man's problem, and as such female patients are not considered high-risk. More often than in previous generations, Generation X women are both the primary caretakers of their families and full-time employees, reducing time for self-care. Offspring Generation X are usually the parents of Generation Z, and sometimes millennials. Jason Dorsey, who works for the Center of Generational Kinetics, observed that like their parents from Generation X, members of Generation Z tend to be autonomous and pessimistic. They need validation less than the millennials and typically become financially literate at an earlier age, as many of their parents bore the full brunt of the Great Recession. See also Generation gap Generation Jones List of generations References External links Generation X Goes Global: Mapping a Youth Culture in Motion , Christine Henseler, Ed.; 2012 "Generation X's journey from jaded to sated" – Salon, 1 October 2013 Gen X Today—2016 documentary by Viacom International Media Networks 1950s neologisms 20th century X Marketing by target group Popular culture Postmodernism
[ 0.4573117196559906, -0.37885212898254395, 0.0025395681150257587, 0.12071571499109268, 0.0917639508843422, 0.37566882371902466, -0.02779039368033409, 0.7901315689086914, -0.05127489194273949, -0.389615535736084, -0.3772636950016022, -0.07875430583953857, -0.1626400500535965, 0.6253126263618...
11974
https://en.wikipedia.org/wiki/Guam
Guam
Guam (; ) is an organized, unincorporated territory of the United States in the Micronesia subregion of the western Pacific Ocean. It is the westernmost point and territory of the United States (reckoned from the geographic center of the U.S.); in Oceania, it is the largest and southernmost of the Mariana Islands and the largest island in Micronesia. Guam's capital is Hagåtña, and the most populous village is Dededo. People born on Guam are American citizens but have no vote in the United States presidential elections while residing on Guam and Guam delegates to the United States House of Representatives have no vote on the floor. Indigenous Guamanians are the Chamoru, historically known as the Chamorro, who are related to the Austronesian peoples of Indonesia, the Philippines, Taiwan, Micronesia, and Polynesia. As of 2021, Guam's population is 168,801. Chamoros are the largest ethnic group, but a minority on the multi-ethnic island. The territory spans and has a population density of . The Chamoro people settled the island approximately 3,500 years ago. Portuguese explorer Ferdinand Magellan, while in the service of Spain, was the first European to visit the island on March 6, 1521. Guam was colonized by Spain in 1668. Between the 16th and 18th centuries, Guam was an important stopover for the Spanish Manila Galleons. During the Spanish–American War, the United States captured Guam on June 21, 1898. Under the Treaty of Paris, signed December 10, 1898, Spain ceded Guam to the U.S. effective April 11, 1899. Before World War II, Guam was one of five American jurisdictions in the Pacific Ocean, along with Wake Island in Micronesia, American Samoa and Hawaii in Polynesia, and the Philippines. On December 8, 1941, hours after the attack on Pearl Harbor, Guam was captured by the Japanese, who occupied the island for two and a half years. During the occupation, Guamanians were subjected to forced labor, incarceration, torture and execution. American forces recaptured the island on July 21, 1944, which is commemorated as Liberation Day. Since the 1960s, Guam's economy has been supported primarily by tourism and the U.S. military, for which Guam is a major strategic asset. An unofficial but frequently used territorial motto is "Where America's Day Begins", which refers to the island's proximity to the International Date Line. Guam is among the 17 non-self-governing territories listed by the United Nations, and has been a member of the Pacific Community since 1983. History Pre-Contact era Guam, along with the Mariana Islands, were the first islands settled by humans in Remote Oceania. Incidentally it is also the first and the longest of the ocean-crossing voyages of the Austronesian peoples, and is separate from the later Polynesian settlement of the rest of Remote Oceania. They were first settled around 1500 to 1400 BC by migrants departing from the Philippines. This was followed by a second migration from the Caroline Islands by the first millennium AD, and a third migration from Island Southeast Asia (likely the Philippines or eastern Indonesia) by 900 AD. These original settlers of Guam and the Northern Mariana Islands evolved into the Chamoru people, historically known as Chamorros after first contact with the Spaniards. The ancient Chamoru society had four classes: (chiefs), (upper class), (middle class), and (lower class). The were located in the coastal villages, which meant they had the best access to fishing grounds, whereas the were located in the island's interior. and rarely communicated with each other, and often used as intermediaries. There were also "" or "", shamans with magical powers and "'" or "", healers who used different kinds of plants and natural materials to make medicine. Belief in spirits of ancient Chamorus called "" still persists as a remnant of pre-European culture. It is believed that "" or "" are the only ones who can safely harvest plants and other natural materials from their homes or "" without incurring the wrath of the "." Their society was organized along matrilineal clans. The Chamoru people raised colonnades of megalithic capped pillars called upon which they built their homes. Latte stones are stone pillars that are found only in the Mariana Islands; they are a recent development in Pre-Contact Chamoru society. The latte-stone was used as a foundation on which thatched huts were built. Latte stones consist of a base shaped from limestone called the and with a capstone, or , made either from a large brain coral or limestone, placed on top. A possible source for these stones, the Rota Latte Stone Quarry, was discovered in 1925 on Rota. Spanish era The first European to travel to Guam was Portuguese navigator Ferdinand Magellan, sailing for the King of Spain, when he sighted the island on March 6, 1521, during his fleet's circumnavigation of the globe. Despite Magellan's visit, Guam was not officially claimed by Spain until January 26, 1565, by Miguel López de Legazpi. From 1565 to 1815, Guam and the Northern Mariana Islands, the only Spanish outposts in the Pacific Ocean east of the Philippines, were reprovisioning stops for the Manila galleons, a fleet that covered the Pacific trade route between Acapulco and Manila. Spanish colonization commenced on June 15, 1668, with the arrival of a mission led by Diego Luis de San Vitores, who established the first Catholic church. The islands were part of the Spanish East Indies, and in turn part of the Viceroyalty of New Spain, based in Mexico City. The Spanish-Chamorro Wars on Guam began in 1670 over growing tensions with the Jesuit mission, with the last large-scale uprising in 1683. Intermittent warfare, plus the typhoons of 1671 and 1693, and in particular the smallpox epidemic of 1688, reduced the Chamoru population from 50,000 to 10,000, finally to less than 5,000. The island became a rest stop for whalers starting in 1823. A devastating typhoon struck the island on August 10, 1848, followed by a severe earthquake on January 25, 1849, which resulted in many refugees from the Caroline Islands, victims of the resultant tsunami. After a smallpox epidemic killed 3,644 Guamanians in 1856, Carolinians and Japanese were permitted to settle in the Marianas. American era After almost four centuries as part of the Kingdom of Spain, the United States occupied the island following Spain's defeat in 1898 Spanish–American War, as part of the Treaty of Paris of 1898. Guam was transferred to the United States Navy control on December 23, 1898, by Executive Order 108-A from 25th President William McKinley. Guam was a station for American merchants and warships traveling to and from the Philippines (another American acquisition from Spain) while the Northern Mariana Islands were sold by Spain to Germany for part of its rapidly expanding German Empire. A U.S. Navy yard was established at Piti in 1899, and a United States Marine Corps barracks at Sumay in 1901. A marine seaplane unit was stationed in Sumay from 1921 to 1930, the first in the Pacific. The Commercial Pacific Cable Company built a telegraph/telephone station in 1903 for the first trans-Pacific communications cable, followed by Pan American World Airways established a seaplane base at Sumay for its trans-Pacific China Clipper route. World War II During World War II, Guam was attacked and invaded by Japan on Monday, December 8, 1941, at the same time as the attack on Pearl Harbor, across the International Date Line. The Japanese renamed Guam (Great Shrine Island). The Japanese occupation of Guam lasted for approximately 31 months. During this period, the indigenous people of Guam were subjected to forced labor, family separation, incarceration, execution, concentration camps, and forced prostitution. Approximately 1,000 people died during the occupation, according to later Congressional committee testimony in 2004. Some historians estimate that war violence killed 10% of Guam's then 20,000 population. The United States returned and fought the Battle of Guam from July 21 to August 10, 1944, to recapture the island from Japanese military occupation. July 21 is now celebrated as Liberation Day, a territorial holiday. Post-war After World War II, the Guam Organic Act of 1950 established Guam as an unincorporated organized territory of the United States, provided for the structure of the island's civilian government, and granted the people U.S. citizenship. The Governor of Guam was federally appointed until 1968 when the Guam Elective Governor Act provided for the office's popular election. Since Guam is not a U.S. state, U.S. citizens residing on Guam are not allowed to vote for president and their congressional representative is a non-voting member. They do, however, get to vote for party delegates in presidential primaries. In 1969, a referendum on unification with the Northern Mariana Islands was held and rejected. During the 1970s, Dr. Maryly Van Leer Peck started an engineering program, expanded University of Guam, and founded Guam Community College. The removal of Guam's security clearance by President John F. Kennedy in 1963 allowed for the development of a tourism industry. When the United States closed U.S. Naval Base Subic Bay and Clark Air Base bases in the Philippines after the expiration of their leases in the early 1990s, many of the forces stationed there were relocated to Guam. The 1997 Asian financial crisis, which hit Japan particularly hard, severely affected Guam's tourism industry. Military cutbacks in the 1990s also disrupted the island's economy. Economic recovery was further hampered by devastation from Supertyphoons Paka in 1997 and Pongsona in 2002, as well as the effects of the September 11 terrorist attacks on tourism. Geography and environment Guam is long and wide, giving it an area of (three-fourths the size of Singapore) and making it the 32nd largest island of the United States. It is the southernmost and largest island in the Mariana Island archipelago, as well as the largest in Micronesia. Guam's Point Udall is the westernmost point of the U.S., as measured from the geographic center of the United States. The Mariana chain of which Guam is a part was created by collision of the Pacific and Philippine Sea tectonic plates, with Guam located on the micro Mariana Plate between the two. Guam is the closest land mass to the Mariana Trench, the deep subduction zone that runs east of the Marianas. Volcanic eruptions established the base of the island in the Eocene, roughly 56 to 33.9 million years ago. The north of Guam is a result of this base being covered with layers of coral reef, turning into limestone, and then being thrust upward by tectonic activity to create a plateau. The rugged south of the island is a result of more recent volcanic activity. Cocos Island off the southern tip of Guam is the largest of the many small islets along the coastline. Guam's highest point is Mount Lamlam at above sea level. If its base is considered to be nearby Challenger Deep, the deepest surveyed point in the Oceans, Mount Lamlam is the world's highest mountain at . Politically, Guam is divided into 19 villages. The majority of the population lives on the coralline limestone plateaus of the north, with political and economic activity centered in the central and northern regions. The rugged geography of the south largely limits settlement to rural coastal areas. The western coast is leeward of the trade winds and is the location of Apra Harbor, the capital Hagåtña, and the tourist center of Tumon. The U.S. Defense Department owns about 29% of the island, under the management of Joint Region Marianas. Climate Guam has a tropical rainforest climate (Köppen Af), though its driest month of March almost averages dry enough to qualify as a tropical monsoon climate (Köppen Am). The weather is generally hot and humid throughout the year with little seasonal temperature variation. Hence, Guam is known to have equable temperatures year-round. Trade winds are fairly constant throughout the year, but there is often a weak westerly monsoon influence in summer. Guam has two distinct seasons: Wet and dry season. The dry season runs from January through May and June being the transitional period. The wet season runs from July through November with an average annual rainfall between 1981 and 2010 of around . The wettest month on record at Guam Airport has been August 1997 with and the driest was February 2015 with . The wettest calendar year has been 1976 with and the driest was in 1998 with . The most rainfall in a single day occurred on October 15, 1953, when fell. The mean high temperature is and mean low is . Temperatures rarely exceed or fall below . The relative humidity commonly exceeds 84 percent at night throughout the year, but the average monthly humidity hovers near 66 percent. The highest temperature ever recorded in Guam was on April 18, 1971, and April 1, 1990. A record low of was set on February 1, 2021, while the lowest recorded temperature was 65 °F (18.3 °C), set on February 8, 1973. Guam lies in the path of typhoons and it is common for the island to be threatened by tropical storms and possible typhoons during the wet season. The highest risk of typhoons is from August through November, where typhoons and tropical storms are most probable in the western Pacific. They can, however, occur year-round. Typhoons that have caused major damage on Guam in the American period include the Typhoon of 1900, Karen (1962), Pamela (1976), Paka (1997), and Pongsona (2002). Since Typhoon Pamela in 1976, wooden structures have been largely replaced by concrete structures. During the 1980s, wooden utility poles began to be replaced by typhoon-resistant concrete and steel poles. After the local Government enforced stricter construction codes, many home and business owners built their structures out of reinforced concrete with installed typhoon shutters. Ecology Guam has experienced severe impacts from invasive species upon the natural biodiversity of the island. These include the local extinction of endemic bird species after the introduction of the brown tree snake, an infestation of the Asiatic rhinoceros beetle destroying coconut palms, and the effect of introduced feral mammals and amphibians. Wildfires plague the forested areas of Guam every dry season despite the island's humid climate. Most fires are caused by humans with 80% resulting from arson. Poachers often start fires to attract deer to the new growth. Invasive grass species that rely on fire as part of their natural life cycle grow in many regularly burned areas. Grasslands and "barrens" have replaced previously forested areas leading to greater soil erosion. During the rainy season, sediment is carried by the heavy rains into the Fena Lake Reservoir and Ugum River, leading to water quality problems for southern Guam. Eroded silt also destroys the marine life in reefs around the island. Soil stabilization efforts by volunteers and forestry workers (planting trees) have had little success in preserving natural habitats. Efforts have been made to protect Guam's coral reef habitats from pollution, eroded silt and overfishing, problems that have led to decreased fish populations. This has both ecological and economic value, as Guam is a significant vacation spot for scuba divers, and one study found that Guam's reefs are worth $127 million per year. In recent years, the Department of Agriculture, Division of Aquatic and Wildlife Resources has established several new marine preserves where fish populations are monitored by biologists. These are located at Pati Point, Piti Bomb Holes, Sasa Bay, Achang Reef Flat, and Tumon Bay. Before adopting U.S. Environmental Protection Agency standards, portions of Tumon Bay were dredged by the hotel chains to provide a better experience for hotel guests. Tumon Bay has since been made into a preserve. A federal Guam National Wildlife Refuge in northern Guam protects the decimated sea turtle population in addition to a small colony of Mariana fruit bats. Harvest of sea turtle eggs was a common occurrence on Guam before World War II. The green sea turtle (Chelonia mydas) was harvested legally on Guam before August 1978, when it was listed as threatened under the Endangered Species Act. The hawksbill sea turtle (Eretmochelys imbricata) has been on the endangered list since 1970. In an effort to ensure the protection of sea turtles on Guam, routine sightings are counted during aerial surveys and nest sites are recorded and monitored for hatchlings. Demographics According to the 2010 United States Census, the largest ethnic group are the native Chamorus, accounting for 37.3% of the total population. Asians (including Filipinos, Koreans, Chinese, and Japanese) account for 33% of the total population. Other ethnic groups of Micronesia (including those of Chuukese, Palauan, and Pohnpeians) accounts for 10% of the total population. 9.4% of the population are multiracial (two or more races). White Americans account for 7.1% of the total population. The estimated interracial marriage rate is over 40%. The official languages of the island are English and Chamoru. Filipino is also a common language across the island. Other Pacific island languages and many Asian languages are spoken in Guam as well. Spanish, the language of administration for 300 years, is no longer commonly spoken on the island, although vestiges of the language remain in proper names, loanwords, and place names and it is studied at university and high schools. The most common religion is Catholicism. According to the Pew Research Center, the religious denominations constitute of the following, in 2010: Roman Catholicism: 75% Protestantism: 17.7% Other religions: 1.6% Folk religions: 1.5% Other Christianity: 1.4% Buddhism: 1.1% Eastern Orthodoxy: <1% Hinduism: <1% Islam: <1% Judaism: <1% Culture The culture of Guam is a reflection of traditional Chamoru customs in combination with American, Spanish and Mexican traditions. Post-European-contact Chamoru Guamanian culture is a combination of American, Spanish, Filipino, other Micronesian Islander and Mexican traditions. Few indigenous pre-Hispanic customs remained following Spanish contact. Hispanic influences are manifested in the local language, music, dance, sea navigation, cuisine, fishing, games (such as , , , and ), songs, and fashion. The island's original community is of Chamorro natives who have inhabited Guam for almost 4000 years. They had their own language related to the languages of Indonesia and southeast Asia. The Spanish later called them Chamorros, a derivative of the word Chamorri is "noble race"). They began to grow rice on the island. Historically, the native people of Guam venerated the bones of their ancestors, keeping the skulls in their houses in small baskets, and practicing incantations before them when it was desired to attain certain objects. During Spanish rule (1668–1898) the majority of the population was converted to Catholicism and religious festivities such as Easter and Christmas became widespread. Many Chamorus have Spanish surnames, although few of the inhabitants are themselves descended from the Spaniards. Instead, Spanish names and surnames became commonplace after their conversion to Catholicism and the imposition of the Catálogo alfabético de apellidos in Guam. Historically, the diet of the native inhabitants of Guam consisted of fish, fowl, rice, breadfruit, taro, yams, bananas, and coconuts used in a variety of dishes. Post-contact Chamoru cuisine is largely based on corn, and includes tortillas, tamales, atole, and chilaquiles, which are a clear influence from Mesoamerica, principally Mexico, from Spanish trade with Asia. Due to foreign cultural influence from Spain, most aspects of the early indigenous culture have been lost, though there has been a resurgence in preserving any remaining pre-Hispanic culture in the last few decades. Some scholars have traveled throughout the Pacific Islands conducting research to study what the original Chamoru cultural practices such as dance, language, and canoe building may have been like. Sports Guam's most popular sport is American football, followed by basketball and baseball respectively. Soccer and other sports are also somewhat popular. Guam hosted the Pacific Games in 1975 and 1999. At the 2007 Games, Guam finished 7th of 22 countries in the medal count, and 14th at the 2011 Games. Guam men's national basketball team and the women's team are traditional powerhouses in the Oceania region, behind the Australia men's national basketball team and the New Zealand national basketball team. , the men's team is the reigning champion of the Pacific Games Basketball Tournament. Guam is home to various basketball organizations, including the Guam Basketball Association. The Guam national football team was founded in 1975 and joined FIFA in 1996. It was once considered one of FIFA's weakest teams, and experienced their first victory over a FIFA-registered side in 2009. Guam hosted qualifying games on the island for the first time in 2015 and, in 2018, clinched their first FIFA World Cup Qualifying win. The Guam national rugby union team played its first match in 2005 and has never qualified for a Rugby World Cup. Economy Guam's economy depends primarily on tourism, Department of Defense installations and locally owned businesses. Under the provisions of a special law by Congress, it is Guam's treasury rather than the U.S. treasury that receives the federal income taxes paid by local taxpayers (including military and civilian federal employees assigned to Guam). Tourism Lying in the western Pacific, Guam is a popular destination for Japanese tourists. Its tourist hub, Tumon, features over 20 large hotels, a Duty Free Shoppers Galleria, Pleasure Island district, indoor aquarium, Sandcastle Las Vegas–styled shows and other shopping and entertainment venues. It is a relatively short flight from Asia or Australia compared to Hawaii, with hotels and seven public golf courses accommodating over a million tourists per year. Although 75% of the tourists are Japanese, Guam also receives a sizable number of tourists from South Korea, the U.S., the Philippines, and Taiwan. Significant sources of revenue include duty-free designer shopping outlets, and the American-style malls: Micronesia Mall, Guam Premier Outlets, the Agana Shopping Center, and the world's largest Kmart. The economy had been stable since 2000 due to increased tourism. It was expected to stabilize with the transfer of U.S. Marine Corps' 3rd Marine Expeditionary Force, currently in Okinawa, Japan (approximately 8,000 Marines, along with their 10,000 dependents), to Guam between 2010 and 2015. However, the move was delayed until late 2020, the number of marines decreased to 5,000, and expected to be complete in 2025. In 2003, Guam had a 14% unemployment rate, and the government suffered a $314 million shortfall. As of 2019 the unemployment rate had dropped to 6.1%. By September 2020, however, the unemployment rate had risen again to 17.9%. The Compacts of Free Association between the United States, the Federated States of Micronesia, the Republic of the Marshall Islands, and the Republic of Palau accorded the former entities of the Trust Territory of the Pacific Islands a political status of "free association" with the United States. The Compacts give citizens of these island nations generally no restrictions to reside in the United States (also its territories), and many were attracted to Guam due to its proximity, environmental, and cultural familiarity. Over the years, it was claimed by some in Guam that the territory has had to bear the brunt of this agreement in the form of public assistance programs and public education for those from the regions involved, and the federal government should compensate the states and territories affected by this type of migration. Over the years, Congress had appropriated "Compact Impact" aids to Guam, the Northern Mariana Islands, and Hawaii, and eventually this appropriation was written into each renewed Compact. Some, however, continue to claim the compensation is not enough or that the distribution of actual compensation received is significantly disproportionate. Guam's largest single private sector employer, with about 1,400 jobs, was Continental Micronesia, a subsidiary of Continental Airlines; it is now a part of United Airlines, a subsidiary of Chicago-based United Airlines Holdings, Inc. the Continental Micronesia annual payroll in Guam was $90 million. Military bases Currently, Joint Region Marianas maintains jurisdiction over installations which cover approximately , or 29% of the island's total land area. These include: U.S. Naval Base Guam, U.S. Navy (Santa Rita), comprising the Orote Peninsula, additional lands, and with jurisdiction of the majority of Apra Harbor Andersen Air Force Base, U.S. Air Force (Yigo), including Northwest Field Marine Corps Base Camp Blaz, U.S. Marine Corps (Dededo) Ordnance Annex, U.S. Navy – South Central Highlands (formerly known as Naval Magazine) Naval Computer and Telecommunications Station Guam, U.S. Navy (Dededo), sometimes referred to "NCTS Finegayan" Naval Radio Station Barrigada (Barrigada), often referred to as "Radio Barrigada" Joint Region Marianas Headquarters (Asan), at Nimitz Hill Annex Naval Hospital Guam (Agana Heights) South Finegayan (Dededo), a military housing complex Andersen South (Yigo), formerly Marine Barracks Guam until its closure in 1992 Fort Juan Muña, Guam National Guard (Tamuning) The U.S. military has proposed building a new aircraft carrier berth on Guam and moving 8,600 Marines, and 9,000 of their dependents, to Guam from Okinawa, Japan. Including the required construction workers, this buildup would increase Guam's population by a total of 79,000, a 49% increase over its 2010 population of 160,000. In a February 2010 letter, the United States Environmental Protection Agency sharply criticized these plans because of a water shortfall, sewage problems and the impact on coral reefs. By 2012, these plans had been cut to have only a maximum of 4,800 Marines stationed on the island, two thirds of whom would be there on a rotational basis without their dependents. Government and politics Guam is governed by a popularly elected governor and a unicameral 15-member legislature, whose members are known as senators. Its judiciary is overseen by the Supreme Court of Guam. The District Court of Guam is the court of United States federal jurisdiction in the territory. Guam elects one delegate to the United States House of Representatives, currently Democrat Michael San Nicolas. The delegate does not have a vote on the final passage of legislation, but is accorded a vote in committee, and the privilege to speak to the House. U.S. citizens in Guam vote in a presidential straw poll for their choice in the U.S. presidential general election, but since Guam has no votes in the Electoral College, the poll has no real effect. However, in sending delegates to the Republican and Democratic national conventions, Guam does have influence in the national presidential race. These delegates are elected by local party conventions. Political status In the 1980s and early 1990s, there was a significant movement in favor of this U.S. territory becoming a commonwealth, which would give it a level of self-government similar to Puerto Rico and the Northern Mariana Islands. In a 1982 plebiscite, voters indicated interest in seeking commonwealth status. However, the federal government rejected the version of a commonwealth that the government of Guam proposed, because its clauses were incompatible with the Territorial Clause (Art. IV, Sec. 3, cl. 2) of the U.S. Constitution. Other movements advocate U.S. statehood for Guam, union with the state of Hawaii, or union with the Northern Mariana Islands as a single territory, or independence. A Commission on Decolonization was established in 1997 to educate the people of Guam about the various political status options in its relationship with the U.S.: statehood, free association and independence. The island has been considering another non-binding plebiscite on decolonization since 1998, however, the group was dormant for some years. In 2013, the commission began seeking funding to start a public education campaign. There were few subsequent developments until late 2016. In early December 2016, the Commission scheduled a series of education sessions in various villages about the current status of Guam's relationship with the U.S. and the self-determination options that might be considered. The commission's current Executive Director is Edward Alvarez and there are ten members. The group is also expected to release position papers on independence and statehood but the contents have not yet been completed. The United Nations is in favor of greater self-determination for Guam and other such territories. The UN's Special Committee on Decolonization has agreed to endorse the Governor's education plan. The commission's May 2016 report states: "With academics from the University of Guam, [the Commission] was working to create and approve educational materials. The Office of the Governor was collaborating closely with the Commission" in developing educational materials for the public. The United States Department of the Interior had approved a $300,000 grant for decolonization education, Edward Alvarez told the United Nations Pacific Regional Seminar in May 2016. "We are hopeful that this might indicate a shift in [United States] policy to its Non-Self-Governing Territories such as Guam, where they will be more willing to engage in discussions about our future and offer true support to help push us towards true self-governances and self-determination." On July 31, 2020, the Government of Guam joined the Unrepresented Nations and Peoples Organization (UNPO). Villages Guam is divided into 19 municipal villages: Agana Heights Asan‑Maina Barrigada Chalan Pago‑Ordot Dededo Hågat Hagåtña Humåtak Inalåhan Malesso Mangilao Mongmong‑Toto‑Maite Piti Sånta Rita-Sumai Sinajana Talo'fo'fo Tamuning Yigo Yona Transportation and communications Most of the island has state-of-the-art mobile phone services and high-speed internet widely available through either cable or DSL. Guam was added to the North American Numbering Plan (NANP) in 1997 (country code 671 became NANP area code 671), removing the barrier of high-cost international long-distance calls to the U.S. mainland. Guam is also a major hub for submarine cables between the Western U.S., Hawaii, Australia and Asia. Guam currently serves twelve submarine cables, with most continuing to China. In 2012 Slate stated that the island has "tremendous bandwidth" and internet prices comparable to those of the U.S. Mainland due to being at the junction of undersea cables. In 1899, the local postage stamps were overprinted "Guam" as was done for the other former Spanish colonies, but this was discontinued shortly thereafter and regular U.S. postage stamps have been used ever since. Because Guam is also part of the U.S. Postal System (postal abbreviation: GU, ZIP code range: 96910–96932), mail to Guam from the U.S. mainland is considered domestic and no additional charges are required. Private shipping companies, such as FedEx, UPS, and DHL, however, have no obligation to do so, and do not regard Guam as domestic. The speed of mail traveling between Guam and the states varies depending on size and time of year. Light, first-class items generally take less than a week to or from the mainland, but larger first-class or Priority items can take a week or two. Fourth-class mail, such as magazines, are transported by sea after reaching Hawaii. Most residents use post office boxes or private mail boxes, although residential delivery is becoming increasingly available. Incoming mail not from the Americas should be addressed to "Guam" instead of "USA" to avoid being routed the long way through the U.S. mainland and possibly charged a higher rate (especially from Asia). The Port of Guam is the island's lifeline because most products must be shipped into Guam for consumers. It receives the weekly calls of the Hawaii-based shipping line Matson, Inc. whose container ships connect Guam with Honolulu, Hawaii, Los Angeles, California, Oakland, California and Seattle, Washington. The port is also the regional transhipment hub for over 500,000 customers throughout the Micronesian region. The port is the shipping and receiving point for containers designated for the island's U.S. Department of Defense installations, Andersen Air Force Base and Commander, Naval Forces Marianas and eventually the Third Marine Expeditionary Force. Guam is served by the Antonio B. Won Pat International Airport. The island is outside the United States customs zone, so Guam is responsible for establishing and operating its own customs and quarantine agency and jurisdiction. Therefore, the U.S. Customs and Border Protection only carries out immigration (but not customs) functions. Since Guam is under federal immigration jurisdiction, passengers arriving directly from the United States skip immigration and proceed directly to Guam Customs and Quarantine. However, due to the Guam and CNMI visa waiver program for certain countries, an eligibility pre-clearance check is carried on Guam for flights to the States. For travel from the Northern Mariana Islands to Guam, a pre-flight passport and visa check is performed before boarding the flight to Guam. On flights from Guam to the Northern Mariana Islands, no immigration check is performed. Traveling between Guam and the States through a foreign point, however, does require a passport. Most residents travel within Guam using personally owned vehicles. The Guam Regional Transit Authority provides fixed route bus and paratransit services, and some commercial companies operate buses between tourist-frequented locations. Education Guam Public Library System operates the Nieves M. Flores Memorial Library in Hagåtña and five branch libraries. The Guam Department of Education serves the entire island of Guam. In 2000, 32,000 students attended Guam's public schools, including 26 elementary schools, eight middle schools, and six high schools and alternative schools. Guam Public Schools have struggled with problems such as high dropout rates and poor test scores. Guam's educational system has always faced unique challenges as a small community located from the U.S. mainland with a very diverse student body including many students who come from backgrounds without traditional American education. An economic downturn in Guam since the mid-1990s has compounded the problems in schools. Before September 1997, the U.S. Department of Defense partnered with Guam Board of Education. In September 1997, the Department of Defense Education Activity (DoDEA) opened its own schools for children of military personnel. DoDEA schools, which also serve children of some federal civilian employees, had an attendance of 2,500 in 2000. DoDEA Guam operates three elementary/middle schools and one high school. The University of Guam (UOG) and Guam Community College, both fully accredited by the Western Association of Schools and Colleges, offer courses in higher education. UOG is a member of the exclusive group of only 106 land-grant institutions in the entire United States. Pacific Islands University is a small Christian liberal arts institution nationally accredited by the Transnational Association of Christian Colleges and Schools. Health care The Government of Guam maintains the island's main health care facility, Guam Memorial Hospital, in Tamuning. U.S. board certified doctors and dentists practice in all specialties. In addition, the U.S. Naval Hospital in Agana Heights serves active-duty members and dependents of the military community. There is one subscriber-based air ambulance located on the island, CareJet, which provides emergency patient transportation across Guam and surrounding islands. A private hospital, the Guam Regional Medical City, opened its doors in early 2016.Medicaid is accepted in Guam. See also 51st state Index of Guam-related articles Lists of hospitals in the United States#Insular areas List of people from Guam Outline of Guam Voting in Guam References Further reading Maga, Timothy P. Defending Paradise: The United States and Guam, 1898–1950 (Garland, 1988). Rogers, Robert F. Destiny's Landfall: A History of Guam (U of Hawaii Press, 1995). Spear, Jane E. "Guamanian Americans." Gale Encyclopedia of Multicultural America, edited by Thomas Riggs, (3rd ed., vol. 2, Gale, 2014), pp. 263–273. online External links Guampedia, Guam's Online Encyclopedia "Guam Society of America", fosters the CHamoru language, culture, and traditions The Insular Empire: America in the Mariana Islands, PBS documentary film website. Guam. The World Factbook. Central Intelligence Agency. U.S. Census Bureau: Island Areas Census 2000 Geology and Hydrology of Guam Portals to the World: Guamfrom the U.S. Library of Congress. 1898 establishments in Oceania English-speaking countries and territories Former Spanish colonies Geography of Micronesia Insular areas of the United States Island countries Islands of Oceania Members of the Unrepresented Nations and Peoples Organization Pacific islands of the United States Small Island Developing States States and territories established in 1898 World War II sites
[ 0.2434324324131012, -0.18974027037620544, -0.14729782938957214, 0.15193767845630646, 0.059519778937101364, 0.11515047401189804, 0.017632249742746353, 0.2421507090330124, -0.35142946243286133, 0.05483844131231308, -0.07829655706882477, 0.11766526103019714, -0.41575556993484497, -0.093869730...
11979
https://en.wikipedia.org/wiki/Game%20Boy%20family
Game Boy family
The Game Boy family is a line of cartridge-based handheld video game consoles developed, manufactured, released and marketed by Nintendo. It comprises three sub families: Classic Game Boy, Game Boy Color and Game Boy Advance. Excluding Classic Game Boy systems and Game Boy Micro, all devices in the Game Boy family are backwards compatible with every game produced for a previous console in the family with only a few exceptions. Classic Game Boy systems are forwards compatible with all black cartridge Game Boy Color games, but will not display them in color. This was accomplished through use of cartridges with similar hardware on later consoles in the family. The Game Boy line was succeeded by the Nintendo DS line. A number of Game Boy, Game Boy Color, and Game Boy Advance games have been rereleased digitally through the Virtual Console service for the Nintendo 3DS and Wii U. The original and Game Boy Color combined sold 118.69 million units worldwide. All versions of the Game Boy Advance family combined have sold 81.51 million units. All Game Boy systems combined have sold 200.20 million units worldwide. History Nintendo's Game Boy handheld was first released in 1989. The gaming device was the brainchild of long-time Nintendo employee Gunpei Yokoi, who was the person behind the Ultra Hand, an expanding arm toy created and produced by Nintendo in 1970, long before Nintendo would enter the video game market. Yokoi was also responsible for the Game & Watch series of handhelds when Nintendo made the move from toys to video games. When Yokoi designed the original Game Boy, he knew that to be successful, the system needed to be small, light, inexpensive, and durable, as well as have a varied, recognizable library of games upon its release. By following this simple mantra, the Game Boy line managed to gain a vast following despite technically superior alternatives which would have color graphics instead. This is also apparent in the name (conceived by Shigesato Itoi), which connotes a smaller "sidekick" companion to Nintendo's consoles. Game Boy continues its success to this day and many at Nintendo have dedicated the handheld in Yokoi's memory. Game Boy celebrated its 15th anniversary in 2004, which nearly coincided with the 20-year anniversary of the original Nintendo Entertainment System (NES). To celebrate, Nintendo released the Classic NES Series and an NES controller-themed color scheme for the Game Boy Advance SP. In 2006, Nintendo president Satoru Iwata said on the rumored demise of the Game Boy brand: "No, it's not true after all. What we are repeatedly saying is that for whichever platform, we are always conducting research and development for the new system, be it the Game Boy, or new console or whatever. And what we just told the reporter was that in thinking about the current situation where we are enjoying great sales with the DS and that we are now trying to launch the Wii, it's unthinkable for us to launch any new platform for the handheld system, including the new version of the GBA... Perhaps they misunderstood a part of this story, but as far as the handheld market is concerned [right now] we really want to focus on more sales of the DS; that's all" until Nintendo ceased the production of the Game Boy Advance games and handheld system in North America on May 15, 2010. Classic Game Boy family Game Boy The original gray Game Boy was first released in Japan on April 21, 1989. Based on a Z80 processor, it has a black and green reflective LCD screen, an eight-way directional pad, two action buttons (A and B), and Start and Select buttons with the controls being identical to the NES controller. It plays games from ROM-based media contained in cartridges (sometimes called carts or Game Paks). Its graphics are 8-bit (similar to the NES). The game that pushed the Game Boy into the upper reaches of success was Tetris. Tetris was widely popular, and on the handheld format could be played anywhere. It came packaged with the Game Boy, and broadened its reach; adults and children alike were buying Game Boys in order to play Tetris. Releasing Tetris on the Game Boy was selected as #4 on GameSpy's "25 Smartest Moments in Gaming". The original Game Boy was one of the first cartridge-based systems that supported networking: two devices with a Game Link Cable, or up to four with the Four Player Adapter. In 1995, the "Play it Loud" version of the original Game Boy was released in six different colors; black, red, yellow, green, blue, white and clear as well as additional sports-themed editions. Game Boy Pocket The Game Boy Pocket is a redesigned version of the original Game Boy having the same features. It was released in 1996. Notably, this variation is smaller and lighter. It comes in seven different colors; red, yellow, green, black, clear, silver, blue, and pink. Another notable improvement over the original Game Boy is a black-and-white display screen, rather than the green-tinted display of the original Game Boy, that also featured improved response time for less blurring during motion. The Game Boy Pocket takes two AAA batteries as opposed to four AA batteries for roughly ten hours of gameplay. The first model of the Game Boy Pocket did not have an LED to show battery levels, but the feature was added due to public demand. Game Boy Light In April 1998, a variant of the Game Boy Pocket named Game Boy Light was exclusively released in Japan. The differences between the original Game Boy Pocket and the Game Boy Light is that the Game Boy Light takes on two AA batteries for approximately 20 hours of gameplay (when playing without using the light), rather than two AAA batteries, and it has an electroluminescent screen that can be turned on or off. This electroluminescent screen gave games a blue-green tint and allowed the use of the unit in darkened areas. Playing with the light on would allow about 12 hours of play. The Game Boy Light also comes in six different colors; silver, gold, yellow for the Pokémon edition, translucent yellow, clear and translucent red for the Astro Boy edition. The Game Boy Light was superseded by the Game Boy Color six months later and was the only Game Boy to have a backlit screen until the release of the Game Boy Advance SP AGS-101 model in 2005. Game Boy Color family Game Boy Color First released in Japan on October 21, 1998, the Game Boy Color (abbreviated as GBC) added a (slightly smaller) color screen to a form factor similar in size to the Game Boy Pocket. It also has double the processor speed, three times as much memory, and an infrared communications port. Technologically, it was likened to the 8-bit NES video game console from the 1980s although the Game Boy Color has a much larger color palette (56 simultaneous colors out of 32,768 possible) which had some classic NES ports and newer titles. It comes in six different colors; Atomic purple, indigo, berry (red), kiwi (green), dandelion (yellow) and teal. The Game Boy Color also has several special edition variants such as the yellow and silver Pokémon special editions or the Tommy Hilfiger yellow special edition. Like the Game Boy Light, the Game Boy Color takes only two AA batteries. It was the final handheld to have 8-bit graphics and to have a vertical shape. A major component of the Game Boy Color is its near-universal backward compatibility; that is, a Game Boy Color is able to read older Game Boy cartridges and even play them in a selectable color palette (similar to the Super Game Boy). The only black and white Game Boy games known to be incompatible are Road Rash and Joshua & the Battle of Jericho. Backwards compatibility became a major feature of the Game Boy line, since it allowed each new launch to begin with a significantly larger library than any of its competitors. Some games written specifically for the Game Boy Color can be played on older model Game Boys, whereas others cannot (see the Game Paks section for more information). Game Boy Advance family Game Boy Advance In Japan, on March 21, 2001, Nintendo released a significant upgrade to the Game Boy line. The Game Boy Advance (also referred to as GBA) featured a 32 bit 16.8 MHz ARM. It included a Z80 processor and a switch activated by inserting a Game Boy or Game Boy Color game into the slot for backward compatibility, and had a larger, higher resolution screen. Controls were slightly modified with the addition of "L" and "R" shoulder buttons. Like the Game Boy Light and Game Boy Color, the Game Boy Advance takes on two AA batteries. The system was technically likened to the SNES and showed its power with successful ports of SNES titles such as Super Mario World, Super Mario World 2: Yoshi's Island, The Legend of Zelda: A Link to the Past and Donkey Kong Country. There were also new titles that could be found only on the GBA, such as Mario Kart: Super Circuit, F-Zero: Maximum Velocity, Wario Land 4, Mario & Luigi: Superstar Saga and more. A widely criticized drawback of the Game Boy Advance is that the screen is not backlit, making viewing difficult in some conditions. The Game Paks for the GBA are roughly half the length of original Game Boy cartridges and Game Boy Color cartridges, and so older Game Paks would stick out of the top of the unit. When playing older games, the GBA provides the option to play the game at the standard equal square resolution of the original screen or the option to stretch it over the wider GBA screen. The selectable color palettes for the original Game Boy games are identical to what it was on the Game Boy Color. The only Game Boy Color games known to be incompatible are Pocket Music and Chee-Chai Alien. It was the final handheld to require regular batteries and to lack a backlit screen. Game Boy Advance SP First released in Japan on February 14, 2003, the Game Boy Advance SP—Nintendo model AGS-001—resolved several problems with the original Game Boy Advance model. It featured a new smaller clamshell design with a flip-up screen, a switchable internal frontlight, a rechargeable battery for the first time, and the only notable issue is the omission of the headphone jack, which requires a special adapter, purchased separately. In September 2005, Nintendo released the Game Boy Advance SP model AGS-101, that featured a high quality backlit screen instead of a frontlit, similar to the Game Boy Micro screen but larger. It was the final Game Boy and last handheld to have backwards compatibility with Game Boy and Game Boy Color games. Game Boy Micro The third form of Game Boy Advance system, the Game Boy Micro is four and a half inches wide (10 cm), two inches tall (5 cm), and weighs 2.8 ounces (80 g). By far the smallest Game Boy created, it has approximately the same dimensions as an original NES controller pad. Its screen is approximately 2/3 the size of the SP and GBA screens while maintaining the same resolution (240×160 pixels) but boasted a higher quality backlit display with adjustable brightness. Included with the system are two additional faceplates which can be swapped to give the system a new look; Nintendo of America sold additional faceplates on its online store. In Europe, the Game Boy Micro comes with a single faceplate. In Japan, a special Mother 3 limited edition Game Boy Micro was released with the game in the Mother 3 Deluxe Box. Unlike the Game Boy Advance and Game Boy Advance SP, the Game Boy Micro is unable to play any original Game Boy or Game Boy Color games, only playing Game Boy Advance titles (with the exception of the Nintendo e-Reader, discontinued in America, but still available in Japan). Comparison Game Paks Each video game is stored on a plastic cartridge, officially called a "Game Pak" by Nintendo. All cartridges, excluding those for Game Boy Advance, measure 5.8 by 6.5 cm. The cartridge provides the code and game data to the console's CPU. Some cartridges include a small battery with SRAM, flash memory chip, or EEPROM, which allows game data to be saved when the console is turned off. If the battery runs out in a cartridge, then the save data will be lost, however, it is possible to replace the battery with a new battery. To do this, the cartridge must be unscrewed, opened up, and the old battery would be removed and replaced. This may require desoldering the dead battery and soldering the replacement in place. Before 2003, Nintendo used round, flat watch batteries for saving information on the cartridges. These batteries were replaced in newer cartridges because they could only live for a certain amount of time. The cartridge is inserted into the console cartridge slot. If the cartridge is removed while the power is on, and the Game Boy does not automatically reset, the game freezes; the Game Boy may exhibit unexpected behavior, such as rows of zeros appearing on the screen, the sound remaining at the same pitch as was emitted the instant the game was pulled out, saved data may be corrupted, and hardware may be damaged. This applies to most video game consoles that use cartridges. The original Game Boy power switch was designed to prevent the player from being able to remove the cartridge while the power is on. Cartridges intended only for Game Boy Color (and not for the original Game Boy) lack the "notch" for the locking mechanism present in the top of the original cartridges, preventing operation on an original Game Boy (the cartridge can be inserted, but the power switch cannot be moved to the "on" position). Even if this was bypassed by using a Game Boy Pocket, Game Boy Light, or Super Game Boy (and its Japanese-only follow-up), the game would not run, and an image on the screen would inform the user that the game is only compatible with Game Boy Color systems. One exception would be the Kirby Tilt 'n' Tumble game: despite the game cartridge featuring a notch, enabling it to be inserted on the original Game Boy, the game displays an error message indicating that it only plays on Game Boy Color. Chee Chai Alien and Pocket Music are incompatible with Game Boy Advance models, displaying an error message indicating that they only play on Game Boy Color. Game Boy Advance cartridges used a similar physical lock-out feature. Notches were located at the base of the cartridge's two back corners. One of these notches was placed as to avoid pressing a switch inside the cartridge slot to help stabilize it. When an older Game Boy or Game Boy Color game was inserted into the cartridge slot, the switch would be pressed down and the Game Boy Advance would start in Game Boy Color mode, while a Game Boy Advance cartridge would not touch the switch and the system would start in Game Boy Advance mode. The Nintendo DS replaced the switch with a solid piece of plastic that would allow Game Boy Advance cartridges to be inserted into Slot 2, but would prevent an older Game Boy or Game Boy Color cartridge from being inserted fully into the slot. Excluding game-specific variations, there are four types of cartridges compatible with Game Boy systems: Grey cartridges Grey cartridges (also known as class A) are compatible with all Game Boy systems, excluding Game Boy Micro. All original Game Boy games are of this type. Some of these cartridges are in alternative colors, such as red or blue for Pokémon Red and Blue, and yellow for the Donkey Kong Land series. The games on these cartridges are programmed in black and white; the Game Boy Color and later systems provide selectable color palettes for them. Some grey cartridges that were released between 1994 and 1998 have Super Game Boy enhancements. Even fewer grey cartridges were released with built-in features that made them protrude from the slot, but included the notch to be compatible with the original Game Boy (notably the Game Boy Camera) Black cartridges Black cartridges (also known as class B or Dual Mode) are compatible with all Game Boy systems, excluding Game Boy Micro. Although the games on these cartridges are programmed in color, they can still be played in monochrome on Game Boy, Game Boy Pocket, Game Boy Light and Super Game Boy (and its Japanese follow-up). Examples of black-cartridge games are Pokémon Yellow: Special Pikachu Edition, Pokémon Gold and Silver (however, the actual colors of these three cartridges are yellow, gold, and silver, respectively). Games such as Wario Land II and The Legend of Zelda: Link's Awakening DX were full-color re-releases of gray-cartridge games but with additional content only available on the Game Boy Color. Some black cartridges have Super Game Boy enhancements. Even some games had built-in features similar to what the later clear cartridges did, like rumble features (Pokémon Pinball) and infrared receiver (Robopon Sun, Star, and Moon Versions). Clear cartridges Clear cartridges (also known as class C) are compatible with Game Boy Color and the Game Boy Advance systems, excluding Game Boy Micro. Some games (such as Pokémon Crystal) were released in specially colored cartridges, as had been done before, but the new colors remained translucent. Some clear cartridges have built-in features, including rumble features (Perfect Dark) and tilt sensors (Kirby Tilt 'n' Tumble). These cartridges are a slightly different shape from the earlier varieties, and would obstruct the latch if inserted into the original Game Boy. Unlike the Gray cartridges and Black cartridges, the Clear cartridges cannot be played on a Game Boy Pocket, a Game Boy Light or on Super Game Boy (or even its Japanese follow-up). Some class C cartridges (European version of V-Rally: Championship Edition) used a solid cartridge design, like in Class B. Advance cartridges Advance cartridges (also known as class D) are half the size of all earlier cartridges and are compatible with Game Boy Advance and later systems including the Nintendo DS. Some cartridges are colored to resemble the game (usually for the Pokémon series; Pokémon Emerald, for example, being a clear emerald green). They are also compatible with Nintendo DS and DS Lite (but see the Reception section for limitations). Some Advance cartridges have built-in features, including rumble features (Drill Dozer), tilt sensors (WarioWare: Twisted!, Yoshi's Universal Gravitation) and solar sensors (Boktai). Accessories Stand alone devices The Game Boy, as with many other consoles, has had a number of releases from both first-party and unlicensed third-party accessories. The most notable were the Game Boy Camera (left) and the Game Boy Printer (right), both released in 1998. Television adapters In addition to the Game Boy, special hardware has been released for various handhelds in the Game Boy line so they can be played on a television set. Super Game Boy In 1994, a special adapter cartridge for Nintendo's Super Nintendo Entertainment System (SNES) was released called the Super Game Boy. The Super Game Boy allows game cartridges designed for use on the Game Boy to be played on a TV display using the SNES/Super Famicom controllers. When it was released in 1994, the Super Game Boy sold for about $60 in the United States. In the United Kingdom, it retailed for £49.99. The Super Game Boy's technical architecture is similar to that of a regular Game Boy, thus Game Boy games functioned on the native hardware rather than being emulated by the SNES. It was the precursor to the Game Boy Player on the Nintendo GameCube, which functioned in a similar manner. Super Game Boy 2 A follow-up of the Super Game Boy, the Super Game Boy 2 was released only in Japan in 1998. The border is similar to that of actual Game Boy Pocket hardware, but it includes an actual link cable port, and the clock speed is slowed down to match that of the Game Boy. Game Boy Player The Game Boy Player is a device released in 2003 by Nintendo for the GameCube which enables Game Boy (although Super Game Boy enhancements are ignored), Game Boy Color, or Game Boy Advance cartridges to be played on a television. It connects via the high speed parallel port at the bottom of the GameCube and requires use of a boot disc to access the hardware. Unlike devices such as Datel's Advance Game Port, the Game Boy Player does not use software emulation, but instead uses physical hardware nearly identical to that of a Game Boy Advance. Reception Approximately two thousand games are available for the Game Boy, which can be attributed in part to its sales in the amount of millions, a well-documented design, and a typically short development cycle. The Nintendo DS and Nintendo DS Lite are able to play the large library of Game Boy Advance games (though the Nintendo DSi, Nintendo DSi XL, Nintendo 3DS, and Nintendo 2DS lack a GBA game cartridge slot). However, the DS consoles do not have a GBA game link connector, and so cannot play multiplayer GBA games (except for the few that are multiplayer on a single GBA) or link to the GameCube. The DS is not backward-compatible with Game Paks for the original Game Boy or the Game Boy Color. With homebrew development on the Nintendo DS, full speed Game Boy and Game Boy Color emulation has been achieved as well as the ability to scale the smaller Game Boy screen image to the full DS screen. Legacy Numerous musical acts have appropriated the Game Boy as a musical instrument (Game Boy music), using software such as nanoloop or Little Sound DJ. Certain games released for the Game Boy and Game Boy Color handheld consoles are available via the Virtual Console service on the Nintendo 3DS. Game Boy Advance games were thought to be as well due to the 3DS not being compatible, but it was just a mistranslation. However, ten Game Boy Advance games were released for Nintendo 3DS ambassadors, as in Nintendo 3DS owners who logged into the 3DS eShop before the major August 2011 price drop. The Virtual Console GBA features of releases are limited, and there are no plans to release them to the public. However, starting from April 2014, Nintendo has been releasing Game Boy Advance games as Virtual Console titles via the Nintendo eShop for the Wii U. See also Pokémon Mini References External links Official website Game Boy Game Boy consoles Handheld game consoles 1980s toys 1990s toys 2000s toys
[ -0.006959896069020033, 0.014068636111915112, 0.28367629647254944, 0.19654832780361176, 0.31191086769104004, 0.08678048104047775, -0.4308270812034607, 0.28401681780815125, -0.08967909216880798, -0.018057096749544144, -0.467841237783432, 0.07021258771419525, -0.23225152492523193, 0.140266329...
11982
https://en.wikipedia.org/wiki/Gemini%2010
Gemini 10
Gemini 10 (officially Gemini X) was a 1966 crewed spaceflight in NASA's Gemini program. It was the 8th crewed Gemini flight, the 16th crewed American flight, and the 24th spaceflight of all time (includes X-15 flights over ). During the mission, flown by John Young and future Apollo 11 Command Module Pilot Michael Collins, Collins became the first person to perform two extravehicular activities. Crew Backup crew Support crew Edwin E. "Buzz" Aldrin (Houston CAPCOM) L. Gordon Cooper Jr. (Cape and Houston CAPCOM) Jim Lovell and Buzz Aldrin had originally been named the backup crew, but after Charles Bassett and Elliot See died in a T-38 crash, they were moved to the backup crew for Gemini 9 and Alan Bean and Clifton Williams were moved to the Gemini 10 flight. Mission parameters Mass: Perigee: Apogee: Inclination: 28.87° Period: 88.79 min Docking Docked: July 19, 1966 - 04:15:00 UTC Undocked: July 20, 1966 - 19:00:00 UTC Space walk Collins - EVA 1 (stand up) Start: July 19, 1966, 21:44:00 UTC End: July 19, 1966, 22:33:00 UTC Duration: 0 hours, 49 minutes Collins - EVA 2 Start: July 20, 1966, 23:01:00 UTC End: July 20, 1966, 23:40:00 UTC Duration: 0 hours, 39 minutes Objectives Gemini 10 was designed to achieve rendezvous and docking with an Agena Target Vehicle (ATV), and EVA. It was also planned to dock with the ATV from the Gemini 8 mission. This Agena's battery power had failed months earlier, and an approach and docking would demonstrate the ability to rendezvous with a passive object. It would be also the first mission to fire the Agena's own rocket, allowing them to reach higher orbits. Gemini 10 established that radiation at high altitude was not a problem. After docking with their Agena booster in low orbit, Young and Collins used it to climb temporarily to . After leaving the first Agena, they then rendezvoused with the derelict Agena left over from the aborted Gemini 8 flight—thus executing the program's first double rendezvous. With no electricity on board the second Agena, the rendezvous was accomplished with eyes only—no radar. After the rendezvous, Collins spacewalked over to the dormant Agena at the end of a tether, making him the first person to meet another spacecraft in orbit. Collins then retrieved a cosmic dust-collecting panel from the side of the Agena. As he was concentrating on keeping his tether clear of the Gemini and Agena, Collins' Hasselblad camera worked itself free and drifted away, so he was unable to take photographs during the spacewalk. Flight The Agena launched perfectly for the second time, after problems had occurred with the targets for Gemini 6 and 9. Gemini 10 followed 100 minutes later and entered a orbit. They were behind the Agena. Two anomalous events occurred during the launch. At liftoff, a propellant fill umbilical became snared with its release lanyard. It ripped out of the LC-19 service tower and remained attached to the second stage during ascent. Tracking camera footage also showed that the first stage oxidizer tank dome ruptured after staging and released a cloud of nitrogen tetroxide. The telemetry package on the first stage had been disabled at staging, so visual evidence was the only data available. Film review of the Titan II ICBM launches found at least seven other instances of post-staging tank ruptures, most likely caused by flying debris, second stage engine exhaust, or structural bending. NASA finally decided that this phenomenon did not pose any safety risk to the astronauts and took no corrective action. First rendezvous Collins was unable to use the sextant for navigation as it did not seem to work as expected. At first he mistook airglow as the real horizon when trying to make some fixes on stars. When the image didn't seem right he tried another instrument, but this was not practical to use as it had a very small field of view. They had a backup in the form of the computers on the ground. They made their first burn to put them into a orbit. However Young didn't realize that during the next burn, he had the spacecraft turned slightly, which meant that they introduced an out-of-plane error. This meant two extra burns were necessary, and by the time they had docked with the Agena, 60% of their fuel had been consumed. It was decided to keep the Gemini docked to the Agena as long as possible, as this would mean that they could use the fuel on board the Agena for attitude control. The first burn of the Agena engine lasted 80 seconds and put them in a orbit. This was the highest a person had ever been, although the record was soon surpassed by Gemini 11, which went to over . This burn was quite a ride for the crew. Because the Gemini and Agena docked nose-to-nose, the forces experienced were "eyeballs out" as opposed to "eyeballs in" for a launch from Earth. The crew took a couple of pictures when they reached apogee but were more interested in what was going on in the spacecraft — checking the systems and watching the radiation dosage meter. After this they had their sleep period which lasted for eight hours and then they were ready for another busy day. The crew's first order of business was to make a second burn with the Agena engine to put them into the same orbit as the Gemini 8 Agena. This was at 20:58 UTC on July 19 and lasted 78 seconds and took off their speed, putting them into a orbit. They made one more burn of the Agena to circularize their orbit to . EVA 1 The first of two EVAs on Gemini 10 was a standup EVA, where Collins would stand in the open hatch and take photographs of stars as part of experiment S-13. They used a 70 mm general purpose camera to image the southern Milky Way in ultraviolet. After orbital sunrise Collins photographed a color plate on the side of the spacecraft (MSC-8) to see whether film reproduced colors accurately in space. He reentered the spacecraft six minutes early when both astronauts found that their eyes were irritated, which was caused by a minor leak of lithium hydroxide in the astronauts' oxygen supply. After repressurizing the cabin, they ran the oxygen at high rates and flushed the environment system. After the exercise of the EVA Young and Collins slept in their second 'night' in space. The next 'morning' they started preparing for the second rendezvous and another EVA. Second rendezvous After undocking from their Agena, the crew thought they sighted the Gemini 8 Agena. It however turned out to be their own Agena away, while their target was away. It wasn't until just over away that they saw it as a faint star. After a few more correction burns, they were station-keeping away from the Gemini 8 Agena. They found the Agena to be very stable and in good condition. EVA 2 At 48 hours and 41 minutes into the mission, the second EVA began. Collins' first task was to retrieve a Micrometeorite Collector (S-12) from the side of the spacecraft. This he accomplished with some difficulty (similar to that encountered by Eugene Cernan on Gemini 9A). The collector floated out of the cabin at some time during the EVA, and was lost. Collins next traveled over to the Agena and tried to grab onto the docking cone but found this impossible as it was smooth and had no grip. He used a nitrogen-propelled Hand-Held Maneuvering Unit (HHMU) to move himself towards the Gemini and then back to the Agena. This time he was able to grab hold of some wire bundles and retrieved the Micrometeorite Collector (S-10) from the Agena. He decided against replacing it as a piece of shroud had come loose on the Agena which could have snared the umbilical, and returning to the Gemini was deemed the safest course of action. The last tasks remaining on this EVA were to test out the HHMU, test orbital mechanics using a tether between the Gemini and Agena, and for Young in the spacecraft to translate over to a passive Collins. However, due to low propellant quantity remaining, combined with intermittent telemetry to monitor it, these fuel costly manoeuvres were abandoned and the EVA was finished after only 39 minutes. During this time, it took the crew eight minutes to close the hatch as they had some difficulty with the umbilical. It was jettisoned along with the chestpack used by Collins an hour later when they opened the hatch for the third and final time. Experiments There were ten other experiments that the crew performed during the mission. Three were interested in radiation: MSC-3 was the Tri-Axis Magnetometer which measured levels in the South Atlantic Anomaly. There was also MSC-6, a beta spectrometer, which measured potential radiation doses for Apollo missions, and MSC-7, a bremsstrahlung spectrometer which detected radiation flux as a function of energy when the spacecraft passed through the South Atlantic Anomaly. S-26 investigated the ion and electron wake of the spacecraft. This provided limited results due to the lack of fuel for attitude control, but found that electron and ion temperatures were higher than expected and it registered shock effects during docking and undocking. The S-5 and S-6 experiments were performed, which were previously carried on Gemini 9A; these were Synoptic Terrain and Synoptic Weather photography respectively. There was also S-1 which was intended to image the Zodiacal light. All of these experiments were of little use as the film used was only half as sensitive as Gemini 9A and the dirty windows lowered the transmission of light by a factor of six. The crew also tried to perform D-5, a navigation experiment. They were only able to track five stars, with six needed for accurate measurements. The last experiment, D-10, was to investigate an ion-sensing attitude control system. This experiment measured the attitude of the spacecraft from the flow of ions and electrons around the spacecraft in orbit. The results from this experiment showed the system to be accurate and responsive. Re-entry The last day of the mission was short and retrofire came at 70 hours and 10 minutes into the mission. They landed only away from the intended landing site and were recovered by . The Gemini 10 mission was supported by the following U.S. Department of Defense resources: 9,067 personnel, 78 aircraft and 13 ships. Insignia The patch is simple in design but highly symbolic. The main feature is a large X with a Gemini and Agena orbiting around it. The two stars have a variety of meanings: the two rendezvous attempts, Castor and Pollux in Gemini or the two crew members. This is one of the few crew patches without the crew's name. It is able to be displayed "upside down" but is correctly shown with the spacecraft to the right. It was designed by Young's first wife, Barbara. Spacecraft location For many years the spacecraft was the centerpiece of a space exhibition at Norsk Teknisk Museum, Oslo, Norway. It was returned on request in 2002. The spacecraft is currently on display at the Cosmosphere in Hutchinson, Kansas. See also Agena Target Vehicle Extra-vehicular activity List of spacewalks Splashdown Space exploration U.S. space exploration history on U.S. stamps Space suit Space capsule References External links NASA Gemini 10 press kit - July 15, 1966 Gemini 10 Mission Report (PDF) August 1966 U.S. Space Objects Registry https://web.archive.org/web/20090521121750/http://usspaceobjectsregistry.state.gov/search/index.cfm Extravehicular activity Human spaceflights Project Gemini missions Spacecraft launched in 1966 Spacecraft launched by Titan rockets Spacecraft which reentered in 1966 July 1966 events John Young (astronaut) Michael Collins (astronaut)
[ -0.5306123495101929, 0.33993080258369446, 0.21753185987472534, 0.3361971080303192, 0.08369933813810349, 0.42173102498054504, 0.3379724323749542, 0.2751922011375427, -0.1743749976158142, 0.13337597250938416, 0.039989858865737915, 0.23297351598739624, -0.11305876076221466, -0.316509187221527...
11984
https://en.wikipedia.org/wiki/Gardening
Gardening
Gardening is the practice of growing and cultivating plants as part of horticulture. In gardens, ornamental plants are often grown for their flowers, foliage, or overall appearance; useful plants, such as root vegetables, leaf vegetables, fruits, and herbs, are grown for consumption, for use as dyes, or for medicinal or cosmetic use. Gardening ranges in scale from fruit orchards, to long boulevard plantings with one or more different types of shrubs, trees, and herbaceous plants, to residential back gardens including lawns and foundation plantings, and to container gardens grown inside or outside. Gardening may be very specialized, with only one type of plant grown, or involve a variety of plants in mixed plantings. It involves an active participation in the growing of plants, and tends to be labor-intensive, which differentiates it from farming or forestry. History Ancient times Forest gardening, a forest-based food production system, is the world's oldest form of gardening. Forest gardens originated in prehistoric times along jungle-clad river banks and in the wet foothills of monsoon regions. In the gradual process of families improving their immediate environment, useful tree and vine species were identified, protected and improved while undesirable species were eliminated. Eventually foreign species were also selected and incorporated into the gardens. After the emergence of the first civilizations, wealthy individuals began to create gardens for aesthetic purposes. Ancient Egyptian tomb paintings from the New Kingdom (around 1500 BC) provide some of the earliest physical evidence of ornamental horticulture and landscape design; they depict lotus ponds surrounded by symmetrical rows of acacias and palms. A notable example of ancient ornamental gardens were the Hanging Gardens of Babylon—one of the Seven Wonders of the Ancient World —while ancient Rome had dozens of gardens. Wealthy ancient Egyptians used gardens for providing shade. Egyptians associated trees and gardens with gods, believing that their deities were pleased by gardens. Gardens in ancient Egypt were often surrounded by walls with trees planted in rows. Among the most popular species planted were date palms, sycamores, fig trees, nut trees, and willows. These gardens were a sign of higher socioeconomic status. In addition, wealthy ancient Egyptians grew vineyards, as wine was a sign of the higher social classes. Roses, poppies, daisies and irises could all also be found in the gardens of the Egyptians. Assyria was also renowned for its beautiful gardens. These tended to be wide and large, some of them used for hunting game—rather like a game reserve today—and others as leisure gardens. Cypresses and palms were some of the most frequently planted types of trees. Gardens were also available in Kush. In Musawwarat es-Sufra, the Great Enclosure dated to the 3rd century BC included splendid gardens. Ancient Roman gardens were laid out with hedges and vines and contained a wide variety of flowers—acanthus, cornflowers, crocus, cyclamen, hyacinth, iris, ivy, lavender, lilies, myrtle, narcissus, poppy, rosemary and violets—as well as statues and sculptures. Flower beds were popular in the courtyards of rich Romans. The Middle Ages The Middle Ages represent a period of decline in gardens for aesthetic purposes. After the fall of Rome, gardening was done for the purpose of growing medicinal herbs and/or decorating church altars. Monasteries carried on a tradition of garden design and intense horticultural techniques during the medieval period in Europe. Generally, monastic garden types consisted of kitchen gardens, infirmary gardens, cemetery orchards, cloister garths and vineyards. Individual monasteries might also have had a "green court", a plot of grass and trees where horses could graze, as well as a cellarer's garden or private gardens for obedientiaries, monks who held specific posts within the monastery. Islamic gardens were built after the model of Persian gardens and they were usually enclosed by walls and divided in four by watercourses. Commonly, the centre of the garden would have a reflecting pool or pavilion. Specific to the Islamic gardens are the mosaics and glazed tiles used to decorate the rills and fountains that were built in these gardens. By the late 13th century, rich Europeans began to grow gardens for leisure and for medicinal herbs and vegetables. They surrounded the gardens by walls to protect them from animals and to provide seclusion. During the next two centuries, Europeans started planting lawns and raising flowerbeds and trellises of roses. Fruit trees were common in these gardens and also in some, there were turf seats. At the same time, the gardens in the monasteries were a place to grow flowers and medicinal herbs but they were also a space where the monks could enjoy nature and relax. The gardens in the 16th and 17th century were symmetric, proportioned and balanced with a more classical appearance. Most of these gardens were built around a central axis and they were divided into different parts by hedges. Commonly, gardens had flowerbeds laid out in squares and separated by gravel paths. Gardens in Renaissance were adorned with sculptures, topiary and fountains. In the 17th century, knot gardens became popular along with the hedge mazes. By this time, Europeans started planting new flowers such as tulips, marigolds and sunflowers. Cottage gardens Cottage gardens, which emerged in Elizabethan times, appear to have originated as a local source for herbs and fruits. One theory is that they arose out of the Black Death of the 1340s, when the death of so many laborers made land available for small cottages with personal gardens. According to the late 19th-century legend of origin, these gardens were originally created by the workers that lived in the cottages of the villages, to provide them with food and herbs, with flowers planted among them for decoration. Farm workers were provided with cottages that had architectural quality set in a small garden—about —where they could grow food and keep pigs and chickens. Authentic gardens of the yeoman cottager would have included a beehive and livestock, and frequently a pig and sty, along with a well. The peasant cottager of medieval times was more interested in meat than flowers, with herbs grown for medicinal use rather than for their beauty. By Elizabethan times there was more prosperity, and thus more room to grow flowers. Even the early cottage garden flowers typically had their practical use—violets were spread on the floor (for their pleasant scent and keeping out vermin); calendulas and primroses were both attractive and used in cooking. Others, such as sweet William and hollyhocks, were grown entirely for their beauty. 18th century In the 18th century gardens were laid out more naturally, without any walls. This style of smooth undulating grass, which would run straight to the house, clumps, belts and scattering of trees and his serpentine lakes formed by invisibly damming small rivers, were a new style within the English landscape, a "gardenless" form of landscape gardening, which swept away almost all the remnants of previous formally patterned styles. The English landscape garden usually included a lake, lawns set against groves of trees, and often contained shrubberies, grottoes, pavilions, bridges and follies such as mock temples, Gothic ruins, bridges, and other picturesque architecture, designed to recreate an idyllic pastoral landscape. This new style emerged in England in the early 18th century, and spread across Europe, replacing the more formal, symmetrical garden à la française of the 17th century as the principal gardening style of Europe. The English garden presented an idealized view of nature. They were often inspired by paintings of landscapes by Claude Lorraine and Nicolas Poussin, and some were Influenced by the classic Chinese gardens of the East, which had recently been described by European travelers. The work of Lancelot 'Capability' Brown was particularly influential. Also, in 1804 the Horticultural Society was formed. Gardens of the 19th century contained plants such as the monkey puzzle or Chile pine. This is also the time when the so-called "gardenesque" style of gardens evolved. These gardens displayed a wide variety of flowers in a rather small space. Rock gardens increased in popularity in the 19th century. India: In India, in the ancient times, patterns from sacred geometry and mandalas were used to design their gardens. Distinct mandala patterns denoted specific deities, planets, or even constellations. Such a garden was also referred to as a 'Mandala Vaatika'. The word 'Vaatika' can mean garden, plantation or parterre. Types Residential gardening takes place near the home, in a space referred to as the garden. Although a garden typically is located on the land near a residence, it may also be located on a roof, in an atrium, on a balcony, in a windowbox, on a patio or vivarium. Gardening also takes place in non-residential green areas, such as parks, public or semi-public gardens (botanical gardens or zoological gardens), amusement parks, along transportation corridors, and around tourist attractions and garden hotels. In these situations, a staff of gardeners or groundskeepers maintains the gardens. Indoor gardening is concerned with the growing of houseplants within a residence or building, in a conservatory, or in a greenhouse. Indoor gardens are sometimes incorporated as part of air conditioning or heating systems. Indoor gardening extends the growing season in the fall and spring and can be used for winter gardening. Native plant gardening is concerned with the use of native plants with or without the intent of creating wildlife habitat. The goal is to create a garden in harmony with, and adapted to a given area. This type of gardening typically reduces water usage, maintenance, and fertilization costs, while increasing native faunal interest. Water gardening is concerned with growing plants adapted to pools and ponds. Bog gardens are also considered a type of water garden. These all require special conditions and considerations. A simple water garden may consist solely of a tub containing the water and plant(s). In aquascaping, a garden is created within an aquarium tank. Container gardening is concerned with growing plants in any type of container either indoors or outdoors. Common containers are pots, hanging baskets, and planters. Container gardening is usually used in atriums and on balconies, patios, and roof tops. Hügelkultur is concerned with growing plants on piles of rotting wood, as a form of raised bed gardening and composting in situ. An English loanword from German, it means "mound garden." Toby Hemenway, noted permaculture author and teacher, considers wood buried in trenches to also be a form of hugelkultur referred to as a dead wood swale. Hugelkultur is practiced by Sepp Holzer as a method of forest gardening and agroforestry, and by Geoff Lawton as a method of dryland farming and desert greening. When used as a method of disposing of large volumes of waste wood and woody debris, hugelkultur accomplishes carbon sequestration. It is also a form of xeriscaping. Community gardening is a social activity in which an area of land is gardened by a group of people, providing access to fresh produce, herbs, flowers and plants as well as access to satisfying labor, neighborhood improvement, sense of community and connection to the environment. Community gardens are typically owned in trust by local governments or nonprofits. Garden sharing partners landowners with gardeners in need of land. These shared gardens, typically front or back yards, are usually used to produce food that is divided between the two parties. Organic gardening uses natural, sustainable methods, fertilizers and pesticides to grow non-genetically modified crops. Biodynamic gardening or biodynamic agriculture is similar to organic gardening, but it includes various esoteric concepts drawn from the ideas of Rudolf Steiner, such as astrological sowing and planting calendar and particular field and compost preparations. Commercial gardening is a more intensive type of gardening that involves the production of vegetables, nontropical fruits, and flowers from local farmers. Commercial gardening began because farmers would sell locally to stop food from spoiling faster because of the transportation of goods from a far distance. Mediterranean agriculture is also a common practice that commercial gardeners use. Mediterranean agriculture is the practice of cultivating animals such as sheep to help weed and provide manure for vine crops, grains, or citrus. Gardeners can easily train these animals to not eat the actual plant. Social aspects People can express their political or social views in gardens, intentionally or not. The lawn vs. garden issue is played out in urban planning as the debate over the "land ethic" that is to determine urban land use and whether hyper hygienist bylaws (e.g. weed control) should apply, or whether land should generally be allowed to exist in its natural wild state. In a famous Canadian Charter of Rights case, "Sandra Bell vs. City of Toronto", 1997, the right to cultivate all native species, even most varieties deemed noxious or allergenic, was upheld as part of the right of free expression. Community gardening comprises a wide variety of approaches to sharing land and gardens. People often surround their house and garden with a hedge. Common hedge plants are privet, hawthorn, beech, yew, leyland cypress, hemlock, arborvitae, barberry, box, holly, oleander, forsythia and lavender. The idea of open gardens without hedges may be distasteful to those who enjoy privacy. The Slow Food movement has sought in some countries to add an edible school yard and garden classrooms to schools, e.g. in Fergus, Ontario, where these were added to a public school to augment the kitchen classroom. Garden sharing, where urban landowners allow gardeners to grow on their property in exchange for a share of the harvest, is associated with the desire to control the quality of one's food, and reconnect with soil and community. In US and British usage, the production of ornamental plantings around buildings is called landscaping, landscape maintenance or grounds keeping, while international usage uses the term gardening for these same activities. Also gaining popularity is the concept of "Green Gardening" which involves growing plants using organic fertilizers and pesticides so that the gardening process – or the flowers and fruits produced thereby – doesn't adversely affect the environment or people's health in any manner. Benefits Gardening is considered by many people to be a relaxing activity. There are also many studies about the positive effects on mental and physical health in relation to gardening. Specifically, gardening is thought to increase self-esteem and reduce stress. As writer and former teacher Sarah Biddle notes, one's garden may become a "tiny oasis to relax and recharge [one's] batteries." Comparison with farming Gardening for beauty is likely nearly as old as farming for food, however for most of history for the majority of people there was no real distinction since the need for food and other useful products trumped other concerns. Small-scale, subsistence agriculture (called hoe-farming) is largely indistinguishable from gardening. A patch of potatoes grown by a Peruvian peasant or an Irish smallholder for personal use could be described as either a garden or a farm. Gardening for average people evolved as a separate discipline, more concerned with aesthetics, recreation and leisure, under the influence of the pleasure gardens of the wealthy. Meanwhile, farming has evolved (in developed countries) in the direction of commercialization, economics of scale, and monocropping. In respect to its food-producing purpose, gardening is distinguished from farming chiefly by scale and intent. Farming occurs on a larger scale, and with the production of salable goods as a major motivation. Gardening happens on a smaller scale, primarily for pleasure and to produce goods for the gardener's own family or community. There is some overlap between the terms, particularly in that some moderate-sized vegetable growing concerns, often called market gardening, can fit in either category. The key distinction between gardening and farming is essentially one of scale; gardening can be a hobby or an income supplement, but farming is generally understood as a full-time or commercial activity, usually involving more land and quite different practices. One distinction is that gardening is labor-intensive and employs very little infrastructural capital, sometimes no more than a few tools, e.g. a spade, hoe, basket and watering can. By contrast, larger-scale farming often involves irrigation systems, chemical fertilizers and harvesters or at least ladders, e.g. to reach up into fruit trees. However, this distinction is becoming blurred with the increasing use of power tools in even small gardens. Monty Don has speculated on an atavistic connection between present-day gardeners and pre-modern peasantry. The term precision agriculture is sometimes used to describe gardening using intermediate technology (more than tools, less than harvesters), especially of organic varieties. Gardening is effectively scaled up to feed entire villages of over 100 people from specialized plots. A variant is the community garden which offers plots to urban dwellers; see further in allotment (gardening). Garden ornaments and accessories There is a wide range of garden ornaments and accessories available in the market for both the professional gardener and the amateur to exercise their creativity. These are used to add decoration or functionality, and may be made from a wide range of materials such as copper, stone, wood, bamboo, stainless steel, clay, stained glass, concrete, or iron. Examples include trellis, garden furniture, statues, outdoor fireplaces, fountains, rain chains, urns, bird baths and feeders, wind chimes, and garden lighting such as candle lanterns and oil lamps. The use of these items can be part of the expression of a gardener's gardening personality. Gardens as art Garden design is considered to be an art in most cultures, distinguished from gardening, which generally means garden maintenance. Garden design can include different themes such as perennial, butterfly, wildlife, Japanese, water, tropical, or shade gardens. In Japan, Samurai and Zen monks were often required to build decorative gardens or practice related skills like flower arrangement known as ikebana. In 18th-century Europe, country estates were refashioned by landscape gardeners into formal gardens or landscaped park lands, such as at Versailles, France, or Stowe, England. Today, landscape architects and garden designers continue to produce artistically creative designs for private garden spaces. In the US, professional landscape designers are certified by the Association of Professional Landscape Designers. Garden pests Garden pests are generally plants, fungi, or animals (frequently insects) that engage in activity that the gardener considers undesirable. A pest may crowd out desirable plants, disturb soil, stunt the growth of young seedlings, steal or damage fruit, or otherwise kill plants, hamper their growth, damage their appearance, or reduce the quality of the edible or ornamental portions of the plant. Aphids, spider mites, slugs, snails, ants, birds, and even cats are commonly considered to be garden pests. Because gardeners may have different goals, organisms considered "garden pests" vary from gardener to gardener. Tropaeolum speciosum, for example, may be considered a desirable and ornamental garden plant, or it may be considered a pest if it seeds and starts to grow where it is not wanted. As another example, in lawns, moss can become dominant and be impossible to eradicate. In some lawns, lichens, especially very damp lawn lichens such as Peltigera lactucfolia and P. membranacea, can become difficult to control and are considered pests. Garden pest control There are many ways by which unwanted pests are removed from a garden. The techniques vary depending on the pest, the gardener's goals, and the gardener's philosophy. For example, snails may be dealt with through the use of a chemical pesticide, an organic pesticide, hand-picking, barriers, or simply growing snail-resistant plants. Pest control is often done through the use of pesticides, which may be either organic or artificially synthesized. Pesticides may affect the ecology of a garden due to their effects on the populations of both target and non-target species. For example, unintended exposure to some neonicotinoid pesticides has been proposed as a factor in the recent decline in honey bee populations. A mole vibrator can deter mole activity in a garden. Other means of control include the removal of infected plants, using fertilizers and biostimulants to improve the health and vigour of plants so they better resist attack, practising crop rotation to prevent pest build-up, using companion planting, and practising good garden hygiene, such as disinfecting tools and clearing debris and weeds which may harbour pests. Garden guns Garden guns are smooth bore shotguns specifically made to fire .22 caliber snake shot, and are commonly used by gardeners and farmers for pest control. Garden guns are short range weapons that can do little harm past to , and they're relatively quiet when fired with snake shot, compared to a standard ammunition. These guns are especially effective inside of barns and sheds, as the snake shot will not shoot holes in the roof or walls, or more importantly injure livestock with a ricochet. They are also used for pest control at airports, warehouses, stockyards, etc. See also Arboretum Bonsai Compost Cultigen Eyecatchers Garden writing Growbag Introduced species Impact gardening List of garden types List of gardening topics List of horticulture and gardening books List of professional gardeners Master gardener program No-dig gardening Orchard References External links National Gardening Association (USA)
[ 0.673214316368103, -0.12023162096738815, -0.004855388309806585, 0.03512730449438095, -0.02059159055352211, 0.3057352900505066, 0.23698124289512634, -0.07411026954650879, -0.2686457633972168, -0.16922342777252197, -0.5494043231010437, 0.3798951506614685, -0.08399748802185059, 0.100542157888...
11985
https://en.wikipedia.org/wiki/Graffiti
Graffiti
Graffiti (both singular and plural; the singular graffito is rarely used except in archeology) is a type of art genre that means writing or drawings made on a wall or other surface, usually without permission and within public view. Graffiti ranges from simple written words to elaborate wall paintings, and has existed since ancient times, with examples dating back to ancient Egypt, ancient Greece, and the Roman Empire. Graffiti is a controversial subject. In most countries, marking or painting property without permission is considered by property owners and civic authorities as defacement and vandalism, which is a punishable crime, citing the use of graffiti by street gangs to mark territory or to serve as an indicator of gang-related activities. Graffiti has become visualized as a growing urban "problem" for many cities in industrialized nations, spreading from the New York City subway system in the early 1970s to the rest of the United States and Europe and other world regions. Etymology "Graffiti" (usually both singular and plural) and the rare singular form "graffito" are from the Italian word graffiato ("scratched"). The term "graffiti" is used in art history for works of art produced by scratching a design into a surface. A related term is "sgraffito", which involves scratching through one layer of pigment to reveal another beneath it. This technique was primarily used by potters who would glaze their wares and then scratch a design into it. In ancient times graffiti were carved on walls with a sharp object, although sometimes chalk or coal were used. The word originates from Greek —graphein—meaning "to write". History The term graffiti originally referred to the inscriptions, figure drawings, and such, found on the walls of ancient sepulchres or ruins, as in the Catacombs of Rome or at Pompeii. Use of the word has evolved to include any graphics applied to surfaces in a manner that constitutes vandalism. The only known source of the Safaitic language, an ancient form of Arabic, is from graffiti: inscriptions scratched on to the surface of rocks and boulders in the predominantly basalt desert of southern Syria, eastern Jordan and northern Saudi Arabia. Safaitic dates from the first century BC to the fourth century AD. Modern-style graffiti The first known example of "modern style" graffiti survives in the ancient Greek city of Ephesus (in modern-day Turkey). Local guides say it is an advertisement for prostitution. Located near a mosaic and stone walkway, the graffiti shows a handprint that vaguely resembles a heart, along with a footprint, a number, and a carved image of a woman's head. The ancient Romans carved graffiti on walls and monuments, examples of which also survive in Egypt. Graffiti in the classical world had different connotations than they carry in today's society concerning content. Ancient graffiti displayed phrases of love declarations, political rhetoric, and simple words of thought, compared to today's popular messages of social and political ideals. The eruption of Vesuvius preserved graffiti in Pompeii, which includes Latin curses, magic spells, declarations of love, insults, alphabets, political slogans, and famous literary quotes, providing insight into ancient Roman street life. One inscription gives the address of a woman named Novellia Primigenia of Nuceria, a prostitute, apparently of great beauty, whose services were much in demand. Another shows a phallus accompanied by the text, mansueta tene ("handle with care"). Disappointed love also found its way onto walls in antiquity: Ancient tourists visiting the 5th-century citadel at Sigiriya in Sri Lanka scribbled over 1800 individual graffiti there between the 6th and 18th centuries. Etched on the surface of the Mirror Wall, they contain pieces of prose, poetry, and commentary. The majority of these visitors appear to have been from the elite of society: royalty, officials, professions, and clergy. There were also soldiers, archers, and even some metalworkers. The topics range from love to satire, curses, wit, and lament. Many demonstrate a very high level of literacy and a deep appreciation of art and poetry. Most of the graffiti refer to the frescoes of semi-nude females found there. One reads: Among the ancient political graffiti examples were Arab satirist poems. Yazid al-Himyari, an Umayyad Arab and Persian poet, was most known for writing his political poetry on the walls between Sajistan and Basra, manifesting a strong hatred towards the Umayyad regime and its walis, and people used to read and circulate them very widely. Level of literacy often evident in graffiti Historic forms of graffiti have helped gain understanding into the lifestyles and languages of past cultures. Errors in spelling and grammar in these graffiti offer insight into the degree of literacy in Roman times and provide clues on the pronunciation of spoken Latin. Examples are CIL IV, 7838: Vettium Firmum / aed[ilem] quactiliar[ii] rog[ant]. Here, "qu" is pronounced "co". The 83 pieces of graffiti found at CIL IV, 4706-85 are evidence of the ability to read and write at levels of society where literacy might not be expected. The graffiti appear on a peristyle which was being remodeled at the time of the eruption of Vesuvius by the architect Crescens. The graffiti were left by both the foreman and his workers. The brothel at CIL VII, 12, 18–20 contains more than 120 pieces of graffiti, some of which were the work of the prostitutes and their clients. The gladiatorial academy at CIL IV, 4397 was scrawled with graffiti left by the gladiator Celadus Crescens (Suspirium puellarum Celadus thraex: "Celadus the Thracian makes the girls sigh.") Another piece from Pompeii, written on a tavern wall about the owner of the establishment and his questionable wine: It was not only the Greeks and Romans who produced graffiti: the Maya site of Tikal in Guatemala contains examples of ancient Maya graffiti. Viking graffiti survive in Rome and at Newgrange Mound in Ireland, and a Varangian scratched his name (Halvdan) in runes on a banister in the Hagia Sophia at Constantinople. These early forms of graffiti have contributed to the understanding of lifestyles and languages of past cultures. Graffiti, known as Tacherons, were frequently scratched on Romanesque Scandinavian church walls. When Renaissance artists such as Pinturicchio, Raphael, Michelangelo, Ghirlandaio, or Filippino Lippi descended into the ruins of Nero's Domus Aurea, they carved or painted their names and returned to initiate the grottesche style of decoration. There are also examples of graffiti occurring in American history, such as Independence Rock, a national landmark along the Oregon Trail. Later, French soldiers carved their names on monuments during the Napoleonic campaign of Egypt in the 1790s. Lord Byron's survives on one of the columns of the Temple of Poseidon at Cape Sounion in Attica, Greece. Contemporary graffiti Contemporary graffiti style has been heavily influenced by hip hop culture and the myriad international styles derived from Philadelphia and New York City Subway graffiti, however, there are many other traditions of notable graffiti in the twentieth century. Graffiti have long appeared on building walls, in latrines, railroad boxcars, subways, and bridges. The oldest known example of modern graffiti are the "monikers" found on traincars created by hobos and railworkers since the late 1800s. The Bozo Texino monikers were documented by filmmaker Bill Daniel in his 2005 film, Who is Bozo Texino?. Some graffiti have their own poignancy. In World War II, an inscription on a wall at the fortress of Verdun was seen as an illustration of the US response twice in a generation to the wrongs of the Old World: During World War II and for decades after, the phrase "Kilroy was here" with an accompanying illustration was widespread throughout the world, due to its use by American troops and ultimately filtering into American popular culture. Shortly after the death of Charlie Parker (nicknamed "Yardbird" or "Bird"), graffiti began appearing around New York with the words "Bird Lives". The student protests and general strike of May 1968 saw Paris bedecked in revolutionary, anarchistic, and situationist slogans such as L'ennui est contre-révolutionnaire ("Boredom is counterrevolutionary") expressed in painted graffiti, poster art, and stencil art. At the time in the US, other political phrases (such as "Free Huey" about Black Panther Huey Newton) became briefly popular as graffiti in limited areas, only to be forgotten. A popular graffito of the early 1970s was "Dick Nixon Before He Dicks You", reflecting the hostility of the youth culture to that US president. Advent of aerosol paint Rock and roll graffiti is a significant subgenre. A famous graffito of the twentieth century was the inscription in the London tube reading "Clapton is God" in a link to the guitarist Eric Clapton. The phrase was spray-painted by an admirer on a wall in an Islington station on the Underground in the autumn of 1967. The graffito was captured in a photograph, in which a dog is urinating on the wall. Graffiti also became associated with the anti-establishment punk rock movement beginning in the 1970s. Bands such as Black Flag and Crass (and their followers) widely stenciled their names and logos, while many punk night clubs, squats, and hangouts are famous for their graffiti. In the late 1980s the upside down Martini glass that was the tag for punk band Missing Foundation was the most ubiquitous graffito in lower Manhattan Spread of hip hop culture Style Wars depicted not only famous graffitists such as Skeme, Dondi, MinOne, and ZEPHYR, but also reinforced graffiti's role within New York's emerging hip-hop culture by incorporating famous early break-dancing groups such as Rock Steady Crew into the film and featuring rap in the soundtrack. Although many officers of the New York City Police Department found this film to be controversial, Style Wars is still recognized as the most prolific film representation of what was going on within the young hip hop culture of the early 1980s. Fab5 Freddy and Futura 2000 took hip hop graffiti to Paris and London as part of the New York City Rap Tour in 1983. Stencil graffiti emerges This period also saw the emergence of the new stencil graffiti genre. Some of the first examples were created in 1981 by graffitists Blek le Rat in Paris, in 1982 by Jef Aerosol in Tours (France); by 1985 stencils had appeared in other cities including New York City, Sydney, and Melbourne, where they were documented by American photographer Charles Gatewood and Australian photographer Rennie Ellis. Commercialization and entrance into mainstream pop culture With the popularity and legitimization of graffiti has come a level of commercialization. In 2001, computer giant IBM launched an advertising campaign in Chicago and San Francisco which involved people spray painting on sidewalks a peace symbol, a heart, and a penguin (Linux mascot), to represent "Peace, Love, and Linux." IBM paid Chicago and San Francisco collectively US$120,000 for punitive damages and clean-up costs. In 2005, a similar ad campaign was launched by Sony and executed by its advertising agency in New York, Chicago, Atlanta, Philadelphia, Los Angeles, and Miami, to market its handheld PSP gaming system. In this campaign, taking notice of the legal problems of the IBM campaign, Sony paid building owners for the rights to paint on their buildings "a collection of dizzy-eyed urban kids playing with the PSP as if it were a skateboard, a paddle, or a rocking horse". Advocates Marc Ecko, an urban clothing designer, has been an advocate of graffiti as an art form during this period, stating that "Graffiti is without question the most powerful art movement in recent history and has been a driving inspiration throughout my career." Graffiti have become a common stepping stone for many members of both the art and design communities in North America and abroad. Within the United States graffitists such as Mike Giant, Pursue, Rime, Noah, and countless others have made careers in skateboard, apparel, and shoe design for companies such as DC Shoes, Adidas, Rebel8, Osiris, or Circa Meanwhile, there are many others such as DZINE, Daze, Blade, and The Mac who have made the switch to being gallery artists, often not even using their initial medium, spray paint. Global developments South America Tristan Manco wrote that Brazil "boasts a unique and particularly rich, graffiti scene ... [earning] it an international reputation as the place to go for artistic inspiration." Graffiti "flourishes in every conceivable space in Brazil's cities." Artistic parallels "are often drawn between the energy of São Paulo today and 1970s New York." The "sprawling metropolis," of São Paulo has "become the new shrine to graffiti;" Manco alludes to "poverty and unemployment ... [and] the epic struggles and conditions of the country's marginalised peoples," and to "Brazil's chronic poverty," as the main engines that "have fuelled a vibrant graffiti culture." In world terms, Brazil has "one of the most uneven distributions of income. Laws and taxes change frequently." Such factors, Manco argues, contribute to a very fluid society, riven with those economic divisions and social tensions that underpin and feed the "folkloric vandalism and an urban sport for the disenfranchised," that is South American graffiti art. Prominent Brazilian graffitists include Os Gêmeos, Boleta, Nunca, Nina, Speto, Tikka, and T.Freak. Their artistic success and involvement in commercial design ventures has highlighted divisions within the Brazilian graffiti community between adherents of the cruder transgressive form of pichação and the more conventionally artistic values of the practitioners of grafite. Middle East Graffiti in the Middle East has emerged slowly, with taggers operating in Egypt, Lebanon, the Gulf countries like Bahrein or the United Arab Emirates, Israel, and in Iran. The major Iranian newspaper Hamshahri has published two articles on illegal writers in the city with photographic coverage of Iranian artist A1one's works on Tehran walls. Tokyo-based design magazine, PingMag, has interviewed A1one and featured photographs of his work. The Israeli West Bank barrier has become a site for graffiti, reminiscent in this sense of the Berlin Wall. Many graffitists in Israel come from other places around the globe, such as JUIF from Los Angeles and DEVIONE from London. The religious reference "נ נח נחמ נחמן מאומן" ("Na Nach Nachma Nachman Meuman") is commonly seen in graffiti around Israel. Graffiti has played an important role within the street art scene in the Middle East and North Africa (MENA), especially following the events of the Arab Spring of 2011 or the Sudanese Revolution of 2018/19. Graffiti is a tool of expression in the context of conflict in the region, allowing people to raise their voices politically and socially. Famous street artist Banksy has had an important effect in the street art scene in the MENA area, especially in Palestine where some of his works are located in the West Bank barrier and Bethlehem. Southeast Asia There are also a large number of graffiti influences in Southeast Asian countries that mostly come from modern Western culture, such as Malaysia, where graffiti have long been a common sight in Malaysia's capital city, Kuala Lumpur. Since 2010, the country has begun hosting a street festival to encourage all generations and people from all walks of life to enjoy and encourage Malaysian street culture. Characteristics of common graffiti Methods and production The modern-day graffitists can be found with an arsenal of various materials that allow for a successful production of a piece. This includes such techniques as scribing. However, spray paint in aerosol cans is the number one medium for graffiti. From this commodity comes different styles, technique, and abilities to form master works of graffiti. Spray paint can be found at hardware and art stores and comes in virtually every color. Stencil graffiti is created by cutting out shapes and designs in a stiff material (such as cardboard or subject folders) to form an overall design or image. The stencil is then placed on the "canvas" gently and with quick, easy strokes of the aerosol can, the image begins to appear on the intended surface. Modern experimentation Modern graffiti art often incorporates additional arts and technologies. For example, Graffiti Research Lab has encouraged the use of projected images and magnetic light-emitting diodes (throwies) as new media for graffitists. Yarnbombing is another recent form of graffiti. Yarnbombers occasionally target previous graffiti for modification, which had been avoided among the majority of graffitists. Tagging Tagging is the practice of someone spray-painting "their name, initial or logo onto a public surface". A number of recent examples of graffiti make use of hashtags. Uses Theories on the use of graffiti by avant-garde artists have a history dating back at least to the Asger Jorn, who in 1962 painting declared in a graffiti-like gesture "the avant-garde won't give up". Many contemporary analysts and even art critics have begun to see artistic value in some graffiti and to recognize it as a form of public art. According to many art researchers, particularly in the Netherlands and in Los Angeles, that type of public art is, in fact an effective tool of social emancipation or, in the achievement of a political goal. In times of conflict, such murals have offered a means of communication and self-expression for members of these socially, ethnically, or racially divided communities, and have proven themselves as effective tools in establishing dialog and thus, of addressing cleavages in the long run. The Berlin Wall was also extensively covered by graffiti reflecting social pressures relating to the oppressive Soviet rule over the GDR. Many artists involved with graffiti are also concerned with the similar activity of stenciling. Essentially, this entails stenciling a print of one or more colors using spray-paint. Recognized while exhibiting and publishing several of her coloured stencils and paintings portraying the Sri Lankan Civil War and urban Britain in the early 2000s, graffitists Mathangi Arulpragasam, aka M.I.A., has also become known for integrating her imagery of political violence into her music videos for singles "Galang" and "Bucky Done Gun", and her cover art. Stickers of her artwork also often appear around places such as London in Brick Lane, stuck to lamp posts and street signs, she having become a muse for other graffitists and painters worldwide in cities including Seville. Personal expression Many graffitists choose to protect their identities and remain anonymous or to hinder prosecution. With the commercialization of graffiti (and hip hop in general), in most cases, even with legally painted "graffiti" art, graffitists tend to choose anonymity. This may be attributed to various reasons or a combination of reasons. Graffiti still remains the one of four hip hop elements that is not considered "performance art" despite the image of the "singing and dancing star" that sells hip hop culture to the mainstream. Being a graphic form of art, it might also be said that many graffitists still fall in the category of the introverted archetypal artist. Banksy is one of the world's most notorious and popular street artists who continues to remain faceless in today's society. He is known for his political, anti-war stencil art mainly in Bristol, England, but his work may be seen anywhere from Los Angeles to Palestine. In the UK, Banksy is the most recognizable icon for this cultural artistic movement and keeps his identity a secret to avoid arrest. Much of Banksy's artwork may be seen around the streets of London and surrounding suburbs, although he has painted pictures throughout the world, including the Middle East, where he has painted on Israel's controversial West Bank barrier with satirical images of life on the other side. One depicted a hole in the wall with an idyllic beach, while another shows a mountain landscape on the other side. A number of exhibitions also have taken place since 2000, and recent works of art have fetched vast sums of money. Banksy's art is a prime example of the classic controversy: vandalism vs. art. Art supporters endorse his work distributed in urban areas as pieces of art and some councils, such as Bristol and Islington, have officially protected them, while officials of other areas have deemed his work to be vandalism and have removed it. Pixnit is another artist who chooses to keep her identity from the general public. Her work focuses on beauty and design aspects of graffiti as opposed to Banksy's anti-government shock value. Her paintings are often of flower designs above shops and stores in her local urban area of Cambridge, Massachusetts. Some store owners endorse her work and encourage others to do similar work as well. "One of the pieces was left up above Steve's Kitchen, because it looks pretty awesome"- Erin Scott, the manager of New England Comics in Allston, Massachusetts. Graffiti artists may become offended if photographs of their art are published in a commercial context without their permission. In March 2020, the Finnish graffiti artist Psyke expressed his displeasure at the newspaper Ilta-Sanomat publishing a photograph of a Peugeot 208 in an article about new cars, with his graffiti prominently shown on the background. The artist claims he does not want his art being used in commercial context, not even if he were to receive compensation. Radical and political Graffiti often has a reputation as part of a subculture that rebels against authority, although the considerations of the practitioners often diverge and can relate to a wide range of attitudes. It can express a political practice and can form just one tool in an array of resistance techniques. One early example includes the anarcho-punk band Crass, who conducted a campaign of stenciling anti-war, anarchist, feminist, and anti-consumerist messages throughout the London Underground system during the late 1970s and early 1980s. In Amsterdam graffiti was a major part of the punk scene. The city was covered with names such as "De Zoot", "Vendex", and "Dr Rat". To document the graffiti a punk magazine was started that was called Gallery Anus. So when hip hop came to Europe in the early 1980s there was already a vibrant graffiti culture. The student protests and general strike of May 1968 saw Paris bedecked in revolutionary, anarchistic, and situationist slogans such as L'ennui est contre-révolutionnaire ("Boredom is counterrevolutionary") and Lisez moins, vivez plus ("Read less, live more"). While not exhaustive, the graffiti gave a sense of the 'millenarian' and rebellious spirit, tempered with a good deal of verbal wit, of the strikers. The developments of graffiti art which took place in art galleries and colleges as well as "on the street" or "underground", contributed to the resurfacing in the 1990s of a far more overtly politicized art form in the subvertising, culture jamming, or tactical media movements. These movements or styles tend to classify the artists by their relationship to their social and economic contexts, since, in most countries, graffiti art remains illegal in many forms except when using non-permanent paint. Since the 1990s with the rise of Street Art, a growing number of artists are switching to non-permanent paints and non-traditional forms of painting. Contemporary practitioners, accordingly, have varied and often conflicting practices. Some individuals, such as Alexander Brener, have used the medium to politicize other art forms, and have used the prison sentences enforced on them as a means of further protest. The practices of anonymous groups and individuals also vary widely, and practitioners by no means always agree with each other's practices. For example, the anti-capitalist art group the Space Hijackers did a piece in 2004 about the contradiction between the capitalistic elements of Banksy and his use of political imagery. Territorial graffiti marks urban neighborhoods with tags and logos to differentiate certain groups from others. These images are meant to show outsiders a stern look at whose turf is whose. The subject matter of gang-related graffiti consists of cryptic symbols and initials strictly fashioned with unique calligraphies. Gang members use graffiti to designate membership throughout the gang, to differentiate rivals and associates and, most commonly, to mark borders which are both territorial and ideological. Berlin human rights activist Irmela Mensah-Schramm has received global media attention and numerous awards for her 35-year campaign of effacing neo-Nazi and other right-wing extremist graffiti throughout Germany, often by altering hate speech in humorous ways. Gallery As advertising Graffiti has been used as a means of advertising both legally and illegally. Bronx-based TATS CRU has made a name for themselves doing legal advertising campaigns for companies such as Coca-Cola, McDonald's, Toyota, and MTV. In the UK, Covent Garden's Boxfresh used stencil images of a Zapatista revolutionary in the hopes that cross referencing would promote their store. Smirnoff hired artists to use reverse graffiti (the use of high pressure hoses to clean dirty surfaces to leave a clean image in the surrounding dirt) to increase awareness of their product. Offensive graffiti Graffiti may also be used as an offensive expression. This form of graffiti may be difficult to identify, as it is mostly removed by the local authority (as councils which have adopted strategies of criminalization also strive to remove graffiti quickly). Therefore, existing racist graffiti is mostly more subtle and at first sight, not easily recognized as "racist". It can then be understood only if one knows the relevant "local code" (social, historical, political, temporal, and spatial), which is seen as heteroglot and thus a 'unique set of conditions' in a cultural context. A spatial code for example, could be that there is a certain youth group in an area that is engaging heavily in racist activities. So, for residents (knowing the local code), a graffiti containing only the name or abbreviation of this gang already is a racist expression, reminding the offended people of their gang activities. Also a graffiti is in most cases, the herald of more serious criminal activity to come. A person who does not know these gang activities would not be able to recognize the meaning of this graffiti. Also if a tag of this youth group or gang is placed on a building occupied by asylum seekers, for example, its racist character is even stronger. By making the graffiti less explicit (as adapted to social and legal constraints), these drawings are less likely to be removed, but do not lose their threatening and offensive character. Elsewhere, activists in Russia have used painted caricatures of local officials with their mouths as potholes, to show their anger about the poor state of the roads. In Manchester, England a graffitists painted obscene images around potholes, which often resulted in their being repaired within 48 hours. Decorative and high art In the early 1980s, the first art galleries to show graffitists to the public were Fashion Moda in the Bronx, Now Gallery and Fun Gallery, both in the East Village, Manhattan. A 2006 exhibition at the Brooklyn Museum displayed graffiti as an art form that began in New York's outer boroughs and reached great heights in the early 1980s with the work of Crash, Lee, Daze, Keith Haring, and Jean-Michel Basquiat. It displayed 22 works by New York graffitists, including Crash, Daze, and Lady Pink. In an article about the exhibition in the magazine Time Out, curator Charlotta Kotik said that she hoped the exhibition would cause viewers to rethink their assumptions about graffiti. From the 1970s onwards, Burhan Dogancay photographed urban walls all over the world; these he then archived for use as sources of inspiration for his painterly works. The project today known as "Walls of the World" grew beyond even his own expectations and comprises about 30,000 individual images. It spans a period of 40 years across five continents and 114 countries. In 1982, photographs from this project comprised a one-man exhibition titled "Les murs murmurent, ils crient, ils chantent..." (The walls whisper, shout and sing...) at the Centre Georges Pompidou in Paris. In Australia, art historians have judged some local graffiti of sufficient creative merit to rank them firmly within the arts. Oxford University Press's art history text Australian Painting 1788–2000 concludes with a long discussion of graffiti's key place within contemporary visual culture, including the work of several Australian practitioners. Between March and April 2009, 150 artists exhibited 300 pieces of graffiti at the Grand Palais in Paris. Environmental effects Spray paint has many negative environmental effects. The paint contains toxic chemicals, and the can uses volatile hydrocarbon gases to spray the paint onto a surface. Volatile organic compound (VOC) leads to ground level ozone formation and most of graffiti related emissions are VOCs. A 2010 paper estimates 4,862 tons of VOCs were released in the United States in activities related to graffiti. Government responses Asia In China, Mao Zedong in the 1920s used revolutionary slogans and paintings in public places to galvanise the country's communist revolution. Based on different national conditions, many people believe that China's attitude towards Graffiti is fierce, but in fact, according to Lance Crayon in his film Spray Paint Beijing: Graffiti in the Capital of China, Graffiti is generally accepted in Beijing, with artists not seeing much police interference. Political and religiously sensitive graffiti, however, is not allowed. In Hong Kong, Tsang Tsou Choi was known as the King of Kowloon for his calligraphy graffiti over many years, in which he claimed ownership of the area. Now some of his work is preserved officially. In Taiwan, the government has made some concessions to graffitists. Since 2005 they have been allowed to freely display their work along some sections of riverside retaining walls in designated "Graffiti Zones". From 2007, Taipei's department of cultural affairs also began permitting graffiti on fences around major public construction sites. Department head Yong-ping Lee (李永萍) stated, "We will promote graffiti starting with the public sector, and then later in the private sector too. It's our goal to beautify the city with graffiti". The government later helped organize a graffiti contest in Ximending, a popular shopping district. graffitists caught working outside of these designated areas still face fines up to NT$6,000 under a department of environmental protection regulation. However, Taiwanese authorities can be relatively lenient, one veteran police officer stating anonymously, "Unless someone complains about vandalism, we won't get involved. We don't go after it proactively." In 1993, after several expensive cars in Singapore were spray-painted, the police arrested a student from the Singapore American School, Michael P. Fay, questioned him, and subsequently charged him with vandalism. Fay pleaded guilty to vandalizing a car in addition to stealing road signs. Under the 1966 Vandalism Act of Singapore, originally passed to curb the spread of communist graffiti in Singapore, the court sentenced him to four months in jail, a fine of S$3,500 (US$2,233), and a caning. The New York Times ran several editorials and op-eds that condemned the punishment and called on the American public to flood the Singaporean embassy with protests. Although the Singapore government received many calls for clemency, Fay's caning took place in Singapore on 5 May 1994. Fay had originally received a sentence of six strokes of the cane, but the presiding president of Singapore, Ong Teng Cheong, agreed to reduce his caning sentence to four lashes. In South Korea, Park Jung-soo was fined two million South Korean won by the Seoul Central District Court for spray-painting a rat on posters of the G-20 Summit a few days before the event in November 2011. Park alleged that the initial in "G-20" sounds like the Korean word for "rat", but Korean government prosecutors alleged that Park was making a derogatory statement about the president of South Korea, Lee Myung-bak, the host of the summit. This case led to public outcry and debate on the lack of government tolerance and in support of freedom of expression. The court ruled that the painting, "an ominous creature like a rat" amounts to "an organized criminal activity" and upheld the fine while denying the prosecution's request for imprisonment for Park. Europe In Europe, community cleaning squads have responded to graffiti, in some cases with reckless abandon, as when in 1992 in France a local Scout group, attempting to remove modern graffiti, damaged two prehistoric paintings of bison in the Cave of Mayrière supérieure near the French village of Bruniquel in Tarn-et-Garonne, earning them the 1992 Ig Nobel Prize in archeology. In September 2006, the European Parliament directed the European Commission to create urban environment policies to prevent and eliminate dirt, litter, graffiti, animal excrement, and excessive noise from domestic and vehicular music systems in European cities, along with other concerns over urban life. In Budapest, Hungary, both a city-backed movement called I Love Budapest and a special police division tackle the problem, including the provision of approved areas. United Kingdom The Anti-Social Behaviour Act 2003 became Britain's latest anti-graffiti legislation. In August 2004, the Keep Britain Tidy campaign issued a press release calling for zero tolerance of graffiti and supporting proposals such as issuing "on the spot" fines to graffiti offenders and banning the sale of aerosol paint to anyone under the age of 16. The press release also condemned the use of graffiti images in advertising and in music videos, arguing that real-world experience of graffiti stood far removed from its often-portrayed "cool" or "edgy'" image. To back the campaign, 123 Members of Parliament (MPs) (including then Prime Minister Tony Blair), signed a charter which stated: "Graffiti is not art, it's crime. On behalf of my constituents, I will do all I can to rid our community of this problem." In the UK, city councils have the power to take action against the owner of any property that has been defaced under the Anti-social Behaviour Act 2003 (as amended by the Clean Neighbourhoods and Environment Act 2005) or, in certain cases, the Highways Act. This is often used against owners of property that are complacent in allowing protective boards to be defaced so long as the property is not damaged. In July 2008, a conspiracy charge was used to convict graffitists for the first time. After a three-month police surveillance operation, nine members of the DPM crew were convicted of conspiracy to commit criminal damage costing at least £1 million. Five of them received prison sentences, ranging from eighteen months to two years. The unprecedented scale of the investigation and the severity of the sentences rekindled public debate over whether graffiti should be considered art or crime. Some councils, like those of Stroud and Loerrach, provide approved areas in the town where graffitists can showcase their talents, including underpasses, car parks, and walls that might otherwise prove a target for the "spray and run". Australia In an effort to reduce vandalism, many cities in Australia have designated walls or areas exclusively for use by graffitists. One early example is the "Graffiti Tunnel" located at the Camperdown Campus of the University of Sydney, which is available for use by any student at the university to tag, advertise, poster, and create "art". Advocates of this idea suggest that this discourages petty vandalism yet encourages artists to take their time and produce great art, without worry of being caught or arrested for vandalism or trespassing. Others disagree with this approach, arguing that the presence of legal graffiti walls does not demonstrably reduce illegal graffiti elsewhere. Some local government areas throughout Australia have introduced "anti-graffiti squads", who clean graffiti in the area, and such crews as BCW (Buffers Can't Win) have taken steps to keep one step ahead of local graffiti cleaners. Many state governments have banned the sale or possession of spray paint to those under the age of 18 (age of majority). However, a number of local governments in Victoria have taken steps to recognize the cultural heritage value of some examples of graffiti, such as prominent political graffiti. Tough new graffiti laws have been introduced in Australia with fines of up to A$26,000 and two years in prison. Melbourne is a prominent graffiti city of Australia with many of its lanes being tourist attractions, such as Hosier Lane in particular, a popular destination for photographers, wedding photography, and backdrops for corporate print advertising. The Lonely Planet travel guide cites Melbourne's street as a major attraction. All forms of graffiti, including sticker art, poster, stencil art, and wheatpasting, can be found in many places throughout the city. Prominent street art precincts include; Fitzroy, Collingwood, Northcote, Brunswick, St. Kilda, and the CBD, where stencil and sticker art is prominent. As one moves farther away from the city, mostly along suburban train lines, graffiti tags become more prominent. Many international artists such as Banksy have left their work in Melbourne and in early 2008 a perspex screen was installed to prevent a Banksy stencil art piece from being destroyed, it has survived since 2003 through the respect of local street artists avoiding posting over it, although it has recently had paint tipped over it. New Zealand In February 2008 Helen Clark, the New Zealand prime minister at that time, announced a government crackdown on tagging and other forms of graffiti vandalism, describing it as a destructive crime representing an invasion of public and private property. New legislation subsequently adopted included a ban on the sale of paint spray cans to persons under 18 and increases in maximum fines for the offence from NZ$200 to NZ$2,000 or extended community service. The issue of tagging become a widely debated one following an incident in Auckland during January 2008 in which a middle-aged property owner stabbed one of two teenage taggers to death and was subsequently convicted of manslaughter. United States Tracker databases Graffiti databases have increased in the past decade because they allow vandalism incidents to be fully documented against an offender and help the police and prosecution charge and prosecute offenders for multiple counts of vandalism. They also provide law enforcement the ability to rapidly search for an offender's moniker or tag in a simple, effective, and comprehensive way. These systems can also help track costs of damage to city to help allocate an anti-graffiti budget. The theory is that when an offender is caught putting up graffiti, they are not just charged with one count of vandalism; they can be held accountable for all the other damage for which they are responsible. This has two main benefits for law enforcement. One, it sends a signal to the offenders that their vandalism is being tracked. Two, a city can seek restitution from offenders for all the damage that they have committed, not merely a single incident. These systems give law enforcement personnel real-time, street-level intelligence that allows them not only to focus on the worst graffiti offenders and their damage, but also to monitor potential gang violence that is associated with the graffiti. Gang injunctions Many restrictions of civil gang injunctions are designed to help address and protect the physical environment and limit graffiti. Provisions of gang injunctions include things such as restricting the possession of marker pens, spray paint cans, or other sharp objects capable of defacing private or public property; spray painting, or marking with marker pens, scratching, applying stickers, or otherwise applying graffiti on any public or private property, including, but not limited to the street, alley, residences, block walls, and fences, vehicles or any other real or personal property. Some injunctions contain wording that restricts damaging or vandalizing both public and private property, including but not limited to any vehicle, light fixture, door, fence, wall, gate, window, building, street sign, utility box, telephone box, tree, or power pole. Hotlines and reward programs To help address many of these issues, many local jurisdictions have set up graffiti abatement hotlines, where citizens can call in and report vandalism and have it removed. San Diego's hotline receives more than 5,000 calls per year, in addition to reporting the graffiti, callers can learn more about prevention. One of the complaints about these hotlines is the response time; there is often a lag time between a property owner calling about the graffiti and its removal. The length of delay should be a consideration for any jurisdiction planning on operating a hotline. Local jurisdictions must convince the callers that their complaint of vandalism will be a priority and cleaned off right away. If the jurisdiction does not have the resources to respond to complaints in a timely manner, the value of the hotline diminishes. Crews must be able to respond to individual service calls made to the graffiti hotline as well as focus on cleanup near schools, parks, and major intersections and transit routes to have the biggest impact. Some cities offer a reward for information leading to the arrest and prosecution of suspects for tagging or graffiti related vandalism. The amount of the reward is based on the information provided, and the action taken. Search warrants When police obtain search warrants in connection with a vandalism investigation, they are often seeking judicial approval to look for items such as cans of spray paint and nozzles from other kinds of aerosol sprays; etching tools, or other sharp or pointed objects, which could be used to etch or scratch glass and other hard surfaces; permanent marking pens, markers, or paint sticks; evidence of membership or affiliation with any gang or tagging crew; paraphernalia including any reference to "(tagger's name)"; any drawings, writing, objects, or graffiti depicting taggers' names, initials, logos, monikers, slogans, or any mention of tagging crew membership; and any newspaper clippings relating to graffiti crime. Documentaries 80 Blocks from Tiffany's (1979): A rare glimpse into late 1970s New York toward the end of the infamous South Bronx gangs, the documentary shows many sides of the mainly Puerto Rican community of the South Bronx, including reformed gang members, current gang members, the police, and the community leaders who try to reach out to them. Stations of the Elevated (1980), the earliest documentary about subway graffiti in New York City, with music by Charles Mingus Style Wars (1983), an early documentary on hip hop culture, made in New York City Piece by Piece (2005), a feature-length documentary on the history of San Francisco graffiti from the early 1980s Infamy (2005), a feature-length documentary about graffiti culture as told through the experiences of six well-known graffiti writers and a graffiti buffer NEXT: A Primer on Urban Painting (2005), a documentary about global graffiti culture RASH (2005), a feature documentary about Melbourne, Australia and the artists who make it a living host for street art Jisoe (2007): A glimpse into the life of a Melbourne, Australia, graffiti writer shows the audience an example of graffiti in struggling Melbourne Areas. Roadsworth: Crossing the Line (2009), about Montréal artist Peter Gibson and his controversial stencil art on public roads Exit Through The Gift Shop (2010) was produced by the notorious artist Banksy. It tells the story of Thierry Guetta, a French immigrant in Los Angeles, and his obsession with street art; Shepard Fairey and Invader, whom Guetta discovers is his cousin, are also in the film. Still on and non the wiser (2011) is a ninety-minute-long documentation that accompanies the exhibition with the same name in the Kunsthalle Barmen of the Von der Heydt-Museum in Wuppertal (Germany). It draws vivid portrayals of the artists by means of very personal interviews and also catches the creation process of the works before the exhibition was opened. Graffiti Wars (2011), a documentary detailing King Robbo's feud with Banksy as well as the authorities' differing attitude towards graffiti and street art Dramas Wild Style (1983), about hip hop and graffiti culture in New York City Turk 182 (1985), about graffiti as political activism Bomb the System (2002), about a crew of graffitists in modern-day New York City Quality of Life (2004) was shot in the Mission District of San Francisco, co-written by and starring a retired graffiti writer. Wholetrain (2006), a German film See also Anti-graffiti coating BUGA UP Calligraffiti The Faith of Graffiti Grafedia Graffiti abatement Graffiti in Miami Graffiti in the United Kingdom Graffiti post-2011 Egyptian Revolution Graffiti terminology Hobo sign Kilroy was here Kotwica Latrinalia List of graffiti and street art injuries and deaths Monsters of Art Philadelphia Mural Arts Program Spray paint art Stencil Graffiti Street art Vandalism Visual pollution Yarn bombing References Further reading Baird, J. A. and C. Taylor, eds. 2011, Ancient Graffiti in Context. New York: Routledge. External links Visual arts genres Painting techniques Street culture Writing Organized crime activity
[ 1.0779109001159668, 0.12101039290428162, -0.46387338638305664, -0.02676185593008995, -0.11455945670604706, 0.24374918639659882, 0.30010008811950684, 0.7140665054321289, -0.25690391659736633, -0.6167479157447815, -0.49477052688598633, 0.2845741808414459, -0.28725171089172363, -0.01035450957...
11986
https://en.wikipedia.org/wiki/Godzilla
Godzilla
is a fictional monster, or kaiju, originating from a series of Japanese films. The character first appeared in the 1954 film Godzilla and became a worldwide pop culture icon, appearing in various media, including 32 films produced by Toho, four Hollywood films and numerous video games, novels, comic books and television shows. Godzilla has been dubbed the "King of the Monsters", a phrase first used in Godzilla, King of the Monsters! (1956), the Americanized version of the original film. Godzilla is an enormous, destructive, prehistoric sea monster awakened and empowered by nuclear radiation. With the nuclear bombings of Hiroshima and Nagasaki and the Lucky Dragon 5 incident still fresh in the Japanese consciousness, Godzilla was conceived as a metaphor for nuclear weapons. Others have suggested that Godzilla is a metaphor for the United States, a giant beast woken from its slumber which then takes terrible vengeance on Japan. As the film series expanded, some stories took on less serious undertones, portraying Godzilla as an antihero, or a lesser threat who defends humanity. Later films address themes including Japan's forgetfulness over its imperial past, natural disasters, and the human condition. Godzilla has featured alongside many supporting characters. It has faced human opponents such as the JSDF, or other monsters, including King Ghidorah, Mechagodzilla and Gigan. Godzilla sometimes has allies, such as Rodan, Mothra and Anguirus, and offspring, such as Minilla and Godzilla Junior. Godzilla has also fought characters from other franchises in crossover media, such as the RKO Pictures/Universal Studios movie monster King Kong, as well as various Marvel Comics characters, including S.H.I.E.L.D., the Fantastic Four and the Avengers. Overview Name is a portmanteau of the Japanese words and , owing to the fact that in one planning stage, Godzilla was described as "a cross between a gorilla and a whale", due to its size, power and aquatic origin. One popular story is that "Gojira" was actually the nickname of a corpulent stagehand at Toho Studio. Kimi Honda, the widow of the director, dismissed this in a 1998 BBC documentary devoted to Godzilla: "The backstage boys at Toho loved to joke around with tall stories." Godzilla's name was written in ateji as , where the kanji are used for phonetic value and not meaning. The Japanese pronunciation of the name is ; the Anglicized form is , with the first syllable pronounced like the word "god" and the rest rhyming with "gorilla". In the Hepburn romanization system, Godzilla's name is rendered as "Gojira", whereas in the Kunrei romanization system it is rendered as "Gozira". During the development of the American version of Godzilla Raids Again (1955), Godzilla's name was changed to "Gigantis" by producer Paul Schreibman, who wanted to create a character distinct from Godzilla. Characteristics Within the context of the Japanese films, Godzilla's exact origins vary, but it is generally depicted as an enormous, violent, prehistoric sea monster awakened and empowered by nuclear radiation. Although the specific details of Godzilla's appearance have varied slightly over the years, the overall impression has remained consistent. Inspired by the fictional Rhedosaurus created by animator Ray Harryhausen for the film The Beast from 20,000 Fathoms, Godzilla's character design was conceived as that of an amphibious reptilian monster based around the loose concept of a dinosaur with an erect standing posture, scaly skin, an anthropomorphic torso with muscular arms, lobed bony plates along its back and tail, and a furrowed brow. Art director Akira Watanabe combined attributes of a Tyrannosaurus, an Iguanodon, a Stegosaurus and an alligator to form a sort of blended chimera, inspired by illustrations from an issue of Life magazine. To emphasise the monster's relationship with the atomic bomb, its skin texture was inspired by the keloid scars seen on the survivors of Hiroshima. The basic design has a reptilian visage, a robust build, an upright posture, a long tail and three rows of serrated plates along the back. In the original film, the plates were added for purely aesthetic purposes, in order to further differentiate Godzilla from any other living or extinct creature. Godzilla is sometimes depicted as green in comics, cartoons and movie posters, but the costumes used in the movies were usually painted charcoal grey with bone-white dorsal plates up until the film Godzilla 2000: Millennium. In the original Japanese films, Godzilla and all the other monsters are referred to with gender-neutral pronouns equivalent to "it", while in the English dubbed versions, Godzilla is explicitly described as a male. In his book, Godzilla co-creator Tomoyuki Tanaka suggested that the monster was probably male. In the 1998 film Godzilla, the monster is referred to as a male and is depicted laying eggs through parthenogenesis. In the Legendary Godzilla films, Godzilla is referred to as a male. Godzilla's allegiance and motivations have changed from film to film to suit the needs of the story. Although Godzilla does not like humans, it will fight alongside humanity against common threats. However, it makes no special effort to protect human life or property and will turn against its human allies on a whim. It is not motivated to attack by predatory instinct: it does not eat people and instead sustains itself on nuclear radiation and an omnivorous diet. When inquired if Godzilla was "good or bad", producer Shōgo Tomiyama likened it to a Shinto "God of Destruction" which lacks moral agency and cannot be held to human standards of good and evil. "He totally destroys everything and then there is a rebirth. Something new and fresh can begin." Abilities Godzilla's signature weapon is its "atomic heat beam" (also known as "atomic breath"), nuclear energy that it generates inside of its body, uses electromagnetic force to concentrate it into a laser-like high velocity projectile and unleashes it from its jaws in the form of a blue or red radioactive beam. Toho's special effects department has used various techniques to render the beam, from physical gas-powered flames to hand-drawn or computer-generated fire. Godzilla is shown to possess immense physical strength and muscularity. Haruo Nakajima, the actor who played Godzilla in the original films, was a black belt in judo and used his expertise to choreograph the battle sequences. Godzilla is amphibious: it has a preference for traversing Earth's hydrosphere when in hibernation or migration, can breathe underwater and is described in the original film by the character Dr. Yamane as a transitional form between a marine and a terrestrial reptile. Godzilla is shown to have great vitality: it is immune to conventional weaponry thanks to its rugged hide and ability to regenerate, and as a result of surviving a nuclear explosion, it cannot be destroyed by anything less powerful. One incarnation possesses an electromagnetic pulse-producing organ in its body which generates an asymmetrical permeable shield, making it impervious to all damage except for a short period when the organ recycles. Various films, non-canonical television shows, comics and games have depicted Godzilla with additional powers, such as an atomic pulse, magnetism, precognition, fireballs, convert electromagnetic energy into intensive body heat, converting shed blood into temporary tentacle limbs, an electric bite, superhuman speed, laser beams emitted from its eyes and even flight. Roar Godzilla has a distinctive disyllabic roar (transcribed in several comics as Skreeeonk!), which was created by composer Akira Ifukube, who produced the sound by rubbing a pine tar-resin-coated glove along the string of a contrabass and then slowing down the playback. In the American version of Godzilla Raids Again (1955) titled Gigantis the Fire Monster (1959), Godzilla's roar was mostly substituted with that of the monster Anguirus. From The Return of Godzilla (1984) to Godzilla vs. King Ghidorah (1991), Godzilla was given a deeper and more threatening-sounding roar than in previous films, though this change was reverted from Godzilla vs. Mothra (1992) onward. For the 2014 American film, sound editors Ethan Van der Ryn and Erik Aadahl refused to disclose the source of the sounds used for their Godzilla's roar. Aadahl described the two syllables of the roar as representing two different emotional reactions, with the first expressing fury and the second conveying the character's soul. Size Godzilla's size is inconsistent, changing from film to film and even from scene to scene for the sake of artistic license. The miniature sets and costumes were typically built at a – scale and filmed at 240 frames per second to create the illusion of great size. In the original 1954 film, Godzilla was scaled to be tall. This was done so Godzilla could just peer over the largest buildings in Tokyo at the time. In the 1956 American version, Godzilla is estimated to be tall, because producer Joseph E. Levine felt that 50 m did not sound "powerful enough". As the series progressed, Toho would rescale the character, eventually making Godzilla as tall as . This was done so that it would not be dwarfed by the newer, bigger buildings in Tokyo's skyline, such as the Tokyo Metropolitan Government Building which Godzilla destroyed in the film Godzilla vs. King Ghidorah (1991). Supplementary information, such as character profiles, would also depict Godzilla as weighing between . In the American film Godzilla (2014) from Legendary Pictures, Godzilla was scaled to be and weighing , making it the largest film version at that time. Director Gareth Edwards wanted Godzilla "to be so big as to be seen from anywhere in the city, but not too big that he couldn't be obscured". For Shin Godzilla (2016), Godzilla was made even taller than the Legendary version, at . In Godzilla: Planet of the Monsters (2017), Godzilla's height was increased further still to . In Godzilla: King of the Monsters (2019) and Godzilla vs. Kong (2020), Godzilla's height was increased to from the 2014 incarnation. Special effects details Godzilla's appearance has traditionally been portrayed in the films by an actor wearing a latex costume, though the character has also been rendered in animatronic, stop-motion and computer-generated form. Taking inspiration from King Kong, special effects artist Eiji Tsuburaya had initially wanted Godzilla to be portrayed via stop-motion, but prohibitive deadlines and a lack of experienced animators in Japan at the time made suitmation more practical. The first suit consisted of a body cavity made of thin wires and bamboo wrapped in chicken wire for support and covered in fabric and cushions, which were then coated in latex. The first suit was held together by small hooks on the back, though subsequent Godzilla suits incorporated a zipper. Its weight was in excess of . Prior to 1984, most Godzilla suits were made from scratch, thus resulting in slight design changes in each film appearance. The most notable changes from 1962 to 1975 were the reduction in Godzilla's number of toes and the removal of the character's external ears and prominent fangs, features which would all later be reincorporated in the Godzilla designs from The Return of Godzilla (1984) onward. The most consistent Godzilla design was maintained from Godzilla vs. Biollante (1989) to Godzilla vs. Destoroyah (1995), when the suit was given a cat-like face and double rows of teeth. Several suit actors had difficulties in performing as Godzilla due to the suits' weight, lack of ventilation and diminished visibility. Haruo Nakajima, who portrayed Godzilla from 1954 to 1972, said the materials used to make the 1954 suit (rubber, plastic, cotton, and latex) were hard to find after World War II. The suit weighed 100 kilograms after its completion and required two men to help Nakajima put it on. When he first put it on, he sweated so heavily that his shirt was soaked within seconds. Kenpachiro Satsuma in particular, who portrayed Godzilla from 1984 to 1995, described how the Godzilla suits he wore were even heavier and hotter than their predecessors because of the incorporation of animatronics. Satsuma himself suffered numerous medical issues during his tenure, including oxygen deprivation, near-drowning, concussions, electric shocks and lacerations to the legs from the suits' steel wire reinforcements wearing through the rubber padding. The ventilation problem was partially solved in the suit used in 1994's Godzilla vs. SpaceGodzilla, which was the first to include an air duct that allowed suit actors to last longer during performances. In The Return of Godzilla (1984), some scenes made use of a 16-foot high robotic Godzilla (dubbed the "Cybot Godzilla") for use in close-up shots of the creature's head. The Cybot Godzilla consisted of a hydraulically-powered mechanical endoskeleton covered in urethane skin containing 3,000 computer operated parts which permitted it to tilt its head and move its lips and arms. In Godzilla (1998), special effects artist Patrick Tatopoulos was instructed to redesign Godzilla as an incredibly fast runner. At one point, it was planned to use motion capture from a human to create the movements of the computer-generated Godzilla, but it was said to have ended up looking too much like a man in a suit. Tatopoulos subsequently reimagined the creature as a lean, digitigrade bipedal, iguana-like creature that stood with its back and tail parallel to the ground, rendered via CGI. Several scenes had the monster portrayed by stuntmen in suits. The suits were similar to those used in the Toho films, with the actors' heads being located in the monster's neck region and the facial movements controlled via animatronics. However, because of the creature's horizontal posture, the stuntmen had to wear metal leg extenders, which allowed them to stand off the ground with their feet bent forward. The film's special effects crew also built a scale animatronic Godzilla for close-up scenes, whose size outmatched that of Stan Winston's T. rex in Jurassic Park. Kurt Carley performed the suitmation sequences for the adult Godzilla. In Godzilla (2014), the character was portrayed entirely via CGI. Godzilla's design in the reboot was intended to stay true to that of the original series, though the film's special effects team strove to make the monster "more dynamic than a guy in a big rubber suit." To create a CG version of Godzilla, the Moving Picture Company (MPC) studied various animals such as bears, Komodo dragons, lizards, lions and wolves, which helped the visual effects artists visualize Godzilla's body structure, like that of its underlying bone, fat and muscle structure, as well as the thickness and texture of its scales. Motion capture was also used for some of Godzilla's movements. T. J. Storm provided the performance capture for Godzilla by wearing sensors in front of a green screen. Storm reprised the role of Godzilla in Godzilla: King of the Monsters, portraying the character through performance capture. In Shin Godzilla, a majority of the character was portrayed via CGI, with Mansai Nomura portraying Godzilla through motion capture. Appearances Cultural impact Godzilla is one of the most recognizable symbols of Japanese popular culture worldwide and remains an important facet of Japanese films, embodying the kaiju subset of the tokusatsu genre. Godzilla's vaguely humanoid appearance and strained, lumbering movements endeared it to Japanese audiences, who could relate to Godzilla as a sympathetic character, despite its wrathful nature. Audiences respond positively to the character because it acts out of rage and self-preservation and shows where science and technology can go wrong. In 1967, the Keukdong Entertainment Company of South Korea, with production assistance from Toei Company, produced Yongary, Monster from the Deep, a reptilian monster who invades South Korea to consume oil. The film and character has often been branded as an imitation of Godzilla. Godzilla has been considered a filmographic metaphor for the United States, as well as an allegory of nuclear weapons in general. The earlier Godzilla films, especially the original, portrayed Godzilla as a frightening nuclear-spawned monster. Godzilla represented the fears that many Japanese held about the atomic bombings of Hiroshima and Nagasaki and the possibility of recurrence. As the series progressed, so did Godzilla, changing into a less destructive and more heroic character. Ghidorah (1964) was the turning point in Godzilla's transformation from villain to hero, by pitting him against a greater threat to humanity, King Ghidorah. Godzilla has since been viewed as an anti-hero. Roger Ebert cites Godzilla as a notable example of a villain-turned-hero, along with King Kong, Jaws (James Bond), the Terminator and John Rambo. Godzilla is considered "the original radioactive superhero" due to his accidental radioactive origin story predating Spider-Man (1962 debut), though Godzilla did not become a hero until Ghidorah in 1964. By the 1970s, Godzilla came to be viewed as a superhero, with the magazine King of the Monsters in 1977 describing Godzilla as "Superhero of the '70s." Godzilla had surpassed Superman and Batman to become "the most universally popular superhero of 1977" according to Donald F. Glut. Godzilla was also voted the most popular movie monster in The Monster Times poll in 1973, beating Count Dracula, King Kong, the Wolf Man, the Mummy, the Creature from the Black Lagoon and the Frankenstein Monster. In 1996, Godzilla received the MTV Lifetime Achievement Award, as well as being given a star on the Hollywood Walk of Fame in 2004 to celebrate the premiere of the character's 50th anniversary film, Godzilla: Final Wars. Godzilla's pop-cultural impact has led to the creation of numerous parodies and tributes, as seen in media such as Bambi Meets Godzilla, which was ranked as one of the "50 greatest cartoons", two episodes of Mystery Science Theater 3000 and the song "Godzilla" by Blue Öyster Cult. Godzilla has also been used in advertisements, such as in a commercial for Nike, where Godzilla lost an oversized one-on-one game of basketball to a giant version of NBA player Charles Barkley. The commercial was subsequently adapted into a comic book illustrated by Jeff Butler. Godzilla has also appeared in a commercial for Snickers candy bars, which served as an indirect promo for the 2014 film. Godzilla's success inspired the creation of numerous other monster characters, such as Gamera, Reptilicus of Denmark, Yonggary of South Korea, Pulgasari of North Korea, Gorgo of the United Kingdom and the Cloverfield monster of the United States. Dakosaurus is an extinct sea crocodile of the Jurassic Period, which researchers informally nicknamed "Godzilla". Paleontologists have written tongue-in-cheek speculative articles about Godzilla's biology, with Ken Carpenter tentatively classifying it as a ceratosaur based on its skull shape, four-fingered hands and dorsal scutes and paleontologist Darren Naish expressing skepticism, while commenting on Godzilla's unusual morphology. Godzilla's ubiquity in pop-culture has led to the mistaken assumption that the character is in the public domain, resulting in litigation by Toho to protect their corporate asset from becoming a generic trademark. In April 2008, Subway depicted a giant monster in a commercial for their Five Dollar Footlongs sandwich promotion. Toho filed a lawsuit against Subway for using the character without permission, demanding $150,000 in compensation. In February 2011, Toho sued Honda for depicting a fire-breathing monster in a commercial for the Honda Odyssey. The monster was never mentioned by name, being seen briefly on a video screen inside the minivan. The Sea Shepherd Conservation Society christened a vessel the MV Gojira. Its purpose is to target and harass Japanese whalers in defense of whales in the Southern Ocean Whale Sanctuary. The MV Gojira was renamed the in May 2011, due to legal pressure from Toho. Gojira is the name of a French death metal band, formerly known as Godzilla; legal problems forced the band to change their name. In May 2015, Toho launched a lawsuit against Voltage Pictures over a planned picture starring Anne Hathaway. Promotional material released at the Cannes Film Festival used images of Godzilla. Steven Spielberg cited Godzilla as an inspiration for Jurassic Park (1993), specifically Godzilla, King of the Monsters! (1956), which he grew up watching. Spielberg described Godzilla as "the most masterful of all the dinosaur movies because it made you believe it was really happening." Godzilla also influenced the Spielberg film Jaws (1975). Godzilla has also been cited as an inspiration by filmmakers Martin Scorsese and Tim Burton. The main-belt asteroid 101781 Gojira, discovered by American astronomer Roy Tucker at the Goodricke-Pigott Observatory in 1999, was named in honor of the creature. The official naming citation was published by the Minor Planet Center on 11 July 2018 (). Cultural ambassador In April 2015, the Shinjuku ward of Tokyo named Godzilla a special resident and official tourism ambassador to encourage tourism. During an unveiling of a giant Godzilla bust at Toho headquarters, Shinjuku mayor Kenichi Yoshizumi stated "Godzilla is a character that is the pride of Japan." The mayor extended a residency certificate to an actor in a rubber suit representing Godzilla, but as the suit's hands were not designed for grasping, it was accepted on Godzilla's behalf by a Toho executive. Reporters noted that Shinjuku ward has been flattened by Godzilla in three Toho movies. References Sources External links Official Godzilla website by Toho Official website of Toho (Japanese) Godzilla on IMDb CGI characters Fictional dragonslayers Film characters introduced in 1954 Fictional aquatic creatures Fictional characters with accelerated healing Fictional characters with electric or magnetic abilities Fictional characters with fire or heat abilities Fictional characters with nuclear or radiation abilities Fictional characters with superhuman strength Fictional prehistoric characters Fictional reptiles Fictional giants Fictional mass murderers Fictional monsters Fictional mutants Fictional telepaths Godzilla characters Horror film villains Kaiju King Kong (franchise) characters MonsterVerse characters Science fiction film characters Toho Monsters
[ 0.5514570474624634, 0.14710575342178345, -0.3888530731201172, 0.12064697593450546, -0.44736936688423157, -0.26023539900779724, 0.20647400617599487, 0.46883320808410645, -0.2598770558834076, 0.22665835916996002, -0.25697025656700134, 0.3789813220500946, -0.29658815264701843, 0.3162562549114...
11988
https://en.wikipedia.org/wiki/King%20Kong%20vs.%20Godzilla
King Kong vs. Godzilla
is a 1962 Japanese kaiju film directed by Ishirō Honda, with special effects by Eiji Tsuburaya. Produced and distributed by Toho Co., Ltd, it is the third film in both the Godzilla franchise, and King Kong franchise, plus the first of two Toho-produced films featuring King Kong. It is also the first time that each character appeared on film in color and widescreen. The film stars Tadao Takashima, Kenji Sahara, Yū Fujiki, Ichirō Arishima, and Mie Hama, with Shoichi Hirose as King Kong and Haruo Nakajima as Godzilla. In the film, as Godzilla is reawakened by an American submarine, a pharmaceutical company captures King Kong for promotional uses, which culminates into a battle on Mount Fuji. The project began with a story outline devised by King Kong stop motion animator Willis H. O'Brien around 1960, in which Kong battles a giant Frankenstein Monster; O'Brien gave the outline to producer John Beck for development. Behind O'Brien's back and without his knowledge, Beck gave the project to Toho to produce the film, replacing the giant Frankenstein Monster with Godzilla and scrapping O'Brien's original story. King Kong vs. Godzilla was released theatrically in Japan on August 11, 1962. The film remains the most attended Godzilla film in Japan to date, and is credited with encouraging Toho to prioritize the continuation of the Godzilla series after seven years of dormancy. A heavily edited version was released by Universal International Inc. theatrically in the United States on June 26, 1963. The film was followed by Mothra vs. Godzilla, released April 29, 1964. Plot Mr. Tako, head of Pacific Pharmaceuticals, is frustrated with the television shows his company is sponsoring and wants something to boost his ratings. When a doctor tells Tako about a giant monster he discovered on the small Faro Island, Tako believes that it would be a brilliant idea to use the monster to gain publicity. Tako immediately sends two men, Osamu Sakurai and Kinsaburo Furue, to find and bring back the monster. Meanwhile, the American nuclear submarine Seahawk gets caught in an iceberg. The iceberg collapses, unleashing Godzilla, who had been trapped within it since 1955. Godzilla then destroys the submarine and makes his way towards Japan, attacking a military base as he journeys southward. On Faro Island, a gigantic octopus crawls ashore and attacks the native village in search of Farolacton juice, taken from a species of red berry native to the island. The mysterious Faro monster, revealed to be King Kong, arrives and defeats the octopus. Kong then drinks several vases full of the juice while the islanders perform a ceremony, which both cause him to fall asleep. Sakurai and Furue place Kong on a large raft and begin to transport him back to Japan. Mr. Tako arrives on the ship transporting Kong, but a JSDF ship stops them and orders them to return Kong to Faro Island. Meanwhile, Godzilla arrives in Japan and begins terrorizing the countryside. Kong wakes up and breaks free from the raft. Reaching the mainland, Kong confronts Godzilla and proceeds to throw giant rocks at Godzilla. Godzilla is not fazed by King Kong's rock attack and uses his atomic heat ray to burn him. Kong retreats after realizing that he is not yet ready to take on Godzilla and his atomic heat ray. The JSDF digs a large pit laden with explosives and poison gas and lures Godzilla into it, but Godzilla is unharmed. They next string up a barrier of power lines around the city filled with 1,000,000 volts of electricity, which proves effective against Godzilla. Kong then approaches Tokyo and tears through the power lines, feeding off the electricity, which seems to make him stronger. Kong then enters Tokyo and captures Fumiko, Sakurai's sister, taking her to the National Diet Building which he then scales. The JSDF launches capsules full of vaporised Farolacton juice, which puts Kong to sleep, and are able to rescue Fumiko. The JSDF then decides to transport Kong via balloons to Godzilla, in hopes that they will kill each other. The next morning, Kong is deployed by helicopter next to Godzilla at the summit of Mount Fuji and the two engage in a final battle. Godzilla initially has the advantage dazing Kong with a devastating dropkick and repeated tail blows to his head. Godzilla then attempts to burn Kong to death by using his Atomic Breath to set fire to the foliage around Kong's body. Suddenly, a bolt of lightning from thunder clouds strikes Kong, reviving him and charging him up, and the battle resumes. Godzilla and King Kong fight their way down the mountain and into Atami, where the two monsters destroy Atami Castle while trading blows, before falling off a cliff together into Sagami Bay. After a brief underwater battle, only Kong resurfaces from the water, victorious, and he begins to swim back toward his home island. There is no sign of Godzilla, but the JSDF speculates that it is possible he survived. Cast Japanese version American version Cast taken from Japan's Favorite Mon-star. Production Crew Ishirō Honda – director Eiji Tsuburaya – special effects director Kōji Kajita – assistant director Toshio Takashima – lighting Takeo Kita – art director Teruaki Abe – art director Akira Watanabe – special effects art director Kuichirō Kishida – special effects lighting Masao Fujiyoshi – sound recording Thomas Montgomery – director (American footage) John Beck – producer (American version) Paul Mason – writer (American version) Bruce Howard – writer (American version) Peter Zinner – editorial and music supervision (American version) Personnel taken from Japan's Favorite Mon-star. Conception The film had its roots in an earlier concept for a new King Kong feature developed by Willis O'Brien, animator of the original stop-motion Kong. Around 1958, O'Brien came up with a proposed treatment, King Kong Meets Frankenstein, where Kong would fight against a giant Frankenstein Monster in San Francisco. O'Brien took the project (which consisted of some concept art and a screenplay treatment) to RKO to secure permission to use the King Kong character. During this time, the story was renamed King Kong vs. the Ginko when it was believed that Universal had the rights to the Frankenstein name (it actually only had the rights to the monster's makeup design by Jack Pierce). O'Brien was introduced to producer John Beck, who promised to find a studio to make the film (at this point in time, RKO was no longer a production company). Beck took the story treatment and had George Worthing Yates flesh it out into a screenplay. The story was slightly altered and the title changed to King Kong vs. Prometheus, returning the name to the original Frankenstein concept (The Modern Prometheus was the alternate title of the original novel). The November 2, 1960 issue of Variety reported that Beck had even asked Jerry Guran to direct the film. Unfortunately, the cost of stop-motion animation discouraged potential studios from putting the film into production. After shopping the script around overseas, Beck eventually attracted the interest of the Japanese studio Toho, which had long wanted to make a King Kong film. After purchasing the script, they decided to replace the giant Frankenstein Monster with Godzilla to be King Kong's opponent and would have Shinichi Sekizawa rewrite Yates' script. The studio thought that it would be the perfect way to celebrate its 30th year in production. It was one of five big banner releases for the company to celebrate the anniversary alongside Sanjuro, 47 Samurai, Lonely Lane, and Born in Sin. John Beck's dealings with Willis O'Brien's project were done behind his back, and O'Brien was never credited for his idea. Merian C. Cooper was bitterly opposed to the project, stating in a letter addressed to his friend Douglas Burden, "I was indignant when some Japanese company made a belittling thing, to a creative mind, called King Kong vs. Godzilla. I believe they even stooped so low as to use a man in a gorilla suit, which I have spoken out against so often in the early days of King Kong". In 1963, he filed a lawsuit to enjoin distribution of the movie against John Beck, as well as Toho and Universal (the film's U.S. copyright holder) claiming that he outright owned the King Kong character, but the lawsuit never went through, as it turned out he was not Kong's sole legal owner as he had previously believed. Themes Ishiro Honda wanted the theme of the movie to be a satire of the television industry in Japan. In April 1962, TV networks and their various sponsors started producing outrageous programming and publicity stunts to grab audiences' attention after two elderly viewers reportedly died at home while watching a violent wrestling match on TV. The various rating wars between the networks and banal programming that followed this event caused widespread debate over how TV would affect Japanese culture with Soichi Oya stating TV was creating "a nation of 100 million idiots". Honda stated "People were making a big deal out of ratings, but my own view of TV shows was that they did not take the viewer seriously, that they took the audience for granted...so I decided to show that through my movie" and "the reason I showed the monster battle through the prism of a ratings war was to depict the reality of the times". Honda addressed this by having a pharmaceutical company sponsor a TV show and going to extremes for a publicity stunt for ratings by capturing a giant monster stating "All a medicine company would have to do is just produce good medicines you know? But the company doesn't think that way. They think they will get ahead of their competitors if they use a monster to promote their product.". Honda would work with screenwriter Shinichi Sekizawa on developing the story stating that "Back then Sekizawa was working on pop songs and TV shows so he really had a clear insight into television". Filming Special effects director Eiji Tsuburaya was planning on working on other projects at this point in time such as a new version of a fairy tale film script called Kaguyahime (Princess Kaguya), but he postponed those to work on this project with Toho instead since he was such a huge fan of King Kong. He stated in an early 1960s interview with the Mainichi Newspaper, "But my movie company has produced a very interesting script that combined King Kong and Godzilla, so I couldn't help working on this instead of my other fantasy films. The script is special to me; it makes me emotional because it was King Kong that got me interested in the world of special photographic techniques when I saw it in 1933." Early drafts of the script were sent back with notes from the studio asking that the monster antics be made as "funny as possible". This comical approach was embraced by Tsuburaya, who wanted to appeal to children's sensibilities and broaden the genre's audience. Much of the monster battle was filmed to contain a great deal of humour but the approach was not favoured by most of the effects crew, who "couldn't believe" some of the things Tsuburaya asked them to do, such as Kong and Godzilla volleying a giant boulder back and forth. With the exception of the next film, Mothra vs. Godzilla, this film began the trend to portray Godzilla and the monsters with more and more anthropomorphism as the series progressed, to appeal more to younger children. Ishirō Honda was not a fan of the dumbing down of the monsters. Years later, Honda stated in an interview. "I don't think a monster should ever be a comical character." "The public is more entertained when the great King Kong strikes fear into the hearts of the little characters." The decision was also taken to shoot the film in a (2.35:1) scope ratio (Tohoscope) and to film in color (Eastman Color), marking both monsters' first widescreen and color portrayals. Toho had planned to shoot this film on location in Sri Lanka, but had to forgo that (and scale back on production costs) because it ended up paying RKO roughly ¥80 million ($220,000) for the rights to the King Kong character. The bulk of the film was shot on the Japanese island of Izu Ōshima instead. The movie's production budget came out to (). Suit actors Shoichi Hirose (King Kong) and Haruo Nakajima (Godzilla) were given mostly free rein by Eiji Tsuburaya to choreograph their own moves. The men would rehearse for hours and would base their moves on that from professional wrestling (a sport that was growing in popularity in Japan), in particular the movies of Toyonobori. During pre-production, Eiji Tsuburaya had toyed with the idea of using Willis O'Brien's stop-motion technique instead of the suitmation process used in the first two Godzilla films, but budgetary concerns prevented him from using the process, and the more cost-efficient suitmation was used instead. However, some brief stop motion was used in a couple of quick sequences. One of these sequences was animated by Koichi Takano, who was a member of Eiji Tsuburaya's crew. A brand new Godzilla suit was designed for this film and some slight alterations were done to its overall appearance. These alterations included the removal of its tiny ears, three toes on each foot rather than four, enlarged central dorsal fins, and a bulkier body. These new features gave Godzilla a more reptilian/dinosaurian appearance. Outside of the suit, a meter-high model and a small puppet were also built. Another puppet (from the waist up) was also designed that had a nozzle in the mouth to spray out liquid mist simulating Godzilla's atomic breath. However the shots in the film where this prop was employed (far away shots of Godzilla breathing its atomic breath during its attack on the Arctic Military base) were ultimately cut from the film. These cut scenes can be seen in the Japanese theatrical trailer. Finally, a separate prop of Godzilla's tail was also built for close-up practical shots when its tail would be used (such as the scene where Godzilla trips Kong with its tail). The tail prop would be swung offscreen by a stage hand. Sadamasa Arikawa (who worked with Eiji Tsuburaya) said that the sculptors had a hard time coming up with a King Kong suit that appeased Tsuburaya. The first suit was rejected for being too fat with long legs giving Kong what the crew considered an almost cute look. A few other designs were done before Tsuburaya would approve the final look that was ultimately used in the film. The suit's body design was a team effort by brothers Koei Yagi and Kanji Yagi and was covered with expensive yak hair, which Eizo Kaimai hand-dyed brown. Because RKO instructed that the face must be different from the original's design, sculptor Teizo Toshimitsu based Kong's face on the Japanese macaque rather than a gorilla, and designed two separate masks. As well, two separate pairs of arms were also created. One pair were extended arms that were operated by poles inside the suit to better give Kong a gorilla-like illusion, while the other pair were at normal arms-length and featured gloves that were used for scenes that required Kong to grab items and wrestle with Godzilla. Suit actor Hirose had to be sewn into the suit in order to hide the zipper. This would force him to be trapped inside the suit for large amounts of time and would cause him much physical discomfort. In the scene where Kong drinks the berry juice and falls asleep, he was trapped in the suit for three hours. Hirose stated in an interview "Sweat came pouring out like a flood and it got into my eyes too. When I came out, I was pale all over". Besides the suit with the two separate arm attachments, a meter-high model and a puppet of Kong (used for closeups) were also built. As well, a huge prop of Kong's hand was built for the scene where he grabs Mie Hama (Fumiko) and carries her off. For the attack of the giant octopus, four live octopuses were used. They were forced to move among the miniature huts by having hot air blown onto them. After the filming of that scene was finished, three of the four octopuses were released. The fourth became special effects director Eiji Tsuburaya's dinner. These sequences were filmed on a miniature set outdoors on the Miura Coast. Along with the live animals, two rubber octopus props were built, with the larger one being covered with plastic wrap to simulate mucous. Some stop-motion tentacles were also created for the scene where the octopus grabs a native and tosses him. These sequences were shot indoors at Toho Studios. Since King Kong was seen as the bigger draw and since Godzilla was still a villain at this point in the series, the decision was made to not only give King Kong top billing but also to present him as the winner of the climactic fight. While the ending of the film does look somewhat ambiguous, Toho confirmed that King Kong was indeed the winner in their 1962–63 English-language film program Toho Films Vol. 8, which states in the film's plot synopsis, A spectacular duel is arranged on the summit of Mt. Fuji and King Kong is victorious. But after he has won... American version When John Beck sold the King Kong vs Prometheus script to Toho (which became King Kong vs. Godzilla), he was given exclusive rights to produce a version of the film for release in non-Asian territories. He was able to line up a couple of potential distributors in Warner Bros. and Universal-International even before the film began production. Beck, accompanied by two Warner Bros. representatives, attended at least two private screenings of the film on the Toho Studios lot before it was released in Japan. John Beck enlisted the help of two Hollywood writers, Paul Mason and Bruce Howard, to write a new screenplay. After discussions with Beck, the two wrote the American version and worked with editor Peter Zinner to remove scenes, recut others, and change the sequence of several events. To give the film more of an American feel, Mason and Howard decided to insert new footage that would convey the impression that the film was actually a newscast. The television actor Michael Keith played newscaster Eric Carter, a United Nations reporter who spends much of the time commenting on the action from the U.N. Headquarters via an International Communications Satellite (ICS) broadcast. Harry Holcombe was cast as Dr. Arnold Johnson, the head of the Museum of Natural History in New York City, who tries to explain Godzilla's origin and his and Kong's motivations. The new footage, directed by Thomas Montgomery, was shot in three days. Beck and his crew were able to obtain library music from a host of older films (music tracks that had been composed by Henry Mancini, Hans J. Salter, and even a track from Heinz Roemheld). These films include Creature from the Black Lagoon, Bend of the River, Untamed Frontier, The Golden Horde, Frankenstein Meets the Wolf Man, Man Made Monster, Thunder on the Hill, While the City Sleeps, Against All Flags, The Monster That Challenged the World, The Deerslayer and music from the TV series Wichita Town. Cues from these scores were used to almost completely replace the original Japanese score by Akira Ifukube and give the film a more Western sound. They also obtained stock footage from the film The Mysterians from RKO (the film's U.S. copyright holder at the time) which was used not only to represent the ICS, but which was also utilized during the film's climax. Stock footage of a massive earthquake from The Mysterians was employed to make the earthquake caused by Kong and Godzilla's plummet into the ocean much more violent than the tame tremor seen in the Japanese version. This added footage features massive tidal waves, flooded valleys, and the ground splitting open swallowing up various huts. Beck spent roughly $15,500 making his English version and sold the film to Universal-International for roughly $200,000 on April 29, 1963. The film opened in New York on June 26 of that year. Starting in 1963, Toho's international sales booklets began advertising an English dub of King Kong vs. Godzilla alongside Toho-commissioned, unedited international dubs of movies such as Giant Monster Varan and The Last War. By association, it is thought that this King Kong vs. Godzilla dub is an unedited English-language international version not known to have been released on home video. Release Theatrical In Japan, the film was released on August 11, 1962, where it played alongside Myself and I for a two week period, afterward, it was extended by one more week and screened alongside the anime film Touring the World. The film was re-released twice as part of the Toho Champion Festival, a film festival that ran from 1969 through 1978 that featured numerous films packaged together and aimed at children, first in 1970, and then again in 1977, to coincide with the Japanese release of the 1976 version of King Kong. After its theatrical re-releases, the film was screened two more times at specialty festivals. In 1979, to celebrate Godzilla's 25th anniversary, the film was reissued as part of a triple bill festival known as The Godzilla Movie Collection (Gojira Eiga Zenshu). It played alongside Invasion of Astro-Monster and Godzilla vs. Mechagodzilla. This release is known among fans for its exciting and dynamic movie poster featuring all the main kaiju from these three films engaged in battle. Then in 1983, the film was screened as part of The Godzilla Resurrection Festival (Gojira no Fukkatsu). This large festival featured 10 Godzilla/kaiju films in all (Godzilla, King Kong vs. Godzilla, Mothra vs. Godzilla, Ghidorah, the Three-Headed Monster, Invasion of Astro-Monster, Godzilla vs. Mechagodzilla, Rodan, Mothra, Atragon, and King Kong Escapes). In North America, King Kong vs. Godzilla premiered in New York City on June 26, 1963. The film was also released in many international markets. In Germany, it was known as Die Rückkehr des King Kong ("The Return of King Kong") and in Italy as Il trionfo di King Kong ("The triumph of King Kong"). In France, it was released in 1976. Home media The Japanese version of this film was released numerous times through the years by Toho on different home video formats. The film was first released on VHS in 1985 and again in 1991. It was released on LaserDisc in 1986 and 1991, and then again in 1992 in its truncated 74-minute form as part of a laserdisc box set called the Godzilla Toho Champion Matsuri. Toho then released the film on DVD in 2001. They released it again in 2005 as part of the Godzilla Final Box DVD set, and again in 2010 as part of the Toho Tokusatsu DVD Collection. This release was volume #8 of the series and came packaged with a collectible magazine that featured stills, behind-the-scenes photos, interviews, and more. In the summer of 2014, the film was released for the first time on Blu-ray as part of the company releasing the entire series on the Blu-ray format for Godzilla's 60th anniversary. The 4K Ultra High Definition remaster of the film was released on Blu-Ray in both a two disc deluxe box set and a standard one disc in May 2021. The American version was released on VHS by GoodTimes Entertainment (which acquired the license to some of Universal's film catalogue) in 1987, and then on DVD to commemorate the 35th anniversary of the film's U.S release in 1998. Both of these releases were full frame. Universal Pictures released the English-language version of the film on DVD in widescreen as part of a two-pack bundle with King Kong Escapes in 2005, and then on its own as an individual release on September 15, 2009. They then re-released the film on Blu-ray on April 1, 2014, along with King Kong Escapes. This release sold $749,747 worth of Blu-rays. FYE released an exclusive Limited Edition Steelbook version of this Blu-ray on September 10, 2019. In 2019, the Japanese and American versions were included in a Blu-ray box set released by The Criterion Collection, which included all 15 films from the franchise's Shōwa era. Reception Box office In Japan, this film has the highest box office attendance figures of all of the Godzilla films to date. It sold 11.2 million tickets during its initial theatrical run, accumulating in distribution rental earnings. The film was the fourth highest-grossing film in Japan that year, behind The Great Wall (Shin no shikōtei), Sanjuro, and 47 Samurai and was Toho's second biggest moneymaker. At an average 1962 Japanese ticket price, ticket sales were equivalent to estimated gross receipts of approximately (). Including re-releases, the film accumulated a lifetime figure of 12.6 million tickets sold in Japan, with distribution rental earnings of . The 1970 re-release sold 870,000 tickets, equivalent to estimated gross receipts of approximately (). The 1977 re-release sold 480,000 tickets, equivalent to estimated gross receipts of approximately (). This adds up to total estimated Japanese gross receipts of approximately (). In the United States, the film grossed $2.7 million, accumulating a profit (via rentals) of $1.25 million. In France, where it released in 1976, the film sold 554,695 tickets, equivalent to estimated gross receipts of approximately ($1,667,650). This adds up to total estimated gross receipts of approximately worldwide. Preservation The original Japanese version of King Kong vs. Godzilla is infamous for being one of the most poorly-preserved tokusatsu films. In 1970, director Ishiro Honda prepared an edited version of the film for the Toho Champion Festival, a children's matinee program that showcased edited re-releases of older kaiju films along with cartoons and then-new kaiju films. Honda cut 24 minutes from the film's original negative and, as a result, the highest quality source for the cut footage was lost. For years, all that was thought to remain of the uncut 1962 version was a faded, heavily damaged 16mm element from which rental prints had been made. 1980s restorations for home video integrated the 16mm deleted scenes into the 35mm Champion cut, resulting in wildly inconsistent picture quality. In 1991, Toho issued a restored laserdisc incorporating the rediscovery of a reel of 35mm trims of the deleted footage from the original negative. The resultant quality was far superior to previous reconstructions, but not perfect; an abrupt cut caused by missing frames at the beginning or end of a trim is evident whenever the master switches between the Champion cut and a 35mm trim within the same shot. This laserdisc master was utilized for Toho's 2001 DVD release with few changes. In 2014, Toho released a new restoration of the film on Blu-Ray, which utilized the 35mm edits once again, but only those available for reels 2-7 of the film were able to be located. The remainder of video for the deleted portions was sourced from the earlier Blu-Ray of the U.S. version, in addition to the previous 480i 1991 laserdisc master. On July 14, 2016, a 4K restoration of a completely 35mm sourced version of the film aired on The Godzilla First Impact, a series of 4K broadcasts of Godzilla films on the Nihon Eiga Senmon Channel. Legacy Due to the great box office success of this film, Toho wanted to produce a sequel immediately. Shinichi Sekizawa was brought back to write the screenplay tentatively called Continuation King Kong vs Godzilla. Sekizawa revealed that Kong had killed Godzilla during their underwater battle in Sagami Bay with a line of dialogue stating "Godzilla, who sank and died in the waters off Atami". As the story progressed, Godzilla's body is salvaged from the Ocean by a group of entrepreneurs who hope to display the remains at a planned resort. Meanwhile King Kong is found in Africa where he had been protecting a baby (the sole survivor of a plane crash). After the baby is rescued by investigators, and is taken back to Japan, Kong follows the group and rampages through the country looking for the infant. Godzilla is then revived with hopes of driving off Kong. The story ends with both monsters plummeting into a volcano. The project was ultimately cancelled. A couple of years later, Toho conceived the idea to pit Godzilla against a giant Frankenstein Monster and assigned Takeshi Kimura in 1964 to write a screenplay titled Frankenstein vs. Godzilla. However, Toho would cancel this project as well and instead decided to match Mothra against Godzilla in Mothra vs. Godzilla. This began a formula where kaiju from past Toho films would be added into the Godzilla franchise. Toho was interested in producing a series around their version of King Kong, but were refused by RKO. However, Toho would handle the character once more in 1967 to help Rankin/Bass co-produce their film King Kong Escapes, which was loosely based on a cartoon series Rankin/Bass had produced. Henry G. Saperstein was impressed with the giant octopus scene and requested a giant octopus to appear in Frankenstein Conquers the World and The War of the Gargantuas. The giant octopus appeared in an alternate ending for Frankenstein Conquers the World that was intended for overseas markets, but went unused. As a result, the octopus instead appeared in the opening of The War of the Gargantuas. The film's Godzilla suit was reused for certain scenes in Mothra vs. Godzilla The film's Godzilla design also formed the basis for some early merchandise in the U.S. in the 1960s, such as a model kit by Aurora Plastics Corporation, and a board game by Ideal Toys. This game was released alongside a King Kong game in 1963 to coincide with the U.S. theatrical release of the film. The film's King Kong suit was recycled and altered for the second episode of Ultra Q and the water scenes for King Kong Escapes. Scenes of the film's giant octopus attack were recycled for the 23rd episode of Ultra Q. In 1992, to coincide with the company's 60th anniversary, Toho expressed interest in remaking the film as Godzilla vs. King Kong. However, producer Tomoyuki Tanaka stated that obtaining the rights to King Kong proved difficult. Toho then considered producing Godzilla vs. Mechani-Kong but effects director Koichi Kawakita confirmed that obtaining the likeness of King Kong also proved difficult. Mechani-Kong was replaced by Mechagodzilla, and the project was developed into Godzilla vs. Mechagodzilla II in 1993. During the production of Pirates of the Caribbean: Dead Man's Chest, animation director Hal Hickel instructed his team to watch King Kong vs. Godzilla, specifically the giant octopus scene, to use as a reference when animating the Kraken's tentacles. The film has been referenced in pop culture through various media. It was referenced in Da Lench Mob's 1992 single "Guerillas in tha Mist". It was spoofed in advertising for a Bembos burger commercial from Peru, for Ridsect Lizard Repellant, and for the board game Connect 4. It was paid homage to in comic books by DC Comics, Bongo Comics, and Disney Comics. It was spoofed in The Simpsons episode "Wedding for Disaster". In 2015, Legendary Entertainment announced plans for a King Kong vs Godzilla film of their own (unrelated to Toho's version), which was released on March 26, 2021. Dual ending myth For many years, a popular myth has persisted that in the Japanese version of this film, Godzilla emerges as the winner. The myth originated in the pages of Spacemen magazine, a 1960s sister magazine to the influential publication Famous Monsters of Filmland. In an article about the film, it is incorrectly stated that there were two endings and "If you see King Kong vs Godzilla in Japan, Hong Kong or some Oriental sector of the world, Godzilla wins!" The article was reprinted in various issues of Famous Monsters of Filmland in the years following, such as in issues #51 and #114. This misinformation would be accepted as fact and persist for decades. For example, a question in the "Genus III" edition of the popular board game Trivial Pursuit asked, "Who wins in the Japanese version of King Kong vs. Godzilla?" and stated that the correct answer was "Godzilla". Various media have repeated this falsehood, including the Los Angeles Times. With the rise of home video, Westerners have increasingly been able to view the original version and the myth has been dispelled. The only differences between the two endings of the film are minor: In the Japanese version, as Kong and Godzilla are fighting underwater, a very small earthquake occurs. In the American version, producer John Beck used stock footage of a violent earthquake from the 1957 Toho film The Mysterians to make the climactic earthquake seem far more violent and destructive. The dialogue is slightly different. In the Japanese version, onlookers are wondering if Godzilla might be dead or not as they watch Kong swim home and speculate that it is possible he survived. In the American version, onlookers simply say, "Godzilla has disappeared without a trace" and newly shot scenes of reporter Eric Carter have him watching Kong swim home on a view screen and wishing him luck on his long journey home. As the film ends and the screen fades to black, "Owari" ("The End") appears onscreen. Godzilla's roar, followed by Kong's, is on the Japanese soundtrack. This was akin to the monsters' taking a bow or saying goodbye to the audience, as at this point the film is over. In the American version, only Kong's roar is present on the soundtrack. In 1993, comic book artist Arthur Adams wrote and drew a one-page story that appeared in the anthology Urban Legends #1, published by Dark Horse Comics, which dispels the popular misconception about the two versions of King Kong vs. Godzilla. See also List of American films of 1963 Godzilla vs. Kong Notes References Bibliography External links Official Godzilla website by Toho [https://web.archive.org/web/20201104190952/https://www.toho.website/movies/3/index.html キングコング対ゴジラ'] - Toho's official page for the film King Kong vs. Godzilla at Movie-Censorship - detailed comparison between the Japanese and American versions of the film Archer, Eugene. "King Kong vs. Godzilla" (film review) The New York Times''. June 27, 1963. 1962 films 1963 films 1960s fantasy films 1960s monster movies 1960s science fiction films Color sequels of black-and-white films Horror crossover films Crossover tokusatsu films Films about cephalopods Films directed by Ishirō Honda Films produced by Tomoyuki Tanaka Films scored by Akira Ifukube Films set in the 1960s Films set in Atami Films set in Gunma Prefecture Films set in Oceania Films set in Tochigi Prefecture Films set in Tokyo Films set on fictional islands Films shot in Japan Films shot in Tokyo Films using stop-motion animation Giant monster films Godzilla films Japanese fantasy adventure films Japanese films Japanese sequel films Japanese-language films Kaiju films King Kong (franchise) films Toho films Universal Pictures films Films with screenplays by Shinichi Sekizawa Films with live action and animation
[ -0.18389518558979034, 0.12208007276058197, -0.23912149667739868, -0.07615002989768982, -0.3966658115386963, -0.03982780501246452, 0.003396248444914818, -0.2290440797805786, -0.46709540486335754, 0.34064510464668274, -0.05155792832374573, 0.23487265408039093, -0.21864305436611176, 0.0918140...
11992
https://en.wikipedia.org/wiki/Ebirah%2C%20Horror%20of%20the%20Deep
Ebirah, Horror of the Deep
is a 1966 Japanese kaiju film directed by Jun Fukuda and produced and distributed by Toho Co., Ltd. The film stars Akira Takarada, Kumi Mizuno, Akihiko Hirata and Eisei Amamoto, and features the fictional monster characters Godzilla, Mothra, and Ebirah. It is the seventh film in the Godzilla franchise, and features special effects by Sadamasa Arikawa, under the supervision of Eiji Tsuburaya. In the film, Godzilla and Ebirah are portrayed by Haruo Nakajima and Hiroshi Sekita, respectively. During its development, Ebirah, Horror of the Deep was intended to feature King Kong, but the character was replaced by Godzilla. The film was released to theaters in Japan on December 17, 1966, and was released directly to television in the United States in 1968 under the title Godzilla versus the Sea Monster. Plot After Yota is lost at sea, his brother Ryota steals a yacht with his two friends and a bank robber. However, the crew runs afoul of Ebirah, a giant lobster-like creature, and washes ashore on Letchi Island. There, the Red Bamboo, a terrorist organization, manufactures heavy water for selling weapons of mass destruction and a yellow liquid that keeps Ebirah at bay, presumably controlling him. The Red Bamboo has enslaved natives from nearby Infant Island to create the yellow liquid, while the natives hope that Mothra will awaken in her winged, adult form and rescue them. In their efforts to avoid capture, Ryota and his friends, aided by Daiyo, a native girl, come across Godzilla, who previously fought Ghidorah and is now sleeping within a cliffside cavern. The group devises a plan to defeat the Red Bamboo and escape the island. In the process, they awaken Godzilla using a makeshift lightning rod. Godzilla fights Ebirah, but the huge crustacean escapes. Godzilla is then attacked by a giant condor and a squadron of Red Bamboo fighter jets. Using his atomic ray, Godzilla destroys the jets and kills the giant bird. The humans retrieve the missing Yata and free the enslaved natives as Godzilla begins to destroy the Red Bamboo's base of operations, smashing a tower that causes a countdown that will destroy the island in a nuclear explosion. Godzilla fights Ebirah and defeats him, ripping his claws off, forcing him to retreat into the sea. The natives await for Mothra to carry them off in a large net. However, when she gets to the island, Mothra is challenged by Godzilla due to a previous confrontation. Mothra manages to repel Godzilla and save her people and the human heroes. Godzilla also escapes just before the bomb detonates and destroys the island. Cast Production Development The film was originally written to feature King Kong rather than Godzilla. The film's working title was Operation Robinson Crusoe: King Kong vs. Ebirah, and the project was rejected by Rankin/Bass Productions before being accepted by Toho, after which King Kong's role in the film was replaced by Godzilla. Even though Eiji Tsuburaya was given directorial credit for the special effects, Sadamasa Arikawa actually directed the special effects under the supervision of Tsuburaya, who had his own company, Tsuburaya Productions, at the time. Toho had decided to set the film on an island to cut back on special effects costs. Arikawa has cited the film as a frustrating experience, stating, "There were major limitations on the budget from the studio. Toho couldn't have made too many demands about the budget if Mr. Tsuburaya had been in charge. The studio knew I was also doing TV work then, so they must have figured I could produce the movie cheaply." Special effects The underwater sequences were filmed on an indoor soundstage where the Godzilla and Ebirah suits were filmed through the glass of a water-filled aquarium, with some scenes of the Godzilla suit shot separately underwater as well. Haruo Nakajima (the suit performer for Godzilla) wore a wet suit under the Godzilla suit for every scene that required him to be in the water, which took a week to complete the water scenes, Nakajima stated, "I worked overtime until about eight o'clock every day. Even though I wore a wet suit under the costume, I got cold. But I never got sick, because I was so tense during the filming." Filming This is the first of two Godzilla films in which a Pacific island is the primary setting, rather than a location inside Japan. The second and final one is Son of Godzilla (1967). Release Ebirah, Horror of the Deep was released theatrically in Japan on December 17, 1966, where it was distributed by Toho. The American version of the film was released directly to television by Continental Distributing in 1968 under the title Godzilla versus the Sea Monster. The film may have received theatrical distribution in the United States as a Walter Reade, Jr. Presentation, but this has not been confirmed. Home media The film was released on DVD on February 8, 2005 by Sony Pictures Home Entertainment. The film was released on Blu-ray on May 6, 2014 by Kraken Releasing. In 2019, the Japanese version was included in a Blu-ray box set released by the Criterion Collection, which included all 15 films from the franchise's Shōwa era. References Sources External links Godzilla on the web (Japan) 1960s children's fantasy films 1960s fantasy films 1960s monster movies 1960s science fiction films 1966 films Films about bank robbery Films about terrorism in Asia Films directed by Jun Fukuda Films dubbed by Frontier Enterprises Films produced by Tomoyuki Tanaka Films scored by Masaru Sato Films set in Kanagawa Prefecture Films set in Tokyo Films set on fictional islands Films with screenplays by Shinichi Sekizawa Godzilla films Giant monster films Japanese films Japanese sequel films Japanese-language films Kaiju films Mad scientist films Mothra Toho films
[ -0.4059096574783325, -0.11867829412221909, -0.19034720957279205, 0.04410603269934654, -0.056463032960891724, 0.45292872190475464, 0.5207592844963074, -0.4695473313331604, -0.23514728248119354, -0.07073792070150375, 0.041657086461782455, 0.34366950392723083, -0.1204068511724472, 0.691915154...
11993
https://en.wikipedia.org/wiki/Son%20of%20Godzilla
Son of Godzilla
is a 1967 Japanese kaiju film directed by Jun Fukuda, with special effects by Sadamasa Arikawa, under the supervision of Eiji Tsuburaya. Produced and distributed by Toho Co., Ltd, it is the eighth film in the Godzilla franchise. It stars Tadao Takashima, Akira Kubo, Akihiko Hirata, and Beverly Maeda, with Hiroshi Sekita, Seiji Onaka, and Haruo Nakajima as Godzilla, and Marchan the Dwarf as Minilla. Son of Godzilla received a theatrical release in Japan on December 16, 1967, and was released directly to television in the United States in 1969 through the Walter Reade Organization. Plot A team of scientists are trying to perfect a weather-controlling system. Their efforts are hampered by the arrival of a nosy reporter and by the sudden presence of giant praying mantises. The first test of the weather control system goes awry when the remote control for a radioactive balloon is jammed by an unexplained signal coming from the center of the island. The balloon detonates prematurely, creating a radioactive storm that causes the giant mantises to grow to enormous sizes. Investigating the mantises, which are named Kamacuras (Gimantis in the English-dubbed version), the scientists find the monstrous insects digging an egg out from under a pile of earth. The egg hatches, revealing a baby Godzilla. The scientists realize that the baby's telepathic cries for help were the cause of the interference that ruined their experiment. Shortly afterwards, Godzilla arrives on the island in response to the infant's cries, demolishing the scientist's base while rushing to defend the baby. Godzilla kills two of the Kamacuras during the battle while one manages to fly away to safety, Godzilla then adopts the baby. The baby Godzilla, named Minilla, quickly grows to about half the size of the adult Godzilla and Godzilla instructs it on the important monster skills of roaring and using its atomic ray. At first, Minilla has difficulty producing anything more than atomic smoke rings, but Godzilla discovers that stressful conditions (i.e. stomping on his tail) or motivation produces a true radioactive blast. Minilla comes to the aid of Saeko when she is attacked by a Kamacuras, but inadvertently awakens Kumonga (Spiga in the English-dubbed version), a giant spider that was sleeping in a valley. Kumonga attacks the caves where the scientists are hiding and Minilla stumbles into the fray. Kumonga traps Minilla and the final Kamacuras with its webbing, but as Kumonga begins to feed on the deceased Kamacuras, Godzilla arrives. Godzilla saves Minilla and they work together to defeat Kumonga by using their atomic rays on the giant spider. Hoping to keep the monsters from interfering in their attempt to escape the island, the scientists finally use their perfected weather altering device on the island and the once tropical island becomes buried in snow and ice. As the scientists are saved by an American submarine, Godzilla and Minilla begin to hibernate as they wait for the island to become tropical again. Cast Production For the second Godzilla film in a row, Toho produced an island themed adventure with a smaller budget than most of their monster films from this time period. While the a-list crew of talent was hired to work on that year's King Kong Escapes, (Ishirō Honda, Eiji Tsuburaya, and Akira Ifukube), the second string crew of cheaper talent was once again tapped to work on this project as they had done with Ebirah, Horror of the Deep. This included Jun Fukuda (director), Sadamasa Arikawa (special effects), and Masaru Sato (composer). This was the first film where Arikawa was officially listed as the director of Special Effects, although he did receive some supervision from Tsuburaya when he was available. Toho wanted to create a baby Godzilla to appeal to the "date crowd" (a genre of films that were very popular among young couples during this time period), with the idea that girls would like a "cute" baby monster. For the idea behind Minilla, Fukuda stated, "We wanted to take a new approach, so we gave Godzilla a child. We thought it would be a little strange if we gave Godzilla a daughter, so instead we gave him a son". Fukuda also wanted to portray the monsters almost as people in regards to the father-son relationship between Godzilla and Minilla, as Fukuda stated "We focused on the relationship between Godzilla and his son throughout the course of Son of Godzilla. Minilla was designed to incorporate features of not only a baby Godzilla but a human baby was well. "Marchan the Dwarf" was hired to play the character due to his ability to play-act and to give the character a childlike ambiance. He was also hired because of his ability to perform athletic rolls and flips inside the thick rubber suit. The Godzilla suit built for this film was the biggest in terms of size and girth. This was done in order to give Godzilla a "maternal" appearance and to give a parent-like stature in contrast next to Minilla. Because of the size of the suit, seasoned Godzilla suit actor Haruo Nakajima was only hired to play Godzilla in two scenes because the suit was much too big for him to wear. The smaller suit he had worn for the films Ebirah, Horror of the Deep and Invasion of Astro-Monster was used for these sequences. The much larger Seji Onaka instead played Godzilla in the film, although he was replaced midway through filming by Hiroshi Sekita after he broke his fingers. Outside of the two monster suits, various marionettes and puppets were used to portray the Island's gigantic inhabitants. The various giant preying mantises known as Kamacuras and the huge spider Kumonga. Arikawa would usually have 20 puppeteers at a time working on the various marionettes. The massive Kumonga puppet needed 2 to 3 people at a time to operate each leg. Filming took place in Guam and areas in Japan including Gotemba, Lake Yamana, the Fuji Five Lakes region, and Oshima. A sequence that shows Godzilla leaving Minilla behind on the freezing Sollgel Island and making it to shore before turning back was cut from the final film's ending. A portion of this sequence has been preserved in both the trailer and an outtake reel included with the Godzilla Final Box DVD collection as supplemental material. Release Theatrical Son of Godzilla was distributed theatrically in Japan by Toho on December 16, 1967. The film was released theatrically in the United Kingdom in August of 1969, as a double feature with Ebirah, Horror of the Deep. Son of Godzilla was never released theatrically in the United States, instead being released directly to television by Walter Reade Sterling as well as American International Pictures (AIP-TV) in some markets in 1969. The American television version was cut to 84 minutes. Home media In 2005, the film was released on DVD by Sony Pictures in its original uncut length with the original Japanese audio and Toho's international English dub. In 2019, the Japanese version and export English version was included in a Blu-ray box set released by the Criterion Collection, which included all 15 films from the franchise's Shōwa era. Reception In a contemporary review, the Monthly Film Bulletin declared the film to be "out of the top drawer of the Toho Company's monster file, with the special effects department achieving their best results in monster locomotion" and that the film "has the advantage of a more soundly constructed story than most of its predecessors and a delightful vein of humor that allows for a gentle parody of the genre." According to the Polish writer Aleksandra Ziółkowska-Boehm, the film appealed to Polish journalist Melchior Wańkowicz: "On August 9, Tomuś's birthday, we all went to see Son of Godzilla. I was afraid [Melchior] would be irritated by this film's type. I was again surprised, I watched with what interest he looked at the picture. Later he said that he had never seen this genre, but he was delighted with the technique of realization." See also List of Japanese films of 1967 List of science fiction films of the 1960s References Footnotes Bibliography External links Godzilla on the web(Japan) 1967 films 1960s monster movies 1960s science fiction films 1960s children's fantasy films Films scored by Masaru Sato Films about insects Films about spiders Films directed by Jun Fukuda Films dubbed by Frontier Enterprises Films produced by Tomoyuki Tanaka Films set on fictional islands Films shot in Guam Films shot in Japan Giant monster films Godzilla films Japanese films Japanese-language films Japanese children's films Japanese fantasy films Japanese sequel films Kaiju films Toho films Films with screenplays by Shinichi Sekizawa Father and son films
[ -0.30403587222099304, 0.11867048591375351, -0.2015182077884674, -0.15270979702472687, -0.20837803184986115, 0.045756347477436066, 0.07492754608392715, -0.25967642664909363, -0.2787136435508728, -0.04049946367740631, -0.19180721044540405, 0.36688411235809326, -0.14545680582523346, 0.3287631...
11994
https://en.wikipedia.org/wiki/Destroy%20All%20Monsters
Destroy All Monsters
is a 1968 Japanese kaiju film directed by Ishirō Honda, with special effects by Eiji Tsuburaya. The film, which was produced and distributed by Toho Co., Ltd, is the ninth film in the Godzilla franchise, and features eleven monster characters, including Godzilla, Mothra, Rodan, King Ghidorah, Anguirus, and Minilla. The film stars Akira Kubo, Jun Tazaki, Yukiko Kobayashi and Yoshio Tsuchiya. In the film, humans have achieved world peace by the year 1999, and various giant monsters are confined to an area known as Monsterland. The monsters are freed from the area and are mind-controlled by aliens known as Kilaaks, who send them to attack major cities. When the monsters are freed from the Kilaaks' influence, the aliens send King Ghidorah to challenge the other monsters. Destroy All Monsters was released theatrically in Japan on August 1, 1968. The film was released by American International Pictures with an English-language dub in the United States on May 23, 1969. Contemporary American reviews were mixed, with praise mainly held for the climactic monster battle. Retrospectively, the film has received more praise, and is considered a favorite among Godzilla fans for its "audacious and simple story", "innovative action sequences", and a "memorably booming" score by Akira Ifukube. Plot At the close of the 20th century (1999 in the dub), all of the Earth's kaiju have been collected by the United Nations Science Committee and confined in an area known as Monster Island, located in the Ogasawara island chain. A special control center is constructed underneath the island to ensure that the monsters stay secure and to serve as a research facility to study them. When communications with Monster Island are suddenly and mysteriously severed, and all of the monsters begin attacking world capitals, Dr. Yoshida of the UNSC orders Captain Yamabe and the crew of his spaceship, Moonlight SY-3, to investigate Ogasawara. There, they discover that the scientists, led by Dr. Otani, have become mind-controlled slaves of a feminine alien race identifying themselves as the Kilaaks, who reveal that they are in control of the monsters. Their leader demands that the human race surrender, or face total annihilation. Godzilla attacks New York City, Rodan invades Moscow, Mothra lays waste to Beijing, Gorosaurus destroys Paris (although Baragon was credited for its destruction), and Manda attacks London. These attacks were set in to motion to draw attention away from Japan, so that the aliens can establish an underground stronghold near Mount Fuji in Japan. The Kilaaks then turn their next major attack onto Tokyo and, without serious opposition, become arrogant in their aims until the UNSC discover, after recovering the Kilaaks' monster mind-control devices from around the world, that they have switched to broadcasting the control signals from their base under the Moon's surface. In a desperate battle, the crew of the SY-3 destroys the Kilaak's lunar outpost and returns the alien control system to Earth. With all of the monsters under the control of the UNSC, the Kilaaks call King Ghidorah. The three-headed space dragon is dispatched to protect the alien stronghold at Mount Fuji, and battles Godzilla, Minilla, Mothra, Rodan, Gorosaurus, Anguirus, and Kumonga. While seemingly invincible, King Ghidorah is eventually overpowered by the combined strength of the Earth monsters and is killed. Refusing to admit defeat, the Kilaaks produce their ace, a burning monster they call the Fire Dragon, which begins to torch Tokyo and destroys the control center on Ogasawara. Suddenly, Godzilla attacks and destroys the Kilaaks' underground base, revealing that the Earth's monsters instinctively know who their enemies are. Captain Yamabe then pursues the Fire Dragon in the SY-3 and narrowly achieves victory for the human race. The Fire Dragon is revealed to be a flaming Kilaak saucer and is destroyed. With the Kilaaks defeated, Godzilla and the other monsters eventually return to Monster Island to live in peace. Cast Production Per the waning popularity of the Godzilla series, special effects director Sadamasa Arikawa noted that Toho were going to potentially end the Godzilla series as "Producer Tanaka figured that all the ideas had just run out." The film was written by Takeshi Kimura and Ishirō Honda, making it the first Godzilla film since Godzilla Raids Again not written by Shinichi Sekizawa. Takeshi Kimura is credited to the pen name Kaoru Mabuchi in the film's credits. Kimura and Honda's script developed the concept of Monsterland (referred to as Monster Island in future films). The earliest screenplay was titled Kaiju Chushingura (The word chushingura refers to a famous historical story in Japan about the rebellion of 47 samurai who took revenge after their master was unjustly forced to commit suicide). Written in 1967 by Kimura, this version of the film was to include “all of the monsters”, according to Ishiro Honda in an interview. The story called for Godzilla, Minilla, Anguirus, Rodan, Mothra, Gorosaurus, Manda, Baragon, Kumonga, Varan, Magma, Kamacuras, Gaira, Sanda, and King Kong to appear in the film. When it was decided to adapt Two Godzillas!: Japan SOS (an earlier version of Son of Godzilla) instead, the script was shelved for next year, by then the rights to Kong had expired. Ishiro Honda also wanted to show lunar colonies and brand new hybrid monsters, the results of interbreeding and genetic splicing. He also wanted to delve more deeply into undersea farming to feed the monsters. But because of budget constraints he couldn't show all this. In later scripts, the number of monsters was cut as well. As the film has several monsters who continuously return in the films, the location was developed to be a faraway island where the monsters are pacified. This tied other films not related to the Godzilla series within its universe, as creatures such as Manda (from Atragon) and Varan (Varan the Unbelievable) exist. The film features footage from Ghidorah, the Three-Headed Monster (1964), specifically King Ghidorah's fiery birth scene. New monster suits for Godzilla and Anguirus were constructed for the film, while Rodan, Kumonga, Minilla, Gorosaurus, Manda, Baragon, Mothra, and King Ghidorah suits were modified from previous films, with King Ghidorah having less detail than he had in previous films. Release Destroy All Monsters was released in Japan on 1 August 1968 where it was distributed by Toho. It was released on a double bill with a reissue of the film Atragon. The film was reissued theatrically in Japan in 1972 where it was re-edited by Honda to a 74-minute running time and released with the title Gojira: Dengeki Taisakusen ( Godzilla: Lightning Fast Strategy). Destroy All Monsters continued the decline in ticket sales in Japan for the Godzilla series, earning 2.6 million in ticket sales. In comparison, Invasion of Astro-Monster brought in 3.8 million and Son of Godzilla collected 2.5 million. The film was released in the United States by American International Pictures with an English-language dub on 23 May 1969. The film premiered in the United States in Cincinnati. American International Pictures hired Titra Studios to dub the film into English. The American version of the film remains relatively close to the Japanese original. Among the more notable removed elements include Akira Ifukube's title theme and a brief shot of Minilla shielding his eyes and ducking when King Ghidorah drops Anguirus from the sky. Destroy All Monsters was shown on American television until the early 1980s. It resurfaced on cable broadcast on the Sci-Fi Channel in 1996. Home media Destroy All Monsters was released on VHS by ADV Films in 1998 which featured English-dubbed dialogue from Toho's own international version of the film. In 2011, Tokyo Shock released the film on DVD and Blu-ray and in 2014 the company re-released it on DVD and Blu-ray. In 2019, the Japanese version and export English version were included in a Blu-ray box set released by the Criterion Collection, which included all 15 films from the franchise's Shōwa era. Critical reception From contemporary reviews, both Variety and Monthly Film Bulletin noted the film's best scenes involved the monsters together, while criticising the filmmaking. Variety reviewed the English-dubbed version of the film stating that it may appeal to "Sci-fi addicts and monster fans" while stating that the "plot is on comic strip level, special effects depend on obvious miniatures and acting (human) is from school of Flash Gordon" and that the film's strength relied on its "monster rally". The Monthly Film Bulletin opined that "the model work is poor, and as usual the script is junior comic-strip". Both reviews mentioned the monsters' final scene with Variety commenting that it was "clever" and the Monthly Film Bulletin stating that "apart from [the monsters] statutory devastation of world capitals [...] the monsters have disappointingly little to do until they get together in the last reel for a splendid battle" The Monthly Film Bulletin commented that the film was "almost worth sitting through the banalities for the final confrontation on Mount Fuji" noting the son of Godzilla "endearingly applauding from a safe distance" and "the victorious monsters performing a celebratory jig". From retrospective reviews, Steve Biodrowski of Cinefantastique commented that the film "is too slim in its storyline, too thin in its characterizations, to be considered a truly great film [...] But for the ten-year-old living inside us all, it is entertainment of the most awesome sort." Matt Paprocki of Blogcritics said the film is "far from perfect" and "can be downright boring at times" but felt that "the destruction scenes make up for everything else" and "the final battle is an epic that simply can't be matched". The film is considered a cult favorite among fans of the Godzilla franchise. In Steve Ryfle and Ed Godziszewski's 2017 book covering Ishiro Honda's filmography, they expressed that Destroy All Monsters is now seen as the "last truly spirited entry" in Toho's initial series of kaiju films, due to "its audacious and simple story, a bounty of monsters and destruction, and a memorably booming soundtrack from Akira Ifukube". Godzilla director Gareth Edwards previously expressed interest in making a sequel to his 2014 movie inspired by Destroy All Monsters. See also List of Japanese films of 1968 List of science fiction films of the 1960s References Notes Bibliography External links 1968 films 1960s science fiction films 1960s fantasy films 1960s monster movies ADV Films Alien invasions in films American International Pictures films Crossover tokusatsu films Films scored by Akira Ifukube Films about dragons Films about extraterrestrial life Films about spiders Films about the United Nations Films directed by Ishirō Honda Films dubbed by Frontier Enterprises Films produced by Tomoyuki Tanaka Films set in 1999 Films set in Moscow Films set in New York City Films set in Paris Films set in the future Films set in Tokyo Films set in Yamanashi Prefecture Godzilla films Japanese films Japanese-language films Japanese science fiction films Japanese sequel films Kaiju films Moon in film Mothra Pterosaurs in fiction Toho films
[ -0.3221130073070526, 0.49627187848091125, -0.33534058928489685, -0.0971243754029274, -0.16675129532814026, 0.07153470814228058, 0.6282517313957214, -0.2093012034893036, -0.3282911777496338, -0.1625424176454544, -0.3203931152820587, 0.5376545786857605, -0.6228980422019958, 0.498105019330978...
11998
https://en.wikipedia.org/wiki/Godzilla%20vs.%20Megalon
Godzilla vs. Megalon
is a 1973 Japanese kaiju film directed by Jun Fukuda, written by Fukuda and Shinichi Sekizawa, and produced by Tomoyuki Tanaka, with special effects by Teruyoshi Nakano. Distributed by Toho and produced under their effects subsidiary Toho–Eizo, it is the 13th film in the Godzilla franchise, and features the fictional monster characters Godzilla, Megalon, and Gigan, along with the mecha character Jet Jaguar. The film stars Katsuhiko Sasaki, Hiroyuki Kawase, Yutaka Hayashi, and Robert Dunham, alongside Shinji Takagi as Godzilla, Hideto Date as Megalon, Kenpachiro Satsuma as Gigan, and Tsugutoshi Komada as Jet Jaguar. Godzilla vs. Megalon was released theatrically in Japan on March 17, 1973. It received a theatrical release in the United States in the summer of 1976 by Cinema Shares. Due to this release and subsequent home media, the film has become one of the most well-known kaiju films in the United States. The film's popularity might also be a major contributor to western perceptions of kaiju films as comedic or campy. The film received revived recognition after an appearance on Myster Science Theater 3000 in 1991. The film was followed up by Godzilla vs Mechagodzilla on March 21, 1974. Plot In the first part of 1971 (197X in the Japanese version), the most recent underground nuclear test, set off near the Aleutians, sends shockwaves as far across the globe as Monster Island in the South Pacific, severely damaging the island paradise and sending Rodan and Anguirus plummeting into the depths of the Earth, with Godzilla narrowly escaping the fissure which his friends tumbled into. For years, Seatopia, an opulent undersea civilisation that resides in vast cities reminiscent of those of Ancient Greece and Rome, has existed in relative peace, ruled by Emperor Antonio, but nuclear tests in recent years have severely affected the cities via the earthquakes the tests produced. With the Seatopian capital badly affected by the most recent test, the Seatopians plan to unleash their civilization's beetle-styled god, Megalon, to destroy the surface world out of vengeance. On the surface, an inventor named Goro Ibuki, his little brother Rokuro, and Goro's friend Hiroshi Jinkawa are off on an outing near a lake when Seatopia makes itself known to the Earth by drying up the lake the trio was relaxing nearby and using it as a base of operation. As they return home they are ambushed by agents of Seatopia who are trying to steal Jet Jaguar, a humanoid robot under construction by the trio of inventors. However the agents' first attempt is botched and they are forced to flee to safety. Some time later, Jet Jaguar is completed but the trio of inventors are knocked unconscious by the returning Seatopian agents. The agents' plan is to use Jet Jaguar to guide and direct Megalon to destroy whatever city Seatopia commands him to do. Goro and Rokuro are sent to be killed, while Hiroshi is taken hostage. Megalon is finally released to the surface while Jet Jaguar is put under the control of the Seatopians and is used to guide Megalon to attack Tokyo with the Japan Self Defense Forces failing to defeat the monster. Eventually, the trio of heroes manage to escape their situation with the Seatopians and reunite to devise a plan to send Jet Jaguar to get Godzilla's help using Jet Jaguar's secondary control system. After uniting with Japan's Defense Force, Goro manages to regain control of Jet Jaguar and sends the robot to Monster Island to bring Godzilla to fight Megalon. Without a guide to control its actions, Megalon flails around relentlessly and aimlessly fighting with the Defense Force and destroying the outskirts of Tokyo. The Seatopians learn of Jet Jaguar's turn and thus send out a distress call to their allies, the Space Hunter Nebula M aliens (from the previous film) to send the alien monster Gigan to assist their allies. As Godzilla journeys to fight Megalon, Jet Jaguar starts acting on its own and ignoring commands to the surprise of its inventors, and grows to gigantic proportions to face Megalon himself until Godzilla arrives. The battle is roughly at a standstill between robot and cyborg, until Gigan arrives and both Megalon and Gigan double team against Jet Jaguar. Godzilla finally arrives to assist Jet Jaguar and the odds become even. After a long and brutal fight, Gigan and Megalon both retreat and Godzilla and Jet Jaguar shake hands on a job well done. Jet Jaguar bids Godzilla farewell and Godzilla returns to its home on Monster Island. Jet Jaguar turns back to its human size, and returns home with Goro and Rokuro. Cast Production Development The origins of Megalon can be traced back to 1969's All Monsters Attack, as the original working idea for the film's antagonist Gabara was initially envisioned as a giant mole cricket called Gebara. The character was later reworked into Kaoru Mabuchi's 1971 treatment for Godzilla vs. the Space Monsters: Earth Defense Directive, a precursor to 1972's Godzilla vs. Gigan. The proposal called for Megalon to be paired with Gigan and King Ghidorah under the command of the hostile alien invader Miko, only to be defeated and driven off by the combined might of Godzilla, Anguirus, and a brand new monster called Majin Tuol. The next draft of the script, titled The Return of King Ghidorah!, retained the core villain cast of Gigan, King Ghidorah, and Megalon, but replaced Anguirus and Majin Tuol with Varan and Rodan. However, most of the proposed monsters were cut, leading to the final version of Godzilla vs. Gigan. Contrary to popular belief, there is no evidence Godzilla vs. Megalon was originally planned as a Jet Jaguar solo film, and no Japanese sources have surfaced which claim otherwise. Rather, the creation of Jet Jaguar was the result of a contest Toho had for children in mid-to-late 1972. The winner of the contest was an elementary school student, who submitted the drawing of a robot called Red Arone. Red Arone was turned into a monster suit, but when the child was shown the suit, he became upset because the suit did not resemble his original design. The boy's original design was white but the costume was colored red, blue and yellow. Red Arone was used for publicity, but Toho had renamed the character Jet Jaguar and had special effects director Teruyoshi Nakano redesign the character, only keeping the colors from the Red Arone suit. The Red Arone suit had a different head and wings. According to Teruyoshi Nakano, Godzilla vs. Megalon was a replacement project for another film that was cancelled at the last minute, and evidence suggests this cancelled film was Godzilla vs. Red Moon, slated for 1973. As a result, the project was postponed during pre-production. Screenwriter Shinichi Sekizawa had no time to write out a full script, and instead thought out a general story. Director Jun Fukuda ultimately ended up writing the screenplay. To make up for lost production time, the film was shot in a hasty three weeks. The production time totaled nearly six months from planning to finish. The film had three early treatments, each written by Shinichi Sekizawa, one was titled Godzilla vs. The Megalon Brothers: The Undersea Kingdom's Annihilation Strategy which was completed in September 1972. The second was titled Insect Monster Megalon vs. Godzilla: Undersea Kingdom's Annihilation Strategy, which was turned in on September 5, 1972, and the third draft was submitted on September 7, 1972. Creature design According to Teruyoshi Nakano, the Godzilla suit used in this film (nicknamed "MegaroGoji" メガロゴジ ) was made in a week, making it the fastest Godzilla suit ever made to date. They did not have time to make the eyes work correctly, something they had more time to fix for Godzilla's five appearances on Toho's superhero TV series Zone Fighter (1973), which was produced around the same time. The Megalon suit was one of the heaviest suits produced since the 1954 Godzilla suit, which made it even more difficult to raise the Megalon suit via wires in certain scenes up to the point where Nakano almost decided to scrap those scenes altogether. Since the film was shot in the winter, Katsuhiko Sasaki stated that director Jun Fukuda gave him and Yutaka Hayashi a shot of whiskey to warm them up. The Gigan suit is similar to the previous design, but the suit was made thinner, less bulky, the horn on the head was less pointed, and the buzzsaw didn't move, since it was made of static pieces. This suit also has different-sized back fins, a more circular visor, scales running up the back/sides of the neck and longer legs compared to the original version. Teruyoshi Nakano recalls how the film was rushed and that it took three weeks to shoot, stating, "It went into productions without enough preparation. There was no time to ask Mr. Sekizawa to write the script, so Mr. Sekizawa kind of thought up the general story and director Fukuda wrote the screenplay. The screenplay was completed right before crank-in". Filming Like previous Godzilla films, Godzilla vs. Megalon heavily employs stock footage from previous films such as Mothra vs. Godzilla (1964), The War of the Gargantuas (1966), Ebirah, Horror of the Deep (1966), Destroy All Monsters (1968), Godzilla vs. Hedorah (1971), and Godzilla vs. Gigan (1972). English versions In 1976, Cinema Shares gave Godzilla vs. Megalon a wide theatrical release in the United States and launched a massive marketing campaign for the film, along with the poster, buttons with one of the four monsters' faces on them were released. Given away at theatrical showings was a comic that told a simplified version of the film, which incorrectly named Jet Jaguar as "Robotman" and Gigan as "Borodan". These incorrect names were also featured in the U.S. trailer. Initially, Cinema Shares screened Toho's international English version but to ensure a G rating, several cuts were made, which resulted in the film running three minutes shorter than the original version. Godzilla vs. Megalon is the first Godzilla film to receive an American prime time network television premiere, where it was broadcast nationwide at 9:00 PM on NBC on March 15, 1977. However, to accommodate commercials, the film was only shown in a one-hour time slot, which resulted in the film being cut down to 48 minutes. John Belushi hosted the broadcast where he did some skits, all in a Godzilla suit. Mel Maron (who was president of Cinema Shares at the time) chose to release Godzilla vs. Megalon because he saw Godzilla as a heroic figure by that point and felt the timing was right to show children a hero who was a friendly monster and not Superman. The U.S. rights for the film eventually fell into the public domain in the late 80s, which resulted in companies releasing poorly-cropped, fullscreen VHS tapes mastered from pan and scan sources. This also led to the film being featured in Mystery Science Theater 3000. In 1988, New World Video intended to release the original uncut version of the English dub but, declined the project, due to a lack of budget that was required for a full release. However, despite this, the film was released uncut and in widescreen in 1992 by UK company Polygram Ltd as a double feature with Godzilla vs. Gigan. In 1998 the film was again released by UK company, 4 Front Video. As of now it appears those are the only two VHS tapes on the film that are unedited and in high quality. It was also released on DVD by Power Multimedia in 1999 in Taiwan. Originally the Sci-Fi Channel (now SyFy) showed the cut version, until finally in 2002 as Toho regained ownership of that title alongside Godzilla vs. Gigan and Godzilla vs. Mechagodzilla (both of which also were released by Cinema Shares) and broadcast the film fully uncut for the first time in the U.S. Release Box office In Japan, Godzilla vs. Megalon sold approximately 980,000 tickets. It was the first Godzilla film to sell less than one million admissions. It earned ¥220 million in Japan distribution income (rentals). The film was a success in American theaters, earning $383,744 in its first three days in Texas and Louisiana alone. The film grossed about worldwide. Critical reception Godzilla vs. Megalon was released theatrically in America on May 9, 1976, though the San Francisco Chronicle indicates that it opened there in June, and The New York Times indicates that it opened in New York City on July 11. The New York Times film critic Vincent Canby, who a decade before had given a negative review to Ghidorah, the Three-Headed Monster, gave Godzilla vs. Megalon a generally positive review. In his review on July 12, 1976, Canby said, "Godzilla vs. Megalon completes the canonization of Godzilla...It's been a remarkable transformation of character - the dragon has become St. George...It's wildly preposterous, imaginative and funny (often intentionally). It demonstrates the rewards of friendship, between humans as well as monsters, and it is gentle." Godzilla vs. Megalon has attracted the ire of many Godzilla fans in the decades since its original release. The film contributed to the reputation of Godzilla films in the United States as cheap children's entertainment that should not be taken seriously. It has been described as "incredibly, undeniably, mind-numbingly bad" and one of the "poorer moments" in the history of kaiju films. Author Stephen Mark Rainey's critique of the film was strongly negative, published in Japanese Giants, issue four. 1977. Edited and published by Bradford G. Boyle. In particular, the special effects of the film have been heavily criticized. One review described the Godzilla costume as appearing to be "crossed with Kermit the Frog" and another sneeringly compared it to Godzilla vs. Gigan, stating that it did "everything wrong that Gigan did, and then some." However, most of the criticism is of the lack of actual special effects work, as most of it consists of stock footage from previous films, including Godzilla vs. Gigan and Ghidorah, the Three-Headed Monster, but a few pieces of effects work have garnered praise, specifically a scene where Megalon breaks through a dam and the draining of the lake. The other aspects of the film have been similarly skewered. The acting is usually described as flat and generally poor, and as not improving, or sometimes, worsening, the already weak script. One part of the film, on the other hand, has garnered almost universal praise: Godzilla's final attack on Megalon, a flying kick. It has been called the saving grace of the film, and was made famous by the mock exclamations of shock and awe displayed on Godzilla vs. Megalon's appearance on Mystery Science Theater 3000. Through the end of season three to the middle of season five, that clip would be shown at the opening of each show. Despite all this, the film is also one of the most widely seen Godzilla films in the United States — it was popular in its initial theatrical release, largely due to an aggressive marketing campaign, including elaborate posters of the two title monsters battling atop New York City's World Trade Center towers, presumably to capitalize on the hype surrounding the Dino De Laurentiis remake of King Kong, which used a similar image for its own poster. Home media The film was released numerous times in the VHS format, mostly as videos from bargain basement studios that featured the edited TV version (which was wrongly assumed to be in the public domain for many years), while PolyGram and 4 Front released the unedited version of the film in 1992 and 1998, respectively. Some rumors have circulated that the film's original VHS releases in the States were uncut, but there is no evidence confirming or denying this.   Media Blasters acquired the DVD rights to both Godzilla vs. Megalon and Destroy All Monsters. Both films were released under one of the company's divisions, Tokyo Shock. Media Blasters originally planned to release Godzilla vs. Megalon on DVD and Blu-ray on December 20, 2011; however, due to technical difficulties with the dubbing and Toho having yet to give its approval for the release, the DVD/Blu-ray release was delayed. Media Blasters finally released the film on August 14, 2012, but only on a bare-bones DVD and Blu-ray. Despite this, a manufacturing error led to several copies of the originally planned version featuring bonus content to be released by accident. These special features versions are incredibly rare and are not labelled differently from the standard version, making them nearly impossible to find. This release was commercially the first to remaster the film to its original full-length version. In 2019, the Japanese version and export English dub were included in a Blu-ray box set released by the Criterion Collection, which included all 15 films from the franchise's Shōwa era. References Bibliography Canby, Vincent. (July 22, 1976). Another 'Godzilla' Movie; Monster Is Now a Good Guy (film review at The New York Times). Stanley, John. "Godzilla - The Asian Beast Who Refuses to Die". San Francisco Chronicle (Sunday Datebook), June 20, 1976 (review of Godzilla vs. Megalon - actually a history of the Godzilla films to date, mentions Megalon currently playing at three theaters and a drive-in in passing). External links Godzilla on the web 1973 films 1970s science fiction films 1970s children's fantasy films Films about insects Films directed by Jun Fukuda Films produced by Tomoyuki Tanaka Films set in Tokyo Giant monster films Godzilla films Japanese films Japanese-language films Japanese science fiction films Japanese sequel films Kaiju films 1970s monster movies Robot films Toho films Films with screenplays by Shinichi Sekizawa
[ 0.09586003422737122, 0.26140639185905457, -0.39925792813301086, 0.06521470844745636, -0.08628465980291367, 0.06039956212043762, -0.17207036912441254, -0.1018638014793396, -0.11646527051925659, 0.38985130190849304, -0.24802115559577942, 0.688434898853302, -0.576462984085083, 0.2359066754579...
12000
https://en.wikipedia.org/wiki/Godzilla%20vs.%20Biollante
Godzilla vs. Biollante
is a 1989 Japanese kaiju film written and directed by Kazuki Ōmori, with special effects by Koichi Kawakita. Distributed by Toho and produced under their subsidiary Toho Pictures, it is the 17th film in the Godzilla franchise and the second film in the franchise's Heisei period. The film stars Kunihiko Mitamura, Yoshiko Tanaka, Masanobu Takashima, Megumi Odaka, Toru Minegishi, Yasuko Sawaguchi, Toshiyuki Nagashima, Yoshiko Kuga, Ryunosuke Kaneda and Kōji Takahashi. In the film, corporations struggle for control over samples of Godzilla's cells, while the monster itself battles a creature born from a combination of Godzilla's cells, the cells of a plant, and the cells of a woman. The idea originated from a public story-writing contest, and set a trend common to all Heisei era movies, in which Godzilla faces off against opponents capable of metamorphosing into new, progressively more powerful forms. Godzilla vs. Biollante was released theatrically in Japan on December 16, 1989. It received a direct-to-video release in the United States on November 25, 1992 through HBO Video. Although it received generally positive reviews, the film was a disappointment at the Japanese box office. In Japan, it was followed by Godzilla vs. King Ghidorah in 1991. Plot In the aftermath of Godzilla's attack on Tokyo and later imprisonment at Mount Mihara, the monster's cells are secretly delivered to the Saradia Institute of Technology and Science, where they are to be merged with genetically modified plants in the hope of transforming Saradia's deserts into fertile land and ending the country's economic dependence on oil wells. Dr. Genshiro Shiragami and his daughter, Erika, are enlisted to aid with the project. However, a terrorist bombing destroys the institute's laboratory, ruining the cells and killing Erika. In 1990, Shiragami has returned to Japan and merged some of Erika's cells with those of a rose in an attempt to preserve her soul. Scientist Kazuhito Kirishima and Lieutenant Goro Gondo of the Japan Self-Defense Forces (JSDF) are using the Godzilla cells they collected to create "Anti-Nuclear Energy Bacteria" (ANEB), hoping it can serve as a weapon against Godzilla should it return. They attempt to recruit Shiragami to aid them, but are rebuffed. Meanwhile, international tensions increase over the Godzilla cells, as they are coveted by both the Saradia Institute of Technology and Science and the American Bio-Major organization. An explosion from Mount Mihara causes tremors across the area, including Shiragami's home, badly damaging the roses. Shiragami agrees to join the JSDF's effort and is given access to the Godzilla cells, which he secretly merges with one of the roses. A night later, rival Bio-Major and Saradian agents break into Shiragami's lab, but are attacked by a large plant-like creature which later escapes to Lake Ashi and is named "Biollante" by Shiragami. Bio-Major agents plant explosives around Mount Mihara and blackmails the Diet of Japan, warning the explosives will be detonated and thus free Godzilla if the cells are not handed over. Kirishima and Gondo attempt to trade, but Saradian agent SSS9 thwarts the attempt and escapes with the cells. The explosives are detonated, and Godzilla is released. Godzilla attempts to reach the nearest power plant to replenish its supply of nuclear energy, but Biollante calls out to Godzilla. Godzilla arrives at the lake to engage Biollante in a vicious battle, and emerges as the victor. Godzilla then proceeds toward the power plant at Tsuruga, but psychic Miki Saegusa uses her powers to divert it toward Osaka instead. The city is quickly evacuated before Godzilla makes landfall. A team led by Gondo meet Godzilla at the central district and fire rockets infused with the ANEB into its body. Gondo is killed in the process, and an unharmed Godzilla leaves. Kirishima recovers the cells and returns them to the JSDF. Shiragami theorizes that if Godzilla's body temperature is increased, the ANEB should work against him. The JSDF erects microwave-emitting plates during an artificial thunderstorm, hitting Godzilla with lightning and heating up his body temperature during a battle near the shores of Wakasa Bay. Godzilla is only moderately affected, but Biollante, having obtained a more powerful form, arrives to engage Godzilla in battle once again. After a long battle, the fight ends after Godzilla fires an atomic heat ray inside Biollante's mouth, severely injuring her. An exhausted Godzilla collapses on the beach as the bacterial infection finally takes hold, and Biollante splits apart into glowing spores which rise into the sky, forming an image of Erika among the stars. Shiragami, watching the scene, is killed by SSS9. Kirishima chases the assassin and, after a brief scuffle, SSS9 is killed by a microwave-emitting plate activated by Sho Kuroki. Godzilla reawakens and leaves for the ocean. Cast Production Pre-production Tomoyuki Tanaka announced a sequel to The Return of Godzilla in 1985, but was skeptical of its possibilities, as the film had been of little financial benefit to Toho, and the failure of King Kong Lives following year convinced him that audiences were not ready for a continuation of the Godzilla series. He relented after the success of Little Shop of Horrors, and proceeded to hold a public story-writing contest for a possible script. In consideration of The Return of Godzilla'''s marginal success in Japan, Tanaka insisted that the story focus on a classic monster vs. monster theme. Tanaka handed the five finalist entries to director Kazuki Ōmori, despite the two's initially hostile relationship; the latter had previously held Tanaka responsible for the decline in the Godzilla series' quality during the 1970s. Ōmori chose the entry of dentist Shinichiro Kobayashi, who wrote his story with the hypothetical death of his daughter in mind. Kobayashi's submission was notable for its emphasis on dilemmas concerning biotechnology rather than nuclear energy, and revolved around a scientist grieving for his deceased daughter and attempting to keep her soul alive by merging her genes with those of a plant. The scientist's initial experiments would have resulted in the creation of a giant rat-like amphibian called Deutalios, which would have landed in Tokyo Bay and been killed by Godzilla. A female reporter investigating the scientist's activities would have suffered from psychic visions of plants with humanoid faces compelling her to infiltrate the scientist's laboratory. The scientist would have later confessed his intentions, and the finale would have had Godzilla battling a human-faced Biollante who defeats him by searing his flesh with acid. Ōmori proceeded to modify the story into a workable script over a period of three years, using his background as a biologist to create a plausible plot involving genetic engineering and botany. In order to preserve the series' anti-nuclear message, he linked the creation of Biollante to the use of Godzilla cells, and replaced Kobayashi's journalist character with Miki Saegusa. He openly admitted that directing a Godzilla film was secondary to his desire to make a James Bond movie, and thus added elements of the spy film genre into the plot. Unlike the case with later, more committee-driven Godzilla films, Ōmori was given considerable leeway in writing and directing the film, which Toho staff later judged to have been an error resulting in a movie with a very narrow audience. Special effects Koichi Kawakita, who had previously worked for Tsuburaya Productions, replaced Teruyoshi Nakano as head of the series' special effects unit after Toho became impressed at his work in Gunhed. Kawakita made use of Gunhed's special effects team Studio OX, and initially wanted to make Godzilla more animal-like, using crocodiles as references, but was berated by Tanaka, who declared Godzilla to be "a monster" rather than an animal. Kenpachiro Satsuma returned to portray Godzilla, hoping to improve his performance by making it less anthropomorphic than in previous films. Suitmaker Noboyuki Yasamaru created a Godzilla suit made specifically with Satsuma's measurements in mind, unlike the previous one which was initially built for another performer and caused Satsuma discomfort. The resulting 242 lb suit proved more comfortable than the last, having a lower center of gravity and more mobile legs. A second 176 lb suit was built for outdoor underwater scenes. The head's size was reduced, and the whites around the eyes removed. On the advice of story finalist Shinichiro Kobayashi, a double row of teeth was incorporated in the jaws. As with the previous film, animatronic models were used for close-up shots. These models were an improvement over the last, as they were made from the same molds used for the main costume, and included an articulated tongue and intricate eye motion. The suit's dorsal plates were filled with light bulbs for scenes in which Godzilla uses his atomic ray, thus lessening reliance on optical animation, though they electrocuted Satsuma the first time they were activated. Satsuma was also obliged to wear protective goggles when in the suit during scenes in which Godzilla battles the JSDF, as real explosives were used on set. The film was mainly shot at the Toho lot, although some filming occued on location at the East Fuji Maneuver Area. Designing and building the Biollante props proved problematic, as traditional suitmation techniques made realizing the requested design of the creature's first form difficult, and the resulting cumbersome model for Biollante's final form was met with disbelief from the special effects team. Biollante's first form was performed by Masao Takegami, who sat within the model's trunk area on a platform just above water level. While the creature's head movements were simple to operate, its vines were controlled by an intricate array of overhead wires which proved difficult for Satsuma to react to during combat scenes as they offered no tension, thus warranting Satsuma to feign receiving blows from them, despite not being able to perceive them. Biollante's final form was even more difficult to operate, as its vine network took hours to rig up on set. Visibility in both the Godzilla and final form Biollante suits was poor, thus causing difficulties for Takegami in aiming the creature's head when firing sap, which permanently stained anything it landed on. While it was initially decided to incorporate stop motion animation into the film, the resulting sequences were scrapped, as Kawakita felt they failed to blend in with the live-action footage effectively. The film however became the first of its kind to use CGI, though its usage was limited to scenes involving computer generated schematics. The original cut of the movie had the first battle culminating in Biollante's spores falling around the hills surrounding Lake Ashino and blooming into fields of flowers, though this was removed as the flowers were out of scale. Music Unlike the previous film, Godzilla vs. Biollante incorporates themes from Akira Ifukube's original Godzilla theme, though the majority of the soundtrack was composed of original themes by Koichi Sugiyama. The score was orchestrated by conductor David Howell through the Kansai Philarmonic, though Howell himself had never viewed the movie, and thus was left to interpret what the scenes would consist of when conducting the orchestra. English version After the film was released in Japan, Toho commissioned a Hong Kong company named Omni Productions to dub the film into English. In early 1990, Toho entered discussions with Miramax to distribute the film. When talks broke off, Toho filed a lawsuit in Los Angeles Federal Court, accusing Miramax of entering an oral agreement in June to pay Toho $500,000 to distribute the film. This lawsuit delayed the film's release for two years. An out of court settlement was reached with Miramax buying the rights to the film for an unreported figure. Miramax would have entertained thoughts of releasing the film in theaters, but in the end it was decided to release the film straight to home video instead. HBO released the film on VHS in 1992 and Laserdisc in 1993. Miramax utilized the uncut English international version of the film for this release. Release Home mediaGodzilla vs. Biollante was released on VHS by HBO Home Video on November 25, 1992. It was later relicensed by Miramax and released on Blu-ray and DVD by Echo Bridge on December 4, 2012. It was released as a double feature and 8-disk movie pack on both Blu-ray and DVD with Mega Shark Versus Giant Octopus (2009) by Echo Bridge Home Entertainment in 2013. It was last released by Lionsgate on Blu-ray and DVD on October 7, 2014. It's quite likely that Miramax's rights have reverted back to Toho since, as this release has since gone out of print. Reception Box office In Japan, the film sold approximately 2 million tickets, grossing . Critical reactionGodzilla vs. Biollante has received positive reviews, with praise for the story, music and visuals. Ed Godziszewski of Monster Zero said the film is "by no means a classic" but felt that "for the first time in well over 20 years, a [Godzilla] script is presented with some fresh, original ideas and themes." Joseph Savitski of Beyond Hollywood said the film's music is "a major detraction", but added that it's "not only one of the most imaginative films in the series, but also the most enjoyable to watch."<ref>Review - Joseph Savitski. Beyond Hollywood. August 2, 2004</ref> Japan Hero said, "[T]his is definitely a Godzilla movie not to be missed." In their scholarly book Japan's Green Monsters on kaiju cinema, Rhoads and McCorkle offer an ecocritical assessment of Godzilla vs. Biollante. The scholars focus on the film's critique of genetic engineering and biotechnology years before the subject appeared in more popular Hollywood blockbusters like Steven Spielberg's 1993 blockbuster Jurassic Park. Rhoads and McCorkle counter prior reviews of the film and argue that Godzilla vs. Biollante possesses far deeper environmental messages than the obvious ones present on the film's surface. In July 2014, in a poll reported by the , Godzilla vs. Biollante was selected as the best Godzilla film by a group of fans and judges. Composer Akira Ifukube, who had refused to compose the film's score, stated on interview that he disliked the way Koichi Sugiyama had modernized his Godzilla theme, and defined the Saradia theme as "ridiculous", on account of it sounding more European than Middle Eastern. See also List of Japanese films of 1989 List of science fiction films of the 1980s List of monster movies Biollante References Bibliography Anon (2015), ゴジラvsビオランテ コンプリーション [Godzilla vs. Biollante Completion], Hobby Japan, External links Gojira tai Biorante (Japanese) at Japanese Movie Database 1989 films 1980s monster movies 1980s science fiction films Eco-terrorism in fiction English-language films Films about plants Films about telepathy Films about volcanoes Films directed by Kazuki Ōmori Films produced by Tomoyuki Tanaka Films set in 1984 Films set in 1989 Films set in Osaka Films set in Kanagawa Prefecture Films set in Fukui Prefecture Films set in Ibaraki Prefecture Films set in Tokyo Films set in Asia Films shot in Japan Films shot in Tokyo Giant monster films Godzilla films Japanese films Japanese-language films Japanese science fiction films Japanese sequel films Kaiju films Mad scientist films Toho films
[ -0.018143970519304276, 0.016468606889247894, 0.03802056238055229, -0.1512776017189026, -0.23476798832416534, -0.0829782634973526, -0.1586569994688034, -0.08543591946363449, -0.21330426633358002, 0.21084605157375336, 0.35759246349334717, 0.7516085505485535, -0.4291123151779175, 0.1387929022...
12001
https://en.wikipedia.org/wiki/Terror%20of%20Mechagodzilla
Terror of Mechagodzilla
is a 1975 Japanese kaiju film directed by Ishirō Honda, written by Yukiko Takayama, and produced by Tomoyuki Tanaka and Henry G. Saperstein, with special effects by Teruyoshi Nakano. Distributed by Toho and produced under their effects subsidiary Toho–Eizo, it is the 15th film in the Godzilla franchise, serving as a direct sequel to the 1974 film Godzilla vs. Mechagodzilla. Terror of Mechagodzilla stars Katsuhiko Sasaki, Tomoko Ai, Akihiko Hirata, and Gorō Mutsumi, and features Toru Kawai, Kazunari Mori, and Tatsumi Nikamoto as the fictional monster characters Godzilla, Mechagodzilla 2, and Titanosaurus, respectively. The film was released theatrically in Japan on March 15, 1975. It received a limited release in the United States in 1978 by Bob Conn Enterprises under the title The Terror of Godzilla. The film remains the least financially successful entry in the Godzilla franchise to this day. Plot Following the events of Godzilla vs. Mechagodzilla, Interpol agents search for Mechagodzilla's remains at the bottom of the Okinawan Sea in the hopes of gathering information on the robot's builders, the alien Simeons. However, their submarine is attacked by a giant, aquatic dinosaur called Titanosaurus and the crew vanishes. Interpol launches an investigation into the incident. With the help of marine biologist Akira Ichinose, they trace Titanosaurus to a reclusive, mad scientist named Shinzô Mafune, who wants to destroy mankind. While the group is visiting the scientist's old home, they meet Mafune's daughter, Katsura, who claims her father is dead and that she burned his notes about Titanosaurus at his request. Unbeknownst to Interpol, the living Mafune is visited by Tsuda, aide to the Simeon leader Mugal, who is leading a project to rebuild Mechagodzilla. Mugal offers the Simeons' services to Mafune so that their respective monsters can wipe out mankind and allow them to rebuild the world for themselves. Complicating matters, Ichinose falls in love with Katsura and unwittingly gives her Interpol's information on the Simeons, Mechagodzilla, and Titanosaurus. She is also revealed to be a cyborg, having undergone cybernetic surgery after she was nearly killed during one of her father's experiments as a child, and implanted with Mechagodzilla's control device. Additionally, an impatient Mafune releases Titanosaurus on Yokosuka without the aliens' permission. While Interpol discovers the dinosaur is vulnerable to supersonic waves, Katsura destroys their supersonic wave oscillator. However, Godzilla arrives and easily defeats Titanosaurus, causing the latter to retreat. When Ichinose visits Katsura, the Simeons capture him and force him to watch as they unleash Mechagodzilla 2 and Titanosaurus on Tokyo while Interpol struggles to repair their wave oscillator and the Japanese armed forces struggle to fend off the monsters. Godzilla arrives, but is initially outmatched until Interpol distracts Titanosaurus with the repaired wave oscillator, allowing Godzilla to focus on Mechagodzilla 2. Interpol agents infiltrate the aliens' hideout, rescue Ichinose, and kill Mafune and many of the aliens. The remaining Simeons attempt to escape, but Godzilla shoots down their ships with its atomic breath. The wounded Katsura shoots herself to destroy Mechagodzilla 2's control device and dies in Ichinose's arms. With the robot non-functional, Godzilla tosses it into a chasm before blasting it with its atomic breath, causing it to explode and get buried. With help from Interpol, Godzilla then defeats Titanosaurus, who returns to the sea. Cast Production Development The original screenplay that Yukiko Takayama created after winning Toho's story contest for the next installment in the Godzilla series was picked by assistant producer Kenji Tokoro and was submitted for approval on July 1, 1974, less than four months after Godzilla vs. Mechagodzilla was released. The original concept is similar to the finished version of Terror of Mechagodzilla, with many of the changes being budgetary in nature. The most obvious alteration is the removal of the two dinosaurs called the Titans, which merged to become Titanosaurus in the first draft. It was an interesting concept, although something that was also under-explained, considering the magnitude of such an occurrence of the creatures merging. Another noticeable change to the script is that of the final battle, which does not move to the countryside but instead would have reduced Tokyo to rubble during the ensuing conflict between the three monsters. After her initial draft, Takayama submitted a revised version on October 14, 1974. This went through a third revision on December 4, and then yet another on December 28 of that same year before it was met with approval and filming began. Filming This film is one of two Godzilla films with brief nudity (the other being 1994's Godzilla vs. SpaceGodzilla). The scene occurs when Katsura undergoes an operation to have Mechagodzilla 2's control device placed inside her body, at which point her breasts are exposed. While she was portrayed by a mannequin in the scene, the scene was cut when the film was released in the U.S., both from the theatrical and TV versions of the film. Director Ishiro Honda laments not being able to work with the story's writer, Yukiko Takayama, on other films, enjoying that a "woman's perspective was especially fresh" for the genre. Kensho Yamashita was the chief assistant director on the project. He notes, though, that Honda never actually assigned any of the shooting to him, possibly because he was happy to be directing again after a long gap in his career and wanted to do the work himself. English version Toho titled its English version of the film Terror of Mechagodzilla and had it dubbed into English in Hong Kong. This “international version” has never seen wide release in the United States, but has been issued on VHS in the United Kingdom by PolyGram Video Ltd. and on DVD in Taiwan by Power Multimedia. The film was given a North American theatrical release in March 1978 by independent distributor Bob Conn Enterprises under the title The Terror of Godzilla. Just as Cinema Shares had done with the previous three Godzilla movies, Bob Conn Enterprises chose to utilize the Toho-commissioned English dub instead of hiring a new crew to re-dub the film. The Terror of Godzilla was heavily edited to obtain a "G" rating from the MPAA. Several scenes with violent content were entirely removed, disrupting the flow of the narrative. Henry G. Saperstein, who sold the theatrical rights to Bob Conn Enterprises, also released the film to television in late 1978, this time under Toho's international title, Terror of Mechagodzilla. Unlike The Terror of Godzilla, the television version remained mostly uncut, with only the shot of Katsura's naked breasts excised. Saperstein's editors also added a 10-minute prologue that served as a brief history of Godzilla, with footage from Saperstein's English versions of Invasion of Astro-Monster and All Monsters Attack (the latter of which utilized stock footage from both Ebirah, Horror of the Deep and Son of Godzilla). In the mid-1980s, the U.S. television version, Terror of Mechagodzilla, was replaced by the theatrical edit, The Terror of Godzilla, on television and home video. For some reason, the title was also changed to Terror of Mechagodzilla. The 1994 Paramount release of Terror of Mechagodzilla listed a running time of 89 minutes on the slipcase, implying that this release would be the longer version first shown on American TV. The actual video cassette featured the edited theatrical version. In a 1995 interview with G-Fan magazine, Saperstein was surprised to hear about this mistake. In 1997 on Channel 4 in the U.K., three Godzilla movies were shown back to back late at night, starting with Godzilla vs. Megalon, Godzilla vs. Gigan and then Terror of Mechagodzilla; all were dubbed versions. This showing was uncut, including the Katsura nudity scene, but it did not have the Western-made prologue. In the mid-2000s, the television version showed up again on Monsters HD, and in 2007, it made its home video debut as the U.S. version on the Classic Media DVD. Although the added prologue was originally framed for fullscreen television, it was cropped and shown in widescreen on the disc. The rest of the movie featured the audio from Saperstein's television version synced to video from the Japanese version. The first article about the movie's storyline was published in Japanese Giants #4 in 1977, edited and published by Bradford G. Boyle, and was written by Richard H. Campbell, creator of The Godzilla Fan News Letter (a.k.a. "The Gang"). Box office In Japan, the film sold 980,000 tickets. Despite earning positive reviews, it would be the least-attended Godzilla film in Japan and also one of only two Godzilla films to sell less than 1 million tickets. This was part of a decline in attendance for monster movies as a whole and Toho put the production of monster movies on hold. Toho had no intention of permanently ending the Godzilla series. Throughout the remainder of the 1970s, several new Godzilla stories were submitted by various writers and producers. None of these films, however, were ultimately made. It was not until 1984 and Godzilla'''s 30th anniversary that Toho would start production on a new Godzilla movie. Home media The film has been released several times on DVD in the United States. The first release, by Simitar Entertainment, was on May 6, 1998 in a fullscreen version under the title The Terror of Godzilla. The second release, by First Classic Media and distributed by Sony Music Entertainment, was on September 17, 2002. It was released both individually and as part of the Ultimate Godzilla DVD Collection box set, the latter being released on the same day. It was then re-released by Second Classic Media, this time distributed by Genius Entertainment, on November 20, 2007 both individually and as part of the Godzilla Collection'' box set on April 29, 2008. In 2019, both the Japanese version and the export English version were included in a Blu-ray box set released by the Criterion Collection, which included all 15 films from the franchise's Shōwa era. References Notes Bibliography External links Godzilla on the web (Japan) 1975 films 1970s science fiction films Alien invasions in films Films scored by Akira Ifukube Films directed by Ishirō Honda Films set in Okinawa Prefecture Films set in Shizuoka Prefecture Films set in Yokosuka Giant monster films Godzilla films Japanese films Japanese-language films Japanese science fiction films Japanese sequel films Kaiju films Mad scientist films 1970s monster movies UPA films Toho films Mecha films Apes in popular culture Father and daughter films
[ -0.7522956132888794, -0.15548132359981537, -0.25577014684677124, -0.12181364744901657, -0.024889890104532242, 0.15889953076839447, 0.4982140064239502, -0.5254154205322266, -0.24100565910339355, 0.11232728511095047, -0.10598025470972061, 0.5883356928825378, -0.7032155990600586, 0.1522977799...
12002
https://en.wikipedia.org/wiki/Godzilla%20vs.%20King%20Ghidorah
Godzilla vs. King Ghidorah
is a 1991 Japanese kaiju film written and directed by Kazuki Ōmori and produced by Shōgo Tomiyama. The film, produced and distributed by Toho Studios, is the 18th film in the Godzilla franchise, and is the third film in the franchise's Heisei period. The film features the fictional monster characters Godzilla and King Ghidorah, and stars Kōsuke Toyohara, Anna Nakagawa, Megumi Odaka, Katsuhiko Sasaki, Akiji Kobayashi, Yoshio Tsuchiya, and Robert Scott Field. In the film, time-travelers' from the future warn Japan to prevent Godzilla's mutation, only to reveal their true motives via unleashed a three-headed dragon that terrorizes the city. The production crew of Godzilla vs. King Ghidorah remained largely unchanged from that of the previous film in the series, Godzilla vs. Biollante. Because the previous installment was a box office disappointment, due to a lack of child viewership and alleged competition with the Back to the Future franchise, the producers of Godzilla vs. King Ghidorah were compelled to create a film with more fantasy elements, along with time travel. Godzilla vs. King Ghidorah was the first Godzilla film since 1975's Terror of Mechagodzilla to feature a newly orchestrated score by Akira Ifukube. The film was released theatrically in Japan on December 14, 1991, and was followed by Godzilla vs. Mothra the following year. It was released direct-to-video in North America in 1998 by Columbia TriStar Home Entertainment. Though Godzilla vs. King Ghidorah was more financially successful than Godzilla vs. Biollante, the film attracted controversy outside Japan due to its perceived Japanese nationalist themes. Plot In 1992, Godzilla is still weakened after being infected by the ANEB (Anti-Nuclear Energy Bacteria). Meanwhile, science fiction writer Kenichiro Terasawa is writing a book about the monster and learns of a group of Japanese soldiers stationed on Lagos Island during the Gilbert and Marshall Islands campaign. In February 1944, while threatened by American G.I.s, the Japanese soldiers were saved by a mysterious dinosaur, which Terasawa theorizes was subsequently mutated into Godzilla in 1954 after a hydrogen bomb test on the island. Yasuaki Shindo, a wealthy businessman who commanded the Japanese soldiers on Lagos Island, confirms that the dinosaur did indeed exist. Meanwhile, a UFO lands on Mount Fuji. When the Japanese army investigates, they are greeted by Wilson, Grenchiko, Emmy Kano, and the android M-11. The visitors, known as the "Futurians", explain that they are humans from the year 2303, where Godzilla has completely destroyed Japan. The Futurians plan to travel back in time to 1944 and remove the dinosaur from Lagos Island before the island is irradiated in 1954, thus preventing the mutation of the creature into Godzilla. As proof of their story, Emmy presents a copy of Terasawa's book, which has not yet been completed in the present. The Futurians, Terasawa, Miki Saegusa, and Professor Mazaki, board a time shuttle and travel back to 1944 to Lagos Island. There, as American forces land and engage the Japanese forces commanded by Shindo, the dinosaur attacks and kills the American soldiers. The American navy then bombs the dinosaur from the sea and gravely wounds it. After Shindo and his men leave the island, M-11 teleports the dinosaur from Lagos Island to the Bering Strait. Before returning to 1992, the Futurians secretly leave three small creatures called Dorats on Lagos Island, which are exposed to radiation from the hydrogen bomb test in 1954 and merge to become King Ghidorah. After returning to 1992, the Futurians use King Ghidorah to subjugate Japan and issue an ultimatum, but Japan refuses to surrender. Feeling sympathy for the Japanese people, Emmy reveals to Terasawa the truth behind the Futurians' mission: in the future, Japan is an economic superpower that has surpassed the United States, Russia, and China. The Futurians traveled back in time in order to change history and prevent Japan's future economic dominance by creating King Ghidorah and using it to destroy present-day Japan. At the same time, they also planned to erase Godzilla from history so that it would not pose a threat to their plans. After M-11 brings Emmy back to the UFO, she reprograms the android so it will help her. Shindo plans to send his nuclear submarine to the Bering Strait and irradiate the dinosaur in order to recreate Godzilla. However, Terasawa discovers too late that a Russian nuclear submarine sank there in the 1970s and released enough radiation to mutate the dinosaur into Godzilla. En route to the Bering Strait, Shindo's submarine is destroyed by Godzilla, who absorbs its radiation, recovers from the ANEB and becomes larger. Godzilla arrives in Japan and is met by King Ghidorah. They fight at equal strength, each immune to the other's attacks. With M-11 and Terasawa's aid, Emmy sabotages the UFO's control over King Ghidorah, causing the three-headed monster to lose focus during the battle. Godzilla eventually ends the battle by blasting off Ghidorah's middle head. Before sending King Ghidorah crashing into the ocean, Godzilla destroys the UFO, killing Wilson and Grenchiko. It then turns its attention to Tokyo, destroying the city and killing Shindo. Emmy travels to the future with M-11 and returns to the present day with Mecha-King Ghidorah, a cybernetic version of King Ghidorah. The cybernetic Ghidorah blasts Godzilla with beams, which proves useless. Godzilla then counters by relentlessly blasting Ghidorah with its atomic breath before Ghidorah launches clamps to restrain Godzilla. Ghidorah carries Godzilla out of Japan, but Godzilla breaks from its restraints and causes Ghidorah to send both crashing into the ocean. Emmy then returns to the future, but not before informing Terasawa that she is his descendant. At the bottom of the ocean, Godzilla awakens and roars over Mecha-King Ghidorah's remains before swimming away. Cast Production Conception Although the previously filmed Godzilla vs. Biollante had been the most expensive Godzilla film produced at the time, its low audience attendance and loss of revenue convinced executive producer and Godzilla series creator Tomoyuki Tanaka to revitalize the series by bringing back iconic monsters from pre-1984 Godzilla movies, specifically Godzilla's archenemy King Ghidorah. Godzilla vs. Biollante director and writer Kazuki Ōmori had initially hoped to start a standalone series centered on Mothra, and was in the process of rewriting a 1990 script for the unrealized film Mothra vs. Bagan. The film was ultimately scrapped by Toho, under the assumption that, unlike Godzilla, Mothra would have been a difficult character to market overseas. The planning stages for a sequel to Godzilla vs. Biollante were initially hampered by Tanaka's deteriorating health, thus prompting the takeover of Shōgo Tomiyama as producer. The new producer felt that the financial failure of Godzilla vs. Biollante was due to the plot being too sophisticated for child audiences, and thus intended to return some of the fantasy elements of the pre-1984 Godzilla films to the series. Ōmori himself blamed the lackluster performance of Godzilla vs. Biollante on competition with Back to the Future Part II, and thus concluded that audiences wanted plots involving time travel. His approach to the film also differed from Godzilla vs. Biollante in his greater emphasis on developing the personalities of the monsters rather than the human characters. Akira Ifukube agreed to compose the film's score on the insistence of his daughter, after as he was dissatisfied with the way his compositions had been treated in Godzilla vs. Biollante. Special effects The Godzilla suits used in Godzilla vs. Biollante were reused in Godzilla vs. King Ghidorah, though with slight modifications. The original suit used for land-based and full body shots had its head replaced with a wider and flatter one, and the body cut in half. The upper half was used in scenes where Godzilla emerges from the sea and during close-ups during the character's first fight with King Ghidorah. The suit used previously for scenes set at sea was modified with rounder shoulders, a more prominent chest, and an enhanced face, and was used throughout the majority of the film's Godzilla scenes. The redesigned King Ghidorah featured much more advanced wirework puppetry than its predecessors, and effects team leader Koichi Kawakita designed the "Godzillasaurus" as a more paleontologically accurate-looking dinosaur than Godzilla itself as a nod to American filmmakers aspiring to direct their own Godzilla films with the intention of making the monster more realistic. Ōmori's original draft specified that the dinosaur that would become Godzilla was a Tyrannosaurus, though this was rejected by creature designer Shinji Nishikawa, who stated that he "couldn't accept that a tyrannosaur could become Godzilla". The final suit combined features of Tyrannosaurus with Godzilla, and real octopus blood was used during the bombardment scene. Because the Godzillasaurus' arms were much smaller than Godzilla's, suit performer Wataru Fukuda had to operate them with levers within the costume. The creature's distress calls were recycled Gamera cries. Home media The Columbia/TriStar Home Video DVD version was released in 1998 as a single disc double feature with Godzilla vs. Mothra. The picture was full frame (1.33:1) [NTSC] and the audio in English (2.0). There were no subtitles. Extras included the trailer for Godzilla vs. King Ghidorah and Godzilla vs. Mothra. The Sony Blu-ray version was released on May 6, 2014 as a two-disc double feature with Godzilla vs. Mothra. The picture was MPEG-4 AVC (1.85:1) [1080p] and the audio was in Japanese and English (DTS-HD Master Audio 2.0). Subtitles were added in English, English SDH and French. Extras included the theatrical trailer and three teasers in HD with English subtitles. Reception Joseph Savitski of Beyond Hollywood said "This entry in the popular monster series is a disappointing and flawed effort unworthy of the “Godzilla” name." Film historian and critic David Kalat wrote "Despite its shortcomings, illogic, and overpopulated cast, Godzilla vs. King Ghidorah is crammed full of ideas, richly visualized innovations, a genuine spirit of fun, and some of the most complex emotional manipulation ever to grace the series." Controversy The film was considered controversial at the time of its release, being contemporary to a period of economic tension between America and Japan, but mainly due to its fictional World War II depictions. Gerald Glaubitz of the Pearl Harbor Survivors Association appeared alongside director Kazuki Ōmori on Entertainment Tonight and condemned the film as being in "very poor taste" and detrimental to American-Japanese relations. Ishirō Honda also criticized Ōmori, stating that the scene in which Godzilla attacks and crushes American G.I.s went "too far". Conversely, Godzilla historian Steve Ryfle said American media reports of supposed anti-Americanism "weren't really thought-provoking or insightful." Ōmori has denied all such allegations, stating that the American extras in the film had been "happy about being crushed and squished by Godzilla." Commenting on the controversy in 2006, Ōmori stated: References External links 1991 films 1990s monster movies 1991 science fiction films Android (robot) films Films about dinosaurs English-language films Films about dragons Films about nuclear war and weapons Films about time travel Films directed by Kazuki Ōmori Films produced by Tomoyuki Tanaka Films scored by Akira Ifukube Films set in 1944 Films set in 1992 Films set in the 23rd century Films set in Fukuoka Films set in Hiroshima Films set in Sapporo Films set in Tokyo Films set in Yokkaichi Films set in the Marshall Islands Films set in the Pacific Ocean Films set on fictional islands Giant monster films Godzilla films Japanese films Japanese-language films Japanese sequel films Kaiju films Robot films Submarine films TriStar Pictures films Toho films Japanese World War II films Pacific War films
[ 0.12956592440605164, 0.25680574774742126, -0.17393778264522552, -0.12076256424188614, -0.3709094524383545, -0.06076904386281967, -0.0859399139881134, -0.2163141667842865, 0.03665260225534439, 0.466464102268219, 0.053273558616638184, 0.41333362460136414, -0.7640959620475769, 0.4514247477054...
12003
https://en.wikipedia.org/wiki/Godzilla%20vs.%20Mothra
Godzilla vs. Mothra
is a 1992 Japanese kaiju film directed by Takao Okawara, written by Kazuki Ōmori, and produced by Shogo Tomiyama. Produced and distributed by Toho Studios, it is the 19th film in the Godzilla franchise, and is the fourth film in the franchise's Heisei era. The film features the fictional monster characters Godzilla, Mothra, and Battra, and stars Tetsuya Bessho, Satomi Kobayashi, Takehiro Murata, Megumi Odaka, Shiori Yonezawa, Makoto Otake, Akiji Kobayashi, Koichi Ueda, Shinya Owada, Keiko Imamura, Sayaka Osawa, Saburo Shinoda and Akira Takarada, with Kenpachiro Satsuma as Godzilla. The plot follows Battra and Mothra's attempts to stop Godzilla from attacking Yokohama. Originally conceived as a standalone Mothra film entitled Mothra vs. Bagan, the film is notable for its return to a more fantasy-based, family-oriented atmosphere, evocative of older Godzilla films. Although he did not return as director, Ōmori continued his trend of incorporating Hollywood elements into his screenplay, in this case nods to the Indiana Jones franchise. Godzilla vs. Mothra was released theatrically in Japan on December 12, 1992, and was followed by Godzilla vs. Mechagodzilla II the following year. Godzilla vs. Mothra was released direct-to-video in the United States in 1998 by Columbia Tristar Home Video under the title Godzilla and Mothra: The Battle for Earth. The film was the second highest-grossing film in Japan in 1993, with Jurassic Park being the highest-grossing. Plot In mid-1992, following the events of Godzilla vs. King Ghidorah, a meteoroid crashes in the Ogasawara Trench and awakens Godzilla. Six months later, explorer Takuya Fujito is detained after stealing an ancient artifact. Later, a representative of the Japanese Prime Minister offers to have Takuya's charges dropped if he explores Infant Island with his ex-wife, Masako Tezuka and Kenji Ando, the secretary of the rapacious Marutomo company. After the trio arrives on the island, they find a cave containing a depiction of two giant insects in battle. Further exploration leads them to a giant egg and a pair of diminutive humanoids called the Cosmos, who identify the egg as belonging to Mothra. The Cosmos tell of an ancient civilization that tried to control the Earth's climate 12,000 years ago, thus provoking the Earth into creating Battra. Battra, a male divine moth similar to Mothra, but much more fearsome in appearance, destroyed the civilisation and their weather-controlling device but then became uncontrollable, and started to harm the very planet that created him. Mothra was then sent by the Earth to fight Battra, who eventually lost. The Cosmos explain how the meteoroid uncovered Mothra's egg, and may have awoken Battra, who is still embittered over humanity's interference in the Earth's natural order. The Marutomo company sends a freighter to Infant Island to pick up the egg, ostensibly to protect it. As they are sailing, Godzilla surfaces and heads toward the newly hatched Mothra larva. Battra, also as a larva, soon appears and joins the fight, allowing Mothra to retreat. The battle between Godzilla and Battra is eventually taken underwater, where the force of the battle causes a giant crack on the Philippine Sea Plate that swallows the two. Masako and Takuya later discover Ando's true intentions when he kidnaps the Cosmos and takes them to Marutomo headquarters, where the CEO intends to use them for publicity purposes. Mothra enters Tokyo in an attempt to rescue the Cosmos, but is attacked by the JSDF. The wounded Mothra heads for the National Diet Building and starts constructing a cocoon around herself. Meanwhile, Godzilla surfaces from Mount Fuji, while Battra frees himself from the Earth's crust and continues towards Japan. Both Mothra and Battra attain their imago forms and converge at Yokohama Cosmo World where they begin to fight once more. Godzilla interrupts the battle and attacks Mothra, but Battra comes to her aid and briefly incapacitates Godzilla. Regrouping, the two moths decide to join forces against Godzilla, determining him to be the greater threat to the planet. Eventually, Mothra and Battra overwhelm Godzilla and carry it over the ocean. Godzilla bites Battra's neck and fires its atomic breath into the wound, killing him. A tired Mothra drops Godzilla and the lifeless Battra into the water below, sealing Godzilla below the surface by creating a mystical glyph with scales from her wings. The next morning, the Cosmos explain that Battra had been waiting many years to destroy an even larger asteroid that would threaten the Earth in 1999. Mothra had promised she would stop the future collision if Battra were to die, and she and the Cosmos leave Earth as the humans bid farewell. Cast Production The idea of shooting a movie featuring a revamped Mothra dated back to a screenplay written in 1980 by Akira Murao entitled Mothra vs. Bagan, which revolved around a vengeful dragon called Bagan who sought to destroy humanity for its abuse of the Earth's resources, only to be defeated by Mothra, the goddess of peace. The screenplay was revised by Kazuki Ōmori after the release of Godzilla vs. Biollante, though the project was ultimately scrapped by Toho, under the assumption that Mothra was a character born purely out of Japanese culture, and thus would have been difficult to market overseas unlike the more internationally recognized Godzilla. After the success of Godzilla vs. King Ghidorah, producer Shōgo Tomiyama and Godzilla series creator Tomoyuki Tanaka proposed resurrecting King Ghidorah in a film entitled Ghidorah's Counterattack, but relented when polls demonstrated that Mothra was more popular with women, who comprised the majority of Japan's population. Tomiyama replaced Ōmori with Takao Okawara as director, but maintained Ōmori as screenwriter. Hoping to maintain as much of Mothra vs. Bagan as possible, Ōmori reconceptualized Bagan as Badora, a dark twin to Mothra. The character was later renamed Battra (a portmanteau of "battle" and "Mothra"), as the first name was disharmonious in Japanese. Tomiyama had intended to feature Mothra star Frankie Sakai, but was unable to because of scheduling conflicts. The final battle between Godzilla, Mothra and Battra was originally meant to have a more elaborate conclusion; as in the final product, Godzilla would have been transported to sea, only to kill Battra and plunge into the ocean. However, the site of their fall would have been the submerged, Stonehenge-like ruins of the Cosmos civilization, which would have engulfed and trapped Godzilla with a forcefield activated by Mothra. Ishirō Honda, who directed the first Godzilla film and many others, visited the set shortly before dying. Special effects Koichi Kawakita continued his theme of giving Godzilla's opponents the ability to metamorphose, and had initially intended to have Mothra killed off, only to be reborn as the cybernetic moth MechaMothra, though this was scrapped early in production, thus making Godzilla vs. Mothra the first post-1984 Godzilla movie to not feature a mecha contraption. The underwater scenes were filmed through an aquarium filled with fish set between the performers and the camera. Kawakita's team constructed a new Godzilla suit from previously used molds, though it was made slimmer than previous suits, the neck given more prominent ribbing, and the arrangement of the character's dorsal plates was changed so that the largest plate was placed on the middle of the back. The arms were more flexible at the biceps, and the face was given numerous cosmetic changes; the forehead was reduced and flattened, the teeth scaled down, and the eyes given a golden tint. The head was also electronically modified to allow more vertical mobility. Filming the Godzilla scenes was hampered when the suit previously used for Godzilla vs. Biollante and Godzilla vs. King Ghidorah, which was needed for some stunt-work, was stolen from Toho studios, only to be recovered at Lake Okutama in bad condition. The remains of the suit were recycled for the first battle sequence. Godzilla's roar was reverted to the high-pitched shriek from pre-1984 Godzilla films, while Battra's sound effects were recycled from those of Rodan. In designing Battra, which the script described as a "black Mothra", artist Shinji Nishikawa sought to distance its design from Mothra's by making its adult form more similar to its larval one than is the case with Mothra, and combining Mothra's two eyes into one. Release Godzilla vs. Mothra was released in Japan on December 12, 1992 where it was distributed by Toho. The film sold approximately 4,200,000 tickets in Japan, becoming the number one Japanese film on the domestic market in the period that included the year 1993. It earned ¥2.22 billion in distribution income, and grossed in total. The film was released in the United States as Godzilla and Mothra: The Battle for Earth on April 28, 1998 on home video by Columbia TriStar Home Video. Critical reaction Review aggregation website Rotten Tomatoes has a 75% approval rating from critics, based on 8 reviews with an average score of 6.3/10. Ed Godziszewski of Monster Zero said, "Rushed into production but a few months after Godzilla vs. King Ghidorah, this film is unable to hide its hurried nature [but] effects-wise, the film makes up for the story's shortcomings and then some." Japan Hero said, "While this movie is not the best of the Heisei series, it is still a really interesting movie. The battles are cool, and Battra was an interesting idea. If you have never seen this movie, I highly recommend it." Stomp Tokyo said the film is "one of the better Godzilla movies in that the scenes in which monsters do not appear actually make some sort of sense. And for once, they are acted with some gusto, so that we as viewers can actually come to like the characters on screen, or at least be entertained by them." Mike Bogue of American Kaiju said the film "[does] not live up to its potential", but added that "[its] colorful and elaborate spectacle eventually won [him] over" and "the main story thread dealing with the eventual reconciliation of the divorced couple adequately holds the human plot together." Home media The film was released by Sony on Blu-ray in The Toho Godzilla Collection on May 6, 2014. Awards References Bibliography External links Godzilla vs. Mothra on Wikizilla 1992 films 1990s science fiction adventure films 1990s fantasy adventure films 1990s monster movies Japanese fantasy adventure films Japanese crossover films Environmental films Films about telepathy Films directed by Takao Okawara Films produced by Tomoyuki Tanaka Films scored by Akira Ifukube Films set in 1992 Films set in 1993 Films set in Tokyo Films set in Nagoya Films set in Yokohama Films set in amusement parks Films set in the Pacific Ocean Films set on fictional islands Giant monster films Godzilla films Japanese films Japanese-language films Japanese science fiction films Japanese sequel films Kaiju films Mothra TriStar Pictures films Toho films
[ -0.13323190808296204, 0.1458447426557541, -0.37731513381004333, -0.37684404850006104, -0.6542409062385559, 0.18895916640758514, -0.07609349489212036, -0.10689792782068253, -0.21166446805000305, 0.3990154266357422, -0.07332009822130203, 0.24405178427696228, -0.5506246089935303, 0.3059648871...
12004
https://en.wikipedia.org/wiki/Godzilla%20%281954%20film%29
Godzilla (1954 film)
is a 1954 Japanese kaiju film directed by Ishirō Honda, with special effects by Eiji Tsuburaya. Produced and distributed by Toho Co., Ltd, it is the first film in the Godzilla franchise and the Shōwa era. The film stars Akira Takarada, Momoko Kōchi, Akihiko Hirata, and Takashi Shimura, with Haruo Nakajima and Katsumi Tezuka as Godzilla. In the film, Japan's authorities deal with the sudden appearance of a giant monster, whose attacks trigger fears of nuclear holocaust during post-war Japan. Godzilla entered production after a Japanese-Indonesian co-production collapsed. Tsuburaya originally proposed for a giant octopus before the filmmakers decided on a dinosaur-inspired creature. Godzilla pioneered a form of special effects called suitmation, in which a stunt performer wearing a suit interacts with miniature sets. Principal photography ran 51 days, and special effects photography ran 71 days. Godzilla was theatrically released in Japan on November 3, 1954, and grossed during its original theatrical run. In 1956, a heavily re-edited "Americanized" version, titled Godzilla, King of the Monsters! was released in the United States. The film spawned a multimedia franchise, being recognized by Guinness World Records as the longest-running film franchise in history. The character Godzilla has since become an international pop culture icon. The film and Tsuburaya have been largely credited for establishing the template for tokusatsu media. Since its release, the film has been regarded as a cinematic achievement and one of the best monster films ever made. The film was followed by Godzilla Raids Again, released on April 24, 1955. Plot When the Japanese freighter Eiko-maru is destroyed near Odo Island, another ship—the Bingo-maru—is sent to investigate, only to meet the same fate with few survivors. A fishing boat from Odo is also destroyed, with one survivor. Fishing catches mysteriously drop to zero, blamed by an elder on the ancient sea creature known as "Godzilla". Reporters arrive on Odo Island to further investigate. A villager tells one of the reporters that something in the sea is ruining the fishing. That evening, a storm strikes the island, destroying the reporters' helicopter, and Godzilla, briefly seen, destroys 17 homes and kills nine people and 20 of the villagers' livestock. Odo residents travel to Tokyo to demand disaster relief. The villagers' and reporters' evidence describes damage consistent with something large crushing the village. The government sends paleontologist Kyohei Yamane to lead an investigation on the island, where giant radioactive footprints and a trilobite are discovered. The village alarm bell is rung and Yamane and the villagers rush to see the monster, retreating after seeing that it is a giant dinosaur. Yamane presents his findings in Tokyo, estimating that Godzilla is 50 m tall and is evolved from an ancient sea creature becoming a terrestrial creature. He concludes that Godzilla has been disturbed by underwater hydrogen bomb testing. Debate ensues about notifying the public about the danger of the monster. Meanwhile, 17 ships are lost at sea. Ten frigates are dispatched to attempt to kill the monster using depth charges. The mission disappoints Yamane, who wants Godzilla to be studied. When Godzilla survives the attack, officials appeal to Yamane for ideas to kill the monster, but Yamane tells them that Godzilla is unkillable, having survived H-bomb testing, and must be studied. Yamane's daughter, Emiko, decides to break off her arranged engagement to Yamane's colleague, Daisuke Serizawa, because of her love for Hideto Ogata, a salvage ship captain. When a reporter arrives and asks to interview Serizawa, Emiko escorts the reporter to Serizawa's home. After Serizawa refuses to divulge his current work to the reporter, he gives Emiko a demonstration of his recent project on the condition that she must keep it a secret. The demonstration horrifies her and she leaves without mentioning the engagement. Shortly after she returns home, Godzilla surfaces from Tokyo Bay and attacks Shinagawa. After attacking a passing train, Godzilla returns to the ocean. After consulting international experts, the Japanese Self-Defense Forces construct a 30 m tall and 50,000 volt electrified fence along the coast and deploy forces to stop and kill Godzilla. Dismayed that there is no plan to study Godzilla for its resistance to radiation, Yamane returns home, where Emiko and Ogata await, hoping to get his consent for them to wed. When Ogata disagrees with Yamane, arguing that the threat that Godzilla poses outweighs any potential benefits from studying the monster, Yamane tells him to leave. Godzilla resurfaces and breaks through the fence to Tokyo with its atomic breath, unleashing more destruction across the city. Further attempts to kill the monster with tanks and fighter jets fail and Godzilla returns to the ocean. The day after, hospitals and shelters are crowded with the maimed and the dead, with some survivors suffering from radiation sickness. Distraught by the devastation, Emiko tells Ogata about Serizawa's research, a weapon called the "Oxygen Destroyer", which disintegrates oxygen atoms and causes organisms to die of a rotting asphyxiation. Emiko and Ogata go to Serizawa to convince him to use the Oxygen Destroyer but he initially refuses, explaining that if he uses the device, the superpowers of the world will surely force him to construct more Oxygen Destroyers for use as a superweapon. After watching a program displaying the nation's current tragedy, Serizawa finally accepts their pleas. As Serizawa burns his notes, Emiko breaks down crying. A navy ship takes Ogata and Serizawa to plant the device in Tokyo Bay. After finding Godzilla, Serizawa unloads the device and cuts off his air support, taking the secret of the Oxygen Destroyer to his grave. Godzilla is destroyed, but many mourn Serizawa's death. Yamane believes that if nuclear weapons testing continues, another Godzilla may rise in the future. Cast Akira Takarada as Hideto Ogata Momoko Kōchi as Emiko Yamane Akihiko Hirata as Dr. Daisuke Serizawa Takashi Shimura as Dr. Kyohei Yamane Fuyuki Murakami as Dr. Tanabe Sachio Sakai as Hagiwara Ren Yamamoto as Masaji Yamada Toyoaki Suzuki as Shinkichi Yamada Toranosuke Ogawa as the President of the Nankai Shipping Company Hiroshi Hayashi as the Chairman of Diet Committee Seijiro Onda as Oyama, Diet Committee member Kin Sugai as Ozawa, Diet Committee member Kokuten Kōdō as the old fisherman Tadashi Okabe as the assistant of Dr. Tanabe Jiro Mitsuaki as an employee of the Nankai Salvage Company Ren Imaizumi as a radio officer of the Nankai Salvage Company Sokichi Maki as the chief at the Maritime Safety Agency Kenji Sahara as a partygoer Haruo Nakajima as Godzilla and a reporter Katsumi Tezuka as Godzilla and a newspaper deskman Cast taken from Japan's Favorite Mon-Star. Themes In the film, Godzilla symbolizes nuclear holocaust from Japan's perspective and has since been culturally identified as a strong metaphor for nuclear weapons. Producer Tomoyuki Tanaka stated that, "The theme of the film, from the beginning, was the terror of the bomb. Mankind had created the bomb, and now nature was going to take revenge on mankind." Director Ishirō Honda filmed Godzilla's Tokyo rampage to mirror the atomic bombings of Hiroshima and Nagasaki, stating, "If Godzilla had been a dinosaur or some other animal, he would have been killed by just one cannonball. But if he were equal to an atomic bomb, we wouldn't know what to do. So, I took the characteristics of an atomic bomb and applied them to Godzilla." On March 1, 1954, just a few months before the film was made, the Japanese fishing vessel Daigo Fukuryū Maru ("Lucky Dragon No. 5") had been showered with radioactive fallout from the U.S. military's 15-megaton "Castle Bravo" hydrogen bomb test at nearby Bikini Atoll. The boat's catch was contaminated, spurring a panic in Japan about the safety of eating fish, and the crew was sickened, with one crew member eventually dying from radiation sickness. This event led to the emergence of a large and enduring anti-nuclear movement that gathered 30 million signatures on an anti-nuclear petition by August 1955 and eventually became institutionalized as the Japan Council against Atomic and Hydrogen Bombs. The film's opening scene of Godzilla destroying a Japanese vessel is a direct reference to these events, and had a strong impact on Japanese viewers with this recent event still fresh in the mind of the public. Academics Anne Allison, Thomas Schnellbächer, and Steve Ryfle have said that Godzilla contains political and cultural undertones that can be attributed to what the Japanese had experienced in World War II and that Japanese audiences were able to connect emotionally to the monster. They theorized that these viewers saw Godzilla as a victim and felt that the creature's backstory reminded them of their experiences in World War II. These academics have also claimed that as the atomic bomb testing that woke Godzilla was carried out by the United States, the film in a way can be seen to blame the United States for the problems and struggles that Japan experienced after World War II had ended. They also felt that the film could have served as a cultural coping method to help the people of Japan move on from the events of the war. Brian Merchant from Motherboard called the film "a bleak, powerful metaphor for nuclear power that still endures today" and on its themes, he stated: "It's an unflinchingly bleak, deceptively powerful film about coping with and taking responsibility for incomprehensible, manmade tragedy. Specifically, nuclear tragedies. It's arguably the best window into post-war attitudes towards nuclear power we've got—as seen from the perspective of its greatest victims." Terrence Rafferty from The New York Times said Godzilla was "an obvious gigantic, unsubtle, grimly purposeful metaphor for the atomic bomb" and felt the film was "extraordinarily solemn, full of earnest discussions". Mark Jacobson from the website of New York magazine said that Godzilla "transcends humanist prattle. Very few constructs have so perfectly embodied the overriding fears of a particular era. He is the symbol of a world gone wrong, a work of man that once created cannot be taken back or deleted. He rears up out of the sea as a creature of no particular belief system, apart from even the most elastic version of evolution and taxonomy, a reptilian id that lives inside the deepest recesses of the collective unconscious that cannot be reasoned with, a merciless undertaker who broaches no deals." Regarding the film, Jacobson stated, "Honda's first Godzilla... is in line with these inwardly turned post-war films and perhaps the most brutally unforgiving of them. Shame-ridden self-flagellation was in order, and who better to supply the rubber-suited psychic punishment than the Rorschach-shaped big fella himself?" Tim Martin from The Daily Telegraph (London) said that the original 1954 film was "a far cry from its B-movie successors. It was a sober allegory of a film with ambitions as large as its thrice-normal budget, designed to shock and horrify an adult audience. Its roster of frightening images—cities in flames, overstuffed hospitals, irradiated children—would have been all too familiar to cinemagoers for whom memories of Hiroshima and Nagasaki were still less than a decade old, while its script posed deliberately inflammatory questions about the balance of postwar power and the development of nuclear energy." Martin also commented how the film's themes were omitted in the American version, stating, "Its thematic preoccupation with nuclear energy proved even less acceptable to the American distributors who, after buying the film, began an extensive reshoot and recut for Western markets." Production Crew Ishirō Honda – director, co-writer Eiji Tsuburaya – special effects director Kōji Kajita – assistant director Teruo Maki – production manager Choshiro Ishii – lighting Takeo Kita – chief art director Satoshi Chuko – art director Akira Watanabe – special effects art director Kuichirō Kishida – special effects lighting Teizō Toshimitsu – monster builder Hisashi Shimonaga – sound recording Ichiro Minawa – sound and musical effects Personnel taken from Japan's Favorite Mon-Star. Development In 1954, Toho originally planned to produce , a Japanese-Indonesian co-production about the aftermath of the Japanese occupation of Indonesia. However, anti-Japanese sentiment in Indonesia forced political pressure on the government to deny visas for the Japanese filmmakers. The film was to be co-produced with Perfini, filmed on location in Jakarta in color (a first for a major Toho production), and was to open markets for Japanese films in Southeast Asia. Producer Tomoyuki Tanaka flew to Jakarta to renegotiate with the Indonesian government but was unsuccessful and on the flight back to Japan, conceived the idea for a giant monster film inspired by the 1953 film The Beast from 20,000 Fathoms and the Daigo Fukuryū Maru incident that happened in March 1954. The film's opening sequence is a direct reference to the incident. Tanaka felt the film had potential due to nuclear fears generating news and monster films becoming popular, due to the financial success of The Beast from 20,000 Fathoms and the 1952 re-release of King Kong, the latter of which earned more money than previous releases. During his flight, Tanaka wrote an outline with the working title and pitched it to executive producer Iwao Mori. Mori approved the project in mid–April 1954 after special effects director Eiji Tsuburaya agreed to do the film's effects and confirmed that the film was financially feasible. Mori also felt the project was a perfect vehicle for Tsuburaya and to test the storyboarding system that he instituted at the time. Mori also approved Tanaka's choice to have Ishirō Honda direct the film and shortened the title of the production to Project G (G for Giant), as well as giving the production classified status and ordered Tanaka to minimize his attention on other films and mainly focus on Project G. Toho originally intended for Senkichi Taniguchi to direct the film, as he was originally attached to direct In the Shadow of Glory, however, Taniguchi declined the assignment. Honda was not Toho's first choice for the film's director, however, his war-time experience made him an ideal candidate for the film's anti-nuclear themes. Several other directors passed on the project, feeling the idea was "stupid", however, Honda accepted the assignment due to this interest in science and "unusual things", stating, "I had no problem taking it seriously." It was during the production of Godzilla that Honda worked with assistant director Kōji Kajita for the first time. Afterwards, Kajita would go on to collaborate with Honda as his chief assistant director for 17 films over the course of 10 years. Due to sci-fi films lacking respect from film critics, Honda, Tanaka, and Tsuburaya agreed on depicting a monster attack as if it were a real event, with the serious tone of a documentary. Writing Tsuburaya submitted an outline of his own, written three years prior to Godzilla; it featured a giant octopus attacking ships in the Indian Ocean. In May 1954, Tanaka hired sci-fi writer Shigeru Kayama to write the story. Only 50 pages long and written in 11 days, Kayama's treatment depicted Dr. Yamane wearing dark shades, a cape and living in a European-style house from which he only emerged at night. Godzilla was portrayed as more animal-like by coming ashore to feed on animals, with an ostensibly gorilla-like interest in females. Kayama's story also featured less destruction and borrowed a scene from The Beast from 20,000 Fathoms by having Godzilla attack a lighthouse. Takeo Murata and Honda co-wrote the screenplay in three weeks, confining themselves in a Japanese inn in Tokyo's Shibuya ward. On writing the script, Murata stated, "Director Honda and I... racked our brains to make Mr. Kayama's original treatment into a full, working vision." Murata said that Tsuburaya and Tanaka pitched their ideas as well. Tanaka requested that they do not spend too much money, while Tsuburaya encouraged them to "do whatever it takes to make it work". Murata and Honda redeveloped key characters and elements by adding Emiko's love triangle. In Kayama's story, Serizawa was depicted as merely a colleague of Dr. Yamane's. Godzilla's full appearance was to be revealed during the Odo Island hurricane but Honda and Murata opted to show parts of the creature as the film built up to his full reveal. Honda and Murata also introduced the characters Hagiwara and Dr. Tanabe in their draft but the role of Shinkichi, who had a substantial role in Kayama's story, was cut down. A novelization, written by Kayama, was published on October 25, 1954 by Iwatani Bookstore as . Creature design Godzilla was designed by Teizō Toshimitsu and Akira Watanabe under Eiji Tsuburaya's supervision. Early on, Tanaka contemplated having the monster be gorilla-like or whale-like in design, due to the name "Gojira" (a combination of the Japanese words for "gorilla", , and "whale", ), but eventually settled on a dinosaur-like design. Kazuyoshi Abe was hired earlier to design Godzilla but his ideas were later rejected due to Godzilla looking too humanoid and mammalian, with a head shaped like a mushroom cloud; however, Abe was retained to help draw the film's storyboards. Toshimitsu and Watanabe decided to base Godzilla's design on dinosaurs and, by using dinosaur books and magazines as a reference, combined elements of a Tyrannosaurus, Iguanodon and the dorsal fins of a Stegosaurus. Despite wanting to have utilized stop motion animation, Tsuburaya reluctantly settled on suitmation. Toshimitsu sculpted three clay models on which the suit would be based. The first two were rejected but the third was approved by Tsuburaya, Tanaka, and Honda. The Godzilla suit was constructed by Kanji Yagi, Koei Yagi, and Eizo Kaimai, who used thin bamboo sticks and wire to build a frame for the interior of the suit and added metal mesh and cushioning over it to bolster its structure and finally applied coats of latex. Coats of molten rubber were additionally applied, followed by carved indentations and strips of latex glued onto the surface of the suit to create Godzilla's scaly hide. This first version of the suit weighed 100 kilograms (220 pounds). For close-ups, Toshimitsu created a smaller scale, mechanical, hand-operated puppet that sprayed streams of mist from its mouth to act as Godzilla's atomic breath. Haruo Nakajima and Katsumi Tezuka were chosen to perform in the Godzilla suit, due to their strength and endurance. At the first costume fitting, Nakajima fell down while inside the suit, due to the heavy latex and inflexible materials used to create the suit. This first version of the suit was cut in half and used for scenes requiring only partial shots of Godzilla or close-ups, with the lower-half fitted with rope suspenders for Nakajima to wear. For full-body shots, a second identical suit was created which was made lighter than the first suit, but Nakajima was still only able to be inside for three minutes before passing out. Nakajima lost 20 pounds during the production of the film. Nakajima would go on to portray Godzilla and other monsters until his retirement in 1972. Tezuka filmed scenes in the Godzilla suit but, due to his older body, he was unable to fully commit to the physical demands required by the role. As a result, few of his scenes made it to the final cut as very few scenes were considered usable. Tezuka filled in for Nakajima when he was unavailable or needed relief from the physically demanding role. Godzilla's name was also a source of consternation for the filmmakers. Because the monster had no name, the first draft of the film was not called Gojira but rather titled G, also known as Kaihatsu keikaku G ("Development Plan G"), the "G" of the title stood for "Giant", however. Nakajima confirmed that Toho held a contest to name the monster. The monster was eventually named Gojira. One explanation that is chalked up to legend is that a hulking Toho Studios employee's physical attributes led him to be nicknamed Gojira. In a 1998 BBC documentary on Godzilla, Kimi Honda, the widow of the director, dismissed the employee-name story as a tall tale, believing that Honda, Tanaka, and Tsuburaya gave "considerable thought" to the name of the monster, stating, "the backstage boys at Toho loved to joke around with tall stories, but I don't believe that one". In 2003, a Japanese television special claimed to have identified the anonymous hulking Toho employee as Shiro Amikura, a Toho contract actor from the 1950s. Special effects The film's special effects were directed by Eiji Tsuburaya. In order for the effects footage to sync with the live-action footage, Honda and Tsuburaya would develop plans early during development and briefly meet prior to the day's shoot. Kajita would shuttle Tsuburaya to Honda's set to observe how a scene was being shot and where the actors were being positioned. Kajita also ushered Honda to the effects stage to observe how Tsuburaya was shooting certain effects. While Honda edited the live-action footage, he left blank leaders for Tsuburaya to insert the effects footage. At times, Honda had to cut out certain effects footage. Tsuburaya disapproved of these decisions due to Honda's cuts not matching the effects; however, Honda had final say in these matters. Tsuburaya originally wanted to use stop motion for the film's special effects but realized it would have taken seven years to complete based on the then-current staff and infrastructure at Toho. Settling on suitmation and miniature effects, Tsuburaya and his crew scouted the locations Godzilla was to destroy and were nearly arrested after a security guard overheard their plans for destruction but were released after showing police their Toho business cards. Kintaro Makino, the chief of miniature construction, was given blueprints by Akira Watanabe for the miniatures and assigned 30 to 40 workers from the carpentry department to build them, which took a month to build the scaled down version of Ginza. A majority of the miniatures were built at 1:25 scale but the Diet building was scaled down to a 1:33 scale to look smaller than Godzilla. While it proved to be too expensive to use stop-motion extensively throughout the picture, the final film did include a stop-motion scene of Godzilla's tail destroying the Nichigeki Theater building. The buildings' framework were made of thin wooden boards reinforced with a mixture of plaster and white chalk. Explosives were installed inside miniatures that were to be destroyed by Godzilla's atomic breath while some were sprayed with gasoline to make them burn more easily; others included small cracks so they could crumble easily. Optical animation techniques were used for Godzilla's glowing dorsal fins by having hundreds of cells drawn frame-by-frame. Haruo Nakajima perspired inside the suit so much that the Yagi brothers had to dry out the cotton lining every morning and sometimes re-line the interior of the suit and repair damages. The typhoon waves were created by stagehands who overturned barrels of water into a water tank where the miniature Odo Island shoreline was built. Multiple composition shots were used for the Odo Island scenes. Most of the Odo Island scenes were filmed near rice fields. Toho hired en masse part-time employees to work on the film's optical effects. Half of the 400 hired staff were mostly part-timers with little to no experience. An early version of Godzilla's full reveal was filmed that featured Godzilla, via hand-operated puppet, devouring a cow. Sadamasa Arikawa thought the scene was too gruesome and convinced Tsuburaya to re-film it. Optical effects were utilized for Godzilla's footprints on the beach by painting them onto glass and inserting it onto an area of the live-action footage. Special effects photography lasted for 71 days. Filming On the first day of filming, Honda addressed a crew of 30 to read the script and leave the project if they did not feel convinced, wanting only to work with those who had confidence in him and the film. Most of the film was shot in the Toho lot. Honda's team also filmed on location in the Shima Peninsula in Mie Prefecture to film the Odo Island scenes, which used 50 Toho extras and Honda's team establishing their base in the town of Toba. Local villagers were also used as extras for the Odo Island scenes. The dance ritual scene was filmed on location in Mie Prefecture, with local villagers performing as the dancers. The cast and crew commuted every morning by boat to Toba, Mie, working under harsh weather temperatures. Honda worked shirtless and as a result, suffered blistering sunburn on his back that left permanent scars. Toho had negotiated with the Japan Self-Defense Forces (JSDF) to film scenes requiring the military and filmed target practices and drills for the film; Honda's team followed a convoy of JSDF vehicles for the convoy dispatch scene. Two thousand girls were used from an all-girls high school for the prayer for peace scene. The filmmakers had little cooperation from the JSDF and had to rely on World War II stock footage, provided by the Japanese military, for certain scenes. The stock footage was sourced from 16mm prints. Honda's team spent 51 days shooting the film. Music and sound effects The film's score was composed by Akira Ifukube. After meeting with Tanaka, Tsuburaya, and Honda, Ifukube enthusiastically accepted the job after learning that the main character was a monster, Ifukube said, "I couldn't sit still when I heard that in this movie the main character was a reptile that would be rampaging through the city." Ifukube was not shown the final film and only had a week to compose his music. Within that time, he was only shown a model of Godzilla and the screenplay. Tsuburaya briefly showed Ifukube some footage, albeit with the effects missing and Tsuburaya attempting to describe how the scene would unfold, Ifukube recalled, "I was very confused. So I tried to make music that would remind you of something enormous." Ifukube used low-pitch brass and string instruments. It was Honda's idea to make Godzilla roar, despite the fact that reptiles do not have vocal cords. Shimonaga and Minawa were originally tasked with creating the roar, however, Ifukube became involved after taking interest in creating sound effects. Ifukube discussed with Honda regarding what type of sounds were going to be used in certain scenes and other details concerning sounds. Minawa went to the zoo and recorded various animal roars that were played back at certain speeds. However, these sounds proved unsatisfactory and went unused. Ifukube borrowed a contrabass from the Japan Art University's music department and created Godzilla's roar by loosening the strings and rubbing them with a leather glove. The sound was recorded and played at a reduced speed, which achieved the effect of the roar used in the film. This technique would be adopted by Toho as a standard method in creating monster roars in the following years. There are conflicting reports as to how Godzilla's footsteps were created. One claim states that the roar was created with a knotted rope hitting a kettle drum that was recorded and processed through an echo box. However, Ifukube told Cult Movies that the footsteps were created using a primitive amplifier that made a loud clap when struck. Some Japanese texts claim that the footsteps were sourced from an explosion with the ending clipped off and processed through an electronic reverb unit. The optical recording equipment contained four audio tracks: one for principal dialogue, one for background chatter, ambient noises, tanks, planes, and one for the roars and footsteps. An independent audio track was used to prevent bleeding over other audio. The music and sound effects of Godzilla's rampage were recorded live simultaneously. While Ifukube conducted the NHK Philharmonic orchestra, a foley artists watched Godzilla's rampage projected on a screen and used tin, concrete debris, wood, and other equipment to simulate sounds that would sync with the footage. A new take would be needed if the foley artist had missed a cue. Many of Ifukube's themes and motifs associated with Godzilla were introduced in the film, such as the March, the Horror theme, and the Requiem. The "Self Defense Force March" had become synonymous with Godzilla that Ifukube later referred to it as "Godzilla's theme." Ifukube considers his music for the film his finest film score. Release Marketing During production, Mori devised promotional strategies to generate public interest. Amongst these strategies was a radio play titled . 11 episodes were produced based on the screenplay, and aired on Saturdays on the NHK radio network from July 17, 1954 to September 25, 1954. In an attempt to build mystery, Mori banned reporters from the set, kept the special effects techniques and other behind the scenes crafts secret. Nakajima's suit performance as Godzilla would not be revealed until the 1960s. However, Godzilla's image was widely publicized. Godzilla's image was added to the company stationary, cut-out pictures and posters were displayed in theaters and stores, large advertisement balloons were flown to major Japanese cities, and a Godzilla doll was mounted onto a truck and driven around Tokyo. The film's theatrical trailer debuted in theaters on October 20, 1954. Theatrical Godzilla was first released in Nagoya on October 27, 1954, and released nationwide on November 3, 1954. At the time of the film's release, it set a new opening day record for any Toho film, selling 33,000 tickets at Toho's cinemas in Tokyo and selling out at Nichigeki Theater. As a result, Toho's CEO personally called Honda to congratulate him. Honda's wife, Kimi, noted "that sort of thing didn't usually happen." From 1955 into the 1960s, Godzilla played in theaters catering to Japanese-Americans in predominantly Japanese neighborhoods in the United States. An English subtitled version was shown at film festivals in New York, Chicago, and other cities in 1982. An 84-minute cut of the Japanese version was theatrically released in West Germany on April 10, 1956, as Godzilla. This version removes the Japanese Parliament argument, acknowledgement of Godzilla as a "child of the H-bomb", references to Hiroshima and Nagasaki, and an altered translation of the mother holding her children. The film was re-released theatrically in Japan on November 21, 1982, to commemorate Toho's 50th anniversary. Since its release, the 1954 film remained unavailable officially in the United States until 2004. To coincide with Godzillas 50th anniversary, art-house distributor Rialto Pictures gave the film a traveling tour-style limited release, coast-to-coast, across the United States, on May 7, 2004. It ran uncut with English subtitles until December 19, 2004. The film never played on more than six screens at any given point during its limited release. The film played in roughly sixty theaters and cities across the United States during its -month release. On April 18, 2014, Rialto re-released the film in the United States, coast-to-coast, using another limited-style traveling tour. This coincided with not only Godzilla's 60th anniversary, but also celebrated the American Godzilla film which was released that same year. To avoid confusion with the Hollywood feature, the Rialto release was subtitled The Japanese Original. It was screened in 66 theaters in 64 cities from April 18 to October 31, 2014. For its 67th anniversary, a 4K remaster of the film, along with other Godzilla films, was screened in Alamo Drafthouse Cinema locations on November 3, 2021. American version Following the film's success in Japan, Toho sold the American rights to Joseph E. Levine for $25,000. A heavily altered version of the film was released in the United States and worldwide as Godzilla, King of the Monsters! on April 27, 1956. This version trimmed the original down to 80 minutes and featured new footage with Canadian actor Raymond Burr interacting with body doubles mixed with Honda's footage to make it seem like he was part of the original Japanese production. Many of the film's political themes were trimmed or removed completely. It was this version of the original Godzilla film that introduced audiences worldwide to the character and franchise and the only version that critics and scholars had access to until 2004 when the 1954 film was released in select theaters in North America. Godzilla, King of the Monsters! grossed $2 million during its theatrical run, more than what the 1954 film grossed in Japan. Honda was unaware that Godzilla had been re-edited until Toho released Godzilla, King of the Monsters! in Japan in May 1957 as Monster King Godzilla. Toho converted the entire film from its original scope to a widescreen 2.35:1 scope, which resulted in an awkward crop for the entire film. Japanese subtitles were given to the Japanese actors since their original dialogue differed greatly from the original script and were dubbed in English. Since the release of the film, Toho had adopted the moniker "King of the Monsters" for Godzilla, which has since appeared in official marketing, advertisement, and promotional materials. Home media The 1956 Godzilla, King of the Monsters! version of the film was released on DVD by Simitar in 1998 and Classic Media in 2002. In 2005, BFI released the original Japanese version in the UK theatrically, and later in the same year on DVD. The DVD includes the original mono track and several extra features, such as documentaries and commentary tracks by Film historians Steve Ryfle, Ed Godziszewski, and Keith Aiken. The DVD also includes a documentary about the Daigo Fukuryū Maru, a Japanese fishing boat that was caught in an American nuclear blast and partially inspired the creation of the film. In 2006, Classic Media released a two-disc DVD set titled Gojira: The Original Japanese Masterpiece. This release features both the 1954 Japanese version and the 1956 American version, making the Japanese version of the film available on DVD in North America for the first time. This release features trailers and audio commentaries for both films by Ryfle and Godziszewski (separate from the BFI commentaries), two 13-minute documentaries titled "Godzilla Story Development" and "Making of the Godzilla Suit", and a 12-page essay booklet by Ryfle. This release also restores the original ending credits of the American film which, until recently, were thought to have been lost. In 2009, Classic Media released Godzilla on Blu-ray. This release includes the same special features from the 2006 Classic Media DVD release, but does not feature the 1956 American version. In 2012, the Criterion Collection released a "new high-definition digital restoration" of Godzilla on Blu-ray and DVD. This release includes a remaster of the 1956 American version, Godzilla, King of the Monsters!, as well as other special features such as interviews with Akira Ikufube, Japanese film critic Tadao Sato, actor Akira Takarada, Godzilla performer Haruo Nakajima, effects technicians Yoshio Irie and Eizo Kaimai and audio commentaries on both films by film historian David Kalat. In 2014, Classic Media reissued Godzilla and Godzilla, King of the Monsters! on DVD, to commemorate the release of Legendary's Godzilla film. This release retained the same specs and features as the 2006 DVD release. In 2017, Janus Films and the Criterion Collection acquired the film, as well as other Godzilla titles, to stream on Starz and FilmStruck. In 2019, the film and the American version were included in the Godzilla: The Showa Era Films Blu-ray box set released by the Criterion Collection, which included all 15 films from the franchise's Shōwa era. In May 2020, the film became available on HBO Max upon its launch. Reception Box office During its initial Japanese theatrical run, the film sold approximately tickets and was the eighth best-attended film in Japan that year. The film earned a distribution rental income of , unadjusted for inflation. During its 2004 limited theatrical release in North America, the film grossed $38,030 on its opening weekend and grossed $412,520 by the end of its limited run. For the 2014 limited re-release in North America, it grossed $10,903 after playing in one theater in New York and grossed $150,191 at the end of its run. In the United Kingdom, the film sold 3,643 tickets from limited releases during 20052006 and 20162017. Critical response in Japan Prior to the release of the film, skeptics predicted the film would flop. At the time of the film's release, Japanese reviews were mixed. Japanese critics accused the film of exploiting the widespread devastation that the country had suffered in World War II, as well as the Daigo Fukuryū Maru incident that occurred a few months before filming began. Ishiro Honda lamented years later in the Tokyo Journal, "They called it grotesque junk, and said it looked like something you'd spit up. I felt sorry for my crew because they had worked so hard!" Others said that depicting a fire breathing organism was strange. Honda also believed Japanese critics began to change their minds after the good reviews the film received in the United States. He stated "The first film critics to appreciate Godzilla were those in the U.S. When Godzilla was released there as Godzilla, King of the Monsters! in 1956, the critics said such things as, 'For the start, this film frankly depicts the horrors of the Atomic Bomb', and by these evaluations, the assessment began to impact critics in Japan and has changed their opinions over the years." As time went on, the film gained more respect in its home country. In 1984, Kinema Junpo magazine listed Godzilla as one of the top 20 Japanese films of all time, while a survey of 370 Japanese film critics published in Nihon Eiga Besuto 150 (Best 150 Japanese Films), had Godzilla ranked as the 27th best Japanese film ever made. The film was nominated for two Japanese Movie Association awards. One for best special effects and the other for best film. It won best special effects but lost best picture to Akira Kurosawa's Seven Samurai. Critical response internationally Godzilla received generally positive reviews from critics. On review aggregator Rotten Tomatoes, the film has an approval rating of 93% based on 74 reviews, with an average score of 7.60/10. The site's consensus states, "More than straight monster-movie fare, Gojira offers potent, sobering postwar commentary". On Metacritic, the film has a score of 78/100, based on 20 critics, indicating "generally favorable reviews". Owen Gleiberman from Entertainment Weekly noted the film is more "serious" than the 1956 American cut yet "its tone just veers closer to that of solemn American B-horror cheese like Them! The real difference is that the film's famous metaphor for the bombing of Hiroshima and Nagasaki looks more nuttily masochistic than ever." Luke Y. Thompson from Dallas Observer defended the film's effects as products of their time and felt that viewers would be "surprised by what they see", stating, "This ain't your standard goofy monster rampage." Peter Bradshaw from The Guardian awarded the film four stars out of five, praising the storytelling as "muscular" and the nuclear themes as "passionate and fascinatingly ambiguous", stating, "the sheer fervency of this film takes it beyond the crash-bang entertainment of most blockbusters, ancient and modern." David Nusair from Reel Film Reviews awarded the film one and a half stars out of four, saying the film turns into a "terminally erratic narrative that's more dull than engrossing." Nusair criticized Honda for his "inability to offer up even a single compelling human character" and found the film's ending as "anticlimactic and pointless", concluding, "the film is entirely lacking in elements designed to capture and hold the viewer's ongoing attention." Roger Ebert from the Chicago Sun-Times awarded the film one and a half stars out of four, stating, "regaled for 50 years by the stupendous idiocy of the American version of Godzilla, audiences can now see the original Japanese version, which is equally idiotic, but, properly decoded, was the Fahrenheit 9/11 of its time." Ebert criticized the effects as looking "crude", feeling the effects of the 1933 film King Kong were "more convincing" and concluded that "This is a bad movie, but it has earned its place in history." Keith Uhlich from Time Out awarded the film four stars out of five, calling the film "Pop Art as purge", and praising the film's characters, themes, and Godzilla as a "potent and provocative metaphor, a lumbering embodiment of atomic-age anxieties birthed from mankind's own desire to destroy." Desson Thomson from the Washington Post called the film's effects "pretty extraordinary" and "amazingly credible" for their time. Thomson felt some of the acting was "ham-handed" but said "there's a surprisingly powerful thrust to this film." Mick LaSalle from the San Francisco Chronicle called the film a "classic", stating, "Such moments go beyond spectacle. Godzilla is a collective metaphor and a collective nightmare, a message film that says more than its message, that captures, with a horrified poetry, the terrors that stomped through the minds of people 50 years ago." Since its release, Godzilla has been regarded as one of the best giant monster films ever made: critic Allen Perkins called the film "not just a classic monster movie, but also an important cinematic achievement." In 2010, the film was ranked No. 31 in Empire magazine's "100 Best Films Of World Cinema". In 2013, Rolling Stone ranked the film No. 1 on their "Best Monster Movies of All Time" list. In 2015, Variety listed the film amongst their "10 Best Monster Movies of All-Time" list. In 2019, Time Out Film ranked the film No. 9 on their "50 best monster movies" list. Accolades In 1954, Eiji Tsuburaya won the Japanese Film Technique award for the film's special effects. In 2007, Classic Media's DVD release of the film won "Best DVD of 2006" by the Rondo Hatton Classic Horror Awards and Best DVD Classic Film Release by the Saturn Awards. Legacy The film spawned a multimedia franchise consisting of 33 films in total, video games, books, comics, toys and other media. The Godzilla franchise has been recognized by Guinness World Records as being the longest-running film franchise in history. Since his debut, Godzilla became an international pop culture icon, inspiring countless rip-offs, imitations, parodies and tributes. The 1954 film and its special effects director Eiji Tsuburaya have been largely credited for establishing the template for tokusatsu, a technique of practical special effects filmmaking that would become essential in Japan's film industry since the release of Godzilla. Critic and scholar Ryusuke Hikawa said: "Disney created the template for American animation, In the same way, (special-effects studio) Tsuburaya created the template for the Japanese movie business. It was their use of cheap but craftsman-like approaches to movie-making that made tokusatsu unique." Steven Spielberg cited Godzilla as an inspiration for Jurassic Park (1993), specifically Godzilla, King of the Monsters! (1956), which he grew up watching. American films In 1998, TriStar Pictures released a reimagining, titled Godzilla, directed by Roland Emmerich. Emmerich wanted his Godzilla to have nothing to do with Toho's Godzilla but chose to retain key elements from the 1954 film, stating, "We took part of [the original movie's] basic storyline, in that the creature becomes created by radiation and it becomes a big challenge. But that's all we took." In 2014, Warner Bros. and Legendary Pictures released a reboot, also titled Godzilla, directed by Gareth Edwards. Edwards stated that his film was inspired by the 1954 film, and attempted to retain some of its themes, stating, "Godzilla is a metaphor for Hiroshima in the original movie. We tried to keep that, and there are a lot of themes from the '54 movie that we've kept." Notes References Sources External links Official Godzilla website by Toho Godzilla: Poetry After the A-Bomb an essay by J. Hoberman at the Criterion Collection 1954 films 1954 horror films 1950s monster movies 1950s political films 1950s science fiction horror films Japanese films Japanese-language films Japanese disaster films Japanese black-and-white films Japanese horror films Japanese political films Japanese science fiction films Apocalyptic films Anti-war films Natural horror films Films about nuclear war and weapons Films directed by Ishirō Honda Films produced by Tomoyuki Tanaka Films scored by Akira Ifukube Films set in Tokyo Films set on fictional islands Films shot in Japan Films shot in Tokyo Films using stop-motion animation Godzilla films Japanese science fiction horror films Toho films
[ -0.17130474746227264, 0.28862348198890686, -0.13130414485931396, -0.24349014461040497, -0.5801888704299927, 0.11281069368124008, 0.18705585598945618, -0.3685864806175232, -0.6384939551353455, 0.4020959138870239, 0.06912803649902344, 0.5646581649780273, -0.3261845111846924, 0.30219730734825...
12005
https://en.wikipedia.org/wiki/The%20Return%20of%20Godzilla
The Return of Godzilla
is a 1984 Japanese kaiju film directed by Koji Hashimoto, with special effects by Teruyoshi Nakano. The film features the fictional monster character Godzilla. Distributed by Toho and produced under their subsidiary Toho Pictures, it is the 16th film in the Godzilla franchise, and is the last film to be produced in the Showa era. In Japan, the film was followed by Godzilla vs. Biollante in 1989. The Return of Godzilla stars Ken Tanaka, Yasuko Sawaguchi, Yosuke Natsuki, and Keiju Kobayashi, with Kenpachiro Satsuma as Godzilla. The film serves as both a sequel to the original 1954 film and a reboot of the franchise that ignores the events of every Shōwa era film aside from the original Godzilla, placing itself in line with the darker tone and themes of the original film and returning Godzilla to his destructive, antagonistic roots. The film was released theatrically in Japan on December 15, 1984. The following year, in the United States, New World Pictures released Godzilla 1985, a heavily re-edited American adaptation of the film which includes additional footage, and features Raymond Burr reprising his role from the 1956 film Godzilla, King of the Monsters!. Plot The Japanese fishing vessel Yahata-Maru is caught in strong currents off the shores of Daikoku Island. As the boat drifts into shore, the island begins to erupt, and a giant monster lifts itself out of the volcano. A few days later, reporter Goro Maki is sailing in the area and finds the vessel intact but deserted. As he explores the vessel, he finds all the crew dead except for Hiroshi Okumura, who has been badly wounded. Suddenly a giant Shockirus sea louse attacks him but he is saved by Okumura. In Tokyo, Okumura realizes by looking at pictures that the monster he saw was a new Godzilla. Maki writes an article about the account, but the news of Godzilla's return is kept secret and his article is withheld. Maki visits Professor Hayashida, whose parents were lost in the 1954 Godzilla attack. Hayashida describes Godzilla as a living, invincible nuclear weapon able to cause mass destruction. At Hayashida's laboratory, Maki meets Okumura's sister, Naoko, and informs her that her brother is alive and at the police hospital. A Soviet submarine is destroyed in the Pacific. The Soviets believe the attack was perpetrated by the Americans, and a diplomatic crisis ensues, which threatens to escalate into nuclear war. The Japanese intervene and reveal that Godzilla was behind the attacks. The Japanese cabinet meets to discuss Japan's defense. A new weapon is revealed, the Super X, a specially-armored flying fortress that will defend the capital. The Japanese military is put on alert. Godzilla attacks the Ihama nuclear power plant in Shizuoka Prefecture. While feeding off the reactor, he is distracted by a flock of birds and leaves the facility. Hayashida believes that Godzilla was distracted instinctively by a homing signal from the birds. Hayashida, together with geologist Minami, propose to the Japanese Cabinet, that Godzilla could be lured back to Mount Mihara on Ōshima Island by a similar signal, and a volcanic eruption could be started, capturing Godzilla. Prime Minister Mitamura meets with Soviet and American envoys and declares that nuclear weapons will not be used on Godzilla, even if he were to attack the Japanese mainland. Meanwhile, the Soviets have their own plans to counter the threat posed by Godzilla, and a Soviet control ship disguised as a freighter in Tokyo Harbor prepares to launch a nuclear missile from one of their orbiting satellites should Godzilla attack. Godzilla is sighted at dawn in Tokyo Bay heading towards Tokyo, causing mass evacuations. The JASDF attacks Godzilla but fails to stop his advance on the city. Godzilla soon emerges and makes short work of the JSDF stationed there. The battle causes damage to the Soviet ship and starts a missile launch countdown. The captain dies as he attempts to stop the missile from launching. Godzilla proceeds towards Shinjuku, wreaking havoc along the way. Godzilla is confronted by four laser-armed trucks and the Super X. Because Godzilla's heart is similar to a nuclear reactor, the cadmium shells that are fired into his mouth by the Super X seal and slow down his heart, knocking Godzilla unconscious. The countdown ends and the Soviet missile is launched, but it is destroyed by an American counter-missile. Hayashida and Okumura are extracted from Tokyo via helicopter and taken to Mt. Mihara to set up the homing device before the two missiles collide above Tokyo. The destruction of the nuclear missile produces an electrical storm and an EMP, which revives Godzilla once more and temporarily disables the Super X. An enraged Godzilla bears down on the Super X just as it manages to get airborne again. The Super X's weapons prove ineffective against the kaiju, resulting in even more destruction in the city as Godzilla chases it through several skyscrapers. Godzilla finally destroys the Super X by dropping a skyscraper on top of it and continues his rampage, until Hayashida uses the homing device to distract him. Godzilla leaves Tokyo and swims across Tokyo Bay, following the homing device to Mount Mihara. There, Godzilla follows the device and falls into the mouth of the volcano. Okumura activates detonators at the volcano, creating a controlled eruption that traps Godzilla inside. Cast Production Development After the box office failure of Terror of Mechagodzilla, Toho attempted to reinvigorate the franchise several times during the late 1970s and early 1980s. The first attempt was the announcement of a color remake of the original 1954 film entitled The Rebirth of Godzilla in 1977, but the project was shelved. A year later, it was announced that Toho would develop a film jointly with UPA studios entitled Godzilla vs. the Devil, though this, along with UPA producer Henry G. Saperstein's proposed Godzilla vs. Gargantua, also never materialized. Godzilla series creator Tomoyuki Tanaka took charge of reviving the franchise in 1979, Godzilla's 25th anniversary, intending to return the series to its dark, anti-nuclear roots in the wake of the Three Mile Island accident. Hoping to win back adult audiences alienated by the fantastical approach to Godzilla films taken during the 1970s, Tanaka was further encouraged in his vision by the contemporary success of adult-oriented horror and science fiction movies like King Kong, Invasion of the Body Snatchers, Alien and The Thing. A draft story entitled The Resurrection of Godzilla was submitted by Akira Murao in 1980, and had Godzilla pitted against a shape-shifting monster called Bakan in the backdrop of an illegal nuclear waste disposal site, though the project was cancelled due to budgetary concerns. In 1983, American director Steve Miner proposed directing a Godzilla film at his own expense. Toho approved of the project, and Miner hired Fred Dekker to write the screenplay and paleosculptor Steve Czerkas to redesign the monster. The project was however hampered by Miner's insistence on using prohibitively costly stop-motion animation and shooting the film in 3D, and was thus rejected by major American movie studios. Under pressure from a 10,000-member group of Japanese Godzilla fans calling themselves the "Godzilla Resurrection Committee", Tanaka decided to helm a Japanese film for "strictly domestic consumption" to be released jointly alongside Miner's movie. In an effort to disavow Godzilla's increasingly heroic and anthropomorphic depiction in previous films, Tanaka insisted on making a direct sequel to the original 1954 movie. He hired screenwriter Shuichi Nagahara, who wrote a screenplay combining elements of the previously cancelled The Resurrection of Godzilla and Miner's still unproduced film, including an intensification of hostilities during the Cold War and a flying fortress which fires missiles into Godzilla's mouth. Koji Hashimoto was hired as director after Ishirō Honda declined the offer, as he was assisting Akira Kurosawa with Kagemusha and Ran, and felt that the franchise should have been discontinued after the death of Eiji Tsuburaya. Composer Akira Ifukube was offered to score the film but respectfully declined. At the time, it was rumored that Ifukube refused to participate in the film due to the changes made to Godzilla, stating, "I do not write music for 80-meter monsters". However, this quote was later clarified, by Ifukube's biographer Erik Homenick and Japanese Giants editor Ed Godziszewski, as a joke spread by fans which was later misinterpreted as fact. Ifukube declined to score the film due to his priorities, at the time, teaching composition at the Tokyo College of Music. Special effects The special effects were directed by Teruyoshi Nakano, who had directed the special effects of several previous Godzilla films. The decision was made by Tanaka to increase the apparent height of Godzilla from to so that Godzilla would not be dwarfed by the contemporary skyline of Tokyo. This meant that the miniatures had to be built to a th scale, and this contributed to an increase in the budget of the film to $6.25 million. Tanaka and Nakano supervised suit-maker Noboyuki Yasumaru in constructing a new Godzilla design, incorporating ears and four toes, features not seen since Godzilla Raids Again. Nakano insisted on infusing elements into the design that suggested sadness, such as downward-slanting eyes and sloping shoulders. Suit construction took two months, and consisted of separately casting body-part molds with urethane on a pre-built, life-size statue of the final design. Yasumaru personally took charge of all phases of suit-building, unlike in previous productions wherein the different stages of suit-production were handled by different craftsmen. The final suit was constructed to accommodate stuntman Hiroshi Yamawaki, but he declined suddenly, and was replaced by veteran suit actor Kenpachiro Satsuma, who had portrayed Hedorah and Gigan in the Showa Era. Because the suit wasn't built to his measurements, Satsuma had difficulty performing, being able to last only ten minutes within it, and losing 12 pounds during filming. Hoping to avoid having Godzilla move in an overly human fashion, Nakano instructed Satsuma to base his actions on Noh, a traditional Japanese dance. Taking inspiration from the publicity surrounding the 40-foot tall King Kong model from Dino De Laurentiis's 1976 film of the same name, Toho spent a reported ¥52,146 (approximately $475.00) on a 16-foot high robotic Godzilla (dubbed "Cybot") for use in close-up shots of the creature's head. The Cybot consisted of a hydraulically-powered mechanical endoskeleton covered in urethane skin containing 3,000 computer operated parts which permitted it to tilt its head, and move its lips and arms. Unlike previous Godzilla suits, whose lower jaws consisted of wire-operated flaps, the Cybot's jaws were hinged like those of an actual animal, and slid back as they opened. A life-size, crane operated foot was also built for close-up shots of city destruction scenes. Part of the film was shot on location on Izu Ōshima, where the climax of the story takes place. Release Theatrical The Return of Godzilla was released on December 15, 1984 in Japan where it was distributed by Toho. The film sold 3.2 million tickets, and grossed at the Japanese box office. Reception Despite its American re-edit receiving negative reviews, the original Japanese cut of the film has been much more well-received, with critics and fans praising the film's score, practical effects, and its darker tone. In 1985, the film won the Japan Academy Award for Special Effects. Home video In May 2016, Kraken Releasing revealed plans to release the original Japanese version of The Return of Godzilla and its international English dub on DVD and Blu-ray in North America on September 13, 2016. However, it was also revealed that the Americanized version of the film, Godzilla 1985 would not be featured in the release due to ongoing copyright issues concerning music cues that New World Pictures borrowed from Def-Con 4 for use in Godzilla 1985. Alternate English versions Exported English dub Shortly after the film's completion, Toho's foreign sales division, Toho International Co., Ltd, had the film dubbed into English by an unidentified firm in Hong Kong. No cuts were made, though credits and other titles were accordingly rendered in English. The international English dub features the voice of news anchor and radio announcer John Culkin in the role of Goro Maki, and actor Barry Haigh as Prime Minister Mitamura. The English version fully dubs all dialogue into English, including that of the Soviet and American characters. The international English dub was released on VHS in the U.K. by Carlton Home Entertainment on July 24, 1998. In 2016, the international English dub was included on the U.S. DVD and Blu-Ray releases from Kraken, though the audio mix was not the original monaural track that was originally heard on Toho's English language prints. The English dialogue was originally mixed with an alternate music and effects track that contained different music edits and sound effects from the Japanese theatrical version, most notably a distinct "cry" produced by Godzilla during the film's ending. The U.S. home video version instead uses the conventional music and effects track used for the regular Japanese version mixed in DTS 5.1 surround sound instead of mono. Godzilla 1985 After the film's lackluster performance in the Japanese box office and the ultimate shelving of Steve Miner's Godzilla 3D project, Toho decided to distribute the film overseas in order to regain lost profits. New World Pictures acquired The Return of Godzilla for distribution in North America, and changed the title to Godzilla 1985, bringing back Raymond Burr in order to commemorate the 30th anniversary of Godzilla: King of the Monsters!. Originally, New World reportedly planned to re-write the dialogue in order to turn the film into a tongue-in-cheek comedy starring Leslie Nielsen (à la What's Up, Tiger Lily?), but this plan was reportedly scrapped because Raymond Burr expressed displeasure at the idea, taking the idea of Godzilla as a nuclear metaphor seriously. The only dialogue left over from that script was "That's quite an urban renewal program they've got going on over there," said by Major McDonahue. All of Burr's scenes were filmed in one day to suit his schedule. He was paid US$50,000. The reverse shots, of the actors he was speaking to, were filmed the next day, and the American filming was completed in three days. One of the most controversial changes done on the film was having Soviet Colonel Kashirin deliberately launch the nuclear missile rather than die in attempting to prevent its launch. Director R. J. Kizer later attributed this to New World's management's conservative leanings. The newly edited film also contained numerous product placements for Dr Pepper, which had twice used Godzilla in its commercials. Dr Pepper's marketing director at one point insisted that Raymond Burr drink Dr Pepper during a scene, and the suggestion was put to the actor by Kizer. Burr reportedly responded by "[fixing] me with one of those withering glares and just said nothing." Roger Ebert and Vincent Canby gave the film negative reviews. See also Kaiju List of Japanese films of 1984 List of science-fiction films of the 1980s List of monster movies Notes References Bibliography External links The Return of Godzilla at the Movie Review Query Engine 1984 films 1980s monster movies 1980s political films 1980s science fiction films Alternative sequel films Anti-war films English-language films Japanese-language films Russian-language films Cold War submarine films Films about nuclear war and weapons Films about volcanoes Films directed by Koji Hashimoto Films produced by Tomoyuki Tanaka Films set in Tokyo Films set in Shizuoka Prefecture Films set on islands Films shot in Japan Films shot in Tokyo Giant monster films Godzilla films Japanese films Japanese political films Japanese science fiction films Japanese sequel films Kaiju films Reboot films Toho films
[ -0.1876099854707718, 0.14173519611358643, 0.15985830128192902, -0.251913845539093, -0.36605972051620483, 0.04822961241006851, 0.16898968815803528, -0.2607085704803467, -0.2821213901042938, 0.1600656509399414, -0.2914636433124542, 0.5610026121139526, -0.2185949683189392, 0.343610018491745, ...
12007
https://en.wikipedia.org/wiki/Johann%20Gottlieb%20Fichte
Johann Gottlieb Fichte
Johann Gottlieb Fichte (; ; 19 May 1762 – 29 January 1814) was a German philosopher who became a founding figure of the philosophical movement known as German idealism, which developed from the theoretical and ethical writings of Immanuel Kant. Recently, philosophers and scholars have begun to appreciate Fichte as an important philosopher in his own right due to his original insights into the nature of self-consciousness or self-awareness. Fichte was also the originator of thesis–antithesis–synthesis, an idea that is often erroneously attributed to Hegel. Like Descartes and Kant before him, Fichte was motivated by the problem of subjectivity and consciousness. Fichte also wrote works of political philosophy; he has a reputation as one of the fathers of German nationalism. Biography Origins Fichte was born in Rammenau, Upper Lusatia. The son of a ribbon weaver, he came of peasant stock which had lived in the region for many generations. The family was noted in the neighborhood for its probity and piety. Christian Fichte, Johann Gottlieb's father, married somewhat above his station. It has been suggested that a certain impatience which Fichte himself displayed throughout his life was an inheritance from his mother. He received a rudimentary education from his father. He showed remarkable ability from an early age, and it was owing to his reputation among the villagers that he gained the opportunity for a better education than he otherwise would have received. The story runs that the Freiherr von Militz, a country landowner, arrived too late to hear the local pastor preach. He was, however, informed that a lad in the neighborhood would be able to repeat the sermon almost verbatim. As a result, the baron took Fichte into his protection and paid for his tuition. Early schooling Fichte was placed in the family of Pastor Krebel at Niederau near Meissen, and there received a thorough grounding in the classics. From this time onward, Fichte saw little of his parents. In October 1774, he attended the celebrated foundation-school at Pforta near Naumburg. This school is associated with the names of Novalis, August Wilhelm Schlegel, Friedrich Schlegel and Nietzsche. The spirit of the institution was semi-monastic and, while the education was excellent, it is doubtful whether there was enough social life and contact with the world for Fichte's temperament and antecedents. Perhaps his education strengthened a tendency toward introspection and independence, characteristics which appear strongly in his doctrines and writings. Theological studies and private tutoring In 1780, Fichte began study at the University of Jena's theology seminary. He was transferred a year later to study at the Leipzig University. Fichte seems to have supported himself during this period of bitter poverty and hard struggle. Freiherr von Militz continued to support him, but when he died in 1784, Fichte had to end his studies without completing his degree. From 1784 to 1788, Fichte precariously supported himself as tutor for various Saxon families. In early 1788, he returned to Leipzig in the hope of finding a better employment, but eventually he had to settle for a less promising position with the family of an innkeeper in Zurich. He lived in Zurich for the next two years (1788–1790), which was a time of great contentment for him. There he met his future wife, Johanna Rahn, and Johann Heinrich Pestalozzi. He also became, in 1793, a member of the Freemasonry lodge "Modestia cum Libertate" with which Johann Wolfgang Goethe was also connected. In the spring of 1790, he became engaged to Johanna. Fichte began to study the works of Kant in the summer of 1790. This occurred initially because one of Fichte's students wanted to know about Kant's writings. They had a lasting effect on his life and thought. However, while Fichte was studying Kantian philosophy, the Rahn family suffered financial reverses. His impending marriage had to be postponed. Kant From Zurich, Fichte returned to Leipzig in May 1790. In early 1791, he obtained a tutorship in Warsaw in the house of a Polish nobleman. The situation, however, quickly proved disagreeable and he was released. He then got a chance to see Kant at Königsberg. After a disappointing interview on 4 July of the same year, he shut himself in his lodgings and threw all his energies into the composition of an essay which would attract Kant's attention and interest. This essay, completed in five weeks, was the Versuch einer Critik aller Offenbarung (Attempt at a Critique of All Revelation, 1792). In this book, according to Henrich, Fichte investigated the connections between divine revelation and Kant's critical philosophy. The first edition was published without Kant's or Fichte's knowledge and without Fichte's name or signed preface. It was thus believed by the public to be a new work by Kant. When Kant cleared the confusion and openly praised the work and author, Fichte's reputation skyrocketed. In a letter to Karl Reinhold, Jens Baggeson wrote that it was "...the most shocking and astonishing news... [since] nobody but Kant could have written this book. This amazing news of a third sun in the philosophical heavens has set me into such confusion". Kant waited seven years to make public statement about the incident; after considerable external pressure he dissociated himself from Fichte. In his statement, he inscribed, "May God protect us from our friends. From our enemies, we can try to protect ourselves." Jena In October 1793, Fichte was married in Zurich, where he remained the rest of the year. Stirred by the events and principles of the French Revolution, he wrote and anonymously published two pamphlets which led to him to be seen as a devoted defender of liberty of thought and action and an advocate of political changes. In December of the same year, he received an invitation to fill the position of extraordinary professor of philosophy at the University of Jena. He accepted and began his lectures in May 1794. With extraordinary zeal, he expounded his system of "transcendental idealism". His success was immediate. He excelled as a lecturer due to the earnestness and force of his personality. These lectures were later published under the title The Vocation of the Scholar (Einige Vorlesungen über die Bestimmung des Gelehrten). He gave himself up to intense production, and a succession of works soon appeared. Atheism dispute After weathering several academic storms, Fichte was finally dismissed from the University of Jena in 1799 for atheism. He had been accused of this in 1798 after publishing the essay "Ueber den Grund unsers Glaubens an eine göttliche Weltregierung" ("On the Ground of Our Belief in a Divine World-Governance"), written in response to Friedrich Karl Forberg's essay "Development of the Concept of Religion", in his Philosophical Journal. For Fichte, God should be conceived primarily in moral terms: "The living and efficaciously acting moral order is itself God. We require no other God, nor can we grasp any other" ("On the Ground of Our Belief in a Divine World-Governance"). Fichte's intemperate "Appeal to the Public" ("Appellation an das Publikum", 1799) provoked F. H. Jacobi to publish an open letter in which he equated philosophy in general and Fichte's transcendental philosophy in particular with nihilism. Berlin Since all the German states except Prussia had joined in the cry against Fichte, he was forced to go to Berlin. There he associated himself with the Schlegels, Schleiermacher, Schelling and Tieck. In April 1800, through the introduction of Hungarian writer Ignaz Aurelius Fessler, he was initiated into Freemasonry in the Lodge Pythagoras of the Blazing Star where he was elected minor warden. At first Fichte was a warm admirer of Fessler, and was disposed to aid him in his proposed Masonic reform. But later he became Fessler's bitter opponent. Their controversy attracted much attention among Freemasons. Fichte presented two lectures on the philosophy of Masonry during the same period as part of his work on the development of various higher degrees for the lodge in Berlin. Johann Karl Christian Fischer, a high official of the Grand Orient, published those lectures in 1802/03 in two volumes under the title Philosophy of Freemasonry: Letters to Konstant (Philosophie der Maurerei. Briefe an Konstant), where Konstant referred to a fictitious non-Mason. In November 1800, Fichte published The Closed Commercial State: A Philosophical Sketch as an Appendix to the Doctrine of Right and an Example of a Future Politics (Der geschlossene Handelsstaat. Ein philosophischer Entwurf als Anhang zur Rechtslehre und Probe einer künftig zu liefernden Politik), a philosophical statement of his property theory, a historical analysis of European economic relations, and a political proposal for reforming them. In 1805, he was appointed to a professorship at the University of Erlangen. The Battle of Jena-Auerstedt in 1806, in which Napoleon completely crushed the Prussian army, drove him to Königsberg for a time, but he returned to Berlin in 1807 and continued his literary activity. After the collapse of the Holy Roman Empire, where German southern principalities resigned as member states and became part of a French protectorship, Fichte delivered the famous Addresses to the German Nation (Reden an die deutsche Nation, 1807-1808) which attempted to define the German Nation, and guided the uprising against Napoleon. He became a professor at the new University of Berlin, founded in 1810. By the votes of his colleagues Fichte was unanimously elected its rector in the succeeding year. But, once more, his impetuosity and reforming zeal led to friction, and he resigned in 1812. The campaign against Napoleon began, and the hospitals at Berlin were soon full of patients. Fichte's wife devoted herself to nursing and caught a virulent fever. Just as she was recovering, he became sick with typhus and died in 1814 at the age of 51. His son, Immanuel Hermann Fichte (18 July 1796 – 8 August 1879), also made contributions to philosophy. Philosophical work Fichte's critics argued that his mimicry of Kant's difficult style produced works that were barely intelligible. "He made no hesitation in pluming himself on his great skill in the shadowy and obscure, by often remarking to his pupils, that 'there was only one man in the world who could fully understand his writings; and even he was often at a loss to seize upon his real meaning. On the other hand, Fichte acknowledged the difficulty, but argued that his works were clear and transparent to those who made the effort to think without preconceptions and prejudices. Fichte did not endorse Kant's argument for the existence of noumena, of "things in themselves", the supra-sensible reality beyond direct human perception. Fichte saw the rigorous and systematic separation of "things in themselves" (noumena) and things "as they appear to us" (phenomena) as an invitation to skepticism. Rather than invite skepticism, Fichte made the radical suggestion that we should throw out the notion of a noumenal world and accept that consciousness does not have a grounding in a so-called "real world". In fact, Fichte achieved fame for originating the argument that consciousness is not grounded in outside of itself. The phenomenal world as such, arises from consciousness; the activity of the I; and moral awareness. His student (and critic), Arthur Schopenhauer, wrote: Søren Kierkegaard was also a student of the writings of Fichte: Central theory In Foundations of Natural Right (1797), Fichte argued that self-consciousness was a social phenomenon — an important step and perhaps the first clear step taken in this direction by modern philosophy. For Fichte, a necessary condition of every subject's self-awareness is the existence of other rational subjects. These others call or summon (fordern auf) the subject or self out of its unconsciousness and into an awareness of itself as a free individual. Fichte proceeds from the general principle that the I (das Ich) must posit itself as an individual in order to posit (setzen) itself at all, and that in order to posit itself as an individual, it must recognize itself to a calling or summons (Aufforderung) by other free individual(s) — called to limit its own freedom out of respect for the freedom of the others. The same condition applies to the others in development. Mutual recognition (gegenseitig anerkennen) of rational individuals is a condition necessary for the individual I. The argument for intersubjectivity is central to the conception of selfhood developed in the Foundations of the Science of Knowledge (Grundlage der gesamten Wissenschaftslehre, 1794/1795). Fichte's consciousness of the self depends upon resistance or a check by something that is understood as not part of the self yet is not immediately ascribable to a particular sensory perception. In his later 1796–99 lectures (his Nova methodo), Fichte incorporated this into his revised presentation of the foundations of his system, where the summons takes its place alongside original feeling, which takes the place of the earlier Anstoss (see below) as a limit on the absolute freedom and a condition for the positing of the I. The I posits this situation for itself. To posit does not mean to 'create' the objects of consciousness. The principle in question simply states that the essence of an I lies in the assertion of self-identity, i.e., that consciousness presupposes self-consciousness. Such immediate self-identity cannot be understood as a psychological fact, or an act or accident of some previously existing substance or being. It is an action of the I, but one that is identical with the very existence of this same I. In Fichte's technical terminology, the original unity of self-consciousness is an action and the product of the same I, as a "fact and/or act" (Thathandlung; Modern German: Tathandlung), a unity that is presupposed by and contained within every fact and every act of empirical consciousness, although it never appears as such. The I can posit itself only as limited. Moreover, it cannot even posit its own limitations, in the sense of producing or creating these limits. The finite I cannot be the ground of its own passivity. Instead, for Fichte, if the I is to posit itself, it must simply discover itself to be limited, a discovery that Fichte characterizes as an "impulse," "repulse," or "resistance" (Anstoss; Modern German: Anstoß) to the free practical activity of the I. Such an original limitation of the I is, however, a limit for the I only insofar as the I posits it out as a limit. The I does this, according to Fichte's analysis, by positing its own limitation, first, as only a feeling, then as a sensation, then as an intuition of a thing, and finally as a summons of another person. The Anstoss thus provides the essential impetus that first posits in motion the entire complex train of activities that finally result in our conscious experience both of ourselves and others as empirical individuals and of the world around us. Although Anstoss plays a similar role as the thing in itself does in Kantian philosophy, unlike Kant, Fichte's Anstoss is not something foreign to the I. Instead, it denotes the original encounter of the I with its own finitude. Rather than claim that the not-I (das Nicht-Ich) is the cause or ground of the Anstoss, Fichte argues that not-I is posited by the I in order to explain to itself the Anstoss in order to become conscious of Anstoss. The Wissenschaftslehre demonstrates that Anstoss must occur if self-consciousness is to come about but is unable to explain the actual occurrence of Anstoss. There are limits to what can be expected from an a priori deduction of experience, and this, for Fichte, equally applies to Kant's transcendental philosophy. According to Fichte, transcendental philosophy can explain that the world must have space, time, and causality, but it can never explain why objects have the particular sensible properties they happen to have or why I am this determinate individual rather than another. This is something that the I simply has to discover at the same time that it discovers its own freedom, and indeed, is a condition for the latter. Dieter Henrich (1966) proposed that Fichte was able to move beyond a "reflective theory of consciousness". According to Fichte, the self must already have some prior acquaintance with itself, independent of the act of reflection ("no object comes to consciousness except under the condition that I am aware of myself, the conscious subject [jedes Object kommt zum Bewusstseyn lediglich unter der Bedingung, dass ich auch meiner selbst, des bewusstseyenden Subjects mir bewusst sey]"). This idea is what Henrich called Fichte's original insight. Nationalism Between December 1807 and March 1808, Fichte gave a series of lectures concerning the "German nation" and its culture and language, projecting the kind of national education he hoped would raise it from the humiliation of its defeat at the hands of the French. Having been a supporter of Revolutionary France, Fichte became disenchanted by 1804 as Napoleon's armies advanced through Europe, occupying German territories, stripping them of their raw materials and subjugating them to foreign rule. He came to believe Germany would be responsible to carry the virtues of the French Revolution into the future. Furthermore, his nationalism was not aroused by Prussian military defeat and humiliation, for these had not yet occurred, but resulted from his own humanitarian philosophy. Disappointed in the French, he turned to the German nation as the instrument of fulfilling it. These lectures, entitled the Addresses to the German Nation, coincided with a period of reform in the Prussian government, under the chancellorship of Baron vom Stein. The Addresses display Fichte's interest during that period in language and culture as vehicles of human spiritual development. Fichte built upon earlier ideas of Johann Gottfried Herder and attempted to unite them with his approach. The aim of the German nation, according to Fichte, was to "found an empire of spirit and reason, and to annihilate completely the crude physical force that rules of the world." Like Herder's German nationalism, Fichte's was cultural, and grounded in the aesthetic, literary, and moral. However, Fichte's belief in a "Closed Commercial State", a state dominated economy and society, should be noted – as should its kinship with certain 20th-century governments in Germany and elsewhere. The nationalism propounded by Fichte in the Addresses would be used over a century later by the Nazi Party in Germany, which saw in Fichte a forerunner to its own nationalist ideology. Like Nietzsche, the association of Fichte with the Nazi regime came to colour readings of Fichte's German nationalism in the post-war period. This reading of Fichte was often bolstered through reference to an unpublished letter from 1793, Contributions to the Correction of the Public's Judgment concerning the French Revolution, wherein Fichte expressed anti-semitic sentiments, such as arguing against extending civil rights to Jews and calling them a "state within a state" that could "undermine" the German nation. However, attached to the letter is a footnote in which Fichte provides an impassioned plea for permitting Jews to practice their religion without hindrance. Furthermore, the final act of Fichte's academic career was to resign as rector of the University of Berlin in protest when his colleagues refused to punish the harassment of Jewish students. While recent scholarship has sought to dissociate Fichte's writings on nationalism with their adoption by the Nazi Party, the association continues to blight his legacy, although Fichte, as if to exclude all ground of doubt, clearly and distinctly prohibits,  in his reworked version of The Science of Ethics as Based on the Science of Knowledge (see § Final period in Berlin) genocide and other crimes against humanity: If you say that it is your conscience's command to exterminate peoples for their sins, [...] we can confidently tell you that you are wrong; for such things can never be commanded against the free and moral force. Economics Fichte's 1800 economic treatise The Closed Commercial State had a profound influence on the economic theories of German Romanticism. In it, Fichte argues the need for the strictest, purely guild-like regulation of industry. The "exemplary rational state" (Vernunftstaat), Fichte argues, should not allow any of its "subjects" to engage in this or that production, failing to pass the preliminary test, not certifying government agents in their professional skills and agility. According to Vladimir Mikhailovich Shulyatikov, "this kind of demand was typical of Mittelstund, the German petty middle class, the class of artisans, hoping by creating artificial barriers to stop the victorious march of big capital and thus save themselves from inevitable death. The same demand was imposed on the state, as is evident from Fichte's treatise, by the German "factory" (Fabrike), more precisely, the manufacture of the early 19th century". Fichte opposed free trade and unrestrained capitalist industrial growth, stating: "There is an endless war of all against all ... And this war is becoming more fierce, unjust, more dangerous in its consequences, the more the world's population grows, the more acquisitions the trading state makes, the more production and art (industry) develops and, together with thus, the number of circulating goods increases, and with them the needs become more and more diversified. What, with the simple way of life of nations, was done before without great injustices and oppression, turns, thanks to increased needs, into flagrant injustice, into a source of great evils. The buyer tries to take the goods away from the seller; therefore he demands freedom of trade, i.e. freedom for the seller to wander around the markets, freedom not to find a sale for goods and sell them significantly below their value. Therefore, he requires strong competition between manufacturers (Fabrikanten) and merchants." The only means that could save the modern world, which would destroy evil at the root, is, according to Fichte, to split the "world state" (the global market) into separate self-sufficient bodies. Each such body, each "closed trading state" will be able to regulate its internal economic relations. It will be able to both extract and process everything that is needed to meet the needs of its citizens. It will carry out the ideal organization of production. Fichte argued for government regulation of industrial growth, writing "Only by limitation does a certain industry become the property of the class that deals with it". Vladimir Mikhailovich Shulyatikov considers the economics of German idealists and Romantics as representing the compromise of the German bourgeoisie of the early 19th century with the monarchical State: The French physiocrats proclaimed the principle: "Laissez faire!" On the other hand, the German capitalists of the 1800s, whose ideologists were the objective idealists, professed a belief in the saving effect of government tutelage. Women Fichte believed that "active citizenship, civic freedom and even property rights should be withheld from women, whose calling was to subject themselves utterly to the authority of their fathers and husbands." Final period in Berlin Fichte gave a wide range of public and private lectures in Berlin from the last decade of his life. These form some of his best known work, and are the basis of a revived German-speaking scholarly interest in his work. The lectures include two works from 1806. In The Characteristics of the Present Age (Die Grundzüge des gegenwärtigen Zeitalters), Fichte outlines his theory of different historical and cultural epochs. His mystic work The Way Towards the Blessed Life (Die Anweisung zum seligen Leben oder auch die Religionslehre) gave his fullest thoughts on religion. In 1807-1808 he gave a series of speeches in French-occupied Berlin, Addresses to the German Nation. In 1810, the new University of Berlin was established, designed along ideas put forward by Wilhelm von Humboldt. Fichte was made its rector and also the first Chair of Philosophy. This was in part because of educational themes in the Addresses, and in part because of his earlier work at Jena University. Fichte lectured on further versions of his Wissenschaftslehre. Of these, he only published a brief work from 1810, The Science of Knowledge in its General Outline (Die Wissenschaftslehre, in ihrem allgemeinen Umrisse dargestellt; also translated as Outline of the Doctrine of Knowledge). His son published some of these thirty years after his death. Most only became public in the last decades of the twentieth century, in his collected works. This included reworked versions of the Doctrine of Science (Wissenschaftslehre, 1810–1813), The Science of Rights (Das System der Rechtslehre, 1812), and The Science of Ethics as Based on the Science of Knowledge (Das System der Sittenlehre nach den Principien der Wissenschaftslehre, 1812; 1st ed. 1798). Bibliography Selected works in German Wissenschaftslehre Ueber den Begriff der Wissenschaftslehre oder der sogenannten Philosophie (1794) Grundlage der gesamten Wissenschaftslehre (1794/1795) Wissenschaftslehre nova methodo (1796–1799: "Halle Nachschrift," 1796/1797 and "Krause Nachschrift," 1798/1799) Versuch einer neuen Darstellung der Wissenschaftslehre (1797/1798) Darstellung der Wissenschaftslehre (1801) Die Wissenschaftslehre (1804, 1812, 1813) Die Wissenschaftslehre, in ihrem allgemeinen Umrisse dargestellt (1810) Other works in German Versuch einer Critik aller Offenbarung (1792) Beitrag zur Berichtigung der Urteile des Publikums über die französische Revolution (1793) Einige Vorlesungen über die Bestimmung des Gelehrten (1794) Grundlage des Naturrechts (1796) Das System der Sittenlehre nach den Principien der Wissenschaftslehre (1798) "Ueber den Grund unsers Glaubens an eine göttliche Weltregierung" (1798) "Appellation an das Publikum über die durch Churf. Sächs. Confiscationsrescript ihm beigemessenen atheistischen Aeußerungen. Eine Schrift, die man zu lesen bittet, ehe man sie confsicirt" (1799) Der geschlossene Handelsstaat. Ein philosophischer Entwurf als Anhang zur Rechtslehre und Probe einer künftig zu liefernden Politik (1800) Die Bestimmung des Menschen (1800) Friedrich Nicolais Leben und sonderbare Meinungen (1801) Philosophie der Maurerei. Briefe an Konstant (1802/03) Die Grundzüge des gegenwärtigen Zeitalters (1806) Die Anweisung zum seligen Leben oder auch die Religionslehre (1806) Reden an die deutsche Nation (1807/1808) Das System der Rechtslehre (1812) Correspondence Jacobi an Fichte, German Text (1799/1816), with Introduction and Critical Apparatus by Marco Ivaldo and Ariberto Acerbi (Introduction, German Text, Italian Translation, 3 Appendices with Jacobi's and Fichte's complementary Texts, Philological Notes, Commentary, Bibliography, Index): Istituto Italiano per gli Studi Filosofici Press, Naples 2011, . Collected works in German The new standard edition of Fichte's works in German, which supersedes all previous editions, is the Gesamtausgabe ("Collected Works" or "Complete Edition", commonly abbreviated as GA), prepared by the Bavarian Academy of Sciences: Gesamtausgabe der Bayerischen Akademie der Wissenschaften, 42 volumes, edited by , Hans Gliwitzky, Erich Fuchs and Peter Schneider, Stuttgart-Bad Cannstatt: Frommann-Holzboog, 1962–2012. It is organized into four parts: Part I: Published Works Part II: Unpublished Writings Part III: Correspondence Part IV: Lecture Transcripts Fichte's works are quoted and cited from GA, followed by a combination of Roman and Arabic numbers, indicating the series and volume, respectively, and the page number(s). Another edition is Johann Gottlieb Fichtes sämmtliche Werke (abbrev. SW), ed. I. H. Fichte. Berlin: de Gruyter, 1971. Selected works in English Concerning the Conception of the Science of Knowledge Generally (Ueber den Begriff der Wissenschaftslehre oder der sogenannten Philosophie, 1794), translated by Adolph Ernst Kroeger. In The Science of Knowledge, pp. 331–336. Philadelphia: J.B. Lippincott & Co., 1868. Rpt., London: Trübner & Co., 1889. Attempt at a Critique of All Revelation. Trans. Garrett Green. New York: Cambridge University Press, 1978. (Translation of Versuch einer Critik aller Offenbarung, 1st ed. 1792, 2nd ed. 1793.) Early Philosophical Writings. Trans. and ed. Daniel Breazeale. Ithaca: Cornell University Press, 1988. (Contains Selections from Fichte's Writings and Correspondence from the Jena period, 1794–1799). Foundations of the Entire Science of Knowledge. Translation of: Grundlage der gesammten Wissenschaftslehre (1794/95, 2nd ed. 1802), Fichte's first major exposition of the Wissenschaftlehre. In: Foundations of Natural Right. Trans. Michael Baur. Ed. Frederick Neuhouser. Cambridge: Cambridge University Press, 2000. (Translation of Grundlage des Naturrechts, 1796/97.) Foundations of Transcendental Philosophy (Wissenschaftslehre) Nova Methodo [FTP]. Trans. and ed. Daniel Breazeale. Ithaca, NY: Cornell University Press, 1992. (Translation of Wissenschaftslehre nova methodo, 1796–1799.) The System of Ethics according to the Principles of the Wissenschaftslehre (translation of Das System der Sittenlehre nach den Principien der Wissenschaftslehre, 1798). Ed. and trans. Daniel Breazeale and Günter Zöller. Cambridge University Press, 2005. Introductions to the Wissenschaftslehre and Other Writings. Trans. and ed. Daniel Breazeale. Indianapolis, and Cambridge: Hackett, 1994. (Contains mostly writings from the late Jena period, 1797–1799.) The Vocation of Man, 1848. Trans. Peter Preuss. Indianapolis. (Translation of Die Bestimmung des Menschen, 1800.) The Vocation of the Scholar, 1847. (Translation of Einige Vorlesungen über die Bestimmung des Gelehrten, 1794.) A Crystal Clear Report to the General Public Concerning the Actual Essence of the Newest Philosophy: An Attempt to Force the Reader to Understand. Trans. John Botterman and William Rash. In: Philosophy of German Idealism, pp. 39–115. (Translation of Sonnenklarer Bericht an das grössere Publikum über das Wesen der neuesten Philosophie, 1801.) Outline of the Doctrine of Knowledge, 1810 (Translation of Die Wissenschaftslehre, in ihrem allgemeinen Umrisse dargestellt published in From The Popular Works of Johann Gottlieb Fichte, Trubner and Co., 1889; trans. William Smith.) On the Nature of the Scholar, 1845 (Translation of Ueber das Wesen des Gelehrten, 1806.) Characteristics of the Present Age (Die Grundzüge des gegenwärtigen Zeitalters, 1806). In: The Popular Works of Johann Gottlieb Fichte, 2 vols., trans. and ed. William Smith. London: Chapman, 1848/49. Reprint, London: Thoemmes Press, 1999. Addresses to the German Nation (Reden an die deutsche Nation, 1808), ed. and trans. Gregory Moore. Cambridge University Press, 2008. The Philosophical Rupture Between Fichte and Schelling: Selected Texts and Correspondence (1800-1802). Trans. and eds. Michael G. Vater and David W. Wood. Albany, NY: State University of New York Press, 2012. Includes the following texts by Johann Gottlieb Fichte: Correspondence with F.W.J. Schelling (1800–1802); "Announcement" (1800); extract from "New Version of the Wissenschaftslehre" (1800); "Commentaries on Schelling's System of Transcendental Idealism and Presentation of My System of Philosophy" (1800–1801). Works online in English J.G. Fichte. "The Wissenschaftslehre is Mathematics" ("Announcement", 1800/1801). * Addresses to the German Nation (1922). (Trs. R. F. Jones and G. H. Turnbull.) IA (UToronto) The Destination of Man (1846). Alternative translation of The Vocation of Man. (Tr. Mrs. Percy Sinnett.) IA (UToronto) Doctrine de la science (Paris, 1843). French translation of Foundations of the Entire Science of Knowledge. Google (Harvard) Google (Oxford) Google (UMich) Johann Gottlieb Fichte's Popular Works (1873). (Tr. William Smith.) IA (UToronto) New Exposition of the Science of Knowledge (1869). Translation of Versuch einer neuen Darstellung der Wissenschaftslehre. (Tr. A. E. Kroeger.) Google (Harvard) Google (NYPL) IA (UToronto) On the Nature of the Scholar (1845). Alternative translation of The Vocation of the Scholar. (Tr. William Smith.) IA (UToronto) The Popular Works of Johann Gottlieb Fichte (1848–49). (Tr. William Smith.) Volume 1, 1848. Google (Oxford) IA (UToronto) 4th ed., 1889. IA (UIllinois) IA (UToronto) Volume 2, 1849. IA (UToronto) 4th ed., 1889. Google (Stanford) IA (UIllinois) IA (UToronto) The Science of Ethics as Based on the Science of Knowledge (1897). (Tr. A. E. Kroeger.) Google (UMich) IA (UToronto) The Science of Knowledge (1889). Alternative translation of Foundations of the Entire Science of Knowledge. (Tr. A. E. Kroeger.) IA (UToronto) The Science of Rights (1889). (Tr. A. E. Kroeger.) IA (UCal) (German) Versuch einer Critik aller Offenbarung (Königsberg, 1792; 2nd ed. 1793). Gallica Google (Oxford) Google (Oxford) The Vocation of Man (1848). (Tr. William Smith.) Google (Oxford) 1910. Google (UCal) The Vocation of the Scholar (1847). (Tr. William Smith.) IA (UCal) The Way Towards the Blessed Life (1849). (Tr. William Smith.) Google (Oxford) "On the Foundation of Our Belief in a Divine Government of the Universe"; alternative translation of "On the Ground of Our Belief in a Divine World-Governance" (trans. anon. n.d.). See also Butterfly effect Notes References Daniel Breazeale. "Fichte's Aenesidemus Review and the Transformation of German Idealism" The Review of Metaphysics, 34 (1980–81): 545–68. Daniel Breazeale and Tom Rockmore (eds.). Fichte: Historical Contexts/Contemporary Controversies. Atlantic Highlands: Humanities Press, 1994. Daniel Breazeale and Tom Rockmore (eds.), Fichte, German Idealism, and Early Romanticism, Rodopi, 2010. Daniel Breazeale. Thinking Through the Wissenschaftslehre: Themes from Fichte's Early Philosophy. Oxford: Oxford University Press, 2013. Ezequiel L. Posesorski. Between Reinhold and Fichte: August Ludwig Hülsen's Contribution to the Emergence of German Idealism. Karlsruhe: Karlsruher Institut Für Technologie, 2012. Sally Sedgwick. The Reception of Kant's Critical Philosophy: Fichte, Schelling, and Hegel. Cambridge: Cambridge University Press, 2007. Violetta L. Waibel, Daniel Breazeale, Tom Rockmore (eds.), Fichte and the Phenomenological Tradition, Berlin: Walter de Gruyter, 2010. Günter Zöller. Fichte's Transcendental Philosophy: The Original Duplicity of Intelligence and Will. Cambridge: Cambridge University Press, 1998. Further reading Karl Ameriks, Dieter Sturma (eds.), The Modern Subject: Conceptions of the Self in Classical German Philosophy, Albany: State University of New York Press, 1995. Arash Abizadeh. "Was Fichte an Ethnic Nationalist?", History of Political Thought 26.2 (2005): 334–359. Gunnar Beck. Fichte and Kant on Freedom, Rights and Law, Lexington Books (Rowman and Littlefield), 2008. Franks, Paul. All or Nothing: Systematicity, Transcendental Arguments, and Skepticism in German Idealism, Cambridge: Harvard University Press, 2005. Andrea Gentile, Bewusstsein, Anschauung und das Unendliche bei Fichte, Schelling und Hegel. Über den unbedingten Grundsatz der Erkenntnis, Verlag Karl Alber, Freiburg, München 2018, T. P. Hohler. Imagination and Reflection: Intersubjectivity. Fichte's 'Grundlage' of 1794. The Hague: Nijhoff, 1982. Wayne Martin. Idealism and Objectivity: Understanding Fichte's Jena Project. Stanford: Stanford University Press, 1997. Harald Muenster. Fichte trifft Darwin, Luhmann und Derrida. 'Die Bestimmung des Menschen' in differenztheoretischer Rekonstruktion und im Kontext der 'Wissenschaftslehre nova methodo''' [Fichte Meets Darwin, Luhmann and Derrida. "The Vocation of Man" As Reconstructed by Theories of Difference and in the Context of the "Wissenschaftslehre nova methodo"]. Amsterdam/New York: Rodopi, 2011 (Fichte-Studien-Supplementa, volume 28). Isaac Nakhimovsky, The Closed Commercial State: Perpetual Peace and Commercial Society from Rousseau to Fichte, Princeton, New Jersey: Princeton University Press, 2011. Frederick Neuhouser. Fichte's Theory of Subjectivity. Cambridge: Cambridge University Press, 1990. Tom Rockmore. Fichte, Marx, and the German Philosophical Tradition. Carbondale: Southern Illinois University Press, 1980. Rainer Schäfer. Johann Gottlieb Fichtes Grundlage der gesamten Wissenschaftslehre von 1794. Darmstadt: Wissenschaftliche Buchgesellschaft, 2006. Ulrich Schwabe. Individuelles und Transindividuelles Ich. Die Selbstindividuation reiner Subjektivität und Fichtes "Wissenschaftslehre nova methodo". Paderborn 2007. Peter Suber. "A Case Study in Ad Hominem Arguments: Fichte's Science of Knowledge," Philosophy and Rhetoric, 23, 1 (1990): 12–42. Xavier Tilliette, Fichte. La science de la liberté, pref. by , Vrin, Paris, 2003. Robert R. Williams. Recognition: Fichte and Hegel on the Other. Albany: State University of New York Press, 1992. David W. Wood. 'Mathesis of the Mind': A Study of Fichte's Wissenschaftslehre and Geometry. Amsterdam/New York: Rodopi, 2012 (Fichte-Studien-Supplementa, volume 29). Tommaso Valentini, I fondamenti della libertà in J.G. Fichte. Studi sul primato del pratico'', Presentazione di Armando Rigobello, Editori Riuniti University Press, Roma 2012. External links Outlines of the Doctrine of Knowledge The North American Fichte Society: "Fichte's Works in English Translation" Works by Fichte, original German texts Internationale Johann-Gottlieb-Fichte-Gesellschaft KULTUR & KONGRESSWERK-fichte – Event-location in Magdeburg, named after Johann-Gottlieb Fichte A Case Study in Ad Hominem Arguments: Fichte's Science of Knowledge 1762 births 1814 deaths 18th-century German philosophers 18th-century German writers 18th-century German male writers 18th-century philosophers 19th-century German philosophers 19th-century German writers 19th-century German male writers 19th-century philosophers Christian philosophers Consciousness researchers and theorists Continental philosophers Cultural critics Deaths from typhus German Freemasons German idealism German Lutherans German nationalists Historians of philosophy Humboldt University of Berlin faculty Idealists Infectious disease deaths in Germany Metaphilosophers Moral philosophers People from Bautzen (district) People from the Electorate of Saxony Philosophers of culture Philosophers of education Philosophers of ethics and morality Philosophers of history Philosophers of mind Philosophers of religion Political philosophers German social commentators Social critics Social philosophers Spinoza scholars Writers from Saxony University of Erlangen-Nuremberg faculty University of Jena faculty
[ -0.054692644625902176, 0.24876292049884796, -0.3814895451068878, 0.18933850526809692, 0.05861251428723335, 0.6015681624412537, 1.1848247051239014, -0.027362287044525146, 0.0572223924100399, -0.7028347253799438, -0.31405654549598694, 0.033329691737890244, -0.17185916006565094, 0.18232978880...
12010
https://en.wikipedia.org/wiki/Great%20Lakes
Great Lakes
The Great Lakes, also called the Great Lakes of North America or the Laurentian Great Lakes, is a series of large interconnected freshwater lakes in the mid-east region of North America that connect to the Atlantic Ocean via the Saint Lawrence River. They are Lakes Superior, Michigan, Huron, Erie, and Ontario and are in general on or near the Canada–United States border. Hydrologically, there are four lakes, because lakes Michigan and Huron join at the Straits of Mackinac. The Great Lakes Waterway enables travel by water among the lakes. The Great Lakes are the largest group of freshwater lakes on Earth by total area and are second-largest by total volume, containing 21% of the world's surface fresh water by volume. The total surface is , and the total volume (measured at the low water datum) is , slightly less than the volume of Lake Baikal (, 22–23% of the world's surface fresh water). Because of their sea-like characteristics, such as rolling waves, sustained winds, strong currents, great depths, and distant horizons, the five Great Lakes have long been called inland seas. By surface area, Lake Superior is the second-largest lake in the world, and is the largest freshwater lake. Lake Michigan is the largest lake that is entirely within one country. The Great Lakes began to form at the end of the Last Glacial Period around 14,000 years ago, as retreating ice sheets exposed the basins they had carved into the land, which then filled with meltwater. The lakes have been a major source for transportation, migration, trade, and fishing, serving as a habitat to many aquatic species in a region with much biodiversity. The surrounding region is called the Great Lakes region, which includes the Great Lakes Megalopolis. Geography Though the five lakes lie in separate basins, they form a single, naturally interconnected body of fresh water, within the Great Lakes Basin. As a chain of lakes and rivers, they connect the east-central interior of North America to the Atlantic Ocean. From the interior to the outlet at the Saint Lawrence River, water flows from Superior to Huron and Michigan, southward to Erie, and finally northward to Lake Ontario. The lakes drain a large watershed via many rivers and contain approximately 35,000 islands. There are also several thousand smaller lakes, often called "inland lakes", within the basin. The surface area of the five primary lakes combined is roughly equal to the size of the United Kingdom, while the surface area of the entire basin (the lakes and the land they drain) is about the size of the UK and France combined. Lake Michigan is the only one of the Great Lakes that is entirely within the United States; the others form a water boundary between the United States and Canada. The lakes are divided among the jurisdictions of the Canadian province of Ontario and the U.S. states of Michigan, Wisconsin, Minnesota, Illinois, Indiana, Ohio, Pennsylvania, and New York. Both the province of Ontario and the state of Michigan include in their boundaries portions of four of the lakes: The province of Ontario does not border Lake Michigan, and the state of Michigan does not border Lake Ontario. New York and Wisconsin's jurisdictions extend into two lakes, and each of the remaining states into one of the lakes. Bathymetry As the surfaces of Lakes Superior, Huron, Michigan, and Erie are all approximately the same elevation above sea level, while Lake Ontario is significantly lower, and because the Niagara Escarpment precludes all natural navigation, the four upper lakes are commonly called the "upper great lakes". This designation is not universal. Those living on the shore of Lake Superior often refer to all the other lakes as "the lower lakes", because they are farther south. Sailors of bulk freighters transferring cargoes from Lake Superior and northern Lake Michigan and Lake Huron to ports on Lake Erie or Ontario commonly refer to the latter as the lower lakes and Lakes Michigan, Huron, and Superior as the upper lakes. This corresponds to thinking of lakes Erie and Ontario as "down south" and the others as "up north". Vessels sailing north on Lake Michigan are considered "upbound" even though they are sailing toward its effluent current. Primary connecting waterways The Chicago River and Calumet River systems connect the Great Lakes Basin to the Mississippi River System through man-made alterations and canals. The St. Marys River, including the Soo Locks, connects Lake Superior to Lake Huron, via the North Channel. The Straits of Mackinac connect Lake Michigan to Lake Huron (which are hydrologically one). The St. Clair River connects Lake Huron to Lake St. Clair. The Detroit River connects Lake St. Clair to Lake Erie. The Niagara River, including Niagara Falls, connects Lake Erie to Lake Ontario. The Welland Canal, bypassing the Niagara River, connects Lake Erie to Lake Ontario. The Saint Lawrence River and the Saint Lawrence Seaway connect Lake Ontario to the Gulf of Saint Lawrence, which connects to the Atlantic Ocean. Lake Michigan–Huron Lakes Huron and Michigan are sometimes considered a single lake, called Lake Michigan–Huron, because they are one hydrological body of water connected by the Straits of Mackinac. The straits are wide and deep; the water levels rise and fall together, and the flow between Michigan and Huron frequently reverses direction. Large bays and related significant bodies of water Lake Nipigon, connected to Lake Superior by the Nipigon River, is surrounded by sill-like formations of mafic and ultramafic igneous rock hundreds of meters high. The lake lies in the Nipigon Embayment, a failed arm of the triple junction (centered beneath Lake Superior) in the Midcontinent Rift System event, estimated at 1.1 billion years ago. Green Bay is an arm of Lake Michigan along the south coast of the Upper Peninsula of Michigan and the east coast of Wisconsin. It is separated from the rest of the lake by the Door Peninsula in Wisconsin, the Garden Peninsula in Michigan, and the chain of islands between them, all of which were formed by the Niagara Escarpment. Lake Winnebago, connected to Green Bay by the Fox River, serves as part of the Fox–Wisconsin Waterway and is part of a larger system of lakes in Wisconsin known as the Winnebago Pool. Grand Traverse Bay is an arm of Lake Michigan on Michigan's west coast and is one of the largest natural harbors in the Great Lakes. The bay has one large peninsula and one major island known as Power Island. Its name is derived from Jacques Marquette's crossing of the bay from Norwood to Northport which he called La Grande Traversee. Georgian Bay is an arm of Lake Huron, extending northeast from the lake entirely within Ontario. The bay, along with its narrow westerly extensions of the North Channel and Mississagi Strait, is separated from the rest of the lake by the Bruce Peninsula, Manitoulin Island, and Cockburn Island, all of which were formed by the Niagara Escarpment. Lake Nipissing, connected to Georgian Bay by the French River, contains two volcanic pipes, which are the Manitou Islands and Callander Bay. These pipes were formed by a violent, supersonic eruption of deep origin. The lake lies in the Ottawa-Bonnechere Graben, a Mesozoic rift valley that formed 175 million years ago. Lake Simcoe, connected to Georgian Bay by the Severn River, serves as part of the Trent–Severn Waterway, a canal route traversing Southern Ontario between Lakes Ontario and Huron. Lake St. Clair, connected with Lake Huron to its north by the St. Clair River and with Lake Erie to its south by the Detroit River. Although it is 17 times smaller in area than Lake Ontario and only rarely included in the listings of the Great Lakes, proposals for its official recognition as a Great Lake are occasionally made, which would affect its inclusion in scientific research projects designated as related to "The Great Lakes". Saginaw Bay, an extension of Lake Huron into the Lower Peninsula of Michigan, fed by the Saginaw and other rivers, has the largest contiguous freshwater wetland in the United States. Islands Dispersed throughout the Great Lakes are approximately 35,000 islands. The largest among them is Manitoulin Island in Lake Huron, the largest island in any inland body of water in the world. The second-largest island is Isle Royale in Lake Superior. Both of these islands are large enough to contain multiple lakes themselves—for instance, Manitoulin Island's Lake Manitou is the world's largest lake on a freshwater island. Some of these lakes even have their own islands, like Treasure Island in Lake Mindemoya in Manitoulin Island Peninsulas The Great Lakes also have several peninsulas between them, including the Door Peninsula, the Peninsulas of Michigan, and the Ontario Peninsula. Some of these peninsulas even contain smaller peninsulas, such as the Keweenaw Peninsula, the Thumb Peninsula, the Bruce Peninsula, and the Niagara Peninsula. Population centers on the peninsulas include Grand Rapids, Flint, and Detroit in Michigan along with London, Hamilton, Brantford, and Toronto in Ontario. Shipping connection to the ocean Although the Saint Lawrence Seaway and Great Lakes Waterway make the Great Lakes accessible to ocean-going vessels, shifts in shipping to wider ocean-going container ships—which do not fit through the locks on these routes—have limited container shipping on the lakes. Most Great Lakes trade is of bulk material, and bulk freighters of Seawaymax-size or less can move throughout the entire lakes and out to the Atlantic. Larger ships are confined to working within the lakes. Only barges can access the Illinois Waterway system providing access to the Gulf of Mexico via the Mississippi River. Despite their vast size, large sections of the Great Lakes freeze over in winter, interrupting most shipping from January to March. Some icebreakers ply the lakes, keeping the shipping lanes open through other periods of ice on the lakes. The Great Lakes are connected by the Chicago Sanitary and Ship Canal to the Gulf of Mexico via the Illinois River (from the Chicago River) and the Mississippi River. An alternate track is via the Illinois River (from Chicago), to the Mississippi, up the Ohio, and then through the Tennessee–Tombigbee Waterway (a combination of a series of rivers and lakes and canals), to Mobile Bay and the Gulf of Mexico. Commercial tug-and-barge traffic on these waterways is heavy. Pleasure boats can enter or exit the Great Lakes by way of the Erie Canal and Hudson River in New York. The Erie Canal connects to the Great Lakes at the east end of Lake Erie (at Buffalo, New York) and at the south side of Lake Ontario (at Oswego, New York). Water levels The lakes were originally fed by both precipitation and meltwater from glaciers which are no longer present. In modern times, only about 1% of volume per year is "new" water, originating from rivers, precipitation, and groundwater springs. In the post-glacial period, evaporation, and drainage have generally been balanced, making the levels of the lakes relatively constant. Intensive human population growth began in the region in the 20th century and continues today. At least two human water use activities have been identified as having the potential to affect the lakes' levels: diversion (the transfer of water to other watersheds) and consumption (substantially done today by the use of lake water to power and cool electric generation plants, resulting in evaporation). Outflows through the Chicago Sanitary and Ship Canal is more than balanced by artificial inflows via the Ogoki River and Long Lake/Kenogami River diversions. Fluctuation of the water levels in the lakes has been observed since records began in 1918. The water level of Lake Michigan–Huron had remained fairly constant over the 20th century Recent lake levels include record low levels in 2013 in Lakes Superior, Erie, and Michigan-Huron, followed by record high levels in 2020 in the same lakes. The water level in Lake Ontario has remained relatively constant in the same time period, hovering around the historical average level. The lake levels are affected primarily by changes in regional meteorology and climatology. The outflows from lakes Superior and Ontario are regulated, while the outflows of Michigan-Huron and Erie are not regulated at all. Ontario is the most tightly regulated, with its outflow controlled by the Moses-Saunders Power Dam, which explains its consistent historical levels. Etymology Lake Erie From the Erie tribe, a shortened form of the Iroquoian word 'long tail'. Lake Huron Named for the inhabitants of the area, the Wyandot (or "Hurons"), by the first French explorers . The Wyandot originally referred to the lake by the name , a word which has been variously translated as "Freshwater Sea", "Lake of the Hurons", or simply "lake". Lake Michigan From the Ojibwe word "great water" or "large lake". Lake Ontario From the Wyandot word "lake of shining waters". Lake Superior English translation of the French term "upper lake", referring to its position north of Lake Huron. The indigenous Ojibwe call it (from Ojibwe "big, large, great"; "water, lake, sea"). Popularized in French-influenced transliteration as Gitchigumi as in Gordon Lightfoot's 1976 story song "The Wreck of the Edmund Fitzgerald", or Gitchee Gumee as in Henry Wadsworth Longfellow's 1855 epic poem, The Song of Hiawatha). Statistics The Great Lakes contain 21% of the world's surface fresh water: , or 6.0×1015 U.S. gallons, that is 6 quadrillion U.S gallons, (2.3×1016 liters). The lakes contain about 84% of the surface freshwater of North America; if the water were evenly distributed over the entire continent's land area, it would reach a depth of 5 feet (1.5 meters). This is enough water to cover the 48 contiguous U.S. states to a uniform depth of . Although the lakes contain a large percentage of the world's fresh water, the Great Lakes supply only a small portion of U.S. drinking water on a national basis. The total surface area of the lakes is approximately —nearly the same size as the United Kingdom, and larger than the U.S. states of New York, New Jersey, Connecticut, Rhode Island, Massachusetts, Vermont, and New Hampshire combined. The Great Lakes coast measures approximately ;, but the length of a coastline is impossible to measure exactly and is not a well-defined measure. Canada borders approximately of coastline, while the remaining are bordered by the United States. Michigan has the longest shoreline of the United States, bordering roughly of lakes, followed by Wisconsin (), New York (), and Ohio (). Traversing the shoreline of all the lakes would cover a distance roughly equivalent to travelling half-way around the world at the equator. A notable modern phenomenon is the formation of ice volcanoes over the lakes during wintertime. Storm-generated waves carve the lakes' ice sheet and create conical mounds through the eruption of water and slush. The process is only well-documented in the Great Lakes, and has been credited with sparing the southern shorelines from worse rocky erosion. Geology It has been estimated that the foundational geology that created the conditions shaping the present day upper Great Lakes was laid from 1.1 to 1.2 billion years ago, when two previously fused tectonic plates split apart and created the Midcontinent Rift, which crossed the Great Lakes Tectonic Zone. A valley was formed providing a basin that eventually became modern day Lake Superior. When a second fault line, the Saint Lawrence rift, formed approximately 570 million years ago, the basis for Lakes Ontario and Erie was created, along with what would become the Saint Lawrence River. The Great Lakes are estimated to have been formed at the end of the Last Glacial Period (the Wisconsin glaciation ended 10,000 to 12,000 years ago), when the Laurentide Ice Sheet receded. The retreat of the ice sheet left behind a large amount of meltwater (Lake Algonquin, Lake Chicago, Glacial Lake Iroquois, and Champlain Sea) that filled up the basins that the glaciers had carved, thus creating the Great Lakes as we know them today. Because of the uneven nature of glacier erosion, some higher hills became Great Lakes islands. The Niagara Escarpment follows the contour of the Great Lakes between New York and Wisconsin. Land below the glaciers "rebounded" as it was uncovered. Since the glaciers covered some areas longer than others, this glacial rebound occurred at different rates. Climate The Great Lakes have a humid continental climate, Köppen climate classification Dfa (in southern areas) and Dfb (in northern parts) with varying influences from air masses from other regions including dry, cold Arctic systems, mild Pacific air masses from the west, and warm, wet tropical systems from the south and the Gulf of Mexico. The lakes have a moderating effect on the climate; they can also increase precipitation totals and produce lake effect snowfall. Lake effect The Great Lakes can have an effect on regional weather called lake-effect snow, which is sometimes very localized. Even late in winter, the lakes often have no icepack in the middle. The prevailing winds from the west pick up the air and moisture from the lake surface, which is slightly warmer in relation to the cold surface winds above. As the slightly warmer, moist air passes over the colder land surface, the moisture often produces concentrated, heavy snowfall that sets up in bands or "streamers". This is similar to the effect of warmer air dropping snow as it passes over mountain ranges. During freezing weather with high winds, the "snowbelts" receive regular snow fall from this localized weather pattern, especially along the eastern shores of the lakes. Snowbelts are found in Wisconsin, Michigan, Ohio, Pennsylvania, New York, and Ontario. Related to the lake effect is the regular occurrence of fog, particularly along the shorelines of the lakes. This is most noticeable along Lake Superior's shores. The lakes tend to moderate seasonal temperatures to some degree but not with as large an influence as do large oceans; they absorb heat and cool the air in summer, then slowly radiate that heat in autumn. They protect against frost during transitional weather and keep the summertime temperatures cooler than further inland. This effect can be very localized and overridden by offshore wind patterns. This temperature buffering produces areas known as "fruit belts", where fruit can be produced that is typically grown much farther south. For instance, western Michigan has apple orchards, and cherry orchards are cultivated adjacent to the lake shore as far north as the Grand Traverse Bay. Near Collingwood, Ontario, commercial fruit orchards, including a few wineries, exist near the shoreline of southern Nottawasaga Bay. The eastern shore of Lake Michigan and the southern shore of Lake Erie have many successful wineries because of the lakes' moderating effects, as do the large commercial fruit and wine growing areas of the Niagara Peninsula located between Lake Erie and Lake Ontario. A similar phenomenon allows wineries to flourish in the Finger Lakes region of New York, as well as in Prince Edward County, Ontario, on Lake Ontario's northeast shore. The Great Lakes have been observed to help intensify storms, such as Hurricane Hazel in 1954, and the 2011 Goderich, Ontario tornado, which moved onshore as a tornadic waterspout. In 1996, a rare tropical or subtropical storm was observed forming in Lake Huron, dubbed the 1996 Lake Huron cyclone. Rather large severe thunderstorms covering wide areas are well known in the Great Lakes during mid-summer; these Mesoscale convective complexes or MCCs can cause damage to wide swaths of forest and shatter glass in city buildings. These storms mainly occur during the night, and the systems sometimes have small embedded tornadoes, but more often straight-line winds accompanied by intense lightning. Ecology Historically, the Great Lakes, in addition to their lake ecology, were surrounded by various forest ecoregions (except in a relatively small area of southeast Lake Michigan where savanna or prairie occasionally intruded). Logging, urbanization, and agriculture uses have changed that relationship. In the early 21st century, Lake Superior's shores are 91% forested, Lake Huron 68%, Lake Ontario 49%, Lake Michigan 41%, and Lake Erie, where logging and urbanization has been most extensive, 21%. Some of these forests are second or third growth (i.e. they have been logged before, changing their composition). At least 13 wildlife species are documented as becoming extinct since the arrival of Europeans, and many more are threatened or endangered. Meanwhile, exotic and invasive species have also been introduced. Fauna While the organisms living on the bottom of shallow waters are similar to those found in smaller lakes, the deep waters contain organisms found only in deep, cold lakes of the northern latitudes. These include the delicate opossum shrimp (order mysida), the deepwater scud (a crustacean of the order amphipoda), two types of copepods, and the deepwater sculpin (a spiny, large-headed fish). The Great Lakes are an important source of fishing. Early European settlers were astounded by both the variety and quantity of fish; there were 150 different species in the Great Lakes. Throughout history, fish populations were the early indicator of the condition of the Lakes and have remained one of the key indicators even in the current era of sophisticated analyses and measuring instruments. According to the bi-national (U.S. and Canadian) resource book, The Great Lakes: An Environmental Atlas and Resource Book: "The largest Great Lakes fish harvests were recorded in 1889 and 1899 at some [147 million pounds]." By 1801, the New York Legislature found it necessary to pass regulations curtailing obstructions to the natural migrations of Atlantic salmon from Lake Erie into their spawning channels. In the early 19th century, the government of Upper Canada found it necessary to introduce similar legislation prohibiting the use of weirs and nets at the mouths of Lake Ontario's tributaries. Other protective legislation was passed, but enforcement remained difficult. On both sides of the Canada–United States border, the proliferation of dams and impoundments have multiplied, necessitating more regulatory efforts. Concerns by the mid-19th century included obstructions in the rivers which prevented salmon and lake sturgeon from reaching their spawning grounds. The Wisconsin Fisheries Commission noted a reduction of roughly 25% in general fish harvests by 1875. The states have removed dams from rivers where necessary. Overfishing has been cited as a possible reason for a decrease in population of various whitefish, important because of their culinary desirability and, hence, economic consequence. Moreover, between 1879 and 1899, reported whitefish harvests declined from some 24.3 million pounds (11 million kg) to just over 9 million pounds (4 million kg). By 1900, commercial fishermen on Lake Michigan were hauling in an average of 41 million pounds of fish annually. By 1938, Wisconsin's commercial fishing operations were motorized and mechanized, generating jobs for more than 2,000 workers, and hauling 14 million pounds per year. The population of giant freshwater mussels was eliminated as the mussels were harvested for use as buttons by early Great Lakes entrepreneurs. The Great Lakes: An Environmental Atlas and Resource Book (1972) notes: "Only pockets remain of the once large commercial fishery." Water quality improvements realized during the 1970s and 1980s, combined with successful salmonid stocking programs, have enabled the growth of a large recreational fishery. The last commercial fisherman left Milwaukee in 2011 because of overfishing and anthropogenic changes to the biosphere. Invasive species Since the 19th century, an estimated 160 new species have found their way into the Great Lakes ecosystem; many have become invasive; the overseas ship ballast and ship hull parasitism are causing severe economic and ecological impacts. According to the Inland Seas Education Association, on average a new species enters the Great Lakes every eight months. Introductions into the Great Lakes include the zebra mussel, which was first discovered in 1988, and quagga mussel in 1989. Since 2000, the invasive quagga mussel has smothered the bottom of Lake Michigan almost from shore to shore, and their numbers are estimated at 900 trillion. The mollusks are efficient filter feeders, competing with native mussels and reducing available food and spawning grounds for fish. In addition, the mussels may be a nuisance to industries by clogging pipes. The U.S. Fish and Wildlife Service estimated in 2007 that the economic impact of the zebra mussel could be about $5 billion over the next decade. The alewife first entered the system west of Lake Ontario via 19th-century canals. By the 1960s, the small silver fish had become a familiar nuisance to beach goers across Lakes Michigan, Huron, and Erie. Periodic mass die-offs result in vast numbers of the fish washing up on shore; estimates by various governments have placed the percentage of Lake Michigan's biomass, which was made up of alewives in the early 1960s, as high as 90%. In the late 1960s, the various state and federal governments began stocking several species of salmonids, including the native lake trout as well as non-native chinook and coho salmon; by the 1980s, alewife populations had dropped drastically. The ruffe, a small percid fish from Eurasia, became the most abundant fish species in Lake Superior's Saint Louis River within five years of its detection in 1986. Its range, which has expanded to Lake Huron, poses a significant threat to the lower lake fishery. Five years after first being observed in the St. Clair River, the round goby can now be found in all of the Great Lakes. The goby is considered undesirable for several reasons: it preys upon bottom-feeding fish, overruns optimal habitat, spawns multiple times a season, and can survive poor water quality conditions. The influx of parasitic lamprey populations after the development of the Erie Canal and the much later Welland Canal led to the two federal governments of the U.S. and Canada working on joint proposals to control it. By the mid-1950s, the lake trout populations of Lakes Michigan and Huron were reduced, with the lamprey deemed largely to blame. This led to the launch of the bi-national Great Lakes Fishery Commission. Several species of exotic water fleas have accidentally been introduced into the Great Lakes, such as the spiny waterflea, Bythotrephes longimanus, and the fishhook waterflea, Cercopagis pengoi, potentially having an effect on the zooplankton population. Several species of crayfish have also been introduced that may contend with native crayfish populations. More recently an electric fence has been set up across the Chicago Sanitary and Ship Canal in order to keep several species of invasive Asian carp out of the lakes. These fast-growing planktivorous fish have heavily colonized the Mississippi and Illinois river systems. Invasive species, particularly zebra and quagga mussels, may be at least partially responsible for the collapse of the deepwater demersal fish community in Lake Huron, as well as drastic unprecedented changes in the zooplankton community of the lake. Microbiology Scientists understand that the micro-aquatic life of the lakes is abundant but know very little about some of the most plentiful microbes and their environmental effects in the Great Lakes. Although a drop of lake water may contain 1 million bacteria cells and 10 million viruses, only since 2012 has there been a long-term study of the lakes' micro-organisms. Between 2012 and 2019 more than 160 new species have been discovered. Flora Native habitats and ecoregions in the Great Lakes region include: Alvar Boreal rich fen (such as in Door County) Eastern forest-boreal transition Eastern Great Lakes lowland forests Southern Great Lakes forests Central forest-grasslands transition Upper Midwest forest-savanna transition Western Great Lakes forests Central Canadian Shield forests Laurentian Mixed Forest Province Beech-maple forest Habitats of the Indiana Dunes Plant lists include: List of Michigan flowers List of Minnesota wild flowers List of Minnesota trees Logging Logging of the extensive forests in the Great Lakes region removed riparian and adjacent tree cover over rivers and streams, which provide shade, moderating water temperatures in fish spawning grounds. Removal of trees also destabilized the soil, with greater volumes washed into stream beds causing siltation of gravel beds, and more frequent flooding. Running cut logs down the tributary rivers into the Great Lakes also dislocated sediments. In 1884, the New York Fish Commission determined that the dumping of sawmill waste (chips and sawdust) had impacted fish populations. Pollution The first U.S. Clean Water Act, passed by a Congressional override after being vetoed by U.S. President Richard Nixon in 1972, was a key piece of legislation, along with the bi-national Great Lakes Water Quality Agreement signed by Canada and the U.S. A variety of steps taken to process industrial and municipal pollution discharges into the system greatly improved water quality by the 1980s, and Lake Erie in particular is significantly cleaner. Discharge of toxic substances has been sharply reduced. Federal and state regulations control substances like PCBs. The first of 43 "Great Lakes Areas of Concern" to be formally "de-listed" through successful cleanup was Ontario's Collingwood Harbour in 1994; Ontario's Severn Sound followed in 2003. Presque Isle Bay in Pennsylvania is formally listed as in recovery, as is Ontario's Spanish Harbour. Dozens of other Areas of Concern have received partial cleanups such as the Rouge River (Michigan) and Waukegan Harbor (Illinois). Phosphate detergents were historically a major source of nutrient to the Great Lakes algae blooms in particular in the warmer and shallower portions of the system such as Lake Erie, Saginaw Bay, Green Bay, and the southernmost portion of Lake Michigan. By the mid-1980s, most jurisdictions bordering the Great Lakes had controlled phosphate detergents. Blue-green algae, or cyanobacteria blooms, have been problematic on Lake Erie since 2011. "Not enough is being done to stop fertilizer and phosphorus from getting into the lake and causing blooms," said Michael McKay, executive director of the Great Lakes Institute for Environmental Research (GLIER) at the University of Windsor. The largest Lake Erie bloom to date occurred in 2015, exceeding the severity index at 10.5 and in 2011 at a 10. In early August 2019, satellite images depicted a bloom stretching up to 1,300 square kilometres on Lake Erie, with the heaviest concentration near Toledo, Ohio. A large bloom does not necessarily mean the cyanobacteria ... will produce toxins", said Michael McKay, of the University of Windsor. Water quality testing was underway in August 2019. Mercury Until 1970, mercury was not listed as a harmful chemical, according to the United States Federal Water Quality Administration. In the 21st century, mercury has become more apparent in water tests. Mercury compounds have been used in paper mills to prevent slime from forming during their production, and chemical companies have used mercury to separate chlorine from brine solutions. Studies conducted by the Environmental Protection Agency have shown that when the mercury comes in contact with many of the bacteria and compounds in the fresh water, it forms the compound methyl mercury, which has a much greater impact on human health than elemental mercury due to a higher propensity for absorption. This form of mercury is not detrimental to a majority of fish types, but is very detrimental to people and other wildlife animals who consume the fish. Mercury has been known for health related problems such as birth defects in humans and animals, and the near extinction of eagles in the Great Lakes region. Sewage The amount of raw sewage dumped into the waters was the primary focus of both the first Great Lakes Water Quality Agreement and federal laws passed in both countries during the 1970s. Implementation of secondary treatment of municipal sewage by major cities greatly reduced the routine discharge of untreated sewage during the 1970s and 1980s. The International Joint Commission in 2009 summarized the change: "Since the early 1970s, the level of treatment to reduce pollution from waste water discharges to the Great Lakes has improved considerably. This is a result of significant expenditures to date on both infrastructure and technology, and robust regulatory systems that have proven to be, on the whole, quite effective." The commission reported that all urban sewage treatment systems on the U.S. side of the lakes had implemented secondary treatment, as had all on the Canadian side except for five small systems. Though contrary to federal laws in both countries, those treatment system upgrades have not yet eliminated combined sewer overflow events. This describes when older sewerage systems, which combine storm water with sewage into single sewers heading to the treatment plant, are temporarily overwhelmed by heavy rainstorms. Local sewage treatment authorities then must release untreated effluent, a mix of rainwater and sewage, into local water bodies. While enormous public investments such as the Deep Tunnel projects in Chicago and Milwaukee have greatly reduced the frequency and volume of these events, they have not been eliminated. The number of such overflow events in Ontario, for example, is flat according to the International Joint Commission. Reports about this issue on the U.S. side highlight five large municipal systems (those of Detroit, Cleveland, Buffalo, Milwaukee and Gary) as being the largest current periodic sources of untreated discharges into the Great Lakes. Impacts of climate change on algae Algae such as diatoms, along with other phytoplankton, are photosynthetic primary producers supporting the food web of the Great Lakes, and have been affected by global warming. The changes in the size or in the function of the primary producers may have a direct or an indirect impact on the food web. Photosynthesis carried out by diatoms constitutes about one fifth of the total photosynthesis. By taking out of the water to photosynthesize, diatoms help to stabilize the pH of the water, as would react with water to produce carbonic acid. Diatoms acquire inorganic carbon through passive diffusion of and , and use carbonic anhydrase mediated active transport to speed up this process. Large diatoms require more carbon uptake than smaller diatoms. There is a positive correlation between the surface area and the chlorophyll concentration of diatom cells. History Several Native American populations (Paleo-indians) inhabited the region around 10,000 BC, after the end of the Wisconsin glaciation. The peoples of the Great Lakes traded with the Hopewell culture from around 1000 AD, as copper nuggets have been extracted from the region and fashioned into ornaments and weapons in the mounds of Southern Ohio. The Rush–Bagot Treaty signed in 1818, after the War of 1812 and the later Treaty of Washington eventually led to a complete disarmament of naval vessels in the Great Lakes. Nonetheless, both nations maintained coast guard vessels in the Great Lakes. The brigantine Le Griffon, which was commissioned by René-Robert Cavelier, Sieur de La Salle, was built at Cayuga Creek, near the southern end of the Niagara River, and became the first known sailing ship to travel the upper Great Lakes on August 7, 1679. During settlement, the Great Lakes and its rivers were the only practical means of moving people and freight. Barges from middle North America were able to reach the Atlantic Ocean from the Great Lakes when the Welland Canal opened in 1824 and the later Erie Canal opened in 1825. By 1848, with the opening of the Illinois and Michigan Canal at Chicago, direct access to the Mississippi River was possible from the lakes. With these two canals an all-inland water route was provided between New York City and New Orleans. The main business of many of the passenger lines in the 19th century was transporting immigrants. Many of the larger cities owe their existence to their position on the lakes as a freight destination as well as for being a magnet for immigrants. After railroads and surface roads developed, the freight and passenger businesses dwindled and, except for ferries and a few foreign cruise ships, have now vanished. The immigration routes still have an effect today. Immigrants often formed their own communities, and some areas have a pronounced ethnicity, such as Dutch, German, Polish, Finnish, and many others. Since many immigrants settled for a time in New England before moving westward, many areas on the U.S. side of the Great Lakes also have a New England feel, especially in home styles and accent. Since general freight these days is transported by railroads and trucks, domestic ships mostly move bulk cargoes, such as iron ore, coal and limestone for the steel industry. The domestic bulk freight developed because of the nearby mines. It was more economical to transport the ingredients for steel to centralized plants rather than to make steel on the spot. Grain exports are also a major cargo on the lakes. In the 19th and early 20th centuries, iron and other ores such as copper were shipped south on (downbound ships), and supplies, food, and coal were shipped north (upbound). Because of the location of the coal fields in Pennsylvania and West Virginia, and the general northeast track of the Appalachian Mountains, railroads naturally developed shipping routes that went due north to ports such as Erie, Pennsylvania and Ashtabula, Ohio. Because the lake maritime community largely developed independently, it has some distinctive vocabulary. Ships, no matter the size, are called boats. When the sailing ships gave way to steamships, they were called steamboats—the same term used on the Mississippi. The ships also have a distinctive design; ships that primarily trade on the lakes are known as lakers. Foreign boats are known as salties. One of the more common sights on the lakes has been since about 1950 the 1,000‑by‑105-foot (305-by-32-meter), self-unloader. This is a laker with a conveyor belt system that can unload itself by swinging a crane over the side. Today, the Great Lakes fleet is much smaller in numbers than it once was because of the increased use of overland freight, and a few larger ships replacing many small ones. During World War II, the risk of submarine attacks against coastal training facilities motivated the United States Navy to operate two aircraft carriers on the Great Lakes, and . Both served as training ships to qualify naval aviators in carrier landing and takeoff. Lake Champlain briefly became the sixth Great Lake of the United States on March 6, 1998, when President Clinton signed Senate Bill 927. This bill, which reauthorized the National Sea Grant Program, contained a line declaring Lake Champlain to be a Great Lake. Not coincidentally, this status allows neighboring states to apply for additional federal research and education funds allocated to these national resources. Following a small uproar, the Senate voted to revoke the designation on March 24 (although New York and Vermont universities would continue to receive funds to monitor and study the lake). Alan B. McCullough has written that the fishing industry of the Great Lakes got its start "on the American side of Lake Ontario in Chaumont Bay, near the Maumee River on Lake Erie, and on the Detroit River at about the time of the War of 1812". Although the region was sparsely populated until the 1830s, so there was not much local demand and transporting fish was prohibitively costly, there were economic and infrastructure developments that were promising for the future of the fishing industry going into the 1830s. Particularly, the 1825 opening of the Erie Canal and the Welland Canal a few years later. The fishing industry expanded particularly in the waters associated with the fur trade that connect Lake Erie and Lake Huron. In fact, two major suppliers of fish in the 1830s were the fur trading companies Hudson's Bay Company and the American Fur Company. The catch from these waters was sent to the growing market for salted fish in Detroit, where merchants involved in the fur trade had already gained some experience handling salted fish. One such merchant was John P. Clark, a shipbuilder and merchant who began selling fish in the area of Manitowoc, Wisconsin where whitefish was abundant. Another operation cropped up in Georgian Bay, Canadian waters plentiful with trout as well as whitefish. In 1831, Alexander MacGregor from Goderich, Ontario found whitefish and herring in abundant supply around the Fishing Islands. A contemporary account by Methodist missionary John Evans describes the fish as resembling a "bright cloud moving rapidly through the water". From 1844 through 1857, palace steamers carried passengers and cargo around the Great Lakes. In the first half of the 20th century large luxurious passenger steamers sailed the lakes in opulence. The Detroit and Cleveland Navigation Company had several vessels at the time and hired workers from all walks of life to help operate these vessels. Several ferries currently operate on the Great Lakes to carry passengers to various islands. As of 2007, four car ferry services cross the Great Lakes, two on Lake Michigan: a steamer from Ludington, Michigan, to Manitowoc, Wisconsin, and a high speed catamaran from Milwaukee to Muskegon, Michigan, one on Lake Erie: a boat from Kingsville, Ontario, or Leamington, Ontario, to Pelee Island, Ontario, then onto Sandusky, Ohio, and one on Lake Huron: the M.S. Chi-Cheemaun runs between Tobermory and South Baymouth, Manitoulin Island, operated by the Owen Sound Transportation Company. An international ferry across Lake Ontario from Rochester, New York, to Toronto ran during 2004 and 2005 but is no longer in operation. Shipwrecks The large size of the Great Lakes increases the risk of water travel; storms and reefs are common threats. The lakes are prone to sudden and severe storms, in particular in the autumn, from late October until early December. Hundreds of ships have met their end on the lakes. The greatest concentration of shipwrecks lies near Thunder Bay (Michigan), beneath Lake Huron, near the point where eastbound and westbound shipping lanes converge. The Lake Superior shipwreck coast from Grand Marais, Michigan, to Whitefish Point became known as the "Graveyard of the Great Lakes". More vessels have been lost in the Whitefish Point area than any other part of Lake Superior. The Whitefish Point Underwater Preserve serves as an underwater museum to protect the many shipwrecks in this area. The first ship to sink in Lake Michigan was Le Griffon, also the first ship to sail the Great Lakes. Caught in a 1679 storm while trading furs between Green Bay and Michilimacinac, she was lost with all hands aboard. Its wreck may have been found in 2004, but a wreck subsequently discovered in a different location was also claimed in 2014 to be Le Griffon. The largest and last major freighter wrecked on the lakes was the , which sank on November 10, 1975, just over offshore from Whitefish Point on Lake Superior. The largest loss of life in a shipwreck out on the lakes may have been that of , wrecked in 1860 with the loss of around 400 lives on Lake Michigan. In an incident at a Chicago dock in 1915, the rolled over while loading passengers, killing 841. In 2007, the Great Lakes Shipwreck Historical Society announced that it had found the wreckage of Cyprus, a long, century-old ore carrier. Cyprus sank during a Lake Superior storm on October 11, 1907, during its second voyage while hauling iron ore from Superior, Wisconsin, to Buffalo, New York. The entire crew of 23 drowned, except one, Charles Pitz, who floated on a life raft for almost seven hours. In 2008, deep sea divers in Lake Ontario found the wreck of the 1780 Royal Navy warship in what has been described as an "archaeological miracle". There are no plans to raise her as the site is being treated as a war grave. In 2010, L.R. Doty was found in Lake Michigan by an exploration diving team led by dive boat Captain Jitka Hanakova from her boat Molly V. The ship sank in October 1898, probably attempting to rescue a small schooner, Olive Jeanette, during a terrible storm. Still missing are the two last warships to sink in the Great Lakes, the French minesweepers, Inkerman and Cerisoles, which vanished in Lake Superior during a blizzard in 1918. 78 lives were lost making it the largest loss of life in Lake Superior and the greatest unexplained loss of life in the Great Lakes. Economy Shipping Except when the water is frozen during winter, more than 100 lake freighters operate continuously on the Great Lakes, which remain a major water transport corridor for bulk goods. The Great Lakes Waterway connects all the lakes; the smaller Saint Lawrence Seaway connects the lakes to the Atlantic oceans. Some lake freighters are too large to use the Seaway and operate only on the Waterway and lakes. In 2002, 162 million net tons of dry bulk cargo were moved on the Lakes. This was, in order of volume: iron ore, grain and potash. The iron ore and much of the stone and coal are used in the steel industry. There is also some shipping of liquid and containerized cargo. Only four bridges are on the Great Lakes other than Lake Ontario because of the cost of building structures high enough for ships to pass under. The Blue Water Bridge is, for example, more than 150 feet high and more than a mile long. Major ports on the Great Lakes include Duluth-Superior, Chicago, Detroit, Cleveland, Twin Harbors, Hamilton and Thunder Bay. Recreation Tourism and recreation are major industries on the Great Lakes. A few small cruise ships operate on the Great Lakes including some sailing ships. Sport fishing, commercial fishing, and Native American fishing represent a U.S.$4 billion a year industry with salmon, whitefish, smelt, lake trout, bass and walleye being major catches. Many other water sports are practiced on the lakes such as yachting, sea kayaking, diving, kitesurfing, powerboating, and lake surfing. The Great Lakes Circle Tour is a designated scenic road system connecting all of the Great Lakes and the Saint Lawrence River. Legislation In 1872, a treaty gave access to the St. Lawrence River to the United States and access to Lake Michigan to the Dominion of Canada. The International Joint Commission was established in 1909 to help prevent and resolve disputes relating to the use and quality of boundary waters, and to advise Canada and the United States on questions related to water resources. Concerns over diversion of Lake water are of concern to both Americans and Canadians. Some water is diverted through the Chicago River to operate the Illinois Waterway, but the flow is limited by treaty. Possible schemes for bottled water plants and diversion to dry regions of the continent raise concerns. Under the U.S. "Water Resources Development Act", diversion of water from the Great Lakes Basin requires the approval of all eight Great Lakes governors through the Great Lakes Commission, which rarely occurs. International treaties regulate large diversions. In 1998, the Canadian company Nova Group won approval from the Province of Ontario to withdraw of Lake Superior water annually to ship by tanker to Asian countries. Public outcry forced the company to abandon the plan before it began. Since that time, the eight Great Lakes Governors and the Premiers of Ontario and Quebec have negotiated the Great Lakes-Saint Lawrence River Basin Sustainable Water Resources Agreement and the Great Lakes-St. Lawrence River Basin Water Resources Compact that would prevent most future diversion proposals and all long-distance ones. The agreements strengthen protection against abusive water withdrawal practices within the Great Lakes basin. On December 13, 2005, the Governors and Premiers signed these two agreements, the first of which is between all ten jurisdictions. It is somewhat more detailed and protective, though its legal strength has not yet been tested in court. The second, the Great Lakes Compact, has been approved by the state legislatures of all eight states that border the Great Lakes as well as the U.S. Congress, and was signed into law by President George W. Bush on October 3, 2008. The Great Lakes Restoration Initiative, described as "the largest investment in the Great Lakes in two decades", was funded at $475 million in the U.S. federal government's Fiscal Year 2011 budget, and $300 million in the Fiscal Year 2012 budget. Through the program a coalition of federal agencies is making grants to local and state entities for toxics cleanups, wetlands and coastline restoration projects, and invasive species-related projects. The Great Lakes Restoration Initiative Act of 2019 passed as Public Law 116-294 on January 5, 2021. See also Alliance for the Great Lakes Boundary Waters Treaty of 1909 Eastern Continental Divide Great Lakes census statistical areas Great Lakes Protection Fund Great Lakes WATER Institute Great Recycling and Northern Development Canal List of municipalities on the Great Lakes Michigan Islands National Wildlife Refuge Populated islands of the Great Lakes Sixty Years' War for control of the Great Lakes References Further reading Beltran, R. et al. The Great Lakes: An Environmental Atlas and Resource Book. (United States Environmental Protection Agency and Government of Canada, 1995, ). Coon, W.F. and R.A. Sheets. Estimate of Ground Water in Storage in the Great Lakes Basin [Scientific Investigations Report 2006-5180]. Department of the Interior, U.S. Geological Survey, 2006. Riley, John L. (2013) The Once and Future Great Lakes Country: An Ecological History (McGill-Queen's University Press 516 pages; traces environmental change in the region since the last ice age. Holling, Holling Clancy Paddle to the Sea (), an illustrated children's book about the Great Lakes and their environment. Beautiful and educational. External links Great Lakes website of the Canadian Department of the Environment Great Lakes website of the United States Environmental Protection Agency Binational website of USEPA and Environment Canada for Great Lakes Water Quality Great Lakes Environmental Research Laboratory website (an arm of the American National Oceanic and Atmospheric Administration) Great Lakes Information Network, sponsored by the Great Lakes Commission, an official American interstate compact agency. Great Lakes Echo, a publication covering Great Lakes environmental issues Maritime History of the Great Lakes, digital library covering Great Lakes history. Dynamically updated data Surface temperatures Water levels Currents Ship locations Water levels since 1918 Great Lakes Eastern Canada Great Lakes region (U.S.) Lake groups
[ -0.3062043786048889, 0.3944958448410034, 0.09553013741970062, 0.38935258984565735, 0.2993698716163635, 0.08556576073169708, 0.8844149112701416, 1.0354046821594238, -0.800144374370575, -1.0548624992370605, -0.3209288716316223, -0.11670888960361481, -0.4294818937778473, 0.8861937522888184, ...
12012
https://en.wikipedia.org/wiki/German
German
German(s) may refer to: Germany (of or related to) Germania (historical use) Germans, citizens of Germany, people of German ancestry, or native speakers of the German language For citizens of Germany, see also German nationality law Germanic peoples (Roman times) German language any of the Germanic languages German cuisine, traditional foods of Germany People German (given name) German (surname) Germán, a Spanish name Places German (parish), Isle of Man German, Albania, or Gërmej German, Bulgaria German, Iran German, North Macedonia German, New York, U.S. Agios Germanos, Greece Other uses German (mythology), a South Slavic mythological being Germans (band), a Canadian rock band "German" (song), a 2019 song by No Money Enterprise The German, a 2008 short film "The Germans", an episode of Fawlty Towers The German, a nickname for Congolese rebel André Kisase Ngandu See also Germanic (disambiguation) Germany (disambiguation) Germanus (disambiguation) Germen (disambiguation) Germain (disambiguation) Germaine (disambiguation) Germantown (disambiguation) Germen (disambiguation) Germane, a simple chemical compound of germanium and hydrogen Language and nationality disambiguation pages
[ 0.24290621280670166, 0.3507283329963684, -0.21194110810756683, -0.29811128973960876, -0.06216754391789436, 0.30204081535339355, 0.9107609391212463, 0.5990119576454163, 0.04191768541932106, -0.480175644159317, -0.38888028264045715, -0.06864636391401291, 0.0008520148112438619, 0.226892337203...
12013
https://en.wikipedia.org/wiki/Girth%20%28graph%20theory%29
Girth (graph theory)
In graph theory, the girth of an undirected graph is the length of a shortest cycle contained in the graph. If the graph does not contain any cycles (that is, it is a forest), its girth is defined to be infinity. For example, a 4-cycle (square) has girth 4. A grid has girth 4 as well, and a triangular mesh has girth 3. A graph with girth four or more is triangle-free. Cages A cubic graph (all vertices have degree three) of girth that is as small as possible is known as a -cage (or as a (3,)-cage). The Petersen graph is the unique 5-cage (it is the smallest cubic graph of girth 5), the Heawood graph is the unique 6-cage, the McGee graph is the unique 7-cage and the Tutte eight cage is the unique 8-cage. There may exist multiple cages for a given girth. For instance there are three nonisomorphic 10-cages, each with 70 vertices: the Balaban 10-cage, the Harries graph and the Harries–Wong graph. Girth and graph coloring For any positive integers and , there exists a graph with girth at least and chromatic number at least ; for instance, the Grötzsch graph is triangle-free and has chromatic number 4, and repeating the Mycielskian construction used to form the Grötzsch graph produces triangle-free graphs of arbitrarily large chromatic number. Paul Erdős was the first to prove the general result, using the probabilistic method. More precisely, he showed that a random graph on vertices, formed by choosing independently whether to include each edge with probability has, with probability tending to 1 as goes to infinity, at most cycles of length or less, but has no independent set of size Therefore, removing one vertex from each short cycle leaves a smaller graph with girth greater than in which each color class of a coloring must be small and which therefore requires at least colors in any coloring. Explicit, though large, graphs with high girth and chromatic number can be constructed as certain Cayley graphs of linear groups over finite fields. These remarkable Ramanujan graphs also have large expansion coefficient. Related concepts The odd girth and even girth of a graph are the lengths of a shortest odd cycle and shortest even cycle respectively. The of a graph is the length of the longest (simple) cycle, rather than the shortest. Thought of as the least length of a non-trivial cycle, the girth admits natural generalisations as the 1-systole or higher systoles in systolic geometry. Girth is the dual concept to edge connectivity, in the sense that the girth of a planar graph is the edge connectivity of its dual graph, and vice versa. These concepts are unified in matroid theory by the girth of a matroid, the size of the smallest dependent set in the matroid. For a graphic matroid, the matroid girth equals the girth of the underlying graph, while for a co-graphic matroid it equals the edge connectivity. References Graph invariants
[ -0.10704309493303299, 0.11489497870206833, 0.19315855205059052, 0.04647288843989372, -0.6714735627174377, -0.12476178258657455, 0.08469954133033752, -0.24944967031478882, -0.2132524996995926, -0.795407235622406, -0.2983931303024292, 0.15908487141132355, -0.557585597038269, 0.21781972050666...
12015
https://en.wikipedia.org/wiki/Gun%20safety
Gun safety
Gun safety is the study and practice of using, transporting, storing and disposing of firearms and ammunition, including the training of gun users, the design of weapons, and formal and informal regulation of gun production, distribution, and usage, for the purpose of avoiding unintentional injury, illness, or death. This includes mishaps like accidental discharge, negligent discharge, and firearm malfunctions, as well as secondary risks like hearing loss, lead poisoning from bullets, and pollution from other hazardous materials in propellants and cartridges. There were 47,000 unintentional firearm deaths worldwide in 2013. History Accidental explosions of stored gunpowder date to the 13th century in Yangzhou, China. Early handheld muskets using matchlock or wheel lock mechanisms were limited by poor reliability and the risk of accidental discharge, which was improved somewhat by the introduction of the flintlock, though unintentional firing continued to be a serious drawback. Percussion caps, introduced in the 1820s were more reliable, and by 1830 inventors added security pins to their designs to prevent accidental discharges. Trigger guards and grip safetys were further steps leading to the various safeties built into modern firearms. Malfunctions Storage Proper storage prevents unauthorized use or theft of firearms and ammunition, or damage to them. A gun safe or gun cabinet is commonly used to physically prevent access to a firearm. Local laws may require particular standards for the lock, for the strength and burglar resistance of the cabinet, and may even require weapons and ammunition to be stored separately. Rifles or shotgun safes that are a lighter version of true safes are generally the norm for hunters or multiple firearm owners. Various safety standards like the RSC standard and CDOJ safety standard in US exists for the minimum requirement to qualify a container as firearm safety storage device. Similarly small handgun safes of different sizes and capacity are preferred for storing small number of handguns although most of them are found to be not very reliable by independent researchers and professional hackers. Locking mechanism plays important role in overall safety of the small safe. Generally simplex mechanical locks are found to be most secure and reliable. For ammunition some experts recommend storing in secure locations away from firearms. Ammunition should be kept in cool, dry conditions free from contaminating vapors to prevent deterioration of the propellant and cartridge. Handloaders must take special precautions for storing primers and loose gunpowder. Training, habits and mindset Gun safety training teaches a safety mindset, habits, and rules. The mindset is that firearms are inherently dangerous and must always be stored carefully and handled with care. Handlers are taught to treat firearms with respect for their destructive capabilities, and strongly discouraged from playing or toying with firearms, a common cause of accidents. The rules of gun safety follow from this mindset. In 1902, the English politician and game shooting enthusiast Mark Hanbury Beaufoy wrote some much-quoted verses on gun safety, meant to instill the safety mindset. Various similar sayings have since been popularized. Jeff Cooper, an influential figure in modern firearms training, formalized and popularized "Four Rules" of safe firearm handling. Prior lists of gun safety rules included as few as three basic safety rules or as many as ten rules including gun safety and sporting etiquette rules. In addition to Cooper, other influential teachers of gun safety include Massad Ayoob, Clint Smith, Chuck Taylor, Jim Crews, Bob Munden and Ignatius Piazza. The National Rifle Association and other public safety websites provides a similar set of rules. Disassembly Locks There are several types of locks that serve to make it difficult to discharge a firearm. Locks are considered less effective than keeping firearms stored in a lockable safe since locks are more easily defeated than approved safes. An unauthorized handler can bypass the locked firearm at their leisure. Some manufacturers, such as Taurus, build locks into the firearm itself. California effected regulations in 2000 that forced locks to be approved by a firearm safety device laboratory via California Penal Code Section 12088. All locks under this code must receive extensive tests including saw, pick, pull, and many other tests in order to be approved for the state of California. If a lock passes the requirements then it is said to be California Department of Justice (CADOJ) approved. Trigger lock There is controversy surrounding manufacturing standards, usage, and legislation of trigger locks. While supporters of trigger locks argue that they will save children by preventing accidents, critics point to demonstrations that some models can be removed by children with very little force and common household tools. Many firearms can discharge when dropped. Firearms that fully disengage the hammer when the safety is on pose less of a risk. A former senior product manager at Master Lock, a trigger lock manufacturer, was quoted as saying "If it is a loaded gun, there isn't a lock out there that will keep it from being fired... If you put a trigger lock on any loaded gun, you are making the gun more dangerous." Critics also point out that a trigger lock will increase the time it takes an owner to respond to a self-defense emergency. In 2008, the U.S. Supreme Court overturned a Washington, D.C. law that required handguns to be locked or otherwise kept inoperative within the home, saying that this "makes it impossible for citizens to use them for the core lawful purpose of self-defense." Although there are no universal standards for the design or testing of trigger locks, some jurisdictions, such as the state of California, maintain a list of approved trigger lock devices. In Canada, a trigger lock is one of the methods prescribed by law to secure a firearm during transport or storage. Chamber locks Chamber locks aim to block ammunition from being chambered, since most firearms typically cannot be discharged unless the ammunition is in the correct position. They are used to prevent live ammunition from being loaded into a firearm by blocking the chamber with a dummy cartridge or a chamber plug, which is sometimes wedged into place with the use of a tool, in essence jamming the firearm. Another type is one in which a steel rod locked into the safety cartridge with a key. As long as the rod and safety cartridge are engaged, the dummy round cannot eject nor can live ammunition be loaded into the firearm. Chamber locks work with most firearm types including revolvers, pistols, rifles and shotguns. They are available in any caliber and length, and may include such features as unique keying, rapid removal, and rigorous testing and certification by major state departments such as the California Department of Justice. Some shooting ranges require the handler to insert a temporary chamber plug which often has a brightly colored external tag, to signal the chamber being devoid of ammunition and blocked, whenever the firearm is being unused. These are called empty chamber indicators, or chamber flags. Cable locks Cable locks are a popular type of lock that usually threads into the receiver through the ejection port of repeating firearms. These locks physically obstruct the movements of the bolt, thereby preventing the cycling of the action, and deny the return to "battery" and the closure of the breech. In many designs of pistol and rifle, they also thread through the magazine well of the firearm to prevent the proper insertion of a magazine. Smart gun Personalized firearms, or smart guns, are intended to prevent unauthorized use with built-in locks that are released by RFID chips or other proximity devices, fingerprint recognition, magnetic rings, or a microchip implant. Secondary dangers While a firearm's primary danger lies in the discharge of ammunition, there are other ways a firearm may be detrimental to the health of the handler and bystanders. Noise When a firearm is discharged it emits a very loud noise, typically close to the handler's ears. This can cause temporary or permanent hearing damage such as tinnitus. Hearing protection such as earplugs, or earmuffs, or both, can reduce the risk of hearing damage. Some earmuffs or headphones made for shooting and similar loud situations use active noise control. Firearms may also have silencers which reduce the sound intensity from the barrel. Hot gases and debris A firearm emits hot gases, powder, and other debris when discharged. Some firearms, such as semi-automatic and fully automatic firearms, typically eject spent cartridge casings at high speed. Casings are also dangerously hot when ejected. Revolvers store spent casings in the chamber, but may emit a stream of hot gases and possible fine particulate debris laterally from the interface between the revolving chamber and the barrel. Any of these may hurt the handler or bystanders through burning or impact damage. Because eyes are particularly vulnerable to this type of damage, eye protection should be worn to reduce the risk of injury. Prescription lenses and various tints to suit different light conditions are available. Some eye protection products are rated to withstand impact from birdshot loads, which offers protection against irresponsible firearms use by other game bird shooters. Toxins and pollutants In recent years the toxic effects of ammunition and firearm cleaning agents have been highlighted. Lead ammunition left in nature may become mobilized by acid rain. Older ammunition may have mercury-based primers. Lead accumulates in shooting range backstops. Indoor ranges require good ventilation to remove pollutants such as powder, smoke, and lead dust from the air around the shooters. Indoor and outdoor ranges typically require extensive decontamination when they are decommissioned to remove all traces of lead, copper, and powder residues from the area. Lead, copper and other metals will also be released when a firearm is cleaned. Highly aggressive solvents and other agents used to remove lead and powder fouling may also present a hazard to health. Installing good ventilation, washing hands after handling firearms, and cleaning the space where the firearm was handled lessens the risk of unnecessary exposure. Unsafe users Impaired users Firearms should never be handled by persons who are under the influence of alcohol or any drugs which may affect their judgment. Gun safety teachers advocate zero tolerance of their use. In the United States, this recommendation is codified in many states' penal codes as a crime of "carrying under the influence", with penalties similar to DWI/DUI. Other sources of temporary impairment include exhaustion, dehydration, and emotional stress. These can affect reaction time, cognitive processing, sensory perception, and judgment. Many jurisdictions prohibit the possession of firearms by people deemed generally incapable of using them safely, such as the mentally ill or convicted felons. Children The National Rifle Association's the Eddie Eagle program for preschoolers through 6th graders is intended to teach children to avoid firearm accidents when they encounter guns that have not been securely stored out of their reach. Whether programs like Eddie Eagle are effective has not been conclusively determined. Some studies published in peer-reviewed journals have shown that it is very difficult for young children to control their curiosity even when they have been taught not to touch firearms. Gun access is also a major risk factor for youth suicide. The American Academy of Pediatrics (AAP) advises that keeping a gun in the home, especially a handgun increases the risk of injury and death for children and youth in the home. See also Safety area References External links A Review of Gun Safety Technologies—National Institute of Justice Firearm training Safety practices
[ 0.7187101244926453, 0.571207582950592, -0.25910577178001404, 0.2332208752632141, 0.19607287645339966, -0.08466698974370956, 0.4062652885913849, -0.5205997228622437, 0.02820945903658867, -0.3441721796989441, -0.40953201055526733, 1.0272057056427002, -0.43520650267601013, -0.2512307763099670...
12021
https://en.wikipedia.org/wiki/Go%20Down%20Moses
Go Down Moses
"Go Down Moses" is a spiritual phrase that describes events in the Old Testament of the Bible, specifically Exodus 5:1: "And the LORD spake unto Moses, Go unto Pharaoh, and say unto him, Thus saith the LORD, Let my people go, that they may serve me", in which God commands Moses to demand the release of the Israelites from bondage in Egypt. This phrase is the title of the one of the most well known African American spirituals of all time. The song discusses themes of freedom, a very common occurrence in spirituals. In fact, the song actually had multiple messages, discussing not only the metaphorical freedom of Moses but also the physical freedom of runaway slaves, and many slave holders outlawed this song because of those very messages. The opening verse as published by the Jubilee Singers in 1872: The lyrics of the song represent liberation of the ancient Jewish people from Egyptian slavery, a story recounted in the Old Testament. For enslaved African Americans, the story was very powerful because they could relate to the experiences of Moses and the Israelites who were enslaved by the pharaoh, representing the slave holders, and it holds the hopeful message that God will help those who are persecuted. The song also makes references to the Jordan River, which was often referred to in spirituals to describe finally reaching freedom because such an act of running away often involved crossing one or more rivers. Going "down" to Egypt is derived from the Bible; the Old Testament recognizes the Nile Valley as lower than Jerusalem and the Promised Land; thus, going to Egypt means going "down" while going away from Egypt is "up". In the context of American slavery, this ancient sense of "down" converged with the concept of "down the river" (the Mississippi), where slaves' conditions were notoriously worse, a situation which led to the idiom "sell [someone] down the river" in present-day English. "Oh! Let My People Go" Although usually thought of as a spiritual, the earliest written record of the song was as a rallying anthem for the Contrabands at Fort Monroe sometime before July 1862. White people who reported on the song presumed it was composed by them. This became the first ever spiritual to be recorded in sheet music that is known of, by Reverend Lewis Lockwood. While visiting Fortress Monroe in 1861, he heard runaway slaves singing this song, transcribed what he heard, and then eventually published it in the National Anti-Slavery Standard. Sheet music was soon after published, titled "Oh! Let My People Go: The Song of the Contrabands", and arranged by Horace Waters. L.C. Lockwood, chaplain of the Contrabands, stated in the sheet music that the song was from Virginia, dating from about 1853. However, the song was not included in Slave Songs of the United States, despite it being a very prominent spiritual among slaves. Furthermore, the original version of the song sung by slaves almost definitely sounded very different from what Lockwood transcribed by ear, especially following an arrangement by a person who had never before heard the song how it was originally sung. The opening verse, as recorded by Lockwood, is: Sarah Bradford's authorized biography of Harriet Tubman, Scenes in the Life of Harriet Tubman (1869), quotes Tubman as saying she used "Go Down Moses" as one of two code songs fugitive slaves used to communicate when fleeing Maryland. Tubman began her underground railroad work in 1850 and continued until the beginning of the Civil War, so it's possible Tubman's use of the song predates the origin claimed by Lockwood. Some people even hypothesize that she herself may have written the spiritual. Although others claim Nat Turner, who led one of the most well-known slave revolts in history, either wrote or was the inspiration for the song. In popular culture Films Al Jolson sings it in Alan Crosland' film Big Boy (1930). Used briefly in Kid Millions (1934). Jess Lee Brooks sings it in Preston Sturges' film Sullivan's Travels (1941). Gregory Miller (played by Sidney Poitier) sang the song in the film Blackboard Jungle (1955). A reference is made to the song in the film Ferris Bueller's Day Off (1986), when a bedridden Cameron Frye sings, "When Cameron was in Egypt's land, let my Cameron go". Sergei Bodrov Jr. and Oleg Menshikov, who play the two main characters in Sergei Bodrov's film Кавказский пленник (1996; Prisoner of the Mountains) dance to the Louis Armstrong version. The teen comedy film Easy A (2010) remixed this song with a fast guitar and beats. The song was originally published as Original Soundtrack and is listed in IMDb. Literature William Faulkner titled his 1942 short-story collection Go Down, Moses after the song. Djuna Barnes, in her 1936 novel Nightwood, titled a chapter "Go Down, Matthew" as an allusion to the song's title. In Margaret Mitchell's 1936 novel Gone with the Wind, slaves from the Georgia plantation Tara are in Atlanta, to dig breastworks for the soldiers, and they sing "Go Down, Moses" as they march down a street. Music The song was made famous by Paul Robeson whose deep voice was said by Robert O'Meally to have assumed "the might and authority of God." On February 7, 1958, the song was recorded in New York City and sung by Louis Armstrong with Sy Oliver's Orchestra. It was recorded by Doris Akers and the Sky Pilot Choir. The song has since become a jazz standard, having been recorded by Grant Green, Fats Waller, Archie Shepp, Hampton Hawes and many others. It is one of the five spirituals included in the oratorio A Child of Our Time, first performed in 1944, by the English classical composer Michael Tippett (190598). It is included in some seders in the United States, and is printed in Meyer Levin's An Israel Haggadah for Passover. The song was recorded by Deep River Boys in Oslo on September 26, 1960. It was released on the extended play Negro Spirituals No. 3 (HMV 7EGN 39). The song, or a modified version of it, has been used in the Roger Jones musical From Pharaoh to Freedom The French singer Claude Nougaro used its melody for his tribute to Louis Armstrong in French, under the name Armstrong (1965). "Go Down Moses" has sometimes been called "Let My People Go" and performed by a variety of musical artists, including RebbeSoul The song heavily influences "Get Down Moses", by Joe Strummer & the Mescaleros on their album Streetcore (2003). The song has been performed by the Russian Interior Ministry (MVD) Choir. Jazz singer Tony Vittia released a swing version under the name "Own The Night" (2013). The phrase "Go Down Moses" is featured in the chorus of the John Craigie song, "Will Not Fight" (2009). The phrase "Go Down Moses" is sung by Pops Staples with the Staple Singers in the song "The Weight" in The Last Waltz film by The Band (1976). The usual lyric is actually "Go down Miss Moses". Avant-garde singer-songwriter and composer Diamanda Galás recorded a version for her fifth album, You Must Be Certain of the Devil (1988), the final part of a trilogy about the AIDS epidemic that features songs influenced by American gospel music and biblical themes, and later in Plague Mass (1991) and The Singer (1992). Composer Nathaniel Dett used the text and melody of "Go Down Moses" throughout his oratorio, "The Ordering of Moses" (1937). In the first section, Dett sets the melody with added-note harmonies, quartal chords, modal harmonies, and chromaticism (especially French augmented sixth chords). Later in the oratorio, "Go Down Moses" is set as a fugue. Television The NBC television comedy The Fresh Prince of Bel-Air twice used the song for comedic effect. In the first instance, Will Smith's character sings the song after he and his cousin Carlton Banks are thrown into prison (Smith sings the first two lines, Banks sullenly provides the refrain, then a prisoner sings the final four lines in an operatic voice.) In the second instance, Banks is preparing for an Easter service and attempts to show off his prowess by singing the last two lines of the chorus; Smith replies with his own version, in which he makes a joke about Carlton's height ("...Let my cousin grow!"). In Dr. Katz, Professional Therapist is sung by Katz and Ben during the end credits of the episode "Thanksgiving" (Season 5, Episode 18). Della Reese sings it in Episode 424, "Elijah", of Touched by an Angel, which Bruce Davison sings "Eliyahu". In series 2 episode 3 of Life on Mars, the lawyer sings for his client's release. Recordings The Tuskegee Institute Singers recorded the song for Victor in 1914. The Kelly Family recorded the song twice: live version is included on their album Live (1988) and a studio version on New World (1990). The latter also features on their compilation album The Very Best - Over 10 Years (1993). The Golden Gate Quartet (Duration: 3:05; recorded in 1957 for their album Spirituals). "Go Down Moses" was recorded by the Robert Shaw Chorale on RCA Victor 33 record LM/LSC 2580, copyright 1964, first side, second band, lasting 4 minutes and 22 seconds. Liner notes by noted African-American author Langston Hughes. See also Christian child's prayer § Spirituals Let My People Go (disambiguation) References BibliographyThe Continental Monthly''. Vol. II (July–December 1862). New York. Lockwood, L.C. "Oh! Let My People Go: The Song of the Contrabands". New York: Horace Waters (1862). External links Sweet Chariot: The Story of the Spirituals, particularly their section on "Freedom" (Web site maintained by The Spirituals Project at the University of Denver) American folk songs Gospel songs Paul Robeson songs African-American spiritual songs Songs about celebrities Cultural depictions of Moses Year of song unknown Songwriter unknown Songs about religious leaders Songs about Egypt
[ -0.0016631226753816009, 0.3763708770275116, -0.7465316653251648, -0.47710543870925903, -0.2414151132106781, 0.6712687015533447, 0.6344981789588928, -0.13108274340629578, -0.40041032433509827, -0.7631085515022278, -0.48537424206733704, 0.1368752121925354, -0.5416440367698669, 0.494924575090...
12024
https://en.wikipedia.org/wiki/General%20relativity
General relativity
General relativity, also known as the general theory of relativity and Einstein's theory of gravity, is the geometric theory of gravitation published by Albert Einstein in 1915 and is the current description of gravitation in modern physics. General relativity generalizes special relativity and refines Newton's law of universal gravitation, providing a unified description of gravity as a geometric property of space and time or four-dimensional spacetime. In particular, the is directly related to the energy and momentum of whatever matter and radiation are present. The relation is specified by the Einstein field equations, a system of second order partial differential equations. Newton's law of universal gravitation, which describes classical gravity, can be seen as a prediction of general relativity for the almost flat space-time geometry around stationary mass distributions. Some predictions of general relativity, however, are beyond Newton's law of universal gravitation in classical physics. These predictions concern the passage of time, the geometry of space, the motion of bodies in free fall, and the propagation of light, and include gravitational time dilation, gravitational lensing, the gravitational redshift of light, the Shapiro time delay and singularities/black holes. So far all tests of general relativity have been shown to be in agreement with the theory. The time dependent solutions of general relativity enable us to talk about the history of the universe and have provided the modern framework for cosmology, thus leading to the discovery of the Big Bang and Cosmic Microwave Background radiation. Despite the introduction of a number of alternative theories, general relativity continues to be the simplest theory consistent with experimental data. Reconciliation of general relativity with the laws of quantum physics remains a problem however, as there is a lack of a self-consistent theory of quantum gravity; and how gravity can be unified with the three non-gravitational forces—strong, weak, and electromagnetic forces. Einstein's theory has astrophysical implications, including the prediction of black holes—regions of space in which space and time are distorted in such a way that nothing, not even light, can escape from them. Black holes are the end-state for massive stars. Microquasars and active galactic nuclei are believed to be stellar black holes and supermassive black holes. It also predicts gravitational lensing, where the bending of light results in multiple images of the same distant astronomical phenomenon. Other predictions include the existence of gravitational waves, which have been observed directly by the physics collaboration LIGO and other observatories. In addition, general relativity has provided the base of cosmological models of an expanding universe. Widely acknowledged as a theory of extraordinary beauty, general relativity has often been described as the most beautiful of all existing physical theories. History Soon after publishing the special theory of relativity in 1905, Einstein started thinking about how to incorporate gravity into his new relativistic framework. In 1907, beginning with a simple thought experiment involving an observer in free fall, he embarked on what would be an eight-year search for a relativistic theory of gravity. After numerous detours and false starts, his work culminated in the presentation to the Prussian Academy of Science in November 1915 of what are now known as the Einstein field equations, which form the core of Einstein's general theory of relativity. These equations specify how the geometry of space and time is influenced by whatever matter and radiation are present. A version of non-Euclidean geometry, called Riemannian Geometry, enabled Einstein to develop general relativity by providing the key mathematical framework on which he fit his physical ideas of gravity. This idea was pointed out by mathematician Marcel Grossmann and published by Grossmann and Einstein in 1913. The Einstein field equations are nonlinear and considered difficult to solve. Einstein used approximation methods in working out initial predictions of the theory. But in 1916, the astrophysicist Karl Schwarzschild found the first non-trivial exact solution to the Einstein field equations, the Schwarzschild metric. This solution laid the groundwork for the description of the final stages of gravitational collapse, and the objects known today as black holes. In the same year, the first steps towards generalizing Schwarzschild's solution to electrically charged objects were taken, eventually resulting in the Reissner–Nordström solution, which is now associated with electrically charged black holes. In 1917, Einstein applied his theory to the universe as a whole, initiating the field of relativistic cosmology. In line with contemporary thinking, he assumed a static universe, adding a new parameter to his original field equations—the cosmological constant—to match that observational presumption. By 1929, however, the work of Hubble and others had shown that our universe is expanding. This is readily described by the expanding cosmological solutions found by Friedmann in 1922, which do not require a cosmological constant. Lemaître used these solutions to formulate the earliest version of the Big Bang models, in which our universe has evolved from an extremely hot and dense earlier state. Einstein later declared the cosmological constant the biggest blunder of his life. During that period, general relativity remained something of a curiosity among physical theories. It was clearly superior to Newtonian gravity, being consistent with special relativity and accounting for several effects unexplained by the Newtonian theory. Einstein showed in 1915 how his theory explained the anomalous perihelion advance of the planet Mercury without any arbitrary parameters ("fudge factors"), and in 1919 an expedition led by Eddington confirmed general relativity's prediction for the deflection of starlight by the Sun during the total solar eclipse of May 29, 1919, instantly making Einstein famous. Yet the theory remained outside the mainstream of theoretical physics and astrophysics until developments between approximately 1960 and 1975, now known as the golden age of general relativity. Physicists began to understand the concept of a black hole, and to identify quasars as one of these objects' astrophysical manifestations. Ever more precise solar system tests confirmed the theory's predictive power, and relativistic cosmology also became amenable to direct observational tests. General relativity has acquired a reputation as a theory of extraordinary beauty. Subrahmanyan Chandrasekhar has noted that at multiple levels, general relativity exhibits what Francis Bacon has termed a "strangeness in the proportion" (i.e. elements that excite wonderment and surprise). It juxtaposes fundamental concepts (space and time versus matter and motion) which had previously been considered as entirely independent. Chandrasekhar also noted that Einstein's only guides in his search for an exact theory were the principle of equivalence and his sense that a proper description of gravity should be geometrical at its basis, so that there was an "element of revelation" in the manner in which Einstein arrived at his theory. Other elements of beauty associated with the general theory of relativity are its simplicity and symmetry, the manner in which it incorporates invariance and unification, and its perfect logical consistency. From classical mechanics to general relativity General relativity can be understood by examining its similarities with and departures from classical physics. The first step is the realization that classical mechanics and Newton's law of gravity admit a geometric description. The combination of this description with the laws of special relativity results in a heuristic derivation of general relativity. Geometry of Newtonian gravity At the base of classical mechanics is the notion that a body's motion can be described as a combination of free (or inertial) motion, and deviations from this free motion. Such deviations are caused by external forces acting on a body in accordance with Newton's second law of motion, which states that the net force acting on a body is equal to that body's (inertial) mass multiplied by its acceleration. The preferred inertial motions are related to the geometry of space and time: in the standard reference frames of classical mechanics, objects in free motion move along straight lines at constant speed. In modern parlance, their paths are geodesics, straight world lines in curved spacetime. Conversely, one might expect that inertial motions, once identified by observing the actual motions of bodies and making allowances for the external forces (such as electromagnetism or friction), can be used to define the geometry of space, as well as a time coordinate. However, there is an ambiguity once gravity comes into play. According to Newton's law of gravity, and independently verified by experiments such as that of Eötvös and its successors (see Eötvös experiment), there is a universality of free fall (also known as the weak equivalence principle, or the universal equality of inertial and passive-gravitational mass): the trajectory of a test body in free fall depends only on its position and initial speed, but not on any of its material properties. A simplified version of this is embodied in Einstein's elevator experiment, illustrated in the figure on the right: for an observer in an enclosed room, it is impossible to decide, by mapping the trajectory of bodies such as a dropped ball, whether the room is stationary in a gravitational field and the ball accelerating, or in free space aboard a rocket that is accelerating at a rate equal to that of the gravitational field versus the ball which upon release has nil acceleration. Given the universality of free fall, there is no observable distinction between inertial motion and motion under the influence of the gravitational force. This suggests the definition of a new class of inertial motion, namely that of objects in free fall under the influence of gravity. This new class of preferred motions, too, defines a geometry of space and time—in mathematical terms, it is the geodesic motion associated with a specific connection which depends on the gradient of the gravitational potential. Space, in this construction, still has the ordinary Euclidean geometry. However, spacetime as a whole is more complicated. As can be shown using simple thought experiments following the free-fall trajectories of different test particles, the result of transporting spacetime vectors that can denote a particle's velocity (time-like vectors) will vary with the particle's trajectory; mathematically speaking, the Newtonian connection is not integrable. From this, one can deduce that spacetime is curved. The resulting Newton–Cartan theory is a geometric formulation of Newtonian gravity using only covariant concepts, i.e. a description which is valid in any desired coordinate system. In this geometric description, tidal effects—the relative acceleration of bodies in free fall—are related to the derivative of the connection, showing how the modified geometry is caused by the presence of mass. Relativistic generalization As intriguing as geometric Newtonian gravity may be, its basis, classical mechanics, is merely a limiting case of (special) relativistic mechanics. In the language of symmetry: where gravity can be neglected, physics is Lorentz invariant as in special relativity rather than Galilei invariant as in classical mechanics. (The defining symmetry of special relativity is the Poincaré group, which includes translations, rotations and boosts.) The differences between the two become significant when dealing with speeds approaching the speed of light, and with high-energy phenomena. With Lorentz symmetry, additional structures come into play. They are defined by the set of light cones (see image). The light-cones define a causal structure: for each event , there is a set of events that can, in principle, either influence or be influenced by via signals or interactions that do not need to travel faster than light (such as event in the image), and a set of events for which such an influence is impossible (such as event in the image). These sets are observer-independent. In conjunction with the world-lines of freely falling particles, the light-cones can be used to reconstruct the spacetime's semi-Riemannian metric, at least up to a positive scalar factor. In mathematical terms, this defines a conformal structure or conformal geometry. Special relativity is defined in the absence of gravity. For practical applications, it is a suitable model whenever gravity can be neglected. Bringing gravity into play, and assuming the universality of free fall motion, an analogous reasoning as in the previous section applies: there are no global inertial frames. Instead there are approximate inertial frames moving alongside freely falling particles. Translated into the language of spacetime: the straight time-like lines that define a gravity-free inertial frame are deformed to lines that are curved relative to each other, suggesting that the inclusion of gravity necessitates a change in spacetime geometry. A priori, it is not clear whether the new local frames in free fall coincide with the reference frames in which the laws of special relativity hold—that theory is based on the propagation of light, and thus on electromagnetism, which could have a different set of preferred frames. But using different assumptions about the special-relativistic frames (such as their being earth-fixed, or in free fall), one can derive different predictions for the gravitational redshift, that is, the way in which the frequency of light shifts as the light propagates through a gravitational field (cf. below). The actual measurements show that free-falling frames are the ones in which light propagates as it does in special relativity. The generalization of this statement, namely that the laws of special relativity hold to good approximation in freely falling (and non-rotating) reference frames, is known as the Einstein equivalence principle, a crucial guiding principle for generalizing special-relativistic physics to include gravity. The same experimental data shows that time as measured by clocks in a gravitational field—proper time, to give the technical term—does not follow the rules of special relativity. In the language of spacetime geometry, it is not measured by the Minkowski metric. As in the Newtonian case, this is suggestive of a more general geometry. At small scales, all reference frames that are in free fall are equivalent, and approximately Minkowskian. Consequently, we are now dealing with a curved generalization of Minkowski space. The metric tensor that defines the geometry—in particular, how lengths and angles are measured—is not the Minkowski metric of special relativity, it is a generalization known as a semi- or pseudo-Riemannian metric. Furthermore, each Riemannian metric is naturally associated with one particular kind of connection, the Levi-Civita connection, and this is, in fact, the connection that satisfies the equivalence principle and makes space locally Minkowskian (that is, in suitable locally inertial coordinates, the metric is Minkowskian, and its first partial derivatives and the connection coefficients vanish). Einstein's equations Having formulated the relativistic, geometric version of the effects of gravity, the question of gravity's source remains. In Newtonian gravity, the source is mass. In special relativity, mass turns out to be part of a more general quantity called the energy–momentum tensor, which includes both energy and momentum densities as well as stress: pressure and shear. Using the equivalence principle, this tensor is readily generalized to curved spacetime. Drawing further upon the analogy with geometric Newtonian gravity, it is natural to assume that the field equation for gravity relates this tensor and the Ricci tensor, which describes a particular class of tidal effects: the change in volume for a small cloud of test particles that are initially at rest, and then fall freely. In special relativity, conservation of energy–momentum corresponds to the statement that the energy–momentum tensor is divergence-free. This formula, too, is readily generalized to curved spacetime by replacing partial derivatives with their curved-manifold counterparts, covariant derivatives studied in differential geometry. With this additional condition—the covariant divergence of the energy–momentum tensor, and hence of whatever is on the other side of the equation, is zero—the simplest set of equations are what are called Einstein's (field) equations: On the left-hand side is the Einstein tensor, , which is symmetric and a specific divergence-free combination of the Ricci tensor and the metric. In particular, is the curvature scalar. The Ricci tensor itself is related to the more general Riemann curvature tensor as On the right-hand side, is the energy–momentum tensor. All tensors are written in abstract index notation. Matching the theory's prediction to observational results for planetary orbits or, equivalently, assuring that the weak-gravity, low-speed limit is Newtonian mechanics, the proportionality constant is found to be , where is the gravitational constant and the speed of light in vacuum. When there is no matter present, so that the energy–momentum tensor vanishes, the results are the vacuum Einstein equations, In general relativity, the world line of a particle free from all external, non-gravitational force is a particular type of geodesic in curved spacetime. In other words, a freely moving or falling particle always moves along a geodesic. The geodesic equation is: where is a scalar parameter of motion (e.g. the proper time), and are Christoffel symbols (sometimes called the affine connection coefficients or Levi-Civita connection coefficients) which is symmetric in the two lower indices. Greek indices may take the values: 0, 1, 2, 3 and the summation convention is used for repeated indices and . The quantity on the left-hand-side of this equation is the acceleration of a particle, and so this equation is analogous to Newton's laws of motion which likewise provide formulae for the acceleration of a particle. This equation of motion employs the Einstein notation, meaning that repeated indices are summed (i.e. from zero to three). The Christoffel symbols are functions of the four spacetime coordinates, and so are independent of the velocity or acceleration or other characteristics of a test particle whose motion is described by the geodesic equation. Total force in general relativity In general relativity, the effective gravitational potential energy of an object of mass m rotating around a massive central body M is given by A conservative total force can then be obtained as where L is the angular momentum. The first term represents the Newton's force of gravity, which is described by the inverse-square law. The second term represents the centrifugal force in the circular motion. The third term represents the relativistic effect. Alternatives to general relativity There are alternatives to general relativity built upon the same premises, which include additional rules and/or constraints, leading to different field equations. Examples are Whitehead's theory, Brans–Dicke theory, teleparallelism, f(R) gravity and Einstein–Cartan theory. Definition and basic applications The derivation outlined in the previous section contains all the information needed to define general relativity, describe its key properties, and address a question of crucial importance in physics, namely how the theory can be used for model-building. Definition and basic properties General relativity is a metric theory of gravitation. At its core are Einstein's equations, which describe the relation between the geometry of a four-dimensional pseudo-Riemannian manifold representing spacetime, and the energy–momentum contained in that spacetime. Phenomena that in classical mechanics are ascribed to the action of the force of gravity (such as free-fall, orbital motion, and spacecraft trajectories), correspond to inertial motion within a curved geometry of spacetime in general relativity; there is no gravitational force deflecting objects from their natural, straight paths. Instead, gravity corresponds to changes in the properties of space and time, which in turn changes the straightest-possible paths that objects will naturally follow. The curvature is, in turn, caused by the energy–momentum of matter. Paraphrasing the relativist John Archibald Wheeler, spacetime tells matter how to move; matter tells spacetime how to curve. While general relativity replaces the scalar gravitational potential of classical physics by a symmetric rank-two tensor, the latter reduces to the former in certain limiting cases. For weak gravitational fields and slow speed relative to the speed of light, the theory's predictions converge on those of Newton's law of universal gravitation. As it is constructed using tensors, general relativity exhibits general covariance: its laws—and further laws formulated within the general relativistic framework—take on the same form in all coordinate systems. Furthermore, the theory does not contain any invariant geometric background structures, i.e. it is background independent. It thus satisfies a more stringent general principle of relativity, namely that the laws of physics are the same for all observers. Locally, as expressed in the equivalence principle, spacetime is Minkowskian, and the laws of physics exhibit local Lorentz invariance. Model-building The core concept of general-relativistic model-building is that of a solution of Einstein's equations. Given both Einstein's equations and suitable equations for the properties of matter, such a solution consists of a specific semi-Riemannian manifold (usually defined by giving the metric in specific coordinates), and specific matter fields defined on that manifold. Matter and geometry must satisfy Einstein's equations, so in particular, the matter's energy–momentum tensor must be divergence-free. The matter must, of course, also satisfy whatever additional equations were imposed on its properties. In short, such a solution is a model universe that satisfies the laws of general relativity, and possibly additional laws governing whatever matter might be present. Einstein's equations are nonlinear partial differential equations and, as such, difficult to solve exactly. Nevertheless, a number of exact solutions are known, although only a few have direct physical applications. The best-known exact solutions, and also those most interesting from a physics point of view, are the Schwarzschild solution, the Reissner–Nordström solution and the Kerr metric, each corresponding to a certain type of black hole in an otherwise empty universe, and the Friedmann–Lemaître–Robertson–Walker and de Sitter universes, each describing an expanding cosmos. Exact solutions of great theoretical interest include the Gödel universe (which opens up the intriguing possibility of time travel in curved spacetimes), the Taub-NUT solution (a model universe that is homogeneous, but anisotropic), and anti-de Sitter space (which has recently come to prominence in the context of what is called the Maldacena conjecture). Given the difficulty of finding exact solutions, Einstein's field equations are also solved frequently by numerical integration on a computer, or by considering small perturbations of exact solutions. In the field of numerical relativity, powerful computers are employed to simulate the geometry of spacetime and to solve Einstein's equations for interesting situations such as two colliding black holes. In principle, such methods may be applied to any system, given sufficient computer resources, and may address fundamental questions such as naked singularities. Approximate solutions may also be found by perturbation theories such as linearized gravity and its generalization, the post-Newtonian expansion, both of which were developed by Einstein. The latter provides a systematic approach to solving for the geometry of a spacetime that contains a distribution of matter that moves slowly compared with the speed of light. The expansion involves a series of terms; the first terms represent Newtonian gravity, whereas the later terms represent ever smaller corrections to Newton's theory due to general relativity. An extension of this expansion is the parametrized post-Newtonian (PPN) formalism, which allows quantitative comparisons between the predictions of general relativity and alternative theories. Consequences of Einstein's theory General relativity has a number of physical consequences. Some follow directly from the theory's axioms, whereas others have become clear only in the course of many years of research that followed Einstein's initial publication. Gravitational time dilation and frequency shift Assuming that the equivalence principle holds, gravity influences the passage of time. Light sent down into a gravity well is blueshifted, whereas light sent in the opposite direction (i.e., climbing out of the gravity well) is redshifted; collectively, these two effects are known as the gravitational frequency shift. More generally, processes close to a massive body run more slowly when compared with processes taking place farther away; this effect is known as gravitational time dilation. Gravitational redshift has been measured in the laboratory and using astronomical observations. Gravitational time dilation in the Earth's gravitational field has been measured numerous times using atomic clocks, while ongoing validation is provided as a side effect of the operation of the Global Positioning System (GPS). Tests in stronger gravitational fields are provided by the observation of binary pulsars. All results are in agreement with general relativity. However, at the current level of accuracy, these observations cannot distinguish between general relativity and other theories in which the equivalence principle is valid. Light deflection and gravitational time delay General relativity predicts that the path of light will follow the curvature of spacetime as it passes near a star. This effect was initially confirmed by observing the light of stars or distant quasars being deflected as it passes the Sun. This and related predictions follow from the fact that light follows what is called a light-like or null geodesic—a generalization of the straight lines along which light travels in classical physics. Such geodesics are the generalization of the invariance of lightspeed in special relativity. As one examines suitable model spacetimes (either the exterior Schwarzschild solution or, for more than a single mass, the post-Newtonian expansion), several effects of gravity on light propagation emerge. Although the bending of light can also be derived by extending the universality of free fall to light, the angle of deflection resulting from such calculations is only half the value given by general relativity. Closely related to light deflection is the gravitational time delay (or Shapiro delay), the phenomenon that light signals take longer to move through a gravitational field than they would in the absence of that field. There have been numerous successful tests of this prediction. In the parameterized post-Newtonian formalism (PPN), measurements of both the deflection of light and the gravitational time delay determine a parameter called γ, which encodes the influence of gravity on the geometry of space. Gravitational waves Predicted in 1916 by Albert Einstein, there are gravitational waves: ripples in the metric of spacetime that propagate at the speed of light. These are one of several analogies between weak-field gravity and electromagnetism in that, they are analogous to electromagnetic waves. On February 11, 2016, the Advanced LIGO team announced that they had directly detected gravitational waves from a pair of black holes merging. The simplest type of such a wave can be visualized by its action on a ring of freely floating particles. A sine wave propagating through such a ring towards the reader distorts the ring in a characteristic, rhythmic fashion (animated image to the right). Since Einstein's equations are non-linear, arbitrarily strong gravitational waves do not obey linear superposition, making their description difficult. However, linear approximations of gravitational waves are sufficiently accurate to describe the exceedingly weak waves that are expected to arrive here on Earth from far-off cosmic events, which typically result in relative distances increasing and decreasing by or less. Data analysis methods routinely make use of the fact that these linearized waves can be Fourier decomposed. Some exact solutions describe gravitational waves without any approximation, e.g., a wave train traveling through empty space or Gowdy universes, varieties of an expanding cosmos filled with gravitational waves. But for gravitational waves produced in astrophysically relevant situations, such as the merger of two black holes, numerical methods are presently the only way to construct appropriate models. Orbital effects and the relativity of direction General relativity differs from classical mechanics in a number of predictions concerning orbiting bodies. It predicts an overall rotation (precession) of planetary orbits, as well as orbital decay caused by the emission of gravitational waves and effects related to the relativity of direction. Precession of apsides In general relativity, the apsides of any orbit (the point of the orbiting body's closest approach to the system's center of mass) will precess; the orbit is not an ellipse, but akin to an ellipse that rotates on its focus, resulting in a rose curve-like shape (see image). Einstein first derived this result by using an approximate metric representing the Newtonian limit and treating the orbiting body as a test particle. For him, the fact that his theory gave a straightforward explanation of Mercury's anomalous perihelion shift, discovered earlier by Urbain Le Verrier in 1859, was important evidence that he had at last identified the correct form of the gravitational field equations. The effect can also be derived by using either the exact Schwarzschild metric (describing spacetime around a spherical mass) or the much more general post-Newtonian formalism. It is due to the influence of gravity on the geometry of space and to the contribution of self-energy to a body's gravity (encoded in the nonlinearity of Einstein's equations). Relativistic precession has been observed for all planets that allow for accurate precession measurements (Mercury, Venus, and Earth), as well as in binary pulsar systems, where it is larger by five orders of magnitude. In general relativity the perihelion shift , expressed in radians per revolution, is approximately given by where: is the semi-major axis is the orbital period is the speed of light in vacuum is the orbital eccentricity Orbital decay According to general relativity, a binary system will emit gravitational waves, thereby losing energy. Due to this loss, the distance between the two orbiting bodies decreases, and so does their orbital period. Within the Solar System or for ordinary double stars, the effect is too small to be observable. This is not the case for a close binary pulsar, a system of two orbiting neutron stars, one of which is a pulsar: from the pulsar, observers on Earth receive a regular series of radio pulses that can serve as a highly accurate clock, which allows precise measurements of the orbital period. Because neutron stars are immensely compact, significant amounts of energy are emitted in the form of gravitational radiation. The first observation of a decrease in orbital period due to the emission of gravitational waves was made by Hulse and Taylor, using the binary pulsar PSR1913+16 they had discovered in 1974. This was the first detection of gravitational waves, albeit indirect, for which they were awarded the 1993 Nobel Prize in physics. Since then, several other binary pulsars have been found, in particular the double pulsar PSR J0737-3039, where both stars are pulsars and which was last reported to also be in agreement with general relativity in 2021 after 16 years of observations. Geodetic precession and frame-dragging Several relativistic effects are directly related to the relativity of direction. One is geodetic precession: the axis direction of a gyroscope in free fall in curved spacetime will change when compared, for instance, with the direction of light received from distant stars—even though such a gyroscope represents the way of keeping a direction as stable as possible ("parallel transport"). For the Moon–Earth system, this effect has been measured with the help of lunar laser ranging. More recently, it has been measured for test masses aboard the satellite Gravity Probe B to a precision of better than 0.3%. Near a rotating mass, there are gravitomagnetic or frame-dragging effects. A distant observer will determine that objects close to the mass get "dragged around". This is most extreme for rotating black holes where, for any object entering a zone known as the ergosphere, rotation is inevitable. Such effects can again be tested through their influence on the orientation of gyroscopes in free fall. Somewhat controversial tests have been performed using the LAGEOS satellites, confirming the relativistic prediction. Also the Mars Global Surveyor probe around Mars has been used. Interpretations Neo-Lorentzian Interpretation Examples of prominent physicists who support neo-Lorentzian explanations of general relativity are Franco Selleri and Antony Valentini. Astrophysical applications Gravitational lensing The deflection of light by gravity is responsible for a new class of astronomical phenomena. If a massive object is situated between the astronomer and a distant target object with appropriate mass and relative distances, the astronomer will see multiple distorted images of the target. Such effects are known as gravitational lensing. Depending on the configuration, scale, and mass distribution, there can be two or more images, a bright ring known as an Einstein ring, or partial rings called arcs. The earliest example was discovered in 1979; since then, more than a hundred gravitational lenses have been observed. Even if the multiple images are too close to each other to be resolved, the effect can still be measured, e.g., as an overall brightening of the target object; a number of such "microlensing events" have been observed. Gravitational lensing has developed into a tool of observational astronomy. It is used to detect the presence and distribution of dark matter, provide a "natural telescope" for observing distant galaxies, and to obtain an independent estimate of the Hubble constant. Statistical evaluations of lensing data provide valuable insight into the structural evolution of galaxies. Gravitational-wave astronomy Observations of binary pulsars provide strong indirect evidence for the existence of gravitational waves (see Orbital decay, above). Detection of these waves is a major goal of current relativity-related research. Several land-based gravitational wave detectors are currently in operation, most notably the interferometric detectors GEO 600, LIGO (two detectors), TAMA 300 and VIRGO. Various pulsar timing arrays are using millisecond pulsars to detect gravitational waves in the 10−9 to 10−6 Hertz frequency range, which originate from binary supermassive blackholes. A European space-based detector, eLISA / NGO, is currently under development, with a precursor mission (LISA Pathfinder) having launched in December 2015. Observations of gravitational waves promise to complement observations in the electromagnetic spectrum. They are expected to yield information about black holes and other dense objects such as neutron stars and white dwarfs, about certain kinds of supernova implosions, and about processes in the very early universe, including the signature of certain types of hypothetical cosmic string. In February 2016, the Advanced LIGO team announced that they had detected gravitational waves from a black hole merger. Black holes and other compact objects Whenever the ratio of an object's mass to its radius becomes sufficiently large, general relativity predicts the formation of a black hole, a region of space from which nothing, not even light, can escape. In the currently accepted models of stellar evolution, neutron stars of around 1.4 solar masses, and stellar black holes with a few to a few dozen solar masses, are thought to be the final state for the evolution of massive stars. Usually a galaxy has one supermassive black hole with a few million to a few billion solar masses in its center, and its presence is thought to have played an important role in the formation of the galaxy and larger cosmic structures. Astronomically, the most important property of compact objects is that they provide a supremely efficient mechanism for converting gravitational energy into electromagnetic radiation. Accretion, the falling of dust or gaseous matter onto stellar or supermassive black holes, is thought to be responsible for some spectacularly luminous astronomical objects, notably diverse kinds of active galactic nuclei on galactic scales and stellar-size objects such as microquasars. In particular, accretion can lead to relativistic jets, focused beams of highly energetic particles that are being flung into space at almost light speed. General relativity plays a central role in modelling all these phenomena, and observations provide strong evidence for the existence of black holes with the properties predicted by the theory. Black holes are also sought-after targets in the search for gravitational waves (cf. Gravitational waves, above). Merging black hole binaries should lead to some of the strongest gravitational wave signals reaching detectors here on Earth, and the phase directly before the merger ("chirp") could be used as a "standard candle" to deduce the distance to the merger events–and hence serve as a probe of cosmic expansion at large distances. The gravitational waves produced as a stellar black hole plunges into a supermassive one should provide direct information about the supermassive black hole's geometry. Cosmology The current models of cosmology are based on Einstein's field equations, which include the cosmological constant since it has important influence on the large-scale dynamics of the cosmos, where is the spacetime metric. Isotropic and homogeneous solutions of these enhanced equations, the Friedmann–Lemaître–Robertson–Walker solutions, allow physicists to model a universe that has evolved over the past 14 billion years from a hot, early Big Bang phase. Once a small number of parameters (for example the universe's mean matter density) have been fixed by astronomical observation, further observational data can be used to put the models to the test. Predictions, all successful, include the initial abundance of chemical elements formed in a period of primordial nucleosynthesis, the large-scale structure of the universe, and the existence and properties of a "thermal echo" from the early cosmos, the cosmic background radiation. Astronomical observations of the cosmological expansion rate allow the total amount of matter in the universe to be estimated, although the nature of that matter remains mysterious in part. About 90% of all matter appears to be dark matter, which has mass (or, equivalently, gravitational influence), but does not interact electromagnetically and, hence, cannot be observed directly. There is no generally accepted description of this new kind of matter, within the framework of known particle physics or otherwise. Observational evidence from redshift surveys of distant supernovae and measurements of the cosmic background radiation also show that the evolution of our universe is significantly influenced by a cosmological constant resulting in an acceleration of cosmic expansion or, equivalently, by a form of energy with an unusual equation of state, known as dark energy, the nature of which remains unclear. An inflationary phase, an additional phase of strongly accelerated expansion at cosmic times of around 10−33 seconds, was hypothesized in 1980 to account for several puzzling observations that were unexplained by classical cosmological models, such as the nearly perfect homogeneity of the cosmic background radiation. Recent measurements of the cosmic background radiation have resulted in the first evidence for this scenario. However, there is a bewildering variety of possible inflationary scenarios, which cannot be restricted by current observations. An even larger question is the physics of the earliest universe, prior to the inflationary phase and close to where the classical models predict the big bang singularity. An authoritative answer would require a complete theory of quantum gravity, which has not yet been developed (cf. the section on quantum gravity, below). Exotic solutions: Time travel, Warp drives Kurt Gödel showed that solutions to Einstein's equations exist that contain closed timelike curves (CTCs), which allow for loops in time. The solutions require extreme physical conditions unlikely ever to occur in practice, and it remains an open question whether further laws of physics will eliminate them completely. Since then, other—similarly impractical—GR solutions containing CTCs have been found, such as the Tipler cylinder and traversable wormholes. Stephen Hawking has introduced Chronology protection conjecture which is an assumption beyond those of standard general relativity to prevent time travel. Some exact solutions in general relativity such as Alcubierre drive present examples of warp drive but these solutions requires exotic matter distribution, and generally suffers from semiclassical instability. Advanced concepts Asymptotic symmetries The spacetime symmetry group for special relativity is the Poincaré group, which is a ten-dimensional group of three Lorentz boosts, three rotations, and four spacetime translations. It is logical to ask what symmetries if any might apply in General Relativity. A tractable case might be to consider the symmetries of spacetime as seen by observers located far away from all sources of the gravitational field. The naive expectation for asymptotically flat spacetime symmetries might be simply to extend and reproduce the symmetries of flat spacetime of special relativity, viz., the Poincaré group. In 1962 Hermann Bondi, M. G. van der Burg, A. W. Metzner and Rainer K. Sachs addressed this asymptotic symmetry problem in order to investigate the flow of energy at infinity due to propagating gravitational waves. Their first step was to decide on some physically sensible boundary conditions to place on the gravitational field at light-like infinity to characterize what it means to say a metric is asymptotically flat, making no a priori assumptions about the nature of the asymptotic symmetry group — not even the assumption that such a group exists. Then after designing what they considered to be the most sensible boundary conditions, they investigated the nature of the resulting asymptotic symmetry transformations that leave invariant the form of the boundary conditions appropriate for asymptotically flat gravitational fields. What they found was that the asymptotic symmetry transformations actually do form a group and the structure of this group does not depend on the particular gravitational field that happens to be present. This means that, as expected, one can separate the kinematics of spacetime from the dynamics of the gravitational field at least at spatial infinity. The puzzling surprise in 1962 was their discovery of a rich infinite-dimensional group (the so-called BMS group) as the asymptotic symmetry group, instead of the finite-dimensional Poincaré group, which is a subgroup of the BMS group. Not only are the Lorentz transformations asymptotic symmetry transformations, there are also additional transformations that are not Lorentz transformations but are asymptotic symmetry transformations. In fact, they found an additional infinity of transformation generators known as supertranslations. This implies the conclusion that General Relativity (GR) does not reduce to special relativity in the case of weak fields at long distances. It turns out that the BMS symmetry, suitably modified, could be seen as a restatement of the universal soft graviton theorem in quantum field theory (QFT), which relates universal infrared (soft) QFT with GR asymptotic spacetime symmetries. Causal structure and global geometry In general relativity, no material body can catch up with or overtake a light pulse. No influence from an event A can reach any other location X before light sent out at A to X. In consequence, an exploration of all light worldlines (null geodesics) yields key information about the spacetime's causal structure. This structure can be displayed using Penrose–Carter diagrams in which infinitely large regions of space and infinite time intervals are shrunk ("compactified") so as to fit onto a finite map, while light still travels along diagonals as in standard spacetime diagrams. Aware of the importance of causal structure, Roger Penrose and others developed what is known as global geometry. In global geometry, the object of study is not one particular solution (or family of solutions) to Einstein's equations. Rather, relations that hold true for all geodesics, such as the Raychaudhuri equation, and additional non-specific assumptions about the nature of matter (usually in the form of energy conditions) are used to derive general results. Horizons Using global geometry, some spacetimes can be shown to contain boundaries called horizons, which demarcate one region from the rest of spacetime. The best-known examples are black holes: if mass is compressed into a sufficiently compact region of space (as specified in the hoop conjecture, the relevant length scale is the Schwarzschild radius), no light from inside can escape to the outside. Since no object can overtake a light pulse, all interior matter is imprisoned as well. Passage from the exterior to the interior is still possible, showing that the boundary, the black hole's horizon, is not a physical barrier. Early studies of black holes relied on explicit solutions of Einstein's equations, notably the spherically symmetric Schwarzschild solution (used to describe a static black hole) and the axisymmetric Kerr solution (used to describe a rotating, stationary black hole, and introducing interesting features such as the ergosphere). Using global geometry, later studies have revealed more general properties of black holes. With time they become rather simple objects characterized by eleven parameters specifying: electric charge, mass-energy, linear momentum, angular momentum, and location at a specified time. This is stated by the black hole uniqueness theorem: "black holes have no hair", that is, no distinguishing marks like the hairstyles of humans. Irrespective of the complexity of a gravitating object collapsing to form a black hole, the object that results (having emitted gravitational waves) is very simple. Even more remarkably, there is a general set of laws known as black hole mechanics, which is analogous to the laws of thermodynamics. For instance, by the second law of black hole mechanics, the area of the event horizon of a general black hole will never decrease with time, analogous to the entropy of a thermodynamic system. This limits the energy that can be extracted by classical means from a rotating black hole (e.g. by the Penrose process). There is strong evidence that the laws of black hole mechanics are, in fact, a subset of the laws of thermodynamics, and that the black hole area is proportional to its entropy. This leads to a modification of the original laws of black hole mechanics: for instance, as the second law of black hole mechanics becomes part of the second law of thermodynamics, it is possible for black hole area to decrease—as long as other processes ensure that, overall, entropy increases. As thermodynamical objects with non-zero temperature, black holes should emit thermal radiation. Semi-classical calculations indicate that indeed they do, with the surface gravity playing the role of temperature in Planck's law. This radiation is known as Hawking radiation (cf. the quantum theory section, below). There are other types of horizons. In an expanding universe, an observer may find that some regions of the past cannot be observed ("particle horizon"), and some regions of the future cannot be influenced (event horizon). Even in flat Minkowski space, when described by an accelerated observer (Rindler space), there will be horizons associated with a semi-classical radiation known as Unruh radiation. Singularities Another general feature of general relativity is the appearance of spacetime boundaries known as singularities. Spacetime can be explored by following up on timelike and lightlike geodesics—all possible ways that light and particles in free fall can travel. But some solutions of Einstein's equations have "ragged edges"—regions known as spacetime singularities, where the paths of light and falling particles come to an abrupt end, and geometry becomes ill-defined. In the more interesting cases, these are "curvature singularities", where geometrical quantities characterizing spacetime curvature, such as the Ricci scalar, take on infinite values. Well-known examples of spacetimes with future singularities—where worldlines end—are the Schwarzschild solution, which describes a singularity inside an eternal static black hole, or the Kerr solution with its ring-shaped singularity inside an eternal rotating black hole. The Friedmann–Lemaître–Robertson–Walker solutions and other spacetimes describing universes have past singularities on which worldlines begin, namely Big Bang singularities, and some have future singularities (Big Crunch) as well. Given that these examples are all highly symmetric—and thus simplified—it is tempting to conclude that the occurrence of singularities is an artifact of idealization. The famous singularity theorems, proved using the methods of global geometry, say otherwise: singularities are a generic feature of general relativity, and unavoidable once the collapse of an object with realistic matter properties has proceeded beyond a certain stage and also at the beginning of a wide class of expanding universes. However, the theorems say little about the properties of singularities, and much of current research is devoted to characterizing these entities' generic structure (hypothesized e.g. by the BKL conjecture). The cosmic censorship hypothesis states that all realistic future singularities (no perfect symmetries, matter with realistic properties) are safely hidden away behind a horizon, and thus invisible to all distant observers. While no formal proof yet exists, numerical simulations offer supporting evidence of its validity. Evolution equations Each solution of Einstein's equation encompasses the whole history of a universe — it is not just some snapshot of how things are, but a whole, possibly matter-filled, spacetime. It describes the state of matter and geometry everywhere and at every moment in that particular universe. Due to its general covariance, Einstein's theory is not sufficient by itself to determine the time evolution of the metric tensor. It must be combined with a coordinate condition, which is analogous to gauge fixing in other field theories. To understand Einstein's equations as partial differential equations, it is helpful to formulate them in a way that describes the evolution of the universe over time. This is done in "3+1" formulations, where spacetime is split into three space dimensions and one time dimension. The best-known example is the ADM formalism. These decompositions show that the spacetime evolution equations of general relativity are well-behaved: solutions always exist, and are uniquely defined, once suitable initial conditions have been specified. Such formulations of Einstein's field equations are the basis of numerical relativity. Global and quasi-local quantities The notion of evolution equations is intimately tied in with another aspect of general relativistic physics. In Einstein's theory, it turns out to be impossible to find a general definition for a seemingly simple property such as a system's total mass (or energy). The main reason is that the gravitational field—like any physical field—must be ascribed a certain energy, but that it proves to be fundamentally impossible to localize that energy. Nevertheless, there are possibilities to define a system's total mass, either using a hypothetical "infinitely distant observer" (ADM mass) or suitable symmetries (Komar mass). If one excludes from the system's total mass the energy being carried away to infinity by gravitational waves, the result is the Bondi mass at null infinity. Just as in classical physics, it can be shown that these masses are positive. Corresponding global definitions exist for momentum and angular momentum. There have also been a number of attempts to define quasi-local quantities, such as the mass of an isolated system formulated using only quantities defined within a finite region of space containing that system. The hope is to obtain a quantity useful for general statements about isolated systems, such as a more precise formulation of the hoop conjecture. Relationship with quantum theory If general relativity were considered to be one of the two pillars of modern physics, then quantum theory, the basis of understanding matter from elementary particles to solid-state physics, would be the other. However, how to reconcile quantum theory with general relativity is still an open question. Quantum field theory in curved spacetime Ordinary quantum field theories, which form the basis of modern elementary particle physics, are defined in flat Minkowski space, which is an excellent approximation when it comes to describing the behavior of microscopic particles in weak gravitational fields like those found on Earth. In order to describe situations in which gravity is strong enough to influence (quantum) matter, yet not strong enough to require quantization itself, physicists have formulated quantum field theories in curved spacetime. These theories rely on general relativity to describe a curved background spacetime, and define a generalized quantum field theory to describe the behavior of quantum matter within that spacetime. Using this formalism, it can be shown that black holes emit a blackbody spectrum of particles known as Hawking radiation leading to the possibility that they evaporate over time. As briefly mentioned above, this radiation plays an important role for the thermodynamics of black holes. Quantum gravity The demand for consistency between a quantum description of matter and a geometric description of spacetime, as well as the appearance of singularities (where curvature length scales become microscopic), indicate the need for a full theory of quantum gravity: for an adequate description of the interior of black holes, and of the very early universe, a theory is required in which gravity and the associated geometry of spacetime are described in the language of quantum physics. Despite major efforts, no complete and consistent theory of quantum gravity is currently known, even though a number of promising candidates exist. Attempts to generalize ordinary quantum field theories, used in elementary particle physics to describe fundamental interactions, so as to include gravity have led to serious problems. Some have argued that at low energies, this approach proves successful, in that it results in an acceptable effective (quantum) field theory of gravity. At very high energies, however, the perturbative results are badly divergent and lead to models devoid of predictive power ("perturbative non-renormalizability"). One attempt to overcome these limitations is string theory, a quantum theory not of point particles, but of minute one-dimensional extended objects. The theory promises to be a unified description of all particles and interactions, including gravity; the price to pay is unusual features such as six extra dimensions of space in addition to the usual three. In what is called the second superstring revolution, it was conjectured that both string theory and a unification of general relativity and supersymmetry known as supergravity form part of a hypothesized eleven-dimensional model known as M-theory, which would constitute a uniquely defined and consistent theory of quantum gravity. Another approach starts with the canonical quantization procedures of quantum theory. Using the initial-value-formulation of general relativity (cf. evolution equations above), the result is the Wheeler–deWitt equation (an analogue of the Schrödinger equation) which, regrettably, turns out to be ill-defined without a proper ultraviolet (lattice) cutoff. However, with the introduction of what are now known as Ashtekar variables, this leads to a promising model known as loop quantum gravity. Space is represented by a web-like structure called a spin network, evolving over time in discrete steps. Depending on which features of general relativity and quantum theory are accepted unchanged, and on what level changes are introduced, there are numerous other attempts to arrive at a viable theory of quantum gravity, some examples being the lattice theory of gravity based on the Feynman Path Integral approach and Regge calculus, dynamical triangulations, causal sets, twistor models or the path integral based models of quantum cosmology. All candidate theories still have major formal and conceptual problems to overcome. They also face the common problem that, as yet, there is no way to put quantum gravity predictions to experimental tests (and thus to decide between the candidates where their predictions vary), although there is hope for this to change as future data from cosmological observations and particle physics experiments becomes available. Current status General relativity has emerged as a highly successful model of gravitation and cosmology, which has so far passed many unambiguous observational and experimental tests. However, there are strong indications that the theory is incomplete. The problem of quantum gravity and the question of the reality of spacetime singularities remain open. Observational data that is taken as evidence for dark energy and dark matter could indicate the need for new physics. Even taken as is, general relativity is rich with possibilities for further exploration. Mathematical relativists seek to understand the nature of singularities and the fundamental properties of Einstein's equations, while numerical relativists run increasingly powerful computer simulations (such as those describing merging black holes). In February 2016, it was announced that the existence of gravitational waves was directly detected by the Advanced LIGO team on September 14, 2015. A century after its introduction, general relativity remains a highly active area of research. See also Alcubierre drive (warp drive) Alternatives to general relativity Center of mass (relativistic) Contributors to general relativity Derivations of the Lorentz transformations Ehrenfest paradox Einstein–Hilbert action Einstein's thought experiments General relativity priority dispute Introduction to the mathematics of general relativity Nordström's theory of gravitation Ricci calculus Tests of general relativity Timeline of gravitational physics and relativity Two-body problem in general relativity Weak Gravity Conjecture References Bibliography ; original paper in Russian: See also English translation at Einstein Papers Project See also English translation at Einstein Papers Project See also English translation at Einstein Papers Project Further reading Popular books Beginning undergraduate textbooks Advanced undergraduate textbooks Graduate textbooks Specialists' books Journal articles See also English translation at Einstein Papers Project External links Einstein Online – Articles on a variety of aspects of relativistic physics for a general audience; hosted by the Max Planck Institute for Gravitational Physics GEO600 home page, the official website of the GEO600 project. LIGO Laboratory NCSA Spacetime Wrinkles – produced by the numerical relativity group at the NCSA, with an elementary introduction to general relativity (lecture by Leonard Susskind recorded September 22, 2008 at Stanford University). Series of lectures on General Relativity given in 2006 at the Institut Henri Poincaré (introductory/advanced). General Relativity Tutorials by John Baez. Concepts in astronomy Theories by Albert Einstein 1915 in science Articles containing video clips
[ -0.3469735383987427, 0.228770911693573, 0.058058030903339386, 0.0434764102101326, -0.5554620027542114, 0.4560106694698334, -0.0657111257314682, 0.004716184921562672, 0.04744235426187515, -0.5478838682174683, -0.33164098858833313, 0.07096082717180252, -0.5147260427474976, 0.9757715463638306...
12025
https://en.wikipedia.org/wiki/Genealogy
Genealogy
Genealogy () is the study of families, family history, and the tracing of their lineages. Genealogists use oral interviews, historical records, genetic analysis, and other records to obtain information about a family and to demonstrate kinship and pedigrees of its members. The results are often displayed in charts or written as narratives. The field of family history is broader than genealogy, and covers not just lineage but also family and community history and biography. The record of genealogical work may be presented as a "genealogy," a "family history," or a "family tree." In the narrow sense, a "genealogy" or a "family tree" traces the descendants of one person, whereas a "family history" traces the ancestors of one person, but the terms are often used interchangeably. A family history may include additional biographical information, family traditions, and the like. The pursuit of family history and origins tends to be shaped by several motives, including the desire to carve out a place for one's family in the larger historical picture, a sense of responsibility to preserve the past for future generations, and self-satisfaction in accurate storytelling. Genealogy research is also performed for scholarly or forensic purposes. Overview Amateur genealogists typically pursue their own ancestry and that of their spouses. Professional genealogists may also conduct research for others, publish books on genealogical methods, teach, or produce their own databases. They may work for companies that provide software or produce materials of use to other professionals and to amateurs. Both try to understand not just where and when people lived but also their lifestyles, biographies, and motivations. This often requires—or leads to—knowledge of antiquated laws, old political boundaries, migration trends, and historical socioeconomic or religious conditions. Genealogists sometimes specialize in a particular group, e.g., a Scottish clan; a particular surname, such as in a one-name study; a small community, e.g., a single village or parish, such as in a one-place study; or a particular, often famous, person. Bloodlines of Salem is an example of a specialized family-history group. It welcomes members who can prove descent from a participant of the Salem Witch Trials or who simply choose to support the group. Genealogists and family historians often join family history societies, where novices can learn from more experienced researchers. Such societies generally serve a specific geographical area. Their members may also index records to make them more accessible or engage in advocacy and other efforts to preserve public records and cemeteries. Some schools engage students in such projects as a means to reinforce lessons regarding immigration and history. Other benefits include family medical histories for families with serious medical conditions that are hereditary. The terms "genealogy" and "family history" are often used synonymously, but some entities offer a slight difference in definition. The Society of Genealogists, while also using the terms interchangeably, describes genealogy as the "establishment of a pedigree by extracting evidence, from valid sources, of how one generation is connected to the next" and family history as "a biographical study of a genealogically proven family and of the community and country in which they lived". Motivation Individuals conduct genealogical research for a number of reasons. Personal or medical interest Private individuals research genealogy out of curiosity about their heritage. This curiosity can be particularly strong among those whose family histories were lost or unknown due to, for example, adoption or separation from family through divorce, death, or other situations. In addition to simply wanting to know more about who they are and where they came from, individuals may research their genealogy to learn about any hereditary diseases in their family history. There is a growing interest in family history in the media as a result of advertising and television shows sponsored by large genealogy companies, such as Ancestry.com. This, coupled with easier access to online records and the affordability of DNA tests, has both inspired curiosity and allowed those who are curious to easily start investigating their ancestry. Community or religious obligation In communitarian societies, one's identity is defined as much by one's kin network as by individual achievement, and the question "Who are you?" would be answered by a description of father, mother, and tribe. New Zealand Māori, for example, learn whakapapa (genealogies) to discover who they are. Family history plays a part in the practice of some religious belief systems. For example, The Church of Jesus Christ of Latter-day Saints (LDS Church) has a doctrine of baptism for the dead, which necessitates that members of that faith engage in family history research. In East Asian countries that were historically shaped by Confucianism, many people follow a practice of ancestor worship as well as genealogical record-keeping. Ancestors' names are inscribed on tablets and placed in shrines, where rituals are performed. Genealogies are also recorded in genealogy books. This practice is rooted in the belief that respect for one's family is a foundation for a healthy society. Establishing identity Royal families, both historically and in modern times, keep records of their genealogies in order to establish their right to rule and determine who will be the next sovereign. For centuries in various cultures, one's genealogy has been a source of political and social status. Some countries and indigenous tribes allow individuals to obtain citizenship based on their genealogy. In Ireland and in Greece, for example, an individual can become a citizen if one of their grandparents was born in that country, regardless of their own or their parents' birthplace. In societies such as Australia or the United States, by the 20th century, there was growing pride in the pioneers and nation-builders. Establishing descent from these was, and is, important to lineage societies, such as the Daughters of the American Revolution and The General Society of Mayflower Descendants. Modern family history explores new sources of status, such as celebrating the resilience of families that survived generations of poverty or slavery, or the success of families in integrating across racial or national boundaries. Some family histories even emphasize links to celebrity criminals, such as the bushranger Ned Kelly in Australia. Legal and forensic research Lawyers involved in probate cases do genealogy to locate heirs of property. Detectives may perform genealogical research using DNA evidence to identify victims of homicides or perpetrators of crimes. Scholarly research Historians and geneticists may carry out genealogical research to gain a greater understanding of specific topics in their respective fields, and some may employ professional genealogists in connection with specific aspects of their research. They also publish their research in peer-reviewed journals. The introduction of postgraduate courses in genealogy in recent years has given genealogy more of an academic focus, with the emergence of peer-reviewed journals in this area. Scholarly genealogy is beginning to emerge as a discipline in its own right, with an increasing number of individuals who have obtained genealogical qualifications carrying out research on a diverse range of topics related to genealogy, both within academic institutions and independently. History Historically, in Western societies, the focus of genealogy was on the kinship and descent of rulers and nobles, often arguing or demonstrating the legitimacy of claims to wealth and power. The term often overlapped with heraldry, in which the ancestry of royalty was reflected in their coats of arms. Modern scholars consider many claimed noble ancestries to be fabrications, such as the Anglo-Saxon Chronicle that traced the ancestry of several English kings to the god Woden. Some family trees have been maintained for considerable periods. The family tree of Confucius has been maintained for over 2,500 years and is listed in the Guinness Book of World Records as the largest extant family tree. The fifth edition of the Confucius Genealogy was printed in 2009 by the Confucius Genealogy Compilation Committee (CGCC). Modern times In modern times, genealogy has become more widespread, with commoners as well as nobility researching and maintaining their family trees. Genealogy received a boost in the late 1970s with the television broadcast of Roots: The Saga of an American Family by Alex Haley. His account of his family's descent from the African tribesman Kunta Kinte inspired many others to study their own lines. With the advent of the Internet, the number of resources readily accessible to genealogists has vastly increased, resulting in an explosion of interest in the topic. Genealogy is one of the most popular topics on the Internet. The Internet has become a major source not only of data for genealogists but also of education and communication. India Some notable places where traditional genealogy records are kept include Hindu genealogy registers at Haridwar (Uttarakhand), Varanasi and Allahabad (Uttar Pradesh), Kurukshetra (Haryana), Trimbakeshwar (Maharashtra), and Chintpurni (Himachal Pradesh). United States Genealogical research in the United States was first systematized in the early 19th century, especially by John Farmer (1789–1838). Before Farmer's efforts, tracing one's genealogy was seen as an attempt by the American colonists to secure a measure of social standing, an aim that was counter to the new republic's egalitarian, future-oriented ideals (as outlined in the Constitution). As Fourth of July celebrations commemorating the Founding Fathers and the heroes of the Revolutionary War became increasingly popular, however, the pursuit of "antiquarianism," which focused on local history, became acceptable as a way to honor the achievements of early Americans. Farmer capitalized on the acceptability of antiquarianism to frame genealogy within the early republic's ideological framework of pride in one's American ancestors. He corresponded with other antiquarians in New England, where antiquarianism and genealogy were well established, and became a coordinator, booster, and contributor to the growing movement. In the 1820s, he and fellow antiquarians began to produce genealogical and antiquarian tracts in earnest, slowly gaining a devoted audience among the American people. Though Farmer died in 1839, his efforts led to the creation of the New England Historic Genealogical Society (NEHGS), one of New England's oldest and most prominent organizations dedicated to the preservation of public records. NEHGS publishes the New England Historical and Genealogical Register. The Genealogical Society of Utah, founded in 1894, later became the Family History Department of The Church of Jesus Christ of Latter-day Saints. The department's research facility, the Family History Library, which Utah.com states is "the largest genealogical library in the world," was established to assist in tracing family lineages for special religious ceremonies which Latter-day Saints believe will seal family units together for eternity. Latter-day Saints believe that this fulfilled a biblical prophecy stating that the prophet Elijah would return to "turn the heart of the fathers to the children, and the heart of the children to their fathers." There is a network of church-operated Family History Centers all over the country and around the world, where volunteers assist the public with tracing their ancestors.<ref>"Family History Centers ," The Church of Jesus Christ of Latter-day Saints: Newsroom, Accessed 2 Jul 2019.</ref> Brigham Young University offers bachelor's degree, minor, and concentration programs in Family History and is the only school in North America to offer this. The American Society of Genealogists is the scholarly honorary society of the U.S. genealogical field. Founded by John Insley Coddington, Arthur Adams, and Meredith B. Colket, Jr., in December 1940, its membership is limited to 50 living fellows. ASG has semi-annually published The Genealogist, a scholarly journal of genealogical research, since 1980. Fellows of the American Society of Genealogists, who bear the post-nominal acronym FASG, have written some of the most notable genealogical materials of the last half-century. Some of the most notable scholarly American genealogical journals are The American Genealogist, National Genealogical Society Quarterly, The New England Historical and Genealogical Register, The New York Genealogical and Biographical Record, and The Genealogist.David L. Greene, "Scholarly Genealogical Journals in America, The American Genealogist 61 (1985-86): 116-20. Research process Genealogical research is a complex process that uses historical records and sometimes genetic analysis to demonstrate kinship. Reliable conclusions are based on the quality of sources (ideally, original records), the information within those sources, (ideally, primary or firsthand information), and the evidence that can be drawn (directly or indirectly), from that information. In many instances, genealogists must skillfully assemble indirect or circumstantial evidence to build a case for identity and kinship. All evidence and conclusions, together with the documentation that supports them, is then assembled to create a cohesive genealogy or family history. Genealogists begin their research by collecting family documents and stories. This creates a foundation for documentary research, which involves examining and evaluating historical records for evidence about ancestors and other relatives, their kinship ties, and the events that occurred in their lives. As a rule, genealogists begin with the present and work backwards in time. Historical, social, and family context is essential to achieving correct identification of individuals and relationships. Source citation is also important when conducting genealogical research. To keep track of collected material, family group sheets and pedigree charts are used. Formerly handwritten, these can now be generated by genealogical software. Genetic analysis Because a person's DNA contains information that has been passed down relatively unchanged from early ancestors, analysis of DNA is sometimes used for genealogical research. Three DNA types are of particular interest. Mitochondrial DNA (mtDNA) is contained in the mitochondria of the egg cell and is passed down from a mother to all of her children, both male and female; however, only females pass it on to their children. Y-DNA is present only in males and is passed down from a father to his sons (direct male line) with only minor mutations occurring over time. Autosomal DNA (atDNA), is found in the 22 non-sex chromosomes (autosomes) and is inherited from both parents; thus, it can uncover relatives from any branch of the family. A genealogical DNA test allows two individuals to find the probability that they are, or are not, related within an estimated number of generations. Individual genetic test results are collected in databases to match people descended from a relatively recent common ancestor. See, for example, the Molecular Genealogy Research Project. Some tests are limited to either the patrilineal or the matrilineal line. Collaboration Most genealogy software programs can export information about persons and their relationships in a standardized format called a GEDCOM. In that format, it can be shared with other genealogists, added to databases, or converted into family web sites. Social networking service (SNS) websites allow genealogists to share data and build their family trees online. Members can upload their family trees and contact other family historians to fill in gaps in their research. In addition to the (SNS) websites, there are other resources that encourage genealogists to connect and share information, such as rootsweb.ancestry.com and rsl.rootsweb.ancestry.com. Volunteerism Volunteer efforts figure prominently in genealogy. These range from the extremely informal to the highly organized. On the informal side are the many popular and useful message boards such as Rootschat and mailing lists on particular surnames, regions, and other topics. These forums can be used to try to find relatives, request record lookups, obtain research advice, and much more. Many genealogists participate in loosely organized projects, both online and off. These collaborations take numerous forms. Some projects prepare name indexes for records, such as probate cases, and publish the indexes, either online or off. These indexes can be used as finding aids to locate original records. Other projects transcribe or abstract records. Offering record lookups for particular geographic areas is another common service. Volunteers do record lookups or take photos in their home areas for researchers who are unable to travel. Those looking for a structured volunteer environment can join one of thousands of genealogical societies worldwide. Most societies have a unique area of focus, such as a particular surname, ethnicity, geographic area, or descendancy from participants in a given historical event. Genealogical societies are almost exclusively staffed by volunteers and may offer a broad range of services, including maintaining libraries for members' use, publishing newsletters, providing research assistance to the public, offering classes or seminars, and organizing record preservation or transcription projects. Software Genealogy software is used to collect, store, sort, and display genealogical data. At a minimum, genealogy software accommodates basic information about individuals, including births, marriages, and deaths. Many programs allow for additional biographical information, including occupation, residence, and notes, and most also offer a method for keeping track of the sources for each piece of evidence. Most programs can generate basic kinship charts and reports, allow for the import of digital photographs and the export of data in the GEDCOM format (short for GEnealogical Data COMmunication) so that data can be shared with those using other genealogy software. More advanced features include the ability to restrict the information that is shared, usually by removing information about living people out of privacy concerns; the import of sound files; the generation of family history books, web pages and other publications; the ability to handle same-sex marriages and children born out of wedlock; searching the Internet for data; and the provision of research guidance. Programs may be geared toward a specific religion, with fields relevant to that religion, or to specific nationalities or ethnic groups, with source types relevant for those groups. Online resources involve complex programming and large data bases, such as censuses. Records and documentation Genealogists use a wide variety of records in their research. To effectively conduct genealogical research, it is important to understand how the records were created, what information is included in them, and how and where to access them.David Hey, The Oxford Companion to Family and Local History (2nd ed. 2008). List of record types Records that are used in genealogy research include: Vital records Birth records Death records Marriage and divorce records Adoption records Biographies and biographical profiles (e.g. Who's Who) Cemetery lists Census records Church and Religious records Baptism or christening Brit milah or Baby naming certificates Confirmation Bar or bat mitzvah Marriage Funeral or death Membership City directories and telephone directories Coroner's reports Court records Criminal records Civil records Diaries, personal letters and family Bibles DNA tests Emigration, immigration and naturalization records Hereditary & lineage organization records, e.g. Daughters of the American Revolution records Land and property records, deeds Medical records Military and conscription records Newspaper articles Obituaries Occupational records Oral histories Passports Photographs Poorhouse, workhouse, almshouse, and asylum records School and alumni association records Ship passenger lists Social Security (within the US) and pension records Tax records Tombstones, cemetery records, and funeral home records Voter registration records Wills and probate records To keep track of their citizens, governments began keeping records of persons who were neither royalty nor nobility. In England and Germany, for example, such record keeping started with parish registers in the 16th century. As more of the population was recorded, there were sufficient records to follow a family. Major life events, such as births, marriages, and deaths, were often documented with a license, permit, or report. Genealogists locate these records in local, regional or national offices or archives and extract information about family relationships and recreate timelines of persons' lives. In China, India and other Asian countries, genealogy books are used to record the names, occupations, and other information about family members, with some books dating back hundreds or even thousands of years. In the eastern Indian state of Bihar, there is a written tradition of genealogical records among Maithil Brahmins and Karna Kayasthas called "Panjis", dating to the 12th century CE. Even today these records are consulted prior to marriages. In Ireland, genealogical records were recorded by professional families of senchaidh (historians) until as late as the mid-17th century. Perhaps the most outstanding example of this genre is Leabhar na nGenealach/The Great Book of Irish Genealogies, by Dubhaltach MacFhirbhisigh (d. 1671), published in 2004. FamilySearch collections The LDS Church has engaged in large-scale microfilming of records of genealogical value. Its Family History Library in Salt Lake City, Utah, houses over 2 million microfiche and microfilms of genealogically relevant material, which are also available for on-site research at over 4,500 Family History Centers worldwide. FamilySearch's website includes many resources for genealogists: a FamilyTree database, historical records, digitized family history books, resources and indexing for African American genealogy such as slave and bank records, and a Family History Research Wiki containing research guidance articles. Indexing ancestral information Indexing is the process of transcribing parish records, city vital records, and other reports, to a digital database for searching. Volunteers and professionals participate in the indexing process. Since 2006, the microfilm in the FamilySearch granite mountain vault is in the process of being digitally scanned, available online, and eventually indexed. For example, after the 72-year legal limit for releasing personal information for the United States Census was reached in 2012, genealogical groups cooperated to index the 132 million residents registered in the 1940 United States Census. Between 2006 and 2012, the FamilySearch indexing effort produced more than 1 billion searchable records. Record loss and preservation Sometimes genealogical records are destroyed, whether accidentally or on purpose. In order to do thorough research, genealogists keep track of which records have been destroyed so they know when information they need may be missing. Of particular note for North American genealogy is the 1890 United States Census, which was destroyed in a fire in 1921. Although fragments survive, most of the 1890 census no longer exists. Those looking for genealogical information for families that lived in the United States in 1890 must rely on other information to fill that gap. War is another cause of record destruction. During World War II, many European records were destroyed. Communists in China during the Cultural Revolution and in Korea during the Korean War destroyed genealogy books kept by families. Often records are destroyed due to accident or neglect. Since genealogical records are often kept on paper and stacked in high-density storage, they are prone to fire, mold, insect damage, and eventual disintegration. Sometimes records of genealogical value are deliberately destroyed by governments or organizations because the records are considered to be unimportant or a privacy risk. Because of this, genealogists often organize efforts to preserve records that are at risk of destruction. FamilySearch has an ongoing program that assesses what useful genealogical records have the most risk of being destroyed, and sends volunteers to digitize such records. In 2017, the government of Sierra Leone asked FamilySearch for help preserving their rapidly deteriorating vital records. FamilySearch has begun digitizing the records and making them available online. The Federation of Genealogical Societies also organized an effort to preserve and digitize United States War of 1812 pension records. In 2010, they began raising funds, which were contribute by genealogists around the United States and matched by Ancestry.com. Their goal was achieved and the process of digitization was able to begin. The digitized records are available for free online. Types of information Genealogists who seek to reconstruct the lives of each ancestor consider all historical information to be "genealogical" information. Traditionally, the basic information needed to ensure correct identification of each person are place names, occupations, family names, first names, and dates. However, modern genealogists greatly expand this list, recognizing the need to place this information in its historical context in order to properly evaluate genealogical evidence and distinguish between same-name individuals. A great deal of information is available for British ancestry with growing resources for other ethnic groups. Family names Family names are simultaneously one of the most important pieces of genealogical information, and a source of significant confusion for researchers. In many cultures, the name of a person refers to the family to which he or she belongs. This is called the family name, surname, or last name. Patronymics are names that identify an individual based on the father's name. For example, Marga Olafsdottir is Marga, daughter of Olaf, and Olaf Thorsson is Olaf, son of Thor. Many cultures used patronymics before surnames were adopted or came into use. The Dutch in New York, for example, used the patronymic system of names until 1687 when the advent of English rule mandated surname usage. In Iceland, patronymics are used by a majority of the population. In Denmark and Norway patronymics and farm names were generally in use through the 19th century and beyond, though surnames began to come into fashion toward the end of the 19th century in some parts of the country. Not until 1856 in Denmark and 1923 in Norway were there laws requiring surnames. The transmission of names across generations, marriages and other relationships, and immigration may cause difficulty in genealogical research. For instance, women in many cultures have routinely used their spouse's surnames. When a woman remarried, she may have changed her name and the names of her children; only her name; or changed no names. Her birth name (maiden name) may be reflected in her children's middle names; her own middle name; or dropped entirely. Children may sometimes assume stepparent, foster parent, or adoptive parent names. Because official records may reflect many kinds of surname change, without explaining the underlying reason for the change, the correct identification of a person recorded identified with more than one name is challenging. Immigrants to America often Americanized their names. Surname data may be found in trade directories, census returns, birth, death, and marriage records. Given names Genealogical data regarding given names (first names) is subject to many of the same problems as are family names and place names. Additionally, the use of nicknames is very common. For example, Beth, Lizzie or Betty are all common for Elizabeth, and Jack, John and Jonathan may be interchanged. Middle names provide additional information. Middle names may be inherited, follow naming customs, or be treated as part of the family name. For instance, in some Latin cultures, both the mother's family name and the father's family name are used by the children. Historically, naming traditions existed in some places and cultures. Even in areas that tended to use naming conventions, however, they were by no means universal. Families may have used them some of the time, among some of their children, or not at all. A pattern might also be broken to name a newborn after a recently deceased sibling, aunt or uncle. An example of a naming tradition from England, Scotland and Ireland: Another example is in some areas of Germany, where siblings were given the same first name, often of a favourite saint or local nobility, but different second names by which they were known (Rufname). If a child died, the next child of the same gender that was born may have been given the same name. It is not uncommon that a list of a particular couple's children will show one or two names repeated. Personal names have periods of popularity, so it is not uncommon to find many similarly named people in a generation, and even similarly named families; e.g., "William and Mary and their children David, Mary, and John". Many names may be identified strongly with a particular gender; e.g., William for boys, and Mary for girls. Others may be ambiguous, e.g., Lee, or have only slightly variant spellings based on gender, e.g., Frances (usually female) and Francis (usually male). Place names While the locations of ancestors' residences and life events are core elements of the genealogist's quest, they can often be confusing. Place names may be subject to variant spellings by partially literate scribes. Locations may have identical or very similar names. For example, the village name Brockton occurs six times in the border area between the English counties of Shropshire and Staffordshire. Shifts in political borders must also be understood. Parish, county, and national borders have frequently been modified. Old records may contain references to farms and villages that have ceased to exist. When working with older records from Poland, where borders and place names have changed frequently in past centuries, a source with maps and sample records such as A Translation Guide to 19th-Century Polish-Language Civil-Registration Documents can be invaluable. Available sources may include vital records (civil or church registration), censuses, and tax assessments. Oral tradition is also an important source, although it must be used with caution. When no source information is available for a location, circumstantial evidence may provide a probable answer based on a person's or a family's place of residence at the time of the event. Maps and gazetteers are important sources for understanding the places researched. They show the relationship of an area to neighboring communities and may be of help in understanding migration patterns. Family tree mapping using online mapping tools such as Google Earth (particularly when used with Historical Map overlays such as those from the David Rumsey Historical Map Collection) assist in the process of understanding the significance of geographical locations. Dates It is wise to exercise extreme caution with dates. Dates are more difficult to recall years after an event, and are more easily mistranscribed than other types of genealogical data. Therefore, one should determine whether the date was recorded at the time of the event or at a later date. Dates of birth in vital records or civil registrations and in church records at baptism are generally accurate because they were usually recorded near the time of the event. Family Bibles are often a source for dates, but can be written from memory long after the event. When the same ink and handwriting is used for all entries, the dates were probably written at the same time and therefore will be less reliable since the earlier dates were probably recorded well after the event. The publication date of the Bible also provides a clue about when the dates were recorded since they could not have been recorded at any earlier date. People sometimes reduce their age on marriage, and those under "full age" may increase their age in order to marry or to join the armed forces. Census returns are notoriously unreliable for ages or for assuming an approximate death date. Ages over 15 in the 1841 census in the UK are rounded down to the next lower multiple of five years. Although baptismal dates are often used to approximate birth dates, some families waited years before baptizing children, and adult baptisms are the norm in some religions. Both birth and marriage dates may have been adjusted to cover for pre-wedding pregnancies. Calendar changes must also be considered. In 1752, England and her American colonies changed from the Julian to the Gregorian calendar. In the same year, the date the new year began was changed. Prior to 1752 it was 25 March; this was changed to 1 January. Many other European countries had already made the calendar changes before England had, sometimes centuries earlier. By 1751 there was an 11-day discrepancy between the date in England and the date in other European countries. For further detail on the changes involved in moving from the Julian to the Gregorian calendar, see: Gregorian calendar. The French Republican Calendar or French Revolutionary Calendar was a calendar proposed during the French Revolution, and used by the French government for about 12 years from late 1793 to 1805, and for 18 days in 1871 in Paris. Dates in official records at this time use the revolutionary calendar and need "translating" into the Gregorian calendar for calculating ages etc. There are various websites which do this. Occupations Occupational information may be important to understanding an ancestor's life and for distinguishing two people with the same name. A person's occupation may have been related to his or her social status, political interest, and migration pattern. Since skilled trades are often passed from father to son, occupation may also be indirect evidence of a family relationship. It is important to remember that a person may change occupations, and that titles change over time as well. Some workers no longer fit for their primary trade often took less prestigious jobs later in life, while others moved upwards in prestige. Many unskilled ancestors had a variety of jobs depending on the season and local trade requirements. Census returns may contain some embellishment; e.g., from labourer to mason, or from journeyman to master craftsman. Names for old or unfamiliar local occupations may cause confusion if poorly legible. For example, an ostler (a keeper of horses) and a hostler (an innkeeper) could easily be confused for one another. Likewise, descriptions of such occupations may also be problematic. The perplexing description "ironer of rabbit burrows" may turn out to describe an ironer (profession) in the Bristol district named Rabbit Burrows. Several trades have regionally preferred terms. For example, "shoemaker" and "cordwainer" have the same meaning. Finally, many apparently obscure jobs are part of a larger trade community, such as watchmaking, framework knitting or gunmaking. Occupational data may be reported in occupational licences, tax assessments, membership records of professional organizations, trade directories, census returns, and vital records (civil registration). Occupational dictionaries are available to explain many obscure and archaic trades. Reliability of sources Information found in historical or genealogical sources can be unreliable and it is good practice to evaluate all sources with a critical eye. Factors influencing the reliability of genealogical information include: the knowledge of the informant (or writer); the bias and mental state of the informant (or writer); the passage of time and the potential for copying and compiling errors. The quality of census data has been of special interest to historians, who have investigated reliability issues.Richard H. Steckel, "The Quality of Census Data for Historical Inquiry: A Research Agenda," Social Science History, vol. 15, no. 4 (Winter, 1991), pp. 579–599. Knowledge of the informant The informant is the individual who provided the recorded information. Genealogists must carefully consider who provided the information and what he or she knew. In many cases the informant is identified in the record itself. For example, a death certificate usually has two informants: a physician who provides information about the time and cause of death and a family member who provides the birth date, names of parents, etc. When the informant is not identified, one can sometimes deduce information about the identity of the person by careful examination of the source. One should first consider who was alive (and nearby) when the record was created. When the informant is also the person recording the information, the handwriting can be compared to other handwriting samples. When a source does not provide clues about the informant, genealogists should treat the source with caution. These sources can be useful if they can be compared with independent sources. For example, a census record by itself cannot be given much weight because the informant is unknown. However, when censuses for several years concur on a piece of information that would not likely be guessed by a neighbor, it is likely that the information in these censuses was provided by a family member or other informed person. On the other hand, information in a single census cannot be confirmed by information in an undocumented compiled genealogy since the genealogy may have used the census record as its source and might therefore be dependent on the same misinformed individual. Motivation of the informant Even individuals who had knowledge of the fact, sometimes intentionally or unintentionally provided false or misleading information. A person may have lied in order to obtain a government benefit (such as a military pension), avoid taxation, or cover up an embarrassing situation (such as the existence of a non-marital child). A person with a distressed state of mind may not be able to accurately recall information. Many genealogical records were recorded at the time of a loved one's death, and so genealogists should consider the effect that grief may have had on the informant of these records. The effect of time The passage of time often affects a person's ability to recall information. Therefore, as a general rule, data recorded soon after the event are usually more reliable than data recorded many years later. However, some types of data are more difficult to recall after many years than others. One type especially prone to recollection errors is dates. Also the ability to recall is affected by the significance that the event had to the individual. These values may have been affected by cultural or individual preferences. Copying and compiling errors Genealogists must consider the effects that copying and compiling errors may have had on the information in a source. For this reason, sources are generally categorized in two categories: original and derivative. An original source is one that is not based on another source. A derivative source is information taken from another source. This distinction is important because each time a source is copied, information about the record may be lost and errors may result from the copyist misreading, mistyping, or miswriting the information. Genealogists should consider the number of times information has been copied and the types of derivation a piece of information has undergone. The types of derivatives include: photocopies, transcriptions, abstracts, translations, extractions, and compilations. In addition to copying errors, compiled sources (such as published genealogies and online pedigree databases) are susceptible to misidentification errors and incorrect conclusions based on circumstantial evidence. Identity errors usually occur when two or more individuals are assumed to be the same person. Circumstantial or indirect evidence does not explicitly answer a genealogical question, but either may be used with other sources to answer the question, suggest a probable answer, or eliminate certain possibilities. Compilers sometimes draw hasty conclusions from circumstantial evidence without sufficiently examining all available sources, without properly understanding the evidence, and without appropriately indicating the level of uncertainty. Primary and secondary sources In genealogical research, information can be obtained from primary or secondary sources. Primary sources are records that were made at the time of the event, for example a death certificate would be a primary source for a person's death date and place. Secondary sources are records that are made days, weeks, months, or even years after an event. Standards and ethics Organizations that educate and certify genealogists have established standards and ethical guidelines they instruct genealogists to follow. Research standards Genealogy research requires analyzing documents and drawing conclusions based on the evidence provided in the available documents. Genealogists need standards to determine whether or not their evaluation of the evidence is accurate. In the past, genealogists in the United States borrowed terms from judicial law to examine evidence found in documents and how they relate to the researcher's conclusions. However, the differences between the two disciplines created a need for genealogists to develop their own standards. In 2000, the Board for Certification of Genealogists published their first manual of standards. The Genealogical Proof Standard created by the Board for Certification of Genealogists is widely distributed in seminars, workshops, and educational materials for genealogists in the United States. Other genealogical organizations around the world have created similar standards they invite genealogists to follow. Such standards provide guidelines for genealogists to evaluate their own research as well as the research of others. Standards for genealogical research include: Clearly document and organize findings. Cite all sources in a specific manner so that others can locate them and properly evaluate them. Locate all available sources that may contain information relevant to the research question. Analyze findings thoroughly, without ignoring conflicts in records or negative evidence. Rely on original, rather than derivative sources, wherever possible. Use logical reasoning based on reliable sources to reach conclusions. Acknowledge when a specific conclusion is only "possible" or "probable" rather than "proven." Acknowledge that other records that have not yet been discovered may overturn a conclusion. Ethical guidelines Genealogists often handle sensitive information and share and publish such information. Because of this, there is a need for ethical standards and boundaries for when information is too sensitive to be published. Historically, some genealogists have fabricated information or have otherwise been untrustworthy. Genealogical organizations around the world have outlined ethical standards as an attempt to eliminate such problems. Ethical standards adopted by various genealogical organizations include: Respect copyright laws Acknowledge where one consulted another's work and do not plagiarize the work of other researchers. Treat original records with respect and avoid causing damage to them or removing them from repositories. Treat archives and archive staff with respect. Protect the privacy of living individuals by not publishing or otherwise disclosing information about them without their permission. Disclose any conflicts of interest to clients. When doing paid research, be clear with the client about scope of research and fees involved. Do not fabricate information or publish false or unproven information as proven. Be sensitive about information found through genealogical research that may make the client or family members uncomfortable. In 2015, a committee presented standards for genetic genealogy at the Salt Lake Institute of Genealogy. The standards emphasize that genealogists and testing companies should respect the privacy of clients and recognize the limits of DNA tests. It also discusses how genealogists should thoroughly document conclusions made using DNA evidence. In 2019, the Board for the Certification of Genealogists officially updated their standards and code of ethics to include standards for genetic genealogy. See also References Further reading General Hopwood, Nick, Rebecca Flemming, Lauren Kassell, eds. Reproduction: Antiquity to the Present Day (Cambridge UP, 2018). Illustrations. xxxv + 730 pp. excerpt also online review 44 scholarly essays by historians. British Isles Kriesberg, Adam. "The future of access to public records? Public–private partnerships in US state and territorial archives." Archival Science 17.1 (2017): 5-25. China Continental Europe Volkmar Weiss: German Genealogy in Its Social and Political Context.'' KDP, 2020, North America External links Kinship and descent
[ 0.5035088658332825, 0.2012712061405182, -0.30829063057899475, 0.2588590681552887, 0.4440654218196869, 0.3542097806930542, 0.37561944127082825, -0.03356442227959633, -0.15511922538280487, -0.7251028418540955, -0.3719787299633026, -0.03628738597035408, 0.1686364859342575, 0.3131681978702545,...
12027
https://en.wikipedia.org/wiki/Gabon
Gabon
Gabon (; ), officially the Gabonese Republic (), is a country on the west coast of Central Africa. Located on the equator, Gabon is bordered by Equatorial Guinea to the northwest, Cameroon to the north, the Republic of the Congo on the east and south, and the Gulf of Guinea to the west. It has an area of nearly and its population is estimated at million people. There are three distinct regions: the coastal plains, the mountains (the Cristal Mountains and the Chaillu Massif in the centre), and the savanna in the east. Gabon's capital and largest city is Libreville. The official language is French. Originally settled by Pygmy peoples, they were largely replaced and absorbed by Bantu tribes as they migrated. By the 18th century, a Myeni-speaking kingdom known as the Kingdom of Orungu formed in Gabon. It was able to become a powerful trading center mainly due to its ability to purchase and sell slaves. The kingdom fell with the demise of the slave trade in the 1870s. Since its independence from France in 1960, the sovereign state of Gabon has had three presidents. In the early 1990s, Gabon introduced a multi-party system and a new democratic constitution that allowed for a more transparent electoral process and reformed many governmental institutions. Abundant petroleum and foreign private investment have helped make Gabon one of the most prosperous countries in Sub-Saharan Africa, with the fifth highest HDI in the region (after Mauritius, Seychelles, Botswana and South Africa) and the fifth highest GDP per capita (PPP) in all of Africa (after Seychelles, Mauritius, Equatorial Guinea and Botswana). Its GDP grew by more than 6% per year from 2010 to 2012. However, because of inequality in income distribution, a significant proportion of the population remains poor. Gabon is rich in folklore and mythology. "Raconteurs" keep traditions alive such as the mvett among the Fangs and the ingwala among the Nzebis. Gabon is also known for its masks, such as the n'goltang (Fang) and the reliquary figures of the Kota. Etymology Gabon's name originates from gabão, Portuguese for "cloak", which is roughly the shape of the estuary of the Komo River by Libreville. History Pre-Colonial Era (pre-1885) The earliest inhabitants of the area were Pygmy peoples. They were largely replaced and absorbed by Bantu tribes as they migrated. In the 15th century, the first Europeans arrived. By the 18th century, a Myeni-speaking kingdom known as Orungu formed in Gabon. Through its control of the slave trade in the 18th and 19th centuries, it was able to become the most powerful of the trading centers that developed in Gabon during that period. On 10 February 1722, Bartholomew Roberts, Barti Ddu, a Welsh pirate known in English as Black Bart, died at sea off Cape Lopez. He raided ships off the Americas and West Africa from 1719 to 1722. Colonial Era (1885–1960) French explorer Pierre Savorgnan de Brazza led his first mission to the Gabon-Congo area in 1875. He founded the town of Franceville, and was later colonial governor. Several Bantu groups lived in the area that is now Gabon when France officially occupied it in 1885. In 1910, Gabon became one of the four territories of French Equatorial Africa, a federation that survived until 1958. In World War II, the Allies invaded Gabon in order to overthrow the pro-Vichy France colonial administration. On 28 November 1958, Gabon became an autonomous republic within the French Community, and on 17 August 1960, it became fully independent. Post-Independence (1960–present) The first president of Gabon, elected in 1961, was Léon M'ba, with Omar Bongo Ondimba as his vice president. After M'ba's accession to power, the press was suppressed, political demonstrations banned, freedom of expression curtailed, other political parties gradually excluded from power, and the Constitution changed along French lines to vest power in the Presidency, a post that M'ba assumed himself. However, when M'ba dissolved the National Assembly in January 1964 to institute one-party rule, an army coup sought to oust him from power and restore parliamentary democracy. French paratroopers flew in within 24 hours to restore M'ba to power. After a few days of fighting, the coup ended and the opposition was imprisoned, despite widespread protests and riots. French soldiers still remain in the Camp de Gaulle on the outskirts of Gabon's capital to this day. When M'Ba died in 1967, Bongo replaced him as president. In March 1968, Bongo declared Gabon a one-party state by dissolving the BDG and establishing a new party—the Parti Democratique Gabonais (PDG). He invited all Gabonese, regardless of previous political affiliation, to participate. Bongo sought to forge a single national movement in support of the government's development policies, using the PDG as a tool to submerge the regional and tribal rivalries that had divided Gabonese politics in the past. Bongo was elected president in February 1975; in April 1975, the position of vice president was abolished and replaced by the position of prime minister, who had no right to automatic succession. Bongo was re-elected President in both December 1979 and November 1986 to 7-year terms. In early 1990 economic discontent and a desire for political liberalization provoked violent demonstrations and strikes by students and workers. In response to grievances by workers, Bongo negotiated with them on a sector-by-sector basis, making significant wage concessions. In addition, he promised to open up the PDG and to organize a national political conference in March–April 1990 to discuss Gabon's future political system. The PDG and 74 political organizations attended the conference. Participants essentially divided into two loose coalitions, the ruling PDG and its allies, and the United Front of Opposition Associations and Parties, consisting of the breakaway Morena Fundamental and the Gabonese Progress Party. The April 1990 conference approved sweeping political reforms, including creation of a national Senate, decentralization of the budgetary process, freedom of assembly and press, and cancellation of an exit visa requirement. In an attempt to guide the political system's transformation to multiparty democracy, Bongo resigned as PDG chairman and created a transitional government headed by a new Prime Minister, Casimir Oye-Mba. The Gabonese Social Democratic Grouping (RSDG), as the resulting government was called, was smaller than the previous government and included representatives from several opposition parties in its cabinet. The RSDG drafted a provisional constitution in May 1990 that provided a basic bill of rights and an independent judiciary but retained strong executive powers for the president. After further review by a constitutional committee and the National Assembly, this document came into force in March 1991. Opposition to the PDG continued after the April 1990 conference, however, and in September 1990, two coup d'état attempts were uncovered and aborted. Despite anti-government demonstrations after the untimely death of an opposition leader, the first multiparty National Assembly elections in almost 30 years took place in September–October 1990, with the PDG garnering a large majority. Following President Omar Bongo's re-election in December 1993 with 51% of the vote, opposition candidates refused to validate the election results. Serious civil disturbances and violent repression led to an agreement between the government and opposition factions to work toward a political settlement. These talks led to the Paris Accords in November 1994, under which several opposition figures were included in a government of national unity. This arrangement soon broke down, however, and the 1996 and 1997 legislative and municipal elections provided the background for renewed partisan politics. The PDG won a landslide victory in the legislative election, but several major cities, including Libreville, elected opposition mayors during the 1997 local election. Facing a divided opposition, President Omar Bongo coasted to easy re-election in December 1998, with large majorities of the vote. While Bongo's major opponents rejected the outcome as fraudulent, some international observers characterized the results as representative despite many perceived irregularities, and there were none of the civil disturbances that followed the 1993 election. Peaceful though flawed legislative elections held in 2001–2002, which were boycotted by a number of smaller opposition parties and were widely criticized for their administrative weaknesses, produced a National Assembly almost completely dominated by the PDG and allied independents. In November 2005 President Omar Bongo was elected for his sixth term. He won re-election easily, but opponents claim that the balloting process was marred by irregularities. There were some instances of violence following the announcement of his win, but Gabon generally remained peaceful. National Assembly elections were held again in December 2006. Several seats contested because of voting irregularities were overturned by the Constitutional Court, but the subsequent run-off elections in early 2007 again yielded a PDG-controlled National Assembly. On 8 June 2009, President Omar Bongo died of cardiac arrest at a Spanish hospital in Barcelona, ushering in a new era in Gabonese politics. In accordance with the amended constitution, Rose Francine Rogombé, the President of the Senate, became Interim President on 10 June 2009. The first contested elections in Gabon's history that did not include Omar Bongo as a candidate were held on 30 August 2009, with 18 candidates for president. The lead-up to the elections saw some isolated protests, but no significant disturbances. Omar Bongo's son, ruling party leader Ali Bongo Ondimba, was formally declared the winner after a 3-week review by the Constitutional Court; his inauguration took place on 16 October 2009. The court's review had been prompted by claims of fraud by the many opposition candidates, with the initial announcement of election results sparking unprecedented violent protests in Port-Gentil, the country's second-largest city and a long-time bastion of opposition to PDG rule. The citizens of Port-Gentil took to the streets, and numerous shops and residences were burned, including the French Consulate and a local prison. Officially, only four deaths occurred during the riots, but opposition and local leaders claim many more. Gendarmes and the military were deployed to Port-Gentil to support the beleaguered police, and a curfew was in effect for more than three months. A partial legislative by-election was held in June 2010. A newly created coalition of parties, the Union Nationale (UN), participated for the first time. The UN is composed largely of PDG defectors who left the party after Omar Bongo's death. Of the five hotly contested seats, the PDG won three and the UN won two; both sides claimed victory. In January 2019, there was an attempted coup d'état led by soldiers against the President Ali Bongo; the coup ultimately failed. Government and politics Gabon is a republic with a presidential form of government under the 1961 constitution (revised in 1975, rewritten in 1991, and revised in 2003). The president is elected by universal suffrage for a seven-year term; a 2003 constitutional amendment removed presidential term limits and facilitated a presidency for life. The president can appoint and dismiss the prime minister, the cabinet, and judges of the independent Supreme Court. The president also has other strong powers, such as authority to dissolve the National Assembly, declare a state of siege, delay legislation, and conduct referenda. Gabon has a bicameral legislature with a National Assembly and Senate. The National Assembly has 120 deputies who are popularly elected for a 5-year term. The Senate is composed of 102 members who are elected by municipal councils and regional assemblies and serve for 6 years. The Senate was created in the 1990–1991 constitutional revision, although it was not brought into being until after the 1997 local elections. The President of the Senate is next in succession to the President. Despite the democratic system of government, the Freedom in the World report lists Gabon as "not free", and elections in 2016 have been disputed. Political culture In 1990, the government made major changes to Gabon's political system. A transitional constitution was drafted in May 1990 as an outgrowth of the national political conference in March–April and later revised by a constitutional committee. Among its provisions were a Western-style bill of rights, creation of a National Council of Democracy to oversee the guarantee of those rights, a governmental advisory board on economic and social issues, and an independent judiciary. After approval by the National Assembly, the PDG Central Committee, and the President, the Assembly unanimously adopted the constitution in March 1991. Multiparty legislative elections were held in 1990–91, despite the fact that opposition parties had not been declared formally legal. In spite of this, the elections produced the first representative, multiparty National Assembly. In January 1991, the Assembly passed by unanimous vote a law governing the legalization of opposition parties. After President Omar Bongo was re-elected in 1993, in a disputed election where only 51% of votes were cast, social and political disturbances led to the 1994 Paris Conference and Accords. These provided a framework for the next elections. Local and legislative elections were delayed until 1996–97. In 1997, constitutional amendments put forward years earlier were adopted to create the Senate and the position of vice president, as well as to extend the president's term to seven years. In October 2009, newly elected President Ali Bongo Ondimba began efforts to streamline the government. In an effort to reduce corruption and government bloat, he eliminated 17 minister-level positions, abolished the vice presidency and reorganized the portfolios of numerous ministries, bureaus and directorates. In November 2009, President Bongo Ondimba announced a new vision for the modernization of Gabon, called "Gabon Emergent". This program contains three pillars: Green Gabon, Service Gabon, and Industrial Gabon. The goals of Gabon Emergent are to diversify the economy so that Gabon becomes less reliant on petroleum, to eliminate corruption, and to modernize the workforce. Under this program, exports of raw timber have been banned, a government-wide census was held, the work day has been changed to eliminate a long midday break, and a national oil company was created. In provisional results, the ruling Gabonese Democratic Party (PDG) won 84 out of 120 parliamentary seats. On 25 January 2011, opposition leader André Mba Obame claimed the presidency, saying the country should be run by someone the people really wanted. He also selected 19 ministers for his government, and the entire group, along with hundreds of others, spent the night at UN headquarters. On January 26, the government dissolved Mba Obame's party. AU chairman Jean Ping said that Mba Obame's action "hurts the integrity of legitimate institutions and also endangers the peace, the security and the stability of Gabon." Interior Minister Jean-François Ndongou accused Mba Obame and his supporters of treason. The UN Secretary-General, Ban Ki-moon, said that he recognized Ondimba as the only official Gabonese president. The 2016 presidential election was disputed, with very close official results reported. Protests broke out in the capital and met a brutal repression which culminated in the alleged bombing of opposition party headquarters by the presidential guard. Between 50 and 100 citizens were killed by security forces and 1,000 arrested. International observers criticized irregularities, including unnaturally high turnout reported for some districts. The country's supreme court threw out some suspect precincts, but a full recount was not possible because ballots had been destroyed. The election was declared in favor of the incumbent Ondimba. European Parliament issued 2 resolutions denouncing the unclear results of the election and calling for an independent investigation on the human rights violations. Foreign relations Since independence, Gabon has followed a nonaligned policy, advocating dialogue in international affairs and recognizing each side of divided countries. In intra-African affairs, Gabon espouses development by evolution rather than revolution and favors regulated private enterprise as the system most likely to promote rapid economic growth. Gabon played an important leadership role in the stability of Central Africa through involvement in mediation efforts in Chad, the Central African Republic, Angola, the Republic of the Congo, the Democratic Republic of the Congo (D.R.C.), and Burundi. In December 1999, through the mediation efforts of President Bongo, a peace accord was signed in the Republic of the Congo (Brazzaville) between the government and most leaders of an armed rebellion. President Bongo was also involved in the continuing D.R.C. peace process, and played a role in mediating the crisis in Ivory Coast. Gabonese armed forces were also an integral part of the Central African Economic and Monetary Community (CEMAC) mission to the Central African Republic. Gabon is a member of the United Nations (UN) and some of its specialized and related agencies, as well as of the World Bank; the IMF; the African Union (AU); the Central African Customs Union/Central African Economic and Monetary Community (UDEAC/CEMAC); EU/ACP association under the Lomé Convention; the Communaute Financiere Africaine (CFA); the Organization of the Islamic Conference (OIC); the Nonaligned Movement; and the Economic Community of Central African States (ECCAS/CEEAC), among others. In 1995, Gabon withdrew from the Organization of the Petroleum Exporting Countries (OPEC), rejoining in 2016. Gabon was elected to a non-permanent seat on the United Nations Security Council for January 2010 through December 2011 and held the rotating presidency in March 2010. Military Gabon has a small, professional military of about 5,000 personnel, divided into army, navy, air force, gendarmerie, and police force. A 1,800-member guard provides security for the president. Administrative divisions Gabon is divided into nine provinces, which are further subdivided into 50 departments. The president appoints the provincial governors, the prefects, and the subprefects. The provinces are (capitals in parentheses): Estuaire (Libreville) Haut-Ogooué (Franceville) Moyen-Ogooué (Lambaréné) Ngounié (Mouila) Nyanga (Tchibanga) Ogooué-Ivindo (Makokou) Ogooué-Lolo (Koulamoutou) Ogooué-Maritime (Port-Gentil) Woleu-Ntem (Oyem) Geography Gabon is located on the Atlantic coast of central Africa on the equator, between latitudes 3°N and 4°S, and longitudes 8° and 15°E. Gabon generally has an equatorial climate with an extensive system of rainforests, with 89.3% of its land area forested. There are three distinct regions: the coastal plains (ranging between from the ocean's shore), the mountains (the Cristal Mountains to the northeast of Libreville, the Chaillu Massif in the centre), and the savanna in the east. The coastal plains form a large section of the World Wildlife Fund's Atlantic Equatorial coastal forests ecoregion and contain patches of Central African mangroves especially on the Muni River estuary on the border with Equatorial Guinea. Geologically, Gabon is primarily ancient Archean and Paleoproterozoic igneous and metamorphic basement rock, belonging to the stable continental crust of the Congo Craton, a remnant section of extremely old continental crust. Some formations are more than two billion years old. Ancient rock units are overlain by marine carbonate, lacustrine and continental sedimentary rocks as well as unconsolidated sediments and soils that formed in the last 2.5 million years of the Quaternary. The rifting apart of the supercontinent Pangaea created rift basins that filled with sediments and formed the hydrocarbons which are now a keystone of the Gabonese economy. Gabon is notable for the Oklo reactor zones, the only known natural nuclear fission reactor on Earth which was active two billion years ago. The site was discovered during uranium mining in the 1970s to supply the French nuclear power industry. Gabon's largest river is the Ogooué which is long. Gabon has three karst areas where there are hundreds of caves located in the dolomite and limestone rocks. Some of the caves include Grotte du Lastoursville, Grotte du Lebamba, Grotte du Bongolo, and Grotte du Kessipougou. Many caves have not been explored yet. A National Geographic Expedition visited the caves in the summer of 2008 to document them. Gabon is also noted for efforts to preserve the natural environment. In 2002, President Omar Bongo Ondimba designated roughly 10% of the nation's territory to be part of its national park system (with 13 parks in total), one of the largest proportions of nature parkland in the world. The National Agency for National Parks manages Gabon's national park system. Gabon had a 2018 Forest Landscape Integrity Index mean score of 9.07/10, ranking it 9th globally out of 172 countries. Natural resources include petroleum, magnesium, iron, gold, uranium, and forests. Wildlife Economy Gabon's economy is dominated by oil. Oil revenues constitute roughly 46% of the government's budget, 43% of the gross domestic product (GDP), and 81% of exports. Oil production is currently declining rapidly from its high point of 370,000 barrels per day in 1997. Some estimates suggest that Gabonese oil will be expended by 2025. In spite of the decreasing oil revenues, planning is only now beginning for an after-oil scenario. The Grondin Oil Field was discovered in water depths offshore, in 1971 and produces from the Batanga sandstones of Maastrichtian age forming an anticline salt structural trap which is about deep. Gabonese public expenditures from the years of significant oil revenues were not spent efficiently. Overspending on the Trans-Gabon Railway, the CFA franc devaluation of 1994, and periods of low oil prices caused serious debt problems that still plague the country. Gabon earned a poor reputation with the Paris Club and the International Monetary Fund (IMF) over the management of its debt and revenues. Successive IMF missions have criticized the government for overspending on off-budget items (in good years and bad), over-borrowing from the Central Bank, and slipping on the schedule for privatization and administrative reform. However, in September 2005 Gabon successfully concluded a 15-month Stand-By Arrangement with the IMF. Another 3-year Stand-By Arrangement with the IMF was approved in May 2007. Because of the financial crisis and social developments surrounding the death of President Omar Bongo and the elections, Gabon was unable to meet its economic goals under the Stand-By Arrangement in 2009. Negotiations with the IMF were ongoing. Gabon's oil revenues have given it a per capita GDP of $8,600, unusually high for the region. However, a skewed income distribution and poor social indicators are evident. The richest 20% of the population earn over 90% of the income while about a third of the Gabonese population lives in poverty. The economy is highly dependent on extraction, but primary materials are abundant. Before the discovery of oil, logging was the pillar of the Gabonese economy. Today, logging and manganese mining are the next-most-important income generators. Recent explorations suggest the presence of the world's largest unexploited iron ore deposit. For many who live in rural areas without access to employment opportunity in extractive industries, remittances from family members in urban areas or subsistence activities provide income. Foreign and local observers have lamented the lack of diversity in the Gabonese economy. Various factors have so far limited the development of new industries: the market is small, about a million dependent on imports from France unable to capitalize on regional markets entrepreneurial zeal not always present among the Gabonese a fairly regular stream of oil "rent", even if it is diminishing Further investment in the agricultural or tourism sectors is complicated by poor infrastructure. The small processing and service sectors that do exist are largely dominated by a few prominent local investors. At World Bank and IMF insistence, the government embarked in the 1990s on a program of privatization of its state-owned companies and administrative reform, including reducing public sector employment and salary growth, but progress has been slow. The new government has voiced a commitment to work toward an economic transformation of the country but faces significant challenges to realize this goal. Demographics Gabon has a population of approximately million. Historical and environmental factors caused Gabon's population to decline between 1900 and 1940. Gabon has one of the lowest population densities of any country in Africa, and the fourth highest Human Development Index in Sub-Saharan Africa. Ethnic groups Almost all Gabonese are of Bantu origin. Gabon has at least forty ethnic groups with differing languages and cultures. Including Fang, Myènè, Punu-Échira, Nzebi-Adouma, Teke-Mbete, Mèmbè, Kota, Akélé. There are also various indigenous Pygmy peoples: the Bongo, and Baka. The latter speak the only non-Bantu language in Gabon. More than 10,000 native French live in Gabon, including an estimated 2,000 dual nationals. Ethnic boundaries are less sharply drawn in Gabon than elsewhere in Africa. Most ethnicities are spread throughout Gabon, leading to constant contact and interaction among the groups, and there is no ethnic tension. One important reason for this is that intermarriage is extremely common and every Gabonese person is connected by blood to many different tribes. Indeed, intermarriage is often required because among many tribes, marriage within the same tribe is prohibited because it is regarded as incest. This is because those tribes consist of the descendants of a specific ancestor, and therefore all members of the tribe are regarded as close kin to each other. French, the language of its former colonial ruler, is a unifying force. The Democratic Party of Gabon (PDG)'s historical dominance also has served to unite various ethnicities and local interests into a larger whole. Population centres Languages French is the country's sole official language. It is estimated that 80% of Gabon's population can speak French, and that 30% of Libreville residents are native speakers of the language. Nationally, Gabonese people speak their various mother tongues according to their ethnic group. The 2013 census found that only 63.7% of Gabon's population could speak a Gabonese language, broken down by 86.3% in rural areas and 60.5% in urban areas speaking at least one national language. In October 2012, just before the 14th summit of the Organisation internationale de la Francophonie, the country declared an intention to add English as a second official language, reportedly in response to an investigation by France into corruption in the African country, though a government spokesman insisted it was for practical reasons only. It was later clarified that the country intended to introduce English as a first foreign language in schools, while keeping French as the general medium of instruction and the sole official language. Religion Major religions practiced in Gabon include Christianity (Roman Catholicism and Protestantism), Bwiti, Islam, and indigenous animistic religion. Many persons practice elements of both Christianity and traditional indigenous religious beliefs. Approximately 73 percent of residents practice at least some elements of Christianity, including the syncretistic Bwiti; 12 percent practice Islam; 10 percent practice traditional indigenous religious beliefs exclusively; and 5 percent practice no religion or are atheists. A vivid description of taboos and magic is provided by Schweitzer. Health Most of the health services of Gabon are public, but there are some private institutions, of which the best known is the hospital established in 1913 in Lambaréné by Albert Schweitzer. Gabon's medical infrastructure is considered one of the best in West Africa. By 1985 there were 28 hospitals, 87 medical centers, and 312 infirmaries and dispensaries. , there were an estimated 29 physicians per 100,000 people, and approximately 90% of the population had access to health care services. In 2000, 70% of the population had access to safe drinking water and 21% had adequate sanitation. A comprehensive government health program treats such diseases as leprosy, sleeping sickness, malaria, filariasis, intestinal worms, and tuberculosis. Rates for immunization of children under the age of one were 97% for tuberculosis and 65% for polio. Immunization rates for DPT and measles were 37% and 56% respectively. Gabon has a domestic supply of pharmaceuticals from a factory in Libreville. The total fertility rate has decreased from 5.8 in 1960 to 4.2 children per mother during childbearing years in 2000. Ten percent of all births were low birth weight. The maternal mortality rate was 520 per 100,000 live births as of 1998. In 2005, the infant mortality rate was 55.35 per 1,000 live births and life expectancy was 55.02 years. As of 2002, the overall mortality rate was estimated at 17.6 per 1,000 inhabitants. The HIV/AIDS prevalence is estimated to be 5.2% of the adult population (ages 15–49). , approximately 46,000 people were living with HIV/AIDS. There were an estimated 2,400 deaths from AIDS in 2009 – down from 3,000 deaths in 2003. Education Gabon's education system is regulated by two ministries: the Ministry of Education, in charge of pre-kindergarten through the last high school grade, and the Ministry of Higher Education and Innovative Technologies, in charge of universities, higher education, and professional schools. Education is compulsory for children ages 6 to 16 under the Education Act. Most children in Gabon start their school lives by attending nurseries or "Crèche", then kindergarten known as "Jardins d'Enfants". At age six, they are enrolled in primary school, "École Primaire" which is made up of six grades. The next level is "École Secondaire", which is made up of seven grades. The planned graduation age is 19 years old. Those who graduate can apply for admission at institutions of higher learning, including engineering schools or business schools. In Gabon as of 2012, the literacy rate of its population ages 15 and above was 82%. The government has used oil revenue for school construction, paying teachers' salaries, and promoting education, including in rural areas. However, maintenance of school structures, as well as teachers' salaries, has been declining. In 2002 the gross primary enrollment rate was 132 percent, and in 2000 the net primary enrollment rate was 78 percent. Gross and net enrollment ratios are based on the number of students formally registered in primary school and therefore do not necessarily reflect actual school attendance. As of 2001, 69 percent of children who started primary school were likely to reach grade 5. Problems in the education system include poor management and planning, lack of oversight, poorly qualified teachers, and overcrowded classrooms. Culture A country with a primarily oral tradition up until the spread of literacy in the 21st century, Gabon is rich in folklore and mythology. "Raconteurs" are currently working to keep traditions alive such as the mvett among the Fangs and the ingwala among the Nzebis. Gabon also features internationally celebrated masks, such as the n'goltang (Fang) and the reliquary figures of the Kota. Each group has its own set of masks used for various reasons. They are mostly used in traditional ceremonies such as marriage, birth and funerals. Traditionalists mainly work with rare local woods and other precious materials. Music Gabonese music is lesser-known in comparison with regional giants like the Democratic Republic of the Congo and Cameroon. The country boasts an array of folk styles, as well as pop stars like Patience Dabany and Annie-Flore Batchiellilys, a Gabonese singer and renowned live performer. Also known are guitarists like Georges Oyendze, La Rose Mbadou and Sylvain Avara, and the singer Oliver N'Goma. Imported rock and hip hop from the US and UK are popular in Gabon, as are rumba, makossa and soukous. Gabonese folk instruments include the obala, the ngombi, the balafon and traditional drums. Media Radio-Diffusion Télévision Gabonaise (RTG), which is owned and operated by the government, broadcasts in French and indigenous languages. Color television broadcasts have been introduced in major cities. In 1981, a commercial radio station, Africa No. 1, began operations. The most powerful radio station on the continent, it has participation from the French and Gabonese governments and private European media. In 2004, the government operated two radio stations and another seven were privately owned. There were also two government television stations and four privately owned. In 2003, there were an estimated 488 radios and 308 television sets for every 1,000 people. About 11.5 of every 1,000 people were cable subscribers. Also in 2003, there were 22.4 personal computers for every 1,000 people and 26 of every 1,000 people had access to the Internet. The national press service is the Gabonese Press Agency, which publishes a daily paper, Gabon-Matin (circulation 18,000 as of 2002). L'Union in Libreville, the government-controlled daily newspaper, had an average daily circulation of 40,000 in 2002. The weekly Gabon d'Aujourdhui is published by the Ministry of Communications. There are about nine privately owned periodicals which are either independent or affiliated with political parties. These publish in small numbers and are often delayed by financial constraints. The constitution of Gabon provides for free speech and a free press, and the government supports these rights. Several periodicals actively criticize the government and foreign publications are widely available. Cuisine Gabonese cuisine is influenced by French cuisine, but staple foods are also available. Sports The Gabon national football team has represented the nation since 1962. The Under-23 football team won the 2011 CAF U-23 Championship and qualified for the 2012 London Olympics. Gabon were joint hosts, along with Equatorial Guinea, of the 2012 Africa Cup of Nations, and the sole hosts of the competition's 2017 tournament. The Barcelona striker Pierre-Emerick Aubameyang plays for the Gabon national team. The Gabon national basketball team, nicknamed Les Panthères, finished 8th at the AfroBasket 2015, its best performance ever. Gabon has competed at most Summer Olympics since 1972. The country's sole Olympic medalist is Anthony Obame, who won a silver medal in taekwondo at the 2012 Olympics, held in London. Gabon has excellent recreational fishing and is considered one of the best places in the world to catch Atlantic tarpon. See also Outline of Gabon Index of Gabon-related articles ISO 3166-2:GA References Bibliography External links Gabon. The World Factbook. Central Intelligence Agency. Gabon from the BBC News Key Development Forecasts for Gabon from International Futures 2009 report (PDF) from Direction générale de la statistique et des études économiques Central African countries Former French colonies French-speaking countries and territories Member states of the Organisation internationale de la Francophonie Member states of the African Union Member states of the Organisation of Islamic Cooperation Member states of OPEC Current member states of the United Nations Republics States and territories established in 1960 1960 establishments in Africa Countries in Africa
[ -0.0017184119205921888, 0.3399999439716339, -0.36486947536468506, 0.07588867843151093, 0.3192066550254822, 0.08175425976514816, -0.4141814708709717, 0.6784486174583435, -0.931816041469574, -0.23202654719352722, -0.13298556208610535, 0.35581669211387634, -0.6174742579460144, 0.8521407246589...