text stringlengths 10 951k | source stringlengths 39 44 |
|---|---|
Johnny Weissmuller
Johnny Weissmuller (June 2, 1904 – January 20, 1984) was an Austro-Hungarian-born American competition swimmer, water polo player and actor. He was known for playing Edgar Rice Burroughs' ape man Tarzan in films of the 1930s and 1940s and for having one of the best competitive swimming records of the 20th century.
Weissmuller was one of the world's fastest swimmers in the 1920s, winning five Olympic gold medals for swimming and one bronze medal for water polo. He was the first to break the one minute barrier for 100-meter freestyle, and the first to swim 440-yard freestyle under five minutes. He won fifty-two U.S. national championships, set more than 50 world records (spread over both freestyle and backstroke), and was purportedly undefeated in official competition for the entirety of his competitive career. After retiring from competitions, he became the sixth actor to portray Tarzan, a role he played in twelve feature films. Dozens of other actors have also played Tarzan, but Weissmuller is by far the best known. Weissmuller's distinctive Tarzan yell is still often used in films in his legacy.
Johann Weißmüller was an ethnic German on his father's side, the elder son of Peter Weißmüller and his wife Elisabeth (née Kersch), both Banat Swabians, an ethnic German population in the southeastern part of the Kingdom of Hungary. Johann had one sibling, a younger brother, Peter. His generally accepted birthplace was in Szabadfalva (Freidorf), Austro-Hungarian Empire, today part of Timișoara (Temeschwar), Romania. The records of St. Rochus Church in Freidorf show that Johann, son of Peter and Elisabeth (née Kersch) Weissmüller, was baptized there on June 5, 1904, three days after his birth. According to the contemporary laws, his name was recorded as János Weissmüller. However, the ship's roster from his family's arrival at Ellis Island lists his birthplace as Párdány, Kingdom of Hungary (present-day Međa, Žitište, Serbia, near the Romanian border).
The passenger manifest of SS "Rotterdam", which left its namesake city on January 14 and arrived at Ellis Island in New York on January 26, 1905, lists Peter Weissmüller, a 29-year-old laborer, his 24-year-old wife Elisabeth, and seven-month-old Johann as steerage passengers. The family is listed as Germans, last residence Timișoara. After a brief stay in Chicago visiting relatives, they moved to the coal-mining town of Windber, Pennsylvania, where they intended to join their brother-in-law, Johann Ott. On November 5, 1905, Peter Johann Weissmüller was baptized at St. John Cantius Catholic Church in Windber. Peter Weissmuller worked as a miner, and his younger son, Peter Weissmüller Jr., was born in Windber on September 3, 1905. In the 1910 census, Peter and Elizabeth Weisenmüller, as well as John and Eva Ott, were living at 1521 Cleveland Ave in the 22nd Ward of Chicago, with sons John, age six, born in Timișoara and Peter Jr., age five, erroneously entered as born in Illinois. Peter Weissmüller and John Ott were both brewers, Ott emigrating in 1902, Weissmüller in 1904.
At age nine, young John Weissmüller contracted polio. At the suggestion of his doctor, he took up swimming to help battle the disease. After the family moved from western Pennsylvania to Chicago, he continued swimming and eventually earned a spot on the YMCA swim team.
According to military draft registration records for World War I, Peter and Elizabeth were apparently still together as late as 1917. On his paperwork, Peter was listed as a brewer, working for the Elston and Fullerton Brewery. He and his family were living at 226 West North Avenue in Chicago. In his book, "Tarzan, My Father", Johnny Weissmuller Jr. stated that although rumors of Peter Weissmüller living to "a ripe old age, remarrying along the way and spawning a large brood of little Weissmüllers" were reported, no one in the family was aware of his ultimate fate. Peter signed his consent for 19-year-old John "Weissmuller"s passport application in 1924, preceding Johnny's Olympic competition in France. In the 1930 federal census, Elizabeth Weissmüller, age 49, has listed with her, sons John P. and Peter J., and Peter's wife Dorothy. Elizabeth is listed as a widow.
As a teen, Weissmuller attended Lane Technical College Prep High School before dropping out to work various jobs including a stint as a lifeguard at Oak Street Beach on Lake Michigan. While working as an elevator operator and bellboy at the Illinois Athletic Club, Weissmuller caught the eye of swim coach William Bachrach, who trained Weissmuller; in August 1921, Weissmuller won the national championships in the 50-yard and 220-yard distances. Although foreign-born, Weissmuller gave his birthplace as Tanneryville, Cambria County, Pennsylvania, and his birth date as that of his younger brother, Peter Weissmuller. This was to ensure his eligibility to compete as part of the United States Olympic team, and was a critical issue in being issued a United States passport.
On July 9, 1922, Weissmuller broke Duke Kahanamoku's world record in the 100-meter freestyle, swimming it in 58.6 seconds. He won the title for that distance at the 1924 Summer Olympics, beating Kahanamoku for the gold medal. He also won the 400-meter freestyle and was a member of the winning U.S. team in the 4×200-meter relay.
Four years later, at the 1928 Summer Olympics in Amsterdam, he won another two gold medals. It was during this period that Weissmuller became an enthusiast for John Harvey Kellogg's holistic lifestyle views on nutrition, enemas and exercise. He came to Kellogg's Battle Creek, Michigan sanatorium to dedicate its new 120-foot swimming pool, and break one of his own previous swimming records after adopting the vegetarian diet prescribed by Kellogg.
In 1927, Weissmuller set a new world record of 51.0 seconds in the 100-yard freestyle, which stood for 17 years. He improved it to 48.5 seconds at Billy Rose World's Fair Aquacade in 1940, aged 36, but this result was discounted, as he was competing as a professional.
As a member of the U.S. men's national water polo team, he won a bronze medal at the 1924 Summer Olympics. He also competed in the 1928 Olympics where the U.S. team finished in seventh place.
In all, Weissmuller won five Olympic gold medals and one bronze medal, 52 United States national championships, and set 67 world records. He was the first man to swim the 100-meter freestyle under one minute and the 440-yard freestyle under five minutes. He never lost a race and retired with an unbeaten amateur record. In 1950, he was selected by the Associated Press as the greatest swimmer of the first half of the 20th century.
Upon moving to the prosperous Bel Air section of Los Angeles (specifically to an area known today as East Gate Bel Air), Weissmuller later famously commissioned architect Paul Williams to design a large home with a 300-foot serpentine swimming pool that curled around the house, and which still exists.
In 1929, Weissmuller signed a contract with BVD to be a model and representative. He traveled throughout the country doing swim shows, handing out leaflets promoting that brand of swimwear, signing autographs and going on radio. In that same year, he made his first motion picture appearance as an Adonis, wearing only a fig leaf, in a movie entitled "Glorifying the American Girl". He appeared as himself in the first of several "Crystal Champions" movie shorts featuring Weissmuller and other Olympic champions at Silver Springs, Florida. He co-starred with Esther Williams in "Billy Rose's Aquacade" during the New York World's Fair 1939–41, pursuing her for two years.
His acting career began when he signed a seven-year contract with Metro-Goldwyn-Mayer and played the role of Tarzan in "Tarzan the Ape Man" (1932). The movie was a huge success and Weissmuller became an overnight international sensation. The author of "Tarzan", Edgar Rice Burroughs, was pleased with Weissmuller, although he so hated the studio's depiction of a Tarzan who barely spoke English that he created his own concurrent Tarzan series filmed on location in Central American jungles and starring Herman Brix as a suitably articulate version of the character.
Weissmuller starred in six Tarzan movies for MGM with actress Maureen O'Sullivan as Jane and Cheeta the Chimpanzee. The last three also included Johnny Sheffield as Boy. Then, in 1942, Weissmuller went to RKO and starred in six more Tarzan movies with markedly reduced production values. Sheffield also appeared as Boy in the first five features for RKO. Brenda Joyce took over the role of Jane in Weissmuller's last four Tarzan movies (the first two RKO films had not featured Jane). Unlike MGM, RKO allowed Weissmuller to play other roles, though a three-picture contract with Pine-Thomas Productions led to only one film, "Swamp Fire", being made, co-starring Buster Crabbe. In a total of 12 Tarzan films, Weissmuller earned an estimated $2,000,000 and established himself as what many movie historians consider the definitive Tarzan. Although not the first Tarzan in movies (that was Elmo Lincoln), he was the first to be associated with the now traditional ululating, yodeling Tarzan yell. During an appearance on television's "The Mike Douglas Show" in the 1970s, Weissmuller explained how the famous yell was created. Recordings of three vocalists were spliced together to get the effecta soprano, an alto and a hog caller.
Edgar Rice Burroughs himself paid oblique tribute to Weissmuller's powerful screen persona in the last Tarzan novel that he completed, albeit with a misspelling of the actor's name.
But what seemed a long time to them was a matter of seconds only. The tiger's great frame went limp and sank to the ground. And the man rose and put a foot upon it and, raising his face to the heavens, voiced a horrid cry--the victory cry of the bull ape. Corrie was suddenly terrified of this man who had always seemed so civilized and cultured. Even the men were shocked.
Suddenly recognition lighted the eyes of Jerry Lucas. "John Clayton," he said, "Lord Greystoke--Tarzan of the Apes!"
Shrimp's jaw dropped. "Is dat Johnny Weismuller? ["sic"]" he demanded.
Tarzan shook his head as though to clear his brain of an obsession. His thin veneer of civilization had been consumed by the fires of battle. ...
When Weissmuller finally left the role of Tarzan, he immediately traded his loincloth costume for a slouch hat and safari suit for the role of "Jungle Jim" (1948) for Columbia. He made 13 "Jungle Jim" films between 1948 and 1954. According to actor Michael Fox, Weissmuller shot two "Jungle Jim" films consecutively with nine days filming for each with a break of two days between, then he would return to his home in Mexico. Within the next year, he appeared in three more jungle movies, playing himself due to the rights of the name "Jungle Jim" being taken by Screen Gems. In 1955, he began production of the "Jungle Jim" television adventure series for Screen Gems, a film subsidiary of Columbia Pictures. His costars were Martin Huston and Dean Fredericks. The show produced only 26 episodes, which were subsequently played repeatedly on network and syndicated television. Aside from his first screen appearance as Adonis and the role of Johnny Duval in the 1946 film "Swamp Fire", Weissmuller played only three roles in films during the heyday of his Hollywood career: Tarzan, Jungle Jim, and himself.
According to the book by Weissmuller's son, "Tarzan, My Father", while playing golf in Cuba in 1958 during the Cuban Revolution, Weissmuller's golf cart was suddenly surrounded by rebel soldiers. Weissmuller was unable to communicate who he was until he got out of the cart and attempted the trademark Tarzan yell. The soldiers then recognized him and shouted '"¡Es Tarzán! ¡Es Tarzán de la Jungla!" Johnny and his companions were not only "not" kidnapped, but the guerillas gave him an escort to his hotel.
Weissmuller was an accomplished amateur golfer and played in two official PGA Tour tournaments, at the 1937 Western Open at Canterbury Golf Club outside Cleveland (87–85=172, missed the cut) and the 1948 Hawaiian Open (79–75–79–76=309) to finish in 37th place.
In the late 1950s, Weissmuller moved back to Chicago and started a swimming pool company. He lent his name to other business ventures, but did not have a great deal of success.
He retired in 1965, moving to Fort Lauderdale, Florida, where he was founding chairman of the International Swimming Hall of Fame (ISHOF) and was inducted into the ISHOF that same year.
In September 1966, Weissmuller joined former screen Tarzans James Pierce and Jock Mahoney to appear with Ron Ely as part of the publicity for the upcoming premiere of the "Tarzan" television series.
In the late 1960s and early 1970s, Weissmuller was involved with a tourist attraction called Tropical/Florida Wonderland, a.k.a. Tarzan's Jungleland, on U.S. Route 1 in Titusville, Florida.
Weissmuller's face is included in the collage on the iconic front cover of The Beatles' 1967 record album "Sgt. Pepper's Lonely Hearts Club Band".
Based on his interest in natural lifestyles, Weissmuller opened a small chain of health food stores called Johnny Weissmuller's American Natural Foods in California in 1969.
In 1970, he attended the British Commonwealth Games in Edinburgh, where he was presented to Queen Elizabeth II. That same year, he appeared with former co-star Maureen O'Sullivan in "The Phynx" (1970).
Weissmuller lived in Florida until the end of 1973, then moved to Las Vegas, Nevada, where he worked as a greeter at Caesars Palace along with boxer Joe Louis for a time.
In 1976, he appeared for the last time in a motion picture, playing a movie crewman who is fired by a movie mogul (played by Art Carney) in "Won Ton Ton, the Dog Who Saved Hollywood", and he also made his final public appearance in that year when he was inducted into the Body Building Guild Hall of Fame.
Weissmuller was married five times: band and club singer Bobbe Arnst (married 1931, divorced 1933); actress Lupe Vélez (married 1933, divorced 1939); Beryl Scott (married 1939, divorced 1948); Allene Gates (married 1948, divorced 1962); and Maria Baumann (from 1963 until his death in 1984).
With his third wife, Beryl, he had three children, Johnny Weissmuller, Jr. (1940–2006), Wendy Anne Weissmuller (born 1942), and Heidi Elizabeth Weissmuller (1944–1962), who was killed in a car crash. He also had a stepdaughter with Baumann, Lisa Weissmuller-Gallagher.
In 1974, Weissmuller broke both his hip and leg, marking the beginning of years of declining health. While hospitalized he learned that in spite of his strength and lifelong daily regimen of swimming and exercise, he had a serious heart condition. In 1977, Weissmuller suffered a series of strokes. In 1979, he entered the Motion Picture & Television Country House and Hospital in Woodland Hills, California for several weeks before moving with his last wife, Maria, to Acapulco, Mexico, the location of his last Tarzan movie.
On January 20, 1984, Weissmuller died from pulmonary edema at the age of 79. He was buried just outside Acapulco, Valle de La Luz at the Valley of the Light Cemetery. As his coffin was lowered into the ground, a recording of the Tarzan yell he invented was played three times, at his request. He was honored with a 21-gun salute, befitting a head-of-state, which was arranged by Senator Ted Kennedy and President Ronald Reagan.
His former co-star and movie son Johnny Sheffield wrote of him, "I can only say that working with Big John was one of the highlights of my life. He was a Star (with a capital "S") and he gave off a special light and some of that light got into me. Knowing and being with Johnny Weissmuller during my formative years had a lasting influence on my life."
For his contribution to the motion picture industry, Johnny Weissmuller has a star on the Hollywood Walk of Fame at 6541 Hollywood Boulevard in Hollywood, adjacent to the star of Maureen O'Sullivan.
In 1973, Weissmuller was awarded the George Eastman Award, given by George Eastman House for distinguished contribution to the art of film.
The Piscine Molitor in Paris was built as a tribute to Weissmuller and his swimming prowess. | https://en.wikipedia.org/wiki?curid=16558 |
Jean Grey
Jean Elaine Grey is a fictional superhero appearing in American comic books published by Marvel Comics. The character has been known under the aliases Marvel Girl, Phoenix, and Dark Phoenix. Created by writer Stan Lee and artist Jack Kirby, the character first appeared in "The X-Men" #1 (September 1963).
Jean is a member of a subspecies of humans known as mutants, who are born with superhuman abilities. She was born with telepathic and telekinetic powers. Her powers first manifested when she saw her childhood friend being hit by a car. She is a caring, nurturing figure, but she also has to deal with being an Omega-level mutant and the physical manifestation of the cosmic Phoenix Force. Jean experienced a transformation into the Phoenix in the "X-Men" storyline "The Dark Phoenix Saga". She has faced death numerous times in the history of the series. Her first death was under her guise as Marvel Girl, when she died and was "reborn" as Phoenix in "The Dark Phoenix Saga". This transformation led to her second death, which was suicide, though not her last.
She is an important figure in the lives of other Marvel Universe characters, mostly the X-Men, including her husband Cyclops, her mentor and father figure Charles Xavier, her unrequited love interest Wolverine, her best friend and sister-like figure Storm, and her genetic children Rachel Summers, Cable, Stryfe and X-Man.
The character was present for much of the X-Men's history, and she was featured in all three "X-Men" animated series and several video games. She is a playable character in "X-Men Legends" (2004), "" (2005), "Marvel Ultimate Alliance 2" (2009), "" (2011), "Marvel Heroes" (2013), and "Lego Marvel Super Heroes" (2013), and appeared as a non-playable in the first "".
Famke Janssen portrayed the character as an adult in the "X-Men" films while Sophie Turner portrayed her as a teenager.
In 2006, IGN rated Jean Grey 6th on their list of top 25 X-Men from the past forty years, and in 2011, IGN ranked her 13th in the "Top 100 Comic Book Heroes". Her Dark Phoenix persona was ranked 9th in IGN's "Top 100 Comic Book Villains of All Time" list, the highest rank for a female character.
Created by writer Stan Lee and artist/co-writer Jack Kirby, Jean Grey first appeared as Marvel Girl in "The X-Men" #1 (September 1963). The original team's sole female member, Marvel Girl was a regular part of the team through the series' publication. Initially possessing the ability of telekinesis, the character was later granted the power of telepathy, which would be retconned years later as a suppressed mutant ability.
Under the authorship of Chris Claremont and the artwork of first Dave Cockrum and then John Byrne in the late 1970s, Jean Grey underwent a significant transformation from the X-Men's weakest member, to its most powerful.
The first comic Claremont saw at Marvel after coming there in 1969 was the first X-Men issue penciled by Neal Adams (issue 56), after of which he became enamored of Jean Grey. But when he started to write X-Men in issue 94, the first issue after the creation of the new team in Giant-Size X-Men 1, Len Wein had already established that she was leaving the team. The artwork was already done, and it was too late to change. But he promised himself he would bring her back as soon as possible, which he did in issue 97 when he became the sole writer of the title. Claremont also decided to upgrade her powers significantly.
The storyline in which Jean Grey died as Marvel Girl and was reborn as Phoenix ("Uncanny X-Men" #101–108, 1976–1977) has been retroactively dubbed by fans "The Phoenix Saga", and the storyline of her eventual corruption and death as Dark Phoenix ("Uncanny X-Men" #129–138, 1980) has been termed "The Dark Phoenix Saga". This storyline is one of the most well-known and heavily referenced in mainstream American superhero comics, and is widely considered a classic, including Jean Grey's suicidal sacrifice.
When the first trade paperback of "The Dark Phoenix Saga" was published in 1984, Marvel also published a 48-page special issue titled "Phoenix: The Untold Story". It contained the original version of "Uncanny X-Men" #137, the original splash page for "Uncanny X-Men" #138, and transcripts of a roundtable discussion between Shooter, Claremont, Byrne, editors Jim Salicrup and Louise Jones, and inker Terry Austin about the creation of the new Phoenix persona, the development of the story, and what led to its eventual change, and Claremont and Byrne's plans for Jean Grey had she survived.
Claremont revealed that his and Cockrum's motivation for Jean Grey's transformation into Phoenix was to create "the first female cosmic hero". The two hoped that, like Thor had been integrated into "The Avengers" lineup, Phoenix would also become an effective and immensely powerful member of the X-Men. However, both Salicrup and Byrne had strong feelings against how powerful Phoenix had become, feeling that she drew too much focus in the book. Byrne worked with Claremont to effectively remove Phoenix from the storyline, initially by removing her powers. However, Byrne's decision to have Dark Phoenix destroy an inhabited planetary system in "Uncanny X-Men" #135, coupled with the planned ending to the story arc, worried then-Editor-in-Chief Jim Shooter, who felt that allowing Jean to live at the conclusion of the story was both morally unacceptable (given that she was now a "mass murderer") and also an unsatisfying ending from a storytelling point of view. Shooter publicly laid out his reasoning in the 1984 roundtable:
I personally think, and I've said this many times, that having a character destroy an inhabited world with billions of people, wipe out a starship and then—well, you know, having the powers removed and being let go on Earth. It seems to me that that's the same as capturing Hitler alive and letting him go live on Long Island. Now, I don't think the story would "end" there. I think a lot of people would come to his door with machine guns...
One of the creative team's questions that affected the story's conclusion was whether the Phoenix's personality and later descent into madness and evil were inherent to Jean Grey or if the Phoenix was itself an entity merely possessing her. The relationship between Jean Grey and the Phoenix would continue to be subject to different interpretations and explanations by writers and editors at Marvel Comics following the story's retcon in 1986. At the time of the Dark Phoenix's creation, Byrne felt that, "If someone could be seen to corrupt Jean, rather than her just turning bad, this could make for an interesting story." Salicrup and Byrne stated later that they viewed Phoenix as an entity that entirely possessed Jean Grey, therefore absolving her of its crimes once it was driven out. However, the creative and editorial team ultimately agreed that Phoenix had been depicted as an inherent and inseparable aspect of Jean Grey, meaning that the character was fully responsible for her actions as Phoenix. As a result, Shooter ordered that Claremont and Byrne rewrite issue #137 to explicitly place in the story both a consequence and an ending commensurate with the enormity of Phoenix's actions. In a 2012 public signing, Claremont spoke about the context of the late 1970s and the end of the Vietnam War during the story's writing, stating that the history of these events also made Jean Grey's genocidal actions difficult to redeem.
In the original ending, Jean does not revert to Dark Phoenix, and the Shi'ar subject her to a "psychic lobotomy", permanently removing all her telepathic or telekinetic powers. Claremont and Byrne planned to later have Magneto offer Jean the chance to restore her abilities, but Jean choosing to remain depowered and eliminate the threat of Dark Phoenix returning to power.
After several years, Marvel decided to revive the character, but only after an editorial decree that the character be absolved of her actions during The Dark Phoenix Saga. Writer Kurt Busiek is credited with devising the plot to revive Jean Grey. Busiek, a fan of the original five X-Men, was displeased with the character's death and formulated various storylines that would have met Shooter's rule and allowed the character to return to the X-Men franchise. He eventually shared his storyline idea with fellow writer Roger Stern who mentioned it to Byrne, who was both writing and illustrating the "Fantastic Four" at the time. Both series writer Bob Layton and artist Jackson Guice, who were developing the series "X-Factor"—a team of former X-Men—had yet to settle on their fifth team member, initially considering Dazzler. Layton opted to fill the open spot with Jean instead, and both he and Byrne submitted the idea to Shooter, who approved it. Jean Grey's revival became a crossover plotline between the "Avengers" under Stern, "Fantastic Four" under Byrne, and "X-Factor" under Layton.
Busiek later found out that his idea had been used thanks to Layton, and he was credited in "Fantastic Four" #286 and paid for his contributions. The decision to revive Jean Grey was controversial among fans, with some appreciating the return of the character and others feeling it weakened the impact of the Dark Phoenix Saga's ending. Busiek maintained that the idea that led to Jean Grey's official return to Marvel Comics was merely a case of sharing his ideas with friends as a fan, and that he neither formally pitched the idea to anyone nor gave it the final go ahead. Claremont expressed dissatisfaction with the retcon, stating in 2012: "We'd just gone to all the effort of saying, 'Jean is dead, get over it,' and they said, 'Haha, we fibbed.' So why should anyone trust us again? But that's the difference between being the writer and being the boss." In a 2008 interview Byrne said he still felt Busiek's method of reviving Jean Grey was "brilliant", but agreed that in retrospect the character should have remained dead.
In the comics, having been fully established as separate from the "Jean Grey" copy created and taken over by the Phoenix Force, Jean is "absolved" of involvement in the atrocities of "The Dark Phoenix" storyline, and she returned in the first issue of "X-Factor" (1st Series).
Claremont later commented on how Jean's revival affected his original plans for Madelyne Pryor, stating that the relationship between the two women was intended to be entirely coincidental. He intended Madelyne only to look like Jean by complete coincidence and exist as a means for Cyclops to move on with his life and be written out of the "X-Men" franchise, part of what he believed to be a natural progression for any member of the team. Claremont expressed dismay that Jean's resurrection ultimately resulted in Cyclops abandoning his wife and child, tarnishing his written persona as a hero and "decent human being", and the "untenable situation" with Madelyne was dealt with by transforming her into a prolicidal demonic villain and killing her off.
Soon after the beginning publication of "X-Factor", Marvel also reprinted and released the original "X-Men" series under the title "Classic X-Men". These reissues paired the original stories with new vignettes, elaborating on plot points. One such issue, "Classic X-Men" #8 (April 1987), paired the original "X-Men" #100 (August 1976) story of Jean Grey's disastrous return flight from space immediately preceding her transformation into Phoenix ("Love Hath No X-Man...") with the new story "Phoenix". The story further supported the retcon establishing Jean Grey and the Phoenix Force as two separate entities.
Following the conclusion of "Inferno", Jean continued to be a mainstay character throughout the rest of "X-Factor"
"X-Factor" (1st Series) ended its run featuring the original X-Men with "X-Factor" #70 (September 1991), with the characters transitioning over to "Uncanny X-Men", explained in continuity as the two teams deciding to merge. The fourteen X-Men divide into two teams—"Blue" and "Gold"—led by Cyclops and Storm, respectively. Jean was added to the Gold Team beginning in "Uncanny X-Men" #281 (October, 1991).
Following Cyclops's possession by the mutant villain Apocalypse and disappearance in the conclusion of the crossover storyline "Apocalypse: The Twelve", Jean lost her telekinetic abilities and was left with increased psychic powers, the result of the "six month gap" in plot across the "X-Men" franchise created by the "Revolution" revamp. During the "Revolution" event, all "X-Men" titles began six months after the events of "Apocalypse: the Twelve", allowing writers to create fresh situations and stories and gradually fill in the missing events of the previous six months of continuity. Due to editing decisions following the success of the 2000 "X-Men" film, which depicted the character of Jean Grey with both telepathy and telekinesis, an explanation for Jean's altered powers in the comics was never explicitly made, though writer Chris Claremont revealed in interviews that it was intended to be an accidental power switch between fellow X-Man Psylocke, explaining Psylocke's new telekinetic powers as well.
Jean was next featured in the six-issue miniseries "X-Men Forever" written by Fabian Nicieza, which was designed to tie up remaining plot lines. During the series, Jean revisited many of the events involving the Phoenix Force and the series introduced the concept of "Omega level mutants", a category for mutants with unlimited potential, which included Jean herself.
In June 2001, "X-Men" was retitled as "New X-Men" under writer Grant Morrison. The title consisted of a smaller team featuring Jean, Cyclops, Beast, Wolverine, Emma Frost, and Charles Xavier. The overarching plot focused on the team assuming the roles of teachers to a new generation of mutants at the Xavier Institute while navigating their personal relationships and dealing with newly emerging pro- and anti-mutant political sentiments. Jean also made minor appearances in other titles during the "New X-Men" run, such as Chris Claremont's "X-Treme X-Men", occasionally lending support to the characters.
Jean and her connection with the Phoenix Force was examined again one year after the conclusion of Morrison's run on "New X-Men" in "" written by Greg Pak in 2005. At the 2010 San Diego Comic-Con X-Men panel, when asked whether or not Jean would return, editor Nick Lowe responded by saying, "She's dead."
Regarding Jean's actual return to the "X-Men" franchise, Marvel indicated that Jean's eventual return is being discussed but stated that the return of Jean Grey was "a story Marvel does not want to rush". Marvel loosely tied questions regarding Jean Grey's eventual return to the events in 2007's "" in which a mutant girl named Hope—who has red hair, green eyes, and immense mutant powers—is born, and 2010's "" which sees both Hope's return as a teenager and the return of the Phoenix Force.
Following the conclusion of "Avengers vs. X-Men" as part of the Marvel NOW! event, a teenage Jean Grey and the four other founding members of X-Men are transported across time to the present day by Beast in the series "All-New X-Men" by Brian Michael Bendis.
The original adult Jean Grey returned to the Marvel Universe in a new series titled "Phoenix Resurrection: The Return of Jean Grey", released on December 27, 2017. The series was written by Matthew Rosenberg with art by Leinil Francis Yu.
Following the events of Extermination story, the time-displaced Jean Grey and the other original X-Men were returned to their original time, as part of Jonathan Hickman's plan to reboot the entire X-Men franchise.
Jean Elaine Grey was born the second daughter of John and Elaine Grey. She had an older sister, Sara Grey-Bailey. John Grey was a professor at Bard College in upstate New York. Depictions of Jean's childhood and her relations with her family have shown a stable, loving family life growing up.
Jean's mutant powers of telepathy and telekinesis first manifest when her best friend is hit by a car and killed. Jean mentally links with her friend and nearly dies as well. The event leaves her comatose, and she is brought back to consciousness when her parents seek the help of powerful mutant telepath, Charles Xavier. Xavier blocks her telepathy until she is old enough to be able to control it, leaving her with access only to her telekinetic powers. Xavier later recruits her as a teenager to be part of his X-Men team as "Marvel Girl", the team's sole female member. After several missions with the X-Men, Xavier removes Jean's mental blocks and she is able to use and control her telepathic powers. She begins a relationship with teammate Cyclops, which persists as her main romantic relationship, though she also develops a mutual secret attraction to a later addition to the team, Wolverine.
During an emergency mission in space, the X-Men find their shuttle damaged. Jean pilots the shuttle back to Earth, but is exposed to fatal levels of radiation. Dying, but determined to save Cyclops and her friends, Jean calls out for help and is answered by the cosmic entity, the Phoenix Force. The Phoenix Force, the sum of all life in the universe, is moved by Jean's dedication and love and takes the form of a duplicate body to house Jean's psyche. In that instant, the Phoenix Force is overwhelmed and believes itself to be Jean Grey and places Jean's dying body in a healing cocoon. This cocoon is later described as a Phoenix Egg. The ship crashes in Jamaica Bay, with the other X-Men unharmed.
The Phoenix Force, as Jean Grey, emerges in a new costume and adopts the new codename "Phoenix", with immense cosmic powers. Meanwhile, the cocoon containing the real Jean Grey sinks to the bottom of the bay, unnoticed. Phoenix continues her life as Jean Grey with the other X-Men, joining them on missions and saving the universe. During "The Dark Phoenix Saga", Phoenix becomes overwhelmed and corrupted by her first taste of evil and transforms into a force of total destruction, called "Dark Phoenix", inadvertently killing the inhabitants of a planetary system, after consuming its star, and jeopardizing the entire universe. However, Jean's personality manages to take control and Phoenix commits suicide to ensure the universe's safety.
Upon its suicide by way of a disintegration ray, the Phoenix Force disperses into its original form and a fragment locates the still-healing Jean at the bottom of Jamaica Bay. In trying to bond with her, Jean senses its memories of death and destruction as Dark Phoenix and rejects it, causing it to bond with and animate a lifeless clone of Jean Grey created by the villain Mister Sinister. Sinister created the clone to mate with Cyclops to create genetically superior mutants. Named "Madelyne Pryor", the unaware clone meets Cyclops in a situation engineered by Sinister and the two fall in love, marry, and have a child, Nathan Christopher Summers. Meanwhile, the cocoon is discovered and retrieved by the Avengers and the Fantastic Four. Jean emerges with no memory of the actions of the Phoenix/Dark Phoenix. The Avengers and Fantastic Four tell her of what happened and that she was believed dead until now. She is reunited with the original X-Men and convinces them to form the new superhero team X-Factor, reusing her "Marvel Girl" codename. Jean learns that Cyclops has romantically moved on with Madelyne, who is angered over his decision to lead X-Factor and neglect his family. Though Jean encourages Cyclops to return to Madelyne, he finds their house abandoned and assumes that Madelyne has left him and taken their infant son; Cyclops returns to X-Factor and he and Jean continue their relationship. The team's adventures continued throughout the series, culminating in the line-wide "Inferno" crossover. Madelyne eventually resurfaces, now nearly insane and with powers awakened by a demonic pact, calling herself the Goblyn Queen.
Learning of her true identity and purpose as a clone created by Mister Sinister drove her completely insane and she plans to sacrifice Nathan Christopher to achieve greater power and unleash literal Hell on Earth. While attempting to stop her, Jean is reunited with the other X-Men, who are happy to learn that she is alive, particularly Wolverine, reminding Jean of her unaddressed feelings for him. Jean and Madelyne confront each other, and Madelyne attempts to kill them both. Jean manages to survive only by absorbing the remnant of the Phoenix Force housed within Madelyne, giving her both Madelyne's memories and the Phoenix's memories from "The Dark Phoenix Saga".
While continuing on X-Factor, Cyclops proposes to Jean and she meets her alternate future daughter Rachel Summers (who goes by the codename "Phoenix" as well and is also able to tap into the Phoenix Force), but she rejects them both out of the feeling that they indicate that her life is predetermined. When X-Factor unites with the X-Men, Jean joins the Gold Team, led by Storm. During this time, she no longer uses a codename, instead being referred to by her civilian name. After some time, she makes up with Rachel, welcoming her into her life, and proposes to Cyclops and the two marry. On their honeymoon, the couple is immediately psychically transported 2000 years into the future to raise Cyclops's son Nathan, who had been transported to the future as an infant in hopes of curing him of a deadly virus. Jean adopts the identity of "Redd" along with Cyclops ("Slym") and they raise Nathan Christopher for twelve years before they are sent back into their bodies on their wedding honeymoon. Jean learns that a time-displaced Rachel had used her powers to transport them to the future to protect Nathan, and per Rachel's request, Jean adopts the codename "Phoenix" once again to establish it as a symbol of good after all the bad it had caused. Meanwhile, her psychic and telekinetic abilities begin to grow and she begins using the iconic green and gold Phoenix costume again. Jean also met another alternate future child of hers and Scott's: the immensely powerful Nathan Grey, who accidentally revived the psionic ghost of Madelyne Pryor, leading to another confrontation between the two women.
In Bishop's original timeline before he ends up in the present he finds the X-Men's war room and finds a garbled distress signal from Jean about a traitor destroying the X-Men from within. Meanwhile, in the present, the X-Men begin to hear increasing news about a malevolent entity called Onslaught. Jean first sees Onslaught as a psionic image with the rest of the X-Men after Onslaught coerces Gateway to kidnap Cyclops, Wolverine, Storm, and Iceman. He later appears to her again in a similar way after rescuing her and Gambit from Bastion and offers her a chance to join him. Onslaught makes his first full appearance to Jean on the astral plane and shows her how humanity is closing in on mutants as well as revealing that Xavier was in love with her while she was a student to convince her to join him. He then telepathically brands his name to her mind when she refused and asks him his name. When Juggernaut comes to the mansion with information about Onslaughts true identity but has a mental block preventing him from divulging it, Jean enters his mind and helps him to remember who Onslaught really is and to her horror she discovers that Onslaught is really Professor X, having gone insane ever since wiping Magneto's mind.
Professor Xavier calls the X-Men together for a meeting and Jean tries unsuccessfully to rally the X-Men against him before he manifests Onslaught. While Onslaught easily overtakes the rest of the X-Men, Jean escapes to the war room and sends out the distress signal that Bishop found in the future. After a massive battle against Jean and the rest of the X-Men, Onslaught escapes to carry out his plans. After Onslaught nearly kills the X-Men they team up with the Avengers to make a plan to stop him, knowing full well that it may come down to them killing Xavier if the world is to survive. Jean accompanies Cyclops, Archangel, and Psylocke to Muir Island where they and Moira McTaggart discover the "Xavier Protocols", secret plans that Xavier made to kill any of the individual X-Men should anyone become a threat against the world. Meanwhile, Jean's earlier distress signal makes it to X-Factor, Excalibur, and X-Force. After returning to New York, Jean works closely with Reed Richards to help build up defenses against Onslaught as well as to help create the psionic armor that could block Xavier's telepathic powers as seen in the "Xavier Protocols". When Jean senses that Xavier has been freed from Onslaught and is going to confront him on his own, she and Cyclops bring together the rest of the X-Men to back him up. The rest of the Avengers and Fantastic Four join them in a final stand against Onslaught before he completely destroys the world. In a final act of desperation Jean finds Hulk and locks away Bruce Banner's mind, leaving only the Hulk in control so he can fight Onslaught unencumbered. With the vast majority of earth's heroes missing and assumed dead after Onslaught is finally defeated, Jean and Cyclops open their home to Quicksilver and his daughter and try to help the X-Men to get their lives back together.
Following Cyclops's possession by the mutant villain Apocalypse and apparent death, Jean continues with the X-Men, but is distraught by the loss of her husband. She later learns that she is an "Omega-level" mutant with unlimited potential. Jean begins to suspect that Cyclops may still be alive and with the help of Nathan Summers (now the aged superhero "Cable"), is able to locate and free Cyclops of his possession by Apocalypse. The couple return to the X-Men as part of the Xavier Institute's teaching staff to a new generation of mutants. While Jean finds she is slowly able to tap into the powers of the Phoenix Force once again, her marriage to Scott begins to fail. Jean and Wolverine address their long-unspoken mutual attraction, deciding it is best not to act on their feelings; Cyclops grows further alienated from Jean due to her growing powers and institute responsibilities and seeks consolation from the telepathic Emma Frost to address his disillusionment and his experiences while possessed by Apocalypse. These therapy sessions lead to a "psychic affair" between Scott and Emma. Jean's discovery of the psychic affair results in a confrontation between her and Emma, though ultimately Jean realizes that Emma truly loves him.
In a final confrontation with a traitor at the institute (the X-Men's teammate Xorn, posing as Magneto) Jean fully realizes and assumes complete control of the powers of the Phoenix Force, but is killed in a last-ditch lethal attack by Xorn. Jean dies, telling Scott "to live". However, after her funeral, Scott rejects Emma and her offer to run the school together. This creates a dystopian future where all life and natural evolution is under assault by the infectious, villainous, sentient bacteria "Sublime". Jean is resurrected in this future timeline and becomes the fully realized White Phoenix of the Crown, using the abilities of the Phoenix Force to defeat Sublime and eliminate the dystopic future by reaching back in time and telling Cyclops to move on. This leads him to accept Emma's love and her offer to run the school together. Jean then reconciles with Cyclops and fully bonds with the Phoenix Force and ascends to a higher plane of existence called the "White Hot Room".
A weakened Phoenix Force returns to reanimate Jean. Jean tries to convince the Phoenix Force to let her go so they can return to the White Hot Room together, but once again the Phoenix Force takes over. Jean lets Wolverine find her and tries to convince him to kill her again before the Phoenix does more damage. The Shi'ar track the Phoenix Force and make an alliance with Storm to find her and defeat her. Jean takes Wolverine to the North Pole before the Shi'ar can kill her and convinces him to kill her. He stabs her numerous times but Phoenix keeps reanimating her, prompting Jean to dive deep into the ice and freeze herself. The Phoenix Force leaves her body and once again assumes Jean's form to tempt Cyclops to attack her so she can absorb his optic blasts and become strong again. When the Phoenix Force merges with and overwhelms Emma Frost, Cyclops frees Jean from the ice. Once freed Jean ejects the Phoenix from Emma and accepts that she is one with the Phoenix Force. After feeling the love from the X-Men, the Phoenix relents and returns with Jean back to the White Hot Room. Before she departs, Jean and Cyclops share a telepathic emotional farewell.
Though she had yet to fully return, the Phoenix Force and Jean continued to manifest themselves, particularly the Phoenix through the red-haired, green-eyed "mutant messiah" who slightly resembles Jean named Hope Summers, and Jean briefly appears in a vision to Emma Frost from the White Hot Room, warning the X-Men to "prepare". She again appears in a vision to Cyclops when he is overwhelmed by the power of Dark Phoenix, helping him abandon the power so that it can pass on to its true host. After Nightcrawler is fatally wounded by the Crimson Pirates, Jean appears to him along with Amanda Sefton and the recently deceased Wolverine to help coax him back to life. Jean's spirit begins to manifest in a more straightforward and aggressive manner to the time-displaced Jean from an alternate timeline, seemingly training her for the arrival of the Phoenix. However, after the younger Jean begins to ignore her, she possesses the time displaced Jean and uses her as a means to ambush Emma Frost.
Strange psych occurrences around the world, which include a large bird flaring out from the sun and an explosion on the moon, raise red flags for the X-Men, who quickly launch an investigation of these events. After a string of bizarre encounters with familiar enemies, many of them considered deceased, the X-Men come to one conclusion: the Phoenix Force is back on Earth. The X-Men also discover that psychs are going missing or falling ill, which prompts the team to investigate the grave of Jean Grey. As they find the coffin of their long-dead teammate empty, they race to locate the Phoenix before it can find a suitable host. As it turns out, with the time-displaced teen Jean Grey out of the Phoenix Force's way, the cosmic entity has already resurrected the present adult Jean Grey. However, she doesn't recall her life as a mutant and an X-Man, and terrible visions from her previous life have left Jean unsure of the difference between reality and fiction. As she lies inside of what appears to be a Phoenix Egg, the X-Men theorize that the strange psych occurrences are subconscious cries for help made by Jean Grey and that they must try to stop the Phoenix from merging with their old friend. Old Man Logan is able to make Jean Grey remember her true life and she learns about the fate of her family and several of her friends, among them Cyclops. As Jean faces the Phoenix Force, she is finally able to convince the cosmic entity to stop bringing her back and let her go. Alive once again, Jean is reunited with her friends as the Phoenix Force journeys back to space.
Restored to life, Jean gathers some of the greatest minds on Earth together so that she can read their minds to plan her next move. Recognizing that there has been a sudden surge in anti-mutant sentiment, to the point where there are plans to abort pregnancies if the mutant gene is detected, Jean announces her plans to establish a more official mutant nation, making it clear that she will not establish a geographic location for said nation as past examples make it clear that doing so just makes mutants a target. To support her in this goal, she assembles a team including Nightcrawler, X-23 and Namor, but is unaware that her actions are being observed by Cassandra Nova.
The adult Jean returns to using her original Marvel Girl codename and wore her second green-and-yellow Marvel Girl costume. She was part of a strike team sent to outer space to stop a satellite near the sun from being used as a Sentinel factory. Sentinels crushed Jean's escape pod and she died, only for Jean's mind to be transferred into a cloned body created by Professor X. She is also a member of quiet council of Summer season.
In "All-New X-Men", present-day Beast goes to the past and brings a younger version of Jean to the present day along with the other original X-Men in hopes of helping the present-day Cyclops to see how far he's fallen. This version has experienced a surge in her abilities due to the trauma of being brought to the future. The time travel also caused her suppressed telepathic powers to awaken much earlier in her life than they were supposed to. She also has a habit of reading people's minds without their permission, to the great frustration of her team. During the "Battle of the Atom" crossover, a future version of this Jean Grey, who had never returned to the past and whose powers had grown beyond her control, would return to the present as Xorn, a member of the future Brotherhood of Mutants. Xorn perished during the battle, but in the process the X-Men also found out that there is something preventing the All-New X-Men from returning to the past. During this timeline, she reads the mind of current Beast, who regrets never admitting his feelings for her, so confronts younger Beast and gives him a kiss, which creates problems with the younger Cyclops. She and her team also leave the and go to , where she forms a reluctant friendship with Emma Frost as she trains her psychic abilities.
Jean is later kidnapped by the Shi'ar and placed on trial for the destruction done by the Phoenix Force years earlier in a 2014 crossover story line "The Trial of Jean Grey". The All-New X-Men team up with the Guardians of the Galaxy to rescue Jean from the Shi'ar homeworld, but Jean would end up awakening a new power that she never had, in which she is able to absorb massive amounts of psionic energy from others and combine her telepathy and telekinesis, which she used to defeat the powerful Gladiator, leader of the Shi'ar.
While searching for new mutants, Jean and the All-New X-Men get teleported into the Ultimate Marvel universe. She teams up with Spider-Man (Miles Morales) to rescue Beast, who's been trapped by the local Dr. Doom. Before she is teleported back she gives Miles Morales a kiss. Upon their return to Earth 616, she and the All-New X-Men team up with the Guardians of the Galaxy a second time in search of The Black Vortex.
Following the reconstruction of reality after the Battleworld crisis, Jean has parted ways from the rest of the time-displaced X-Men as she attempts to find her own life in the present by living a normal civilian life in College until Storm recruits her to join her new team of X-Men to help protect mutants from Terrigen. She mentions having broken up with Hank McCoy, considering him to be more of a brother. After the X-Men go to war against the Inhumans to destroy the Terrigen, Jean leaves Storm's team and attempts to return to her original timeline along with the rest of the time-displaced X-Men but realizes that they're not from the 616 timeline, leaving them stranded on Earth 616 with no idea which timeline they're originally from. With this new knowledge that they are from an unknown alternate timeline, Jean becomes the time-displaced X-Men's new leader and they quit the X-Men in hopes of finding their place in the current world.
Jean ends up approached by Magneto, who offers her and her team to join him in preserving Xavier's dream by defeating those who oppose it. Jean accepts and her team joins him, but in secret they train themselves in case Magneto ever reverts to his villainous roots to kill them.
As part of the Marvel's "RessurXion" event, Jean Grey received her first-ever solo series. While on a solo mission against the Wrecking Crew, Jean receives a vision that the Phoenix Force is coming back to earth. She goes to the rest of the X-Men to warn them about her vision but as there haven't been any Phoenix sightings since the X-Men went to war against the Avengers to decide the fate of the Phoenix, she has a hard time getting Beast, Captain Marvel, and Kitty Pryde to accept that her vision was real even though they assure her that if the Phoenix ever does return then the X-Men and Avengers will come together and do all they can to stop it. Jean feels even less taken seriously when Beast begins examining her for signs of delusional hallucinations. Jean then meets with other former Phoenix hosts Colossus, Magik, Rachel Summers, Hope Summers and Quentin Quire, where the latter uses his powers to show her how the aftereffects of bonding with the Phoenix Force has individually affected each of them. A meeting with Namor helps Jean come to the conclusion that she can refuse the Phoenix and even possibly defeat it. After meeting with Thor and training with Psylocke, Jean learns how to create telekinetic weapons to help with her impending battle against the Phoenix.
Jean ends up sent back in time for unknown reasons and ends up meeting that timeline's Jean Grey shortly after she first becomes Phoenix. Time-displaced Jean attempts to ask Phoenix questions about the Phoenix Force but she dodges Jean's questions. Instead Phoenix takes Jean for a night out and shows off her powers. After witnessing Phoenix use her cosmic powers to fight off Galactus from consuming a defenseless planet, Jean contemplates warning Phoenix of her fate until an encounter with The Watcher stops her from doing so. The Watcher commends Jean and tells her that choosing to not change her future means that her ultimate fate is in her own hands whether or not she ends up hosting the Phoenix Force back in her present. As Jean returns to her present, Phoenix cryptically states that they will meet again.
Backed by a host of former Phoenix Force wielders, Emma Frost, Quentin Quire, Hope Summers, the Stepford Cuckoos and even the spirit of the adult Jean Grey, the teen Jean tries to defy destiny and stop the Phoenix before it can take her over and bend her to its will. With the Phoenix Force now on Earth, the team realizes it's going to take a lot more than they have to stop it. And while the young Jean is able to wound the Phoenix with the aid of Cable's Psi-mitar, the Phoenix seems just too strong for anyone to overcome. Teen Jean eventually managed to push the cosmic force far away from her friends and allies, where a final battle can take place. However, both Jean Greys learned how wrong they were, as the Phoenix was never coming for teen Jean, at least not like they believed. Actually, the Phoenix wants the adult Jean, but to do that it needs the young Jean out of the way. Thus, the force floods her body with flaming psychic energy, incinerating her from the inside out, leaving only a skeleton. This was done to resurrect the adult Jean Grey, which the Phoenix considers its one true host. However, after dying, the younger Jean found herself somehow in the White Hot Room despite not being a Phoenix host. Angry, the Phoenix attempted to destroy her using mental manifestations of its past hosts, created from pieces of their life forces left in the Room. Jean realized that she could control the White Hot Room against the Phoenix wishes and commanded the cosmic entity to resurrect her, which it did so in order to get rid of her. After returning to Madripoor, she was approached by her resurrected older Earth-616 counterpart, much to her surprise.
Jean Grey is an Omega-level mutant, and at her highest and strongest potential was fully merged with the Phoenix Force and with it was able to defeat even Galactus.
Jean is a powerful empath, as she can feel and manipulate emotions of other people, as shown when her power first emerged as she felt her friend Annie Richardson slowly dying. Jean can also connect people's minds to the feelings of others and make them feel the pain they inflicted.
When her powers first manifested, Jean was unable to cope with her telepathic abilities, forcing Professor Charles Xavier to suppress her access to it altogether. Instead, he chose to train her in the use of her psychokinetic abilities while allowing her telepathy to grow at its natural rate before reintroducing it. When the Professor hid to prepare for the Z'Nox, he reopened Jean's telepathic abilities, which was initially explained by writers as Xavier 'sharing' some of his telepathy with her.
Jean is also one of the few telepaths skilled enough to communicate with animals (animals with high intelligence, such as dolphins, dogs, and ravens). As a side effect of her telepathy, she has an eidetic memory. Jean was able, through telepathic therapy with the comatose Jessica Jones, to grant Jessica immunity to the Purple Man's mind control abilities, despite his powers being chemical in nature rather than psychic. When Jean absorbed Psylocke's specialized telepathic powers, her own telepathy was increased to the point that she could physically manifest her telepathy as a psionic firebird whose claws could inflict both physical and mental damage. She briefly developed a psychic shadow form like Psylocke's, with a gold Phoenix emblem over her eye instead of the Crimson Dawn mark possessed by Psylocke. Jean briefly lost her telekinesis to Psylocke during this exchange, but her telekinetic abilities later came back in full and at a far stronger level than before. It was later stated that Jean has been an Omega Level telepath..
Jean possesses a high-level of telekinetic ability that enables her to psionically levitate and rapidly move about all manner of animate and inanimate matter. She can use her telekinetic abilities on herself or others to simulate the power of flight or levitation, stimulate molecules to increase friction, create protective force fields out of psychokinetic energy, or project her telekinetic energy as purely concussive force. The outer limits of her telekinetic power have never been clearly established, though she was capable of lifting approximately fifty tons of rubble with some strain.
Jean's younger self who had been brought from the past into the present by an older Hank McCoy eventually found an entirely new usage of her powers separate from the Phoenix Force. The teenage Marvel Girl learned she has the ability to harness ambient psychic energy and channel it into powerful blasts of force, which are a combination of both her telepathy and telekinesis. Its potency is such that she can match and overpower the likes of Gladiator, magistrate of the Shi'ar, with relative ease. When using this ability Jean's whole body glows with pink psychic energy, obscuring her human form.
Under the tutelage of Psylocke, teenage Marvel Girl has learned the ability to create psionic weapons that damage a target either physically, mentally or both in some point. She showed skill in constructing multiple types of psionic weapons that differ in size, length and power which she uses in combat.
The relationship between Jean Grey and the Phoenix Force (and the nature of the powers she has) is portrayed in a variety of ways throughout the character's history. In the initial plotline of the Phoenix being a manifestation of Jean's true potential, these powers are considered her own, as part of Claremont and Byrne's desire to create "the first cosmic superheroine". However, since the retcon of the Phoenix as a separate entity from Jean Grey, depictions of these powers vary; these include Jean being one of many hosts to the Phoenix and "borrowing" its "Phoenix powers" during this time, being a unique host to the Phoenix, and being one with the Phoenix. She is later described as the only one currently to be able to hold the title of "White Phoenix of the Crown" among the many past, present, and future hosts of the Phoenix. Jean — both young and adult versions — is also the only character ever to force the Phoenix against its own cosmic will to do anything while not presently a host to its powers. In one instance Jean forcibly ripped the Phoenix out of Emma Frost and imposed its status upon herself. Young Jean was able to keep her psyche anchored in the Phoenix's mind postmortem despite the Phoenix's own efforts to forcibly remove her after it murdered her. Jean then subsequently forced the Phoenix to resurrect her after manipulating the Phoenix's mental landscape against it.
The Phoenix Force also seems to render its host unaging and, at least in some adaptations, enhances the physical strength of its avatar to superhuman levels; in certain incarnations, Jean, namely while acting as Dark Phoenix, seemed to possess some level of superhuman strength.
For one reason or another, Jean Grey (both young and old) has, on more than one occasion, been repeatedly resurrected by either the Phoenix or apparently her sheer force of will "without" Phoenix. In some depictions, these resurrections are immediately after she or whoever she is reviving is killed, while other depictions indicate that a resurrection must occur at a "correct" time, sometimes taking a century. During the height of the Psych Wars, Young Jean was able to forcibly make the Phoenix Force restore her to life, despite the Phoenix's adamant resolve not to do so, completely recreating her body after it had been vaporized. Some time later, after her body was taken over and completely devoured by a Poison, a small part of Jean's mind survived and, despite itself, was able to infect the whole Poison Hive and destroy it from the inside out, subsequently using nothing but her mind to reconstruct her body. This leaves Jean believing that she may not even be human anymore. This is not the first time Jean was resurrected without the Phoenix; in one instance, she was even able to fully resurrect herself after being clinically dead completely independent of the Phoenix Force.
Jean Grey is a trained pilot and proficient unarmed combatant. She also has some degree of teaching ability, experience as a fashion model, and training in psychology.
As a fictional character in the Marvel Universe, Jean Grey appears in various alternate plot lines and fictional universes.
She was ranked third in "Comics Buyer's Guide's" 100 Sexiest Women in Comics list.
Jean Grey appears in various media, such as animated programs, video games, films, and is sometimes referenced in pop culture. | https://en.wikipedia.org/wiki?curid=16559 |
Jack Brabham
Sir John Arthur Brabham, (2 April 1926 – 19 May 2014) was an Australian racing driver who was Formula One World Champion in , , and . He was a founder of the Brabham racing team and race car constructor that bore his name.
Brabham was a Royal Australian Air Force flight mechanic and ran a small engineering workshop before he started racing midget cars in 1948. His successes with midgets in Australian and New Zealand road racing events led to his going to Britain to further his racing career. There he became part of the Cooper Car Company's racing team, building as well as racing cars. He contributed to the design of the mid-engined cars that Cooper introduced to Formula One and the Indianapolis 500, and won the Formula One world championship in 1959 and 1960. In 1962 he established his own Brabham marque with fellow Australian Ron Tauranac, which in the 1960s became the largest manufacturer of customer racing cars in the world. In the 1966 Formula One season Brabham became the first – and still, the only – man to win the Formula One world championship driving one of his own cars. He was the last surviving World Champion of the 1950s.
Brabham retired to Australia after the 1970 Formula One season, where he bought a farm and maintained business interests, which included the Engine Developments racing engine manufacturer and several garages.
John Arthur 'Jack' Brabham was born on 2 April 1926 in Hurstville, New South Wales, then a commuter town outside Sydney. Brabham was involved with cars and mechanics from an early age. At the age of 12, he learned to drive the family car and the trucks of his father's grocery business. Brabham attended technical college, studying metalwork, carpentry, and technical drawing.
Brabham's early career continued the engineering theme. At the age of 15 he left school to work, combining a job at a local garage with an evening course in mechanical engineering. Brabham soon branched out into his own business selling motorbikes, which he bought and repaired for sale, using his parents' back veranda as his workshop.
One month after his 18th birthday on 19 May 1944 Brabham enlisted into the Royal Australian Air Force (RAAF). Although he was keen on becoming a pilot, there was already a surplus of trained aircrew and the Air Force instead put his mechanical skills to use as a flight mechanic, of which there was a wartime shortage. He was based at RAAF Station Williamtown, where he maintained Bristol Beaufighters at No. 5 Operational Training Unit. On his 20th birthday, 2 April 1946, Brabham was discharged from the RAAF with the rank of leading aircraftman. He then started a small service, repair, and machining business in a workshop built by his uncle on a plot of land behind his grandfather's house.
Brabham started racing after an American friend, Johnny Schonberg, persuaded him to watch a midget car race. Midget racing was a category for small open-wheel cars racing on dirt ovals. It was popular in Australia, attracting crowds of up to 40,000. Brabham records that he was not taken with the idea of driving, being convinced that the drivers "were all lunatics" but he agreed to build a car with Schonberg.
At first Schonberg drove the homemade device, powered by a modified JAP motorcycle engine built by Brabham in his workshop. In 1948, Schonberg's wife persuaded him to stop racing and on his suggestion Brabham took over. He almost immediately found that he had a knack for the sport, winning on his third night's racing. From there he was a regular competitor and winner in Midgets (known as Speedcars in Australia) at tracks such Sydney's Cumberland Speedway, the Sydney Showground, and the Sydney Sports Ground, as well as interstate tracks such as Adelaide's Kilburn and Rowley Park speedways and the Ekka in Brisbane. Brabham has since said that it was "terrific driver training. You had to have quick reflexes: in effect you lived—or possibly died—on them." Due to the time required to prepare the car, the sport also became his living. Brabham won the 1948 Australian Speedcar Championship, the 1949 Australian and South Australian Speedcar championships, and the 1950–1951 Australian championship with the car.
After successfully running the midget at some hillclimbing events in 1951, Brabham became interested in road racing. He bought and modified a series of racing cars from the Cooper Car Company, a British constructor, and from 1953 concentrated on this form of racing, in which drivers compete on closed tarmac circuits. He was supported by his father and by the Redex fuel additive company, although his commercially aware approach—including the title "RedeX Special" painted on the side of his Cooper-Bristol—did not go down well with the Confederation of Australian Motor Sport (CAMS), which banned the advertisement. Brabham competed in Australia and New Zealand until early 1955, taking "a long succession of victories", including the 1953 Queensland Road Racing championship. During this time, he picked up the nickname "Black Jack", which has been variously attributed to his dark hair and stubble, to his "ruthless" approach on the track, and to his "propensity for maintaining a shadowy silence". After the 1954 New Zealand Grand Prix, Brabham was persuaded by Dean Delamont, competitions manager of the Royal Automobile Club in the United Kingdom, to try a season of racing in Europe, then the international centre of road racing.
Upon arriving in Europe on his own in early 1955, Brabham based himself in the UK, where he bought another Cooper to race in national events. His crowd-pleasing driving style initially betrayed his dirt track origins: as he put it, he took corners "by using full [steering] lock and lots of throttle". Visits to the Cooper factory for parts led to a friendship with Charlie and John Cooper, who told the story that after many requests for a drive with the factory team, Brabham was given the keys to the transporter taking the cars to a race. Brabham soon "seemed to "merge" into Cooper Cars": he was not an employee, but he started working at Cooper daily from the midpoint of the 1955 season building a Bobtail mid-engined sports car, intended for Formula One, the top category of single seater racing. He made his Grand Prix debut at the age of 29 driving the car at the 1955 British Grand Prix. It had a 2-litre engine, half a litre less than permitted, and ran slowly with a broken clutch before retiring. Later in the year Brabham, again driving the Bobtail, tussled with Stirling Moss for third place in a non-championship Formula One race at Snetterton. Although Moss finished ahead, Brabham saw the race as a turning point, proving that he could compete at this level. He shipped the Bobtail back to Australia, where he used it to win the 1955 Australian Grand Prix before selling it to help fund a permanent move to the UK the following year with his wife Betty and their son Geoff.
Brabham briefly and unsuccessfully campaigned his own second hand Formula One Maserati 250F during 1956, but his season was saved by drives for Cooper in sports cars and Formula Two, the junior category to Formula One. At that time, almost all racing cars had their engines mounted at the front but Coopers were different, having the engine placed behind the driver, which improved their handling. In 1957, Brabham drove another mid-engined Cooper, again only fitted with a 2-litre engine, at the Monaco Grand Prix. He avoided a large crash at the first corner and was running third towards the end of the race when the fuel pump mount failed. After more than three hours of racing, the exhausted Brabham, who "hated to be beaten", pushed the car to the line to finish sixth. The following year, he was Autocar Formula Two champion in a Cooper, while continuing to score minor points-scoring positions with the small-engined Coopers in the World Drivers' Championship and driving for Aston Martin in Sportscars. His schedule necessitated a considerable amount of travel on the roads of Europe. Brabham's driving on public roads was described as "safe as houses", unlike many of his contemporaries—on the way back from the 1957 Pescara Grand Prix, passenger Tony Brooks took over driving after Brabham refused to overtake a long line of lorries. In late 1958, Brabham rekindled his interest in flying and began taking lessons. He bought his own plane and on gaining his licence began to make heavy use of it piloting himself, his family, and members of his team around Europe to races.
In 1959, Cooper obtained 2.5-litre engines for the first time and Brabham put the extra power to good use by winning his first world championship race at the season-opening Monaco Grand Prix after Jean Behra's Ferrari and Stirling Moss's Cooper failed. More podium places were followed by a win in the British Grand Prix at Aintree after Brabham preserved his tyres to the end of the race, enabling him to finish ahead of Moss who had to pit to replace worn tyres. This gave him a 13-point championship lead with four races to go. At the Portuguese Grand Prix at Monsanto Park, Brabham was chasing race leader Moss when a backmarker moved over on him and launched the Cooper into the air. The airborne car hit a telegraph pole, throwing Brabham onto the track, where he narrowly avoided being hit by one of his teammates but escaped with no serious injury. With two wins each, Brabham, Moss, and Ferrari's Tony Brooks were all capable of winning the championship at the final event of the season, the United States Grand Prix at Sebring. Brabham was among those up until 1 am the morning before the race working on the Cooper team cars. The next day, after pacing himself behind Moss, who soon retired with a broken gearbox, he led almost to the end of the race before running out of fuel on the last lap. He again pushed the car to the finish line to place fourth, although in the event this was unnecessary as his other title rival, Brooks, finished only third. His championship-winning margin over Brooks was four points. According to Gerald Donaldson, "some thought [his title] owed more to stealth than skill, an opinion at least partly based on Brabham's low-key presence."
Despite his success with Cooper, Brabham was sure he could do better. He considered buying Cooper in partnership with Roy Salvadori and then in late 1959 he asked his friend Ron Tauranac to come to the UK and work with him, producing upgrade kits for Sunbeam Rapier and Triumph Herald road cars at his car dealership, Jack Brabham Motors, but with the long-term aim of designing racing cars. Brabham continued to drive for Cooper, but on the long flight back from the 1960 season-opening Argentine Grand Prix, he had a heart-to-heart with John Cooper. John's father Charlie and the designer Owen Maddock had been reluctant to update their car, but although a Cooper had won in Argentina, other cars had been faster before they broke down. Brabham helped design the more advanced Cooper T53, including advice from Tauranac. Brabham spun the new car out of the next championship race, the Monaco Grand Prix, but then embarked on a series of five straight victories. He won from the front at the Dutch, French, and Belgian Grands Prix, where title rival Moss was badly injured in a practice accident that put him out for two months. Two other drivers were killed during the race. At the British Grand Prix, Brabham was closing on Graham Hill's BRM before Hill spun off, leaving Brabham the victory. He then came back from eighth place to second at the Portuguese Grand Prix after sliding off on tramlines and won after race leader John Surtees crashed. Brabham's points total was put out of reach when the British teams withdrew from the Italian GP on safety grounds. Mike Lawrence writes that Brabham's expertise in setting up the cars was a significant factor in Cooper's 1960 drivers' and constructors' titles.
Coventry Climax were late in producing the smaller 1.5-litre engine required for the 1961 season and the Cooper-Climaxes were outclassed by new mid-engined cars from Porsche, Lotus, and championship-winners Ferrari. Brabham scored only three points and finished 11th in the championship. He had a little more success in the non-championship Formula One races, where he ran his own private Coopers and took three victories at Snetterton (26 March), Brussels (9 April), and Aintree (22 April).
The same year, Brabham entered the famous Indianapolis 500 oval race for the first time in a modified version of the Formula One Cooper. It had a 2.7-litre Climax engine producing compared to the 4.4-litre, Offenhauser engines used by the front-engined roadsters driven by all the other entrants. Jack qualified a respectable 17th at 145.144 mp/h (pole winner Eddie Sachs qualified at 147.481 mp/h), and while the front-engined roadsters were much faster on the long front and back straights, the rear-engined Cooper's superior handling through the turns and the shorter north and south sections kept the reigning World Champion competitive. Brabham ran as high as third before finishing ninth, completing all 200 laps. Although most of the doubters in the American Indycar scene claimed that rear-engine cars were for drivers who like to be pushed around, as Brabham put it, it "triggered the rear-engined revolution at Indy" and within five years most of the cars that raced at Indianapolis would be rear-engined.
Brabham and Tauranac set up a company called Motor Racing Developments (MRD), which produced customer racing cars, while Brabham himself continued to race for Cooper. MRD produced cars for Formula Junior, with the first one appearing in mid-1961. Brabham left Cooper in 1962 to drive for his own team: the Brabham Racing Organisation, using cars built by Motor Racing Developments. A newly introduced engine limit in Formula One of 1500 cc did not suit Brabham and he did not win a single race with a 1500 cc car. His team suffered poor reliability during this period and motorsport authors Mike Lawrence and David Hodges have said that Brabham's reluctance to spend money may have cost the team results, a view echoed by Tauranac. During the 1965 season, Brabham started to consider retirement to manage his team. Dan Gurney took the lead driver role, and the team's first world championship win, while Brabham gave up his car to several other drivers towards the end of the season. At the end of the season, Gurney announced his intention to leave and set up his own team and Brabham decided to carry on.
In 1966, a new 3-litre formula was created for Formula One. The new engines under development by other suppliers all had at least 12 cylinders and proved difficult to develop, being heavy and unreliable. Brabham took a different approach to the problem of obtaining a suitable engine: he persuaded Australian engineering company Repco to develop a new 3-litre eight-cylinder engine for him. Repco had no experience in designing complete engines. Brabham had identified a supply of suitable engine blocks obtained from Oldsmobile's aluminium alloy 215 engine and persuaded the company that an engine could be designed around the block, largely using existing components. Brabham and Repco were aware that the engine would not compete in terms of outright power, but felt that a lightweight, reliable engine could achieve good championship results while other teams were still making their new designs reliable.
The combination of the Repco engine and the Brabham BT19 chassis designed by Tauranac worked. At the French Grand Prix at Reims-Gueux, Jack Brabham took his first Formula One world championship win since 1960 and became the first man to win such a race in a car of his own construction. Only his two former teammates, Bruce McLaren and Dan Gurney, have since matched this achievement. It was the first in a run of four straight wins for the Australian veteran. The 40-year-old Brabham was annoyed by press stories about his age and, in a highly uncharacteristic stunt, at the Dutch Grand Prix he hobbled to his car on the starting grid before the race wearing a long false beard and leaning on a cane before going on to win the race. Brabham confirmed his third championship at the Italian Grand Prix and became the only driver to win the Formula One World Championship in a car that carried his own name.
The season also saw the fruition of Brabham's relationship with Japanese engine manufacturer Honda in Formula Two. After a generally unsuccessful season in 1965, Honda revised their 1-litre engine completely. Brabham won ten of the year's 16 European Formula Two races in his Brabham-Honda. There was no European Formula Two championship that year, but Brabham won the "Trophées de France", a championship consisting of six of the French Formula Two races.
In 1967, the Formula One title went to Brabham's teammate Denny Hulme. Hulme had better reliability through the year, possibly due to Brabham's desire to try new parts first.
Despite taking pole position in the first two rounds, mechanical problems halted his chances of victory. He spun numerous times in South Africa, and at Monaco, his engine blew up at the start, and the win went to his teammate Denny Hulme. At the Dutch Grand Prix, he scored his first podium of the season, with second place, behind Scotsman Jim Clark. He retired in the Belgian Grand Prix with another blown engine. He fixed this by winning the French Grand Prix at the Bugatti Circuit in Le Mans. He came fourth at the British Grand Prix, behind Chris Amon, his teammate Hulme, and Clark. At the German Grand Prix, he had a huge battle with Amon, and Brabham eventually finished ahead of the New Zealander, by only half a second. Hulme was the winner. At the first ever Canadian Grand Prix at Mosport Park, he took a huge win, ahead of Hulme, in cold and rainy conditions. At the Italian Grand Prix at Monza, Brabham had to finish second, only a few car lengths behind John Surtees, who took his last GP win. Hulme retired from the race, cutting the gap to 3 points between the two, as the circus headed for the United States, at Watkins Glen for the United States Grand Prix. Brabham outqualified his teammate, and finished fifth in the race, and with Hulme on the podium, this meant the championship chances were looking slim for Black Jack, as the circus went to Mexico for the championship deciding and final race of the season. Once again, he outqualified his teammate, and needed to win, with Hulme fifth or lower. But Jim Clark was simply too fast during the whole weekend, and dominated the race from pole to win, with Brabham finishing over 1 minute and 25 seconds behind. Hulme finished third, and so the New Zealander won the championship, while Brabham settled for second place. The team secured the Constructors' Championship, with 67 total points scored, and 23 points ahead of Lotus which scored a total of 44 points.
Brabham raced alongside his teammate Jochen Rindt during the 1968 season. It wasn't a good season for him. He retired from the first seven races, before scoring two points for fifth place at the German Grand Prix. He retired from the remaining four races. At the end of the year, he fulfilled a desire to fly from Britain to Australia in a small twin-engined Beechcraft Queen Air. Partway through the 1969 season, Brabham suffered serious injuries to his foot in a testing accident. He returned to racing before the end of the year, but promised his wife that he would retire after the season finished and sold his share of the team to Tauranac.
Finding no top drivers available despite coming close to bringing Rindt back to the team, Brabham decided to race for one more year. He began auspiciously, winning the first race of the season, the South African Grand Prix, and then led the third race, the Monaco Grand Prix until the very last turn of the last lap. Brabham was about to hold off the onrushing Rindt (the eventual 1970 F1 champion) when his front wheels locked in a skid on the sharp right turn only yards from the finish and he ended up second. While leading at the British Grand Prix at Brands Hatch, he ran out of fuel at Clearways and Rindt passed him to take the win while Brabham coasted to the finish in second place. After the 13th and final race of the season, the Mexican Grand Prix, Brabham did retire. He had tied Jackie Stewart for fifth in the points standings in the season he drove at the age of 44. Brabham also drove for the works Matra team during the 1970 World Sportscar Championship season and won the final race of the season and his final top level race at the Paris 1000 km in October that year. He then made a complete break from racing and returned to Australia, to the relief of his wife who had been "scared stiff" each time he drove.
Following his retirement, Brabham and his family moved to a farm between Sydney and Melbourne. Brabham says that he "never really wanted" the move, but his wife hoped their sons could grow up away from motorsport. As well as running the new venture, he continued his interest in businesses in the UK and Australia, including a small aviation company and garages and car dealerships. He also set up Engine Developments Ltd. in 1971 with John Judd, who had worked for Brabham on the Repco engine project in the mid 1960s. The company builds engines for racing applications. Brabham was also a shareholder in Jack Brabham Engines Pty Ltd., an Australian company marketing Jack Brabham memorabilia.
The Brabham team continued in Formula One, winning two further Drivers' Championships in the early 1980s under Bernie Ecclestone's ownership. Although the original organisation went into administration in 1992, the name was attached to a German company selling cars and accessories in 2008, and an unsuccessful attempt to set up a new Formula One team the following year. On both occasions the Brabham family, which was unconnected to the ventures, announced its intention to take legal advice. In September 2014, Brabham's youngest son David announced Project Brabham, a new team planning to use a crowdsourcing business model to enter the 2015 FIA World Endurance Championship in the LMP2 category.
Despite his three titles, and although John Cooper considered him "the greatest", Formula One journalist Adam Cooper wrote in 1999 that Brabham is never listed among the Top 10 of all time, noting that "Stirling Moss and Jim Clark dominated the headlines when Jack was racing, and they still do". Brabham was the first post-war racing driver to be knighted when he received the honour in 1978 for services to motorsport. He has received several other honours and in 2011, the suburb of Brabham in Perth, Western Australia, was named after him. A race circuit and an automotive training school were also named after him in the early 2010s.
In retirement, Brabham continued to be involved in motorsport events, appearing at contemporary and historic motorsport events around the world where he often drove his former Cooper and Brabham cars until the early 2000s. In 1999, after competing at the Goodwood Revival at the age of 73 he commented that driving stopped him getting old. Despite a large accident at the 2000 Revival, the first racing accident to put him in hospital overnight, he continued to drive until at least 2004. By the late 2000s, ill-health was preventing him from driving in competition. In addition to the deafness caused by years of motor racing without adequate ear protection, his eyesight was reduced due to macular degeneration and he had kidney disease for which by 2009 he was receiving dialysis three times a week. Nonetheless, that year he attended a celebration of the 50th anniversary of his first world championship at the Phillip Island Classic festival of motorsport, and in 2010 flew to Bahrain with most of the other Formula One world Drivers' Champions for a celebration of 60 years of the Formula One world championship. Brabham was the oldest surviving F1 champion.
Brabham and Betty had three sons together: Geoff, Gary, and David. All three became involved in motorsport, with support from Brabham in their early years. Between them, they have won sportscar and single-seater races and championships. Geoff was an Indycar and sportscar racer who won five North American sportscar championships as well as the 24 Hours of Le Mans, while David competed in Formula One for the Brabham team and has also won the Le Mans race as well as three Japanese and North American sportscar titles. Gary also drove briefly in Formula One, although his F1 career consisted of two DNPQ's for the Life team. Brabham and Betty divorced in 1994 after 43 years. Brabham married his second wife, Margaret in 1995 and they lived on the Gold Coast, Queensland. Brabham's grandson Matthew (son of Geoff) graduated from karts in 2010 and won two ladders of the Mazda Road to Indy, eventually racing in the 2016 Indianapolis 500 and winning two Stadium Super Trucks championships. Another grandson, Sam, the son of David and Lisa, whose brother Mike also was an F1 driver, stepped up to car racing from karts in 2013 when he made his debut in the British Formula Ford Championship. The Brabham family have been involved in world-class motorsport for over 60 years.
Brabham made his last public appearance on 18 May 2014, appearing with one of the cars he built. He died at his home on the Gold Coast on 19 May 2014, aged 88, following a lengthy battle with liver disease. He was eating breakfast with his wife, Margaret, when he died. In a statement on the family's website, Brabham's son David confirmed his father's death.
"It's a very sad day for all of us", David Brabham stated. "My father passed away peacefully at home at the age of 88 this morning. He lived an incredible life, achieving more than anyone would ever dream of and he will continue to live on through the astounding legacy he leaves behind."
At the time of his death, Brabham was the last surviving world champion from the 1950s era.
At his request, his ashes were scattered at the Tamborine Mountain Skywalk in Queensland Australia by his wife, Lady Margaret Brabham on 4 September 2014. Brabham was a frequent visitor to the Skywalk. | https://en.wikipedia.org/wiki?curid=16564 |
Jones calculus
In optics, polarized light can be described using the Jones calculus, discovered by R. C. Jones in 1941. Polarized light is represented by a Jones vector, and linear optical elements are represented by "Jones matrices". When light crosses an optical element the resulting polarization of the emerging light is found by taking the product of the Jones matrix of the optical element and the Jones vector of the incident light.
Note that Jones calculus is only applicable to light that is already fully polarized. Light which is randomly polarized, partially polarized, or incoherent must be treated using Mueller calculus.
The Jones vector describes the polarization of light in free space or another homogeneous isotropic non-attenuating medium, where the light can be properly described as transverse waves. Suppose that a monochromatic plane wave of light is travelling in the positive "z"-direction, with angular frequency "ω" and wavevector k = (0,0,"k"), where the wavenumber "k" = "ω"/"c". Then the electric and magnetic fields E and H are orthogonal to k at each point; they both lie in the plane "transverse" to the direction of motion. Furthermore, H is determined from E by 90-degree rotation and a fixed multiplier depending on the wave impedance of the medium. So the polarization of the light can be determined by studying E. The complex amplitude of E is written
Note that the physical E field is the real part of this vector; the complex multiplier serves up the phase information. Here formula_2 is the imaginary unit with formula_3.
The Jones vector is then
Thus, the Jones vector represents the amplitude and phase of the electric field in the "x" and "y" directions.
The sum of the squares of the absolute values of the two components of Jones vectors is proportional to the intensity of light. It is common to normalize it to 1 at the starting point of calculation for simplification. It is also common to constrain the first component of the Jones vectors to be a real number. This discards the overall phase information that would be needed for calculation of interference with other beams.
Note that all Jones vectors and matrices in this article employ the convention that the phase of the light wave is given by formula_5, a convention used by Hecht. Under this convention, increase in formula_6 (or formula_7) indicates retardation (delay) in phase, while decrease indicates advance in phase. For example, a Jones vectors component of formula_8 (formula_9) indicates retardation by formula_10 (or 90 degree) compared to 1 (formula_11). Circular polarisation described under Jones' convention is called : "From the point of view of the receiver". Collett uses the opposite definition for the phase (formula_12). Circular polarisation described under Collett's convention is called : "From the point of view of the source". The reader should be wary of the choice of convention when consulting references on the Jones calculus.
The following table gives the 6 common examples of normalized Jones vectors.
A general vector that points to any place on the surface is written as a ket formula_13. When employing the Poincaré sphere (also known as the Bloch sphere), the basis kets (formula_14 and formula_15) must be assigned to opposing (antipodal) pairs of the kets listed above. For example, one might assign formula_14 = formula_17 and formula_15 = formula_19. These assignments are arbitrary. Opposing pairs are
The polarization of any point not equal to formula_24 or formula_25 and not on the circle that passes through formula_28 is known as elliptical polarization.
The Jones matrices are operators that act on the Jones vectors defined above. These matrices are implemented by various optical elements such as lenses, beam splitters, mirrors, etc. Each matrix represents projection onto a one-dimensional complex subspace of the Jones vectors. The following table gives examples of Jones matrices for polarizers:
Phase retarders introduce a phase shift between the vertical and horizontal component of the field and thus change the polarization of the beam. Phase retarders are usually made out of birefringent uniaxial crystals such as calcite, MgF2 or quartz. Uniaxial crystals have one crystal axis that is different from the other two crystal axes (i.e., "ni" ≠ "nj" = "nk"). This unique axis is called the extraordinary axis and is also referred to as the optic axis. An optic axis can be the fast or the slow axis for the crystal depending on the crystal at hand. Light travels with a higher phase velocity along an axis that has the smallest refractive index and this axis is called the fast axis. Similarly, an axis which has the largest refractive index is called a slow axis since the phase velocity of light is the lowest along this axis. "Negative" uniaxial crystals (e.g., calcite CaCO3, sapphire Al2O3) have "ne" < "no" so for these crystals, the extraordinary axis (optic axis) is the fast axis, whereas for "positive" uniaxial crystals (e.g., quartz SiO2, magnesium fluoride MgF2, rutile TiO2), "ne" > "n o" and thus the extraordinary axis (optic axis) is the slow axis.
Any phase retarder with fast axis equal to the x- or y-axis has zero off-diagonal terms and thus can be conveniently expressed as
where formula_6 and formula_7 are the phase offsets of the electric fields in formula_32 and formula_33 directions respectively. In the phase convention formula_5, define the relative phase between the two waves as formula_35. Then a positive formula_36 (i.e. formula_7 > formula_6) means that formula_39 doesn't attain the same value as formula_40 until a later time, i.e. formula_40 leads formula_39. Similarly, if formula_43, then formula_39 leads formula_40.
For example, if the fast axis of a quarter wave plate is horizontal, then the phase velocity along the horizontal direction is ahead of the vertical direction i.e., formula_40 leads formula_39. Thus, formula_48 which for a quarter wave plate yields formula_49.
In the opposite convention formula_12, define the relative phase as formula_51. Then formula_52 means that formula_39 doesn't attain the same value as formula_40 until a later time, i.e. formula_55 leads formula_39.
The special expressions for the phase retarders can be obtained by taking suitable parameter values in the general expression for a birefringent material. In the general expression:
Note that for linear retarders, formula_59 = 0 and for circular retarders, formula_59 = ± formula_62/2, formula_58 = formula_62/4. In general for elliptical retarders, formula_59 takes on values between - formula_62/2 and formula_62/2.
Assume an optical element has its optic axis perpendicular to the surface vector for the plane of incidence and is rotated about this surface vector by angle "θ/2" (i.e., the principal plane, through which the optic axis passes, makes angle "θ/2" with respect to the plane of polarization of the electric field of the incident TE wave). Recall that a half-wave plate rotates polarization as "twice" the angle between incident polarization and optic axis (principal plane). Therefore, the Jones matrix for the rotated polarization state, M("θ"), is
This agrees with the expression for a half-wave plate in the table above. These rotations are identical to beam unitary splitter transformation in optical physics given by
where the primed and unprimed coefficients represent beams incident from opposite sides of the beam splitter. The reflected and transmitted components acquire a phase "θr" and "θt", respectively. The requirements for a valid representation of the element are
and
formula_72
This would involve a three-dimensional rotation matrix. See Russell A. Chipman and Garam Yun for work done on this.
The angle of polarization ellipse of the Jones vector formula_73 can be calculated as below,
where formula_58 is the angle of either a major or a minor axis and formula_76 is a reflection matrix. | https://en.wikipedia.org/wiki?curid=16565 |
Josip Broz Tito
Josip Broz (, ; 7 May 1892 – 4 May 1980), commonly known as Tito (; , ), was a Yugoslav communist revolutionary and statesman, serving in various roles from 1943 until his death in 1980. During World War II, he was the leader of the Partisans, often regarded as the most effective resistance movement in occupied Europe. He also served as the President of the Socialist Federal Republic of Yugoslavia from 14 January 1953 to 4 May 1980.
Broz was born to a Croat father and Slovene mother in the village of Kumrovec, Austria-Hungary (now in Croatia). Drafted into military service, he distinguished himself, becoming the youngest sergeant major in the Austro-Hungarian Army of that time. After being seriously wounded and captured by the Imperial Russians during World War I, he was sent to a work camp in the Ural Mountains. He participated in some events of the Russian Revolution in 1917 and the subsequent Civil War. Upon his return to the Balkans in 1918, Broz entered the newly established Kingdom of Yugoslavia, where he joined the Communist Party of Yugoslavia (KPJ). He later was elected as General Secretary (later Chairman of the Presidium) of the League of Communists of Yugoslavia (1939–1980). During World War II, after the Nazi invasion of the area, he led the Yugoslav guerrilla movement, the Partisans (1941–1945).
After the war, he was the chief architect of the Socialist Federal Republic of Yugoslavia (SFRY), serving as both Prime Minister (1944–1963), President (later President for Life) (1953–1980), and Marshal of Yugoslavia, the highest rank of the Yugoslav People's Army (JNA). Despite being one of the founders of Cominform, he became the first Cominform member to defy Soviet hegemony in 1948. He was the only leader in Joseph Stalin's time to leave Cominform and begin with his country's own socialist program, which contained elements of market socialism. Economists active in the former Yugoslavia, including Czech-born Jaroslav Vanek and Yugoslav-born Branko Horvat, promoted a model of market socialism that was dubbed the Illyrian model. Firms were socially owned by their employees and structured on workers' self-management; they competed in open and free markets. Tito managed to keep ethnic tensions under control by delegating as much power as possible to each republic. The 1974 Yugoslav Constitution defined SFR Yugoslavia as a "federal republic of equal nations and nationalities, freely united on the principle of brotherhood and unity in achieving specific and common interest." Each republic was also given the right to self-determination and secession if done through legal channels. Lastly, Tito gave Kosovo and Vojvodina, the two constituent provinces of Serbia, substantially increased autonomy, including de facto veto power in the Serbian parliament. Tito built a very powerful cult of personality around himself, which was maintained by the League of Communists of Yugoslavia after his death. Ten years after his death, communism collapsed in Eastern Europe, and Yugoslavia descended into civil war.
While some criticise his presidency as authoritarian and compare him to the brutality of Stalin, many see Tito as a benevolent dictator. He was a popular public figure both in Yugoslavia and abroad. Viewed as a unifying symbol, his internal policies maintained the peaceful coexistence of the nations of the Yugoslav federation. He gained further international attention as the chief leader of the Non-Aligned Movement, alongside Jawaharlal Nehru of India, Gamal Abdel Nasser of Egypt, and Kwame Nkrumah of Ghana. With a highly favourable reputation abroad in both Cold War blocs, he received some 98 foreign decorations, including the Legion of Honour and the Order of the Bath.
Josip Broz was born on 7 May 1892 in Kumrovec, a village in the northern Croatian region of Hrvatsko Zagorje. At the time it was part of the Kingdom of Croatia-Slavonia within the Austro-Hungarian Empire. He was the seventh or eighth child of Franjo Broz (1860–1936) and Marija née Javeršek (1864–1918). His parents had already had a number of children die in early infancy. Broz was christened and raised as a Roman Catholic. His father, Franjo, was a Croat whose family had lived in the village for three centuries, while his mother Marija, was a Slovene from the village of Podsreda. The villages were apart, and his parents had married on 21 January 1881. Franjo Broz had inherited a estate and a good house, but he was unable to make a success of farming. Josip spent a significant proportion of his pre-school years living with his maternal grandparents at Podsreda, where he became a favourite of his grandfather Martin Javeršek. By the time he returned to Kumrovec to begin school, he spoke Slovene better than Croatian, and had learned to play the piano. Despite his "mixed parentage", Broz identified as a Croat like his father and neighbours.
In July 1900, at the age of eight, Broz entered primary school at Kumrovec. He completed four years of school, failing the 2nd grade and graduating in 1905. As a result of his limited schooling, throughout his life Tito was poor at spelling. After leaving school, he initially worked for a maternal uncle, and then on his parents' family farm. In 1907, his father wanted him to emigrate to the United States, but could not raise the money for the voyage.
Instead, aged 15 years, Broz left Kumrovec and travelled about south to Sisak, where his cousin Jurica Broz was doing army service. Jurica helped him get a job in a restaurant, but Broz was soon tired of that work. He approached a Czech locksmith, Nikola Karas, for a three-year apprenticeship, which included training, food, and room and board. As his father could not afford to pay for his work clothing, Broz paid for it himself. Soon after, his younger brother Stjepan also became apprenticed to Karas.
During his apprenticeship, Broz was encouraged to mark May Day in 1909, and he read and sold "Slobodna Reč" ("Free Word"), a socialist newspaper. After completing his apprenticeship in September 1910, Broz used his contacts to gain employment in Zagreb. At the age of 18, he joined the Metal Workers' Union and participated in his first labour protest. He also joined the Social Democratic Party of Croatia and Slavonia.
He returned home in December 1910. In early 1911 he began a series of moves in search of work, first seeking work in Ljubljana, then Trieste, Kumrovec and Zagreb, where he worked repairing bicycles. He joined his first strike action on May Day 1911. After a brief period of work in Ljubljana, between May 1911 and May 1912, he worked in a factory in Kamnik in the Kamnik–Savinja Alps. After it closed, he was offered redeployment to Čenkov in Bohemia.
On arriving at his new workplace, he discovered that the employer was trying to bring in cheaper labour to replace the local Czech workers, and he and others joined successful strike action to force the employer to back down.
Driven by curiosity, Broz moved to Plzeň, where he was briefly employed at the Škoda Works. He next travelled to Munich in Bavaria. He also worked at the Benz car factory in Mannheim, and visited the Ruhr industrial region. By October 1912 he had reached Vienna. He stayed with his older brother Martin and his family, and worked at the Griedl Works before getting a job at Wiener Neustadt. There he worked for Austro-Daimler, and was often asked to drive and test the cars. During this time he spent considerable time fencing and dancing, and during his training and early work life, he also learned German and passable Czech.
In May 1913, Broz was conscripted into the Austro-Hungarian Army, for his compulsory two years of service. He successfully requested to serve with the 25th Croatian Home Guard () Regiment garrisoned in Zagreb. After learning to ski during the winter of 1913 and 1914, Broz was sent to a school for non-commissioned officers (NCO) in Budapest, after which he was promoted to sergeant major. At 22 years of age, he was the youngest of that rank in his regiment. At least one source states that he was the youngest sergeant major in the Austro-Hungarian Army. After winning the regimental fencing competition, Broz came in second in the army fencing championships in Budapest in May 1914.
Soon after the outbreak of World War I in 1914, the 25th Croatian Home Guard Regiment marched toward the Serbian border. Broz was arrested for sedition and imprisoned in the Petrovaradin fortress in present-day Novi Sad. Broz later gave conflicting accounts of this arrest, telling one biographer that he had threatened to desert to the Russians, but also claiming that the whole matter arose from a clerical error. A third version was that he had been overheard saying that he hoped the Austro-Hungarian Empire would be defeated. After his acquittal and release, his regiment served briefly on the Serbian Front before being deployed to the Eastern Front in Galicia in early 1915 to fight against Russia. Tito in his own account of his military service did not mention that he participated in the failed Austrian invasion of Serbia, instead giving the misleading impression that he fought only in Galicia, as it would have offended Serbian opinion to know that he fought in 1914 for the Habsburgs against them. On one occasion, the scout platoon he commanded went behind the enemy lines and captured 80 Russian soldiers, bringing them back to their own lines alive. In 1980 it was discovered that he had been recommended for an award for gallantry and initiative in reconnaissance and capturing prisoners. Tito's biographer, Richard West, wrote that Tito actually downplayed his military record as the Austrian Army records showed that he was a brave soldier, which contradicted his later claim to have been opposed to the Habsburg monarchy and his self-portrait of himself as an unwilling conscript fighting in a war he was opposed to. Broz was regarded by his fellow soldiers as "kaisertreu" ("true to the Emperor").
On 25 March 1915, he was wounded in the back by a Circassian cavalryman's lance, and captured during a Russian attack near Bukovina. Broz in his account of his capture described it melodramatically as: "...but suddenly the right flank yielded and through the gap poured cavalry of the Circassians, from Asiatic Russia. Before we knew it they were thundering through our positions, leaping from their horses and throwing themselves into our trenches with lances lowered. One of them rammed his two-yard, iron-tipped, double-pronged lance into my back just below the left arm. I fainted. Then, as I learned, the Circassians began to butcher the wounded, even slashing them with their knives. Fortunately, Russian infantry reached the positions and put an end to the orgy". Now a prisoner of war (POW), Broz was transported east to a hospital established in an old monastery in the town of Sviyazhsk on the Volga river near Kazan. During his 13 months in hospital he had bouts of pneumonia and typhus, and learned Russian with the help of two schoolgirls who brought him Russian classics by such authors as Tolstoy and Turgenev to read.
After recuperating, in mid-1916 he was transferred to the Ardatov POW camp in the Samara Governorate, where he used his skills to maintain the nearby village grain mill. At the end of the year, he was again transferred, this time to the Kungur POW camp near Perm where the POWs were used as labour to maintain the newly completed Trans-Siberian Railway. Broz was appointed to be in charge of all the POWs in the camp. During this time he became aware that the Red Cross parcels sent to the POWs were being stolen by camp staff. When he complained, he was beaten and put in prison. During the February Revolution, a crowd broke into the prison and returned Broz to the POW camp. A Bolshevik he had met while working on the railway told Broz that his son was working in an engineering works in Petrograd, so, in June 1917, Broz walked out of the unguarded POW camp and hid aboard a goods train bound for that city, where he stayed with his friend's son. The journalist Richard West has suggested that because Broz chose to remain in an unguarded POW camp rather than volunteer to serve with the Yugoslav legions of the Serbian Army, this indicates that he remained loyal to the Austro-Hungarian Empire, and undermines his later claim that he and other Croat POWs were excited by the prospect of revolution and looked forward to the overthrow of the empire that ruled them.
Less than a month after Broz arrived in Petrograd, the July Days demonstrations broke out, and Broz joined in, coming under fire from government troops. In the aftermath, he tried to flee to Finland in order to make his way to the United States, but was stopped at the border. He was arrested along with other suspected Bolsheviks during the subsequent crackdown by the Russian Provisional Government led by Alexander Kerensky. He was imprisoned in the Peter and Paul Fortress for three weeks, during which he claimed to be an innocent citizen of Perm. When he finally admitted to being an escaped POW, he was to be returned by train to Kungur, but escaped at Yekaterinburg, then caught another train that reached Omsk in Siberia on 8 November after a journey. At one point, police searched the train looking for an escaped POW, but were deceived by Broz's fluent Russian.
In Omsk the train was stopped by local Bolsheviks who told Broz that Vladimir Lenin had seized control of Petrograd. They recruited him into an International Red Guard that guarded the Trans-Siberian Railway during the winter of 1917 and 1918. In May 1918, the anti-Bolshevik Czechoslovak Legion wrested control of parts of Siberia from Bolshevik forces, and the Provisional Siberian Government established itself in Omsk, and Broz and his comrades went into hiding. At this time Broz met a beautiful 14-year-old local girl, Pelagija "Polka" Belousova, who hid him then helped him escape to a Kyrgyz village from Omsk. Broz again worked maintaining the local mill until November 1919 when the Red Army recaptured Omsk from White forces loyal to the Provisional All-Russian Government of Alexander Kolchak. He moved back to Omsk and married Belousova in January 1920. At the time of their marriage, Broz was 27 years old and Belousova was 15. Broz later wrote that during his time in Russia he heard much talk of Lenin, a little of Trotsky and "...as for Stalin, during the time I stayed in Russia, I never once heard his name". In the autumn of 1920 he and his pregnant wife returned to his homeland, first by train to Narva, by ship to Stettin, then by train to Vienna, where they arrived on 20 September. In early October Broz returned home to Kumrovec in what was then the Kingdom of Serbs, Croats and Slovenes to find that his mother had died and his father had moved to Jastrebarsko near Zagreb. Sources differ over whether Broz joined the Communist Party while in Russia, but he stated that the first time he joined the party was in Zagreb after he returned to his homeland.
Upon his return home, Broz was unable to gain employment as a metalworker in Kumrovec, so he and his wife moved briefly to Zagreb, where he worked as a waiter, and took part in a waiter's strike. He also joined the Communist Party of Yugoslavia (CPY). The CPY's influence on the political life of Yugoslavia was growing rapidly. In the 1920 elections it won 59 seats and became the third strongest party. After the assassination of Milorad Drašković, the Yugoslav Minister of the Interior, by a young communist named Alija Alijagić on 2 August 1921, the CPY was declared illegal under the Yugoslav State Security Act of 1921.
Due to his overt communist links, Broz was fired from his employment. He and his wife then moved to the village of Veliko Trojstvo where he worked as a mill mechanic. After the arrest of the CPY leadership in January 1922, Stevo Sabić took over control of its operations. Sabić contacted Broz who agreed to work illegally for the party, distributing leaflets and agitating among factory workers. In the contest of ideas between those that wanted to pursue moderate policies and those that advocated violent revolution, Broz sided with the latter. In 1924, Broz was elected to the CPY district committee, but after he gave a speech at a comrade's Catholic funeral he was arrested when the priest complained. Paraded through the streets in chains, he was held for eight days and was eventually charged with creating a public disturbance. With the help of a Serbian Orthodox prosecutor who hated Catholics, Broz and his co-accused were acquitted. His brush with the law had marked him as a communist agitator, and his home was searched on an almost weekly basis. Since their arrival in Yugoslavia, Pelagija had lost three babies soon after their births, and one daughter, Zlatina, at the age of two. Broz felt the loss of Zlatina deeply. In 1924, Pelagija gave birth to a boy, Žarko, who survived. In mid-1925, Broz's employer died and the new mill owner gave him an ultimatum, give up his communist activities or lose his job. So, at the age of 33, Broz became a professional revolutionary.
The CPY concentrated its revolutionary efforts on factory workers in the more industrialised areas of Croatia and Slovenia, encouraging strikes and similar action. In 1925, the now unemployed Broz moved to Kraljevica on the Adriatic coast, where he started working at a shipyard to further the aims of the CPY. During his time in Karljevica, Tito acquired a love of the warm, sunny Adriatic coastline that was to last for the rest of his life, and throughout his later time as leader, he spent as much time possible living on his yacht while cruising the Adriatic.
While at Kraljevica he worked on Yugoslav torpedo boats and a pleasure yacht for the People's Radical Party politician, Milan Stojadinović. Broz built up the trade union organisation in the shipyards and was elected as a union representative. A year later he led a shipyard strike, and soon after was fired. In October 1926 he obtained work in a railway works in Smederevska Palanka near Belgrade. In March 1927, he wrote an article complaining about the exploitation of workers in the factory, and after speaking up for a worker he was promptly sacked. Identified by the CPY as worthy of promotion, he was appointed secretary of the Zagreb branch of the Metal Workers' Union, and soon after of the whole Croatian branch of the union. In July 1927 Broz was arrested, along with six other workers, and imprisoned at nearby Ogulin. After being held without trial for some time, Broz went on a hunger strike until a date was set. The trial was held in secret and he was found guilty of being a member of the CPY. Sentenced to four months' imprisonment, he was released from prison pending an appeal. On the orders of the CPY, Broz did not report to the court for the hearing of the appeal, instead going into hiding in Zagreb. Wearing dark spectacles and carrying forged papers, Broz posed as a middle-class technician in the engineering industry, working undercover to contact other CPY members and co-ordinate their infiltration of trade unions.
In February 1928, Broz was one of 32 delegates to the conference of the Croatian branch of the CPY. During the conference, Broz condemned factions within the party. These included those that advocated a Greater Serbia agenda within Yugoslavia, like the long-term CPY leader, the Serb Sima Marković. Broz proposed that the executive committee of the Communist International purge the branch of factionalism, and was supported by a delegate sent from Moscow. After it was proposed that the entire central committee of the Croatian branch be dismissed, a new central committee was elected with Broz as its secretary. Marković was subsequently expelled from the CPY at the Fourth Congress of the Comintern, and the CPY adopted a policy of working for the break-up of Yugoslavia. Broz arranged to disrupt a meeting of the Social-Democratic Party on May Day that year, and in a melee outside the venue, Broz was arrested by the police. They failed to identify him, charging him under his false name for a breach of the peace. He was imprisoned for 14 days and then released, returning to his previous activities. The police eventually tracked him down with the help of a police informer. He was ill-treated and held for three months before being tried in court in November 1928 for his illegal communist activities, which included allegations that the bombs that had been found at his address had been planted by the police. He was convicted and sentenced to five years' imprisonment.
After his sentencing, his wife and son returned to Kumrovec, where they were looked after by sympathetic locals, but then one day they suddenly left without explanation and returned to the Soviet Union. She fell in love with another man and Žarko grew up in institutions. After arriving at Lepoglava prison, Broz was employed in maintaining the electrical system, and chose as his assistant a middle-class Belgrade Jew, Moša Pijade, who had been given a 20-year sentence for his communist activities. Their work allowed Broz and Pijade to move around the prison, contacting and organising other communist prisoners. During their time together in Lepoglava, Pijade became Broz's ideological mentor. After two and a half years at Lepoglava, Broz was accused of attempting to escape and was transferred to Maribor prison where he was held in solitary confinement for several months. After completing the full term of his sentence, he was released, only to be arrested outside the prison gates and taken to Ogulin to serve the four-month sentence he had avoided in 1927. He was finally released from prison on 16 March 1934, but even then he was subject to orders that required him to live in Kumrovec and report to the police daily. During his imprisonment, the political situation in Europe had changed significantly, with the rise of Adolf Hitler in Germany and the emergence of right-wing parties in France and neighbouring Austria. He returned to a warm welcome in Kumrovec, but did not stay for long. In early May, he received word from the CPY to return to his revolutionary activities, and left his home town for Zagreb, where he rejoined the Central Committee of the Communist Party of Croatia.
The Croatian branch of the CPY was in disarray, a situation exacerbated by the escape of the executive committee of the CPY to Vienna in Austria, from which they were directing activities. Over the next six months, Broz travelled several times between Zagreb, Ljubljana and Vienna, using false passports. In July 1934, he was blackmailed by a smuggler, but pressed on across the border, and was detained by the local "Heimwehr", a paramilitary Home Guard. He used the Austrian accent he had developed during his war service to convince them that he was a wayward Austrian mountaineer, and they allowed him to proceed to Vienna. Once there, he contacted the General Secretary of the CPY, Milan Gorkić, who sent him to Ljubljana to arrange a secret conference of the CPY in Slovenia. The conference was held at the summer palace of the Roman Catholic bishop of Ljubljana, whose brother was a communist sympathiser. It was at this conference that Broz first met Edvard Kardelj, a young Slovene communist who had recently been released from prison. Broz and Kardelj subsequently became good friends, with Tito later regarding him as his most reliable deputy. As he was wanted by the police for failing to report to them in Kumrovec, Broz adopted various pseudonyms, including "Rudi" and "Tito". He used the latter as a pen name when he wrote articles for party journals in 1934, and it stuck. He gave no reason for choosing the name "Tito" except that it was a common nickname for men from the district where he grew up. Within the Comintern network, his nickname was "Walter".
During this time Tito wrote articles on the duties of imprisoned communists and on trade unions. He was in Ljubljana when King Alexander was assassinated by the Croatian nationalist "Ustaše" organisation in Marseilles on 9 October 1934. In the crackdown on dissidents that followed his death, it was decided that Tito should leave Yugoslavia. He travelled to Vienna on a forged Czech passport where he joined Gorkić and the rest of the Politburo of the CPY. It was decided that the Austrian government was too hostile to communism, so the Politburo travelled to Brno in Czechoslovakia, and Tito accompanied them. On Christmas Day 1934, a secret meeting of the Central Committee of the CPY was held in Ljubljana, and Tito was elected as a member of the Politburo for the first time. The Politburo decided to send him to Moscow to report on the situation in Yugoslavia, and in early February 1935 he arrived there as full-time official of the Comintern. He lodged at the main Comintern residence, the Hotel Lux on Tverskaya Street, and was quickly in contact with Vladimir Ćopić, one of the leading Yugoslavs with the Comintern. He was soon introduced to the main personalities in the organisation. Tito was appointed to the secretariat of the Balkan section, responsible for Yugoslavia, Bulgaria, Romania and Greece. Kardelj was also in Moscow, as was the Bulgarian communist leader Georgi Dimitrov. Tito lectured on trade unions to foreign communists, and attended a course on military tactics run by the Red Army, and occasionally attended the Bolshoi Theatre. He attended as one of 510 delegates to the Seventh World Congress of the Comintern in July and August 1935, where he briefly saw Joseph Stalin for the first time. After the congress, he toured the Soviet Union, then returned to Moscow to continue his work. He contacted Polka and Žarko, but soon fell in love with an Austrian woman who worked at the Hotel Lux, Johanna Koenig, known within communist ranks as Lucia Bauer. When she became aware of this liaison, Polka divorced Tito in April 1936. Tito married Bauer on 13 October of that year.
After the World Congress, Tito worked to promote the new Comintern line on Yugoslavia, which was that it would no longer work to break up the country, and would instead defend the integrity of Yugoslavia against Nazism and Fascism. From a distance, Tito also worked to organise strikes at the shipyards at Kraljevica and the coal mines at Trbovlje near Ljubljana. He tried to convince the Comintern that it would be better if the party leadership was located inside Yugoslavia. A compromise was arrived at, where Tito and others would work inside the country and Gorkić and the Politburo would continue to work from abroad. Gorkić and the Politburo relocated to Paris, while Tito began to travel between Moscow, Paris and Zagreb in 1936 and 1937, using false passports. In 1936, his father died.
Tito returned to Moscow in August 1936, soon after the outbreak of the Spanish Civil War. At the time, the Great Purge was underway, and foreign communists like Tito and his Yugoslav compatriots were particularly vulnerable. Despite a laudatory report written by Tito about the veteran Yugoslav communist Filip Filipović, Filipović was arrested and shot by the Soviet secret police, the NKVD. However, before the Purge really began to erode the ranks of the Yugoslav communists in Moscow, Tito was sent back to Yugoslavia with a new mission, to recruit volunteers for the International Brigades being raised to fight on the Republican side in the Spanish Civil War. Travelling via Vienna, he reached the coastal port city of Split in December 1936. According to the Croatian historian Ivo Banac, the reason Tito was sent back to Yugoslavia by the Comintern was in order to purge the CPY. An initial attempt to send 500 volunteers to Spain by ship failed utterly, with nearly all the communist volunteers being arrested and imprisoned. Tito then travelled to Paris, where he arranged the travel of volunteers to France under the cover of attending the Paris Exhibition. Once in France, the volunteers simply crossed the Pyrenees to Spain. In all, he sent 1,192 men to fight in the war, but only 330 came from Yugoslavia, the rest being expatriates in France, Belgium, the U.S. and Canada. Less than half were communists, and the rest were social-democrats and anti-fascists of various hues. Of the total, 671 were killed in the fighting and another 300 were wounded. Tito himself never went to Spain, despite later claims that he had. Between May and August 1937, Tito travelled several times between Paris and Zagreb organising the movement of volunteers and creating a separate Communist Party of Croatia. The new party was inaugurated at a conference at Samobor on the outskirts of Zagreb on 1–2 August 1937.
In June 1937, Gorkić was summoned to Moscow, where he was arrested, and after months of NKVD interrogation, he was shot. According to Banac, Gorkić was killed on Stalin's orders. West concludes that despite being in competition with men like Gorkić for the leadership of the CPY, it was not in Tito's character to have innocent people sent to their deaths. Tito then received a message from the Politburo of the CPY to join them in Paris. In August 1937 he became acting General Secretary of the CPY. He later explained that he survived the Purge by staying out of Spain where the NKVD was active, and also by avoiding visiting the Soviet Union as much as possible. When first appointed as general secretary, he avoided travelling to Moscow by insisting that he needed to deal with some indiscipline in the CPY in Paris. He also promoted the idea that the upper echelons of the CPY should be sharing the dangers of underground resistance within the country. He developed a new, younger leadership team that was loyal to him, including the Slovene Kardelj, the Serb, Aleksandar Ranković, and the Montenegrin, Milovan Đilas. In December 1937, Tito arranged for a demonstration to greet the French foreign minister when he visited Belgrade, expressing solidarity with the French against Nazi Germany. The protest march numbered 30,000 and turned into a protest against the neutrality policy of the Stojadinović government. It was eventually broken up by the police. In March 1938 Tito returned to Yugoslavia from Paris. Hearing a rumour that his opponents within the CPY had tipped off the police, he travelled to Belgrade rather than Zagreb and used a different passport. While in Belgrade he stayed with a young intellectual, Vladimir Dedijer, who was a friend of Đilas. Arriving in Yugoslavia a few days ahead of the "Anschluss" between Nazi Germany and Austria, he made an appeal condemning it, in which the CPY was joined by the Social Democrats and trade unions. In June, Tito wrote to the Comintern suggesting that he should visit Moscow. He waited in Paris for two months for his Soviet visa before travelling to Moscow via Copenhagen. He arrived in Moscow on 24 August.
On arrival in Moscow, he found that all Yugoslav communists were under suspicion. Nearly all the most prominent leaders of the CPY were arrested by the NKVD and executed, including over twenty members of the Central Committee. Both his ex-wife Polka and his wife Koenig/Bauer were arrested as "imperialist spies", although they were both eventually released, Polka after 27 months in prison. Tito therefore needed to make arrangements for the care of Žarko, who was fourteen. He placed him a boarding school outside Kharkov, then at a school at Penza, but he ran away twice and was eventually taken in by a friend's mother. In 1941, Žarko joined the Red Army to fight the invading Germans. Some of Tito's critics argue that his survival indicates he must have denounced his comrades as Trotskyists. He was asked for information on a number of his fellow Yugoslav communists, but according to his own statements and published documents, he never denounced anyone, usually saying he did not know them. In one case he was asked about the Croatian communist leader Horvatin, but wrote ambiguously, saying that he did not know whether he was a Trotskyist. Nevertheless, Horvatin was not heard of again. While in Moscow, he was given the task of assisting Ćopić to translate the "History of the Communist Party of the Soviet Union (Bolsheviks)" into Serbo-Croatian, but they had only got to the second chapter when Ćopić too was arrested and executed. He worked on with a fellow surviving Yugoslav communist, but a Yugoslav communist of German ethnicity reported an inaccurate translation of a passage and claimed it showed Tito was a Trotskyist. Other influential communists vouched for him, and he was exonerated. He was denounced by a second Yugoslav communist, but the action backfired and his accuser was arrested. Several factors were at play in his survival; working class origins, lack of interest in intellectual arguments about socialism, attractive personality and capacity for making influential friends.
While Tito was avoiding arrest in Moscow, Germany was placing pressure on Czechoslovakia to cede the Sudetenland. In response to this threat, Tito organised for a call for Yugoslav volunteers to fight for Czechoslovakia, and thousands of volunteers came to the Czechoslovak embassy in Belgrade to offer their services. Despite the eventual Munich Agreement and Czechoslovak acceptance of the annexation and the fact that the volunteers were turned away, Tito claimed credit for the Yugoslav response, which worked in his favour. By this stage, Tito was well aware of the realities in the Soviet Union, later stating that he "witnessed a great many injustices", but was too heavily invested in communism and too loyal to the Soviet Union to step back at this point. Tito's appointment as General Secretary of the CPY was formally ratified by the Comintern on 5 January 1939.
He was appointed to the Committee and started to appoint allies to him, among them Edvard Kardelj, Milovan Đilas, Aleksandar Ranković and Boris Kidrič.
On 6 April 1941, German forces, with Hungarian and Italian assistance, launched an invasion of Yugoslavia. On 10 April 1941, Slavko Kvaternik proclaimed the Independent State of Croatia, and Tito responded by forming a Military Committee within the Central Committee of the Yugoslav Communist Party. Attacked from all sides, the armed forces of the Kingdom of Yugoslavia quickly crumbled. On 17 April 1941, after King Peter II and other members of the government fled the country, the remaining representatives of the government and military met with German officials in Belgrade. They quickly agreed to end military resistance. On 1 May 1941, Tito issued a pamphlet calling on the people to unite in a battle against the occupation. On 27 June 1941, the Central Committee of the Communist Party of Yugoslavia (CPY) appointed Tito Commander in Chief of all project national liberation military forces. On 1 July 1941, the Comintern sent precise instructions calling for immediate action.
Tito stayed in Belgrade until 16 September 1941 when he, together with all members of the CPY, left Belgrade to travel to rebel controlled territory. To leave Belgrade Tito used documents given to him by Dragoljub Milutinović, who was a "voivode" with the collaborationist Pećanac Chetniks. Since Pećanac was already fully co-operating with Germans by that time, this fact caused some to speculate that Tito left Belgrade with the blessing of the Germans because his task was to divide rebel forces, similar to Lenin's arrival in Russia. Broz travelled by train through Stalać and Čačak and arrived to the village of Robije on 18 September 1941.
Despite conflicts with the rival monarchic Chetnik movement, Tito's Partisans succeeded in liberating territory, notably the "Republic of Užice". During this period, Tito held talks with Chetnik leader Draža Mihailović on 19 September and 27 October 1941. It is said that Tito ordered his forces to assist escaping Jews, and that more than 2,000 Jews fought directly for Tito.
On 21 December 1941, the Partisans created the First Proletarian Brigade (commanded by Koča Popović) and on 1 March 1942, Tito created the Second Proletarian Brigade. In liberated territories, the Partisans organised People's Committees to act as civilian government. The Anti-Fascist Council of National Liberation of Yugoslavia (AVNOJ) convened in Bihać on 26–27 November 1942 and in Jajce on 29 November 1943. In the two sessions, the resistance representatives established the basis for post-war organisation of the country, deciding on a federation of the Yugoslav nations. In Jajce, a 67-member "presidency" was elected and established a nine-member National Committee of Liberation (five communist members) as a de facto provisional government. Tito was named President of the National Committee of Liberation.
With the growing possibility of an Allied invasion in the Balkans, the Axis began to divert more resources to the destruction of the Partisans main force and its high command. This meant, among other things, a concerted German effort to capture Josip Broz Tito personally. On 25 May 1944, he managed to evade the Germans after the Raid on Drvar ("Operation Rösselsprung"), an airborne assault outside his Drvar headquarters in Bosnia.
After the Partisans managed to endure and avoid these intense Axis attacks between January and June 1943, and the extent of Chetnik collaboration became evident, Allied leaders switched their support from Draža Mihailović to Tito. King Peter II, American President Franklin Roosevelt and British Prime Minister Winston Churchill joined Soviet Premier Joseph Stalin in officially recognising Tito and the Partisans at the Tehran Conference. This resulted in Allied aid being parachuted behind Axis lines to assist the Partisans. On 17 June 1944 on the Dalmatian island of Vis, the Treaty of Vis ("Viški sporazum") was signed in an attempt to merge Tito's government (the AVNOJ) with the government in exile of King Peter II. The Balkan Air Force was formed in June 1944 to control operations that were mainly aimed at aiding his forces.
On 12 September 1944, King Peter II called on all Yugoslavs to come together under Tito's leadership and stated that those who did not were "traitors", by which time Tito was recognised by all Allied authorities (including the government-in-exile) as the Prime Minister of Yugoslavia, in addition to commander-in-chief of the Yugoslav forces. On 28 September 1944, the Telegraph Agency of the Soviet Union (TASS) reported that Tito signed an agreement with the Soviet Union allowing "temporary entry" of Soviet troops into Yugoslav territory, which allowed the Red Army to assist in operations in the northeastern areas of Yugoslavia. With their strategic right flank secured by the Allied advance, the Partisans prepared and executed a massive general offensive that succeeded in breaking through German lines and forcing a retreat beyond Yugoslav borders. After the Partisan victory and the end of hostilities in Europe, all external forces were ordered off Yugoslav territory.
In the final days of World War II in Yugoslavia, units of the Partisans were responsible for atrocities after the repatriations of Bleiburg, and accusations of culpability were later raised at the Yugoslav leadership under Tito. At the time, according to some authors, Josip Broz Tito repeatedly issued calls for surrender to the retreating column, offering amnesty and attempting to avoid a disorderly surrender. On 14 May he dispatched a telegram to the supreme headquarters Slovene Partisan Army prohibiting the execution of prisoners of war and commanding the transfer of the possible suspects to a military court.
On 7 March 1945, the provisional government of the Democratic Federal Yugoslavia ("Demokratska Federativna Jugoslavija", DFY) was assembled in Belgrade by Josip Broz Tito, while the provisional name allowed for either a republic or monarchy. This government was headed by Tito as provisional Yugoslav Prime Minister and included representatives from the royalist government-in-exile, among others Ivan Šubašić. In accordance with the agreement between resistance leaders and the government-in-exile, post-war elections were held to determine the form of government. In November 1945, Tito's pro-republican People's Front, led by the Communist Party of Yugoslavia, won the elections with an overwhelming majority, the vote having been boycotted by monarchists. During the period, Tito evidently enjoyed massive popular support due to being generally viewed by the populace as the liberator of Yugoslavia. The Yugoslav administration in the immediate post-war period managed to unite a country that had been severely affected by ultra-nationalist upheavals and war devastation, while successfully suppressing the nationalist sentiments of the various nations in favour of tolerance, and the common Yugoslav goal. After the overwhelming electoral victory, Tito was confirmed as the Prime Minister and the Minister of Foreign Affairs of the DFY. The country was soon renamed the Federal People's Republic of Yugoslavia (FPRY) (later finally renamed into Socialist Federal Republic of Yugoslavia, SFRY). On 29 November 1945, King Peter II was formally deposed by the Yugoslav Constituent Assembly. The Assembly drafted a new republican constitution soon afterwards.
Yugoslavia organised the Yugoslav People's Army ("Jugoslavenska narodna armija", or JNA) from the Partisan movement and became the fourth strongest army in Europe at the time. The State Security Administration ("Uprava državne bezbednosti"/"sigurnosti"/"varnosti", UDBA) was also formed as the new secret police, along with a security agency, the Department of People's Security ("Organ Zaštite Naroda (Armije)", OZNA). Yugoslav intelligence was charged with imprisoning and bringing to trial large numbers of Nazi collaborators; controversially, this included Catholic clergymen due to the widespread involvement of Croatian Catholic clergy with the Ustaša regime. Draža Mihailović was found guilty of collaboration, high treason and war crimes and was subsequently executed by firing squad in July 1946.
Prime Minister Josip Broz Tito met with the president of the Bishops' Conference of Yugoslavia, Aloysius Stepinac on 4 June 1945, two days after his release from imprisonment. The two could not reach an agreement on the state of the Catholic Church. Under Stepinac's leadership, the bishops' conference released a letter condemning alleged Partisan war crimes in September 1945. The following year Stepinac was arrested and put on trial, which was perceived by some as a show trial. In October 1946, in its first special session for 75 years, the Vatican excommunicated Tito and the Yugoslav government for sentencing Stepinac to 16 years in prison on charges of assisting Ustaše terror and of supporting forced conversions of Serbs to Catholicism. Stepinac received preferential treatment in recognition of his status and the sentence was soon shortened and reduced to house-arrest, with the option of emigration open to the archbishop. At the conclusion of the "Informbiro period", reforms rendered Yugoslavia considerably more religiously liberal than the Eastern Bloc states.
In the first post war years Tito was widely considered a communist leader very loyal to Moscow, indeed, he was often viewed as second only to Stalin in the Eastern Bloc. In fact, Stalin and Tito had an uneasy alliance from the start, with Stalin considering Tito too independent.
During the immediate post-war period Tito's Yugoslavia had a strong commitment to orthodox Marxist ideas. Harsh repressive measures against dissidents were common, including "arrests, show trials, forced collectivisation, suppression of churches and religion". As the leader of Yugoslavia, Tito displayed a fondness for luxury, taking over the royal palaces that had belonged to the House of Karađorđević together with the former palaces used by the House of Habsburg that were located in Yugoslavia. Tito's governing style was very monarchical, as his tours across Yugoslavia in the former royal train closely resembled the royal tours of the Karađorđević kings and Habsburg emperors, and in Serbia he adopted the traditional royal custom of being a godfather to every 9th son. Tito modified the custom by becoming a godfather to every 9th daughter as well after criticism was made that the practice was sexist. Just like a Serbian king, Tito would appear wherever a 9th child was born to the family to congratulate the parents and give them a gift of cash. Tito always spoke very harshly of the Karađorđević kings in both public and private (through in private, he sometimes had a kind word for the Habsburgs), but in many ways he appeared to his people as a sort of king.
Unlike other states in east-central Europe liberated by allied forces, Yugoslavia liberated itself from Axis domination with limited direct support from the Red Army. Tito's leading role in liberating Yugoslavia not only greatly strengthened his position in his party and among the Yugoslav people, but also caused him to be more insistent that Yugoslavia had more room to follow its own interests than other Bloc leaders who had more reasons to recognise Soviet efforts in helping them liberate their own countries from Axis control. Although Tito was formally an ally of Stalin after World War II, the Soviets had set up a spy ring in the Yugoslav party as early as 1945, giving way to an uneasy alliance.
In the immediate aftermath of World War II, several armed incidents occurred between Yugoslavia and the Western Allies. Following the war, Yugoslavia acquired the Italian territory of Istria as well as the cities of Zadar and Rijeka. Yugoslav leadership was looking to incorporate Trieste into the country as well, which was opposed by the Western Allies. This led to several armed incidents, notably attacks by Yugoslav fighter planes on U.S. transport aircraft, causing bitter criticism from the West. In 1946 alone, Yugoslav air-force shot down two U.S. transport aircraft. The passengers and crew of the first plane were secretly interned by the Yugoslav government. The second plane and its crew were a total loss. The U.S. was outraged and sent an ultimatum to the Yugoslav government, demanding the release of the Americans in custody, U.S. access to the downed planes, and full investigation of the incidents. Stalin was opposed to these provocations, as he felt the USSR unready to face the West in open war so soon after the losses of World War II and at the time when U.S. had operational nuclear weapons whereas the USSR had yet to conduct its first test. In addition, Tito was openly supportive of the Communist side in the Greek Civil War, while Stalin kept his distance, having agreed with Churchill not to pursue Soviet interests there, although he did support the Greek communist struggle politically, as demonstrated in several assemblies of the UN Security Council. In 1948, motivated by the desire to create a strong independent economy, Tito modelled his economic development plan independently from Moscow, which resulted in a diplomatic escalation followed by a bitter exchange of letters in which Tito wrote that "We study and take as an example the Soviet system, but develop in a different form".
The Soviet answer on 4 May admonished Tito and the Communist Party of Yugoslavia (CPY) for failing to admit and correct its mistakes, and went on to accuse them of being too proud of their successes against the Germans, maintaining that the Red Army had saved them from destruction. Tito's response on 17 May suggested that the matter be settled at the meeting of the Cominform to be held that June. However, Tito did not attend the second meeting of the Cominform, fearing that Yugoslavia was to be openly attacked. In 1949 the crisis nearly escalated into an armed conflict, as Hungarian and Soviet forces were massing on the northern Yugoslav frontier. An invasion of Yugoslavia was planned to be carried out in 1949 via the combined forces of neighbouring Soviet satellite states of Hungary, Romania, Bulgaria and Albania, followed by the subsequent removal of Tito's government. On 28 June, the other member countries of the Cominform expelled Yugoslavia, citing "nationalist elements" that had "managed in the course of the past five or six months to reach a dominant position in the leadership" of the CPY. The Hungarian and Romanian armies were expanded in size and, together with Soviet ones, massed on the Yugoslav border. The assumption in Moscow was that once it was known that he had lost Soviet approval, Tito would collapse; "I will shake my little finger and there will be no more Tito," Stalin remarked. The expulsion effectively banished Yugoslavia from the international association of socialist states, while other socialist states of Eastern Europe subsequently underwent purges of alleged "Titoists". Stalin took the matter personally and arranged several assassination attempts on Tito, none of which succeeded. In a correspondence between the two leaders, Tito openly wrote:
One significant consequence of the tension arising between Yugoslavia and the Soviet Union was Tito's decision to begin a large scale repression against any real or alleged opponent of his own view of Yugoslavia. This repression was not limited to known and alleged Stalinists, but also included members of the Communist Party or anyone exhibiting sympathy towards the Soviet Union. Prominent partisans, such as Vlado Dapčević and Dragoljub Mićunović, were victims of this period of strong repression, which lasted until 1956 and was marked by significant violations of human rights. Tens of thousands of political opponents served in forced labour camps, such as Goli Otok (meaning Barren Island), and hundreds died. An often disputed, but relatively feasible number that was put forth by the Yugoslav government itself in 1964 places the number of Goli Otok inmates incarcerated between 1948 and 1956 to be 16,554, with less than 600 having died during detention. The facilities at Goli Otok were abandoned in 1956, and jurisdiction of the now-defunct political prison was handed over to the government of the Socialist Republic of Croatia.
Tito's estrangement from the USSR enabled Yugoslavia to obtain U.S. aid via the Economic Cooperation Administration (ECA), the same U.S. aid institution that administered the Marshall Plan. Still, he did not agree to align with the West, which was a common consequence of accepting American aid at the time. After Stalin's death in 1953, relations with the USSR were relaxed, and Tito began to receive aid as well from the COMECON. In this way, Tito played East–West antagonism to his advantage. Instead of choosing sides, he was instrumental in kick-starting the Non-Aligned Movement, which would function as a "third way" for countries interested in staying outside of the East–West divide.
The event was significant not only for Yugoslavia and Tito, but also for the global development of socialism, since it was the first major split between Communist states, casting doubt on Comintern's claims for socialism to be a unified force that would eventually control the whole world, as Tito became the first (and the only successful) socialist leader to defy Stalin's leadership in the COMINFORM. This rift with the Soviet Union brought Tito much international recognition, but also triggered a period of instability often referred to as the Informbiro period. Tito's form of communism was labelled "Titoism" by Moscow, which encouraged purges against suspected "Titoites'" throughout the Eastern bloc.
On 26 June 1950, the National Assembly supported a crucial bill written by Milovan Đilas and Tito regarding "self-management" ("samoupravljanje"), a type of cooperative independent socialist experiment that introduced profit sharing and workplace democracy in previously state-run enterprises, which then became the direct social ownership of the employees. On 13 January 1953, they established that the law on self-management was the basis of the entire social order in Yugoslavia. Tito also succeeded Ivan Ribar as the President of Yugoslavia on 14 January 1953. After Stalin's death, Tito rejected the USSR's invitation for a visit to discuss normalisation of relations between the two nations. Nikita Khrushchev and Nikolai Bulganin visited Tito in Belgrade in 1955 and apologised for wrongdoings by Stalin's administration. Tito visited the USSR in 1956, which signalled to the world that animosity between Yugoslavia and USSR was easing. Relations between Yugoslavia and the Soviet Union worsened in the late 1960s because of the Yugoslav economic reform and Yugoslav support for the Prague Spring.
The Tito-Stalin split had large ramifications for countries outside the USSR and Yugoslavia. It has, for example, been given as one of the reasons for the Slánský trial in Czechoslovakia, in which 14 high-level Communist officials were purged, with 11 of them being executed. Stalin put pressure on Czechoslovakia to conduct purges in order to discourage the spread of the idea of a "national path to socialism," which Tito espoused.
Under Tito's leadership, Yugoslavia became a founding member of the Non-Aligned Movement. In 1961, Tito co-founded the movement with Egypt's Gamal Abdel Nasser, India's Jawaharlal Nehru, Indonesia's Sukarno and Ghana's Kwame Nkrumah, in an action called The Initiative of Five (Tito, Nehru, Nasser, Sukarno, Nkrumah), thus establishing strong ties with third world countries. This move did much to improve Yugoslavia's diplomatic position. Tito saw the Non-Aligned Movement as a way of presenting himself as a world leader of an important bloc of nations that would improve his bargaining power with both the eastern and western blocs. On 1 September 1961, Josip Broz Tito became the first Secretary-General of the Non-Aligned Movement.
Tito's foreign policy led to relationships with a variety of governments, such as exchanging visits (1954 and 1956) with Emperor Haile Selassie of Ethiopia, where a street was named in his honour. In 1953, Tito visited Ethiopia and in 1954, the Emperor visited Yugoslavia. Tito's motives in befriending Ethiopia were somewhat self-interested as he wanted to send recent graduates of Yugoslav universities (whose standards were not up to those of Western universities, thus making them unemployable in the West) to work in Ethiopia, which was one of the few countries that was willing to accept them. As Ethiopia did not have much of a health care system or a university system, Haile Selassie from 1953 onward encouraged the graduates of Yugoslav universities, especially with medical degrees, to come work in his empire. Reflecting his tendency to pursue closer ties with Third World nations, from 1950 onward, Tito permitted Mexican films to be shown in Yugoslavia, where they become very popular, especially the 1950 film "Un día de vida", which become a huge hit when it premiered in Yugoslavia in 1952. The success of Mexican films led to the "Yu-Mex" craze of the 1950s-1960s as Mexican music become popular and it was fashionable for many Yugoslav musicians to don sombreros and sing Mexican songs in Serbo-Croatian.
Tito was notable for pursuing a foreign policy of neutrality during the Cold War and for establishing close ties with developing countries. Tito's strong belief in self-determination caused the 1948 rift with Stalin and consequently, the Eastern Bloc. His public speeches often reiterated that policy of neutrality and co-operation with all countries would be natural as long as these countries did not use their influence to pressure Yugoslavia to take sides. Relations with the United States and Western European nations were generally cordial.
In the early 1950s, Yugoslav-Hungarian relations were strained as Tito made little secret of his distaste for the Stalinist Mátyás Rákosi and his preference for the "national communist" Imre Nagy instead. Tito's decision to create a "Balkan bloc" by signing a treaty of alliance with NATO members Turkey and Greece in 1954 was regarded as tantamount to joining NATO in Soviet eyes, and his vague talk of a neutralist Communist federation of Eastern European states was seen as a major threat in Moscow. The Yugoslav embassy in Budapest was seen by the Soviets as a center of subversion in Hungary as they accused Yugoslav diplomats and journalists, sometimes with justification, of supporting Nagy. However, when the revolt broke out in Hungary in October 1956, Tito accused Nagy of losing control of the situation, as he wanted a Communist Hungary independent of the Soviet Union, not the overthrow of Hungarian Communism. On 31 October 1956, Tito ordered the Yugoslav media to stop praising Nagy and he quietly supported the Soviet intervention on 4 November to end the revolt in Hungary, as he believed that a Hungary ruled by anti-communists would pursue irredentist claims against Yugoslavia, just had been the case during the interwar period. To escape from the Soviets, Nagy fled to the Yugoslav embassy, where Tito granted him asylum. On 5 November 1956, Soviet tanks shelled the Yugoslav embassy in Budapest, killing the Yugoslav cultural attache and several other diplomats. Tito's refusal to turn over Nagy, despite increasingly shrill Soviet demands that he do so, served his purposes well with relations with the Western states, as he was presented in the Western media as the "good communist" who stood up to Moscow by sheltering Nagy and the other Hungarian leaders. On 22 November, Nagy and his cabinet left the embassy on a bus that was take them into exile in Yugoslavia after the new Hungarian leader, János Kádár had promised Tito in writing that they would not be harmed. Much to Tito's fury, when the bus left the Yugoslav embassy, it was promptly boarded by KGB agents who arrested the Hungarian leaders and roughly handled the Yugoslav diplomats who tried to protect them. The kidnapping of Nagy, followed by his subsequent execution, almost led to Yugoslavia breaking off diplomatic relations with the Soviet Union and in 1957 Tito boycotted the ceremonials in Moscow for the 40th anniversary of the October Revolution, being the only communist leader who did not attend the occasion.
Yugoslavia had a liberal travel policy permitting foreigners to freely travel through the country and its citizens to travel worldwide, whereas it was limited by most Communist countries. A number of Yugoslav citizens worked throughout Western Europe. Tito met many world leaders during his rule, such as Soviet rulers Joseph Stalin, Nikita Khrushchev and Leonid Brezhnev; Egypt's Gamal Abdel Nasser, Indian politicians Jawaharlal Nehru and Indira Gandhi; British Prime Ministers Winston Churchill, James Callaghan and Margaret Thatcher; U.S. Presidents Dwight D. Eisenhower, John F. Kennedy, Richard Nixon, Gerald Ford and Jimmy Carter; other political leaders, dignitaries and heads of state that Tito met at least once in his lifetime included Che Guevara, Fidel Castro, Yasser Arafat, Willy Brandt, Helmut Schmidt, Georges Pompidou, Kwame Nkrumah, Queen Elizabeth II, Hua Guofeng, Kim Il Sung, Sukarno, Sheikh Mujibur Rahman, Suharto, Idi Amin, Haile Selassie, Kenneth Kaunda, Gaddafi, Erich Honecker, Nicolae Ceaușescu, János Kádár and Urho Kekkonen. He also met numerous celebrities.
Yugoslavia provided major assistance to anti-colonialist movements in the Third World. The Yugoslav delegation was the first to bring the demands of the Algerian National Liberation Front to the United Nations. In January 1958, the French navy boarded the Slovenija cargo ship off Oran, whose holds were filled with weapons for the insurgents. Diplomat Danilo Milic explained that "Tito and the leading nucleus of the League of Communists of Yugoslavia really saw in the Third World's liberation struggles a replica of their own struggle against the fascist occupants. They vibrated to the rhythm of the advances or setbacks of the FLN or Vietcong. »
Thousands of Yugoslav cooperants travelled Guinea after its decolonisation and as the French government tried to destabilise the country. Tito also supported the liberation movements of the Portuguese colonies in Africa. He saw the murder of Patrice Lumumba in 1961 as the "greatest crime in contemporary history". The country's military schools hosted activists from Swapo (Namibia) and the Pan Africanist Congress of Azania (South Africa). In 1980, the secret services of South Africa and Argentina planned to bring 1,500 anti-communist guerrillas to Yugoslavia. The operation was aimed at overthrowing Tito and was planned during the Olympic Games period so that the Soviets would be too busy to react. The operation was finally abandoned due to Tito's death and while the Yugoslav armed forces raised their alert level.
In 1953, Tito traveled to Britain for a state visit and met with Winston Churchill. He also toured Cambridge and visited the University Library.
Tito visited India from 22 December 1954 through 8 January 1955. After his return, he removed many restrictions on churches and spiritual institutions in Yugoslavia.
Tito also developed warm relations with Burma under U Nu, travelling to the country in 1955 and again in 1959, though he didn't receive the same treatment in 1959 from the new leader, Ne Win. Tito had an especially close friendship with Prince Norodom Sihanouk of Cambodia who preached an eccentric mixture of monarchism, Buddhism, and socialism and like Tito wanted his country to be neutral in the Cold War. Tito saw Sihanouk as something of a kindred soul who like him had to struggle to maintain his backward country's neutrality in face of rival power blocs. By contrast, Tito had a strong dislike of President Idi Amin of Uganda whom he saw as a thuggish and possibly insane leader.
Because of its neutrality, Yugoslavia would often be rare among Communist countries to have diplomatic relations with right-wing, anti-Communist governments. For example, Yugoslavia was the only communist country allowed to have an embassy in Alfredo Stroessner's Paraguay. One notable exception to Yugoslavia's neutral stance toward anti-communist countries was Chile under Pinochet; Yugoslavia was one of many countries that severed diplomatic relations with Chile after Salvador Allende was overthrown. Yugoslavia also provided military aid and arms supplies to staunchly anti-Communist regimes such as that of Guatemala under Kjell Eugenio Laugerud García.
Starting in the 1950s, Tito permitted Yugoslav workers to go to western Europe, especially West Germany as "gastarbeiter" ("guest workers"). The exposure of many Yugoslavs to the West and its culture led many people in Yugoslavia to view themselves as culturally closer to Western Europe than Eastern Europe. On 7 April 1963, the country changed its official name to the Socialist Federal Republic of Yugoslavia. Reforms encouraged private enterprise and greatly relaxed restrictions on religious expression. Tito subsequently went on a tour of the Americas. In Chile, two government ministers resigned over his visit to that country. In the autumn of 1960 Tito met President Dwight D. Eisenhower at the United Nations General Assembly meeting. Tito and Eisenhower discussed a range of issues from arms control to economic development. When Eisenhower remarked that Yugoslavia's neutralism was "neutral on his side", Tito replied that neutralism did not imply passivity but meant "not taking sides".
In 1966 an agreement with the Vatican, fostered in part by the death in 1960 of anti-communist archbishop of Zagreb Aloysius Stepinac and shifts in the church's approach to resisting communism originating in the Second Vatican Council, accorded new freedom to the Yugoslav Roman Catholic Church, particularly to catechise and open seminaries. The agreement also eased tensions, which had prevented the naming of new bishops in Yugoslavia since 1945. Tito's new socialism met opposition from traditional communists culminating in conspiracy headed by Aleksandar Ranković. There exists a strong argument that Ranković was framed. Allegedly, the charge on which he was removed from power and expelled from the LCY was that he bugged the working and sleeping quarters of Josip Broz Tito as well as many other high government officials. Ranković was, for almost twenty years, at the head of the State Security Administration, as well as Federal Secretary of Internal Affairs. His position as a party whip and Tito's way of controlling and monitoring the government and, to a certain extent the people, bothered many, especially the younger, newer generation of government officials who were working towards a more liberal Yugoslav society. In the same year Tito declared that Communists must henceforth chart Yugoslavia's course by the force of their arguments (implying an abandonment of Leninist orthodoxy and development of liberal Communism). The State Security Administration (UDBA) saw its power scaled back and its staff reduced to 5000 after the removal of Ranković. Some historians argue that this shift from Communist orthodoxy and strong centralised government control to Communist liberalism and a more open, decentralised society played a role in the eventual break-up of the country.
On 1 January 1967, Yugoslavia was the first communist country to open its borders to all foreign visitors and abolish visa requirements. In the same year Tito became active in promoting a peaceful resolution of the Arab–Israeli conflict. His plan called for Arabs to recognise the state of Israel in exchange for territories Israel gained.
In 1968, Tito offered to fly to Prague on three hours notice, if Czechoslovak leader Alexander Dubček needed help in facing down the Soviets. In April 1969, Tito removed generals Ivan Gošnjak and Rade Hamović in the aftermath of the invasion of Czechoslovakia due to the unpreparedness of the Yugoslav army to respond to a similar invasion of Yugoslavia.
In 1971, Tito was re-elected as President of Yugoslavia by the Federal Assembly for the sixth time. In his speech before the Federal Assembly he introduced 20 sweeping constitutional amendments that would provide an updated framework on which the country would be based. The amendments provided for a collective presidency, a 22-member body consisting of elected representatives from six republics and two autonomous provinces. The body would have a single chairman of the presidency and chairmanship would rotate among six republics. When the Federal Assembly fails to agree on legislation, the collective presidency would have the power to rule by decree. Amendments also provided for stronger cabinet with considerable power to initiate and pursue legislation independently from the Communist Party. Džemal Bijedić was chosen as the Premier. The new amendments aimed to decentralise the country by granting greater autonomy to republics and provinces. The federal government would retain authority only over foreign affairs, defence, internal security, monetary affairs, free trade within Yugoslavia, and development loans to poorer regions. Control of education, healthcare, and housing would be exercised entirely by the governments of the republics and the autonomous provinces.
Tito's greatest strength, in the eyes of the western communists, had been in suppressing nationalist insurrections and maintaining unity throughout the country. It was Tito's call for unity, and related methods, that held together the people of Yugoslavia. This ability was put to a test several times during his reign, notably during the Croatian Spring (also referred as the "Masovni pokret", "maspok", meaning "Mass Movement") when the government suppressed both public demonstrations and dissenting opinions within the Communist Party. Despite this suppression, much of maspok's demands were later realised with the new constitution, heavily backed by Tito himself against opposition from the Serbian branch of the party. On 16 May 1974, the new Constitution was passed, and the 82-year old Tito was named president for life, a status that he would enjoy for the rest of his life.
Tito's visits to the United States avoided most of the Northeast due to large minorities of Yugoslav emigrants bitter about communism in Yugoslavia. Security for the state visits was usually high to keep him away from protesters, who would frequently burn the Yugoslav flag. During a visit to the United Nations in the late 1970s emigrants shouted "Tito murderer" outside his New York hotel, for which he protested to United States authorities.
Dominic McGoldrick writes that as the head of a "highly centralised and oppressive" regime, Tito wielded tremendous power in Yugoslavia, with his authoritarian rule administered through an elaborate bureaucracy that routinely suppressed human rights. The main victims of this repression were during the first years known and alleged Stalinists, such as Dragoslav Mihailović and Dragoljub Mićunović, but during the following years even some of the most prominent among Tito's collaborators were arrested. On 19 November 1956 Milovan Đilas, perhaps the closest of Tito's collaborator and widely regarded as Tito's possible successor, was arrested because of his criticism against Tito's regime. Victor Sebestyen writes that Tito "was as brutal as" Stalin. The repression did not exclude intellectuals and writers, such as Venko Markovski, who was arrested and sent to jail in January 1956 for writing poems considered anti-Titoist.
Even if after the reforms of 1961 Tito's presidency had become comparatively more liberal than other communist regimes, the Communist Party continued to alternate between liberalism and repression. Yugoslavia managed to remain independent from the Soviet Union and its brand of socialism was in many ways the envy of Eastern Europe, but Tito's Yugoslavia remained a tightly controlled police state. According to David Mates, outside the Soviet Union, Yugoslavia had more political prisoners than all of the rest of Eastern Europe combined.
Tito's secret police was modelled on the Soviet KGB. Its members were ever-present and often acted extrajudicially, with victims including middle-class intellectuals, liberals and democrats. Yugoslavia was a signatory to the International Covenant on Civil and Political Rights, but scant regard was paid to some of its provisions.
Tito's Yugoslavia was based on respect for nationality, although Tito ruthlessly purged any flowerings of nationalism that threatened the Yugoslav federation. However, the contrast between the deference given to some ethnic groups and the severe repression of others was sharp. Yugoslav law guaranteed nationalities to use their language, but for ethnic Albanians the assertion of ethnic identity was severely limited. Almost half of the political prisoners in Yugoslavia were ethnic Albanians imprisoned for asserting their ethnic identity.
Yugoslavia's post-war development was impressive, but the country ran into economic snags around 1970 and experienced significant unemployment and inflation. Between 1961 and 1980, the external debt of Yugoslavia increased exponentially at the unsustainable pace of over 17% per year. By 1970 debt was no longer contracted to finance investment, but to cover current expenses. The structure of the economy had reached a point that it required indefinite debt growth to survive.
Declassified documents from the CIA state in 1967 it was already clear that although Tito's economic model had achieved growth of the gross national product around 7%, it also created frequently unwise industrial investment and a chronic deficit in the nation's balance of payment. In the 1970s, uncontrolled growth often created chronic inflation, both of which Tito and the party were unable to fully stabilise or moderate. Yugoslavia also paid high interest on loans compared to the LIBOR rate, but Tito's presence eased investor's fears, since he had proven willing and able to implement unpopular reforms. By 1979 with Tito's passing on the horizon, a global downturn in the economy, consistently increasing unemployment and growth slowing to 5.9% throughout the 1970s, it had become likely that "the rapid economic growth to which the Yugoslavs [had] become accustomed" would aggressively decline.
After the constitutional changes of 1974, Tito began reducing his role in the day-to-day running of the state. He continued to travel abroad and receive foreign visitors, going to Beijing in 1977 and reconciling with a Chinese leadership that had once branded him a revisionist. In turn, Chairman Hua Guofeng visited Yugoslavia in 1979. In 1978, Tito travelled to the U.S. During the visit strict security was imposed in Washington, D.C. owing to protests by anti-communist Croat, Serb and Albanian groups.
Tito became increasingly ill over the course of 1979. During this time "Vila Srna" was built for his use near Morović in the event of his recovery. On 7 January and again on 11 January 1980, Tito was admitted to the Medical Centre in Ljubljana, the capital city of the SR Slovenia, with circulation problems in his legs. Tito's own stubbornness and refusal to allow doctors to follow through with the necessary amputation of his left leg played a part in his eventual death of gangrene-induced infection. His Adjutant later testified that Tito threatened to take his own life if his leg was ever to be amputated, and that he had to actually hide Tito's pistol in fear that he would follow through on his threats. After a private conversation with his two sons Žarko and Mišo Broz, he finally agreed, and his left leg was amputated due to arterial blockages. The amputation proved to be too late, and Tito died at the Medical Centre of Ljubljana on 4 May 1980, three days short of his 88th birthday. His funeral attracted government leaders from 129 states, the only absentee being the American President, Jimmy Carter and his successor Ronald Reagan.
The funeral for Tito drew many world statesmen. Based on the number of attending politicians and state delegations, at the time it was the largest state funeral in history; this concentration of dignitaries would be unmatched until the funeral of Pope John Paul II in 2005 and the memorial service of Nelson Mandela in 2013. Those who attended included four kings, 31 presidents, six princes, 22 prime ministers and 47 ministers of foreign affairs. They came from both sides of the Cold War, from 128 different countries out of 154 UN members at the time.
Reporting on his death, "The New York Times" commented:
Tito was interred in a mausoleum in Belgrade, which forms part of a memorial complex in the grounds of the Museum of Yugoslav History (formerly called "Museum 25 May" and "Museum of the Revolution"). The actual mausoleum is called House of Flowers ("Kuća Cveća") and numerous people visit the place as a shrine to "better times". The museum keeps the gifts Tito received during his presidency. The collection includes original prints of "Los Caprichos" by Francisco Goya, and many others. The Government of Serbia planned to merge it into the Museum of the History of Serbia.
During his life and especially in the first year after his death, several places were named after Tito. Several of these places have since returned to their original names.
For example, Podgorica, formerly Titograd (though Podgorica's international airport is still identified by the code TGD), and Užice, formerly known as Titovo Užice, which reverted to its original name in 1992. Streets in Belgrade, the capital, have all reverted to their original pre–World War II and pre-communist names as well. In 2004, Antun Augustinčić's statue of Broz in his birthplace of Kumrovec was decapitated in an explosion. It was subsequently repaired. Twice in 2008, protests took place in what was then Zagreb's Marshal Tito Square (today the Republic of Croatia Square), organised by a group called Circle for the Square ("Krug za Trg"), with an aim to force the city government to rename it to its previous name, while a counter-protest by Citizens' Initiative Against Ustašism ("Građanska inicijativa protiv ustaštva") accused the "Circle for the Square" of historical revisionism and neo-fascism. Croatian president Stjepan Mesić criticised the demonstration to change the name.
In the Croatian coastal city of Opatija the main street (also its longest street) still bears the name of Marshal Tito. Rijeka, third largest city in Croatia, also refuses to change the name of one of the squares in the city centre named after Tito. Streets in numerous towns in Serbia, mostly in the country's north. One of the main streets in downtown Sarajevo is called Marshal Tito Street, and Tito's statue in a park in front of the university campus (ex. JNA barrack "Maršal Tito") in Marijin Dvor is a place where Bosnians and Sarajevans still today commemorate and pay tribute to Tito. The largest Tito monument in the world, about high, is located at Tito Square (Slovene: ), the central square in Velenje, Slovenia. One of the main bridges in Slovenia's second largest city of Maribor is Tito Bridge (). The central square in Koper, the largest Slovenian port city, is as well named Tito Square. The main-belt asteroid 1550 Tito, discovered by Serbian astronomer Milorad B. Protić at Belgrade Observatory in 1937, was named in his honour.
The Croat historian Marijana Belaj wrote that for some people in Croatia and other parts of the former Yugoslavia, Tito is remembered as a sort of secular saint, mentioning how some Croats keep portraits of Catholic saints together with a portrait of Tito on their walls as a way to bring hope. The practice of writing letters to Tito has continued after his death with several websites in former Yugoslavia devoted entirely as forums for people to send posthumous letters to him, where they often talk about various personal problems. Every year on 25 May, about 10,000 people from the former Yugoslavia gathered in Tito's hometown of Kumrovec to pay tribute to his memory in a quasi-religious ritual. Belaj wrote that much of the posthumous appeal of the Tito cult centers around Tito's everyman persona and how he was presented as a "friend" to ordinary people, in contrast to the way in which Stalin was depicted in his cult of personality as a cold, aloof god-like figure whose extraordinary qualities set him apart from ordinary people. The majority of those who come to Kumrovec on 25 May to kiss Tito's statue are women. Belaji wrote that the appeal of the Tito cult today centers less around communism, observing that most of people who come to Kumrovec do not believe in communism, but rather due to nostalgia for their youth in Tito's Yugoslavia, and affection for an "ordinary man" who became great. Tito was not a Croat nationalist, but the fact that Tito became the world's most famous Croat, serving as the leader of the non-aligned movement and been seen as an important world leader, inspires pride in certain quarters of Croatia.
Every year a "Brotherhood and Unity" relay race is organised in Montenegro, Macedonia, and Serbia that ends at the "House of Flowers" in Belgrade on 25 May – the final resting place of Tito. At the same time, runners in Slovenia, Croatia, and Bosnia and Herzegovina set off for Kumrovec, Tito's birthplace in northern Croatia. The relay is a left-over from the Relay of Youth from Yugoslav times, when young people made a similar yearly trek on foot through Yugoslavia that ended in Belgrade with a massive celebration.
In 1992, "Tito and Me" (Serbian: Тито и ја, Tito i ja), a 1992 Yugoslav comedy film by Serbian director Goran Marković, was released.
In the years following the dissolution of Yugoslavia, some historians stated that human rights were suppressed in Yugoslavia under Tito, particularly in the first decade up until the Tito–Stalin Split. On 4 October 2011, the Slovenian Constitutional Court found a 2009 naming of a street in Ljubljana after Tito to be unconstitutional. While several public areas in Slovenia (named during the Yugoslav period) do already bear Tito's name, on the issue of renaming an additional street the court ruled that:
The court, however, explicitly made it clear that the purpose of the review was "not a verdict on Tito as a figure or on his concrete actions, as well as not a historical weighing of facts and circumstances". Slovenia has several streets and squares named after Tito, notably "Tito Square" in Velenje, incorporating a 10-meter statue.
Tito has also been named as responsible for systematic eradication of the ethnic German (Danube Swabian) population in Vojvodina by expulsions and mass executions following the collapse of the German occupation of Yugoslavia at the end of World War II, in contrast to his inclusive attitude towards other Yugoslav nationalities. Ten years after his death, Yugoslavia collapsed into multiple devastating civil wars.
Tito carried on numerous affairs and was married several times. In 1918 he was brought to Omsk, Russia, as a prisoner of war. There he met Pelagija Belousova who was then fourteen; he married her a year later, and she moved with him to Yugoslavia. They had five children but only their son Žarko Leon (born 4 February 1924) survived. When Tito was jailed in 1928, she returned to Russia. After the divorce in 1936 she later remarried.
In 1936, when Tito stayed at the Hotel Lux in Moscow, he met the Austrian Lucia Bauer. They married in October 1936, but the records of this marriage were later erased.
His next relationship was with Herta Haas, whom he married in 1940. Broz left for Belgrade after the April War, leaving Haas pregnant. In May 1941, she gave birth to their son, Aleksandar "Mišo" Broz. All throughout his relationship with Haas, Tito had maintained a promiscuous life and had a parallel relationship with Davorjanka Paunović, who, under the codename "Zdenka", served as a courier in the resistance and subsequently became his personal secretary. Haas and Tito suddenly parted company in 1943 in Jajce during the second meeting of AVNOJ after she reportedly walked in on him and Davorjanka. The last time Haas saw Broz was in 1946. Davorjanka died of tuberculosis in 1946 and Tito insisted that she be buried in the backyard of the Beli Dvor, his Belgrade residence.
His best known wife was Jovanka Broz. Tito was just shy of his 60th birthday, while she was 27, when they finally married in April 1952, with state security chief Aleksandar Ranković as the best man. Their eventual marriage came about somewhat unexpectedly since Tito actually rejected her some years earlier when his confidante Ivan Krajacic brought her in originally. At that time, she was in her early 20s and Tito objected to her energetic personality. Not one to be discouraged easily, Jovanka continued working at Beli Dvor, where she managed the staff and eventually got another chance. Their relationship was not a happy one, however. It had gone through many, often public, ups and downs with episodes of infidelities and even allegations of preparation for a "coup d'état" by the latter pair. Certain unofficial reports suggest Tito and Jovanka even formally divorced in the late 1970s, shortly before his death. However, during Tito's funeral she was officially present as his wife, and later claimed rights for inheritance. The couple did not have any children.
Tito's grandchildren include Saša Broz, a theatre director in Croatia; Svetlana Broz, a cardiologist and writer in Bosnia-Herzegovina; and Josip Broz – Joška, Edvard Broz and Natali Klasevski, an artisan of Bosnia-Herzegovina.
As the President, Tito had access to extensive (state-owned) property associated with the office, and maintained a lavish lifestyle. In Belgrade he resided in the official residence, the Beli dvor, and maintained a separate private home. The Brijuni islands were the site of the State Summer Residence from 1949 on. The pavilion was designed by Jože Plečnik, and included a zoo. Close to 100 foreign heads of state were to visit Tito at the island residence, along with film stars such as Elizabeth Taylor, Richard Burton, Sophia Loren, Carlo Ponti, and Gina Lollobrigida.
Another residence was maintained at Lake Bled, while the grounds at Karađorđevo were the site of "diplomatic hunts". By 1974 the Yugoslav President had at his disposal 32 official residences, larger and small, the yacht "Galeb" ("seagull"), a Boeing 727 as the presidential aeroplane, and the Blue Train. After Tito's death the presidential Boeing 727 was sold to Aviogenex, the "Galeb" remained docked in Montenegro, while the Blue Train was stored in a Serbian train shed for over two decades. While Tito was the person who held the office of president for by far the longest period, the associated property was not private and much of it continues to be in use by Yugoslav successor states, as public property, or maintained at the disposal of high-ranking officials.
As regards knowledge of languages, Tito replied that he spoke Serbo-Croatian, German, Russian, and some English. Broz's official biographer and then fellow Central Committee-member Vladimir Dedijer stated in 1953 that he spoke "Serbo-Croatian ... Russian, Czech, Slovenian ... German (with a Viennese accent) ... understands and reads French and Italian ... [and] also speaks Kazakh."
In his youth Tito attended Catholic Sunday school, and was later an altar boy. After an incident where he was slapped and shouted at by a priest when he had difficulty assisting the priest to remove his vestments, Tito would not enter a church again. As an adult, he identified as an atheist.
Every federal unit had a town or city with historic significance from the World War II period renamed to have Tito's name included. The largest of these was Titograd, now Podgorica, the capital city of Montenegro. With the exception of Titograd, the cities were renamed simply by the addition of the adjective "Tito's" (""Titov""). The cities were:
In the years after Tito's death up to the present, there has been some debate as to his identity. Tito's personal doctor, Aleksandar Matunović, wrote a book about Tito in which he questioned his true origin, noting that Tito's habits and lifestyle could only mean that he was from an aristocratic family. Serbian journalist Vladan Dinić, in "Tito is not Tito", included several possible alternate identities of Tito, arguing that three separate people had identified as Tito.
In 2013, a lot of media coverage was given to a declassified NSA study in "Cryptologic Spectrum" that concluded Tito had not spoken the Serbo-Croatian language as a native. The report noted that his speech had features of other Slavic languages (Russian and Polish). The hypothesis that "a non-Yugoslav, perhaps a Russian or a Pole" assumed Tito's identity was included with a note that this had happened during or before the Second World War. The report notes Draža Mihailović's impressions of Tito's Russian origins after he had personally spoken with Tito.
However, the NSA's report was completely invalidated by Croatian experts. The report failed to recognise that Tito was a native speaker of the very distinctive local Kajkavian dialect of Zagorje. His acute accent, present only in Croatian dialects, and which Tito was able to pronounce perfectly, is the strongest evidence for his Zagorje origins.
As the Communist Party was outlawed in Yugoslavia starting on 30 December 1920, Josip Broz took on many assumed names during his activity within the Party, including "Rudi", "Walter", and "Tito". Broz himself explains:
Josip Broz Tito received a total of 119 awards and decorations from 60 countries around the world (59 countries and Yugoslavia). 21 decorations were from Yugoslavia itself, 18 having been awarded once, and the Order of the National Hero on three occasions. Of the 98 international awards and decorations, 92 were received once, and three on two occasions (Order of the White Lion, Polonia Restituta, and Karl Marx). The most notable awards included the French Legion of Honour and National Order of Merit, the British Order of the Bath, the Soviet Order of Lenin, the Japanese Order of the Chrysanthemum, the West German Federal Cross of Merit, and the Order of Merit of Italy.
The decorations were seldom displayed, however. After the Tito–Stalin split of 1948 and his inauguration as president in 1953, Tito rarely wore his uniform except when present in a military function, and then (with rare exception) only wore his Yugoslav ribbons for obvious practical reasons. The awards were displayed in full number only at his funeral in 1980. Tito's reputation as one of the Allied leaders of World War II, along with his diplomatic position as the founder of the Non-Aligned Movement, was primarily the cause of the favourable international recognition.
Here follows a short list including some of the more notable foreign awards and decorations of Tito.
Some of the other foreign awards and decorations of Josip Broz Tito include Order of Merit, Order of Prince Henry, Order of Independence, Order of Merit, Order of the Nile, Order of the Condor of the Andes, Order of the Star of Romania, Order of the Gold Lion of the House of Nassau, Croix de Guerre, Order of the Cross of Grunwald, Czechoslovak War Cross, Decoration of Honour for Services to the Republic of Austria, Military Order of the White Lion, Nishan-e-Pakistan, Order of Al Rafidain, Order of Carol I, Order of Georgi Dimitrov, Order of Karl Marx, Order of Manuel Amador Guerrero, Order of Michael the Brave, Order of Pahlavi, Order of Sukhbaatar, Order of Suvorov, Order of the Liberator, Order of the October Revolution, Order of the Queen of Sheba, Order of the White Rose of Finland, Partisan Cross, Royal Order of Cambodia and Star of People's Friendship and Thiri Thudhamma Thingaha. | https://en.wikipedia.org/wiki?curid=16567 |
John Stauber
John Stauber is an American progressive writer. Stauber has co-authored five books about government propaganda, private interests and the public relations industry. His work includes one book about how industry manipulates science ("Trust Us, We're Experts"), one about the history and current scope of the public relations industry ("Toxic Sludge is Good for You"), and one about mad cow disease ("Mad Cow USA"), which predicted the surfacing of the disease within the United States.
In July 2003, Stauber and Sheldon Rampton wrote "Weapons of Mass Deception: The Uses of Propaganda in Bush's War on Iraq", which argued that the Bush administration deceived the American public into supporting the war. In 2004, the two co-authored "Banana Republicans," which argued that the Republican Party is turning the U.S. into a one-party state. The book argues that the far-right and its functionaries in the media, lobbying establishment and electoral system are undermining dissent and squelching pluralistic politics in the United States. In 2006 the two wrote "The Best War Ever: Lies, Damned Lies, and the Mess in Iraq," which builds upon the arguments they posited in "Weapons of Mass Deception".
Stauber is the founder and former executive director of the Center for Media and Democracy, which sponsors PR Watch and SourceWatch. Since the 1960s, he has worked with public interest, consumer, family farm, environmental and community organizations at the local, state and national level. He edits and writes for the Center's quarterly newsmagazine, "PR Watch". He is also a member of the Liberty Tree Board of Advisers.
Stauber grew up in a conservative Republican household in Marshfield, Wisconsin, but the war in Vietnam turned him into an anti-war and environmental activist while still in high school. | https://en.wikipedia.org/wiki?curid=16568 |
James P. Hogan (writer)
James Patrick Hogan (27 June 1941 – 12 July 2010) was a British science fiction author.
Hogan was born in London, England. He was raised in the Portobello Road area on the west side of London. After leaving school at the age of sixteen, he worked various odd jobs until, after receiving a scholarship, he began a five-year program at the Royal Aircraft Establishment at Farnborough studying the practice and theory of electrical, electronic, and mechanical engineering. He first married at the age of twenty. He married three more times and fathered six children.
Hogan worked as a design engineer for several companies and eventually began working with sales during the 1960s, traveling around Europe as a sales engineer for Honeywell. During the 1970s he joined the Digital Equipment Corporation's Laboratory Data Processing Group and during 1977 relocated to Boston, Massachusetts to manage its sales training program. He published his first novel, "Inherit The Stars", during the same year to win an office bet.
He quit DEC during 1979 and began writing full-time, relocating to Orlando, Florida, for a year where he met his third wife Jackie. They then relocated to Sonora, California. Hogan died of heart failure at his home in Ireland on Monday, 12 July 2010, aged 69.
During his later years, Hogan had contrarian and anti-authoritarian opinions. He was a proponent of Immanuel Velikovsky's version of catastrophism, and of the Peter Duesberg hypothesis that AIDS is caused by pharmaceutical use rather than HIV (see AIDS denialism). He criticized the idea of the gradualism of evolution, though he did not propose theistic creationism as an alternative. Hogan was skeptical of the theories of climate change and ozone depletion.
Hogan believed that the Holocaust did not happen in the manner described by mainstream historians, writing that he found the work of Arthur Butz and Mark Weber to be "more scholarly, scientific, and convincing than what the history written by the victors says". In March 2010, in an essay defending Holocaust denier Ernst Zündel, Hogan stated that the mainstream history of the Holocaust includes "claims that are wildly fantastic, mutually contradictory, and defy common sense and often physical possibility".
Compilations of novels in the "Giants series". | https://en.wikipedia.org/wiki?curid=16569 |
Jeff Lynne
Jeffrey Lynne (born 30 December 1947) is an English singer, songwriter, record producer, and multi-instrumentalist who co-founded the rock band Electric Light Orchestra (ELO). The group formed in 1970 as an offshoot of the Move, of which Lynne was also a member. Following the departure of Roy Wood in 1972, Lynne assumed sole leadership of the band and wrote, arranged and produced virtually all of its subsequent records. Before, Lynne was also involved with the Idle Race as a founding member and principal songwriter.
After ELO's original disbandment in 1986, Lynne released two solo albums: "Armchair Theatre" (1990) and "Long Wave" (2012). Additionally, he began producing various artists. In 1988, under the pseudonyms Otis Wilbury and Clayton Wilbury, he co-founded the supergroup Traveling Wilburys with George Harrison, Bob Dylan, Roy Orbison, and Tom Petty. Lynne's songwriting and production collaborations with former Beatles led him to co-produce their "Anthology" reunion singles from John Lennon demos "Free as a Bird" (1995) and "Real Love" (1996). In 2014, Lynne reformed ELO and resumed concert touring under the name "Jeff Lynne's ELO".
Lynne produced all fifteen ELO singles that rose to the Top 10 record charts in the UK. His producing credits also include the UK or US Top 10 albums "Cloud Nine" (Harrison, 1987), "Mystery Girl" (Orbison, 1989), "Full Moon Fever" (Petty, 1989), "Into the Great Wide Open" (Petty, 1991), "Flaming Pie" (Paul McCartney, 1997) and "Get Up!" (Bryan Adams, 2015). In 2014, Lynne received a star on the Birmingham Walk of Stars, and was awarded a star on the Hollywood Walk of Fame the following year. He received three Ivor Novello Awards, including the award for Outstanding Services to British Music. In 2017, Lynne was inducted into the Rock and Roll Hall of Fame as a member of ELO.
Lynne was born to Nancy and Philip Lynne and grew up in Shard End, Birmingham, England where he attended Alderlea Boys' Secondary School. As a native of Birmingham he has a flat Brummie accent. His first guitar, an acoustic instrument, was bought for him by his father for £2. He was still playing it in 2012. In 1963 he formed a group with Robert Reader and David Walsh using little more than Spanish guitars and cheap electrical instruments. They were originally named the Rockin' Hellcats, then the Handicaps and finally the Andicaps. They practised at Shard End Community Centre and performed weekly. However, in 1964, Robert Reader and David Walsh left the band and Lynne brought in replacements. At the end of 1964, Lynne decided to leave the band to replace Mick Adkins of the local band "the Chads".
Some time in or after 1965, he acquired his first item of studio recording equipment, a Bang & Olufsen 'Beocord 2000 De Luxe' stereo reel-to-reel tape recorder, which allowed multi-tracking between left and right channels. He says it "taught me how to be a producer". In 1966, Lynne joined the line-up of the Nightriders as guitarist, having responded to an advertisement in the Birmingham Evening Mail. The band would soon change their name to the Idle Race. Despite recording two critically acclaimed albums with the band and producing the second, success eluded him. In 1970, Lynne accepted an offer from friend Roy Wood to join the line-up of the more successful band the Move.
Lynne contributed many songs to the Move's last two albums while formulating, with Roy Wood and Bev Bevan, a band built around a fusion of rock and classical music – a project which would eventually become the highly successful Electric Light Orchestra (ELO). The original idea was that both bands would exist in tandem. Bevan has, however, since suggested that Lynne had little interest in the Move, stating:
""The only reason Jeff Lynne ever joined the Move was to form a new band. He was never interested in being a part of the Move. It was a good, money-earning band. It really subsidised the beginning of ELO for getting musicians in and recording and rehearsals and stuff. Jeff never wanted to be in the Move. He wanted to form a new band.""
Problems led to Wood's departure from ELO in 1972, after the band's eponymous first album, leaving Lynne as the band's dominant creative force. Thereafter followed a succession of band personnel changes and increasingly popular albums: 1973's "ELO 2" and "On the Third Day", 1974's "Eldorado" and 1975's "Face the Music". By 1976's "A New World Record", Lynne had almost developed the roots of the group into a more complex and unique pop-rock sound mixed with studio strings, layered vocals, and tight, catchy pop singles. Lynne's now almost complete creative dominance as producer, songwriter, arranger, lead singer and guitarist could make ELO appear to be an almost solo effort. However, the ELO sound and the focus of Lynne's writing was also shaped by Louis Clark's and Richard Tandy's co-arranging, under Lynne's direction (notably the large string sections), and Bev Bevan's drumming, while Richard Tandy's integration of the Moog, harmonium, and Mellotron, with more novel keyboard technology, gave Lynne's songs a more symphonic sound. Kelly Groucutt's distinctive voice mixed with Lynne's to produce the classic ELO harmonic vocal sound.
The pinnacle of ELO's chart success and worldwide popularity was the expansive 1977 double album "Out of the Blue", which was largely conceived in a Swiss chalet during a two-week writing marathon. The band's 1978 world tour featured an elaborate "space ship" set and laser light show. In order to recreate the complex instrumental textures of their albums, the band used pre-recorded supplemental backing tracks in live performances. Although that practice has now become commonplace, it caused considerable derision in the press of the time. Lynne has often stated that he prefers the creative environment of the studio to the rigours and tedium of touring. In 1979, Lynne followed up the success of "Out of the Blue" with "Discovery", which held No. 1 in the UK for 5 weeks. The album is primarily associated with its two disco-flavoured singles ("Shine a Little Love" and "Last Train to London") and with the title's word play on "disco" and "very". However, the remaining seven non-disco tracks on the album reflected Lynne's range as a pop-rock songwriter, including a heavy, mid-tempo rock anthem ("Don't Bring Me Down") that, despite its use of a drum loop, could be considered the antithesis of disco. Lynne later recalled his forays into dance music: "I love the force of disco. I love the freedom it gave me to make different rhythms across it. I enjoyed that really steady driving beat. Just steady as a rock. I’ve always liked that simplicity in the bass drum."
In 1979, Lynne rejected an offer for ELO to headline the Knebworth Concert in the UK, allowing Led Zeppelin to headline instead. In the absence of any touring to support "Discovery", Lynne had time to contribute five tracks to the soundtrack for the 1980 film musical "Xanadu". The score yielded three Top 40 singles: "I'm Alive" (UK No. 20), "All Over The World" (UK No. 11), and the title track "Xanadu", which reached number one in the UK. Nevertheless, Lynne was not closely involved with the development of the film, and his material consequently had only superficial attachment to the plot. "Xanadu" performed weakly at the box office (although it later has experienced popularity as a cult favourite). Lynne subsequently disavowed his limited contribution to the project.
In 1981, Lynne took the band in a somewhat different direction with the science-fiction themed album "Time", reaching number one for two weeks in the UK, producing the second Top 3 single in less than two years. The strings were still featured, but with heavily synthesised textures. Following a marginally successful tour, Lynne kept this general approach with 1983's "Secret Messages" and a final contractually-obligated ELO album "Balance of Power" in 1986. Lynne discusses the contractually-obligated nature of the final albums on the short interview included with the 'Zoom' DVD. ELO now had only three remaining official members (Lynne, Bevan and Tandy), and Lynne began devoting more time to producing. During his time in the Electric Light Orchestra, Lynne did manage to release a few recordings under his own name. In 1976, Lynne covered the Beatles songs "With a Little Help from My Friends" and "Nowhere Man" for "All This and World War II". In 1977, Lynne released his first solo single, the disco-flavoured "Doin' That Crazy Thing"/"Goin' Down to Rio". Despite ELO's high-profile at that time, it received little airplay and failed to chart.
In 1984, Lynne and Tandy contributed two original songs "Video!" and "Let It Run" to the film "Electric Dreams" (they also provided a third song, "Sooner Or Later", which was released as the b-side of "Video!"). Lynne also wrote the song "The Story of Me," which was recorded by the Everly Brothers on their comeback album "EB84". Even before the official end of ELO, Lynne began his move toward focusing almost exclusively on studio production work. Lynne produced and wrote the 1983 top-40 hit "Slipping Away" for Dave Edmunds and played on sessions (with Tandy) for Edmunds' album, "Information". Lynne also produced six tracks on Edmunds' follow-up album in 1984, "Riff Raff". In contrast to the dense, boomy, baroque sound of ELO, Lynne's post-ELO studio work has tended toward more minimal, acoustic instrumentation and a sparse, "organic" quality that generally favours light room ambience and colouration over artificial reverb, especially on vocals. Lynne's recordings also often feature the jangling compressed acoustic guitar sound pioneered by Roger McGuinn and a heavily gated snare drum sound.
Lynne's influence by the Beatles was clearly evident in his ELO work, and the connection to the Beatles was strengthened when Lynne produced George Harrison's "Cloud Nine." The latter was a successful comeback album for Harrison, released in 1987, featuring the popular singles "Got My Mind Set on You", "When We Was Fab" (where Lynne played the violin in the video) and "This Is Love", the last two of which were co-written by Lynne. Lynne's association with Harrison led to the 1988 formation of the Traveling Wilburys, a studio "supergroup" that also included Tom Petty, Bob Dylan and Roy Orbison and resulted in two albums ("Vol. 1" and "Vol. 3"), both produced by Harrison and Lynne. In 1988, Lynne also worked on Orbison's album "Mystery Girl", co-writing and producing Orbison's last major hit, "You Got It", plus two other tracks on that album. For "Rock On!", the final Del Shannon album, Lynne co-wrote "Walk Away" and finished off several tracks after Shannon's death.
In 1989, Lynne co-produced "Full Moon Fever" by Tom Petty, which included the hit singles "Free Fallin'", "I Won't Back Down" and "Runnin' Down a Dream", all co-written by Lynne. This album and "Traveling Wilburys Vol. 1" received nominations for the Grammy Award for Best Album of the Year in 1989. The Traveling Wilburys won a Grammy for "Best Rock Performance By a Duo or Group with Vocal" that year. Lynne's song "One Way Love" was released as a single by Agnetha Fältskog and appeared on her second post-ABBA album, "Eyes of a Woman". Lynne co-wrote and produced the track "Let It Shine" for Beach Boys founder Brian Wilson's first solo album in 1988. Lynne also contributed three tracks to an album by Duane Eddy and "Falling in Love" on "Land of Dreams" for Randy Newman.
In 1990, Lynne collaborated on the Wilburys' follow up "Traveling Wilburys Vol. 3" and released his first solo album "Armchair Theatre". The album featured Harrison and Tandy and included the singles "Every Little Thing" and "Lift Me Up". It received some positive critical attention but little commercial success. Lynne also provided the song "Wild Times" to the motion picture soundtrack "" in 1991. In 1991, Lynne returned to the studio with Petty, co-writing and producing the album "Into the Great Wide Open" for Tom Petty and the Heartbreakers, which featured the singles "Learning to Fly" and "Into the Great Wide Open". The following year he produced two songs on Roy Orbison's posthumous album "King of Hearts", including the single "I Drove All Night".
In February 1994, Lynne worked with the three surviving Beatles on the "Anthology" album series. At Harrison's request, Lynne was brought in to assist in reevaluating John Lennon's original studio material. The songs "Free as a Bird" and "Real Love" were created by digitally processing Lennon's demos for the songs and overdubbing the three surviving band members to form a virtual Beatles reunion that the band had mutually eschewed during Lennon's lifetime. Lynne has also produced records for Ringo Starr and worked on Paul McCartney's Grammy nominated album "Flaming Pie".
Lynne's work in the 1990s also includes production of a 1993 album for singer-songwriter Julianna Raye entitled "Something Peculiar" and production or songwriting contributions to albums by Roger McGuinn ("Back from Rio") and Joe Cocker ("Night Calls"), songs by Aerosmith ("Lizard Love"), Tom Jones ("Lift Me Up"), Bonnie Tyler ("Time Mends a Broken Heart"), the film "Still Crazy", Hank Marvin ("Wonderful Land" and "Nivram"), Et Moi ("Drole De Vie") and the Tandy Morgan Band ("Action"). In 1996, Lynne was officially recognised by his peers when he was awarded the Ivor Novello Award for "Outstanding Contributions to British Music" for a second time.
In the year 2000, Lynne reactivated ELO and released the retrospective box set "Flashback", containing many newly finished, previously unreleased tracks. The following year Lynne debuted the first new ELO album in fifteen years, "Zoom". The album featured guest appearances by Ringo Starr, George Harrison and Richard Tandy, with Lynne multi-tracking a majority of the instruments and vocals. The album received positive reviews but had no hit singles. It was marketed as a "return to the classic ELO sound" in an attempt to connect with a loyal body of fans and to jump-start a planned concert tour (with Lynne and Tandy as the only returning original ELO members). While a live performance was taped at CBS Television City over two consecutive nights and shown on PBS (with subsequent DVD release), the tour itself was cancelled.
Earlier in 2001, Lynne began working with George Harrison on what would turn out to be Harrison's final album, "Brainwashed". After Harrison's death from cancer on 29 November 2001, Lynne returned to the studio in 2002 to help finish the uncompleted album. Lynne was heavily involved in the memorial "Concert for George", held at London's Royal Albert Hall in November 2002, which also featured Traveling Wilburys member Petty. Lynne sang the lead vocal on "The Inner Light", "I Want to Tell You" and "Give Me Love (Give Me Peace on Earth)", and subsequently produced the Surround Sound audio mix for the "Concert for George" DVD, released in November 2003, which later received a Grammy. Lynne reunited in 2006 with Petty to produce the latter's third solo release, "Highway Companion". In 2004, Lynne and Petty inducted Harrison into the Rock and Roll Hall of Fame, then played "Handle with Care" with Dhani Harrison, also "While My Guitar Gently Weeps" adding Prince, Steve Winwood, and others.
In a Reuters article on 23 April 2009, Lynne said that he had been working on the follow-up to his 1990 solo debut album "Armchair Theatre" with a possible tentative release date of "later this year". He also produced four tracks on Regina Spektor's fifth album "Far", released 23 June 2009.
In a March 2010 interview with the "Daily Express" newspaper, Lynne confirmed he was working on a new album with Joe Walsh and simultaneously "writing a couple of albums under his own name, though he won't tell us in which musical direction he's heading." Lynne contributed a cover of Buddy Holly's "Words of Love" for the tribute album "", which was released on 6 September 2011. On 31 December 2011, Brian Williams reported on NBC New Year's Eve with Carson Daly that "2012 releases will include rare new work from Jeff Lynne."
In 2012, Walsh released his "Analog Man" album which was produced by Lynne. Lynne's second solo album, a covers album entitled "Long Wave", was released on 8 October 2012. A greatest hits collection of re-recorded ELO songs by Lynne titled "" was also released under the ELO moniker on the same day. Lynne suggested that a new album with original material may be released during 2013. In 2012, Lynne and Tandy teamed up at Lynne's Bungalow Palace home studios to record a live set of ELO's songs. This was broadcast on TV as part of the "Mr. Blue Sky" documentary. Lynne and Tandy reunited again on 12 November 2013 to perform, under the name Jeff Lynne and Friends, "Livin' Thing" and "Mr. Blue Sky" at the Children in Need Rocks concert at Hammersmith Eventim Apollo, London.
On 9 February 2014, Lynne performed George Harrison's "Something" with Joe Walsh and Dhani Harrison on "", as well as "Hey Bulldog" from the "Yellow Submarine soundtrack", while accompanying Dave Grohl, commemorating the 50th anniversary of the Beatles' performance on "The Ed Sullivan Show". On 5 March 2014, Lynne received an honorary doctorate degree from Birmingham City University. He also mentioned he was working with Bryan Adams on new material. On 14 September 2014, Jeff Lynne and his touring band, under the name Jeff Lynne's ELO, played a public concert for the first time in over 25 years, headlining at the Radio 2 festival in Hyde Park, London. Never particularly enthusiastic for live performance even in his younger days, Lynne has called this event "easily the best concert I've ever been involved with".
On 8 February 2015, Lynne appeared at the Grammy Awards, playing "Evil Woman" and "Mr. Blue Sky" with Ed Sheeran.
On 10 September 2015, Lynne's website announced he had signed a contract to deliver an album of new ELO music for Columbia Records marking the first time in 14 years new ELO music would be released. On 24 September 2015, "When I Was a Boy", the first single from "Alone in the Universe" was released on the internet with a music video scheduled not long after. The album was released on 13 November 2015 and was followed by promotional shows including the first ELO shows in the United States in 30 years. A 2016 European tour was scheduled, with Dublin, Amsterdam and Zurich being some of the locations toured. Notably, the Dublin concert was delayed by a week due to medical advice given to Lynne.
On 24 June 2017, Lynne performed at Wembley Stadium to a crowd of 60,000, playing a 24-song setlist including 'Xanadu', 'Do Ya' and 'Twilight'. The concert was released on DVD and CD, under the title "Wembley or Bust".
On 2 August 2018, Lynne and his band Jeff Lynne's ELO began a 10-city tour of North America which included the US cities of Oakland, California and Los Angeles, Denver, Houston, Dallas, Rosemont, Illinois, Detroit, New York City, Philadelphia and Toronto.
On 12 September 2018, Jeff Lynne's ELO began a tour throughout Europe including dates in Stockholm, Oslo, Copenhagen, Hamburg, Berlin, Munich, Mannheim, Vienna, Amsterdam, Nottingham, Glasgow, Manchester, Newcastle upon Tyne, Birmingham, Leeds, London, Liverpool, Dublin, and Belfast.
On 20 June 2019, Jeff Lynne's ELO began a North American tour with Dhani Harrison.
On 26 September 2019, Jeff Lynne's ELO announced a new album, called "From Out of Nowhere", which was subsequently released on 1 November of the same year. The album was accompanied by the release of an eponymous single which premiered on BBC Radio 2 that same day.
Lynne has been married twice: first to Rosemary (née Adams, b. 1952) from 1972 to 1977, and then to Sandi Kapelson since 1979, with whom he has two daughters. He is a fan of Birmingham City F.C.
See also: The Idle Race discography, the Move discography, Electric Light Orchestra discography, Traveling Wilburys discography
Co-produced
Co-produced
A-sides co-produced
B-sides
B-sides co-produced | https://en.wikipedia.org/wiki?curid=16570 |
Judge Dredd
Judge Dredd is a comic book franchise based on the longest-running comic strip in "2000 AD" (1977), a British weekly anthology comic.
The franchise is centered around Judge Dredd, a law enforcement and judicial officer in the dystopian future city of Mega-City One, which covers most of the east coast of North America. He is a "street judge", empowered to summarily arrest, convict, sentence, and execute criminals.
Over the years "Judge Dredd" has been hailed as one of the best satires of American and British culture with an uncanny trend to predict upcoming events such as rampant mass surveillance, rise of populist leaders, and the COVID-19 pandemic.
When comics editor Pat Mills was developing "2000 AD" in 1976, he brought in his former writing partner, John Wagner, to develop characters. Wagner had written a "Dirty Harry"-style "tough cop" story, "One-Eyed Jack", for "Valiant", and suggested a character who took that concept to its logical extreme. Mills had developed a horror strip called "Judge Dread" (after the stage name of British ska and reggae artist Alexander Minto Hughes), before abandoning the idea as unsuitable for the new comic; but the name, with the spelling modified to "Dredd" at the suggestion of sub-editor Kelvin Gosnell, was adopted by Wagner.
The task of visualising the character was given to Carlos Ezquerra, a Spanish artist who had worked for Mills before on "Battle Picture Weekly". Wagner gave Ezquerra an advertisement for the film "Death Race 2000", showing the character Frankenstein (played by David Carradine) clad in black leather on a motorbike, as a suggestion of Dredd's appearance. Ezquerra added body-armour, zips, and chains, which Wagner initially objected to, commenting that the character looked like a "Spanish pirate." Wagner's initial script was rewritten by Mills and drawn up by Ezquerra. The hardware and cityscapes Ezquerra had drawn were far more futuristic than the near-future setting originally intended; in response, Mills set the story further into the future, on the advice of his art assistant Doug Church. The original launch story written by Wagner and drawn by Ezquerra was vetoed by the board of directors for being too violent. A new script was needed for the first episode.
Mills initially based the characterisation of Judge Dredd on Brother James, one of his teachers at St Joseph's College, Ipswich. Brother James was considered to be an excellent teacher, but also an excessively strict disciplinarian to the extent that he was considered abusive. In his blog, Mills detailed the moments of rage for which Brother James had a reputation and his own experience witnessing them. The De La Salle monks at the school were a major influence in the 2000 AD design of the 'judge, jury and executioner' attitude of the judges. The name Joseph refers to the school.
By this stage, Wagner had quit, disillusioned that a proposed buy-out of the new comic by another company, which would have given him and Mills a greater financial stake in the comic, had fallen through. Mills was reluctant to lose "Judge Dredd" and farmed the strip out to a variety of freelance writers, hoping to develop it further. Their scripts were given to a variety of artists as Mills tried to find a strip which would provide a good introduction to the character. This "Judge Dredd" would not be ready for the first issue of "2000 AD", launched in February 1977.
The story chosen to introduce the character was submitted by freelance writer Peter Harris, and was extensively re-written by Mills, who added a new ending suggested by Kelvin Gosnell. It was drawn by newcomer Mike McMahon. The strip debuted in "prog" (issue) no. 2. Around this time Ezquerra quit and returned to work for "Battle". There are conflicting sources about why. Ezquerra says it was because he was angry that another artist had drawn the first published "Judge Dredd" strip. Mills says he chose McMahon because Ezquerra had already left, having been offered a better deal by the editor of "Battle".
Wagner soon returned to the character, starting in prog 9. His storyline, "The Robot Wars", was drawn by a rotating team of artists (including Ezquerra), and marked the point where Dredd became the most popular character in the comic, a position he has rarely relinquished. Judge Dredd has appeared in almost every issue since, most of the stories written by Wagner (in collaboration with Alan Grant between 1980 and 1988).
In 1983, Judge Dredd made his American debut with his own series from publisher Eagle Comics, titled "Judge Dredd". It consisted of stories reprinted from the British comic. Since 1990, Dredd has also had his own title in Britain, the "Judge Dredd Megazine". With Wagner concentrating his energies on that, the "Dredd" strip in "2000 AD" was left to younger writers, including Garth Ennis, Mark Millar, Grant Morrison and John Smith. Their stories were less popular with fans, and sales fell. Wagner returned to writing the character full-time in 1994.
"Judge Dredd" has also been published in a long-running comic strip (1981–1998) in the "Daily Star", and briefly in "Metro" from January to April 2004. These were usually created by the same teams writing and drawing the main strip, and the "Daily Star" strips have been collected into a number of volumes.
In 2012, Dredd was one of 10 British comic characters commemorated in a series of stamps issued by the Royal Mail.
The setting of "Judge Dredd" is a dystopian future Earth damaged by a series of international conflicts; much of the planet has become radioactive wasteland, and so populations have aggregated in enormous conurbations known as 'mega-cities'. The story is centred on the megalopolis of Mega-City One, on the east coast of North America. Within Mega-City One, extensive automation (including intelligent robots) has rendered the majority of the population unemployed. As a consequence, the general population is prone to embracing any fashion or craze they encounter. Mega-City One is surrounded by the inhospitable "Cursed Earth". Much of the remaining world's geography is somewhat vague, although other mega-cities are visited in the strip.
Mega-City One's population lives in gigantic towers known as City Blocks, each holding some 50,000 people. Each is named after some historical person or TV character, usually for comic effect. For example, Joe Dredd used to live in the Rowdy Yates Block – Rowdy Yates was a character in the American TV cowboy drama "Rawhide," played by a young Clint Eastwood. Eastwood would later play the lead in "Dirty Harry" – one of the thematic influences by which Judge Dredd was inspired. A number of stories feature rivalries between different blocks, on many occasions breaking into full-scale gun battles between them (such as in the story "Block Mania"). The story "Origins" revealed that Mega-City One was formed by urban sprawl rather than deliberate design, and by 2051 it was recognised as the world's first mega-city. The Judges' powers reflect the difficulty of maintaining order. Mega-City One extends from Boston to Charlotte; but extended into Florida before the Apocalypse War laid waste to the southern sectors. At its height, the city contained a population of about 800 million; after the Apocalypse War, it was halved to 400 million. Following Chaos Day in 2134, the city was reduced to 50 million. However, immigration quickly increased the population to 72 million by 2137.
There are four other major population centres in Dredd's Northern America: the first is Texas City, including several of the southern former United States and based on Wild West manners. South of the city is Mex-City. Far north is Uranium City. Canada, now called Canadia, remains a nation with scattered communities. Mega-City Two once existed on the West Coast, but was destroyed in 2114 during the world war known as Judgement Day. Nuclear deserts and destruction elsewhere in the world are also extensive: much of the North Atlantic is severely polluted, and is now known as the "Black Atlantic". An underwater settlement known as Atlantis exists in the Atlantic, half-way along a tunnel from Mega-City One to Brit-Cit (England).
Nuclear desert also stretches across western Europe. The British Isles are Brit-Cit, Cal-Hab (Scotland), and Murphyville in Ireland (a country-sized theme park depicting a stereotypical view of traditional Irish life). The continent has Euro-City (eastern France and part of Germany), Ciudad España (eastern Spain), the Ruhr and Berlin Conurbs in Germany, Vatican City, and a scattering of other city-states. Russia's East-Meg One was destroyed by Dredd at the climax of the Apocalypse War in 2104. Further east is East-Meg Two, which has other territories under the "Sov Block" banner. Mongolia, lacking a Mega-City or Judge system, has called itself the Mongolian Free State and criminals have flocked there for a safe haven; East-Meg Two performed vicious clearances there in 2125.
Compared to North America and Europe, South America is shown to be in much better condition. Large fertile farmlands still exists and feed many cities worldwide, as do jungles and a variety of wild life. The main population centres are the highly corrupt cities of Ciudad Barranquilla in Argentina and Pan-Andes Conurb in the Andes on the Bolivian and Peruvian borders. Formerly two other cities existed, South-Am City and Brasilia, both of which were annihilated on Judgement Day.
In Asia, separated from East-Meg Two by an extensive nuclear desert, are Sino-City One (destroyed during Judgement Day) and Sino-City Two in eastern China, with Hong Tong built in the remains of Hong Kong and partitioned between Sino-Cit and Brit-Cit control. Hondo City lies on the remains of the islands of Japan. Nu-Delhi (previously Indo-Cit and Delhi-City) is in southern India. Surrounding Sino-City 2 is the Radlands of Ji, a nuclear desert containing outlaw gangs and martial arts schools. In the Pacific, cities survive in southeastern Australia or "Oz" (the Sydney-Melbourne Conurbation), the Solomon Islands (Solomon City), Tonga (Friendly City), and the New Pacific City; New Zealand is said to exist as well. All of Indonesia's islands are now linked by a network of mutant coral called "The Web", described as a lawless hotbed of crime, although a city called Djakarta did exist there at one point but was lost on Judgement Day.
The Middle East is without many major cities, being either nuclear or natural deserts, and only the mega-city of Luxor, Egypt has survived; the Mediterranean coast is heavily damaged by mutagens. In Africa much of the south is nuclear desert and a 'Great African Dustbowl' has formed in the northwest; but a large number of nation states have survived, whereof Simba City (Gabon), New Jerusalem (Ethiopia), Zambian Metropolitan, and Dar es Salaam are the largest cities. Nuclear fallout and pollution appear to have missed Antarctica and the Arctic, allowing one mega-city (Antarctic City) to be constructed there.
The high levels of pollution have created instances of mutation in humans and animals. The mega-cities largely operate on a system of genetic apartheid, making expulsion from the cities the worst punishment possible. Mega-City One ended apartheid in the 2130s, but encourages mutants to move to Cursed Earth townships instead of remaining in the city.
Earth's moon has been colonised, with a series of large domes forming Luna City; another colony, Puerto Luminae, exists but is lawless. In addition, many deep space colonies have been established. Some are loyal to various mega-cities, while many are independent states, and others still face violent insurgencies to gain independence. The multi-national Space Corps battles both insurgencies and external alien threats. The newly discovered planet 'Hestia' (which orbits the Sun at 90 degrees to Earth's orbit) has a colony; there are some references to colonies on Mars; Saturn's moon Titan has a judicial penal colony; and Mega-City One is known to have deep space missile silos on Pluto.
The paranormal is both common and often openly visible and so is accepted by both civilians and Judges. Ghosts, demons, ancient gods and two different creatures both claiming to be Satan have appeared in Mega-City One, with the Grand Hall itself known to be haunted by a disgraced former Chief Judge. Magic is real and has been practiced by some criminals. Psi-Divisions worldwide tend to be the main defence against such threats.
Street Judges act as police, judge, jury, and executioner. Capital punishment in Mega-City One is rarely used, though deaths while resisting arrest are commonplace. Numerous writers have used the Judge System to satirize contemporary politics.
Judges, once appointed, can be broadly characterised as "Street Judges" (who patrol the city), and administrative, or office-based Judges. Dredd was once offered the job of Chief Judge; but refused it. The incorruptibility of the Judges is supposedly maintained by the Special Judicial Squad (SJS), although SJS Judges have themselves broken the law on occasion, most notably SJS head Judge Cal who killed the Chief Judge and usurped his office for himself. The Judge System has spread world-wide, with various super-cities possessing similar methods of law enforcement. As such this political model has become the most common form of government on Earth, with only a few small areas practicing civilian rule. There is an international "Judicial Charter" which countries and city states join upon instituting a Judge System.
Almost all of the stories from both comics are currently being reprinted in their original order of publication in a series of trade paperbacks. Stories from the regular issues of "2000 AD" and the "Megazine" are collected in a series entitled "Judge Dredd: The Complete Case Files." This series began in 2005. Stories from special holiday issues and annuals appeared in "Judge Dredd: The Restricted Files". This four-volume series began in 2010 and concluded in 2012.
There have been a number of Judge Dredd storylines that have either significantly developed the Dredd character or the fictional world, or which depict a story on a grand scale. These are listed below (for a complete list of all stories see here).
Shortly before the release of the 1995 movie, three new comic book titles were released, followed by a one-off comic version of the film story.
DC Comics published an alternative version of Judge Dredd between 1994 and 1996, lasting 18 issues. Continuity and history were different from both the original "2000 AD" version and the 1995 film. A major difference was that Chief Judge Fargo, portrayed as incorruptible in the original version, was depicted as evil in the DC version. Most issues were written by Andrew Helfer, but the last issue was written by Gordon Rennie, who has since written "Judge Dredd" for "2000 AD" (Note: the DC crossover story "" featured the original Dredd, not the version depicted in this title).
Another DC Comics title, lasting 13 issues between 1994 and 1995. Although these were intended to feature the same version of Judge Dredd as in the other DC title, the first four issues were written by John Wagner and Alan Grant and were consistent with their original "2000 AD" version.
From the same publishers as "2000 AD", this was nevertheless a completely different version of Dredd aimed at younger readers. Editor David Bishop prohibited writers from showing Dredd killing anyone, a reluctance which would be completely unfamiliar to readers acquainted with the original version. As one reviewer put it years later: "this was Judge Dredd with two vital ingredients missing: his balls." It ran fortnightly for 23 issues from 1995 to 1996, plus one "Action Special".
Written by Andrew Helfer and illustrated by Carlos Ezquerra and Michael Danza. Published by DC Comics in 1995, but a different version of Dredd to that in the DC comic books described above.
From the same publishers as "2000 AD", this was a series of ultra-violent one-off stories from "a separate and aggressive Dredd world". The first eight episodes were originally published in "Rock Power" magazine, and were all co-written by John Wagner and Alan Grant and illustrated by Simon Bisley. These were reprinted, together with 11 new stories (some by other creators), in "Judge Dredd Megazine". The original eight stories were collected in a trade paperback by Hamlyn in 1993. The complete series was collected by Rebellion Developments in 2009.
In the week that the 2012 film "Dredd" was released in the UK, a 10-page prologue was published in issue #328 of "Judge Dredd Megazine", written by its editor, Matt Smith, and illustrated by Henry Flint. "Top of the World, Ma-Ma" told the backstory of the film's main antagonist, Ma-Ma. Five more stories featuring this version of the character were published in "Judge Dredd Megazine": "Underbelly" in #340–342 (2013), "Uprise" in #350–354 (2014), "Dust" in #367–371 (2015–'16), "Furies" in #386–387 (2017), and "The Dead World" in #392–396 (2018) (there were also two "Judge Anderson" stories featuring the film version of that character in #377–379).
An American film loosely based on the comic strip was released in 1995, starring Sylvester Stallone as Dredd (it was said that Arnold Schwarzenegger was originally requested for the role, but declined because in the original script, Dredd would keep the helmet on during major parts of the film). The film received negative reviews upon its release. It currently holds a 15% rating on review aggregator website Rotten Tomatoes, with the critical consensus stating that "Director [Danny] Cannon fails to find the necessary balance to make it work". In deference to its expensive Hollywood star, Dredd's face was shown. In the comic, he very rarely removes his helmet, and even then his real face is never revealed. Also, the writers largely omitted the ironic humour of the comic strip, and ignored important aspects of the "Dredd mythology" (for example, in the film a "love interest" is developed between Dredd and Judge Hershey, something that is strictly forbidden between Judges (or Judges and anyone else for that matter) in the comic strip). In the United States, the film won several "worst film of the year" awards.
The co-creator and main writer of the comic character, John Wagner, said:
However the film has since been praised for its depiction of Dredd's city, costumes, humour and larger-than-life characters.
Reliance Entertainment produced "Dredd", which was released in September 2012. It was positively received by critics with Rotten Tomatoes' rating of 78%. It was directed by Pete Travis and written by Alex Garland. Michael S. Murphey was co-producer with Travis. Karl Urban was cast as Judge Dredd and Olivia Thirlby portrayed Judge Anderson. Dredd's costume was radically redesigned for the film, adding armor plates and reducing the size and prominence of the shoulder insignia.
The main "Judge Dredd" writer John Wagner said:
The film was shot in 3-D and filmed in Cape Town and Johannesburg. Funding was secured from Reliance Big Entertainment.
On May 10, 2017, "Entertainment Weekly" announced that independent entertainment studio IM Global and Rebellion have partnered to develop a live-action TV show called "Judge Dredd: Mega-City One". The show is planned to be an ensemble drama about a team of Judges as they deal with the challenges of the future-shocked 22nd century.
Jason Kingsley, owner of Rebellion, told the "Guardian" in May 2017 that the TV show will be far more satirical than the movie adaptions and could become "one of the most expensive TV shows the UK has ever seen.”
According to Karl Urban, the studio's concept is to "build the show around more rookie judges and young, new judges", where Dredd himself "would come in and out". Urban stated that he would be interested in reprising the role for this, on the condition that Dredd's part of the story be implemented in a "meaningful way".
In November 2018, Rebellion began setting up a new studio in Didcot, valued at $100 million, for Film and TV series based on "2000 AD" characters, including "Judge Dredd: Mega City One."
There have been multiple Judge Dredd games released for various video game consoles and several home computers such as the Sinclair ZX Spectrum, Sony Playstation and Commodore 64. The first game, titled "Judge Dredd", was released in 1986. Another game, also titled "Judge Dredd", was released in 1990. At one time, an arcade game was being developed by Midway Games but it was never released. It can however be found online and has three playable levels.
A game loosely based on the first live action film, called "Judge Dredd" was developed by Probe Software and released by Acclaim for the Sega Genesis, Super Nintendo Entertainment System, Game Boy, and Game Gear. Bally produced a "Judge Dredd" pinball machine based on the comics. In 1997, Acclaim released a "Judge Dredd" arcade game, a rail shooter featuring 3D graphics and full motion video footage shot specifically for the game.
"" was produced by Rebellion Developments and released in early 2003 by Sierra Entertainment for the PC, PlayStation 2, Xbox and Nintendo GameCube. The game sees the return of the Dark Judges when Mega-City One becomes overrun with vampires and the undead. The player takes control of Judge Dredd, with the optional addition of another Human player in co-operative play. The game is a first-person shooter – with key differences such as the requirement to arrest lawbreakers, and an SJS death squad which will hunt down Dredd should the player kill too many civilians. The player can also go up against three friends in the various multiplayer modes which include "Deathmatch", "Team Deathmatch", "Elimination", "Team Elimination", "Informant", "Judges Vs. Perps", "Runner" and more. A novel was based on the game.
A costume set for the PlayStation 3 video game "LittleBigPlanet" was released in May 2009, which contained outfits to dress the game's main character Sackboy as five "2000 AD" characters, one of which is Judge Dredd. Dredd's uniform is also used to create the Judge Anderson costume for the Sackpeople.
In 2012, Rebellion released "Judge Dredd Vs. Zombies", a game application for iPhone, Android phones, Windows 8 and Windows Phone.
Games Workshop released a "" role-playing game in 1985. Mongoose Publishing released "The Judge Dredd Roleplaying Game" in 2002 and another "Judge Dredd" game using the "Traveller" system in 2009. Their licence ended in 2016. In February 2017, EN Publishing announced the new "Judge Dredd & The Worlds of 2000 AD Tabletop Adventure Game" using the WOIN ("What's OLD is NEW") role-playing game system.
On July 17, 2012, Tin Man Games released a "Judge Dredd"-themed digital role-playing gamebook titled "Judge Dredd: Countdown Sector 106", available for the iOS operating system.
Games Workshop produced a "Judge Dredd" boardgame based on the comic strip in 1982. In the game players, who represent judges, attempt to arrest perps that have committed crimes in different location in Mega City One. A key feature of the game is the different action cards that are collected during play; generally these cards are used when trying to arrest perps although some cards can also be played against other players to hinder their progress. The winner of the game is the judge who collected the most points arresting perps. Players could sabotage each other's arrest attempts. Additionally, there were many amusing card combinations such as arresting Judge Death for selling old comics, as the "Old Comic Selling" crime card featured a "2000 AD" cover with Judge Death on it. The game used characters, locations and artwork from the comic, but is now out of print.
In 1987, Games Workshop published a second Dredd-inspired boardgame, "Block Mania". In this game for two players, players take on the role of rival neighboring blocks at war. This was a heavier game than the earlier Dredd boardgame, focused on tactical combat, in which players control these residents as they use whatever means they can to vandalize and destroy their opponent's block. Later the same year, Games Workshop released the Mega Mania expansion for the game, allowing the game to be played by up to four players.
Mongoose Publishing have released a miniatures skirmish game of gang warfare based in Mega-City One called "Gangs of Mega-City One", often referred to as GOMC1. The game features judges being called in when a gang challenges another gang that is too tough to fight. A wide range of miniatures has been released including box sets for an Ape Gang and an Undercity Gang. A Robot Gang was also produced but was released as two blister packs instead of a box set. Only one rules expansion has been released, called "Death on the Streets". The expansion introduced many new rules including usage of the new gangs and the ability to bring Judge Dredd himself into a fight. This game went out of print shortly thereafter, but was replaced by the "Judge Dredd Miniatures Game", which was published free in many stages as the company sought feedback from fans and players. In 2012, an expansion was released called "Block War!". Miniatures continue to be manufactured at a slow pace.
In November 2017, Osprey Games announced their development of a new graphic adventure card game, entitled "Judge Dredd: The Cursed Earth". The game is designed and based on "The Lost Expedition", a game from designer Peer Sylvester. In the game, one to five players "[lead] a team of judges against dinosaurs, mutants, and the Cursed Earth itself". It was released on 21 February 2019.
There was a short-lived collectible card game called simply "Dredd". In the game, players would control a squad of judges and arrest perps. The rules system was innovative and the game was well-received by fans and collectors alike, but various issues unrelated to the game's quality caused its early demise.
There was a four-player pinball game released in 1993, produced by Bally Manufacturing.
From 1993 to 1995, Virgin Books published nine Judge Dredd novels. They had hoped the series would be a success in the wake of the feature film, but the series was cancelled after insufficient sales. In August 2015, these novels were re-released as e-books. The books are:
Also in 1995, St. Martins Press published two novelizations of the film:
In 1997, Virgin published a "Doctor Who" novel by Dave Stone which had originally been intended to feature Judge Dredd, called "Burning Heart". However this idea was abandoned after the film was released, and Dredd was replaced by another character called Adjudicator Joseph Craator.
From 2003 to 2007, Black Flame published official "2000 AD" novels, including a new run of Judge Dredd novels. After Black Flame closed in 2007, Rebellion picked up the rights to their "2000 AD" titles in 2011, and began republishing them as e-books. Their nine Judge Dredd books are:
In July 2012, three of these novels – Gordon Rennie's "Dredd Vs Death", David Bishop (writer)|'s "Kingdom of the Blind", and Matt Smith's "The Final Cut" – were republished in a single paperback volume titled "Dredd", as a tie-in with the 2012 film of the same title. ()
In August 2012, Rebellion announced a new series of e-books under the series title "Judge Dredd: Year One", about Dredd's first year as a judge (the stories in the comic strip having begun in his 20th year when he was already a veteran). All three stories were published by Abaddon Books in a paperback book called "Judge Dredd Year One Omnibus" in October 2014.
In 2016, more e-books were published under the series title "Judge Dredd: Year Two":
In May 2018, a series of three books, collectively called "Judges", was announced. These are not about Dredd but about the first generation of judges, and are set six decades before Dredd's first stories to appear in the comic. The announced books are:
"The Day the Law Died" and "The Apocalypse War" stories were produced by Dirk Maggs and broadcast in three-minute segments (40 for each story) on Mark Goodier's afternoon show on BBC Radio One in 1995. The cast include Lorelei King and Gary Martin. They were issued separately on dual cassette and double CD. Both titles have since been deleted. "The Apocalypse War" also contains plot elements from "Block Mania", because this story set the scene for the main story.
In recent years, Big Finish Productions has produced 18 audio plays featuring "2000 AD" characters. These have mostly featured Judge Dredd, although three have also featured the Strontium Dog. In these, Judge Dredd is played by Toby Longworth and Johnny Alpha, the Strontium Dog is played by Simon Pegg. In July 2009, four further Judge Dredd titles were released under the banner "Crime Chronicles", once more featuring Toby Longworth.
The list of "2000 AD" audio plays featuring Dredd includes:
"Note: 3 and 10 are Strontium Dog stories that do not feature Dredd." | https://en.wikipedia.org/wiki?curid=16572 |
James Flynn (academic)
James Robert Flynn FRSNZ (born 1934) is a New Zealand intelligence researcher. An Emeritus Professor of Political Studies at the University of Otago in Dunedin, New Zealand, he is famous for his publications about the continued year-after-year increase of IQ scores throughout the world, which is now referred to as the Flynn effect. The Flynn effect is the subject of a multiple-author monograph published by the American Psychological Association in 1998. Originally from Washington, D.C. and educated at the University of Chicago, Flynn emigrated to New Zealand in 1963.
Flynn's son Victor is a mathematics professor at New College, Oxford.
Flynn has written a variety of books. His research interests include humane ideals and ideological debate, classics of political philosophy, and race, class and IQ (see race and intelligence). His books combine political and moral philosophy with psychology to examine problems such as justifying humane ideals and whether it makes sense to rank races and classes by merit. He is currently a member of the editorial board of "Intelligence" and on the Honorary International Advisory Editorial Board of the Mens Sana Monographs.
Flynn defines intelligence to be independent of culture, emphasising that the style of thought required to deal with problems of survival in a desert (mapping, tracking..), is different from that required to do well in the modern West (academic achievement etc.), but that both undoubtedly require intelligence.
In 1987, Arthur Jensen praised Flynn's criticism of Jensen's own work in a chapter summarizing an academic book about Jensen's research on human intelligence.
Now and then I am asked by colleagues, students, and journalists: who, in my opinion, are the most respectable critics of my position on the race-IQ issue? The name James R. Flynn is by far the first that comes to mind. His book, "Race, IQ and Jensen" (1980), is a distinguished contribution to the literature on this topic, and, among the critiques I have seen of my position, is virtually in a class by itself for objectivity, thoroughness, and scholarly integrity.
A 1999 article published in "American Psychologist" summarises much of his research. On the alleged genetic inferiority of Blacks on IQ tests, he lays out the argument and evidence for such a belief and then contests each point. He interprets the direct evidence—when Blacks are raised in settings that are less disadvantageous—as suggesting that environmental factors explain genetic differences. And yet, he argues that the environmental explanation gained force after the discovery that IQ scores were rising over time. Inter-generational IQ differences among Whites and across nations were larger than the Black-White IQ Gap and could not be accounted for by genetic factors, which, if anything, should have reduced IQ, according to scholars he references. He posits that the Black-White IQ score gap can be entirely explained by environmental factors if "the average environment for Blacks in 1995 matches the quality of the average environment for Whites in 1945".
Flynn's 2007 book "What Is Intelligence?" impressed Charles Murray, a co-author of the book "The Bell Curve", who wrote in a statement published on the book's back cover, "This book is a gold mine of pointers to interesting work, much of which was new to me. All of us who wrestle with the extraordinarily difficult questions about intelligence that Flynn discusses are in his debt".
Flynn believes in racial equality. He advocates for open scientific debate about controversial social science claims, and is critical of the suppression of research into race and intelligence. He urges those who believe in racial equality to use solid evidence to advance those beliefs.
Flynn's 2010 book "The Torchlight List" proposes that a person can learn more from reading great works of literature than they can from going to university. In 2019, Emerald Insight reversed its decision to publish Flynn's then upcoming book "In Defence of Free Speech". Their letter to Flynn cited concerns that the book would potentially violate hate speech laws in the United Kingdom.
Flynn has described himself as an "atheist, a scientific realist, a social democrat".
The "Flynn effect" is the substantial and long-sustained increase in intelligence test scores measured in many parts of the world. When intelligence quotient (IQ) tests are initially standardised using a sample of test-takers, by convention the average of the test results is set to 100 and their standard deviation is set to 15 IQ points. When IQ tests are revised they are again standardised using a new sample of test-takers, usually born more recently than the first. Again, the average result is set to 100. However, when the new test subjects take the older tests, in almost every case their average scores are significantly above 100.
Test score increases have been continuous and approximately linear from the earliest years of testing to the present. For the Raven's Progressive Matrices test, subjects born over a 100-year period were compared in Des Moines, Iowa, and separately in Dumfries, Scotland. Improvements were remarkably consistent across the whole period, in both countries. This effect of an apparent increase in IQ has also been observed in various other parts of the world, though the rates of increase vary.
There are numerous proposed explanations of the Flynn effect, as well as some scepticism about its implications. Similar improvements have been reported for other cognitions such as semantic and episodic memory. Recent research suggests that the Flynn effect may have ended in at least a few developed nations, possibly allowing national differences in IQ scores to diminish if the Flynn effect continues in nations with lower average national IQs.
Flynn himself, with co-worker William Dickens, has suggested an explanatory model which points to two-way causality between IQ and environment: a cognitively challenging environment raises an individual's IQ, while in addition, a higher individual IQ makes it more likely that an individual will self-select or be sorted into more cognitively challenging environments.
In 1967, Flynn served as a chairperson for the Congress of Racial Equality (CORE), a civil rights organisation in the US South. (In "Where Have All the Liberals Gone?" (p. 278), Flynn states that in the early 1960s in America, he was consistently fired for his Social Democratic politics, prompting his emigration.)
Flynn campaigns passionately for left-wing causes, and became an initiating member of both the NewLabour Party and of the Alliance. He also advised Labour Prime Minister Norman Kirk on foreign policy. He has stood as a parliamentary candidate in general elections on several occasions, for example in the electorate in the and s on the Alliance list, and most recently in again as an Alliance list candidate. In 2008 he acted as the Alliance spokesperson for finance and taxation.
During 2007, new research from the 2006 New Zealand census showed that women without a tertiary (college) education had produced 2.57 babies each, compared to 1.85 babies for those women with a higher education. During July 2007, "The Sunday Star-Times" quoted Flynn as saying that New Zealand risked having a less intelligent population and that a "persistent genetic trend which lowered the genetic quality for brain physiology would have some effect eventually". He referred to hypothetical eugenicists' suggestions for reversing the trend, including some sort of oral contraceptive "in the water supply and … an antidote" to conceive.
Flynn later articulated his own views on the "Close Up" television program in an interview with Paul Henry, suggesting that the "Sunday Star-Times" had grossly misrepresented his opinions. In the article, Flynn argued that he never intended for his suggestion to be taken seriously, as he only said this to illustrate a particular point.
In July 2012, several media outlets reported Flynn as claiming that women had, for the first time in a century, surpassed men on IQ tests based on a study he conducted in 2010. However, Flynn announced that the media had seriously distorted his results and went beyond his claims, revealing that he had instead discovered that the differences between men and women on one particular test, the Raven's Progressive Matrices, had become minimal in five modernised nations (whereas before 1982 women had scored significantly lower). Women, he argued, caught up to men in these nations as a result of exposure to modernity by entering the professions and being allowed greater educational access. Therefore, he claimed, when a total account of the Flynn effect is considered, women's closing the gap had moved them up in IQ slightly faster than men as a result. Flynn had previously documented this same trend among ethnic minorities and other disadvantaged groups. According to Flynn, the sexes are "dead equal on cognitive factors ... in their ability to deal with using logic on the abstract problems of Raven's," but that temperamental differences in the way boys and girls take the tests likely account for the tiny variations in mean scores, rather than any difference in intellectual ability.
In 2019 Flynn was told that his new book "In Defence of Free Speech: The University as Censor" would not be published by an English publisher who had previously accepted it and scheduled it for publication. But it was thought "too controversial" under the United Kingdom's laws about 'hate speech' as the intent is irrelevant if it is thought likely that "racial hatred could be stirred up as a result of the work." Academica Press later agreed to publish the book. | https://en.wikipedia.org/wiki?curid=16573 |
John Hay
John Milton Hay (October 8, 1838July 1, 1905) was an American statesman and official whose career in government stretched over almost half a century. Beginning as a private secretary and assistant to Abraham Lincoln, Hay's highest office was United States Secretary of State under Presidents William McKinley and Theodore Roosevelt. Hay was also an author and biographer, and wrote poetry and other literature throughout much of his life.
Born in Indiana to an anti-slavery family that moved to Illinois when he was young, Hay showed great potential, and his family sent him to Brown University. After graduation in 1858, Hay read law in his uncle's office in Springfield, Illinois, adjacent to that of Lincoln. Hay worked for Lincoln's successful presidential campaign and became one of his private secretaries at the White House. Throughout the American Civil War, Hay was close to Lincoln and stood by his deathbed after the President was shot at Ford's Theatre. In addition to his other literary works, Hay co-authored with John George Nicolay a that helped shape the assassinated president's historical image.
After Lincoln's death, Hay spent several years at diplomatic posts in Europe, then worked for the "New-York Tribune" under Horace Greeley and Whitelaw Reid. Hay remained active in politics, and from 1879 to 1881 served as Assistant Secretary of State. Afterward, he remained in the private sector, until President McKinley, for whom he had been a major backer, made him Ambassador to the United Kingdom in 1897. Hay became Secretary of State the following year.
Hay served for almost seven years as Secretary of State, under President McKinley, and after McKinley's assassination, under Theodore Roosevelt. Hay was responsible for negotiating the Open Door Policy, which kept China open to trade with all countries on an equal basis, with international powers. By negotiating the Hay–Pauncefote Treaty with the United Kingdom, the (ultimately unratified) Hay–Herrán Treaty with Colombia, and finally the Hay–Bunau-Varilla Treaty with the newly-independent Republic of Panama, Hay also cleared the way for the building of the Panama Canal.
John Milton Hay was born in Salem, Indiana, on October 8, 1838. He was the third son of Dr. Charles Hay and the former Helen Leonard. Charles Hay, born in Lexington, Kentucky, hated slavery and moved to the North in the early 1830s. A doctor, he practiced in Salem. Helen's father, David Leonard, had moved his family west from Assonet, Massachusetts, in 1818, but died en route to Vincennes, Indiana, and Helen relocated to Salem in 1830 to teach school. They married there in 1831. Charles was not successful in Salem, and moved, with his wife and children, to Warsaw, Illinois, in 1841.
John attended the local schools, and in 1849 his uncle Milton Hay invited John to live at his home in Pittsfield, Pike County, and attend a well-regarded local school, the John D. Thomson Academy. Milton was a friend of Springfield attorney Abraham Lincoln and had read law in the firm Stuart and Lincoln. In Pittsfield, John first met John Nicolay, who was at the time a 20-year-old newspaperman. Once John Hay completed his studies there, the 13-year-old was sent to live with his grandfather in Springfield and attend school there. His parents and uncle Milton (who financed the boy's education) sent him to Brown University in Providence, Rhode Island, "alma mater" of his late maternal grandfather.
Hay enrolled at Brown in 1855. Although he enjoyed college life, he did not find it easy: his Western clothing and accent made him stand out; he was not well prepared academically and was often sick. Hay gained a reputation as a star student and became a part of Providence's literary circle that included Sarah Helen Whitman and Nora Perry. He wrote poetry and experimented with hashish. Hay received his Master of Arts degree in 1858, and was, like his grandfather before him, Class Poet. He returned to Illinois. Milton Hay had moved his practice to Springfield, and John became a clerk in his firm, where he could study law.
Milton Hay's firm was one of the most prestigious in Illinois. Lincoln maintained offices next door and was a rising star in the new Republican Party. Hay recalled an early encounter with Lincoln:
Hay was not a supporter of Lincoln for president until after his nomination in 1860. Hay then made speeches and wrote newspaper articles boosting Lincoln's candidacy. When Nicolay, who had been made Lincoln's private secretary for the campaign, found he needed help with the huge amounts of correspondence, Hay worked full-time for Lincoln for six months.
After Lincoln was elected, Nicolay, who continued as Lincoln's private secretary, recommended that Hay be hired to assist him at the White House. Lincoln is reported to have said, "We can't take all Illinois with us down to Washington" but then "Well, let Hay come". Kushner and Sherrill were dubious about "the story of Lincoln's offhand appointment of Hay" as fitting well into Hay's self-image of never having been an office-seeker, but "poorly into the realities of Springfield politics of the 1860s"—Hay must have expected some reward for handling Lincoln's correspondence for months. Hay biographer John Taliaferro suggests that Lincoln engaged Nicolay and Hay to assist him, rather than more seasoned men, both "out of loyalty and surely because of the competence and compatibility that his two young aides had demonstrated". Historian Joshua Zeitz argues that Lincoln was moved to hire Hay when Milton agreed to pay his nephew's salary for six months.
Milton Hay desired that his nephew go to Washington as a qualified attorney, and John Hay was admitted to the bar in Illinois on February 4, 1861. On February 11, he embarked with President-elect Lincoln on a circuitous journey to Washington. By this time, several Southern states had seceded to form the Confederate States of America in reaction to the election of Lincoln, seen as an opponent of slavery. When Lincoln was sworn in on March 4, Hay and Nicolay moved into the White House, sharing a shabby bedroom. As there was only authority for payment of one presidential secretary (Nicolay), Hay was appointed to a post in the Interior Department at $1,600 per year, seconded to service at the White House. They were available to Lincoln 24 hours a day. As Lincoln took no vacations as president and worked seven days a week, often until 11 pm (or later, during crucial battles) the burden on his secretaries was heavy.
Hay and Nicolay divided their responsibilities: Nicolay tending to assist Lincoln in his office and in meetings, while Hay dealt with the correspondence, which was very large. Both men tried to shield Lincoln from office-seekers and others who wanted to meet with the President. Unlike the dour Nicolay, Hay, with his charm, escaped much of the hard feelings from those denied Lincoln's presence. Abolitionist Thomas Wentworth Higginson described Hay as "a nice young fellow, who unfortunately looks about seventeen and is oppressed with the necessity of behaving like seventy". Hay continued to write, anonymously, for newspapers, sending in columns calculated to make Lincoln appear a sorrowful man, religious and competent, giving of his life and health to preserve the Union. Similarly, Hay served as what Taliaferro deemed a "White House propagandist", in his columns explaining away losses such as that at First Manassas in July 1861.
Despite the heavy workload—Hay wrote that he was busy 20 hours a day—he tried to make as normal a life as possible, eating his meals with Nicolay at Willard's Hotel, going to the theatre with Abraham and Mary Todd Lincoln, and reading "Les Misérables" in French. Hay, still in his early 20s, spent time both in barrooms and at cultured get-togethers in the homes of Washington's elite. The two secretaries often clashed with Mary Lincoln, who resorted to various stratagems to get the dilapidated White House restored without depleting Lincoln's salary, which had to cover entertainment and other expenses. Despite the secretaries' objections, Mrs. Lincoln was generally the victor and managed to save almost 70% of her husband's salary in his four years in office.
After the death of Lincoln's 11-year-old son Willie in February 1862 (an event not mentioned in Hay's diary or correspondence), "it was Hay who became, if not a surrogate son, then a young man who stirred a higher form of parental nurturing that Lincoln, despite his best intentions, did not successfully bestow on either of his surviving children". According to Hay biographer Robert Gale, "Hay came to adore Lincoln for his goodness, patience, understanding, sense of humor, humility, magnanimity, sense of justice, healthy skepticism, resilience and power, love of the common man, and mystical patriotism". Speaker of the House Galusha Grow stated, "Lincoln was very much attached to him"; writer Charles G. Halpine, who knew Hay then, later recorded that "Lincoln loved him as a son".
Hay and Nicolay accompanied Lincoln to Gettysburg, Pennsylvania, for the dedication of the cemetery there, where were interred many of those who fell at the Battle of Gettysburg. Although they made much of Lincoln's brief Gettysburg Address in their 1890 multi-volume biography of Lincoln, Hay's diary states "the President, in a firm, free way, with more grace than is his wont, said his half-dozen lines of consecration."
Lincoln sent Hay away from the White House on various missions. In August 1861, Hay escorted Mary Lincoln and her children to Long Branch, New Jersey, a resort on the Jersey Shore, both as their caretaker and as a means of giving Hay a much-needed break. The following month, Lincoln sent him to Missouri to deliver a letter to Union General John C. Frémont, who had irritated the President with military blunders and by freeing local slaves without authorization, endangering Lincoln's attempts to keep the border states in the Union.
In April 1863, Lincoln sent Hay to the Union-occupied South Carolina coast to report back on the ironclad vessels being used in an attempt to recapture Charleston Harbor. Hay then went on to the Florida coast. He returned to Florida in January 1864, after Lincoln had announced his Ten Percent Plan, that if ten percent of the 1860 electorate in a state took oaths of loyalty and to support emancipation, they could form a government with federal protection. Lincoln considered Florida, with its small population, a good test case, and made Hay a major, sending him to see if he could get sufficient men to take the oath. Hay spent a month in the state during February and March 1864, but Union defeats there reduced the area under federal control. Believing his mission impractical, he sailed back to Washington.
In July 1864, New York publisher Horace Greeley sent word to Lincoln that there were Southern peace emissaries in Canada. Lincoln doubted that they actually spoke for Confederate President Jefferson Davis, but had Hay journey to New York to persuade the publisher to go to Niagara Falls, Ontario, to meet with them and bring them to Washington. Greeley reported to Lincoln that the emissaries lacked accreditation by Davis, but were confident they could bring both sides together. Lincoln sent Hay to Ontario with what became known as the Niagara Manifesto: that if the South laid down its arms, freed the slaves, and reentered the Union, it could expect liberal terms on other points. The Southerners refused to come to Washington to negotiate.
By the end of 1864, with Lincoln reelected and the victorious war winding down, both Hay and Nicolay let it be known that they desired different jobs. Soon after Lincoln's second inauguration in March 1865, the two secretaries were appointed to the US delegation in Paris, Nicolay as consul and Hay as secretary of legation. Hay wrote to his brother Charles that the appointment was "entirely unsolicited and unexpected", a statement that Kushner and Sherrill found unconvincing given that Hay had spent hundreds of hours during the war with Secretary of State William H. Seward, who had often discussed personal and political matters with him, and the close relationship between the two men was so well known that office-seekers cultivated Hay as a means of getting to Seward. The two men were also motivated to find new jobs by their deteriorating relationship with Mary Lincoln, who sought their ouster, and by Nicolay's desire to wed his intended—he could not bring a bride to his shared room at the White House.They remained at the White House pending the arrival and training of replacements.
Hay did not accompany the Lincolns to Ford's Theatre on the night of April 14, 1865, but remained at the White House, drinking whiskey with Robert Lincoln. When the two were informed that the President had been shot, they hastened to the Petersen House, a boarding house where the stricken Lincoln had been taken. Hay remained by Lincoln's deathbed through the night and was present when he died. At the moment of Lincoln's death, Hay observed "a look of unspeakable peace came upon his worn features". He heard War Secretary Edwin Stanton's declaration, "Now he belongs to the ages."
According to Kushner and Sherrill, "Lincoln's death was for Hay a personal loss, like the loss of a father ... Lincoln's assassination erased any remaining doubts Hay had about Lincoln's greatness." In 1866, in a personal letter, Hay deemed Lincoln, "the greatest character since Christ". Taliaferro noted that "Hay would spend the rest of his life mourning Lincoln ... wherever Hay went and whatever he did, Lincoln would "always" be watching".
Hay sailed for Paris at the end of June 1865. There, he served under U.S. Minister to France John Bigelow. The workload was not heavy, and Hay found time to enjoy the pleasures of Paris. When Bigelow resigned in mid-1866, Hay, as was customary, submitted his resignation, though he was asked to remain until Bigelow's successor was in place, and stayed until January 1867. He consulted with Secretary of State William H. Seward, asking him for "anything worth having". Seward suggested the post of Minister to Sweden, but reckoned without the new president, Andrew Johnson, who had his own candidate. Seward offered Hay a job as his private secretary, but Hay declined, and returned home to Warsaw.
Initially happy to be home, Hay quickly grew restive, and he was glad to hear, in early June 1867, that he had been appointed secretary of legation to act as chargé d'affaires at Vienna. He sailed for Europe the same month, and while in England visited the House of Commons, where he was greatly impressed by the Chancellor of the Exchequer, Benjamin Disraeli. The Vienna post was only temporary, until Johnson could appoint a chargé d'affaires and have him confirmed by the Senate, and the workload was light, allowing Hay, who was fluent in German, to spend much of his time traveling. It was not until July 1868 that Henry Watts became Hay's replacement. Hay resigned, spent the remainder of the summer in Europe, then went home to Warsaw.
Unemployed again, in December 1868 Hay journeyed to the capital, writing to Nicolay that he "came to Washington in the peaceful pursuit of a fat office. But there is nothing just now available". Seward promised to "wrestle with Andy for anything that turns up", but nothing did prior to the departure of both Seward and Johnson from office on March 4, 1869. In May, Hay went back to Washington from Warsaw to press his case with the new Grant administration. The next month, due to the influence of his friends, he obtained the post of secretary of legation in Spain.
Although the salary was low, Hay was interested in serving in Madrid both because of the political situation there—Queen Isabella II had recently been deposed—and because the U.S. Minister was the swashbuckling former congressman, General Daniel Sickles. Hay hoped to assist Sickles in gaining U.S. control over Cuba, then a Spanish colony. Sickles was unsuccessful and Hay resigned in May 1870, citing the low salary, but remaining in his post until September. Two legacies of Hay's time in Madrid were magazine articles he wrote that became the basis of his first book, "Castilian Days", and his lifelong friendship with Sickles's personal secretary, Alvey A. Adee, who would be a close aide to Hay at the State Department.
While still in Spain, Hay had been offered the position of assistant editor at the "New-York Tribune"—both the editor, Horace Greeley, and his managing editor, Whitelaw Reid, were anxious to hire Hay. He joined the staff in October 1870. The "Tribune" was the leading reform newspaper in New York, and through mail subscriptions, the largest-circulating newspaper in the nation. Hay wrote editorials for the "Tribune", and Greeley soon proclaimed him the most brilliant writer of "breviers" (as they were called) that he had ever had.
With his success as an editorial writer, Hay's duties expanded. In October 1871, he journeyed to Chicago after the great fire there, interviewing Mrs. O'Leary, whose cow was said to have started the blaze, describing her as "a woman with a lamp [who went] to the barn behind the house, to milk the cow with the crumpled temper, that kicked the lamp, that spilled the kerosene, that fired the straw that burned Chicago". His work at the "Tribune" came as his fame as a poet was reaching its peak, and one colleague described it as "a liberal education in the delights of intellectual life to sit in intimate companionship with John Hay and watch the play of that well-stored and brilliant mind". In addition to writing, Hay was signed by the prestigious Boston Lyceum Bureau, whose clients included Mark Twain and Susan B. Anthony, to give lectures on the prospects for democracy in Europe, and on his years in the Lincoln White House.
By the time President Grant ran for reelection in 1872, Grant's administration had been rocked by scandal, and some disaffected members of his party formed the Liberal Republicans, naming Greeley as their candidate for president, a nomination soon joined in by the Democrats. Hay was unenthusiastic about the editor-turned-candidate, and in his editorials mostly took aim at Grant, who, despite the scandals, remained untarred, and who won a landslide victory in the election. Greeley died only weeks later, a broken man. Hay's stance endangered his hitherto sterling credentials in the Republican Party.
By 1873, Hay was wooing Clara Stone, daughter of Cleveland multimillionaire railroad and banking mogul Amasa Stone. The success of his suit (they married in 1874) made the salary attached to office a small consideration for the rest of his life. Amasa Stone needed someone to watch over his investments, and wanted Hay to move to Cleveland to fill the post. Although the Hays initially lived in John's New York apartment and later in a townhouse there, they moved in June 1875 to Stone's ornate home on Cleveland's Euclid Avenue, "Millionaire's Row", and a mansion was quickly under construction for the Hays next-door. The Hays had four children, Helen Hay Whitney, Adelbert Barnes Hay, Alice Evelyn Hay Wadsworth Boyd (who married James Wolcott Wadsworth Jr.), and Clarence Leonard Hay. Their father proved successful as a money manager, though he devoted much of his time to literary and political activities, writing to Adee that "I do nothing but read and yawn".
On December 29, 1876, a bridge over Ohio's Ashtabula River collapsed. The bridge had been built from metal cast at one of Stone's mills, and was carrying a train owned and operated by Stone's Lake Shore and Michigan Railway. Ninety-two people died; it was the worst rail disaster in American history up to that point. Blame fell heavily on Stone, who departed for Europe to recuperate and left Hay in charge of his businesses. The summer of 1877 was marked by labor disputes; a strike over wage cuts on the Baltimore & Ohio Railroad soon spread to the Lake Shore, much to Hay's outrage. He blamed foreign agitators for the dispute, and vented his anger over the strike in his only novel, "The Bread-Winners" (1883).
Hay remained disaffected from the Republican Party in the mid-1870s. Seeking a candidate of either party he could support as a reformer, he watched as his favored Democrat, Samuel Tilden, gained his party's nomination, but his favored Republican, James G. Blaine, did not, falling to Ohio Governor Rutherford B. Hayes, whom Hay did not support during the campaign. Hayes's victory in the election left Hay an outsider as he sought a return to politics, and he was initially offered no place in the new administration. Nevertheless, Hay attempted to ingratiate himself with the new President by sending him a gold ring with a strand of George Washington's hair, a gesture that Hayes deeply appreciated. Hay spent time working with Nicolay on their Lincoln biography, and traveling in Europe. When Reid, who had succeeded Greeley as editor of the "Tribune", was offered the post of Minister to Germany in December 1878, he turned it down and recommended Hay. Secretary of State William M. Evarts indicated that Hay "had not been active enough in political efforts", to Hay's regret, who told Reid that he "would like a second-class mission uncommonly well".
From May to October 1879, Hay set out to reconfirm his credentials as a loyal Republican, giving speeches in support of candidates and attacking the Democrats. In October, President and Mrs. Hayes came to a reception at Hay's Cleveland home. When Assistant Secretary of State Frederick W. Seward resigned later that month, Hay was offered his place and accepted, after some hesitancy because he was considering running for Congress.
In Washington, Hay oversaw a staff of eighty employees, renewed his acquaintance with his friend Henry Adams, and substituted for Evarts at Cabinet meetings when the Secretary was out of town. In 1880, he campaigned for the Republican nominee for president, his fellow Ohioan, Congressman James A. Garfield. Hay felt that Garfield did not have enough backbone, and hoped that Reid and others would "inoculate him with the gall which I fear he lacks". Garfield consulted Hay before and after his election as president on appointments and other matters, but offered Hay only the post of private secretary (though he promised to increase its pay and power), and Hay declined. Hay resigned as assistant secretary effective March 31, 1881, and spent the next seven months as acting editor of the "Tribune" during Reid's extended absence in Europe. Garfield's death in September and Reid's return the following month left Hay again on the outside of political power, looking in. He would spend the next fifteen years in that position.
After 1881, Hay did not again hold public office until 1897. Amasa Stone committed suicide in 1883; his death left the Hays very wealthy. They spent several months in most years traveling in Europe. The Lincoln biography absorbed some of Hay's time, the hardest work being done with Nicolay in 1884 and 1885; beginning in 1886, portions began appearing serially, and the was published in 1890.
In 1884, Hay and Adams commissioned architect Henry Hobson Richardson to construct houses for them on Washington's Lafayette Square; these were completed by 1886. Hay's house, facing the White House and fronting on Sixteenth Street, was described even before completion as "the finest house in Washington". The price for the combined tract, purchased from William Wilson Corcoran, was $73,800, of which Adams paid a third for his lot. Hay budgeted the construction cost at $50,000; his ornate, mansion eventually cost over twice that. Despite their possession of two lavish houses, the Hays spent less than half the year in Washington and only a few weeks a year in Cleveland. They also spent time at The Fells, their summer residence in Newbury, New Hampshire. According to Gale, "for a full decade before his appointment in 1897 as ambassador to England, Hay was lazy and uncertain."
Hay continued to devote much of his energy to Republican politics. In 1884, he supported Blaine for president, donating considerable sums to the senator's unsuccessful campaign against New York Governor Grover Cleveland. Many of Hay's friends were unenthusiastic about Blaine's candidacy, to Hay's anger, and he wrote to editor Richard Watson Gilder, "I have never been able to appreciate the logic that induces some excellent people every four years because they cannot nominate the candidate they prefer to vote for the party they don't prefer." In 1888, Hay had to follow his own advice as his favored candidate, Ohio Senator John Sherman, was unsuccessful at the Republican convention. After some reluctance, Hay supported the nominee, former Indiana senator Benjamin Harrison, who was elected. Though Harrison appointed men whom Hay supported, including Blaine, Reid, and Robert Lincoln, Hay was not asked to serve in the Harrison administration. In 1890, Hay spoke for Republican congressional candidates, addressing a rally of 10,000 people in New York City, but the party was defeated, losing control of Congress. Hay contributed funds to Harrison's unsuccessful re-election effort, in part because Reid had been made Harrison's 1892 running mate.
Hay was an early supporter of Ohio's William McKinley and worked closely with McKinley's political manager, Cleveland industrialist Mark Hanna. In 1889, Hay supported McKinley in his unsuccessful effort to become Speaker of the House. Four years later, McKinley—by then Governor of Ohio—faced a crisis when a friend whose notes he had imprudently co-signed went bankrupt during the Panic of 1893. The debts were beyond the governor's means to pay, and the possibility of insolvency threatened McKinley's promising political career. Hay was among those Hanna called upon to contribute, buying up $3,000 of the debt of over $100,000. Although others paid more, "Hay's checks were two of the first, and his touch was more personal, a kindness McKinley never forgot". The governor wrote, "How can I ever repay you & other dear friends?"
The same panic that nearly ruined McKinley convinced Hay that men like himself must take office to save the country from disaster. By the end of 1894, he was deeply involved in efforts to lay the groundwork for the governor's 1896 presidential bid. It was Hay's job to persuade potential supporters that McKinley was worth backing. Nevertheless, Hay found time for a lengthy stay in New Hampshire—one visitor at The Fells in mid-1895 was Rudyard Kipling—and later in the year wrote, "The summer wanes and I have done nothing for McKinley." He atoned with a $500 check to Hanna, the first of many. During the winter of 1895–96, Hay passed along what he heard from other Republicans influential in Washington, such as Massachusetts Senator Henry Cabot Lodge.
Hay spent part of the spring and early summer of 1896 in the United Kingdom, and elsewhere in Europe. There was a border dispute between Venezuela and British Guiana, and Cleveland's Secretary of State, Richard Olney, supported the Venezuelan position, announcing the Olney interpretation of the Monroe Doctrine. Hay told British politicians that McKinley, if elected, would be unlikely to change course. McKinley was nominated in June 1896; still, many Britons were minded to support whoever became the Democratic candidate. This changed when the 1896 Democratic National Convention nominated former Nebraska congressman William Jennings Bryan on a "free silver" platform; he had electrified the delegates with his Cross of Gold speech. Hay reported to McKinley when he returned to Britain after a brief stay on the Continent during which Bryan was nominated in Chicago: "they were all scared out of their wits for fear Bryan would be elected, and very polite in their references to you."
Once Hay returned to the United States in early August, he went to The Fells and watched from afar as Bryan barnstormed the nation in his campaign while McKinley gave speeches from his front porch. Despite an invitation from the candidate, Hay was reluctant to visit McKinley at his home in Canton. "He has asked me to come, but I thought I would not struggle with the millions on his trampled lawn". In October, after basing himself at his Cleveland home and giving a speech for McKinley, Hay went to Canton at last, writing to Adams,
Hay was disgusted by Bryan's speeches, writing in language that Taliaferro compares to "The Bread-Winners" that the Democrat "simply reiterates the unquestioned truths that every man with a clean shirt is a thief and ought to be hanged: that there is no goodness and wisdom except among the illiterate & criminal classes". Despite Bryan's strenuous efforts, McKinley won the election easily, with a campaign run by himself and Hanna, and well-financed by supporters like Hay. Henry Adams later wondered, "I would give sixpence to know how much Hay paid for McKinley. His politics must have cost."
In the post-election speculation as to who would be given office under McKinley, Hay's name figured prominently, as did that of Whitelaw Reid; both men sought high office in the State Department, either as secretary or one of the major ambassadorial posts. Reid, in addition to his vice-presidential run, had been Minister to France under Harrison. Reid, an asthmatic, handicapped himself by departing for Arizona Territory for the winter, leading to speculation about his health.
Hay was faster than Reid to realize that the race for these posts would be affected by Hanna's desire to be senator from Ohio, as with one of the state's places about to be occupied by the newly elected Joseph B. Foraker, the only possible seat for him was that held by Senator Sherman. As the septuagenarian senator had served as Treasury Secretary under Hayes, only the secretaryship of state was likely to attract him and cause a vacancy that Hanna could fill. Hay knew that with only eight cabinet positions, only one could go to an Ohioan, and so he had no chance for a cabinet post. Accordingly, Hay encouraged Reid to seek the State position, while firmly ruling himself out as a possible candidate for that post, and quietly seeking the inside track to be ambassador in London. Zeitz states that Hay "aggressively lobbied" for the position.
According to Taliaferro, "only after the deed was accomplished and Hay was installed as the ambassador to the Court of St. James's would it be possible to detect just how subtly and completely he had finessed his ally and friend, Whitelaw Reid". A telegraph from Hay to McKinley in the latter's papers, dated December 26 (most likely 1896) reveals the former's suggestion that McKinley tell Reid that the editor's friends had insisted that Reid not endanger his health through office, especially in London's smoggy climes. The following month, in a letter, Hay set forth his own case for the ambassadorship, and urged McKinley to act quickly, as suitable accommodations in London would be difficult to secure. Hay gained his object (as did Hanna), and shifted his focus to appeasing Reid. Taliaferro states that Reid never blamed Hay, but Kushner and Sherrill recorded, "Reid was certain that he had been wronged" by Hay, and the announcement of Hay's appointment nearly ended their 26-year friendship.
Reaction in Britain to Hay's appointment was generally positive, with George Smalley of "The Times" writing to him, "we want a man who is a true American yet not anti-English". Hay secured a Georgian house on Carlton House Terrace, overlooking Horse Guards Parade, with 11 servants. He brought with him Clara, their own silver, two carriages, and five horses. Hay's salary of $17,000 "did not even begin to cover the cost of their extravagant lifestyle".
During his service as ambassador, Hay attempted to advance the relationship between the U.S. and Britain. The latter country had long been seen negatively by many Americans, a legacy of its colonial role that was refreshed by its Civil War neutrality, when British-built raiders such as the "Alabama" preyed on US-flagged ships. In spite of these past differences, according to Taliaferro, "rapprochement made more sense than at any time in their respective histories". In his Thanksgiving Day address to the American Society in London in 1897, Hay echoed these points, "The great body of people in the United States and England are friends ... [sharing] that intense respect and reverence for order, liberty, and law which is so profound a sentiment in both countries". Although Hay was not successful in resolving specific controversies in his year and a third as ambassador, both he and British policymakers regarded his tenure as a success, because of the advancement of good feelings and cooperation between the two nations.
An ongoing dispute between the U.S. and Britain was over the practice of pelagic sealing, that is, the capture of seals offshore of Alaska. The U.S. considered them American resources; the Canadians (Britain was still responsible for that dominion's foreign policy) contended that the mammals were being taken on the high seas, free to all. Soon after Hay's arrival, McKinley sent former Secretary of State John W. Foster to London to negotiate the issue. Foster quickly issued an accusatory note to the British that was printed in the newspapers. Although Hay was successful in getting Lord Salisbury, then both Prime Minister and Foreign Secretary, to agree to a conference to decide the matter, the British withdrew when the U.S. also invited Russia and Japan, rendering the conference ineffective. Another issue on which no agreement was reached was that of bimetallism: McKinley had promised silver-leaning Republicans to seek an international agreement varying the price ratio between silver and gold to allow for free coinage of silver, and Hay was instructed to seek British participation. The British would only join if the Indian colonial government (on a silver standard until 1893) was willing; this did not occur, and coupled with an improving economic situation that decreased support for bimetallism in the United States, no agreement was reached.
Hay had little involvement in the crisis over Cuba that culminated in the Spanish–American War. He met with Lord Salisbury in October 1897 and gained assurances Britain would not intervene if the U.S. found it necessary to go to war against Spain. Hay's role was "to make friends and to pass along the English point of view to Washington". Hay spent much of early 1898 on an extended trip to the Middle East, and did not return to London until the last week of March, by which time the USS "Maine" had exploded in Havana harbor. During the war, he worked to ensure U.S.-British amity, and British acceptance of the U.S. occupation of the Philippines—Salisbury and his government preferred that the U.S. have the islands than have them fall into the hands of the Germans.
In its early days, Hay described the war "as necessary as it is righteous". In July, writing to former Assistant Secretary of the Navy Theodore Roosevelt, who had gained wartime glory by leading the Rough Riders volunteer regiment, Hay made a description of the war for which, according to Zeitz, he "is best remembered by many students of American history":
Secretary Sherman had resigned on the eve of war, and been replaced by his first assistant, William R. Day. One of McKinley's Canton cronies, with little experience of statecraft, Day was never intended as more than a temporary wartime replacement. With America about to splash her flag across the Pacific, McKinley needed a secretary with stronger credentials. On August 14, 1898, Hay received a telegram from McKinley that Day would head the American delegation to the peace talks with Spain, and that Hay would be the new Secretary of State. After some indecision, Hay, who did not think he could decline and still remain as ambassador, accepted. British response to Hay's promotion was generally positive, and Queen Victoria, after he took formal leave of her at Osborne House, invited him again the following day, and subsequently pronounced him, "the most interesting of all the Ambassadors I have known."
John Hay was sworn in as Secretary of State on September 30, 1898. He needed little introduction to Cabinet meetings, and sat at the President's right hand. Meetings were held in the Cabinet Room of the White House, where he found his old office and bedroom each occupied by several clerks. Now responsible for 1,300 federal employees, he leaned heavily for administrative help on his old friend Alvey Adee, the second assistant.
By the time Hay took office, the war was effectively over and it had been decided to strip Spain of her overseas empire and transfer at least part of it to the United States. At the time of Hay's swearing-in, McKinley was still undecided whether to take the Philippines, but in October finally decided to do so, and Hay sent instructions to Day and the other peace commissioners to insist on it. Spain yielded, and the result was the Treaty of Paris, narrowly ratified by the Senate in February 1899 over the objections of anti-imperialists.
By the 1890s, China had become a major trading partner for Western nations, and for Japan. China lacked military muscle to resist these countries, and several, including Russia, Britain, and Germany, had carved off bits of China—some known as treaty ports—for use as trading or military bases. Within those jurisdictions, the nation in possession often gave preference to its own citizens in trade or in developing infrastructure such as railroads. Although the United States did not claim any parts of China, a third of the China trade was carried in American ships, and having an outpost near there was a major factor in deciding to retain the former Spanish colony of the Philippines in the Treaty of Paris.
Hay had been concerned about the Far East since the 1870s. As Ambassador, he had attempted to forge a common policy with the British, but the United Kingdom was willing to undertake territorial acquisition in China to guard its interests there whereas McKinley was not. In March 1898, Hay warned that Russia, Germany, and France were seeking to exclude Britain and America from the China trade, but he was disregarded by Sherman, who accepted assurances from Russia and Germany.
McKinley was of the view that equality of opportunity for American trade in China was key to success there, rather than colonial acquisitions; that Hay shared these views was one reason for his appointment as Secretary of State. Many influential Americans, seeing coastal China being divided into spheres of influence, urged McKinley to join in; still, in his annual message to Congress in December 1898, he stated that as long as Americans were not discriminated against, he saw no need for the United States to become "an actor in the scene".
As Secretary of State, it was Hay's responsibility to put together a workable China policy. He was advised by William Rockhill, an old China hand. Also influential was Charles Beresford, a British Member of Parliament who gave a number of speeches to American businessmen, met with McKinley and Hay, and in a letter to the secretary stated that "it is imperative for American interests as well as our own that the policy of the 'open door' should be maintained". Assuring that all would play on an even playing field in China would give the foreign powers little incentive to dismember the Chinese Empire through territorial acquisition.
In mid-1899, the British inspector of Chinese maritime customs, Alfred Hippisley, visited the United States. In a letter to Rockhill, a friend, he urged that the United States and other powers agree to uniform Chinese tariffs, including in the enclaves. Rockhill passed the letter on to Hay, and subsequently summarized the thinking of Hippisley and others, that there should be "an open market through China for our trade on terms of equality with all other foreigners". Hay was in agreement, but feared Senate and popular opposition, and wanted to avoid Senate ratification of a treaty. Rockhill drafted the first Open Door note, calling for equality of commercial opportunity for foreigners in China.
Hay formally issued his Open Door note on September 6, 1899. This was not a treaty, and did not require the approval of the Senate. Most of the powers had at least some caveats, and negotiations continued through the remainder of the year. On March 20, 1900, Hay announced that all powers had agreed, and he was not contradicted. Former secretary Day wrote to Hay, congratulating him, "moving at the right time and in the right manner, you have secured a diplomatic triumph in the 'open door' in China of the first importance to your country".
Little thought was given to the Chinese reaction to the Open Door note; the Chinese minister in Washington, Wu Ting-fang, did not learn of it until he read of it in the newspapers. Among those in China who opposed Western influence there was a movement in Shantung Province, in the north, that became known as the Fists of Righteous Harmony, or Boxers, after the martial arts they practiced. The Boxers were especially angered by missionaries and their converts. As late as June 1900, Rockhill dismissed the Boxers, contending that they would soon disband. By the middle of that month, the Boxers, joined by imperial troops, had cut the railroad between Peking and the coast, killed many missionaries and converts, and besieged the foreign legations. Hay faced a precarious situation; how to rescue the Americans trapped in Peking, and how to avoid giving the other powers an excuse to partition China, in an election year when there was already Democrat opposition to what they deemed American imperialism.
As American troops were sent to China to relieve the nation's legation, Hay sent a letter to foreign powers (often called the Second Open Door note), stating while the United States wanted to see lives preserved and the guilty punished, it intended that China not be dismembered. Hay issued this on July 3, 1900, suspecting that the powers were quietly making private arrangements to divide up China. Communication between the foreign legations and the outside world had been cut off, and the personnel there were falsely presumed slaughtered, but Hay realized that Minister Wu could get a message in, and Hay was able to establish communication. Hay suggested to the Chinese government that it now cooperate for its own good. When the foreign relief force, principally Japanese but including 2,000 Americans, relieved the legations and sacked Peking, China was made to pay a huge indemnity but there was no cession of land.
McKinley's vice president, Garret Hobart, had died in November 1899. Under the laws then in force, this made Hay next in line to the presidency should anything happen to McKinley. There was a presidential election in 1900, and McKinley was unanimously renominated at the Republican National Convention that year. He allowed the convention to make its own choice of running mate, and it selected Roosevelt, by then Governor of New York. Senator Hanna bitterly opposed that choice, but nevertheless raised millions for the McKinley/Roosevelt ticket, which was elected.
Hay accompanied McKinley on his nationwide train tour in mid-1901, during which both men visited California and saw the Pacific Ocean for the only times in their lives. The summer of 1901 was tragic for Hay; his older son Adelbert, who had been consul in Pretoria during the Boer War and was about to become McKinley's personal secretary, died in a fall from a New Haven hotel window.
Secretary Hay was at The Fells when McKinley was shot by Leon Czolgosz, an anarchist, on September 6 in Buffalo. With Vice President Roosevelt and much of the cabinet hastening to the bedside of McKinley, who had been operated on (it was thought successfully) soon after the shooting, Hay planned to go to Washington to manage the communication with foreign governments, but presidential secretary George Cortelyou urged him to come to Buffalo. He traveled to Buffalo on September 10; hearing on his arrival an account of the President's recovery, Hay responded that McKinley would die. He was more cheerful after visiting McKinley, giving a statement to the press, and went to Washington, as Roosevelt and other officials also dispersed. Hay was about to return to New Hampshire on the 13th, when word came that McKinley was dying. Hay remained at his office and the next morning, on the way to Buffalo, the former Rough Rider received from Hay his first communication as head of state, officially informing President Roosevelt of McKinley's death.
Hay, again next in line to the presidency, remained in Washington as McKinley's body was transported to the capital by funeral train, and stayed there as the late president was taken to Canton for interment. He had admired McKinley, describing him as "awfully like Lincoln in many respects" and wrote to a friend, "what a strange and tragic fate it has been of mine—to stand by the bier of three of my dearest friends, Lincoln, Garfield, and McKinley, three of the gentlest of men, all risen to be head of the State, and all done to death by assassins".
By letter, Hay offered his resignation to Roosevelt while the new president was still in Buffalo, amid newspaper speculation that Hay would be replaced—Garfield's Secretary of State, Blaine, had not remained long under the Arthur administration. When Hay met the funeral train in Washington, Roosevelt greeted him at the station and immediately told him he must stay on as Secretary. According to Zeitz, "Roosevelt's accidental ascendance to the presidency made John Hay an essential anachronism ... the wise elder statesman and senior member of the cabinet, he was indispensable to TR, who even today remains the youngest president ever".
The deaths of his son and of McKinley were not the only griefs Hay suffered in 1901—on September 26, John Nicolay died after a long illness, as did Hay's close friend Clarence King on Christmas Eve.
Hay's involvement in the efforts to have a canal joining the oceans in Central America went back to his time as Assistant Secretary of State under Hayes, when he served as translator for Ferdinand de Lesseps in his efforts to interest the American government in investing in his canal company. President Hayes was only interested in the idea of a canal under American control, which de Lesseps's project would not be. By the time Hay became Secretary of State, de Lesseps's project in Panama (then a Colombian province) had collapsed, as had an American-run project in Nicaragua. The 1850 Clayton–Bulwer Treaty (between the United States and Britain) forbade the United States from building a Central American canal that it exclusively controlled, and Hay, from early in his tenure, sought the removal of this restriction. But the Canadians, for whose foreign policy Britain was still available, saw the canal matter as their greatest leverage to get other disputes resolved in their favor, persuaded Salisbury not to resolve it independently. Shortly before Hay took office, Britain and the U.S. agreed to establish a Joint High Commission to adjudicate unsettled matters, which met in late 1898 but made slow progress, especially on the Canada-Alaska boundary.
The Alaska issue became less contentious in August 1899 when the Canadians accepted a provisional boundary pending final settlement. With Congress anxious to begin work on a canal bill, and increasingly likely to ignore the Clayton-Bulwer restriction, Hay and British Ambassador Julian Pauncefote began work on a new treaty in January 1900. The first Hay–Pauncefote Treaty was sent to the Senate the following month, where it met a cold reception, as the terms forbade the United States from blockading or fortifying the canal, that was to be open to all nations in wartime as in peace. The Senate Foreign Relations Committee added an amendment allowing the U.S. to fortify the canal, then in March postponed further consideration until after the 1900 election. Hay submitted his resignation, which McKinley refused. The treaty, as amended, was ratified by the Senate in December, but the British would not agree to the changes.
Despite the lack of agreement, Congress was enthusiastic about a canal, and was inclined to move forward, with or without a treaty. Authorizing legislation was slowed by discussion on whether to take the Nicaraguan or Panamanian route. Much of the negotiation of a revised treaty, allowing the U.S. to fortify the canal, took place between Hay's replacement in London, Joseph H. Choate, and the British Foreign Secretary, Lord Lansdowne, and the second Hay–Pauncefote Treaty was ratified by the Senate by a large margin on December 6, 1901.
Seeing that the Americans were likely to build a Nicaragua Canal, the owners of the defunct French company, including Philippe Bunau-Varilla, who still had exclusive rights to the Panama route, lowered their price. Beginning in early 1902, President Roosevelt became a backer of the latter route, and Congress passed legislation for it, if it could be secured within a reasonable time. In June, Roosevelt told Hay to take personal charge of the negotiations with Colombia. Later that year, Hay began talks with Colombia's acting minister in Washington, Tomás Herrán. The Hay–Herrán Treaty, granting $10 million to Colombia for the right to build a canal, plus $250,000 annually, was signed on January 22, 1903, and ratified by the United States Senate two months later. In August, however, the treaty was rejected by the Colombian Senate.
Roosevelt was minded to build the canal anyway, using an earlier treaty with Colombia that gave the U.S. transit rights in regard to the Panama Railroad. Hay predicted "an insurrection on the Isthmus [of Panama] against that regime of folly and graft ... at Bogotá". Bunau-Varilla gained meetings with both men, and assured them that a revolution, and a Panamanian government more friendly to a canal, was coming. In October, Roosevelt ordered Navy ships to be stationed near Panama. The Panamanians duly revolted in early November 1903, with Colombian interference deterred by the presence of U.S. forces. By prearrangement, Bunau-Varilla was appointed representative of the nascent nation in Washington, and quickly negotiated the Hay–Bunau-Varilla Treaty, signed on November 18, giving the United States the right to build the canal in a zone wide, over which the U.S. would exercise full jurisdiction. This was less than satisfactory to the Panamanian diplomats who arrived in Washington shortly after the signing, but they did not dare renounce it. The treaty was approved by the two nations, and work on the Panama Canal began in 1904. Hay wrote to Secretary of War Elihu Root, praising "the perfectly regular course which the President did follow" as much preferable to armed occupation of the isthmus.
Hay had met the President's father, Theodore Roosevelt, Sr., during the Civil War, and during his time at the "Tribune" came to know the adolescent "Teddy", twenty years younger than himself. Although before becoming president Roosevelt often wrote fulsome letters of praise to Secretary Hay, his letters to others then and later were less complimentary. Hay felt Roosevelt too impulsive, and privately opposed his inclusion on the ticket in 1900, though he quickly wrote a congratulatory note after the convention.
As President and Secretary of State, the two men took pains to cultivate a cordial relationship. Roosevelt read all ten volumes of the Lincoln biography and in mid-1903, wrote to Hay that by then "I have had a chance to know far more fully what a really great Secretary of State you are". Hay for his part publicly praised Roosevelt as "young, gallant, able, [and] brilliant", words that Roosevelt wrote that he hoped would be engraved on his tombstone.
Privately, and in correspondence with others, they were less generous: Hay grumbled that while McKinley would give him his full attention, Roosevelt was always busy with others, and it would be "an hour's wait for a minute's talk". Roosevelt, after Hay's death in 1905, wrote to Senator Lodge that Hay had not been "a great Secretary of State ... under me he accomplished little ... his usefulness to me was almost exclusively the usefulness of a fine figurehead". Nevertheless, when Roosevelt successfully sought election in his own right in 1904, he persuaded the aging and infirm Hay to campaign for him, and Hay gave a speech linking the administration's policies with those of Lincoln: "there is not a principle avowed by the Republican party to-day which is out of harmony with his [Lincoln's] teaching or inconsistent with his character." Kushner and Sherrill suggested that the differences between Hay and Roosevelt were more style than ideological substance.
In December 1902, the German government asked Roosevelt to arbitrate its dispute with Venezuela over unpaid debts. Hay did not think this appropriate, as Venezuela also owed the U.S. money, and quickly arranged for the International Court of Arbitration in The Hague to step in. Hay supposedly said, as final details were being worked out, "I have it all arranged. If Teddy will keep his mouth shut until tomorrow noon!" Hay and Roosevelt also differed over the composition of the Joint High Commission that was to settle the Alaska boundary dispute. The commission was to be composed of "impartial jurists" and the British and Canadians duly appointed notable judges. Roosevelt appointed politicians, including Secretary Root and Senator Lodge. Although Hay was supportive of the President's choices in public, in private he protested loudly to Roosevelt, complained by letter to his friends, and offered his resignation. Roosevelt declined it, but the incident confirmed him in his belief that Hay was too much of an Anglophile to be trusted where Britain was concerned. The American position on the boundary dispute was imposed on Canada by a 4–2 vote, with the one English judge joining the three Americans.
One incident involving Hay that benefitted Roosevelt politically was the kidnapping of Greek-American playboy Ion Perdicaris in Morocco by chieftain Mulai Ahmed er Raisuli, an opponent of Sultan Abdelaziz. Raisuli demanded a ransom, but also wanted political prisoners to be released and control of Tangier in place of the military governor. Raisuli supposed Perdicaris to be a wealthy American, and hoped United States pressure would secure his demands. In fact, Perdicaris, though born in New Jersey, had renounced his citizenship during the Civil War to avoid Confederate confiscation of property in South Carolina, and had accepted Greek naturalization, a fact not generally known until years later, but that decreased Roosevelt's appetite for military action. The sultan was ineffective in dealing with the incident, and Roosevelt considered seizing the Tangier waterfront, source of much of Abdelaziz's income, as a means of motivating him. With Raisuli's demands escalating, Hay, with Roosevelt's approval, finally cabled the consul-general in Tangier, Samuel Gummeré:
The 1904 Republican National Convention was in session, and the Speaker of the House, Joseph Cannon, its chair, read the first sentence of the cable—and only the first sentence—to the convention, electrifying what had been a humdrum coronation of Roosevelt. "The results were perfect. This was the fighting Teddy that America loved, and his frenzied supporters—and American chauvinists everywhere—roared in delight." In fact, by then the sultan had already agreed to the demands, and Perdicaris was released. What was seen as tough talk boosted Roosevelt's election chances.
Hay never fully recovered from the death of his son Adelbert, writing in 1904 to his close friend Lizzie Cameron that "the death of our boy made my wife and me old, at once and for the rest of our lives". Gale described Hay in his final years as a "saddened, slowly dying old man".
Although Hay gave speeches in support of Roosevelt, he spent much of the fall of 1904 at his New Hampshire house or with his younger brother Charles, who was ill in Boston. After the election, Roosevelt asked Hay to remain another four years. Hay asked for time to consider, but the President did not allow it, announcing to the press two days later that Hay would stay at his post. Early 1905 saw futility for Hay, as a number of treaties he had negotiated were defeated or amended by the Senate—one involving the British dominion of Newfoundland due to Senator Lodge's fears it would harm his fisherman constituents. Others, promoting arbitration, were voted down or amended because the Senate did not want to be bypassed in the settlement of international disputes.
By Roosevelt's inauguration on March 4, 1905, Hay's health was so bad that both his wife and his friend Henry Adams insisted on his going to Europe, where he could rest and get medical treatment. Presidential doctor Presley Rixey issued a statement that Hay was suffering from overwork, but in letters the secretary hinted his conviction that he did not have long to live. An eminent physician in Italy prescribed medicinal baths for Hay's heart condition, and he duly journeyed to Bad Nauheim, near Frankfurt, Germany. Kaiser Wilhelm II was among the monarchs who wrote to Hay asking him to visit, though he declined; Belgian King Leopold II succeeded in seeing him by showing up at his hotel, unannounced. Adams suggested that Hay retire while there was still enough life left in him to do so, and that Roosevelt would be delighted to act as his own Secretary of State. Hay jokingly wrote to sculptor Augustus Saint-Gaudens that "there is nothing the matter with me except old age, the Senate, and one or two other mortal maladies".
After the course of treatment, Hay went to Paris and began to take on his workload again by meeting with the French foreign minister, Théophile Delcassé. In London, King Edward VII broke protocol by meeting with Hay in a small drawing room, and Hay lunched with Whitelaw Reid, ambassador in London at last. There was not time to see all who wished to see Hay on what he knew was his final visit.
On his return to the United States, despite his family's desire to take him to New Hampshire, the secretary went to Washington to deal with departmental business and "say "Ave Caesar!" to the President", as Hay put it. He was pleased to learn that Roosevelt was well on his way to settling the Russo-Japanese War, an action for which the President would win the Nobel Peace Prize. Hay left Washington for the last time on June 23, 1905, arriving in New Hampshire the following day. He died there on July 1 of his heart ailment and complications. Hay was interred in Lake View Cemetery in Cleveland, near the grave of Garfield, in the presence of Roosevelt and many dignitaries, including Robert Lincoln.
Hay wrote some poetry while at Brown University, and more during the Civil War. In 1865, early in his Paris stay, Hay penned "Sunrise in the Place de la Concorde", a poem attacking Napoleon III for his reinstitution of the monarchy, depicting the Emperor as having been entrusted with the child Democracy by Liberty, and strangling it with his own hands. In "A Triumph of Order", set in the breakup of the Paris Commune, a boy promises soldiers that he will return from an errand to be executed with his fellow rebels. Much to their surprise, he keeps his word and shouts to them to "blaze away" as "The Chassepots tore the stout young heart,/And saved Society."
In poetry, he sought the revolutionary outcome for other nations that he believed had come to a successful conclusion in the United States. His 1871 poem, "The Prayer of the Romans", recites Italian history up to that time, with the "Risorgimento" in progress: liberty cannot be truly present until "crosier and crown pass away", when there will be "One freedom, one faith without fetters,/One republic in Italy free!" His stay in Vienna yielded "The Curse of Hungary", in which Hay foresees the end of the Austria-Hungarian Empire. After Hay's death in 1905, William Dean Howells suggested that the Europe-themed poems expressed "(now, perhaps, old-fashioned) American sympathy for all the oppressed." "Castilian Days", souvenir of Hay's time in Madrid, is a collection of seventeen essays about Spanish history and customs, first published in 1871, though several of the individual chapters appeared in "The Atlantic" in 1870. It went through eight editions in Hay's lifetime. The Spanish are depicted as afflicted by the "triple curse of crown, crozier, and sabre"—most kings and ecclesiastics are presented as useless—and Hay pins his hope in the republican movement in Spain. Gale deems "Castilian Days" "a remarkable, if biased, book of essays about Spanish civilization".
"Pike County Ballads", a grouping of six poems published (with other Hay poetry) as a book in 1871, brought him great success. Written in the dialect of Pike County, Illinois, where Hay went to school as a child, they are approximately contemporaneous with pioneering poems in similar dialect by Bret Harte and there has been debate as to which came first. The poem that brought the greatest immediate reaction was "Jim Bludso", about a boatman who is "no saint" with one wife in Mississippi and another in Illinois. Yet, when his steamboat catches fire, "He saw his duty, a dead-sure thing,—/And went for it, ther and then." Jim holds the burning steamboat against the riverbank until the last passenger gets ashore, at the cost of his life. Hay's narrator states that, "And Christ ain't a-going to be too hard/On a man that died for men." Hay's poem offended some clergymen, but was widely reprinted and even included in anthologies of verse.
"The Bread-Winners", one of the first novels to take an anti-labor perspective, was published anonymously in 1883 (published editions did not bear Hay's name until 1916) and he may have tried to disguise his writing style. The book examines two conflicts: between capital and labor, and between the "nouveau riche" and old money. In writing it, Hay was influenced by the labor unrest of the 1870s, that affected him personally, as corporations belonging to Stone, his father-in-law, were among those struck, at a time when Hay had been left in charge in Stone's absence. According to historian Scott Dalrymple, "in response, Hay proceeded to write an indictment of organized labor so scathing, so vehement, that he dared not attach his name to it."
The major character is Arthur Farnham, a wealthy Civil War veteran, likely based on Hay. Farnham, who inherited money, is without much influence in municipal politics, as his ticket is defeated in elections, symbolic of the decreasing influence of America's old-money patricians. The villain is Andrew Jackson Offitt (true name Ananias Offitt), who leads the Bread-winners, a labor organization that begins a violent general strike. Peace is restored by a group of veterans led by Farnham, and, at the end, he appears likely to marry Alice Belding, a woman of his own class.
Although unusual among the many books inspired by the labor unrest of the late 1870s in taking the perspective of the wealthy, it was the most successful of them, and was a sensation, gaining many favorable reviews. It was also attacked as an anti-labor polemic with an upper-class bias. There were many guesses as to authorship, with the supposed authors ranging from Hay's friend Henry Adams to New York Governor Grover Cleveland, and the speculation fueled sales.
Early in his presidency, Hay and Nicolay requested and received permission from Lincoln to write his biography. By 1872, Hay was "convinced that we ought to be at work on our 'Lincoln.' I don't think the time for publication has come, but the time for preparation is slipping away." Robert Lincoln in 1874 formally agreed to let Hay and Nicolay use his father's papers; by 1875, they were engaged in research. Hay and Nicolay enjoyed exclusive access to Lincoln's papers, which were not opened to other researchers until 1947. They gathered documents written by others, as well as many of the Civil War books already being published. They at rare times relied on memory, such as Nicolay's recollection of the moment at the 1860 Republican convention when Lincoln was nominated, but for much of the rest relied on research.
Hay began his part of the writing in 1876; the work was interrupted by illnesses of Hay, Nicolay, or family members, or by Hay's writing of "The Bread-Winners". By 1885, Hay had completed the chapters on Lincoln's early life, and they were submitted to Robert Lincoln for approval. Sale of the serialization rights to "The Century" magazine, edited by Hay's friend Richard Gilder, helped give the pair the impetus to bring what had become a massive project to an end.
The published work, "Abraham Lincoln: A History", alternates parts in which Lincoln is at center with discussions of contextual matters, such as legislative events or battles. The first serial installment, published in November 1886, received positive reviews. When the ten-volume set emerged in 1890, it was not sold in bookstores, but instead door-to-door, then a common practice. Despite a price of $50, and the fact that a good part of the work had been serialized, five thousand copies were quickly sold. The books helped forge the modern view of Lincoln as great war leader, against competing narratives that gave more credit to subordinates such as Seward. According to historian Joshua Zeitz, "it is easy to forget how widely underrated Lincoln the president and Lincoln the man were at the time of his death and how successful Hay and Nicolay were in elevating his place in the nation's collective historical memory."
In 1902, Hay wrote that when he died, "I shall not be much missed except by my wife." Nevertheless, due to his premature death at age 66, he was survived by most of his friends. These included Adams, who although he blamed the pressures of Hay's office, where he was badgered by Roosevelt and many senators, for the Secretary of State's death, admitted that Hay had remained in the position because he feared being bored. He memorialized his friend in the final pages of his autobiographical "The Education of Henry Adams": with Hay's death, his own education had ended.
Gale pointed out that Hay "accomplished a great deal in the realm of international statesmanship, and the world may be a better place because of his efforts as secretary of state ... the man was a scintillating ambassador". Yet, Gale felt, any assessment of Hay must include negatives as well, that after his marriage to the wealthy Clara Stone, Hay "allowed his deep-seated love of ease triumph over his Middle Western devotion to work and a fair shake for all." Despite his literary accomplishments, Hay "was often lazy. His first poetry was his best."
Taliaferro suggests that "if Hay put any ... indelible stamp on history, perhaps it was that he demonstrated how the United States ought to comport itself. He, not Roosevelt, was the adult in charge when the nation and the State Department attained global maturity." He quotes John St. Loe Strachey, "All that the world saw was a great gentleman and a great statesman doing his work for the State and for the President with perfect taste, perfect good sense, and perfect good humour".
Hay's efforts to shape Lincoln's image increased his own prominence and reputation in making his association (and that of Nicolay) with the assassinated president ever more remarkable and noteworthy. According to Zeitz, "the greater Lincoln grew in death, the greater they grew for having known him so well, and so intimately, in life. Everyone wanted to know them if only to ask what it had been like—what "he" had been like." Their answer to that, expressed in ten volumes of biography, Gale wrote, "has been incredibly influential". In 1974, Lincoln scholar Roy P. Basler stated that later biographers such as Carl Sandburg did not "ma[k]e revisions of the essential story told by N.[icolay] & H.[ay]. Zeitz concurs, "Americans today understand Abraham Lincoln much as Nicolay and Hay hoped that they would."
Hay brought about more than 50 treaties, including the Canal-related treaties, and settlement of the Samoan dispute, as a result of which the United States secured what became known as American Samoa. In 1900, Hay negotiated a treaty with Denmark for the cession of the Danish West Indies. That treaty failed in the Danish parliament on a tied vote.
In 1923 Mount Hay, also known as "Boundary Peak 167" on the Canada–United States border, was named after John Hay in recognition of his role in negotiating the US-Canada treaty resulting in the Alaska Boundary Tribunal. Brown University's John Hay Library is named for that prominent alumnus. Hay's New Hampshire estate has been conserved by various organizations. Although he and his family never lived there (Hay died while it was under construction), the Hay-McKinney House, home to the Cleveland History Center and thousands of artifacts, serves to remind Clevelanders of John Hay's lengthy service. During World War II the Liberty ship was built in Panama City, Florida, and named in his honor. Camp John Hay a United States military base established in 1903 in Baguio City, Philippines was named for John Hay, and the base name was maintained by the Philippine government even after its 1991 turnover to Philippine authorities.
According to historian Lewis L. Gould, in his account of McKinley's presidency,
On film and television John Hay has been portrayed by
Books
Journals and other sources | https://en.wikipedia.org/wiki?curid=16578 |
Joy Division
Joy Division were an English rock band formed in Salford in 1976. The group consisted of vocalist Ian Curtis, guitarist/keyboardist Bernard Sumner, bassist Peter Hook and drummer Stephen Morris.
Sumner and Hook formed the band after attending a Sex Pistols concert. While Joy Division's first recordings were heavily influenced by early punk, they soon developed a sound and style that made them one of the pioneers of the post-punk movement. Their self-released 1978 debut EP "An Ideal for Living" drew the attention of the Manchester television personality Tony Wilson, who signed them to his independent label Factory Records. Their debut album "Unknown Pleasures", recorded with producer Martin Hannett, was released in 1979.
Curtis suffered from personal problems including a failing marriage, depression, and epilepsy. As the band's popularity grew, Curtis's condition made it increasingly difficult for him to perform; he occasionally experienced seizures on stage. He killed himself on the eve of the band's first US/Canada tour in May 1980, aged 23. Joy Division's second and final album, "Closer", was released two months later; it and the single "Love Will Tear Us Apart" became their highest charting releases.
The remaining members regrouped under the name New Order. They were successful throughout the next decade, blending post-punk with electronic and dance music influences.
On 4 June 1976, childhood friends Bernard Sumner and Peter Hook separately attended a Sex Pistols show at the Manchester Lesser Free Trade Hall. Both were inspired by the Pistols' performance. Sumner said that he felt the Pistols "destroyed the myth of being a pop star, of a musician being some kind of god that you had to worship". The following day Hook borrowed £35 from his mother to buy a bass guitar. They formed a band with Terry Mason, who had also attended the gig; Sumner bought a guitar, and Mason a drum kit. After their schoolfriend Martin Gresty declined an invitation to join as vocalist after getting a job at a factory, the band placed an advertisement for a vocalist in the Manchester Virgin Records shop. Ian Curtis, who knew them from earlier gigs, responded and was hired without audition. Sumner said that he "knew he was all right to get on with and that's what we based the whole group on. If we liked someone, they were in."
Buzzcocks manager Richard Boon and frontman Pete Shelley have both been credited with suggesting the band name "Stiff Kittens", but the band settled on "Warsaw" shortly before their first gig, a reference to David Bowie's song "Warszawa". Warsaw debuted on 29 May 1977 at the Electric Circus, supporting the Buzzcocks, Penetration and John Cooper Clarke. Tony Tabac played drums that night after joining the band two days earlier. Reviews in the "NME" by Paul Morley and in "Sounds" by Ian Wood brought them immediate national exposure. Mason became the band's manager and Tabac was replaced on drums in June 1977 by Steve Brotherdale, who also played in the punk band Panik. Brotherdale tried to get Curtis to leave the band and join Panik, and even had Curtis audition. In July 1977, Warsaw recorded five demo tracks at Pennine Sound Studios, Oldham. Uneasy with Brotherdale's aggressive personality, the band fired him soon after the sessions: driving home from the studio, they pulled over and asked Brotherdale to check on a flat tyre; when he got out of the car, they drove off.
In August 1977, Warsaw placed an advertisement in a music shop window seeking a replacement drummer. Stephen Morris, who had attended the same school as Curtis, was the sole respondent. Deborah Curtis, Ian's wife, stated that Morris "fitted perfectly" with the band, and that with his addition Warsaw became a "complete 'family'". To avoid confusion with the London punk band Warsaw Pakt, the band renamed themselves Joy Division in early 1978, borrowing the name from the sexual slavery wing of a Nazi concentration camp mentioned in the 1955 novel "House of Dolls". In December, the group recorded their debut EP, "An Ideal for Living", at Pennine Sound Studio and played their final gig as Warsaw on New Year's Eve at the Swinging Apple in Liverpool. Billed as Warsaw to ensure an audience, the band played their first gig as Joy Division on 25 January 1978 at Pip's Disco in Manchester.
Joy Division were approached by RCA Records to record a cover of Nolan "N.F." Porter's "Keep on Keepin' On" at a Manchester recording studio. The band spent late March and April 1978 writing and rehearsing material. During the Stiff/Chiswick Challenge concert at Manchester's Rafters club on 14 April, they caught the attention of music producer Tony Wilson and manager Rob Gretton. Curtis berated Wilson for not putting the group on his Granada Television show "So It Goes"; Wilson responded that Joy Division would be the next band he would showcase on TV. Gretton, the venue's resident DJ, was so impressed by the band's performance that he convinced them to take him on as their manager. Gretton, whose "dogged determination" was later credited for much of the band's public success, contributed the business skills to provide Joy Division with a better foundation for creativity. Joy Division spent the first week of May 1978 recording at Manchester's Arrow Studios. The band were unhappy with the Grapevine Records head John Anderson's insistence on adding synthesiser into the mix to soften the sound, and asked to be dropped from the contract with RCA.
Joy Division made their recorded debut in June 1978 when the band self-released "An Ideal for Living", and two weeks later their track "At a Later Date" was featured on the compilation album "" (which had been recorded live in October 1977). In the "Melody Maker" review, Chris Brazier said that it "has the familiar rough-hewn nature of home-produced records, but they're no mere drone-vendors—there are a lot of good ideas here, and they could be a very interesting band by now, seven months on". The packaging of "An Ideal for Living"—which featured a drawing of a Hitler Youth member on the cover—coupled with the nature of the band's name fuelled speculation about their political affiliations. While Hook and Sumner later said they were intrigued by fascism at the time, Morris believed that the group's dalliance with Nazi imagery came from a desire to keep memories of the sacrifices of their parents and grandparents during World War II alive. He argued that accusations of neo-Nazi sympathies merely provoked the band "to keep on doing it, because that's the kind of people we are".
In September 1978, Joy Division made their television debut performing "Shadowplay" on "So It Goes", with an introduction by Wilson. In October, Joy Division contributed two tracks recorded with producer Martin Hannett to the compilation double-7" EP "A Factory Sample", the first release by Tony Wilson's record label, Factory Records. In the "NME" review of the EP, Paul Morley praised the band as "the missing link" between Elvis Presley and Siouxsie and the Banshees. Joy Division joined Factory's roster, after buying themselves out of the RCA deal. Gretton was made a label partner to represent the interests of the band. On 27 December, during the drive home from gig at the Hope and Anchor in London, Curtis suffered his first recognised severe epileptic seizure and was hospitalised. Meanwhile, Joy Division's career progressed, and Curtis appeared on the 13 January 1979 cover of "NME". That month the band recorded their session for BBC Radio 1 DJ John Peel. According to Deborah Curtis, "Sandwiched in between these two important landmarks was the realisation that Ian's illness was something we would have to learn to accommodate".
Joy Division's debut album, "Unknown Pleasures", was recorded at Strawberry Studios, Stockport, in April 1979. Producer Martin Hannett significantly altered their live sound, a fact that greatly displeased the band at the time; however, in 2006, Hook said that in retrospect Hannett had done a good job and "created the Joy Division sound". The album cover was designed by Peter Saville, who went on to provide artwork for future Joy Division / New Order releases.
"Unknown Pleasures" was released in June and sold through its initial pressing of 10,000 copies. Wilson said the success turned the indie label into a true business and a "revolutionary force" that operated outside of the major record label system. Reviewing the album for "Melody Maker", writer Jon Savage described the album as an "opaque manifesto" and declared it "one of the best, white, English, debut LPs of the year".
Joy Division performed on Granada TV again in July 1979, and made their only nationwide TV appearance in September on BBC2's "Something Else". They supported the Buzzcocks in a 24-venue UK tour that began that October, which allowed the band to quit their regular jobs. The non-album single "Transmission" was released in November. Joy Division's burgeoning success drew a devoted following who were stereotyped as "intense young men dressed in grey overcoats".
Joy Division toured Europe in January 1980. Although the schedule was demanding, Curtis experienced only two grand mal seizures, both in the final two months of the tour. That March, the band recorded their second album, "Closer," with Hannett at London's Britannia Row Studios. That month they released the "Licht und Blindheit" single, with "Atmosphere" as the A-side and "Dead Souls" as the B-side, on the French independent label Sordide Sentimental.
A lack of sleep and long hours destabilised Curtis's epilepsy, and his seizures became almost uncontrollable. He often had seizures during performances, which some audience members believed were part of the performance. The seizures left him feeling ashamed and depressed, and the band became increasingly worried about Curtis's condition. On 7 April 1980, Curtis attempted suicide by overdosing on his anti-seizure medication, phenobarbitone. The following evening, Joy Division were scheduled to play a gig at the Derby Hall in Bury. Curtis was too ill to perform, so at Gretton's insistence the band played a combined set with Alan Hempsall of Crispy Ambulance and Simon Topping of A Certain Ratio singing on the first few songs. When Topping came back towards the end of the set, some audience members threw bottles at the stage. Curtis's ill health led to the cancellation of several other gigs that April. Joy Division's final live performance was held at the University of Birmingham's High Hall on 2 May, and included their only performance of "Ceremony", one of the last songs written by Curtis.
Hannett's production has been widely praised. However, as with "Unknown Pleasures", both Hook and Sumner were unhappy with the production. Hook said that when he heard the final mix of "Atrocity Exhibition" he was disappointed that the abrasiveness had been toned down. He wrote; "I was like, head in hands, 'Oh fucking hell, it's happening again ... Martin had fucking melted the guitar with his Marshall Time Waster. Made it sound like someone strangling a cat and, to my mind, absolutely killed the song. I was so annoyed with him and went in and gave him a piece of my mind but he just turned round and told me to fuck off."
Joy Division were scheduled to commence their first US/Canada tour in May 1980. Curtis had expressed enthusiasm about the tour, but his relationship with his wife, Deborah, was under strain; Deborah was excluded from the band's inner circle, and Curtis was having an affair with Belgian journalist and music promoter Annik Honoré, whom he met on tour in Europe in 1979. He was also anxious about how American audiences would react to his epilepsy.
The evening before the band were due to depart for America, Curtis returned to his Macclesfield home to talk to Deborah. He asked her to drop an impending divorce suit, and asked her to leave him alone in the house until he caught a train to Manchester the following morning. Early on 18 May 1980, having spent the night watching the Werner Herzog film "Stroszek", Curtis hanged himself in his kitchen. Deborah discovered his body later that day when she returned.
The suicide shocked the band and their management. In 2005, Wilson said: "I think all of us made the mistake of not thinking his suicide was going to happen ... We all completely underestimated the danger. We didn't take it seriously. That's how stupid we were." Music critic Simon Reynolds said Curtis's suicide "made for instant myth". Jon Savage's obituary said that "now no one will remember what his work with Joy Division was like when he was alive; it will be perceived as tragic rather than courageous". In June 1980, Joy Division's single "Love Will Tear Us Apart" was released, which hit number thirteen on the UK Singles Chart. In July 1980, "Closer" was released, and peaked at number six on the UK Albums Chart. "NME" reviewer Charles Shaar Murray wrote, ""Closer" is as magnificent a memorial (for 'Joy Division' as much as for Ian Curtis) as any post-Presley popular musician could have."
Morris said that even without Curtis's death, it is unlikely that Joy Division would have endured. The members had made a pact long before Curtis's death that, should any member leave, the remaining members would change the band name. The band re-formed as New Order, with Sumner on vocals; they later recruited Morris's girlfriend Gillian Gilbert as keyboardist and second guitarist. Gilbert had befriended the band and played guitar at a Joy Division performance when Curtis had been unable to play.
New Order's debut single, "Ceremony" (1981), was formed from the last two songs written with Curtis. New Order struggled in their early years to escape the shadow of Joy Division, but went on to achieve far greater commercial success with a different, more upbeat and dance-orientated sound.
Various Joy Division outtakes and live material have been released. "Still", featuring live tracks and rare recordings, was issued in 1981. Factory issued the "Substance" compilation in 1988, including several out-of-print singles. "Permanent" was released in 1995 by London Records, which had acquired the Joy Division catalogue after Factory's 1992 bankruptcy. A comprehensive box set, "Heart and Soul", appeared in 1997.
Joy Division's style quickly evolved from their punk roots. Their sound during their early inception as Warsaw was described as generic and "undistinguished punk-inflected hard-rock". Critic Simon Reynolds observed how the band's originality only "really became apparent as the songs got slower", and their music took on a "sparse" quality. According to Reynolds, "Hook's bass carried the melody, Bernard Sumner's guitar left gaps rather than filling up the group's sound with dense riffage and Steve Morris' drums seemed to circle the rim of a crater." According to music critic Jon Savage, "Joy Division were not punk but they were directly inspired by its energy". In 1994 Sumner said the band's characteristic sound "came out naturally: I'm more rhythm and chords, and Hooky was melody. He used to play high lead bass because I liked my guitar to sound distorted, and the amplifier I had would only work when it was at full volume. When Hooky played low, he couldn't hear himself. Steve has his own style which is different to other drummers. To me, a drummer in the band is the clock, but Steve wouldn't be the clock, because he's passive: he would follow the rhythm of the band, which gave us our own edge." By "Closer", Curtis had adapted a low baritone voice, drawing comparisons to Jim Morrison of the Doors (one of Curtis's favourite bands).
Sumner largely acted as the band's director, a role he continued in New Order. While Sumner was the group's primary guitarist, Curtis played the instrument on a few recorded songs and during a few shows. Curtis hated playing guitar, but the band insisted he do so. Sumner said, "He played in quite a bizarre way and that to us was interesting, because no one else would play like Ian". During the recording sessions for "Closer", Sumner began using self-built synthesisers and Hook used a six-string bass for more melody.
Producer Martin Hannett "dedicated himself to capturing and intensifying Joy Division's eerie spatiality". Hannett believed punk rock was sonically conservative because of its refusal to use studio technology to create sonic space. The producer instead aimed to create a more expansive sound on the group's records. Hannett said, "[Joy Division] were a gift to a producer, because they didn't have a clue. They didn't argue". Hannett demanded clean and clear "sound separation" not only for individual instruments, but even for individual pieces of Morris's drumkit. Morris recalled, "Typically on tracks he considered to be potential singles, he'd get me to play each drum on its own to avoid any bleed-through of sound". Music journalist Richard Cook noted that Hannett's role was "crucial". There are "devices of distance" in his production and "the sound is an illusion of physicality".
Curtis was the band's sole lyricist, and he typically composed his lyrics in a notebook, independently of the eventual music to evolve. The music itself was largely written by Sumner and Hook as the group jammed during rehearsals. Curtis's imagery and word choice often referenced "coldness, pressure, darkness, crisis, failure, collapse, loss of control". In 1979, "NME" journalist Paul Rambali wrote, "The themes of Joy Division's music are sorrowful, painful and sometimes deeply sad." Music journalist Jon Savage wrote that "Curtis's great lyrical achievement was to capture the underlying reality of a society in turmoil, and to make it both universal and personal," while noting that "the lyrics reflected, in mood and approach, his interest in romantic and science-fiction literature." Critic Robert Palmer wrote that William S. Burroughs and J. G. Ballard were "obvious influences" to Curtis, and Morris also remembered the singer reading T. S. Eliot. Deborah Curtis also remembered Curtis reading works by writers such as Fyodor Dostoevsky, Friedrich Nietzsche, Jean-Paul Sartre, Franz Kafka, and Hermann Hesse.
Curtis was unwilling to explain the meaning behind his lyrics and Joy Division releases were absent of any lyric sheets. He told the fanzine "Printed Noise", "We haven't got a message really; the lyrics are open to interpretation. They're multidimensional. You can read into them what you like." The other Joy Division members have said that at the time, they paid little attention to the contents of Curtis' lyrics. In a 1987 interview with "Option", Morris said that they "just thought the songs were sort of sympathetic and more uplifting than depressing. But everyone's got their own opinion." Deborah Curtis recalled that only with the release of "Closer" did many who were close to the singer realise "[h]is intentions and feelings were all there within the lyrics". The surviving members regret not seeing the warning signs in Curtis's lyrics. Morris said that "it was only after Ian died that we sat down and listened to the lyrics...you'd find yourself thinking, 'Oh my God, I missed this one'. Because I'd look at Ian's lyrics and think how clever he was putting himself in the position of someone else. I never believed he was writing about himself. Looking back, how could I have been so bleedin' stupid? Of course he was writing about himself. But I didn't go in and grab him and ask, 'What's up?' I have to live with that".
In contrast to the sound of their studio recordings, Joy Division's live sound was typically loud and aggressive. The band were especially unhappy with Hannett's mix of "Unknown Pleasures", which replaced abrasiveness for a more cerebral and ghostly sound. According to Sumner "the music was loud and heavy, and we felt that Martin had toned it down, especially with the guitars".
During their live performances, the group did not interact with the audience; according to Paul Morley, "During a Joy Division set, outside of the songs, you'll be lucky to hear more than two or three words. Hello and goodbye. No introductions, no promotion." Curtis would often perform what became known as his "'dead fly' dance", as if imitating a seizure; his arms would "start flying in [a] semicircular, hypnotic curve". Simon Reynolds noted that Curtis's dancing style was reminiscent of an epileptic fit, and that he was dancing in the manner for some months before he was diagnosed with epilepsy. Live performances became problematic for Joy Division, due to Curtis's condition. Sumner later said, "We didn't have flashing lights, but sometimes a particular drum beat would do something to him. He'd go off in a trance for a bit, then he'd lose it and have an epileptic fit. We'd have to stop the show and carry him off to the dressing room where he'd cry his eyes out because this appalling thing had just happened to him".
Sumner wrote that Curtis was inspired by artists such as the Doors, Iggy Pop, David Bowie, Kraftwerk, the Velvet Underground and Neu!. Hook has also related that Curtis was particularly influenced by Iggy Pop's stage persona. The group were inspired by Kraftwerk's "marriage between humans and machines", and the inventiveness of their electronic music. Joy Division played "Trans-Europe Express" through the PA before they went on stage, 'to get a momentum". Bowie's "Berlin Trilogy" elaborated with Brian Eno, influenced them; the "cold austerity" of the synthesisers on the b-sides of "Heroes" and "Low" albums, was a "music looking at the future". Morris cited the "unique style" of Velvet Underground's Maureen Tucker and the motorik drum beats, from Neu! and Can. Morris also credited Siouxsie and the Banshees because their "first drummer Kenny Morris played mostly toms" and "the sound of cymbals was forbidden". Hook said that "Siouxsie and the Banshees were one of our big influences ... The way the guitarist and the drummer played was a really unusual way of playing". Hook drew inspiration from the style of bassist Jean-Jacques Burnel and his early material with the Stranglers; he also credited Carol Kaye and her musical basslines on early 1970s work of the Temptations. Sumner mentioned "the raw, nasty, unpolished edge" in the guitars of the Rolling Stones, the simple riff of "Vicious" on Lou Reed's "Transformer", and Neil Young. His musical horizon went up a notch with Jimi Hendrix, he realised "it wasn't about little catchy tunes ... it was what you could do sonically with a guitar."
Despite their short career, Joy Division have exerted a wide-reaching influence. John Bush of AllMusic argues that Joy Division "became the first band in the post-punk movement by ... emphasizing not anger and energy but mood and expression, pointing ahead to the rise of melancholy alternative music in the '80s."
Joy Division have influenced bands including their contemporaries U2 and the Cure to later acts such as Interpol, Bloc Party and Editors. Rapper Danny Brown named his album "Atrocity Exhibition" after the Joy Division song, whose title was partially inspired by the 1970 J. G. Ballard collection of condensed novels of the same name. In 2005, both New Order and Joy Division were inducted into the UK Music Hall of Fame.
The band's dark and gloomy sound, which Martin Hannett described in 1979 as "dancing music with Gothic overtones", presaged the gothic rock genre. While the term "gothic" originally described a "doomy atmosphere" in music of the late 1970s, the term was soon applied to specific bands like Bauhaus that followed in the wake of Joy Division and Siouxsie and the Banshees. Standard musical fixtures of early gothic rock bands included "high-pitched post-Joy Division basslines usurp[ing] the melodic role" and "vocals that were either near operatic and Teutonic or deep, droning alloys of Jim Morrison and Ian Curtis."
Joy Division have been dramatised in two biopics. "24 Hour Party People" (2002) is a fictionalised account of Factory Records in which members of the band appear as supporting characters. Tony Wilson said of the film, "It's all true, it's all not true. It's not a fucking documentary," and that he favoured the "myth" over the truth. The 2007 film "Control", directed by Anton Corbijn, is a biography of Ian Curtis (portrayed by Sam Riley) that uses Deborah Curtis's biography of her late husband, "Touching from a Distance" (1995), as its basis. "Control" had its international premiere on the opening night of Director's Fortnight at the 2007 Cannes Film Festival, where it was critically well received. That year Grant Gee directed the band documentary "Joy Division". | https://en.wikipedia.org/wiki?curid=16579 |
Johnny Bench
Johnny Lee Bench (born December 7, 1947) is an American former professional baseball player. He played his entire career in Major League Baseball as a catcher for the Cincinnati Reds from through . Bench was the leader of the Reds team known as the Big Red Machine that dominated the National League in the mid-1970s, winning six division titles, four National League pennants and two World Series championships.
A fourteen-time All-Star and a two-time National League Most Valuable Player, Bench excelled on offense as well as on defense, twice leading the National League in home runs and three times in runs batted in. At the time of his retirement in 1983, he held the major league record for most home runs hit by a catcher. He is also the only catcher in history to lead the league in home runs.
On defense, Bench was a ten-time Gold Glove Award winner who skillfully handled pitching staffs and possessed a strong, accurate throwing arm. He caught 100 or more games for 13 consecutive seasons. In 1986, Bench was inducted into the Cincinnati Reds Hall of Fame. He was inducted into the Baseball Hall of Fame in 1989. ESPN has called him the greatest catcher in baseball history.
Born and raised in Oklahoma, Bench is one-eighth Choctaw; he played baseball and basketball and was class valedictorian at Binger-Oney High School His father told him that the fastest route to becoming a major leaguer was as a catcher. As a 17-year-old, Bench was selected 36th overall by the Cincinnati Reds in the second round of the 1965 amateur draft, playing for the minor-league in the 1966 and 1967 seasons before being called up to the Reds in August 1967. He hit only .163, but impressed many people with his defense and strong throwing arm, among them Hall of Famer Ted Williams. Williams signed a baseball for him and predicted that the young catcher would be " Hall of Famer for sure!" Williams' prophecy became fact 22 years later in 1989 when Bench was elected to Cooperstown.
During a 1968 spring training game, Bench was catching right-hander Jim Maloney, an eight-year veteran. Maloney was once a hard thrower, but injuries had dramatically reduced the speed of his fastball. Maloney nevertheless insisted on repeatedly "shaking off" his younger catcher by throwing fastballs instead of the breaking balls that Bench had called for. When an exasperated Bench bluntly told Maloney, "Your fastball's not popping," Maloney replied with an epithet. To prove to Maloney that his fastball was no longer effective, Bench called for a fastball, and after Maloney released the ball, Bench dropped his catcher's mitt and caught the fastball barehanded. Bench was the Reds' catcher on April 30, 1969, when Maloney pitched a no hitter against the
In 1968, the 20-year-old Bench impressed many in his first he won the National League Rookie of the Year Award, batting .275 with 15 home runs and 82 RBIs. This marked the first time that the award had been won by a catcher. He also won the 1968 National League Gold Glove Award for catchers, which was the first time that the award had been won by a rookie. He made 102 assists in 1968, which marked the first time in 23 years that a catcher had more than 100 assists in a season.
During the 1960s, Bench also served in the United States Army Reserve as a member of the 478th Engineer Battalion, which was based across the Ohio River from Cincinnati at Fort Thomas, Kentucky. This unit included several of his teammates, among them Pete Rose. In the winter of 1970–1971 he was part of Bob Hope's USO Tour of Vietnam.
In 1970, Bench had his finest statistical season. At age 22, he became the youngest player to win the National League Most Valuable Player Award. He hit .293, led the National League with 45 home runs and a franchise-record 148 runs batted in as the Reds won the NL West Division. The Reds swept the Pittsburgh Pirates in the National League Championship Series, but lost to the Baltimore Orioles in five games in the World Series.
Bench had another strong year in 1972, winning the MVP Award for a second time. He led the National League in home runs (40) and RBI (125) to help propel the Reds to another National League West Division title and won the NL pennant in the deciding fifth game over the Pittsburgh Pirates. One of his most dramatic home runs was likely his ninth-inning, lead off, opposite field home run in that fifth NLCS game. The solo shot tied the game at three; the Reds won later in the inning on a wild pitch, 4–3. It was hailed after the game as "one of the great clutch home runs of all time." However, the Reds lost the World Series to a strong Oakland Athletics team in seven games.
After the 1972 season, Bench had a growth removed from his lung; he remained productive, but never again hit 40 home runs in a season. In 1973, Bench hit 25 home runs and 104 RBI and helped the Reds rally from a 10½-game deficit to the Los Angeles Dodgers in early July to lead the majors with 99 wins and claim another NL West Division crown. In the NLCS, Cincinnati met a New York Mets team that won the NL East with an unimpressive record, 16½ games behind the Reds. But the Mets boasted three of the better starting pitchers in the NL, future Hall of Famer Tom Seaver, Jerry Koosman, and Jon Matlack. Bench's bottom of the ninth-inning home run off Seaver in the first game propelled the Reds to victory, but Seaver would get the best of the Reds and Bench in the deciding Game 5, winning to put the Mets into the World Series against the Oakland A's.
In 1974, Bench led the league with 129 RBI and scored 108 runs, becoming only the fourth catcher in major league history with 100 or more runs and RBI in the same season. The Reds won the second-most games in the majors (98) but lost the West Division to the Los Angeles Dodgers. In 1975, the Reds finally broke through in the post season. Bench contributed 28 home runs and 110 RBI. Cincinnati swept the Pirates in three games to win the NLCS, and defeated the Boston Red Sox in a memorable seven-game World Series.
Bench struggled with ailing shoulders in 1976,
and had one of his least productive years, with only 16 home runs and 74 RBIs. He finished with an excellent postseason, starting with a 4-for-12 (.333) performance in the NLCS sweep over the Philadelphia Phillies. The World Series provided a head-to-head match-up with the Yankees' all-star catcher, Thurman Munson. Bench rose to the occasion, hitting .533 with two home runs, while Munson also hit well, with a .529 average. The Reds won in a four-game sweep and Bench was named the Series' MVP. At the post-World Series press conference, Reds manager Sparky Anderson was asked by a journalist to compare Munson with his catcher. Anderson replied, "I don't want to embarrass any other catcher by comparing him to Johnny Bench."
Bench bounced back in 1977 to hit 31 home runs and 109 RBIs but the Dodgers won two straight NL pennants. The Reds reached the postseason just once more in his career, in 1979, but were swept in three straight in the NLCS by the Pittsburgh Pirates.
For the last three seasons of his career, Bench moved out from behind the plate, catching only 13 games, while primarily becoming a corner infielder (first or third base). The Cincinnati Reds proclaimed Saturday, September 17, 1983, "Johnny Bench Night" at Riverfront Stadium, in which he hit his 389th and final home run, a line drive to left in the third inning before a record crowd. He retired at the end of the season at age 35.
Bench had 2,048 hits for a .267 career batting average with 389 home runs and 1,376 RBI during his 17-year Major League career, all spent with the Reds. He retired as the career home run leader for catchers, a record which stood until surpassed by Carlton Fisk and the current record holder, Mike Piazza. Bench still holds the Major League record for the most grand slam home runs by a catcher with 10. In his career, Bench earned 10 Gold Gloves, was named to the National League All-Star team 14 times, and won two Most Valuable Player Awards. He led the National League three times in caught stealing percentage and ended his career with a .990 fielding percentage at catcher and an overall .987 fielding percentage. He caught 118 shutouts during his career, ranking him 12th all-time among major league catchers. Bench also won such awards as the Lou Gehrig Award (1975), the Babe Ruth Award (1976), and the Hutch Award (1981).
Bench popularized the hinged catcher's mitt, first introduced by Randy Hundley of the Chicago Cubs. He began using the mitt after a stint on the disabled list in 1966 for a thumb injury on his throwing hand. The mitt allowed Bench to tuck his throwing arm safely to the side when receiving the pitch. By the turn of the decade, the hinged mitt became standard catchers' equipment. Having huge hands (a famous photograph features him holding seven baseballs in his right hand), Bench also tended to block breaking balls in the dirt by scooping them with one hand instead of the more common and fundamentally proper way: dropping to both knees and blocking the ball using the chest protector to keep the ball in front.
Bench has been married four times. Once hailed as "baseball's most-eligible bachelor," he shed that distinction before the 1975 season when he married Vickie Chesser, a toothpaste model who had previously dated Joe Namath. Four days after they met, Bench proposed, and they were married on February 21, 1975. Quickly, the pair realized they were incompatible, especially after Bench suggested that his wife accept "Hustler" magazine's offer for her to pose nude for $25,000. They broke up at the end of the season (Bench reportedly said to her, "Now I'm done with two things I hate: baseball and you"), divorcing after just 13 months. "I tried. I even hand-squeezed orange juice," Chesser told Phil Donahue in December 1975. "I don't think either of us had any idea what marriage was really like." After returning to Manhattan, Chesser said, "Johnny Bench is a great athlete, a mediocre everything else, and a true tragedy as a person."
Before Christmas 1987, Bench married Laura Cwikowski, an Oklahoma City model and aerobics instructor. They had a son, Bobby Binger Bench (named for Bob Hope and Bobby Knight, and Bench's hometown), before divorcing in 1995. They shared custody of their son. "He was, and is, a great dad," according to Bobby, who works in Cincinnati as a production operator on Reds broadcasts. Bench's third marriage, to Elizabeth Benton, took place in 1997. Johnny filed for divorce in 2000 on grounds of one too many people being involved with his wife. His fourth marriage took place in 2004, to 31-year-old Lauren Baiocchi, the daughter of pro golfer Hugh Baiocchi. After living in Palm Springs with their two sons, Justin (born 2006) and Josh (born 2010), Johnny had the urge to return to South Florida, where he lived from 2014 to 2017. The family scouted homes in Palm Beach Gardens. In the end, Lauren decided she wasn't going to move to Florida, so she and Johnny divorced. As of 2018, Bench has primary custody of the boys.
Bench was elected to the National Baseball Hall of Fame in Cooperstown, New York, in 1989 alongside Carl Yastrzemski. He was elected in his first year of eligibility, and appeared on 96% of the ballots, the third-highest percentage at that time. Three years earlier, Bench had been inducted into the Cincinnati Reds Hall of Fame in 1986 and his uniform No. 5 was retired by the team. He is currently on the board of directors for the Cincinnati Reds Hall of Fame. In 1989, he became the first individual baseball player to appear on a Wheaties box, a cereal he ate as a child.
For a time in the 1980s Bench was a commercial spokesman for Krylon paint, featuring a memorable catchphrase: "I'm Johnny Bench, and this is Johnny Bench's bench."
In 1985, Bench starred as Joe Boyd/Joe Hardy in a Cincinnati stage production of the musical "Damn Yankees", which also included Gwen Verdon and Gary Sandy. He also hosted the television series "The Baseball Bunch" from 1982 to 1985. A cast of boys and girls from the Tucson, Arizona, area would learn the game of baseball from Bench and other current and retired greats. The Chicken provided comic relief and former Los Angeles Dodgers manager Tommy Lasorda appeared as "The Dugout Wizard."
In 1986, Bench and Don Drysdale did the backup contests or ABC's Sunday afternoon baseball telecasts (Al Michaels and Jim Palmer were the primary commentating crew). Keith Jackson, usually working with Tim McCarver did the No. 2 Monday night games. Bench took a week off in June (with Steve Busby filling in), and also worked one game with Michaels as the networks switched the announcer pairings. While Drysdale worked the All-Star Game in Houston as an interviewer he did not resurface until the playoffs. Bench simply disappeared, ultimately going to CBS Radio to help Brent Musburger call that year's National League Championship Series. Bench would later serve as color commentator CBS Radio's World Series coverage alongside Jack Buck and later Vin Scully from 1989–1993. In 1994, Bench served as a field reporter for NBC/The Baseball Network's coverage of the All-Star Game from Pittsburgh.
After turning 50, Bench was a part-time professional golfer and played in several events on the Senior PGA Tour. He has a home at the Mission Hills-Gary Player Course in Rancho Mirage, California.
In 1999, Bench ranked Number 16 on "The Sporting News" list of the 100 Greatest Baseball Players. He was the highest-ranking catcher. Bench was also elected to the Major League Baseball All-Century Team as the top vote-receiving catcher. As part of the Golden Anniversary of the Rawlings Gold Glove Award, Bench was selected to the All-Time Rawlings Gold Glove Team.
Starting with the 2000 college baseball season, the best collegiate catcher annually receives the Johnny Bench Award. Notable winners include Buster Posey of Florida State University, Kelly Shoppach of Baylor University, Ryan Garko of Stanford University, and Kurt Suzuki of Cal State Fullerton.
In 2003, he guest starred on an episode of Yes, Dear as himself, along with Ernie Banks and Frank Robinson.
In 2008, Bench co-wrote the book "Catch Every Ball: How to Handle Life's Pitches" with Paul Daugherty, published by Orange Frazer Press. An autobiography published in 1979 called "Catch You Later" was co-authored with William Brashler. Bench has also broadcast games on television and radio, and is an avid golfer, having played in several Champions Tour tournaments.
In a September 2008 interview with Heidi Watney of the New England Sports Network, Johnny Bench, who was watching a Cleveland Indians/Boston Red Sox game at Fenway Park, did an impression of late Chicago Cubs announcer Harry Caray after Red Sox third baseman Kevin Youkilis, a native of Cincinnati, made a tough play. While knuckleballer Tim Wakefield was on the mound for the Red Sox, he related a story that then-Reds manager Sparky Anderson told him that he was thinking of trading for knuckleballer Phil Niekro. Bench replied that Anderson had better trade for Niekro's catcher, too.
On September 17, 2011, the Cincinnati Reds unveiled a statue of Bench at the entrance way of the Reds Hall of Fame at Great American Ball Park. The larger-than-life bronze statue by Tom Tsuchiya, shows Bench in the act of throwing out a base runner. Bench called the unveiling of his statue his "greatest moment." | https://en.wikipedia.org/wiki?curid=16581 |
Jell-O
Jell-O is a variety of gelatin desserts (fruit-flavored gels), puddings, and no-bake cream pies. The original Jell-O gelatin dessert (genericized as jello) is the signature of the brand. Jell-O is a registered trademark of Kraft Heinz and is based in Chicago, Illinois.
The original gelatin dessert began in Le Roy, New York, in 1897 after Pearle Bixby Wait and wife May trademarked the name for a product made from strawberry, raspberry, orange, or lemon flavoring added to sugar and granulated gelatin, which had just been patented in 1845 in its powdered form for the masses. The dessert was especially popular in the 1930s and 1950s.
Jell-O is sold prepared (ready-to-eat), or in powder form, and is available in various colors and flavors. The powder contains powdered gelatin and flavorings, including sugar or artificial sweeteners. It is dissolved in hot water, then chilled and allowed to set. Fruit, vegetables, and whipped cream can be added to make elaborate snacks that can be molded into shapes. Jell-O is typically put in a refrigerator to set (become firm), after which it is eaten.
Some non-gelatin pudding and pie-filling products are sold under the Jell-O brand. Ordinary pudding is cooked on the stove top (with milk) then eaten warm or chilled until firmly set, whereas instant pudding is mixed with cold milk and chilled. To make pie fillings, the same products are prepared with less liquid.
Gelatin, a protein produced from collagen extracted from boiled bones, connective tissues, and other animal products, has been a component of food, particularly desserts, since the 15th century.
Gelatin was popularized in New York in the Victorian era with spectacular and complex jelly moulds. Gelatin was sold in sheets and had to be purified, which was time-consuming. Gelatin desserts were the province of royalty and the relatively well-to-do. In 1845, a patent for powdered gelatin was obtained by industrialist Peter Cooper, who built the first American steam-powered locomotive, the "Tom Thumb". This powdered gelatin was easy to manufacture and easier to use in cooking.
In 1897, in LeRoy, New York, carpenter and cough syrup manufacturer Pearle Bixby Wait trademarked a gelatin dessert called Jell-O. His wife May and he added strawberry, raspberry, orange, and lemon flavoring to granulated gelatin and sugar. In 1899, Jell-O was sold to Orator Francis Woodward (1856–1906), whose Genesee Pure Food Company produced the successful Grain-O health drink. Part of the legal agreement between Woodward and Wait dealt with the similar Jell-O name.
Various elements were key to Jell-O becoming a mainstream product: new technologies, such as refrigeration, powdered gelatin and machine packaging, home economics classes, and the company's marketing.
Initially Woodward struggled to sell the powdered product. Beginning in 1902, to raise awareness, Woodward's Genesee Pure Food Company placed advertisements in the "Ladies' Home Journal" proclaiming Jell-O to be "America's Most Famous Dessert." Jell-O was a minor success until 1904, when Genesee Pure Food Company sent armies of salesmen into the field to distribute free Jell-O cookbooks, a pioneering marketing tactic. Within a decade, three new flavors, chocolate (discontinued in 1927), cherry, and peach, were added, and the brand was launched in Canada. Celebrity testimonials and recipes appeared in advertisements featuring actress Ethel Barrymore and opera singer Ernestine Schumann-Heink. Some Jell-O illustrated advertisements were painted by Maxfield Parrish.
In 1923, the newly rechristened Jell-O Company launched D-Zerta, an artificially sweetened version of Jell-O. Two years later, Postum and Genesee merged, and in 1927 Postum acquired Clarence Birdseye's frozen foods company to form the General Foods Corporation.
By 1930, there appeared a vogue in American cuisine for congealed salads, and the company introduced lime-flavored Jell-O to complement the add-ins that cooks across the country were combining in these aspics and salads. Popular Jell-O recipes often included ingredients like cabbage, celery, green peppers, and even cooked pasta.
In 1934, sponsorship from Jell-O made comedian Jack Benny the dessert's spokesperson. At this time Post introduced a jingle ("featured" by the agency Young & Rubicam) that was familiar over several decades, in which the spelling "J-E-L-L-O" was (or could be) sung over a rising five-note musical theme. The jingle was written by Don Bestor, who was the bandleader for Jack Benny on his radio program.
In 1936, chocolate returned to the Jell-O lineup, as an instant pudding made with milk. It proved enormously popular, and over time other pudding flavors were added such as vanilla, tapioca, coconut, pistachio, butterscotch, egg custard, flan, and rice pudding.
By the 1950s, salads became so popular that Jell-O responded with savory and vegetable flavors such as celery, Italian, mixed vegetable, and seasoned tomato. These flavors have since been discontinued.
Though much of the elaborate and dainty tea time fare served between the 1920s and 1950s was luxurious and decorative, using fancy ingredients like caviar or lobster, Jell-O became an affordable ornamental ingredient that women were able to use to create feminine, light, delicate dishes that were the standard of refined tea time fare during that period. By the Jazz Age nearly 1/3 of salad recipes in an average cookbook were gelatin-based recipes including varied fillings of fruit, vegetables or even cream cheese.
Typical recipes from the early 20th century included exotic fruits like figs, dates and bananas, or lemon flavored jello paired with maraschino cherries and other ingredients like marshmallows and almonds. One sweet gelatin-based fruit dessert called only "Good Salad" includes vanilla pudding, tapioca pudding, pineapple, mandarin oranges and orange gelatin. The pudding mixes are made with the reserved juice from the canned fruit and the flavored gelatin, the fruits are added and the dessert salad is allowed to set in the fridge and served cool.
One savory recipe collected by the "Des Moines Register", published in Iowa, is for a tomato soup gelatin salad. The salad, served chilled, is made from lemon gelatin, tomato soup, cream cheese, stuffed olives combined with various other ingredients and seasonings.
The baby boom saw a significant increase in sales for Jell-O. Young mothers didn't have the supporting community structures of earlier generations, so marketers were quick to promote easy-to-prepare prepackaged foods. By this time, creating a Jell-O dessert required simply boiling water, combining the water with Jell-O, and putting the mixture into Tupperware molds and refrigerating it for a short time.
New flavors were continually added and unsuccessful flavors were removed: in the 1950s and 1960s, apple, black cherry, black raspberry, grape, lemon-lime, mixed fruit, orange-banana, pineapple-grapefruit, blackberry, strawberry-banana, tropical fruit, and more intense "wild" versions of the venerable strawberry, raspberry, and cherry. In 1966, the Jell-O "No-Bake" dessert line was launched, which allowed a cheesecake to be made in 15 minutes. In 1969, Jell-O 1∗2∗3 (later Jell-O 1•2•3), a gelatin dessert that separated into three layers as it cooled, was unveiled. Until 1987, Jell-O 1•2•3 was readily found in grocery stores throughout most of the United States, but the dessert is now rare. In 1971 packaged prepared pudding called Jell-O Pudding Treats were introduced. Jell-O Whip 'n Chill, a mousse-style dessert, was introduced and widely promoted; it remains available in limited areas today. A similar dessert called Jell-O Soft Swirl was introduced in 1972, flavors included Chocolate Creme, Strawberry Creme, Vanilla Creme, and Peach Creme. Florence Henderson appeared in TV ads for this product.
In 1964, the slogan "There's always room for Jell-O" was introduced, promoting the product as a "light dessert" that could easily be consumed even after a heavy meal.
Throughout the 1960s through the 1980s, Jell-O's sales steadily decreased. Many Jell-O dishes, such as desserts and Jell-O salads, became special occasion foods rather than everyday items. Marketers blamed this decline on decreasing family sizes, a "fast-paced" lifestyle and women's increasing employment. By 1986, a market study concluded that mothers with young children rarely purchased Jell-O.
To turn things around, Jell-O hired Dana Gioia to stop the decline. The marketing team revisited the Jell-O recipes published in past cookbooks and rediscovered Jigglers, although the original recipe did not use that name. Jigglers are Jell-O snacks molded into fun shapes and eaten as finger food. Jell-O launched a massive marketing campaign, notably featuring Bill Cosby as spokesman. The campaign was a huge success, causing a significant market gain.
Cosby became the company's pudding spokesperson in 1974, and continued as the voice of Jell-O for almost thirty years. Over his tenure as the mouthpiece for the company, he helped introduce new products such as frozen Jell-O Pops (in gelatin and pudding varieties); the new Sugar-Free Jell-O, which replaced D-Zerta in 1984 and was sweetened with NutraSweet; Jell-O Jigglers concentrated gummi snacks; and Sparkling Jell-O, a carbonated version of the dessert touted as the "Champagne of Jell-O." In 2010, Cosby returned as Jell-O spokesperson in an on-line web series called "OBKB".
In the 1980s, a Jell-O advertising campaign slogan reminded consumers, "Don't forget—you have to remember to make it."
In 1990, General Foods was merged into Kraft Foods by parent company Philip Morris (now the Altria Group). New flavors were introduced: watermelon, blueberry, cranberry, margarita, and piña colada, among others. In 2001, the state Senate of Utah recognized Jell-O as a favorite snack food of Utah, recognizing the fundamental basis of Jell-O in Mormon cuisine such as Jell-O salad, and Governor Michael O. Leavitt declared an annual "Jell-O Week." During the 2002 Winter Olympics in Salt Lake City, the souvenir pins included one depicting green Jell-O.
In the late 1980s and early 1990s, Jell-O's family-friendly reputation was slightly tarnished by Jell-O shots and Jell-O wrestling.
, there were over 420 million boxes of Jell-O gelatin and over 1 billion Jell-O cups sold in the United States each year. , there were more than 110 products sold under the Jell-O brand name.
Jell-O is used as a substantial ingredient in a well-known dessert, a "Jell-O mold" the preparation of which requires a mold designed to hold gelatin, and the depositing of small quantities of chopped fruit, nuts, and other ingredients before it hardens to its typical form. Fresh pineapple, papaya, kiwifruit, and ginger root cannot be used because they contain enzymes that prevent gelatin from "setting". In the case of pineapple juice and the enzyme bromelain that it contains though, the enzyme can be inactivated without denaturing through excessive heating and thus altering the flavor by the addition of a small measured amount of capsaicin sourced from hot chilies.
An alternative recipe calls for the addition of an alcoholic beverage to the mix, contributing approximately one third to one half of the liquid added after the gelatin has dissolved in a boil. A serving of the resulting mixture is called a "Jell-O shot" at parties. The quantity and timing of the addition of the liquor are vital aspects; it is not possible to make Jell-O shots with liquor alone, as the colloidal proteins in dry gelatin consist of chains which require a hot liquid to denature them before they can then reform as a semisolid colloidal suspension. Pure alcohol cannot be heated sufficiently to break down these proteins, as it evaporates.
Vodka or rum is commonly used in Jell-O shots, but the shots can be made with almost any liquor. It is important to adjust the proportions of alcohol and cold water to ensure that the mixture sets when experimenting with various liquors. The Jell-O shots can be served in shot glasses and/or small paper or plastic cups; the paper or plastic cups are easier to eat from, but shot glasses are more attractive. The alcohol in Jell-O shots is contained within the Jell-O, so the body absorbs it more slowly, causing people to underestimate how much alcohol they have consumed. Drinkers must monitor their intake because of this.
American singer-songwriter Tom Lehrer claims to have invented the Jell-O shot in the 1950s to circumvent restrictions on alcoholic beverages at the army base where he was stationed. An early published recipe for an alcoholic gelatin drink dates from 1862, found in "How to Mix Drinks, or The Bon Vivant's Companion" by Jerry Thomas: his recipe for "Punch Jelly" calls for the addition of isinglass or other gelatin to a punch made from cognac, rum, and lemon juice. Thomas warns that strength of the punch is "artfully concealed" by the gelatin.
, LeRoy, New York, is known as the home of Jell-O and has the only Jell-O Museum in the world, located on the main road through the small town. Jell-O was manufactured here until General Foods closed the plant in 1964 and relocated manufacturing to Dover, Delaware. The Jell-O Gallery museum is operated by the Le Roy Historical Society at the Le Roy House and Union Free School, listed on the National Register of Historic Places in 1997.
At the museum, visitors can learn about the history of the dessert from its inception. Visitors starting on East Main Street, follow Jell-O Brick Road, whose stones are inscribed with the names of former factory employees. The museum offers looks at starting materials for Jell-O, such as sturgeon bladder and calves' hooves, and various molds.
The Jell-O plant in Mason City, Iowa, produces America's supply of ready-to-eat Jell-O gelatin dessert and pudding cups.
Jack Benny's top-rated radio show did not break for commercials. Instead, announcer Don Wilson incorporated speeches about Jell-O into the program at appropriate places, to Jack's feigned comic annoyance. Lucille Ball's "My Favorite Husband", the radio predecessor to TV's "I Love Lucy", was another popular program sponsored by Jell-O for much of its 124-episode run. Ball's character Liz Cooper often opened the program with the lively greeting "jell-o everybody!"
Comedian Bill Cosby is associated with Jell-O and, more famously, Jell-O pudding, and he appeared in many commercials promoting both. Shows like "Mad TV", "The Simpsons" and "Saturday Night Live" parody Cosby, using Jell-O references like "pudding pop". In the 1960s, the cast of the sitcom "Hogan's Heroes" did a commercial with Carol Channing featuring Colonel Hogan, his men, Kommandant Klink and Sergeant Shultz having Jell-O and Dream Whip for dessert. Also, in the first few seasons of the first of Lucille Ball's two 1960s television series, "The Lucy Show", cast members including Vivian Vance often did commercials for Jell-O.
In 1995, Jell-O carried the tagline "It's alive!" and had the phrase "J-E-L-L-OOOOOOO!".
In August 2018, Jell-O released an animated series on YouTube and Amazon Prime Video titled "JELL-O Wobz" in partnership with DreamWorksTV.
Jell-O is mentioned in the 1936 popular song "A Fine Romance" by Dorothy Fields (with music by Jerome Kern), where it is humorously referred to as a mundane alternative to the excitement of romantic love. In 1980, the American composer William Bolcom wrote a popular humorous song about Jell-O, "Lime Jello Marshmallow Cottage Cheese Surprise", satirising its use in combined sweet and savory dishes such as Jello salad.
In 1992, Ivette Bassa won the second ever Ig Nobel Prize in chemistry for inventing blue Jell-O.
Jell-O is especially popular among members of The Church of Jesus Christ of Latter-day Saints (LDS Church), often referred to as "Mormons". The Mormon Corridor region, which has the highest Mormon populations, was nicknamed the "Jell-O Belt", referring to the 20th-century Mormon cultural stereotype that Mormons have an affinity for Jell-O. In support of this image, Jell-O was designated as Utah's official state snack food in 2001. When drafting the resolution, the Utah Legislature gave many reasons to recognize Jell-O, including that Utah had had the highest per-capita consumption of Jell-O for many years, and how citizens of Utah had rallied to "Take Back the Title" after Des Moines, Iowa exceeded Utah in Jell-O consumption in 1999. The culture of Utah, petitions by Utahns, and campaigning by students of Brigham Young University were also mentioned as reasons for recognizing Jell-O. Bill Cosby, longtime spokesperson for the Jell-O brand, appeared before the Utah Legislature in support of the bill. "He told the assembly that he believes the reason people in Utah love Jell-O is that the snack is perfect for families -- and the people of Utah are all about family."
The 2002 LDS Cinema romantic comedy "The Singles Ward", which is filled with inside Mormon jokes and stereotypes, has a scene where someone slips and falls in Jell-O at a church social for young, single Mormons.
The stereotype of Mormons loving Jell-O does not appear to have a long history. Media reports in 1969 and 1988 on foods popular among Mormons or in Utah make no mention of Jell-O, and a 1988 article mentions Jell-O as a Lutheran tradition. In 1997, Kraft released sales figures revealing Salt Lake City to have the highest per-capita Jell-O consumption.
In 2002, the Winter Olympics in Salt Lake City issued a commemorative pin featuring green Jell-O.
Jell-O presents an issue for Jews who keep kosher. The gelatin in Jell-O typically comes from pig and cow collagen, and since pigs are not kosher animals, the major kosher certification agencies such as OK have declared it to be a non-kosher product. Kosher gelatin can be purchased, which is either made from cattle that have been slaughtered and prepared in compliance with kosher regulations, or is made from non-animal sources.
Similar to Judaism, pork derived gelatin presents a problem to Muslims who keep the halal dietary rules, and those who keep halal can not eat any kind of Jell-O.
The following are the flavors of Jell-O products that are currently being produced:
Also available in a sugar free/low calorie product.
Available seasonally.
Only available as a prepared product. | https://en.wikipedia.org/wiki?curid=16585 |
Javier Saviola
Javier Pedro Saviola Fernández (; born 11 December 1981) is an Argentine former professional footballer, who played as a forward.
He represented both Barcelona and Real Madrid, also having notable spells with Benfica and Olympiacos, and was named as the youngest player on Pelé's FIFA 100 list of the 125 greatest living footballers in 2004. Due to his ancestry he also holds Spanish nationality since 2004, and he amassed La Liga totals of 196 games and 70 goals over the course of eight seasons; he started and finished his career at River Plate.
Saviola won league titles in Argentina, Spain, Portugal and Greece during his playing career, as well as a UEFA Cup. An Argentine international for seven years, he represented his country at the 2006 World Cup and the 2004 Copa América, where Argentina reached the final. He also won a gold medal at the 2004 Olympics in Athens.
Nicknamed "El Conejo" ("The Rabbit"), Buenos Aires-born Saviola made his debut for Club Atlético River Plate at the age of 16, and went on to be a prolific goalscorer for the club.
He helped River to the 1999 "Apertura" and 2000 "Clausura" championships, and earned the 1999 South American Footballer of the Year award. Still only 18, he gained a reputation as a phenomenal prospect, and was even regarded as a potential heir to Diego Maradona, in particular after he broke the latter's 1978 record by becoming the youngest player to win the Golden Boot award.
In 2001, aged 19, Saviola moved abroad to play for FC Barcelona in a £15 million transfer. He obtained Spanish citizenship shortly after, thereby not being restricted by the Spanish league maximum on the number of non-European Union citizens allowed in each team; under coach Carles Rexach, he scored 17 goals in his first season, finishing joint-fourth top scorer in La Liga.
Saviola's second year at the Camp Nou did not start well, as he only scored two goals in the first half of the season. Radomir Antić became the new coach after Louis van Gaal was fired, and he went on to net 11 goals in the latter half of the campaign; Frank Rijkaard was subsequently appointed as new manager for 2003–04, and the player scored 14 times in the league alone, but was deemed surplus at the club as longtime attacking partner Patrick Kluivert.
Saviola was sent on loan in the summer of 2004, moving to AS Monaco FC in Ligue 1. As he did not fit into Rijkaard's plans he was again loaned out the following year, this time to Sevilla FC who were seeking to replace Real Madrid-bound Júlio Baptista; with the Andalusians he won his first title in Europe, conquering the UEFA Cup — he also scored nine times in the league, good enough for fifth.
Saviola returned to Barcelona for 2006–07, playing in 18 league games, six as a starter, and netting five goals. He benefited greatly from injuries to teammates, most notably to Samuel Eto'o, and added five in as many matches in that season's Copa del Rey, notably a hat-trick against Deportivo Alavés (3–2 win at home, 5–2 aggregate).
On 10 July 2007, Real Madrid signed Saviola after his Barcelona contract expired, on a three-year deal. Although on a financially lucrative contract, he endured a difficult time at Real, being mainly restricted to cup matches and sporadic appearances (mainly as a substitute) in the league and the UEFA Champions League.
The arrival of Klaas-Jan Huntelaar limited Saviola's opportunities even more, and he finished his Real Madrid spell with five goals in 28 overall appearances.
On 26 June 2009, S.L. Benfica and Real Madrid agreed on a €5 million deal that would see Saviola play in Portugal for the next three years, with an option for one more; a €30 million clause was added. On 16 July, he scored two goals to send his team into the Guadiana Trophy finals after defeating Athletic Bilbao.
Saviola netted twice on 22 October 2009, guiding his side to a 5–0 victory over Everton for the UEFA Europa League (he would also score in their 2–0 win in Liverpool in the second match), adding another brace four days later in a 6–1 routing of C.D. Nacional for the Primeira Liga.
On 6 December 2009, Saviola scored through a chip shot against Académica de Coimbra in a 4–0 home win. On 20 December he netted the game's only goal as Benfica defeated rivals FC Porto at home; during the victorious campaign, he formed a deadly attacking partnership with Paraguayan Óscar Cardozo, with the pair combining for more than 50 goals overall.
On 3 January 2010, shortly before receiving the SJPF Player of the Month award, Saviola scored another winning goal against Nacional, now for the Taça da Liga, again being the game's only scorer in an away defeat of Rio Ave FC, netting in the 48th minute. He scored his 19th goal overall in a 3–1 home triumph against F.C. Paços de Ferreira on 7 March, and the Lisbon club was eventually crowned league champions after a five-year wait.
In the last hours of the 2012–13 summer transfer window, Saviola agreed on a move to Málaga CF. He played 45 minutes in his first appearance, a 1–0 win at Real Zaragoza on 1 September.
On 15 September 2012, Saviola scored once and provided one assist in a 3−1 home win against Levante UD. He continued with his streak the following game, Málaga's first-ever in the Champions League group stage, netting in a 3–0 home win over FC Zenit Saint Petersburg.
On 25 July 2013, Saviola signed a two-year contract with Greek champions Olympiacos FC. He scored his first goal in the Superleague on 25 August, coming on at half-time and helping his team come from behind to win 2–1 at home to Atromitos F.C. On 10 December he netted a brace – and also missed a penalty – in a 3–1 success over R.S.C. Anderlecht also at the Karaiskakis Stadium in the group stage's last round, which helped the Piraeus team finish second and qualify at the expense of former side Benfica.
On 2 September 2014, Saviola joined Serie A club Hellas Verona FC. He made his official debut on 22 September, starting in a 2–2 home draw against Genoa CFC, and scored his first goal on 2 December, netting the only in a home win over Perugia Calcio for the Coppa Italia. His sole goal of the league season came on 25 January 2015, the only one in a home victory over Atalanta BC.
On 30 June 2015, River Plate announced that Saviola had returned to the club. He left in January of the following year, after failing to find the net in his second spell, and subsequently retired from professional football at the age of 34.
Immediately after retiring, Saviola settled in Andorra with his family and was appointed assistant manager at FC Ordino in the Primera Divisió. In February 2018, he joined local futsal team Encamp. In April of that year, he won the principality's futsal league with the side.
Saviola starred in the 2001 edition of the FIFA U-20 World Cup, held in Argentina. He was top scorer and was voted player of the tournament, as the national team won the competition; with 11 goals in seven games, he became the record goal-scorer in the tournament's history.
Two years later, Saviola played in the 2004 Olympic Games and won the gold medal. Under coach Marcelo Bielsa he was given few playing opportunities for the senior team but, after the former's resignation in 2004, new manager José Pekerman, who also worked with him at youth level, turned the tide in the player's favour; he was also a member of the squads that reached final of the 2004 Copa América and the 2005 FIFA Confederations Cup, netting three times in the former tournament and one in the latter.
Saviola was called up to represent Argentina at the 2006 World Cup – Luciano Figueroa and Luciano Galletti were also in contention for a place on the roster, but his excellent form for Sevilla secured his place in the squad. He scored against Ivory Coast in the country's opening game, and made two assists in the 6–0 victory over Serbia and Montenegro also in the group phase.
Saviola retired from international football on 5 December 2009, although not yet 28. He stated that he felt his career as an Argentina player had come to an end, and that he wanted to concentrate on club football.
Saviola was known for his speed, agility, dribbling and ability to score from almost any attacking position on the field. A diminutive, talented, and prolific forward, with a slender build, he was capable of playing as a striker, in a more creative role as a second striker, or even in a playmaking role as an attacking midfielder. Throughout his career, Saviola was nicknamed "El Conejo" ("The Rabbit", in Spanish), due to his appearance, and also "El Pibito" ("The Little Kid", in Spanish), a reference to compatriot Diego Maradona, who was nicknamed "El Pibe de Oro" ("The Golden Kid", in Spanish), and to whom Saviola was often compared in his youth.
Saviola was sponsored by sportswear company Nike, and appeared in commercials for the brand. In a global advertising campaign in the run-up to the 2002 World Cup in Korea and Japan, he starred in a "Secret Tournament" commercial (branded "Scopion KO") directed by Terry Gilliam, appearing alongside footballers such as Luís Figo, Thierry Henry, Hidetoshi Nakata, Roberto Carlos, Ronaldinho, Ronaldo and Francesco Totti, with former player Eric Cantona the tournament "referee".
Argentina's goal tally first
River Plate
Sevilla
Real Madrid
Benfica
Olympiacos
Argentina | https://en.wikipedia.org/wiki?curid=16588 |
Junkers Ju 87
The Junkers Ju 87 or Stuka (from "Sturzkampfflugzeug", "dive bomber") was a German dive bomber and ground-attack aircraft. Designed by Hermann Pohlmann, it first flew in 1935. The Ju 87 made its combat debut in 1937 with the Luftwaffe's Condor Legion during the Spanish Civil War and served the Axis forces in World War II.
The aircraft is easily recognisable by its inverted gull wings and fixed spatted undercarriage. Upon the leading edges of its faired main gear legs were mounted the "Jericho-Trompete" (Jericho trumpet) wailing sirens, becoming the propaganda symbol of German air power and the so-called "Blitzkrieg" victories of 1939–1942. The Stuka's design included several innovations, including automatic pull-up dive brakes under both wings to ensure that the aircraft recovered from its attack dive even if the pilot blacked out from the high g-forces.
The Ju 87 operated with considerable success in close air support and anti-shipping at the outbreak of World War II. It led air assaults in the invasion of Poland in September 1939. Stukas were critical to the rapid conquest of Norway, the Netherlands, Belgium and France in 1940. Sturdy, accurate, and very effective against ground targets, the Stuka was, like many other dive bombers of the period, vulnerable to fighter aircraft. During the Battle of Britain, its lack of manoeuvrability, speed and defensive armament meant that it required a heavy fighter escort to operate effectively.
After the Battle of Britain, the Stuka was used in the Balkans Campaign, the African and Mediterranean theatres and the early stages of the Eastern Front, where it was used for general ground support, as an effective specialised anti-tank aircraft and in an anti-shipping role. Once the Luftwaffe lost air superiority, the Stuka became an easy target for enemy fighter aircraft. It was produced until 1944 for lack of a better replacement. By 1945 ground-attack versions of the Focke-Wulf Fw 190 had largely replaced the Ju 87, but it remained in service until the end of the war.
An estimated 6,500 Ju 87s of all versions were built between 1936 and August 1944.
"Oberst" Hans-Ulrich Rudel was the most successful Stuka pilot and the most highly decorated German serviceman of the Second World War.
The Ju 87's principal designer, Hermann Pohlmann, held the opinion that any dive-bomber design needed to be simple and robust. This led to many technical innovations, such as the retractable undercarriage being discarded in favour of one of the Stuka's distinctive features, its fixed and "spatted" undercarriage. Pohlmann continued to carry on developing and adding to his ideas and those of Dipl Ing Karl Plauth (Plauth was killed in a flying accident in November 1927), and produced the Ju A 48 which underwent testing on 29 September 1928. The military version of the Ju A 48 was designated the Ju K 47.
After the Nazis came to power, the design was given priority. Despite initial competition from the Henschel Hs 123, the "Reichsluftfahrtministerium" (RLM, the German aviation ministry) turned to the designs of Herman Pohlmann of Junkers and co-designer of the K 47, Karl Plauth. During the trials with the K 47 in 1932, the double vertical stabilisers were introduced to give the rear gunner a better field of fire. The main, and what was to be the most distinctive, feature of the Ju 87 was its double-spar inverted gull wings.
After Plauth's death, Pohlmann continued the development of the Junkers dive bomber. The Ju A 48 registration D-ITOR, was originally fitted with a BMW 132 engine, producing 450 kW (600 hp). The machine was also fitted with dive brakes for dive testing. The aircraft was given a good evaluation and "exhibited very good flying characteristics".
Ernst Udet took an immediate liking to the concept of dive-bombing after flying the Curtiss F11C Goshawk. When Walther Wever and Robert Ritter von Greim were invited to watch Udet perform a trial flight in May 1934 at the Jüterbog artillery range, it raised doubts about the capability of the dive bomber. Udet began his dive at and released his bombs at , barely recovering and pulling out of the dive. The chief of the "Luftwaffe" Command Office Walther Wever, and the Secretary of State for Aviation Erhard Milch, feared that such high-level nerves and skill could not be expected of "average pilots" in the "Luftwaffe". Nevertheless, development continued at Junkers. Udet's "growing love affair" with the dive bomber pushed it to the forefront of German aviation development. Udet went so far as to advocate that all medium bombers should have dive-bombing capabilities, which initially doomed the only dedicated, strategic heavy bomber design to enter German front-line service during the war years—the 30-metre wingspan He 177A—into having an airframe design (due to Udet examining its design details in November 1937) that could perform "medium angle" dive-bombing missions, until "Reichsmarschall" Hermann Göring exempted the He 177A, Germany's only operational heavy bomber, in September 1942 from being given the task of such a mismatched mission profile for its large airframe.
The design of the Ju 87 had begun in 1933 as part of the "Sturzbomber-Programm". The Ju 87 was to be powered by the British Rolls-Royce Kestrel engine. Ten engines were ordered by Junkers on 19 April 1934 for £20,514, two shillings and sixpence.
The first Ju 87 prototype was built by in Sweden and secretly brought to Germany in late 1934. It was to have been completed in April 1935, but, due to the inadequate strength of the airframe, construction took until October 1935. The mostly complete Ju 87 V1 W.Nr. 4921 (less non-essential parts) took off for its maiden flight on 17 September 1935. The aircraft was later given the registration D-UBYR. The flight report, by "Hauptmann" Willy Neuenhofen, stated the only problem was with the small radiator, which caused the engine to overheat.
The Ju 87 V1, powered by a Rolls-Royce Kestrel V12 cylinder liquid-cooled engine, and with a twin tail, crashed on 24 January 1936 at Kleutsch near Dresden, killing Junkers' chief test pilot, Willy Neuenhofen, and his engineer, Heinrich Kreft. The square twin fins and rudders proved too weak; they collapsed and the aircraft crashed after it entered an inverted spin during the testing of the terminal dynamic pressure in a dive. The crash prompted a change to a single vertical stabiliser tail design. To withstand strong forces during a dive, heavy plating was fitted, along with brackets riveted to the frame and longeron, to the fuselage. Other early additions included the installation of hydraulic dive brakes that were fitted under the leading edge and could rotate 90°.
The RLM was still not interested in the Ju 87 and was not impressed that it relied on a British engine. In late 1935, Junkers suggested fitting a DB 600 inverted V-12 engine, with the final variant to be equipped with the Jumo 210. This was accepted by the RLM as an interim solution. The reworking of the design began on 1 January 1936. The test flight could not be carried out for over two months due to a lack of adequate aircraft. The 24 January crash had already destroyed one machine.
The second prototype was also beset by design problems. It had its twin stabilisers removed and a single tail fin installed due to fears over stability. Due to a shortage of engines, instead of a DB 600, a BMW "Hornet" engine was fitted. All these delays set back testing until 25 February 1936. By March 1936, the second prototype, the V2, was finally fitted with the Jumo 210Aa engine, which a year later was replaced by a Jumo 210 G (W.Nr. 19310). Although the testing went well, and the pilot, Flight Captain Hesselbach, praised its performance, Wolfram von Richthofen told the Junkers representative and Construction Office chief engineer that the Ju 87 stood little chance of becoming the Luftwaffe's main dive bomber, as it was underpowered in his opinion. On 9 June 1936, the RLM ordered cessation of development in favour of the Heinkel He 118, a rival design. Udet cancelled the order the next day, and development continued.
On 27 July 1936, Udet crashed the He 118 prototype, He 118 V1 D-UKYM. That same day, Charles Lindbergh was visiting Ernst Heinkel, so Heinkel could only communicate with Udet by telephone. According to this version of the story, Heinkel warned Udet about the propeller's fragility. Udet failed to consider this, so in a dive, the engine oversped and the propeller broke away. Immediately after this incident, Udet announced the Stuka the winner of the development contest.
Despite being chosen, the design was still lacking and drew frequent criticism from Wolfram von Richthofen. Testing of the V4 prototype (A Ju 87 A-0) in early 1937 revealed several problems. The Ju 87 could take off in and climb to in eight minutes with a bomb load, and its cruising speed was . Richthofen pushed for a more powerful engine. According to the test pilots, the Heinkel He 50 had a better acceleration rate, and could climb away from the target area much more quickly, avoiding enemy ground and air defences. Richthofen stated that any maximum speed below was unacceptable for those reasons. Pilots also complained that navigation and powerplant instruments were mixed together, and were not easy to read, especially in combat. Despite this, pilots praised the aircraft's handling qualities and strong airframe.
These problems were to be resolved by installing the DB 600 engine, but delays in development forced the installation of the Jumo 210 D inverted V-12 engine. Flight testing began on 14 August 1936. Subsequent testing and progress fell short of Richthofen's hopes, although the machine's speed was increased to at ground level and at , while maintaining its good handling ability.
The Ju 87 was a single-engined all-metal cantilever monoplane. It had a fixed undercarriage and could carry a two-person crew. The main construction material was duralumin, and the external coverings were made of duralumin sheeting. Parts that were required to be of strong construction, such as the wing flaps, were made of Pantal (a German aluminium alloy containing titanium as a hardening element) and its components made of Elektron. Bolts and parts that were required to take heavy stress were made of steel.
The Ju 87 was fitted with detachable hatches and removable coverings to aid and ease maintenance and overhaul. The designers avoided welding parts wherever possible, preferring moulded and cast parts instead. Large airframe segments were interchangeable as a complete unit, which increased speed of repair.
The airframe was also subdivided into sections to allow transport by road or rail. The wings were of standard Junkers double-wing construction. This gave the Ju 87 considerable advantage on take-off; even at a shallow angle, large lift forces were created through the aerofoil, reducing take-off and landing runs.
In accordance with the Aircraft Certification Centre for "Stress Group 5", the Ju 87 had reached the acceptable structural strength requirements for a dive bomber. It was able to withstand diving speeds of and a maximum level speed of near ground level, and a flying weight of . Performance in the diving attack was enhanced by the introduction of dive brakes under each wing, which allowed the Ju 87 to maintain a constant speed and allow the pilot to steady his aim. It also prevented the crew from suffering extreme g forces and high acceleration during "pull-out" from the dive.
The fuselage had an oval cross-section and housed, in most examples, a Junkers Jumo 211 water-cooled inverted V-12 engine. The cockpit was protected from the engine by a firewall ahead of the wing centre section where the fuel tanks were located. At the rear of the cockpit, the bulkhead was covered by a canvas cover which could be breached by the crew in an emergency, enabling them to escape into the main fuselage. The canopy was split into two sections and joined by a strong welded steel frame. The canopy itself was made of Plexiglas and each compartment had its own "sliding hood" for the two crew members.
The engine was mounted on two main support frames that were supported by two tubular struts. The frame structure was triangulated and emanated from the fuselage. The main frames were bolted onto the engine's top quarter. In turn, the frames were attached to the firewall by universal joints. The firewall itself was constructed from asbestos mesh with dural sheets on both sides. All conduits passing through had to be arranged so that no harmful gases could penetrate the cockpit.
The fuel system comprised two fuel tanks between the main (forward) and rear spars of the (inner) anhedral wing section of the port and starboard wings, each with capacity. The tanks also had a predetermined limit which, if passed, would warn the pilot via a red warning light in the cockpit. The fuel was injected via a pump from the tanks to the engine. Should this shut down, it could be pumped manually using a hand-pump on the fuel cock armature. The powerplant was cooled by a , ring-shaped aluminium water container situated between the propeller and engine. A further container of was positioned under the engine.
The control surfaces operated in much the same way as other aircraft, with the exception of the innovative automatic pull-out system. Releasing the bomb initiated the pull-out, or automatic recovery and climb, upon the deflection of the dive brakes. The pilot could override the system by exerting significant force on the control column and taking manual control.
The wing was the most unusual feature. It consisted of a single centre section and two outer sections installed using four universal joints. The centre section had a large negative dihedral (anhedral) and the outer surfaces a positive dihedral. This created the inverted gull, or "cranked", wing pattern along the leading edge. The shape of the wing improved the pilot's ground visibility and also allowed a shorter undercarriage height. The centre section protruded by only on either side.
The offensive armament was two 7.92 mm (.312 in) MG 17 machine guns fitted one in each wing outboard of undercarriage, operated by a mechanical pneumatics system from the pilot's control column. The rear gunner/radio operator operated one 7.92 mm (.312 in) MG 15 machine gun for defensive purposes.
The engine and propeller had automatic controls, and an auto-trimmer made the aircraft tail-heavy as the pilot rolled over into his dive, lining up red lines at 60°, 75° or 80° on the cockpit side window with the horizon and aiming at the target with the sight of the fixed gun. The heavy bomb was swung down clear of the propeller on crutches prior to release.
Flying at , the pilot located his target through a bombsight window in the cockpit floor. The pilot moved the dive lever to the rear, limiting the "throw" of the control column. The dive brakes were activated automatically, the pilot set the trim tabs, reduced his throttle and closed the coolant flaps. The aircraft then rolled 180°, automatically nosing the aircraft into a dive. Red tabs protruded from the upper surfaces of the wing as a visual indicator to the pilot that, in case of a g-induced black-out, the automatic dive recovery system would be activated. The Stuka dived at a 60–90° angle, holding a constant speed of 500–600 km/h (350–370 mph) due to dive-brake deployment, which increased the accuracy of the Ju 87's aim.
When the aircraft was reasonably close to the target, a light on the contact altimeter (an altimeter equipped with an electrical contact which triggers at a preset altitude) came on to indicate the bomb-release point, usually at a minimum height of . The pilot released the bomb and initiated the automatic pull-out mechanism by depressing a knob on the control column. An elongated U-shaped crutch located under the fuselage swung the bomb out of the way of the propeller, and the aircraft automatically began a 6g pullout. Once the nose was above the horizon, dive brakes were retracted, the throttle was opened, and the propeller was set to climb. The pilot regained control and resumed normal flight. The coolant flaps had to be reopened quickly to prevent overheating. The automatic pull-out was not liked by all pilots. Helmut Mahlke later said that he and his unit disconnected the system because it allowed the enemy to predict the Ju 87's recovery pattern and height, making it easier for ground defences to hit an aircraft.
Physical stress on the crew was severe. Human beings subjected to more than 5g in a seated position will suffer vision impairment in the form of a grey veil known to Stuka pilots as "seeing stars". They lose vision while remaining conscious; after five seconds, they black out. The Ju 87 pilots experienced the visual impairments most during "pull-up" from a dive.
Eric "Winkle" Brown RN, a British test pilot and Commanding Officer of No. 1426 Flight RAF (the captured enemy aircraft Flight), tested the Ju 87 at RAE Farnborough. He said of the Stuka, "I had flown a lot of dive-bombers and it's the only one that you can dive truly vertically. Sometimes with the dive-bombers ... maximum dive is usually in the order of 60 degrees ... When flying the Stuka, because it's all automatic, you are really flying vertically ... The Stuka was in a class of its own."
Extensive tests were carried out by the Junkers works at their Dessau plant. It was discovered that the highest load a pilot could endure was 8.5 g for three seconds, when the aircraft was pushed to its limit by the centrifugal forces. At less than 4 g, no visual problems or loss of consciousness were experienced. Above 6 g, 50% of pilots suffered visual problems, or "greyout". With 40%, vision vanished altogether from 7.5 g upwards and black-out sometimes occurred. Despite this blindness, the pilot could maintain consciousness and was capable of "bodily reactions". After more than three seconds, half the subjects passed out. The pilot would regain consciousness two or three seconds after the centrifugal forces had dropped below 3 g and had lasted no longer than three seconds. In a crouched position, pilots could withstand 7.5 g and were able to remain functional for a short duration. In this position, Junkers concluded that of pilots could withstand 8 g and perhaps 9 g for three to five seconds without vision defects which, under war conditions, was acceptable. During tests with the Ju 87 A-2, new technologies were tried out to reduce the effects of g. The pressurised cabin was of great importance during this research. Testing revealed that at high altitude, even 2 g could cause death in an unpressurised cabin and without appropriate clothing. This new technology, along with special clothing and oxygen masks, was researched and tested. When the United States Army occupied the Junkers factory at Dessau on 21 April 1945, they were both impressed at and interested in the medical flight tests with the Ju 87.
The concept of dive bombing became so popular among the leadership of the Luftwaffe that it became almost obligatory in new aircraft designs. Later bomber models like the Junkers Ju 88 and the Dornier Do 217 were equipped for dive bombing. The Heinkel He 177 strategic bomber was initially supposed to have dive bombing capabilities, a requirement that contributed to the failure of the design, with the requirement not rescinded until September 1942 by Göring.
Once the Stuka became too vulnerable to fighter opposition on all fronts, work was done to develop a replacement. None of the dedicated close-support designs on the drawing board progressed far due to the impact of the war and technological difficulties. So the Luftwaffe settled on the Focke-Wulf Fw 190 fighter aircraft, with the Fw 190F becoming the ground-attack version. The Fw 190F started to replace the Ju 87 for day missions in 1943, but the Ju 87 continued to be used as a night nuisance-raider until the end of the war.
The second prototype had a redesigned single vertical stabiliser and a Jumo 210 A engine installed, and later the Jumo 210Da. The first A series variant, the A-0, was of all-metal construction, with an enclosed cockpit under a "greenhouse" well-framed canopy; bearing twin radio masts on its aft sections, diagonally mounted to either side of the airframe's planform centreline and unique to the -A version. To ease the difficulty of mass production, the leading edge of the wing was straightened out and the ailerons' two aerofoil sections had smooth leading and trailing edges. The pilot could adjust the elevator and rudder trim tabs in flight, and the tail was connected to the landing flaps, which were positioned in two parts between the ailerons and fuselage. The A-0 also had a flatter engine cowling, which gave the pilot a much better field of vision. In order for the engine cowling to be flattened, the engine was set down nearly . The fuselage was also lowered along with the gunner's position, allowing the gunner a better field of fire.
The RLM ordered seven A-0s initially, but then increased the order to 11. Early in 1937, the A-0 was tested with varied bomb loads. The underpowered Jumo 210A, as pointed out by von Richthofen, was insufficient, and was quickly replaced with the Jumo 210D engine.
The A-1 differed from the A-0 only slightly. As well as the installation of the Jumo 210D, the A-1 had two fuel tanks built into the inner wing, but it was not armoured or protected. The A-1 was also intended to be fitted with four MG 17 machine guns in its wings, but two of these—one per side—were omitted due to weight concerns; the pair that remained were fed a total of 500 rounds of ammunition, stored in the design's characteristic transverse strut-braced, large-planform undercarriage "trousers", not used on the Ju 87B versions and onward. The pilot relied on the Revi C 21C gun sight for the two MG 17s. The gunner had a single MG 15, with 14 drums of ammunition, each containing 75 rounds. This represented a 150-round increase in this area over the Ju 87 A-0. The A-1 was also fitted with a larger propeller.
The Ju 87 was capable of carrying a bomb, but only if not carrying the rear gunner/radio operator as, even with the Jumo 210D, the Ju 87 was still underpowered for operations with more than a bomb load. All Ju 87 As were restricted to weapons (although during the Spanish Civil War missions were conducted without the gunner).
The Ju 87 A-2 was retrofitted with the Jumo 210Da fitted with a two-stage supercharger. The only further significant difference between the A-1 and A-2 was the H-PA-III controllable-pitch propeller. By mid-1938, 262 Ju 87 As had been produced, 192 from the Junkers factory in Dessau and a further 70 from Weser Flugzeugbau ("Weserflug" – WFG) in Lemwerder near Bremen. The new, more powerful, Ju 87B model started to replace the Ju 87A at this time.
Prototypes
Production variants
The Ju 87 B series was to be the first mass-produced variant. A total of six pre-production Ju 87 B-0 were produced, built from Ju 87 An airframes. The first production version was the Ju 87 B-1, with a considerably larger engine, its Jumo 211D generating 1,200 PS (), and completely redesigned fuselage and landing gear, replacing the twin radio masts of the "A" version with a single mast mounted further forward on the "greenhouse" canopy, and much simpler, lighter-weight wheel "spats" used from the -B version onwards, discarding the transverse strut bracing of the "A" version's maingear design. This new design was again tested in Spain, and after proving its abilities there, production was ramped up to 60 per month. As a result, by the outbreak of World War II, the Luftwaffe had 336 Ju 87 B-1s on hand.
The B-1 was also fitted with "Jericho trumpets", essentially sirens driven by propellers with a diameter of mounted on the wing's leading edge directly forward of the landing gear, or on the front edge of the fixed main gear fairing. This was used to weaken enemy morale and enhance the intimidation of dive-bombing. The devices caused a loss of 20–25 km/h (10–20 mph) through drag, and over time the sirens were not installed on many units any longer, although they remained in use to various extent. As an alternative, some bombs were fitted with whistles on the fin to produce the noise after release. The trumpets were a suggestion from Udet (but some authors say the idea originated from Adolf Hitler).
The Ju 87 B-2s that followed had some improvements and were built in several variants that included ski-equipped versions (the B-1 also had this modification) and at the other end, with a tropical operation kit called the Ju 87 B-2 trop. Italy's Regia Aeronautica received B-2s and named them the "Picchiatello", while others went to the other members of the Axis, including Hungary, Bulgaria and Romania. The B-2 also had an oil hydraulic system for closing the cowling flaps. This continued in all the later designs.
Production of the Ju 87 B started in 1937. 89 B-1s were to be built at Junkers' factory in Dessau and another 40 at the Weserflug plant in Lemwerder by July 1937. Production would be carried out by the Weserflug company after April 1938, but Junkers continued producing Ju 87 up until March 1940.
A long range version of the Ju 87B was also built, known as the Ju 87R, the letter being an abbreviation for "Reichweite", "(operational) range". They were primarily intended for anti-shipping missions. The Ju 87R had a B-series airframe with an additional oil tank and fuel lines to the outer wing stations to permit the use of two standardised capacity under-wing drop tanks, used by a wide variety of Luftwaffe aircraft through most of the war. This increased fuel capacity to (500 litres in main fuel tank of which 480 litres were usable + 600 litres from drop tanks). To prevent overload conditions, bomb carrying ability was often restricted to a single bomb if the aircraft was fully loaded with fuel.
The Ju 87 R-1 had a B-1 airframe with the exception of a modification in the fuselage which enabled an additional oil tank. This was installed to feed the engine due to the increase in range with the extra fuel tanks.
The Ju 87 R-2 had the same airframe as the B-2, and strengthened to ensure it could withstand dives of . The Jumo 211D in-line engine was installed, replacing the R-1s Jumo 211A. Due to an increase in overall weight by , the Ju 87 R-2 was slower than the Ju 87 B-1 and had a lower service ceiling. The Ju 87 R-2 had an increased range advantage of . The R-3 and R-4 were the last R variants developed. Only a few were built. The R-3 was an experimental tug for gliders and had an expanded radio system so the crew could communicate with the glider crew by way of the tow rope. The R-4 differed from the R-2 in the Jumo 211J powerplant.
Known prototypes
On 18 August 1937, the RLM decided to introduce the Ju 87 Tr(C). The Ju 87 C was intended to be a dive and torpedo bomber for the Kriegsmarine. The type was ordered into prototype production and available for testing in January 1938. Testing was given two months and was to begin in February and end in April 1938. The prototype V10 was to be a fixed wing test aircraft, while the following V11 would be modified with folding wings. The prototypes were Ju 87 B-0 airframes powered by Jumo 211 A engines. Owing to delays, the V10 was not completed until March 1938. It first flew on 17 March and was designated Ju 87 C-1. On 12 May, the V11 also flew for the first time. By 15 December 1939, 915 arrested landings on dry land had been made. It was found that the arresting gear winch was too weak and had to be replaced. Tests showed the average braking distance was . The Ju 87 V11 was designated C-0 on 8 October 1938. It was fitted out with standard Ju 87 C-0 equipment and better wing-folding mechanisms. The "carrier Stuka" was to be built at the Weserflug Company's Lemwerder plant between April and July 1940.
Among the "special" equipment of the Ju 87 C was a two-seat rubber dinghy with a flare gun, signal ammunition and other emergency supplies. A quick fuel dump mechanism and two inflatable 750 L (200 US gal) bags in each wing and a further two 500 L (130 US gal) bags in the fuselage enabled the Ju 87 C to remain afloat for up to three days in calm seas. On 6 October 1939, with the war already underway, 120 of the planned Ju 87 Tr(C)s on order at that point were cancelled. Despite the cancellation, the tests continued using catapults. The Ju 87 C had a takeoff weight of and a speed of on departure. The Ju 87 could be launched with a SC bomb and four SC bombs under the fuselage. The C-1 was to have two MG 17s mounted in the wing with a MG 15 operated by the rear gunner. On 18 May 1940, production of the C-1 was switched to the R-1.
Known prototypes
Despite the Stuka's vulnerability to enemy fighters having been exposed during the Battle of Britain, the Luftwaffe had no choice but to continue its development, as there was no replacement aircraft in sight. The result was the D-series. In June 1941, the RLM ordered five prototypes, the Ju 87 V21–25. A Daimler-Benz DB 603 powerplant was to be installed in the Ju 87 D-1, but it did not have the power of the Jumo 211 and performed "poorly" during tests and was dropped. The Ju 87 D-series featured two coolant radiators underneath the inboard sections of the wings, while the oil cooler was relocated to the position formerly occupied by the single, undernose "chin" coolant radiator. The D-series also introduced an aerodynamically refined cockpit with better visibility and space. Armour protection was increased and a new dual-barrel 7.92 mm (.312 in) MG 81Z machine gun with an extremely high rate of fire was installed in the rear defensive position. Engine power was increased again, the Jumo 211J now delivering 1,420 PS ().
Bomb carrying ability was nearly quadrupled from in the B-version to in the D-version (max. load for short ranges, overload condition), a typical bomb load ranged from .
The internal fuel capacity of the Ju 87D was raised to 800 L (of which 780 L were usable) by adding wing tanks while retaining the option to carry two 300 L drop tanks. Tests at Rechlin-Lärz Airfield revealed it made possible a flight duration of 2 hours and 15 minutes. With an extra two 300 L (80 US gal) fuel tanks, it could achieve four hours flight time.
The D-2 was a variant used as a glider tug by converting older D-series airframes. It was intended as the tropical version of the D-1 and had heavier armour to protect the crew from ground fire. The armour reduced its performance and caused the Oberkommando der Luftwaffe to "place no particular value on the production of the D-2". The D-3 was an improved D-1 with more armour for its ground-attack role. Some Ju 87 D-3s were designated D-3N or D-3 trop and fitted with night or tropical equipment. The D-4 designation applied to a prototype torpedo-bomber version, which could carry a aerial torpedo on a PVC 1006 B rack—this setup would have had the capacity to carry the "Luftorpedo" LT 850, the German version of the well-proven Japanese Type 91 aerial torpedo of 848 kg (1,870 lb). The D-4 was to be converted from D-3 airframes and, in place of the carrier-specific Ju 87C series designs, operated from the aircraft carrier "". Other modifications included a flame eliminator and, unlike earlier D variants, two 20 mm MG 151/20 cannon, while the radio operator/rear gunner's ammunition supply was increased by 1,000 to 2,000 rounds.
The Ju 87 D-5 was based on the D-3 design and was unique in the Ju 87 series as it had wings 0.6 metres (2-feet) longer than previous variants. The two 7.92 mm MG 17 wing guns were exchanged for more powerful 20 mm MG 151/20s to better suit the aircraft's ground-attack role. The window in the floor of the cockpit was reinforced and four, rather than the previous three, aileron hinges were installed. Higher diving speeds were obtained of up to . The range was recorded as at ground level and at .
The D-6, according to "Operating instructions, works document 2097", was built in limited numbers to train pilots on "rationalised versions". Due to shortages in raw materials, it did not go into mass production. The D-7 was another ground attack aircraft based on D-1 airframes upgraded to D-5 standard (armour, wing cannons, extended wing panels), while the D-8 was similar to the D-7 but based on D-3 airframes. The D-7 and D-8 were both were fitted with flame dampers, and could conduct night operations.
Production of the D-1 variant started in 1941 with 495 ordered. These aircraft were delivered between May 1941 and March 1942. The RLM wanted 832 machines produced from February 1941. The Weserflug company was tasked with their production. From June to September 1941, 40 Ju 87 Ds were expected to be built, increasing to 90 thereafter. Various production problems were encountered. One of the planned 48 was produced in July. Of the 25 the RLM hoped for in August 1941, none were delivered. In September did the first two of the planned 102 Ju 87s came off the production lines. The shortfalls continued to the end of 1941. During this time, the WFG plant in Lemwerder moved production to Berlin. Over 165 Ju 87s had not been delivered and production was only 23 Ju 87 Ds per month out of the 40 expected. By the spring of 1942 to the end of production in 1944, 3,300 Ju 87s, mostly D-1s, D-2s and D-5s had been manufactured.
In January 1943, a variety of Ju 87 Ds became "test beds" for the Ju 87 G variants. At the start of 1943, the coastal Luftwaffe "Erprobungsstelle" test centre at Tarnewitz tested this combination from a static position. "Oberst" G. Wolfgang Vorwald noted the experiments were not successful, and suggested the cannon be installed on the Messerschmitt Me 410. Testing continued, and on 31 January 1943, Ju 87 D-1 W.Nr 2552 was tested by "Hauptmann" Hans-Karl Stepp near the Briansk training area. Stepp noted the increase in drag, which reduced the aircraft's speed to . Stepp also noted that the aircraft was also less agile than the existing D variants. D-1 and D-3 variants operated in combat with the BK 37 cannon in 1943.
Known prototypes
With the G variant, the ageing airframe of the Ju 87 found new life as an anti-tank aircraft. This was the final operational version of the Stuka, and was deployed on the Eastern Front. The reverse in German military fortunes after 1943 and the appearance of huge numbers of well-armoured Soviet tanks caused Junkers to adapt the existing design to combat this new threat. The Henschel Hs 129B had proved a potent ground attack weapon, but its large fuel tanks made it vulnerable to enemy fire, prompting the RLM to say "that in the shortest possible time a replacement of the Hs 129 type must take place." With Soviet tanks the priority targets, the development of a further variant as a successor to the Ju 87D began in November 1942. On 3 November, Milch raised the question of replacing the Ju 87, or redesigning it altogether. It was decided to keep the design as it was, but the power-plant was upgraded to a Junkers Jumo 211J, and two cannons were added. The variant was also designed to carry a free-fall bomb load. Furthermore, the armoured protection of the Ilyushin Il-2 "Sturmovik"—a feature pioneered by the 1916–17 origin Junkers J.I all-metal sesquiplane of World War I Imperial Germany's Luftstreitkräfte—was copied to protect the crew from ground fire now that the Ju 87 would be required to conduct low level attacks.
Hans-Ulrich Rudel, a Stuka ace, had suggested using two 37 mm (1.46 in) Flak 18 guns, each one in a self-contained under-wing gun pod, as the "Bordkanone BK 3,7", after achieving success against Soviet tanks with the 20 mm MG 151/20 cannon. These gun pods were fitted to a Ju 87 D-1, W.Nr 2552. The first flight of the machine took place on 31 January 1943, piloted by Stepp. The continuing problems with about two dozen of the Ju 88P-1 and slow development of the Henschel Hs 129B-3, each of them equipped with a large, PaK 40-based, autoloading Bordkanone 7,5 7.5 cm (2.95 in) cannon in a conformal gun pod beneath the fuselage, meant the Ju 87G was put into production. In April 1943, the first production Ju 87 G-1s were delivered to front line units. The two 37 mm (1.46 in) "Bordkanone" BK 3,7 cannons were mounted in under-wing gun pods, each loaded with two six-round magazines of armour-piercing tungsten carbide-cored ammunition. With these weapons, the "Kanonenvogel" ("cannon-bird"), as it was nicknamed, proved very successful in the hands of Stuka aces such as Rudel. The G-1 was converted from older D-series airframes, retaining the smaller wing, but without the dive brakes. The G-2 was similar to the G-1 except for use of the extended wing of the D-5. 208 G-2s were built and at least a further 22 more were converted from D-3 airframes. Only a handful of production Gs were committed in the Battle of Kursk. On the opening day of the offensive, Hans-Ulrich Rudel flew the only "official" Ju 87 G, although a significant number of Ju 87D variants were fitted with the 37 mm (1.46 in) cannon, and operated as unofficial Ju 87 Gs before the battle. In June 1943, the RLM ordered 20 Ju 87Gs as production variants. The G-1 later influenced the design of the Fairchild Republic A-10 Thunderbolt II, with Hans Rudel's book, "Stuka Pilot" being required reading for all members of the A-X project.
The Ju 87 had been used in the night night intruder role in 1940 and 1941 during The Blitz, but the Soviet Air Force practice of harassing German ground forces using antiquated Polikarpov Po-2 and R-5 biplanes at night to drop flares and fragmentation bombs, inspired the Luftwaffe to form its own "Störkampfstaffeln" (harassment squadrons). On 23 July 1942, Junkers offered the Ju 87 B-2, R-2 and R-4s with "Flammenvernichter" ("flame eliminators"). On 10 November 1943, the RLM GL/C-E2 Division finally authorised the design in directive No. 1117.
The need to equip night units and the phasing out of Ju 87s from ground attack groups in favour of the Fw 190, enabled the use of D-5 airframes awaiting repair and D-7 and 8s already in conversion units. The latter variants were either conversions or modified D-1 and D-3 air frames. Adding the necessary equipment, radios and dampeners was a requirement regardless of whether the aircraft was a production D-5 or a D-1 or 3 that had undergone wing changes. The change in designations due to conversions was not readily apparent, for with wing changes, the serial number and designation was applied to the fuselage by the manufacture which remained unaltered by wing changes. Some sub-contractors added an “N” designation (Nacht) for clarity on D-3 and 5s . Others added the roman numeral VII to the D-7s, perhaps to reflect that the aircraft was fitted with the FuG 7 radio. A great deal of confusion exists concerning the D-7. Its existence has been questioned, but the type is listed in Junkers company records and in the "Der Reichminister der Luftfahrt" and "Oberbefehlshabere der Luftwaffe Technisches Amt". There was no production "nacht stuka", and modifications could vary according to the sub-contractor and depending on what parts were available.
A Stuka repair centre was set up at Wels-Lichtenegg. From May 1940 to November 1944, 746 were repaired and flight-tested there. In the winter 1943/44, the "Metal Works Lower Saxony Brinckmann und Mergell" company (Menibum) converted approximately 300 Ju 87D-3 and 5s to night versions. Dive-brakes were removed there, while gun muzzles and dampers were to eliminate exhaust and muzzle flash. The Jumo 211P engine was installed in some cases. It took 2,170 technicians and workers to carry out the conversions. Total figures for conversions to night flying operations are unknown. The company’s equipment was seized by the Soviet Union at the end of the war, and the records were lost or destroyed. A main piece of equipment, hereto not installed in the Ju 87, was the FuG 101 Electronic Radio Altimeter. This was used to measure height. Some Ju 87s also used FuG 16Z transmitter/receiver set to augment the FuG 25 IFF (Identification Friend or Foe).
Pilots were also asked to complete the new "Blind Flying Certificate 3", which was especially introduced for this new type of operation. Pilots were trained at night, over unfamiliar terrain, and forced to rely on their instruments for direction. The Ju 87's standard Revi C12D gunsight was replaced with the new Nachtrevi ("Nightrevi") C12N. On some Ju 87s, the Revi 16D was exchanged for the Nachtrevi 16D. To help the pilot see his instrument panel, a violet light was installed.
On 15 November 1942, the Auxiliary "Staffel" was created. By mid-1943, "Luftflotte" 1 was given four "Staffeln" while "Luftflotte" 4 and "Luftwaffe Kommando Ost" (Luftwaffe Command East) were given six and two respectively. In the first half of 1943, 12 "Nachtschlachtgruppen" ("night battle groups"—NSGr) had been formed, flying a multitude of different types of aircraft, including the Ju 87, which proved itself ideally suited to the low-level slow flying needed. NSGr 1 and 2 fought with some success on the Western Front during the Battle of Normandy and Battle of the Bulge. NSGr 7 operated in "anti-partisan" role from bases in Albania from July 1944, replacing their use of German trainers. The 3rd and 4th group served on the Eastern Front, the 8th in the Arctic and the 9th in Italy. NSGr 20 fought against the Western Allied invasion of Germany in 1945. Photographic evidence exists of 16 NSGr 20 Ju 87s lining up to take-off in the woods circling the Lippe airfield, Germany while under attack from P-47 Thunderbolts of the IX Tactical Air Command. The unit operated against the Ludendorff Bridge during the Battle of Remagen.
Despite teething problems with the Ju 87, the RLM ordered 216 Ju 87 A-1s into production and wanted to receive delivery of all machines between January 1936 and 1938. The Junkers production capacity was fully occupied and licensing to other production facilities became necessary. The first 35 Ju 87 A-1s were therefore produced by the Weser Flugzeugbau (WFG). By 1 September 1939, 360 Ju 87 As and Bs had been built by the Junkers factories at Dessau and Weserflug factory in Lemwerder near Bremen. By 30 September 1939, Junkers had received 2,365,196 Reichsmark (RM) for Ju 87 construction orders. The RLM paid another 243,646 RM for development orders. According to audit records in Berlin, by the end of the financial year on 30 September 1941, 3,059,000 RM had been spent on Ju 87 airframes. By 30 June 1940, 697 Ju 87 B-1s and 129 B-2s alone had been produced. Another 105 R-1s and seven R-2s had been built.
The range of the B-2 was not sufficient, and it was dropped in favour of the Ju 87 R long-range versions in the second half of 1940. The 105 R-1s were converted to R-2 status and a further 616 production R-2s were ordered. In May 1941, the development of the D-1 was planned and was ordered into production by March 1942. The expansion of the Ju 88 production lines to compensate for the withdrawal of Dornier Do 17 production delayed production of the Ju 87 D. The Weserflug plant in Lemwerder experienced production shortfalls. This prompted Milch to visit and threaten the company into meeting the RLM's Ju 87 D-1 requirements on 23 February 1942. To meet these demands, 700 skilled workers were needed. Skilled workers had been called up for military service in the Wehrmacht. Junkers were able to supply 300 German workers to the Weserflug factory, and as an interim solution, Soviet prisoners of war and Soviet civilians deported to Germany. Working around the clock, the shortfall was made good. WFG received an official commendation. By May 1942, demand increased further. Chief of Procurement General Walter Herthel found that each unit needed 100 Ju 87s as standard strength and an average of 20 per month to cover attrition. Not until June–December 1942 did production capacity increase, and 80 Ju 87s were produced per month.
By 17 August 1942, production had climbed rapidly after Blohm & Voss BV 138 production was scaled down and licence work had shut down at WFG. Production now reached 150 Ju 87 D airframes per month, but spare parts were failing to reach the same production levels. Undercarriage parts were in particularly short supply. Milch ordered production to 350 Ju 87s per month in September 1942. This was not achievable due to the insufficient production capacity in the Reich.
The RLM considered setting up production facilities in Slovakia. But this would delay production until the buildings and factories could be furnished with the machine tools. These tools were also in short supply, and the RLM hoped to purchase them from Switzerland and Italy. The Slovaks could provide 3,500–4,000 workers, but no technical personnel. The move would only produce another 25 machines per month at a time when demand was increasing. In October, production plans were dealt another blow when one of WFGs plants burned down, leaving a chronic shortage of tailwheels and undercarriage parts. Junkers director and member of the Luftwaffe industry council Carl Frytag reported that by January 1943 only 120 Ju 87s could be produced at Bremen and 230 at Berlin-Tempelhof.
After evaluating Ju 87 operations on the Eastern Front, Göring ordered production limited to 200 per month in total. "General der Schlachtflieger" General of Close-Support Aviation) Ernst Kupfer decided continued development would "hardly bring any further tactical value". Adolf Galland, a fighter pilot with operational and combat experience in strike aircraft, said that abandoning development would be premature, but 150 machines per month would be sufficient.
On 28 July 1943, strike and bomber production was to be scaled down, and fighter and bomber destroyer production given priority. On 3 August 1943, Milch contradicted this and declared that this increase in fighter production would not affect production of the Ju 87, Ju 188, Ju 288 and Ju 290. This was an important consideration as the life expectancy of a Ju 87 had been reduced (since 1941) from 9.5 months to 5.5 months to just 100 operational flying hours. On 26 October, "General der Schlachtflieger" Ernst Kupfer reported the Ju 87 could no longer survive in operations and that the Focke-Wulf Fw 190F should take its place. Milch finally agreed and ordered the minimal continuance of Ju 87 D-3 and D-5 production for a smooth transition period.
In May 1944, production wound down. 78 Ju 87s were built in May and 69 rebuilt from damaged machines. In the next six months, 438 Ju 87 Ds and Gs were added to the Ju 87 force as new or repaired aircraft. It is unknown whether any Ju 87s were built from parts unofficially after December 1944 and the end of production.
Overall, 550 Ju 87 As and B2s were completed at the Junkers factory in Dessau. Production of the Ju 87 R and D variants was transferred to the Weserflug company, which produced 5,930 of the 6,500 Ju 87s produced in total. During the course of the war, little damage was done to the WFG plant at Lemwerder. Attacks throughout 1940-45 caused little lasting damage and succeeded only in damaging some Ju 87 airframes, in contrast to the Focke-Wulf plant in Bremen. At Berlin-Tempelhof, little delay or damage was caused to Ju 87 production, despite the heavy bombings and large-scale destruction inflicted on other targets. The WFG again went unscathed. The Junkers factory at Dessau was heavily attacked, but not until Ju 87 production had ceased. The Ju 87 repair facility at the Wels aircraft works was destroyed on 30 May 1944, and the site abandoned Ju 87 links.
Among the many German aircraft designs that participated in the Condor Legion, and as part of other German involvement in the Spanish Civil War, a single Ju 87 A-0 (the V4 prototype) was allocated serial number 29-1 and was assigned to the VJ/88, the experimental "Staffel" of the Legion's fighter wing. The aircraft was secretly loaded onto the ship "Usaramo" and departed Hamburg harbor on the night of 1 August 1936, arriving in Cádiz five days later. The only known information pertaining to its combat career in Spain is that it was piloted by "Unteroffizier" Herman Beuer, and took part in the Nationalist offensive against Bilbao in 1937. Presumably the aircraft was then secretly returned to Germany.
In January 1938, three Ju 87 As from the Legion Condor arrived. Several problems became evident—the spatted undercarriage sank into muddy airfield surfaces, and the spats were temporarily removed. The maximum bomb load could only be carried if the gunner vacated his seat, therefore the bomb load was restricted to . These aircraft supported the Nationalist forces and carried out anti-shipping missions until they returned to Germany in October 1938. During the Catalonia Offensive in January 1939, the Junkers Ju 87 returned to Spain. On the morning of 21 January 1939, 34 Heinkel He 111, along with some escorts and three Ju 87B, attacked the Port of Barcelona, five days before the city was captured by the Fascists. 29 Republican fighters were defending the city. There were more than 100 aircraft operating over the city and, while a Ju 87 was dive-bombing a ship, a Republican Polikarpov I-15 pilot, Francisco Alférez Jiménez, claimed it destroyed near el Vendrell, in Coma-ruga, but the Stuka was capable of landing on the beach without crashing. That was the only time a Stuka attacked the capital of Catalonia. On 24 January 1939, a group of Stukas prevented the destruction of a bridge near Barcelona by strafing the demolition engineers on Molins de Rei. During the attack the Republican ground defenders, equipped with a quadruple PM M1910 mounting, hit one pilot (Heinz Bohne) in both legs and the Stuka crashed, seriously injuring Bohne, and his machine gunner, Albert Conrad. Those two were the only Stuka casualties of the war.
As with the Ju 87 A-0, the B-1s were returned discreetly to the Reich. The experience of the Spanish Civil War proved invaluable - air and ground crews perfected their skills, and equipment was evaluated under combat conditions. The Ju 87 had however not been tested against numerous and well-coordinated fighter opposition; this lesson was learned later at great cost to the Stuka crews.
All Stuka units were moved to Germany's eastern border in preparation for the invasion of Poland. On the morning of 15 August 1939, during a mass-formation dive-bombing demonstration for high-ranking commanders of the Luftwaffe at Neuhammer training grounds near Sagan, 13 Ju 87s and 26 crew members were lost when they crashed into the ground almost simultaneously. The planes dived through cloud, expecting to release their practice bombs and pull out of the dive once below the cloud ceiling, unaware that the ceiling was too low and unexpected ground mist formed, leaving them no time to pull out of the dive.
On 1 September 1939, the Wehrmacht invaded Poland, beginning World War II. "Generalquartiermeister der Luftwaffe" records indicate a total force of 366 Ju 87 A and Bs were available for operations on 31 August 1939. The first Ju 87 operation was to destroy Polish demolition charges fixed to the rail bridges over the Vistula, that linked Eastern Germany to the Danzig corridor and East Prussia as well as Polish Pomerania. To do this, Ju 87s were ordered to perform a low-level attack on the Polish Army Garrison headquarters. II. and III./StG 1 targeted the cables along the embankment, the electricity plant and signal boxes at Dirschau (now Tczew, Poland. At exactly 04:26 CET, a "Kette" ("chain" or flight of three) of Ju 87s of 3./StG 1 led by "Staffelkapitän" "Oberleutnant" Bruno Dilly carried out the first bombing attack of the war. The Stukas attacked 11 minutes before the official German declaration of hostilities and hit the targets. The Ju 87s achieved complete success. The mission failed as the German Army delayed their advance allowing the Poles to carry out repairs and destroy all but one of the bridges before the Germans could reach them.
A Ju 87 achieved the first air victory during World War II on the morning of 1 September 1939, when "" "Leutnant" Frank Neubert of I./StG 2 "Immelmann" shot down a Polish PZL P.11c fighter while it was taking off from Balice airfield; its pilot, Captain Mieczysław Medwecki, was killed. In air-to-air combat, Ju 87 formations were well protected by German fighter aircraft and losses were light against the tenacious, but short lived opposition.
The Ju 87s reverted to ground attack missions for the campaign after the opening air attacks. Ju 87s were involved in the controversial but effective attacks at Wieluń. The lack of anti-aircraft artillery in the Polish Army magnified the impact of the Ju 87. At Piotrków Trybunalski I./StG 76 and I./StG 2 destroyed a Polish infantry division de-training there. Troop trains were also easy targets. StG 77 destroyed one such target at Radomsko. During the Battle of Radom six Polish divisions trapped by encircling German forces were forced to surrender after a relentless four-day bombardment by StG 51, 76 and 77. Employed in this assault were fragmentation bombs, which caused appalling casualties to the Polish ground troops. Demoralised, the Poles surrendered. The Stukas also participated in the Battle of Bzura which resulted in the breaking of Polish resistance. The dive bomber wings ("Sturzkampfgeschwader") alone dropped 388 tonnes (428 tons) of bombs during this battle. During the Siege of Warsaw and the Battle of Modlin, the Ju 87 wings contributed to the defeat of well-entrenched and resolute Polish forces. IV(Stuka)./LG 1 was particularly effective in destroying the fortified Modlin.
The "Luftwaffe" had a few anti-shipping naval units such as 4.(St)/TrGr 186 to deal with Polish naval forces. This unit performed effectively, sinking the 1540-ton destroyer "Wicher" and the minelayer "Gryf" of the Polish Navy (both moored in a harbour). The torpedo boat "Mazur" (412 tons) was sunk at Oksywie; the gunboat "General Haller" (441 tons) was sunk in Hel Harbour on 6 September—during the Battle of Hel—along with the minesweeper "Mewa" (183 tons) and its sister ships "Czapla" and "Jaskolka" with several auxiliaries. The Polish naval units trapped in the Baltic were destroyed by Ju 87 operations. Once again, enemy air opposition was light, and the "Stukawaffe" (Stuka force) lost 31 aircraft during the campaign.
Operation Weserübung began on 9 April 1940 with the invasions of Norway and Denmark. Denmark capitulated within the day; Norway continued to resist with British and French help. The campaign was not a "Blitzkrieg" of fast-moving armoured divisions supported by air power as the mountainous terrain ruled out close Panzer/Stuka cooperation. Instead, the Germans relied on paratroops transported by Junkers Ju 52s and specialised ski troops. The Ju 87s were given the role of ground attack and anti-shipping missions; they proved to be the most effective weapon in the Luftwaffe's armoury carrying out the latter task.
On 9 April, the first Stukas took off at 10:59 from occupied airfields to destroy Oscarsborg Fortress, after the loss of the German cruiser "Blücher", which disrupted the amphibious landings in Oslo through Oslofjord. The 22 Ju 87s had helped suppress the Norwegian defenders during the ensuing Battle of Drøbak Sound, but the defenders did not surrender until after Oslo had been captured. As a result, the German naval operation failed. StG 1 caught the 735 ton Norwegian destroyer "Æger" off Stavanger and hit her in the engine room. "Æger" was run aground and scuttled. The Stuka wings were now equipped with the new Ju 87 R, which differed from the Ju 87 B by having increased internal fuel capacity and two 300l underwing drop tanks for more range.
The Stukas had numerous successes against Allied naval vessels and in particular the Royal Navy which posed a formidable threat to German naval and coastal operations. The heavy cruiser "Suffolk" was attacked on 17 April. Her stern was virtually destroyed but she limped back to Scapa Flow with 33 dead and 38 wounded crewmen. The light cruiser squadron consisting of the sister ships "Curacoa" and "Curlew" were subjected to lengthy attacks which badly damaged the former for one Ju 87 lost. A witness later said, "they threatened to take our masthead with them in every screaming nerve-racking dive". The same fate nearly befell the sloop "Black Swan". On 27 April, a bomb passed through the quarterdeck, a wardroom, a water tank and 4-inch (10.2 cm) magazine and out through the hull to explode in the fjord. The muffled explosion limited the damage to her hull. "Black Swan" fired 1,000 rounds, but failed to shoot any of her attackers down. was sunk on 30 April. The French large destroyer "Bison" was sunk along with by "Sturzkampfgeschwader" 1 on 3 May 1940 during the evacuation from Namsos. "Bison"s forward magazine was hit, killing 108 of the crew. "Afridi", which attempted to rescue "Bison"s survivors, was sunk with the loss of 63 sailors. 49 officers and men, 13 soldiers and 33 survivors from "Bison" were lost aboard "Afridi". All ships were targeted. Armed trawlers were used under the German air umbrella in an attempt to make smaller targets. Such craft were not armoured or armed. The Ju 87s demonstrated this on 30 April when they sank the "Jardine" (452 tons) and "Warwickshire" (466 tons). On 15 May, the Polish troopship "Chrobry" (11,442 tons) was sunk.
The "Stukas" also had an operational effect, even when little damage was done. On 1 May 1940, Vice Admiral Lionel Wells commanded a Home Fleet expedition of seven destroyers, the heavy cruiser "Berwick", the aircraft carriers "Glorious" and "Ark Royal", and the battleship "Valiant". The carriers mounted fighter patrols over the ships evacuating troops from Andalsnes. The "Stuka" waves (accompanied by He 111s) achieved several near misses, but were unable to obtain a hit. Nevertheless, Wells ordered that no ship was to operate within range of the Ju 87s' Norwegian airfields. The Ju 87s had, in effect, driven British sea power from the Norwegian coast. Moreover, Victor reported to the Commander-in-Chief of the Home Fleet Admiral, Charles Forbes, that carrier operations were no longer practical under the current conditions.
In the following weeks, StG 1 continued their sea operations. Off Namsos on 5 May 1940, they caught and sank the Royal Norwegian Navy transports "Aafjord" (335 tons) and "Blaafjeld" (1,146 tons). The Ju 87s then took to bombing the town and the airstrip to support the German forces under the command of Eduard Dietl. The town fell in the first week of May. In the remaining four weeks of the campaign in Norway, the Ju 87s supported German forces in containing the Allied land forces in Narvik until they withdrew in early June.
The Ju 87 units had learned lessons from the Polish and Norwegian campaigns. The failures in Poland, and of the "Stukas" of I./StG 1 to silence the Oscarsborg fort, ensured even more attention was paid to pin-point bombing during the Phoney War period. This was to pay off in the Western campaign.
When "Fall Gelb" (Case Yellow) began on 10 May 1940, the "Stuka" helped swiftly neutralise the fortress of Eben Emael, Belgium. The headquarters of the commander responsible for ordering the destruction of the Belgian Army-held bridges along the Albert Canal was stationed in the village of Lanaken (14 km/ mi to the north). The "Stuka" demonstrated its accuracy when the small building was destroyed by four direct hits. As a result, only one of the three bridges was destroyed, allowing the German Army to rapidly advance in the opening days of the Battle of Belgium. The Ju 87 proved to be a useful asset to Army Group B in the Low Countries. In pitched battles against French armoured forces at Hannut and Gembloux, Ju 87s effectively neutralised artillery and armour.
The Ju 87s also assisted German forces in the Battle of the Netherlands. The Dutch Navy in concert with the British were evacuating the Dutch Royal Family and Dutch gold reserves through the country's ports. Ju 87s sank the Dutch ships "Jan Van Galen" (1,316 tons) and "Johan Maurits Van Nassau" (1,520 tons) as they provided close-shore artillery support at Waalhaven and the Afsluitdijk. The British "Valentine" was crippled, beached and scuttled while "Winchester", "Whitley" and "Westminster" were damaged. "Whitley" was later beached and scuttled after an air attack on 19 May.
The Ju 87 units were also instrumental in the Battle of France. It was here that most of the Ju 87-equipped units were concentrated. They assisted in the breakthrough at Sedan, the critical and first major land battle of the war on French territory. The "Stukawaffe" flew 300 sorties against French positions, with StG 77 alone flying 201 individual missions. The Ju 87s benefited from heavy fighter protection from Messerschmitt Bf 109 units. When resistance was organised, the Ju 87s could be vulnerable. For example, on 12 May, near Sedan, six French Curtiss H-75s from Groupe de Chasse I/5 (Group Interception) attacked a formation of Ju 87s, claiming 11 out of 12 unescorted Ju 87s without loss (the Germans recorded six losses over Sedan entire). For the most part, Allied opposition was disorganised. During the battles of Montcornet, Arras, Bolougne, and Calais, Ju 87 operations broke up counterattacks and offered pin-point aerial artillery support for German infantry.
The Luftwaffe benefited from excellent ground-to-air communications throughout the campaign. Radio equipped forward liaison officers could call upon the Stukas and direct them to attack enemy positions along the axis of advance. In some cases the Stukas responded in 10–20 minutes. "Oberstleutnant" Hans Seidemann (Richthofen's Chief of Staff) said that "never again was such a smoothly functioning system for discussing and planning joint operations achieved."
During the Battle of Dunkirk, many Allied ships were lost to Ju 87 attacks while evacuating British and French troops. The French destroyer was sunk on 21 May 1940, followed by the paddle steamer "Crested Eagle" on 28 May. The French Channel-steamer "Côte d'Arzur" (3,047) followed. The Ju 87s operated to maximum effectiveness when the weather allowed. RAF fighter units were held back and Allied air cover was patchy at best. On 29 May the Royal Navy destroyer HMS "Grenade" was severely damaged by a Ju 87 attack within Dunkirk's harbour, and subsequently sank. The French destroyer "Mistral" was crippled by bomb damage the same day. "Jaguar" and "Verity" were badly damaged while the trawlers "Calvi" and "Polly Johnson" (363 and 290 tons) disintegrated under bombardment. The merchant ship "Fenella" (2,376 tons) was sunk having taken on 600 soldiers. The attacks brought the evacuation to a halt for a time. The ferries "Lorina" and "Normannia" (1,564 and 1,567 tons) were sunk also. By 29 May, the Allies had lost 31 vessels sunk and 11 damaged. On 1 June the Ju 87s sank the "Halcyon"-class minesweeper "Skipjack" while the destroyer "Keith" was sunk and "Basilisk" was crippled before being scuttled by "Whitehall". "Whitehall" was later badly damaged and along with "Ivanhoe", staggered back to Dover. "Havant", commissioned for just three weeks, was sunk and in the evening the French destroyer "Foudroyant" sank after a mass-attack. Further victories against shipping were claimed before nightfall on 1 June. The steamer "Pavon" was lost while carrying 1,500 Dutch soldiers most of whom were killed. The oil tanker "Niger" was also destroyed. A flotilla of French minesweepers were also lost—"Denis Papin" (264 tons), the "Le Moussaillon" (380 tons) and "Venus" (264 tons).
In total, 89 merchantmen (of 126,518 grt) were lost, and of 40 RN destroyers used in the battle eight were sunk (one to E-boat and one to a submarine), and a further 23 damaged and out of service.
The campaign ended after the French surrender on 25 June 1940. Allied air power had been ineffective and disorganised, and as a result, "Stuka" losses were mainly due to ground fire. 120 machines, one-third of the Stuka force, were destroyed or damaged by all causes from 10 May to 25 June 1940.
For the Battle of Britain, the Luftwaffe's Order of battle included bomber wings equipped with the Ju 87. Lehrgeschwader 2's IV.(St), Sturzkampfgeschwader 1's III. Gruppe and Sturzkampfgeschwader 2's III. Gruppe, Sturzkampfgeschwader 51 and Sturzkampfgeschwader 3's I. Gruppe were committed to the battle. As an anti-shipping weapon, the Ju 87 proved a potent weapon in the early stages of the battle. On 4 July 1940, StG 2 made a successful attack on a convoy in the English Channel, sinking four freighters: "Britsum", "Dallas City", "Deucalion" and "Kolga". Six more were damaged. That afternoon, 33 Ju 87s delivered the single most deadly air assault on British territory in history, when 33 Ju 87s of III./StG 51, avoiding Royal Air Force (RAF) interception, sank the 5,500 ton anti-aircraft ship in Portland Harbour, killing 176 of its 298 crew. One of "Foylebank's" gunners, Leading Seaman John F. Mantle continued to fire on the Stukas as the ship sank. He was awarded a posthumous Victoria Cross for remaining at his post despite being mortally wounded. Mantle may have been responsible for the single Ju 87 lost during the raid.
During August, the Ju 87s also had some success. On 13 August the opening of the main German attacks on airfields took place; it was known to the Luftwaffe as "Adlertag" ("Eagle Day"). Bf 109s of Jagdgeschwader 26 (JG 26) were sent out in advance of the main strike and drew off RAF fighters, allowing 86 Ju 87s of StG 1 to attack RAF Detling in Kent unhindered. The attack killed the station commander, destroyed 20 RAF aircraft on the ground and a great many of the airfield's buildings. Detling was not an RAF Fighter Command station.
The Battle of Britain proved for the first time that the Junkers Ju 87 was vulnerable in hostile skies against well-organised and determined fighter opposition. The Ju 87, like other dive bombers, was slow and possessed inadequate defences. Furthermore, it could not be effectively protected by fighters because of its low speed, and the very low altitudes at which it ended its dive bomb attacks. The Stuka depended on air superiority, the very thing being contested over Britain. It was withdrawn from attacks on Britain in August after prohibitive losses, leaving the Luftwaffe without precision ground-attack aircraft.
Steady losses had occurred throughout their participation in the battle. On 18 August, known as the Hardest Day because both sides suffered heavy losses, the Stuka was withdrawn after 16 were destroyed and many others damaged. According to the Generalquartiermeister der Luftwaffe, 59 Stukas had been destroyed and 33 damaged to varying degrees in six weeks of operations. Over 20% of the total Stuka strength had been lost between 8 and 18 August; and the myth of the Stuka shattered. The Ju 87s did succeed in sinking six warships, 14 merchant ships, badly damaging seven airfields and three Chain Home radar stations, and destroying 49 British aircraft, mainly on the ground.
On 19 August, the units of VIII. Fliegerkorps moved up from their bases around Cherbourg-Octeville and concentrated in the Pas de Calais under Luftflotte 2, closer to the area of the proposed invasion of Britain. On 13 September, the Luftwaffe targeted airfields again, with a small number of Ju 87s crossing the coast at Selsey and heading for Tangmere. After a lull, anti-shipping operations attacks were resumed by some Ju 87 units from 1 November 1940, as part of the new winter tactic of enforcing a blockade. Over the next 10 days, seven merchant ships were sunk or damaged, mainly in the Thames Estuary, for the loss of four Ju 87s. On 14 November 19 Stukas from III./St.G 1 with escort drawn from JG 26 and JG 51 went out against another convoy; as no targets were found over the estuary, the Stukas attacked Dover, their alternative target.
Bad weather resulted in a decline of anti-shipping operations, and before long the Ju 87 groups began re-deploying to Poland, as part of the concealed build-up for Operation Barbarossa. By spring 1941, only St.G 1 with 30 Ju 87s remained facing the United Kingdom. Operations on a small scale continued throughout the winter months into March. Targets included ships at sea, the Thames Estuary, the Chatham naval dockyard and Dover and night-bomber sorties made over the Channel. These attacks were resumed the following winter.
After the Italian defeats in the Italo-Greek War and Operation Compass in North Africa, the Oberkommando der Wehrmacht ordered the deployment of German forces to these theatres. Amongst the Luftwaffe contingent deployed was the command unit StG 3, which touched down in Sicily in December 1940. In the next few days, two groups - 80 Stukas - were deployed under X. Fliegerkorps.
The first task of the "Korps" was to attack British shipping passing between Sicily and Africa, in particular the convoys aimed at re-supplying Malta. The Ju 87s first made their presence felt by subjecting the British aircraft carrier to heavy attack. The crews were confident that they could sink it as the flight deck had an area of about . On 10 January 1941, the Stuka crews were told that four direct hits with bombs would be enough to sink the carrier. The Ju 87s delivered six and three damaging near-misses but the ship's engines were untouched and she reached the besieged harbour of Malta.
The "Regia Aeronautica" was equipped for a while with the Stukas. In 1939, the Italian government asked the RLM to supply 100 Ju 87s. Italian pilots were sent to Graz in Austria to be trained for dive-bombing aircraft. In the spring of 1940, between 72 and 108 Ju 87 B-1s, some of them ex-Luftwaffe aircraft, were delivered to 96° "Gruppo Bombardamento a Tuffo". The Italian Stuka, renamed "Picchiatello", was in turn assigned to "Gruppi" 97°, 101° and 102°. The "Picchiatelli" were used against Malta, Allied convoys in Mediterranean and in North Africa (where they took part in conquering Tobruk). They were used by the Regia Aeronautica up to 1942.
Some of the "Picchiatelli" saw action in the opening phase of the Italian invasion of Greece in October 1940. Their numbers were low and ineffective in comparison to German operations. The Italian forces were quickly pushed back. By early 1941, the Greeks had pushed into Italian-occupied Albania. Once again, Hitler decided to send military aid to his ally. Before the Luftwaffe could intervene, the Italian Ju 87s achieved some successes. 97 "Gruppo" (Group) and its 239 "Squadriglia" (Squadron) sinking the Hellenic Navy freighter "Susanah" off Corfu on 4 April 1941 while the torpedo boat "Proussa" was sunk later in the day. On 21 April the Greek freighter "Ioanna" was sunk and they accounted for the British tanker "Hekla" off Tobruk on 25 May and then the Royal Australian Navy destroyer "Waterhen" on 20 June. The British gunboat "Cricket" and supply submarine "Cachalot" became victims. The former was crippled and later sunk by Italian warships.
In March, the pro-German Yugoslav government was toppled. A furious Hitler ordered the attack to be expanded to include Yugoslavia. Operation Marita commenced on 7 April. The Luftwaffe committed StG 1, 2 and 77 to the campaign. The Stuka once again spearheaded the air assault, with a front line strength of 300 machines, against minimal Yugoslav resistance in the air, allowing the Stukas to develop a fearsome reputation in this region. Operating unmolested, they took a heavy toll of ground forces, suffering only light losses to ground fire. The effectiveness of the dive bombers helped bring about Yugoslav capitulation in ten days. The Stukas also took a peripheral part in Operation Punishment, Hitler's retribution bombing of Belgrade. The dive bombers were to attack airfields and anti-aircraft gun positions as the level bombers struck civilian targets. Belgrade was badly damaged, with 2,271 people killed and 12,000 injured.
In Greece, despite British aid, little air opposition was encountered. As the Allies withdrew and resistance collapsed, the Allies began evacuating to Crete. The Stukas inflicted severe damage on Allied shipping. On 22 April, the 1,389 ton destroyers "Psara" and "Ydra" were sunk. In the next two days, the Greek naval base at Piraeus lost 23 vessels to Stuka attack.
During the Battle of Crete, the Ju 87s also played a significant role. On 21–22 May 1941, the Germans attempted to send in reinforcements to Crete by sea but lost 10 vessels to "Force D" under the command of Rear Admiral Irvine Glennie. The force, consisting of the cruisers , and , forced the remaining German ships to retreat. The Stukas were called upon to deal with the British naval threat. On 21 May, the destroyer was sunk and the next day the battleship was damaged and the cruiser was sunk, with the loss of 45 officers and 648 ratings. The Ju 87s also crippled the cruiser that morning, (she was later finished off by Bf 109 fighter bombers) while sinking the destroyer with one hit. As the Battle of Crete drew to a close, the Allies began yet another withdrawal. On 23 May, the Royal Navy lost the destroyers and , followed by on 26 May; "Orion" and "Dido" were also severely damaged. "Orion" had been evacuating 1,100 soldiers to North Africa; 260 of them were killed and another 280 wounded.
The dive bomber wing supported "Generalfeldmarschall" Erwin Rommel's Afrika Korps in its two-year campaign in North Africa; its other main task was attacking Allied shipping. In 1941, Ju 87 operations in North Africa were dominated by the Siege of Tobruk, which lasted for over seven months. It served during the Battle of Gazala and the First Battle of El Alamein, as well as the decisive Second Battle of El Alamein, which drove Rommel back to Tunisia. As the tide turned and Allied air power grew in the autumn of 1942, the Ju 87 became very vulnerable and losses were heavy. The entry of the Americans into North Africa during Operation Torch made the situation far worse; the Stuka was obsolete in what was now a fighter-bomber's war. The Bf 109 and Fw 190 could at least fight enemy fighters on equal terms after dropping their ordnance but the Stuka could not. The Ju 87's vulnerability was demonstrated on 11 November 1942, when 15 Ju 87 Ds were shot down by United States Army Air Forces (USAAF) Curtiss P-40Fs in minutes.
By 1943, the Allies enjoyed air supremacy in North Africa. The Ju 87s ventured out in "Rotte" strength only, often jettisoning their bombs at the first sight of enemy aircraft. Adding to this trouble, the German fighters had only enough fuel to cover the Ju 87s on takeoff, their most vulnerable point. After that, the Stukas were on their own.
The dive bombers continued operations in southern Europe; after the Italian surrender in September 1943, the Ju 87 participated in the last campaign-sized victory over the Western Allies, the Dodecanese Campaign. The Dodecanese Islands had been occupied by the British; the Luftwaffe committed 75 Stukas of StG 3 based in Megara (I./StG 3) and Argos (II.StG 3; from 17 October on Rhodes), to recover the islands. With the RAF bases away, the Ju 87 helped the German landing forces rapidly conquer the islands. On 5 October the minelayer "Lagnano" was sunk along with a patrol vessel, a steam ship and a light tank carrier "Porto Di Roma". On 24 October Ju 87s sank the landing craft "LCT115" and cargo ship "Taganrog" at Samos. On 31 October the light cruiser "Aurora" was put out of action for a year. The light cruisers "Penelope" and "Carlisle" were badly damaged by StG 3 and the destroyer "Panther" was also sunk by Ju 87s before the capitulation of the Allied force. It proved to be the Stuka's final victory against the British.
On 22 June 1941, the Wehrmacht commenced Operation Barbarossa, the invasion of the Soviet Union. The Luftwaffe order of battle of 22 June 1941 contained four dive bomber wings. "VIII. Fliegerkorps" was equipped with units "Stab", II. and III./StG 1. Also included were "Stab", I., II. and III. of "Sturzkampfgeschwader 2" Immelmann. Attached to "II. Fliegerkorps", under the command of "General der Flieger" Bruno Loerzer, were "Stab", I., II. and III. of StG 77. "Luftflotte 5", under the command of "Generaloberst" Hans-Jürgen Stumpff, operating from Norway's Arctic Circle, were allotted IV. "Gruppe" (St)/"Lehrgeschwader 1" ("LG 1").
The first Stuka loss on the Soviet-German front occurred early at 03:40–03:47 in the morning of the 22 June. While being escorted by Bf 109s from JG 51 to attack Brest Fortress, "Oberleutnant" Karl Führing of StG 77 was shot down by an I-153. The dive bomber wing suffered only two losses on the opening day of Barbarossa. As a result of the Luftwaffe's attention, the Soviet Air Force in the western Soviet Union was nearly destroyed. The official report claimed 1,489 Soviet aircraft destroyed. Göring ordered this checked. After picking their way through the wreckage across the front, Luftwaffe officers found that the tally exceeded 2,000. In the next two days, the Soviets reported the loss of another 1,922 aircraft. Soviet aerial resistance continued but ceased to be effective and the Luftwaffe maintained air superiority until the end of the year.
The Ju 87 took a huge toll on Soviet ground forces, helping to break up counterattacks of Soviet armour, eliminating strongpoints and disrupting the enemy supply lines. A demonstration of the Stukas effectiveness occurred on 5 July, when StG 77 knocked out 18 trains and 500 vehicles. As the 1st and 2nd Panzer Groups forced bridgeheads across the Dnieper river and closed in on Kiev, the Ju 87s again rendered invaluable support. On 13 September, Stukas from StG 1 destroyed the rail network in the vicinity as well as inflicting heavy casualties on escaping Red Army columns, for the loss of one Ju 87. On 23 September, Rudel (who was to become the most decorated serviceman in the Wehrmacht) of StG 2, sank the Soviet battleship "Marat", during an air attack on Kronstadt harbour near Leningrad, with a hit to the bow with a single bomb. During this action, "Leutnant" Egbert Jaeckel sank the destroyer "Minsk", while the destroyer "Steregushchiy" and submarine "M-74" were also sunk. The Stukas also crippled the battleship "Oktyabrskaya Revolutsiya" and the destroyers "Silnyy" and "Grozyashchiy" in exchange for two Ju 87s shot down.
Elsewhere on the Eastern front, the Junkers assisted Army Group Centre in its drive toward Moscow. From 13–22 December, 420 vehicles and 23 tanks were destroyed by StG 77, greatly improving the morale of the German infantry, who were by now on the defensive. StG 77 finished the campaign as the most effective dive bomber wing. It had destroyed 2,401 vehicles, 234 tanks, 92 artillery batteries and 21 trains for the loss of 25 Ju 87s to hostile action. At the end of Barbarossa, StG 1 had lost 60 Stukas in aerial combat and one on the ground. StG 2 lost 39 Ju 87s in the air and two on the ground, StG 77 lost 29 of their dive-bombers in the air and three on the ground (25 to enemy action). IV.(St)/LG1, operating from Norway, lost 24 Ju 87s, all in aerial combat.
In early 1942, the Ju 87s gave the Heer yet more valuable support. On 29 December 1941, the Soviet 44th Army landed on the Kerch Peninsula. The Luftwaffe was only able to dispatch meager reinforcements of four bomber groups ("Kampfgruppen") and two dive bomber groups belonging to StG 77. With air superiority, the Ju 87s operated with impunity. In the first 10 days of the Battle of the Kerch Peninsula, half the landing force was destroyed, while sea lanes were blocked by the Stukas inflicting heavy losses on Soviet shipping. The Ju 87's effectiveness against Soviet armour was not yet potent. Later versions of the T-34 tank could withstand Stuka attack in general, unless a direct hit was scored but the Soviet 44th Army had only obsolescent types with thin armour which were nearly all destroyed. During the Battle of Sevastopol, the Stukas repeatedly bombed the trapped Soviet forces. Some Ju 87 pilots flew up to 300 sorties against the Soviet defenders. StG 77 (Luftflotte 4) flew 7,708 combat sorties dropping 3,537 tonnes of bombs on the city. Their efforts help secure the capitulation of Soviet forces on 4 July.
For the German summer offensive, "Fall Blau", the Luftwaffe had concentrated 1,800 aircraft into "Luftflotte 4" making it the largest and most powerful air command in the world. The "Stukawaffe" strength stood at 151. During the Battle of Stalingrad, Stukas flew thousands of sorties against Soviet positions in the city. StG 1, 2 and 77 flew 320 sorties on 14 October 1942. As the German Sixth Army pushed the Soviets into a 1,000 metre enclave on the west bank of the Volga River, 1,208 Stuka sorties were flown against this small strip of land. The intense air attack, though causing horrific losses on Soviet units, failed to destroy them. The Luftwaffe's Stuka force made a maximum effort during this phase of the war. They flew an average of 500 sorties per day and caused heavy losses among Soviet forces, losing an average of only one Stuka per day. The Battle of Stalingrad marked the high point in the fortunes of the Junkers Ju 87 Stuka. As the strength of the Soviet Air Forces grew, they gradually wrested control of the skies from the Luftwaffe. From this point onward, Stuka losses increased.
The Stuka was also heavily involved in Operation Citadel, the Kursk offensive. The Luftwaffe committed I, II, III./St.G 1 and III./StG 3 under the command of Luftflotte 6. I., II, III. of StGs 2 and 3 were committed under the command of "Fliegerkorps" VIII. Rudel's cannon-equipped Ju 87 Gs had a devastating effect on Soviet armour at Orel and Belgorod. The Ju 87s participated in a huge aerial counter-offensive lasting from 16 to 31 July against a Soviet offensive at Khotynets and saved two German armies from encirclement, reducing the attacking Soviet 11th Guards Army to 33 tanks by 20 July. The Soviet offensive had been completely halted from the air although losses were considerable. Fliegerkorps VIII lost eight Ju 87s on 8 July, six on 9 July, six on 10 July and another eight on 11 July. The Stuka arm also lost eight of their Knight's Cross of the Iron Cross holders. StG 77 lost 24 Ju 87s in the period 5–31 July (StG had lost 23 in July–December 1942), while StG 2 lost another 30 aircraft in the same period. In September 1943, three of the Stuka units were re-equipped with the Fw 190F and G (ground attack versions) and began to be renamed "Schlachtgeschwader" (attack wings). In the face of overwhelming air opposition, the dive-bomber required heavy protection from German fighters to counter Soviet fighters. Some units like SG 2 "Immelmann" continued to operate with great success throughout 1943–45, operating the Ju 87 G variants equipped with 37 mm cannons, which became tank killers, although in increasingly small numbers.
In the wake of the defeat at Kursk, Ju 87s played a vital defensive role on the southern wing of the Eastern Front. To combat the Luftwaffe, the Soviets could deploy 3,000 fighter aircraft. As a result, the Stukas suffered heavily. SG 77 lost 30 Ju 87s in August 1943 as did SG 2 "Immelmann", which also reported the loss of 30 aircraft in combat operations. Despite these losses, Ju 87s helped the XXIX Army Corps break out of an encirclement near the Sea of Azov. The Battle of Kiev also included substantial use of the Ju 87 units, although again, unsuccessful in stemming the advances. Stuka units were with the loss of air superiority, becoming vulnerable on the ground as well. Some Stuka aces were lost this way. In the aftermath of Kursk, Stuka strength fell to 184 aircraft in total. This was well below 50 percent of the required strength. On 18 October 1943, StG 1, 2, 3, 5 and 77 were renamed "Schlachtgeschwader" (SG) wings, reflecting their ground-attack role, as these combat wings were now also using ground-attack aircraft, such as the Fw 190F-series aircraft. The Luftwaffe's dive-bomber units had ceased to exist.
A few Ju 87s were also retained for anti-shipping operations in the Black Sea, a role it had proved successful in when operating in the Mediterranean. In October 1943, this became evident again when StG 3 carried out several attacks against the Soviet Black Sea Fleet. On 6 October 1943 the most powerful flotilla in the fleet comprising the "Leningrad" class destroyers "Kharkov", "Besposhchadny" and "Sposobny" were caught and sunk by dive -bombing. After the disaster, Josef Stalin decreed that no more ships were to pass within range of German aircraft without his personal permission.
Towards the end of the war, as the Allies gained air supremacy, the Stuka was being replaced by ground-attack versions of the Fw 190. By early 1944, the number of Ju 87 units and operational aircraft terminally declined. For the Soviet summer offensive, Operation Bagration, 12 Ju 87 groups and five mixed groups (including Fw 190s) were on the Luftwaffe's order of battle on 26 June 1944. Gefechtsverband Kuhlmey, a mixed aircraft unit, which included large numbers of Stuka dive bombers, was rushed to the Finnish front in the summer of 1944 and was instrumental in halting the Soviet fourth strategic offensive. The unit claimed 200 Soviet tanks and 150 Soviet aircraft destroyed for 41 losses. By 31 January 1945, only 104 Ju 87s remained operational with their units. The other mixed "Schlacht" units contained a further 70 Ju 87s and Fw 190s between them. Chronic fuel shortages kept the Stukas grounded and sorties decreased until the end of the war in May 1945.
In the final months of the war the ground attack groups were still able to impose operational constraints upon the enemy. Most notably the aircraft participated in the defence of Berlin. On 12 January 1945 the 1st Belorussian Front initiated the Vistula–Oder Offensive. The offensive made rapid progress. The Soviets eventually outran their air support which was unable to use forward, quagmire-filled, airfields. The Germans, who had fallen back on air bases with good facilities and concrete runways, were able to mount uninterrupted attacks against Soviet army columns. Reminiscent of the early years, the "Luftwaffe" was able to inflict high losses largely unopposed. Over 800 vehicles were destroyed within two weeks. In the first three days of February 1945, 2,000 vehicles and 51 tanks were lost to German air attacks. The Belorussian Front was forced to abandon its attempt to capture Berlin by mid-February 1945. The Ju 87 participated in these intense battles in small numbers. It was the largest concentration of German air power since 1940 and even in February 1945 the Germans were able to achieve and challenge for air superiority on the Eastern Front. The air offensive was instrumental in saving Berlin, albeit only for three months. The effort exhausted German fuel reserves. The contribution of the Ju 87 was exemplified by Rudel, who claimed 13 enemy tanks on 8 February 1945.
Two intact Ju 87s survive, with a third being restored:
A later, ground-attack variant, this is displayed at the Royal Air Force Museum in London; it was captured by British forces at Eggebek, Schleswig-Holstein in May 1945. It is thought to have been built in 1943–1944 as a D-5 before being rebuilt as a G-2 variant, possibly by fitting G-2 outer wings to a D-5 airframe. The wings have the hard-points for "Bordkanone BK 3,7" gun-pods, but these are not fitted. It was one of 12 captured German aircraft selected by the British for museum preservation and assigned to the Air Historical Branch. The aircraft was stored and displayed at various RAF sites until 1978, when it was moved to the RAF Museum. In 1967, permission was given to use the aircraft in the film "Battle of Britain" and it was repainted and modified to resemble a 1940 variant of the Ju 87. The engine was found to be in excellent condition and there was little difficulty in starting it, but returning the aircraft to airworthiness was considered too costly for the filmmakers, and ultimately, models were used in the film to represent Stukas. In 1998, the film modifications were removed, and the aircraft returned to the original G-2 configuration.
This aircraft is displayed in the Chicago Museum of Science and Industry. It was abandoned in North Africa and found by British forces in 1941. The Ju 87 was donated by the British government and sent to the US during the war. It was fully restored in 1974 by the EAA of Wisconsin.
One Ju 87 is under restoration:
One aircraft is being restored to airworthy condition from two wrecks, owned by Paul Allen's Flying Heritage & Combat Armor Museum. The project takes its identification from Ju 87 R-4 Werk Nr. "6234", which was built in 1941 and served with Stukageschwader 5. Shot down in April 1942 on a mission to bomb Murmansk, it was recovered in 1992. The wreck was purchased by New Zealand collector Tim Wallis, who originally planned for a rebuild to airworthy status, and later went to the Deutsches Technikmuseum in Berlin. Parts from a second airframe, a Ju 87 R-2 "Werknummer" 857509 which served bearing the "Stammkennzeichen" of code LI+KU from 1./St.G.5, and was recovered to the United Kingdom in 1998, have also been incorporated. The project was displayed in November 2018 and the restoration was stated to take between 18 months and two years to complete. Work will be conducted in a display hangar to allow the public to observe the work underway.
Other aircraft survive as wreckage recovered from crash sites: | https://en.wikipedia.org/wiki?curid=16590 |
Keanu Reeves
Keanu Charles Reeves ( ; born September 2, 1964) is a Canadian actor, musician, film producer and director. Born in Beirut, Lebanon, Reeves grew up in Toronto, Canada. He began acting in theatre productions, and in television films before making his mainstream film debut in "Youngblood" (1986). Reeves gained recognition in his breakthrough role as Ted "Theodore" Logan in the science fiction comedy "Bill & Ted's Excellent Adventure" (1989). This was followed by a supporting role in Ron Howard's comedy "Parenthood", 1991's "Point Break", a sequel, "Bill & Ted's Bogus Journey", and the independent drama "My Own Private Idaho" where playing a street hustler received critical praise for his performance. He had a supporting role in "Bram Stoker's Dracula" (1992), which was nominated for four Academy Awards.
The action thriller "Speed" (1994), in which Reeves was cast as a police officer, garnered critical and commercial success, and helped Reeves gain further recognition. He followed this with a series of films box office failures, including: "Johnny Mnemonic" (1995), "Chain Reaction" (1996), and "The Last Time I Committed Suicide" (1997). However, his performance in the supernatural horror "The Devil's Advocate" (1997) was well received. Global stardom followed soon after with his lead role as computer hacker Neo in the science fiction thriller "The Matrix" (1999). The film was a commercial success and won four Academy Awards, although its sequels, "Reloaded" and "Revolutions" (both in 2003) were met with a mixed reception. In 2005, Reeves played exorcist John Constantine in "Constantine" and a dentist in the comedy-drama "Thumbsucker". He also starred in the animated film "A Scanner Darkly" (2006), the romantic drama "The Lake House" (2006), the science fiction thriller "The Day the Earth Stood Still" (2008), and the crime thriller "Street Kings" (2008).
In 2013, he made his directorial debut with "Man of Tai Chi". Although he was praised for his direction, the film was a box-office bomb. In 2014, Reeves played the titular assassin in the neo-noir action thriller "John Wick", which was a commercial success and was generally well-received. A year later, he starred in "Knock Knock" and narrated two documentaries, "Deep Web and" "". 2016 saw Reeves star in five films, including "The Neon Demon, The Bad Batch", and "The Whole Truth". He also starred in a web television series "Swedish Dicks". He returned to the "John Wick" franchise in the commercially successful sequels, "" (2017), and "" (2019). In 2019, Reeves voiced Duke Caboom in "Toy Story 4" (2019). Reeves has spent most of his career typecast as the action hero; "saving the world" is a recurring character arc in many roles he has portrayed, such as the characters of Ted "Theodore" Logan, Gautama Buddha, Neo, Johnny Mnemonic, John Constantine, and Klaatu. Besides acting, he played bass guitar for the band Dogstar and pursued other endeavors such as writing and philanthropy.
Keanu Charles Reeves was born in Beirut, Lebanon, on September 2, 1964, the son of Patricia (née Taylor), a costume designer and performer, and Samuel Nowlin Reeves Jr. His mother is English, originating from Essex. His father is from Hawaii, and is of Chinese, English, Irish, Native Hawaiian, and Portuguese descent. His father earned a GED while serving time in prison for selling heroin at Hilo International Airport. His mother was working in Beirut when she met his father, who abandoned his wife and family when Reeves was three years old. Reeves last met his father on the island of Kauai when he was 13.
After his parents divorced in 1966, his mother moved the family to Sydney, Australia, and then to New York City, where she married Paul Aaron, a Broadway and Hollywood director, in 1970. The couple moved to Toronto, Canada, and divorced in 1971. When Reeves was nine, he took part in a theatre production of "Damn Yankees". At 15, he worked as a production assistant on Aaron's films. Reeves' mother married Robert Miller, a rock music promoter, in 1976; the couple divorced in 1980. She subsequently married her fourth husband, a hairdresser named Jack Bond; the marriage lasted until 1994. Reeves and his sisters grew up primarily in the Yorkville neighborhood of Toronto, with grandparents and nannies caring for them. Because of his grandmother's ethnicity, he grew up around Chinese art, furniture, and cuisine. Reeves watched British comedy shows such as "The Two Ronnies", and his mother imparted English manners that he has maintained into adulthood.
Describing himself as a "private kid", Reeves attended four different high schools, including the Etobicoke School of the Arts, from which he was expelled. Reeves said he was expelled because he was "just a little too rambunctious and shot [his] mouth off once too often ... [he] was not generally the most well-oiled machine in the school". At De La Salle College, he was a successful ice hockey goalkeeper. Reeves had aspirations to become a professional ice hockey player for the Canadian Olympic team but decided to become an actor when he was 15. After leaving the college, he attended Avondale Secondary Alternative School, which allowed him to get an education while working as an actor. He dropped out of high school when he was 17. He obtained a green card through his American stepfather and moved to Los Angeles three years later. Reeves holds Canadian citizenship by naturalization.
In 1984, Reeves was a correspondent for the Canadian Broadcasting Corporation (CBC) youth television program "Going Great". That same year, he made his acting debut in an episode of the television series, called "Hangin' In". In 1985, he played Mercutio in a stage production of "Romeo and Juliet" at the Leah Posluns Theatre in North York, Ontario. He made further appearances on stage, including Brad Fraser's cult hit "Wolfboy" in Toronto. He also appeared in a Coca-Cola commercial, and in 1985, the National Film Board of Canada (NFB) coming-of-age, short film "One Step Away".
Reeves made a foray into television films in 1986, including NBC's "Babes in Toyland, Act of Vengeance" and "Brotherhood of Justice". He made his first motion picture appearances in Peter Markle's "Youngblood" (1986), in which he played a goalkeeper, and in the low-budget romantic drama, "Flying". He was cast as Matt in "River's Edge", a crime drama about a group of high school friends dealing with a murder case, loosely based on the 1981 murder of Marcy Renee Conrad. The film premiered in 1986 at the Toronto International Film Festival to a largely positive response. Janet Maslin of "The New York Times" describes the performances of the young cast as "natural and credible", with Reeves being described as "affecting and sympathetic".
Towards the end of the 1980s, Reeves starred in several dramas aimed at teen audiences, including as the lead in "The Night Before" (1988), a comedy starring opposite Lori Loughlin, "The Prince of Pennsylvania" (1988) and "Permanent Record" (1988). Although the latter received mixed reviews, "Variety" praised Reeves' performance, "which opens up nicely as the drama progresses". His other acting efforts included a supporting role in "Dangerous Liaisons" (1988), which earned seven nominations at the 61st Academy Awards, winning three: Best Adapted Screenplay, Best Costume Design, and Best Production Design. This was followed by "Bill & Ted's Excellent Adventure" (1989), in which he portrays a slacker who travels through time with a friend (portrayed by Alex Winter), to assemble historical figures for a school presentation. The film was generally well-received by critics and grossed $40.5 million at the worldwide box office. Film review aggregator Rotten Tomatoes gave the film a 79% approval rating with the critical consensus: "Keanu Reeves and Alex Winter are just charming, goofy, and silly enough to make this fluffy time-travel Adventure work".
In 1989, Reeves starred in the comedy-drama "Parenthood" directed by Ron Howard. Nick Hilditch of the BBC gave the film three out of five stars, calling it a "feelgood movie" with an "extensive and entertaining ensemble cast". In 1990, Reeves gave two acting performances. He portrayed an incompetent hitman in the black comedy "I Love You to Death", and played Martin, a radio station employee in the comedy "Tune in Tomorrow". He also appeared in Paula Abdul's music video for "Rush Rush" which featured a "Rebel Without a Cause" motif, with him in the James Dean role.
In 1991, Reeves starred in "Bill & Ted's Bogus Journey", a sequel to "Bill & Ted's Excellent Adventure", with his co-star Alex Winter. Michael Wilmington of the "Los Angeles Times" felt that the sequel was "more imaginative, more opulent, wilder and freer, more excitingly visualized", praising the actors for their "fuller" performances. Film critic Roger Ebert thought it was "a riot of visual invention and weird humor that works on its chosen sub-moronic level ... It's the kind of movie where you start out snickering in spite of yourself, and end up actually admiring the originality that went into creating this hallucinatory slapstick". The rest of 1991 marked a significant transition for Reeves' career as he undertook adult roles. Co-starring with River Phoenix as a street hustler in the adventure "My Own Private Idaho", the characters embark on a journey of personal discovery. The story was written by Gus Van Sant, and is loosely based on Shakespeare's "Henry IV, Part 1", "Henry IV, Part 2", and "Henry V". The film premiered at the 48th Venice International Film Festival, followed by a theatrical release in the United States on September 29, 1991. The film earned $6.4 million at the box office. "My Own Private Idaho" was positively received, with Owen Gleiberman of "Entertainment Weekly" describing the film as "a postmodern road movie with a mood of free-floating, trance-like despair... a rich, audacious experience". "The New York Times" complimented Reeves and Phoenix for their insightful performances.
Reeves starred alongside Patrick Swayze, Lori Petty and Gary Busey in the action thriller "Point Break" (1991), directed by Kathryn Bigelow. He plays an undercover FBI agent tasked with investigating the identities of a group of bank robbers. To prepare for the film, Reeves and his co-stars took surfing lessons with professional surfer Dennis Jarvis in Hawaii. Reeves had never surfed before. Upon its release, "Point Break" was generally well-received, and a commercial success, earning $83.5 million at the box office. Reeves' performance was praised by "The New York Times" for "considerable discipline and range", adding, "He moves easily between the buttoned-down demeanor that suits a police procedural story and the loose-jointed manner of his comic roles". Writing for "The Washington Post", Hal Hinson called Reeves the "perfect choice" and praised the surfing scenes, but opined "the filmmakers have their characters make the most ludicrously illogical choices imaginable". At the 1992 MTV Movie Awards, Reeves won the Most Desirable Male award.
In 1992, he played Jonathan Harker in Francis Ford Coppola's Gothic horror "Bram Stoker's Dracula", based on Stoker's 1897 novel "Dracula". Starring alongside Gary Oldman, Winona Ryder and Anthony Hopkins, the film was critically and commercially successful. It grossed $215.8 million worldwide. For his role, Reeves was required to speak with an English accent, which drew some criticism; "Overly posh and entirely ridiculous, Reeves's performance is as painful as it is hilarious", wrote Limara Salt of Virgin Media. Recalling the experience in 2015, director Coppola said, "[Reeves] tried so hard ... He wanted to do it perfectly and in trying to do it perfectly it came off as stilted". "Bram Stoker's Dracula" was nominated for four Academy Awards, winning three in Best Costume Design, Best Sound Editing and Best Makeup. The film also received four nominations at the British Academy Film Awards.
In 1991, Reeves developed an interest in a music career; he formed an alternative rock band called Dogstar, consisting of members Robert Mailhouse, Gregg Miller and Bret Domrose. Reeves played the bass guitar. In 1993, he had a role in "Much Ado About Nothing", based on Shakespeare's play of the same name. The film received positive reviews, although Reeves was nominated for a Golden Raspberry Award for Worst Supporting Actor. "The New Republic" magazine thought his casting was "unfortunate" because of his amateur performance. That same year, he starred in two more drama films, "Even Cowgirls Get the Blues" and "Little Buddha", both of which garnered a mixed-to-negative reception. "The Independent" gave "Little Buddha" a mixed review but opined Reeves was "credible" and fitting for the part.
He starred in the action thriller "Speed" (1994) alongside Sandra Bullock and Dennis Hopper. He plays police officer Jack Traven, who must prevent a bus from exploding by keeping its speed above 50 mph. "Speed" was the directorial debut of Dutch director Jan de Bont. Several actors were considered for the lead role, but Reeves was chosen because Bont was impressed with his "Point Break" performance. To look the part, Reeves shaved all his hair off and spent two months in the gym to gain muscle mass. During production, Reeves' friend River Phoenix (and co-star in "My Own Private Idaho") died, resulting in adjustments to the filming schedule to allow him to mourn. "Speed" was released on June 10 to a critically acclaimed response. Gene Siskel of the "Chicago Tribune" lauded Reeves, calling him "absolutely charismatic ... giving a performance juiced with joy as he jumps through elevator shafts ... and atop a subway train". David Ansen, writing for "Newsweek", summarized "Speed" as, "Relentless without being overbearing, this is one likely blockbuster that doesn't feel too big for its britches. It's a friendly juggernaut". The film grossed $35 million from a $30 million budget and won two Academy Awards in 1995Best Sound Editing and Best Sound.
Following "Speed", Reeves' next leading role came in 1995, in the cyberpunk action thriller "Johnny Mnemonic". It is based on the story of the same name by William Gibson, about a man who has had a cybernetic brain implant. As part of the film studio's marketing efforts, a CD-ROM video game was also released. The film, however, received mainly negative reviews and critics felt Reeves was "woefully miscast". He next appeared in the romantic drama "A Walk in the Clouds" (1995), which also garnered mixed-to-negative reviews. Reeves plays a young soldier returning home from World War II, trying to settle down with a woman he married impulsively just before he enlisted. Film critic Mick LaSalle opined, ""A Walk in the Clouds" is for the most part a beautiful, well-acted and emotionally rich picture", whereas Hal Hinson from "The Washington Post" said, "The film has the syrupy, Kodak magic-moment look of a Bo Derek movie, and pretty much the same level of substance". Besides film work, Reeves retreated briefly back to the theatre playing Prince Hamlet in a 1995 Manitoba Theatre Centre production of "Hamlet" in Winnipeg, Manitoba. "Sunday Times" critic Roger Lewis believed his performance, writing he "quite embodied the innocence, the splendid fury, the animal grace of the leaps and bounds, the emotional violence, that form the Prince of Denmark ... He is one of the top three Hamlets I have seen, for a simple reason: he "is" Hamlet".
Reeves was soon drawn to science fiction roles, appearing in "Chain Reaction" (1996) with co-stars Morgan Freeman, Rachel Weisz, Fred Ward, Kevin Dunn and Brian Cox. He plays a researcher of a green energy project, who has to go on the run when he is framed for murder. "Chain Reaction" was not a critical success and received mostly negative reviews; film review aggregator Rotten Tomatoes gave it a rating of 16% and described it as "a man-on-the-run thriller that mostly sticks to generic formula". Reeves' film choices after "Chain Reaction" were also critical disappointments. He starred in the independent crime comedy "Feeling Minnesota" (1996), with Vincent D'Onofrio and Cameron Diaz, which was described as "shoddily assembled, and fundamentally miscast" by Rotten Tomatoes. That same year, he turned down an offer to star in "", despite being offered a salary of $12 million. According to Reeves, this decision caused 20th Century Fox to sever ties with him for a decade. Instead, Reeves toured with his band Dogstar, and appeared in the drama film "The Last Time I Committed Suicide" (1997), based on a 1950 letter written by Neal Cassady to Jack Kerouac. In a scathing review of Reeves' performance, Paul Tatara of CNN called him "void of talent ... here he is again, reciting his lines as if they're non-related words strung together as a memory exercise".
In 1997, he starred in the supernatural horror "The Devil's Advocate" alongside Al Pacino and Charlize Theron. Reeves agreed to a pay cut of several million dollars so that the film studio could afford to hire Pacino. Based on Andrew Neiderman's novel of the same name, the feature is about a successful young lawyer invited to New York City to work for a major firm, who discovers the owner of the firm is a devil. "The Devil's Advocate" attracted positive reviews from critics. Film critic James Berardinelli called the film "highly enjoyable" and noted, "There are times when Reeves lacks the subtlety that would have made this a more multi-layered portrayal, but it's nevertheless a solid job".
In 1999, Reeves starred in the critically acclaimed science fiction film "The Matrix", the first installment in what would become "The Matrix franchise". Reeves portrays computer programmer Thomas Anderson, a hacker using the alias "Neo", who discovers humanity is trapped inside a simulated reality created by intelligent machines. Written and directed by the Wachowskis, Reeves had to prepare by reading Kevin Kelly's "", and Dylan Evans's ideas on evolutionary psychology. The principal cast underwent months of intense training with martial arts choreographer Yuen Woo-ping to prepare for the fight scenes. "The Matrix" proved to be a box office success; several critics considered it to be one of the best science fiction films of all time. Kenneth Turan of the "Los Angeles Times" felt it was a "wildly cinematic futuristic thriller that is determined to overpower the imagination", despite perceiving weaknesses in the film's dialogue. Janet Maslin of "The New York Times" credited Reeves for being a "strikingly chic Prada model of an action hero", and thought the martial arts stunts were its "single strongest selling point". "The Matrix" received Academy Awards for Best Film Editing, Best Sound Editing, Best Visual Effects, and Best Sound.
After the success of "The Matrix", Reeves avoided another blockbuster in favor of a lighthearted sports comedy, "The Replacements" (2000). He agreed to a pay cut to enable Gene Hackman co-star in the film. Against his wishes, Reeves starred in the thriller "The Watcher" (2000), playing a serial killer who stalks a retired FBI agent. He said that a friend forged his signature on a contract, which he could not prove; he appeared in the film to avoid legal action. Upon its release, the film was critically panned. That year, he had a supporting role in another thriller, Sam Raimi's "The Gift", a story about a woman (played by Cate Blanchett) with extrasensory perception asked to help find a young woman who disappeared. The film grossed $44 million worldwide. Film critic Paul Clinton of CNN thought the film was fairly compelling, saying of Reeves' acting: "[Raimi] managed to get a performance out of Reeves that only occasionally sounds like he's reading his lines from the back of a cereal box."
In 2001, Reeves continued to explore and accept roles in a diverse range of genres. The first was a romantic comedy, "Sweet November", a 1968 remake of the same name. This was his second collaboration with Charlize Theron; the film was met with a generally negative reception. Desson Thompson of "The Washington Post" criticized it for its "syrupy cliches, greeting-card wisdom and over-the-top tragicomedy", but commended Reeves for his likability factor in every performance he gives. "Hardball" (2001) marked Reeves' attempt in another sports comedy. Directed by Brian Robbins, it is based on the book "Hardball: A Season in the Projects" by Daniel Coyle. Reeves plays Conor O'Neill, a troubled young man who agrees to coach a Little League team from the Cabrini Green housing project in Chicago as a condition of obtaining a loan. Film critic Roger Ebert took note of the film's desire to tackle difficult subjects and baseball coaching, but felt it "drifts above the surface", and Reeves' performance was "glum and distant".
By 2002, his professional music career had come to an end when Dogstar disbanded. The band had released two albums during their decade together; "Our Little Visionary" in 1996 and "Happy Ending" in 2000. Sometime afterwards, Reeves performed in the band Becky for a year, founded by Dogstar band-mate Rob Mailhouse, but quit in 2005, citing a lack of interest in a serious music career. After being absent from the screen in 2002, Reeves returned to "The Matrix" sequels in 2003 with "The Matrix Reloaded" and "The Matrix Revolutions", released in May and November, respectively. Principal photography for both films was completed back-to-back, primarily at Fox Studios in Australia. "The Matrix Reloaded" garnered mostly favorable reviews; John Powers of "LA Weekly" praised the "dazzling pyrotechnics" but was critical of certain "mechanical" action scenes. Of Reeves' acting, Powers thought it was somewhat "wooden" but felt he has the ability to "exude a charmed aura". Andrew Walker, writing for "London Evening Standard", praised the cinematography ("visually it gives full value as a virtuoso workout for your senses") but he was less taken by the film's "dime-store philosophy". The film grossed $739 million worldwide.
"The Matrix Revolutions", the third installment, was met with mixed reception. According to review aggregator Rotten Tomatoes, the consensus was that "characters and ideas take a back seat to the special effects". Paul Clinton, writing for CNN, praised the special effects but felt Reeves' character was "dazed and confused". In contrast, the "San Francisco Chronicle"s Carla Meyer was highly critical of the special effects, writing, "[The Wachowskis] computer-generated imagery goes from dazzling to deadening in action scenes that favor heavy, clanking weaponry over the martial-arts moves that thrilled viewers of "The Matrix" and "The Matrix Reloaded"." Nevertheless, the film grossed a healthy $427 million worldwide, although less than the two previous films. "Something's Gotta Give", a romantic comedy, was Reeves' last release of 2003. He co-starred with Jack Nicholson and Diane Keaton, and played Dr. Julian Mercer in the film. "Something's Gotta Give" received generally favorable reviews.
In 2005, Reeves played the titular role in "Constantine", an occult detective film, about a man who has the ability to perceive and communicate with half-angels and half-demons in their true form. The film was a respectable box office hit, grossing $230 million worldwide from a $100 million budget but attracted mixed-to-positive reviews. "The Sydney Morning Herald"s critic noted that ""Constantine" isn't bad, but it doesn't deserve any imposing adjectives. It's occasionally cheesy, sometimes enjoyable, intermittently scary, and constantly spiked with celestial blatherskite". He next appeared in "Thumbsucker", which premiered at the Sundance Film Festival in 2005. A comedy adapted from the 1999 Walter Kirn novel of the same name, the story follows a boy with a thumb-sucking problem. Reeves and the rest of the cast garnered positive critical reviews, with "The Washington Post" describing it as "a gently stirring symphony about emotional transition filled with lovely musical passages and softly nuanced performances".
Reeves appeared in the Richard Linklater-directed animated science fiction thriller "A Scanner Darkly", which premiered at the 2006 Cannes Film Festival. In 2006, Reeves also played Bob Arctor/Fred, an undercover agent in a futuristic dystopia under high-tech police surveillance. Based on the novel of the same name by Philip K. Dick, the film was a box office failure. However, the film attracted generally favorable reviews; Paul Arendt of the BBC thought the film was "beautiful to watch", but Reeves was outshone by his co-star Robert Downey Jr". His next role was Alex Wyler in "The Lake House" (2006), a romantic drama adaptation of the South Korean film "Il Mare" (2000), which reunited him with Sandra Bullock. Despite its box office success, Mark Kermode of "The Guardian" was highly critical, writing "this syrup-drenched supernatural whimsy achieves stupidity at a genuinely international level ... The last time Bullock and Reeves were together on screen the result was "Speed". This should have been entitled Stop". Towards the end of 2006, he co-narrated "The Great Warming" with Alanis Morrissette, a documentary about climate change mitigation.
Next in 2008, Reeves collaborated with director David Ayer on the crime thriller "Street Kings". He played an undercover policeman who must clear his name after the death of another officer. Released on April 11, the film grossed a moderate $66 million worldwide. The film's plot and Reeves' performance, however, were met with mostly unenthusiastic reviews. Paul Byrnes of "The Sydney Morning Herald" stated, "It's full of twists and turns, a dead body in every reel, but it's not difficult to work out who's betraying whom, and that's just not good enough". "The Guardian" opined, "Reeves is fundamentally blank and uninteresting". Reeves starred in another science fiction film, "The Day the Earth Stood Still", a loose adaptation of the 1951 film of the same name. He portrayed Klaatu, an alien sent from outer space to try to change human behavior or eradicate humans because of their environmental impact. At the 2009 Razzie Awards, the film was nominated for Worst Prequel, Remake, Rip-off or Sequel. Many critics were unimpressed with the heavy use of special effects; "The Telegraph" credited Reeves' ability to engage the audience, but thought the cinematography was abysmal and the "sub-Al-Gore environment lecture leaves you light-headed with tedium".
Rebecca Miller's "The Private Lives of Pippa Lee" was Reeves' sole release of 2009, which premiered at the 59th Berlin International Film Festival. The romantic comedy and its ensemble received an amicable review from "The Telegraph"s David Gritten; "Miller's film is a triumph. Uniformly well acted, it boasts a psychologically knowing script, clearly written by a smart, assertive human". In 2010, he starred in another romantic comedy, "Henry's Crime", about a man who is released from prison for a crime he did not commit, but then targets the same bank with his former cellmate. The film was not a box office hit. Reeves' only work in 2011 was an adult picture book titled "Ode to Happiness", which he wrote, complemented by Alexandra Grant's illustrations. Reeves co-produced and appeared in a 2012 documentary, "Side by Side". He interviewed filmmakers including James Cameron, Martin Scorsese, and Christopher Nolan; the feature investigated digital and photochemical film creation. Next, Reeves starred in "Generation Um..." (2012), an independent drama which was critically panned.
In 2013, Reeves starred in his own directorial debut, the martial arts film "Man of Tai Chi". The film has multilingual dialogue and follows a young man drawn to an underground fight club, partially inspired by the life of Reeves' friend Tiger Chen. Principal photography took place in China and in Hong Kong. Reeves was also assisted by Yuen Woo-ping, the fight choreographer of "The" "Matrix" films. "Man of Tai Chi" premiered at the Beijing Film Festival and the Cannes Film Festival, and received praise from director John Woo. A wider, warm response followed suit; Bilge Ebiri of "Vulture" thought the fight sequences were "beautifully assembled", and Reeves showed restraint with the film editing to present "the fighters in full motion". "The Los Angeles Times" wrote, "The brutally efficient shooting style Reeves employs to film master choreographer Yuen Woo-ping's breathtaking fights ... is refreshingly grounded and old-school kinetic", while Dave McGinn of "The Globe and Mail" called the film "ambitious but generic". At the box office, "Man of Tai Chi" was a commercial disappointment, grossing only $5.5 million worldwide from a budget of $25 million. Also in 2013, Reeves played Kai in the 3D fantasy "47 Ronin", a Japanese fable about a group of rogue samurai. The film premiered in Japan but failed to gain traction with audiences. Initial reviews were not positive, causing Universal Pictures to reduce advertising spending for the film elsewhere. "47 Ronin" was a box office flop and was mostly poorly received.
After this series of commercial failures, Reeves career rebounded in 2014. He played the titular role in the action thriller "John Wick", directed by Chad Stahelski. In the first installment of the "John Wick" franchise, Reeves plays a retired hitman seeking vengeance. He worked closely with the screenwriter to develop the story; "We all agreed on the potential of the project. I love the role, but you want the whole story, the whole ensemble to come to life", Reeves said. Filmed on location in the New York City area, the film was eventually released on October 24 in the United States. "The Hollywood Reporter" was impressed by the director's "confident, muscular action debut", and Reeves' "effortless" performance, which marked his return to the action genre. Jeannette Catsoulis of "The New York Times" praised Reeves' fight scenes and noted he is "always more comfortable in roles that demand cool over hot, attitude over emotion". "John Wick" proved to be a box office success, grossing $86 million worldwide. Next, Reeves starred in a smaller-scale horror feature, "Knock Knock" (2015), a remake of the 1977 film "Death Game". Described as "over-the-top destruction" by the "Toronto Star", Reeves plays a father, home alone, when two young women show up and start a game of cat and mouse. His other releases in 2015 were the documentaries "Deep Web", about crime on the dark web, and "", about the life of a Japanese actor (Toshiro Mifune) famous for playing samurai characters. He narrated both films.
Reeves appeared in five film releases in 2016. The first was "Exposed", a crime thriller about a detective who investigates his partner's death and discovers police corruption along the way. The film received negative reviews for its confused plot, and Reeves' was critized for displaying limited facial expressions. His next release, the comedy "Keanu", was better received. In it he voiced the eponymous kitten. Reeves then had a minor role in "The Neon Demon", a psychological horror directed by Nicolas Winding Refn. He played Hank, a lustful motel owner who appears in Jesse's (played by Elle Fanning) nightmare. In his fourth release, he played a charismatic leader of a settlement in "The Bad Batch". His final release of the year was "The Whole Truth", featuring Gabriel Basso, Gugu Mbatha-Raw, Renée Zellweger, and Jim Belushi. He played Richard, a defense attorney. Noel Murray of "The A.V. Club" described it as "moderately clever, reasonably entertaining courtroom drama", with a skilled cast but overall a "mundane" film. Reeves also appeared in "Swedish Dicks", a two-season web television series.
In 2017, Reeves agreed to reprise his role for a sequel in the "John Wick" franchise, "". The story carries on from the first film and follows "John Wick" as he goes on the run when a bounty is placed on him. The film was a critical and commercial success, grossing $171.5 million worldwide, more than its predecessor. Chris Hewitt of "Empire" magazine praised Reeves' performance, which complimented nicely with his previous action roles ("Point Break" and "Speed)". However, Justin Chang of the "Los Angeles Times" described the picture as "a down-and-dirty B-picture with a lustrous A-picture soul". Besides to this large-scale feature, Reeves starred in a drama, "To the Bone", in which he plays a doctor helping a young woman with anorexia. It premiered at the Sundance Film Festival, followed by distribution on Netflix in July. Early reviews were positive, with critics praising its non-glamorized portrayal of anorexia, although the "New Statesman" reviewer did not think this was enough. 2017 also saw Reeves make cameo appearances in the films "A Happening of Monumental Proportions" and "SPF-18".
Reeves reunited with Winona Ryder in the 2018 comedy "Destination Wedding", about wedding guests who develop a mutual affection for each other. They had worked together previously in "Bram Stoker's Dracula", "A Scanner Darkly" and "The Private Lives of Pippa Lee". Reeves also co-produced and starred in two thrillers. "Siberia", in which he plays a diamond trader who travels to Siberia to search for his Russian partner, and "Replicas", which tells the story of a neuroscientist who violates laws and bioethics to bring his family back to life after they die in an automobile crash. "Siberia" and "Replicas" were critically panned, gaining only thirteen and nine percent approval ratings, respectively, from critics on Rotten Tomatoes.
Returning to the "John Wick" franchise, Reeves starred in "" (2019), the third feature in the series directed by Chad Stahelski. The film takes place immediately after the events of "John Wick: Chapter 2" and features new cast members including Halle Berry. The film was another box office hit, grossing $171 million in the United States and more than $155 million internationally. "The Globe and Mail"s reviewer gave the film three out of four stars, praising the action choreography for not becoming boring, but felt there was "aesthetic overindulgence" with the cinematography. "The Guardian"s Cath Clarke questioned Reeves' acting–writing "he keeps his face statue-still", and added "three movies in, franchise bloat is beginning to set in". Reeves was nominated for Favorite Male Movie Star of 2019 in the People's Choice Awards, and the film itself was nominated for Best Contemporary Film in the Art Directors Guild Awards. Reeves had a cameo role in the Netflix original film "Always Be My Maybe" (2019), which he took on to highlight his Asian ancestry. He then voiced Duke Caboom in "Toy Story 4" (2019), the fourth installment of Pixar's "Toy Story" franchise. That same year, on April 27 and 28, a film festival was held in his honour, called KeanuCon, hosted in Glasgow, Scotland. Over two days, nine of his films were screened for guests.
Reeves will next appear in "Bill & Ted Face the Music", the third sequel in the "Bill & Ted" franchise, with his co-star Alex Winter. Reeves will portray the character Johnny Silverhand in the video game "Cyberpunk 2077;" he has completed motion capture and voice recordings for the role. He is also due to appear in the animation "" (2020) as a tumbleweed named Sage. Reeves will reprise his role in the fourth "John Wick" film, scheduled for release in 2021. "The Matrix 4", the fourth sequel to "The Matrix" franchise, is currently being filmed with Reeves and Carrie-Anne Moss reprising their roles. In 2019, Reeves travelled to São Paulo to produce a Netflix series, "Conquest", directed by Carl Rinsch. Details are being kept secret.
On December 24, 1999, Reeves' girlfriend, Jennifer Syme, gave birth eight months into her pregnancy to Ava Archer Syme-Reeves, who was stillborn. The couple broke up several weeks later. On April 2, 2001, Syme was killed when her vehicle collided with three parked cars on Cahuenga Boulevard in Los Angeles. She was being treated for depression at the time, and police found prescription medication in her car. Reeves, who was scheduled to film "The Matrix" sequels the following spring, sought "peace and time", according to friend Bret Domrose of Dogstar.
In 2008, he was linked to model-actress China Chow. In 2009, Reeves met Alexandra Grant at a dinner party; they went on to collaborate on two books together. They went public with their relationship in November 2019.
Reeves is discreet about his spiritual beliefs, saying that it is something "personal and private". In 1997, he expressed a belief in God and the Devil, but said "they don't have to have pitchforks and a long white beard". He has said that he has a lot of interest and respect for Buddhism, but has not "taken refuge in the dharma". When asked if he was a spiritual person, he said that he believes "in God, faith, inner faith, the self, passion, and things" and that he is "very spiritual" and "supremely bountiful". When asked by Stephen Colbert about his views on what happens after death, Reeves replied, "I know that the ones who love us will miss us".
In 2008, a lawsuit was filed against Reeves by photographer Alison Silva in the Los Angeles Superior Court. Silva alleged that Reeves' Porsche hit and injured him when Reeves was leaving a Los Angeles medical facility. The lawsuit sought $711,974 in damages. Following the trial, the jury cleared Reeves of any wrongdoing. On September 12, 2014, a female trespasser entered his Hollywood Hills home. Reeves spoke to her and notified the police, and she was admitted to hospital for a psychological evaluation. Three days later, another person entered his property through an unlocked gate. The intruder used Reeves' bathroom and went swimming in his pool. The house cleaners notified Reeves, who was not home, and police were sent to have her remanded.
Reeves supports several charities and causes. In response to his sister's battle with leukemia, he founded a private cancer foundation, which aids children's hospitals and provides cancer research. He has also volunteered for a Stand Up to Cancer telethon and worked closely with animal rights group PETA. Reeves has said, "Money is the last thing I think about. I could live on what I have already made for the next few centuries". Reeves reportedly gave approximately $80 million of his $114 million earnings from "The Matrix" to the people of the special effects and makeup departments. However, this has been disputed. Reeves negotiated a back-end deal, relinquishing his contractual right to a percentage of the earnings from the ticket sales, to allow a more extensive special effects budget for the rest of the franchise.
Reeves co-founded a production company, Company Films, with friend Stephen Hamel. An avid motorcyclist, Reeves co-founded Arch Motorcycle Company, which builds and sells custom motorcycles. In 2017, Reeves and Alexandra Grant founded book publisher, X Artists' Books (also known as XAB). He has written two books: "Ode to Happiness" and "Shadows", both of which are collaborations with Grant; he provided the text to her photographs and art.
In a 2005 article for "Time" magazine, Lev Grossman called Reeves' "Hollywood's ultimate introvert". He has been described as a workaholic, charming and "excruciatingly shy". During the production of "Constantine", director Francis Lawrence commented on his personality, calling him "hardworking" and "generous". His co-star Shia LaBeouf said, "I've worked with him for a year and a couple of months, but I don't really know him that much". Erwin Stoff of 3 Arts Entertainment has served as Reeves' agent and manager since he was 16, and produced many of his films. Stoff said Reeves "is a really private person" and keeps his distance from other people.
In 2010, an image of Reeves became an internet meme after photographs of him were published, sitting on a park bench with a sad facial expression. The images were posted on the 4chan discussion board and were soon distributed via several blogs and media outlets, leading to the "Sad Keanu" meme being spread on the internet. An unofficial holiday was created when a Facebook fan page declared June 15 as "Cheer-up Keanu Day".
Reeves' casual personality and ability to establish rapport with his manner have been observed by the public, leading him to be dubbed the "Internet's boyfriend". In March 2019, Reeves was flying into Los Angeles when the flight was diverted to Bakersfield, California. Instead of waiting for the plane's repair, he arranged for a van to take him and other passengers into the city. While filming "Bill & Ted Face the Music" in July 2019, Reeves and other cast members came across a house with a banner reading "You're Breathtaking", a meme that had come out of Reeves' appearance at the Electronic Entertainment Expo 2019 for the game "Cyberpunk 2077". Reeves took time to sign the banner and talk to the family. When asked if he could time travel, Reeves said he would like to travel back to the 1600s, to see if the works of William Shakespeare were written by Edward de Vere, 17th Earl of Oxford.
In 2016, "The Hollywood Reporter" calculated that Reeves had earned $250 million from "The Matrix" franchise, making him one of the highest paid actors. In 2005, Reeves received a star on the Hollywood Walk of Fame in recognition for his work in film.
Prolific in film since 1984, Reeves' most acclaimed and highest-grossing films, according to the review aggregate site Rotten Tomatoes, include: "River's Edge" (1987), "Bill and Ted's Excellent Adventure" (1989), "My Own Private Idaho" (1991), "Much Ado About Nothing" (1993), "Speed" (1994), "The Matrix" (1999), "John Wick" (2014), "John Wick: Chapter2" (2017), "John Wick: Chapter 3Parabellum" (2019), and "Toy Story 4" (2019). Reeves' has won three MTV Movie Awards, and received two Best Actor nominations at the Saturn Awards. He was nominated twice for a People's Choice Award: Favorite Male Movie Star and Favorite Action Movie Star, for his performance in "John Wick: Chapter 3Parabellum" (2019). | https://en.wikipedia.org/wiki?curid=16603 |
Koto (instrument)
The is a Japanese stringed musical instrument and the national instrument of Japan. It is derived from the Chinese and se, and similar to the Mongolian , the Korean and ajaeng, the Vietnamese , the Sundanese "kacapi" and the Kazakhstan jetigen. Koto are about length, and made from "kiri" wood ("Paulownia tomentosa"). They have 13 strings that are usually strung over 13 movable bridges along the width of the instrument. There is also a 17-string variant. Players can adjust the string pitches by moving the white bridges before playing. To play the instrument, the strings are plucked using three fingerpicks, worn on the thumb, index finger, and middle finger.
The character for "koto" is 箏, although 琴 is often used. However, 琴 usually refers to another instrument, the kin (琴の琴; "kin no koto"). 箏, in certain contexts, is also read as "sō" (箏の琴; "sō no koto"). The term is used today, but usually only when differentiating the koto and other zithers. The word for an Asian zither with adjustable bridges is “So”. Variations of the instrument were created, and eventually a few of them would become the standard variations for modern day koto. The four types of koto (Gakuso, Chikuso, Zokuso, Tagenso) were all created by different subcultures, but also adapted to change the playing style.
The ancestor of the koto was the Chinese guzheng. It was first introduced to Japan from China in the 7th and 8th century. The first known version had five strings, which eventually increased to seven strings. (It had twelve strings when it was introduced to Japan in the early Nara period (710–784) and increased to thirteen strings). The Japanese koto belongs to the Asian zither family that also comprises the Chinese "zheng" (ancestral to the other zithers in the family), the Korean "gayageum", and the Vietnamese "dan tranh". This variety of instrument came in two basic forms, a zither that had bridges and zithers without bridges.
When the koto was first imported to Japan, the native word "koto" was a generic term for any and all Japanese stringed instruments. Over time the definition of "koto" could not describe the wide variety of these stringed instruments and so the meanings changed. The azumagoto or yamatogoto was called the wagon, the kin no koto was called the kin, and the sau no koto (sau being an older pronunciation of 箏) was called the sō or koto.
The modern koto originates from the gakusō used in Japanese court music. It was a popular instrument among the wealthy; the instrument koto was considered a romantic one. Some literary and historical records indicate that solo pieces for koto existed centuries before sōkyoku, the music of the solo koto genre, was established. According to Japanese literature, the koto was used as imagery and other extra music significance. In one part of "The Tale of Genji" ("Genji monogatari"), Genji falls deeply in love with a mysterious woman, who he has never seen before, after he hears her playing the koto from a distance.
The koto of the chikuso was made for the Tsukushigato tradition and only for blind men. Women could not play the instrument in the professional world nor teach it. With the relief of the rule, women started to playing the koto, but not the chikuso because it was designed for the blind which led to a decline in use; other koto proved more useful. The two main koto varieties still used today are the gakuso and zokuso. These two have relatively stayed the same with the exception of material innovations like plastic and the type of strings. The tagenso is the newest addition to the koto family, surfacing in the 19th century, it was purposefully created to access a wider range of sound and advance style of play; these were made with 17, 21, and 31 strings.
Perhaps the most important influence on the development of koto was Yatsuhashi Kengyo (1614–1685). He was a gifted blind musician from Kyoto who changed the limited selection of six songs to a brand new style of koto music which he called kumi uta. Yatsuhashi changed the tsukushi goto tunings, which were based on gagaku ways of tuning; and with this change, a new style of koto was born. Yatsuhashi is now known as the "Father of Modern Koto".
A smaller influence in the evolution of the koto is found in the inspiration of a woman named Keiko Nosaka. Keiko Nosaka (a musician who won Grand Prize in Music from the Japanese Ministry of Culture in 2002), felt confined by playing a koto with just 13 strings, so she created new versions of the instrument with 20 or more strings.
Japanese developments in bridgeless zithers include the one-stringed koto ("ichigenkin") and two-stringed koto (nigenkin or yakumo goto). Around the 1920s, Goro Morita created a new version of the two-stringed koto; on this koto, one would push down buttons above the metal strings like the western autoharp. It was named the taishōgoto after the Taishō period.
At the beginning of the Meiji Period (1868–1912), western music was introduced to Japan. Michio Miyagi (1894–1956), a blind composer, innovator, and performer, is considered to have been the first Japanese composer to combine western music and traditional koto music. Miyagi is largely regarded as being responsible for keeping the koto alive when traditional Japanese arts were being forgotten and replaced by Westernization. He wrote over 300 new works for the instrument before his death in a train accident at the age of 62. He also invented the popular 17 string bass koto, created new playing techniques, advanced traditional forms, and most importantly increased the koto's popularity. He performed abroad and by 1928 his piece for koto and shakuhachi, "Haru no Umi" (Spring Sea) had been transcribed for numerous instruments. Haru no Umi is even played to welcome each New Year in Japan.
Since Miyagi's time, many composers such as Kimio Eto (1924–2012), Tadao Sawai (1937–1997) have written and performed works that continue to advance the instrument. Sawai's widow Kazue Sawai, who as a child was Miyagi's favored disciple, has been the largest driving force behind the internationalization and modernization of the koto. Her arrangement of composer John Cage's prepared piano duet "Three Dances" for four prepared bass koto was a landmark in the modern era of koto music.
For about one hundred and fifty years after the Meiji Restoration, the Japanese shirked their isolationist ideals and began to openly embrace American and European influences; which is most likely why the koto has taken on many different variations of itself.
A koto is typically made of Paulownia wood. The treatment of the wood before making the koto varies tremendously: one koto maker seasons the wood for perhaps a year on the roof of the house. Some wood may have very little treatment. Koto may or may not be adorned. Adornments include inlays of ivory and ebony, tortoise shell, metal figures, etc. The wood is also cut into two patterns, "itame" (also called "mokume"), which has a swirling pattern, or straight lined "masame". The straight lined pattern is easier to manufacture, so the swirl raises the cost of production therefore is reserved for decorative and elegant models.
The body of a traditional koto is made of a wood called "kiri". Every piece of the instrument comes with cultural significance, especially since the koto is the national instrument. Kiri is also important to Japan because it is the Imperial family crest for the Empress. Kiri is dried and cut into precise measurements. The size of the soundboard on a standard modern koto has remained approximately 182 centimeters. In the past the measurement ranged from 152 to 194 centimeters.
The bridges (Ji) used to be made of ivory, but nowadays are typically made of plastic, and occasionally made of wood. One can alter the pitch of a string by manipulating or moving the bridge. For some very low notes, there are small bridges made, as well as specialty bridges with three different heights, depending on the need of the tuning. When a small bridge is unavailable for some very low notes, some players may, as an emergency measure, use a bridge upside down. Of course, such an arrangement is unstable, and the bridge would have a tendency to fall down. Bridges have been known to break during playing, and with some older instruments which have the surface where the bridges rest being worn due to much use, the bridges may fall during playing, especially when pressing strings. There are, of course, various sorts of patch materials sold to fill the holes which cause the legs of a bridge to rest on an unstable area. About six feet long and one foot wide, the koto is traditionally placed on the floor in front of the player, who kneels.
The strings are made from a variety of materials. Various types of plastic strings are popular. Silk strings are still made, and are usually yellow in color. They cost more and are not as durable, but claimed to be more musical. The strings are tied with a half hitch to a roll of paper or cardboard, about the size of a cigarette butt, strung through the holes at the head of the koto, threaded through the holes at the back, tightened, and tied with a special knot. Strings can be tightened by a special machine, but often are tightened by hand, and then tied. One can tighten by pulling the string from behind, or sitting at the side of the koto, although the latter is much harder and requires much arm strength. Some instruments may have tuning pins (like a piano) installed, to make tuning easier.
The makura ito, the silk thread used in the instrument, is a pivotal part of its construction. This feature was not seen on the speculated nobility style instruments because they used a more tension of theirs and valued the relict nature of their instruments. The commoners did all the innovations that made the Koto not only a sturdy instrument, but more sonically adept. The makura ito was used in paper so the fine silk was in abundance in Japan. As of the beginning of the 19th century, an ivory called makura zuno became the standard for the koto.
For every part of the koto there is a traditional name which connects with the opinion that the body of a koto resembles that of a dragon. Thus the top part is called the "dragon's shell" (竜甲 "ryūkō"), while the bottom part is called the "dragon's stomach" (竜腹 "ryūfuku"). One end of the koto, noticeable because of the removable colorful fabricshell, is known as the "dragon's head" (竜頭 "ryūzu"), consisting of parts such as the "dragon's horns" (竜角 "ryūkaku" - the saddle of the bridge or "makurazuno" 枕角), "dragon's tongue" (竜舌 "ryūzetsu"), "dragon's eyes" (竜眼 "ryūgan" - the holes for the strings) and "dragon's forehead" (竜額 "ryūgaku" - the space above the "makurazuno").
The other end of the koto is called the "dragon's tail" (竜尾, "ryūbi"); the string nut is called the "cloud horn" (雲角, "unkaku").
The influence of Western pop music has made the koto less prominent in Japan, although it is still developing as an instrument. The 17-string bass koto ("jūshichi-gen") has become more prominent over the years since its development by Michio Miyagi. There are also 20-, 21-, and 25-string koto. Works are being written for 20- and 25-string koto and 17-string bass koto. Reiko Obata has also made the koto accessible to Western music readers with the publication of two books for solo koto using Western notation. The current generation of koto players, such as American performers Reiko Obata and Miya Masaoka, Japanese master Kazue Sawai, and Michiyo Yagi, are finding places for the koto in today's jazz, experimental music and even pop music. The members of the band Rin' are popular "jūshichi-gen" players in the modern music scene.
June Kuramoto of the jazz fusion group Hiroshima was one of the first koto performers to popularize the koto in a non-traditional fusion style. Reiko Obata, founder of East West Jazz, was the first to perform and record an album of jazz standards featuring the koto. Obata also produced the first-ever English language koto instructional DVD, titled "You Can Play Koto." Obata is one of the few koto performers to perform koto concertos with United States orchestras, having done so on multiple occasions, including with Orchestra Nova for San Diego's KPBS in 2010.
Other solo performers outside Japan include award-winning recording artist Elizabeth Falconer, who also studied for a decade at the Sawai Koto School in Tokyo, and Linda Kako Caplan, Canadian "daishihan" (grandmaster) and member of Fukuoka's Chikushi Koto School for over two decades. Another Sawai disciple, Masayo Ishigure, holds down a school in New York City. Yukiko Matsuyama leads her KotoYuki band in Los Angeles. Her compositions blend the timbres of world music with her native Japanese culture. She performed on the Grammy-winning album "" (2010) by the Paul Winter Consort, garnering additional exposure to Western audiences for the instrument. In November 2011, worldwide audiences were further exposed to the koto when she performed with Shakira at the Latin Grammy Awards.
In March 2010 the koto received widespread international attention when a video linked by the Grammy Award-winning hard rock band Tool on its website became a viral hit. The video showed Tokyo-based ensemble Soemon playing member Brett Larner's arrangement of the Tool song "Lateralus" for six koto and two bass koto. Larner had previously played koto with John Fahey, Jim O'Rourke, and members of indie rock groups including Camper Van Beethoven, Deerhoof, Jackie O Motherfucker, and Mr. Bungle.
In older pop and rock music, David Bowie used a koto in the instrumental piece "Moss Garden" on his album ""Heroes"" (1977). The multi-instrumentalist, founder, and former guitarist of The Rolling Stones Brian Jones played the koto in the song "Take It Or Leave It" on the album "Aftermath" (1966).
Paul Gilbert, a popular guitar virtuoso, recorded his wife Emi playing the koto on his song "Koto Girl" from the album "Alligator Farm" (2000). Rock band Kagrra, are well known for using traditional Japanese musical instruments in many of their songs, an example being "Utakata" (うたかた), a song in which the koto has a prominent place. Winston Tong, the singer of Tuxedomoon, uses it on his 15-minute song "The Hunger" from his debut solo album "Theoretically Chinese" (1985).
The rock band Queen used a (toy) koto in "The Prophet's Song" on their 1975 album "A Night at the Opera". Ex-Genesis guitarist Steve Hackett used a koto on the instrumental song "The Red Flower of Tachai Blooms Everywhere" from the album "Spectral Mornings" (1979), and Genesis keyboardist Tony Banks sampled a koto using an Emulator keyboard for the band's song "Mama". A koto played by Hazel Payne is featured in A Taste of Honey's 1981 English cover of the Japanese song "Sukiyaki". Asia (band) used a koto on the middle-eight section of "Heat of the Moment" on their eponymous 1982 album. A synthesized koto also appears in their cover of the song "I'll Try Something New".
Dr. Dre's 1999 album "2001" prominently features a synthesized koto on two of its tracks, "Still D.R.E." and "The Message". | https://en.wikipedia.org/wiki?curid=16610 |
Kludge
A kludge or kluge () is a workaround or quick-and-dirty solution that is clumsy, inelegant, inefficient, difficult to extend and hard to maintain. This term is used in diverse fields such as computer science, aerospace engineering, Internet slang, evolutionary neuroscience, and government. It is similar in meaning to the naval term "jury rig".
The word has alternate spellings ("" and ""), pronunciations ( and , rhyming with "judge" and "stooge" respectively, and several proposed etymologies.
The "Oxford English Dictionary" (2nd ed., 1989), cites Jackson W. Granholm's 1962 "How to Design a Kludge" article in the American computer magazine "Datamation".
kludge Also kluge. [J. W. Granholm's jocular invention: see first quot.; cf. also "bodge" v., "fudge" v.]'An ill-assorted collection of poorly-matching parts, forming a distressing whole' (Granholm); esp. in "Computing", a machine, system, or program that has been improvised or 'bodged' together; a hastily improvised and poorly thought-out solution to a fault or 'bug'. ...
The "OED" entry also includes the verb "kludge" ("to improvise with a kludge or kludges") and "kludgemanship" ("skill in designing or applying kludges").
Granholm humorously imagined a fictitious source for the term:Phineas Burling is the Chief calligrapher with the Fink and Wiggles Publishing Company, Inc. . . . According to Burling, the word "kludge" first appeared in the English language in the early fifteen-hundreds. . . .
The word "kludge" is, according to Burling, derived from the same root as the German "klug" (Dutch "kloog", Swedish "Klag", Danish "Klog", Gothic "Klaugen", Lettish "Kladnis" and Sanskrit "Veklaunn"), originally meaning "smart" or "witty". In the typical machinations of language in evolutionary growth, the word "Kludge" eventually came to mean "not so smart" or "pretty ridiculous". . . . Today "kludge" forms one of the most beloved words in design terminology, and it stands ready for handy application to the work of anyone who gins up 110-volt circuitry to plug into the 220 VAC source. The building of a Kludge, however, is not work for amateurs.Although "OED" accepts Granholm's coinage of the term, there are examples of its use before the 1960s.
American Yiddish speakers use ' () to mean "too smart by half", the reflected meaning of German ' ("clever"). This may explain the idea of clever but clumsy and temporary, as well as the pronunciation variation from German.
Cf. German ("clod", diminutive ), Low Saxon "klut", "klute", Dutch , perhaps related to Low German diminutive "klütje" ("dumpling, clod"), Danish Jutland dialect "klyt" ("piece of bad workmanship, kludge"), and Standard Danish ("mess, disorder").
Arguments against the derivation from German "":
Another hypothesis dates to 1907, "when John Brandtjen convinced two young machinists from Oslo, Norway named Abel and Eneval Kluge to service and install presses for his fledgling printing equipment firm". In 1919, the brothers invented an automatic feeder for printing presses which was a success, though "temperamental, subject to frequent breakdowns, and devilishly difficult to repair — but oh, so clever!" The Kluge brothers continued to innovate, and the company remained active as of 2017. Given that the feeder bore the Kluge name, it seems reasonable that it became a byword for over-complex mechanical contraptions.
Other suggested folk etymologies or backronyms for "kludge" or "kluge" are: klumsy, lame, ugly, dumb, but good enough; or klutzy lashup, under-going engineering.
The Jargon File (a.k.a. "The New Hacker's Dictionary"), a glossary of computer programmer slang maintained by Eric S. Raymond, differentiates "kludge" from "kluge" and cites usage examples predating 1962. "Kluge" seems to have the sense of overcomplicated, while "kludge" has only the sense of poorly done.
kludge /kluhj/
This Jargon File entry notes "kludge" apparently derives via British military slang from Scots "cludge" or "cludgie" meaning "a common toilet", and became confused with U.S. "kluge" during or after World War II.
kluge: /klooj/ [from the German 'klug', clever; poss. related to Polish & Russian 'klucz' (a key, a hint, a main point)]
This entry notes "kluge", which is now often spelled "kludge", "was the original spelling, reported around computers as far back as the mid-1950s and, at that time, used exclusively of hardware kluges".
"Kluge" "was common Navy slang in the World War II era for any piece of electronics that worked well on shore but consistently failed at sea". A summary of a 1947 article in the "New York Folklore Quarterly" states:
On being drafted into the navy, Murgatroyd gave his profession as "kluge maker". . . . Whenever Murgatroyd was asked what he was doing, he said he was making a kluge, and actually he was one of the world's best kluge makers. Not wanting to seem ignorant, his superiors kept giving him commendations and promotions. . . . One day . . . the admiral asked him what a kluge was – the first person ever to do so. Murgatroyd said it was hard to explain, but he would make one so the admiral could see what it was. After a couple of days, he returned with a complex object.
"Interesting," said the admiral, "but what does it do?" In reply, Murgatroyd dropped it over the side of the ship. As the thing sank, it went "kluge".
The Jargon File further includes "kluge around" "to avoid a bug or difficult condition by inserting a kluge", "kluge up" "to lash together a quick hack to perform a task".
After Granholm's 1962 "How to Design a Kludge" article popularized the "kluge" variant "kludge", both were interchangeably used and confused. The Jargon File concludes:
The result of this history is a tangle. Many younger U.S. hackers pronounce the word as /klooj/ but spell it, incorrectly for its meaning and pronunciation, as 'kludge'. ... British hackers mostly learned /kluhj/ orally, use it in a restricted negative sense and are at least consistent. European hackers have mostly learned the word from written American sources and tend to pronounce it /kluhj/ but use the wider American meaning! Some observers consider this mess appropriate in view of the word's meaning.
In aerospace, a kludge was a temporary design using separate commonly available components that were not flightworthy in order to proof the design and enable concurrent software development while the integrated components were developed and manufactured. The term was in common enough use to appear in a fictional movie about the US space program.
Perhaps the ultimate kludge was the first US space station, Skylab. Its two major components, the Saturn Workshop and the Apollo Telescope Mount, began development as separate projects (the SWS was kludged from the S-IVB stage of the Saturn 1B and Saturn V launch vehicles, the ATM was kludged from an early design for the descent stage of the Apollo Lunar Module). Later the SWS and ATM were folded into the Apollo Applications Program, but the components were to have been launched separately, then docked in orbit. In the final design, the SWS and ATM were launched together, but for the single-launch concept to work, the ATM had to pivot 90 degrees on a truss structure from its launch position to its on-orbit orientation, clearing the way for the crew to dock its Apollo Command/Service Module at the axial docking port of the Multiple Docking Adapter.
The Airlock Module's manufacturer, McDonnell Douglas, even recycled the hatch design from its Gemini spacecraft and kludged what was originally designed for the conical Gemini Command Module onto the cylindrical Skylab Airlock Module. The Skylab project, managed by the National Aeronautics and Space Administration's Marshall Space Flight Center, was seen by the Manned Spacecraft Center (later Johnson Space Center) as an invasion of its historical role as the NASA center for manned spaceflight. Thus, MSC personnel missed no opportunity to disparage the Skylab project, calling it "the kludge".
In modern computing terminology, a "kludge" (or often a "hack") is a solution to a problem, the performance of a task, or a system fix which is inefficient, inelegant ("hacky"), or even incomprehensible, but which somehow works. It is similar to a workaround, but quick and ugly. To "kludge around something" is to avoid a bug or difficulty by building a kludge, perhaps exploiting properties of the bug itself. A kludge is often used to modify a working system while avoiding fundamental changes, or to ensure backwards compatibility. "Hack" can also be used with a positive connotation, for a quick solution to a frustrating problem.
A kludge is often used to fix an unanticipated problem in an earlier kludge; this is essentially a kind of cruft.
A solution might be a kludge if it fails in corner cases. An intimate knowledge of the problem domain and execution environment is typically required to build a corner-case kludge. More commonly, a kludge is a heuristic which was expected to work almost always, but ends up failing often.
A 1960's Soviet anecdote tells of a computer part which needed a slightly delayed signal to work. Rather than setting up a timing system, the kludge was to connect long coils of internal wires to slow the electrical signal.
Another type of kludge is the evasion of an unknown problem or bug in a computer program. Rather than continue to struggle to diagnose and fix the bug, the programmer may write additional code to compensate. For example, if a variable keeps ending up doubled, add later code which divides by two, rather than search for the original incorrect computation.
In computer networking, use of NAT (Network Address Translation) (RFC 1918) or PAT (Port Address Translation) to cope with the shortage of IPv4 addresses is an example of a kludge.
In FidoNet terminology, "kludge" refers to a piece of control data embedded inside a message.
The "kludge" or "kluge" metaphor has been adapted in fields such as evolutionary neuroscience, particularly in reference to the human brain.
The neuroscientist David Linden discusses how intelligent design proponents have misconstrued brain anatomy.
The transcendent aspects of our human experience, the things that touch our emotional and cognitive core, were not given to us by a Great Engineer. These are not the latest design features of an impeccably crafted brain. Rather, at every turn, brain design has been a kludge, a workaround, a jumble, a pastiche. The things we hold highest in our human experience (love, memory, dreams, and a predisposition for religious thought) result from a particular agglomeration of ad hoc solutions that have been piled on through millions of years of evolution history. It's not that we have fundamentally human thoughts and feelings "despite" the kludgy design of the brain as molded by the twists and turns of evolutionary history. Rather, we have them precisely "because" of that history.
The research psychologist Gary Marcus's book "Kluge: The Haphazard Construction of the Human Mind" compares evolutionary kluges with engineering ones like manifold vacuum-powered windshield wipers – when you accelerated or drove uphill, "Your wipers slowed to a crawl, or even stopped working altogether."
For instance, the vertebrate eye's retina that is installed backward, facing the back of the head rather than the front. As a result, all kinds of stuff gets in its way, including a bunch of wiring that passes through the eye and leaves us with a pair of blind spots, one in each eye.
In John Varley's 1985 short story "Press Enter_", the antagonist, a reclusive hacker, adopts the identity Charles Kluge.
In the science fiction television series "Andromeda", genetically engineered human beings called Nietzscheans use the term disparagingly to refer to genetically unmodified humans.
In a 2012 article, political scientist Steven Teles used the term "kludgeocracy" to criticize the complexity of social welfare policy in the United States. Teles argues that institutional and political obstacles to passing legislation often drive policy makers to accept expedient fixes rather than carefully thought out reforms. | https://en.wikipedia.org/wiki?curid=16615 |
King Crimson
King Crimson are an English progressive rock band formed in London in 1968. King Crimson have been influential both on the early 1970s progressive rock movement and many contemporary artists. Although the band has undergone numerous formations throughout its history, Robert Fripp is the only constant member of the group and is considered the band's leader and driving force. The band has earned a large cult following. They were ranked No. 87 on VH1's "100 Greatest Artists of Hard Rock". Although considered to be a seminal progressive rock band (a genre characterised by extended instrumental sections and complex song structures), they have often distanced themselves from the genre: as well as influencing several generations of progressive and psychedelic rock bands, they have also been an influence on subsequent alternative metal, hardcore and experimental/noise musicians.
Developed from the unsuccessful psychedelic pop trio Giles, Giles and Fripp, the initial King Crimson were key to the formation of early progressive rock, strongly influencing and altering the music of contemporaries such as Yes and Genesis. Their debut album, "In the Court of the Crimson King" (1969), remains their most successful and influential release, with its elements of jazz, classical and experimental music. Their success increased following an opening act performance for the Rolling Stones at Hyde Park, London, in 1969. Following "In the Wake of Poseidon" (1970) and the less successful chamber jazz-inspired "Lizard" (1970), and "Islands" (1971), the group reformatted and changed their instrumentation (swapping out saxophone in favour of violin and unusual percussion) in order to develop their own take on European rock improvisation, reaching a new creative peak on "Larks' Tongues in Aspic" (1973), "Starless and Bible Black" (1974) and "Red" (1974). Fripp disbanded the group in 1974.
In 1981, King Crimson reformed with another change in musical direction and instrumentation (incorporating, for the first time, a mixture of British and American personnel plus doubled guitar and influences taken from gamelan, post-punk and New York minimalism). This lasted for three years, resulting in the trio of albums "Discipline" (1981), "Beat" (1982) and "Three of a Perfect Pair" (1984). Following a decade-long hiatus, Fripp revived the group as an expanded "Double Trio" sextet in 1994, mingling its mid-‘70s and 1980s approaches with new creative options available via MIDI technology. This resulted in another three-year cycle of activity including the release of "Thrak" (1995). King Crimson reunited again in 2000 as a more alternative metal-oriented quartet (or "Double Duo"), releasing "The Construkction of Light" in 2000 and "The Power to Believe" in 2003: after further personnel shuffles, the band expanded to a double-drummer quintet for a 2008 tour celebrating their 40th anniversary.
Following another hiatus between 2009 and 2012, King Crimson reformed once again in 2013; this time as a septet (and, later, octet) with an unusual three-drumkit frontline and the return of saxophone/flute to the lineup for the first time since 1972. This current version of King Crimson has continued to tour and to release live albums, significantly rearranging and reinterpreting music from across the band's career.
Since 1997, several musicians have pursued aspects of the band's work and approaches through a series of related bands collectively referred to as ProjeKcts.
In August 1967, brothers Michael Giles (drums) and Peter Giles (bass), who had been professional musicians in various jobbing bands since their mid-teens in Dorset, England, advertised for a singing organist to join their new group. Fellow Dorset musician Robert Fripp – a guitarist who did not sing – responded and the trio formed the band Giles, Giles and Fripp. Based on a format of eccentric pop songs and complex instrumentals, the band recorded several unsuccessful singles and one album, "The Cheerful Insanity of Giles, Giles and Fripp". The band hovered on the edge of success, with several radio sessions and a television appearance, but never scored the hit that would have been crucial for a commercial breakthrough.
Attempting to expand their sound, the three recruited Ian McDonald on keyboards, reeds and woodwinds. McDonald brought along his then-girlfriend, former Fairport Convention singer Judy Dyble, whose brief tenure with the group ended when the two split. McDonald brought in lyricist, roadie, and art strategist Peter Sinfield, with whom he had been writing songs – a partnership initiated when McDonald had said to Sinfield, regarding his 1968 band Creation, "Peter, I have to tell you that your band is hopeless, but you write some great words. Would you like to get together on a couple of songs?" Fripp, meanwhile, saw Clouds perform at the Marquee Club in London which inspired him to incorporate classical melodies and jazz-like improvisation in his songwriting. No longer interested in pursuing Peter Giles' more whimsical pop style, Fripp recommended his friend, singer, and guitarist Greg Lake, join and replace either Peter Giles or Fripp himself. Peter Giles later called it one of Fripp's "cute political moves". But he had become disillusioned with the band's lack of success and departed, leaving Lake to become bassist and singer.
The first incarnation of King Crimson was formed in London on 30 November 1968 and first rehearsed on 13 January 1969. The band's name was coined by Sinfield, though it is not meant to be a synonym for Beelzebub, prince of demons. (According to Fripp, Beelzebub would be an anglicised form of the Arabic phrase "B'il Sabab", meaning "the man with an aim".) Historically and etymologically, a "crimson king" was any monarch during whose reign there was civil unrest and copious bloodshed; the album debuted at the height of worldwide opposition to the military involvement of the United States in Southeast Asia. At this point, McDonald was the group's main composer, albeit with contributions from Lake and Fripp, while Sinfield wrote the lyrics, and designed and operated the band's stage lighting, being credited with "sounds and visions." McDonald suggested the band purchase a Mellotron, and they began using it to create an orchestral rock sound, inspired by the Moody Blues. Sinfield described Crimson thus: "If it sounded at all popular, it was out. So it had to be complicated, it had to be more expansive chords, it had to have strange influences. If it sounded, like, too simple, we'd make it more complicated, we'd play it in 7/8 or 5/8, just to show off".
King Crimson made their breakthrough live debut on 5 July 1969 by playing the Rolling Stones free concert at Hyde Park, London in July 1969 before an estimated 500,000 people. The debut album, "In the Court of the Crimson King", was released in October 1969 on Island Records. Fripp would later describe it as having been "an instant smash" and "New York's acid album of 1970" (notwithstanding Fripp and Giles' assertion that the band never used psychedelic drugs). The album received public compliments from Pete Townshend, the Who's guitarist, who called the album "an uncanny masterpiece." The album's sound, including its opening track "21st Century Schizoid Man", was described as setting the precedent for alternative rock and grunge, whilst the softer tracks are described as having an "ethereal" and "almost sacred" feel. In contrast to the blues-based hard rock of the contemporary British and American scenes, King Crimson presented a more Europeanised approach that blended antiquity and modernity. The band's music drew on a wide range of influences provided by all five group members. These elements included romantic- and modernist-era classical music, the psychedelic rock spearheaded by Jimi Hendrix, folk, jazz, military music (partially inspired by McDonald's stint as an army musician), ambient improvisation, Victoriana and British pop.
After playing shows across England, the band toured the US with various pop and rock acts. Their first show was at Goddard College in Plainfield, Vermont. While their original sound astounded contemporary audiences and critics, creative tensions were already developing within the band. Giles and McDonald, still striving to cope with King Crimson's rapid success and the realities of touring life, became uneasy with the band's direction. Although he was neither the dominant composer in the band nor the frontman, Fripp was very much the band's driving force and spokesman, leading King Crimson into progressively darker and more intense musical areas. McDonald and Giles, now favouring a lighter and more romantic style of music, became increasingly uncomfortable with their position and resigned from the band during the US tour. To salvage what he saw as the most important elements of King Crimson, Fripp offered to resign himself, but McDonald and Giles declared that the band was "more (him) than them" and that they should therefore be the ones to leave. The line-up played their last show at the Fillmore West in San Francisco on 16 December 1969. Live recordings of the tour were released in 1997 on "Epitaph".
After their first US tour, King Crimson was in a state of flux with various line-up changes, thwarted tour plans, and difficulties in finding a satisfactory musical direction. This period has subsequently been referred to as the "interregnum" – a nickname implying that the "King" (King Crimson) was not properly in place during this time. Fripp became the only remaining musician in the band, with Sinfield expanding his creative role to playing synthesizers.
Fripp and Sinfield recorded the second King Crimson album, "In the Wake of Poseidon", in 1970 with the Giles brothers hired back as the session rhythm section, and with jazz pianist Keith Tippett and Circus saxophonist Mel Collins as guest musicians. The group considered hiring Elton John to be the singer, but decided against the idea. Lake, who was leaving to form what would become Emerson, Lake and Palmer, then agreed to sing on the album in exchange for receiving King Crimson's PA equipment on all songs except "Cadence and Cascade", which was ultimately sung by Fripp's friend Gordon Haskell. Though Tippett was offered band membership, he preferred to remain as a studio collaborator, performing with the band for a single gig. Upon its release in May 1970, "In the Wake of Poseidon" reached No. 4 in the UK and No. 31 in the US. It received some criticism from those who thought it sounded too similar to their first album. With no musicians to perform material from their new album, Fripp and Sinfield persuaded Haskell to join as singer and bassist and recruited Andy McCulloch as drummer, retaining Collins as saxophonist, flautist and occasional keyboard player.
During the writing sessions for the third album, "Lizard", Haskell and McCulloch had no say in the direction of the material, since Fripp and Sinfield wrote the album themselves, bringing in Tippett, Mark Charig on cornet, Nick Evans on trombone, and Robin Miller on oboe and cor anglais as additional musicians. Haskell sang and played bass. Jon Anderson of Yes was also brought in to sing the first part of the album's title track, "Prince Rupert Awakes", which Fripp and Sinfield considered to be outside Haskell's range and style. "Lizard" featured stronger avant-garde jazz and chamber-classical influences than previous albums, as well as Sinfield's upfront experiments with processing and distorting sound through the EMS VCS 3 synthesiser. It also featured complex lyrics from Sinfield, including a coded song about the break-up of the Beatles, with almost the entire second side taken up by a predominantly instrumental chamber suite describing a medieval battle and its outcome. Released in December 1970, "Lizard" reached No. 29 in the UK and No. 113 in the US. Described retrospectively as an "acquired taste", "Lizard" was certainly not to the taste of the more rhythm-and-blues-oriented Haskell and McCulloch, both of whom found the music difficult to relate to. As a result, Haskell quit the band acrimoniously after refusing to sing live with distortion and electronic effects. McCulloch also departed, leaving Fripp and Sinfield to recruit new members once more.
After a search for new musicians, Fripp and Sinfield secured a returning Collins and Ian Wallace on drums. Auditions for a singer included those from Bryan Ferry and John Gaydon, the band's manager. The position went to Raymond "Boz" Burrell. Bassist John Wetton was invited to join, but declined (at the time) in order to play with Family. Rick Kemp also declined an offer to join, leaving Fripp and Wallace teaching Burrell to play bass rather than continue auditions. Though he had not played bass before, Burrell had played enough rhythm guitar to assist him in learning the instrument. With the line-up complete, King Crimson toured in 1971 for the first time since 1969. The concerts were well received, but the musical and lifestyle differences of Collins, Wallace, and Burrell began to alienate the drug-free Fripp, who began to withdraw socially from his bandmates, creating further tension.
In 1971, the new King Crimson formation recorded "Islands". Loosely influenced by Miles Davis's orchestral collaborations with Gil Evans and Homer's "Odyssey", the album also showed signs of a split in styles between Sinfield (who favoured the softer and more textural jazz-folk approach and wanted the band to move in a Miles Davis direction) and Fripp (who was drawn more towards the harsher instrumental style exemplified by the instrumental "Sailor's Tale", with its dramatic Mellotron and banjo-inspired guitar technique). "Islands" also featured the band's only experiment with a string ensemble on "Prelude: Song of the Gulls" and the raunchy rhythm-and-blues-inspired "Ladies of the Road". A hint of trouble to come came when one member of the band allegedly described the more delicate and meditative parts of "Islands" as "airy-fairy shit". Released in December 1971, "Islands" charted at No. 30 in the UK and No. 76 in the US. Following a period of touring "Islands", Fripp asked Sinfield to leave the band, citing musical differences and a loss of faith in his partner's ideas. The remaining band broke up acrimoniously in rehearsals shortly afterward, owing to Fripp's refusal to incorporate other members' compositions into the band's repertoire. He later cited this as "quality control", with the idea that King Crimson would perform the "right kind" of music.
King Crimson reformed to fulfil touring commitments in 1972, with the intention of disbanding afterwards. Recordings from various North American dates between January and February 1972 were released as "Earthbound" in June 1972, noted and criticised for its sub-par sound quality and playing style that occasionally veered towards funk, with scat singing on the improvised pieces. By this time, a definite musical rift between Fripp and the rest of the band existed, since Wallace, Burrell and Collins favoured a more rhythm-and-blues style. Though personal relations improved during the 1972 tour (to the point where most of the band wished to continue), Fripp opted to part company with the existing band and to restructure King Crimson with new members, since he felt the current members wouldn't be able to play the new material he had in mind.
The third major line-up of King Crimson was radically different from the previous two. Fripp's four new recruits were free-improvising percussionist Jamie Muir, drummer Bill Bruford, who left Yes at a new commercial peak in their career in favour of the "darker" King Crimson, bassist and singer John Wetton, and violinist and keyboardist David Cross whom Fripp had encountered through work with music colleagues. Most of the musical compositions were collaborations between Fripp and Wetton, who each composed segments independently and fitted together those which they found compatible. With Sinfield gone, the band recruited Wetton's friend Richard Palmer-James as their new lyricist. Unlike Sinfield, Palmer-James played no part in artistic, visual, or sonic direction; his sole contributions were his lyrics, sent to Wetton by post from his home in Germany. Following a period of rehearsals, King Crimson resumed touring on 13 October 1972 at the Zoom Club in Frankfurt, with the band's penchant for improvisation and Muir's startling stage presence gaining them renewed press attention.
In January and February 1973, King Crimson recorded "Larks' Tongues in Aspic" in London which was released that March. The band's new sound was exemplified by the album's two-part title track – a significant change from what King Crimson had done before, emphasising instrumentals and drawing influences from classical, free, and heavy metal music. The record displayed Muir's free approach to percussion, which included using a drum kit, bicycle parts, toys, a bullroarer, hitting a gong with chains, and a joke laughing bag. He also used fake blood capsules applied to his head, becoming a sole example of such theatrical stage activity in the band's history. The album reached No. 20 in the UK and No. 61 in the US. After a period of further touring, Muir departed in 1973, quitting the music industry altogether. Though this was initially thought to have been motivated by an onstage injury caused by a gong landing on his foot, it was later revealed that Muir had gone through a personal spiritual crisis, and had withdrawn to become a monk.
With Muir gone, the remaining members reconvened in January 1974 to produce "Starless and Bible Black", released in March 1974 and earned them a positive "Rolling Stone" review. Though most of the album is formed of live performances from the band's late 1973 tour, the recordings were painstakingly edited to sound like a studio record, with "The Great Deceiver" and "Lament" the only tracks recorded entirely in the studio. The album reached No. 28 in the UK and No. 64 in the US. Following the album's release, the band began to divide once more, this time over performance. Musically, Fripp found himself positioned between Bruford and Wetton, who played with such force and increasing volume that Fripp once compared them to "a flying brick wall", and Cross, whose amplified acoustic violin was increasingly being drowned out by the rhythm section, leading him to concentrate more on keyboards. An increasingly frustrated Cross began to withdraw musically and personally, with the result that he was voted out of the group following the band's 1974 tour of Europe and America.
In July 1974 Fripp, Bruford, and Wetton began recording "Red". Before recording began, Fripp, now increasingly disillusioned with the music business, turned his attention to the works of Russian mystic George Gurdjieff and experienced a spiritual crisis-cum-awakening; he later described it as if "the top of my head blew off". Though most of the album was already written, Fripp retreated into himself in the studio and "withdrew his opinion", leaving Bruford and Wetton to direct most of the recording sessions. The album contains studio recorded material with one live track, "Providence", recorded on 30 June 1974 with Cross in the group. Several musicians, including some from past King Crimson line-ups, contribute to the album. Released in October 1974, "Red" went to No. 45 in the UK and No. 66 in the US. AllMusic called it "an impressive achievement" for a group about to disband, with "intensely dynamic" musical chemistry between the band members.
Two months before the release of "Red", King Crimson's future looked bright (with talks regarding founder member Ian McDonald rejoining the group). However, Fripp wished not to tour as he felt increasingly disenchanted by the group and the music industry. He also felt the world was going to end in 1981 and that he had to prepare for it. Despite a band meeting while touring the US in which Fripp expressed a desire to end the band, the group formally disbanded on 25 September 1974 when Fripp announced that King Crimson had "ceased to exist" and was "completely over for ever and ever". It was later revealed that Fripp had attempted to interest his managers in a King Crimson without him, but the idea was turned down. Following the band's disbanding, the live album "USA" was released in May 1975, formed of recordings from their 1974 North American tour. It received some positive reviews, including "a must" for fans of the band and "insanity you're better off having". Issues with some of the tapes rendered some of Cross' violin inaudible, so Eddie Jobson was hired to perform overdubs of violin and keyboards in a studio; further edits were also made to allow the music to fit on a single LP. Between 1975 and 1980, King Crimson were inactive.
In 1981, having spent seven years in spiritual pursuits and smaller projects (from playing guitar for David Bowie, Peter Gabriel and Daryl Hall to pursuing an experimental solo career to leading the instrumental band The League of Gentlemen, which included members of various post-punk groups) Fripp decided to form a new "first division" rock group but had no intentions of reforming King Crimson. Having recruited Bill Bruford as drummer, Fripp asked singer and guitarist Adrian Belew, the first time Fripp was in a band with another guitarist and therefore indicative of Fripp's desire to create a sound unlike any of his previous work. After touring with Talking Heads, Belew agreed to join and also become the band's lyricist. Bruford's suggestion of Jeff Berlin as bassist was rejected as his playing was "too busy", so auditions were held in New York: on the third day, Fripp left after roughly three auditions, only to return several hours later with Tony Levin (who got the job after playing a single chorus of "Red"). Fripp later confessed that, had he initially known that Levin was available and interested, he would have selected him as first-choice bass player without holding auditions. Fripp named the new quartet Discipline, and the band went to England to rehearse and write new material. They made their live debut at Moles Club in Bath, Somerset on 30 April 1981, and completed a UK tour supported by the Lounge Lizards. By October 1981, the band had opted to change their name to King Crimson.
In 1981, King Crimson recorded "Discipline" with producer Rhett Davies. The album displayed a very different version of the band, with newer influences including post-punk, new wave, latterday funk and go-go and African-styled polyrhythms. With a sound described in "The New Rolling Stone Album Guide" as having a "jaw-dropping technique" of "knottily rhythmic, harmonically demanding workouts", Fripp intended to create the sound of a "rock gamelan", with an interlocking rhythmic quality to the paired guitars that he found similar to Indonesian gamelan ensembles. Fripp concentrated on playing complex picked arpeggios, while Belew provided an arsenal of guitar sounds including animal noises, industrial textures, and guitar screams with a range of electronic effects pedals. In addition to bass guitar, Levin used the Chapman Stick, a ten-string polyphonic two-handed tapping guitar instrument that has a bass and treble range and which he played in an "utterly original style". Bruford experimented with cymbal-less acoustic kits and a Simmons SDS-V electronic drum kit. The band's songs were shorter in comparison to previous King Crimson albums, and very much shaped by Belew's pop sensibilities and quirky approach to writing lyrics. Though the band's previous taste for improvisation was now tightly reined in, one instrumental ("The Sheltering Sky") emerged from group rehearsals; while the noisy, half-spoken/half-shouted "Indiscipline" was a partially-written, part-improvised piece created in order to give Bruford a chance to escape from the strict rhythmic demands of the rest of the album and to play against the beat in any way that he could. Released in September 1981, "Discipline" reached No. 41 in the UK and No. 45 in the US.
In June 1982, King Crimson followed "Discipline" with "Beat" (the first King Crimson album recorded with the same band line-up as the album preceding it). None of the members of the group produced the record; Davies undertook production duties himself. The album had a loosely linked theme of the Beat Generation and its writings, reflected in song titles such as "Neal and Jack and Me" (inspired by Neal Cassady and Jack Kerouac), "The Howler" (inspired by Allen Ginsberg's "Howl") and "Sartori in Tangier" (inspired by Paul Bowles). Fripp asked Belew to read Kerouac's novel "On the Road" for inspiration, and the album contained themes of travel, disorientation and loneliness. While the album was noticeably poppier than "Discipline", it featured the harsh, atonal and improvised "Requiem".
Recording "Beat" was faced with tension with Belew suffering high stress levels over his duties as front man, lead singer, and principal songwriter. On one occasion, he clashed with Fripp and ordered him out of the studio. After differences were resolved, and while "Beat" reached No. 39 in the UK and No. 52 in the US, King Crimson resumed touring. "Heartbeat" was released as a single which peaked at No. 57 on the "Billboard" Mainstream Rock chart. Around this time the band released the VHS-only '"The Noise: Live in Frejus" (DGMVC2), a record of a show played at the Arena, Frejus, France on 27 August 1982. (This video is now on DVD as part of the compilation Neal and Jack and Me.)
King Crimson's next album, "Three of a Perfect Pair", was recorded in 1983 and released in March 1984. Having encountered difficulty in both writing and determining a direction for the album, the band chose to record and sequence it as a "left side" — four of the band's poppier songs plus an instrumental — and a "right side" (experimental work including extended and atonal improvisations in the tradition of the mid-1970s band, plus as the third part of "Larks' Tongues in Aspic"). "Three of a Perfect Pair" peaked at No. 30 in the UK and No. 58 in the US, with "Three of a Perfect Pair" and "Sleepless" being released as singles. The 2001 remaster of the album included "the other side", a collection of remixes and improvisation out-takes plus Levin's tongue-in-cheek vocal piece, "The King Crimson Barbershop". The last concert of the "Three of a Perfect Pair" tour, at the Spectrum in Montreal, Canada on 11 July 1984, was recorded and released in 1998 as "".
Following the 1984 tour, Fripp dissolved King Crimson for the second time, having become dissatisfied with its working methods. Bruford and Belew expressed some frustration over this; Belew recalled the first he had heard of the split was when he read about it in a report in "Musician" magazine. Despite these circumstances, the musicians remained on fairly amicable terms. Belew would later refer to the band "taking a break" that ultimately lasted for ten years.
In the early 1990s, Belew met with Fripp in England with an interest in a reformed King Crimson. Two years later, in 1992, Fripp established the Discipline Global Mobile (DGM) record label with producer David Singleton: this would subsequently be the main home for Fripp's work, with main album releases distributed to larger record companies, affording Fripp and his associates greater freedom and more control over their work.
After a tour with David Sylvian in 1993 (who declined an offer to join Crimson), Fripp began to assemble a new version of the band, a union between the band's previous incarnation and the Sylvian & Fripp group: he was joined with Belew, Levin, Bruford, Chapman Stick player Trey Gunn (a Guitar Craft alumnus), and drummer Pat Mastelotto (who replaced the first choice, Jerry Marotta). Fripp explained the six-member formation was to be a "Double Trio" with two guitarists, two bassists, and two drummers, to explore a different style of music. Bruford later said he lobbied his own way into the band, believing that King Crimson was very much "his gig", and that Fripp had come up with the philosophical explanation later. One of the conditions Fripp had imposed on Bruford regarding his return was to give up all creative control to Fripp.
Following rehearsals in Woodstock, New York, the group released the extended play "Vrooom" in October 1994. This revealed the new King Crimson sound, which featured elements of the interlocking guitars on "Discipline" and the heavy rock feel of "Red", but also involved a greater use of ambient electronic sound and ideas from industrial music. In contrast, many of the actual songs – mostly written or finalised by Belew – displayed stronger elements of 1960s pop than before – in particular, a Beatles influence (although Bruford would also refer to the band as sounding like "a dissonant Shadows on steroids"). As with previous line-ups, new technology was used including MIDI and the Warr tap guitar with which Gunn replaced the Stick. King Crimson toured the album from 28 September 1994 in Buenos Aires, Argentina; following concerts were released on the double live "" in 1995.
In October and December 1994, King Crimson recorded their eleventh studio album, "Thrak". Formed of revised versions of most of the tracks on "Vrooom", plus new tracks, the album was described by "Q" magazine as having "jazz-scented rock structures, characterised by noisy, angular, exquisite guitar interplay" and an "athletic, ever-inventive rhythm section," while being in tune with the sound of alternative rock of the mid-1990s. Examples of the band's efforts to integrate their multiple elements could be heard on the complex post-prog songs "Dinosaur" and "Sex Sleep Eat Drink Dream" as well as the more straightforward "One Time" and the funk-pop inspired "People".
King Crimson resumed touring in 1995 and 1996; dates from October and November 1995 were recorded and released on the live album "Thrakattak" in May 1996, consisting of improvisations from performances of "THRAK" and Fripp's DGM partner David Singleton into an hour-long extended improvisation. A more conventional live recording from the period was later made available on the 2001 double CD release "Vrooom Vrooom", as was a 1995 concert on the 2003 "Déjà Vrooom" DVD.
When fresh writing rehearsals began in mid-1997 in Nashville, Tennessee, Fripp was dissatisfied with the quality of the new music being developed by the band; developing friction and disagreements between himself and Bruford led to the latter deciding to leave King Crimson. The resulting atmosphere and the lack of workable band material almost broke the band up altogether. Instead, the six members (including Bruford) opted to work in four smaller groups (or "fraKctalisations", according to Fripp) known as the ProjeKcts. This enabled the group to continue developing musical ideas and searching for Crimson's next direction without the practical difficulty and expense of convening all six members at once. In 1998 and 1999, the first four ProjeKcts played live in the US, Japan, and the UK and released recordings that showed a high degree of free improvisation. These have been collectively described by music critic J. D. Considine as "frequently astonishing" but lacking in melody, and perhaps too difficult for a casual listener.
At the end of the four ProjeKct runs, Bruford left King Crimson altogether to resume his work in jazz. At the same time, Levin's commitments as a session and touring musician forced him to take an indefinite break from the band. The remaining members (Fripp, Belew, Gunn and Mastelotto) reconvened as a "Double Duo" to write and record "The Construkction of Light" in Belew's basement and garage near Nashville. Released in May 2000, the album reached No. 129 in the UK. All of the pieces were metallic and harsh in sound, similar to the work of contemporary alternative metal. They featured a distinct electronic texture, a heavy processed drum sound from Mastelotto, Gunn continuing on Warr Guitar but now taking over the bass role, and a different take on the interlocked guitar sound that the band had used since the 1980s. With the exception of a parodic industrial blues (sung by Belew through a voice changer under the pseudonym of "Hooter J. Johnson"), the songs were unrelentingly complex and challenging to the listener, with plenty of rhythmic displacement to add to the harsh textures. The album contains the fourth instalment of "Larks' Tongues in Aspic". It received a negative reception for lacking new ideas. The band recorded an album at the same time, under the name of ProjeKct X, called "Heaven and Earth". Conceived and led by Mastelotto and Gunn, with Fripp and Belew playing subsidiary roles, it was a further development of the polyrhythmic/dance music approach adopted in the ProjeKcts.
King Crimson toured to support both albums, including double bill shows with Tool. The tour was documented in the triple live album "Heavy Construkction", released in December 2000. This showed the band constantly switching between the structured album pieces and ferocious ProjeKct-style Soundscape-and-percussion improvisations. Bassist John Paul Jones supported the band on some live shows.
On 9 November 2001, King Crimson released a limited edition live extended play called "Level Five", featuring three new pieces: Previously unrecorded new tracks "Dangerous Curves", "Level Five" title track and "Virtuous Circle", plus versions of "The Construkction of Light" and 1998 ProjeKct Two's "Deception of the Thrush" followed by the unlisted track "ProjeKct 12th and X" after one silent minute. A second EP followed in October 2002, "Happy with What You Have to Be Happy With". This featured eleven tracks including a live version of "Larks' Tongues in Aspic, Part IV". Half of the tracks were brief processed vocal snippets sung by Belew, and the songs themselves varied between gamelan pop, Soundscapes, and slightly parodic takes on heavy metal and blues.
King Crimson released their thirteenth album, "The Power to Believe", in October 2003. Fripp described it as "the culmination of three years of Crimsonising". The album incorporated reworked and/or retitled versions of "Deception of the Thrush", tracks from their previous two EPs, and a 1997 track with added instrumentation and vocals. "The Power to Believe" reached No. 162 in the UK and No. 150 in the US. King Crimson toured in 2003 to support the album; recordings from it were used for the live album "". 2003 also saw the release of the DVD "Eyes Wide Open", a compilation of the band's shows Live at the Shepherds Bush Empire (London, 3 July 2000) and Live in Japan (Tokyo, 16 April 2003).
In November 2003, Gunn left the group to pursue solo projects and was replaced by the returning Tony Levin. The band reconvened in early 2004 for rehearsals, but nothing developed from the sessions. At this point, Fripp was publicly reassessing his desire to work with King Crimson and within the music industry, often citing the unsympathetic aspects of the life of a touring musician.
Despite this, a new King Crimson formation was announced in 2007: Fripp, Belew, Levin, Mastelotto, and a new second drummer, Gavin Harrison, the first new member from the UK since 1972. In August 2008, after a period of rehearsals, the five completed the band's 40th Anniversary Tour. The setlists featured no new material, drawing instead from the existing "Discipline"-era/Double Trio/Double Duo repertoire, although several pieces received striking new percussion-heavy arrangements. Additional shows were planned for 2009, but were cancelled due to scheduling clashes.
King Crimson began another hiatus after the 40th Anniversary Tour. Belew continued to lobby for reviving the band, and discussed it with Fripp several times in 2009 and 2010. Among Belew's suggestions was a temporary reunion of the 1980s line-up for a thirtieth anniversary tour: an idea declined by both Fripp and Bruford, the latter commenting "I would be highly unlikely to try to recreate the same thing, a mission I fear destined to failure." In December 2010, Fripp wrote that the King Crimson "switch" had been set to "off", citing several reasons for this decision.
In 2011, a band called Jakszyk Fripp Collins (and subtitled "A King Crimson ProjeKct") released an album called "A Scarcity of Miracles". The band featured Jakko Jakszyk, Robert Fripp and Mel Collins as main players and composers, with Tony Levin and Gavin Harrison covering bass guitar/Chapman Stick and drums respectively. At one point, Fripp referred to the band as "P7". Unusually for a ProjeKct, it was based around fully finished and carefully crafted original songs (initially derived from improvisations). For a while, King Crimson fans debated whether this was a new line-up of the main band under another name, but the project did not tour or release another album. In August 2012, Fripp announced his retirement from the music industry, leaving the future of King Crimson uncertain.
In September 2013, Fripp suddenly and unexpectedly announced King Crimson's return to activity with a "very different reformation to what has gone before: seven players, four English and three American, with three drummers". He cited several reasons to make a comeback, varying from the practical to the whimsical: "I was becoming too happy. Time for a pointed stick." The new line-up drew from both the previous lineup (retaining Fripp, Levin, Harrison and Mastelotto) and the "Scarcity of Miracles" project (adding Jakszyk and Collins), with Guitar Craft alumnus and former R.E.M./Ministry drummer Bill Rieflin as the seventh member. Adrian Belew was not asked to take part, thus ending his 32-year tenure in King Crimson: Jakszyk took his place as singer and second guitarist. This version of the group took on the nickname of "the Seven-Headed Beast".
In early 2014, King Crimson had no plans to record in the studio, instead playing "reconfigured" versions of past material For the first time since 1974, the band's repertoire included songs from the run of albums between "In The Court of the Crimson King" and "Larks' Tongues in Aspic", as well as including instrumentals from "THRAK" and "The Power to Believe" (although Adrian Belew's songs were conspicuously absent). After rehearsing in England, they toured North America from 9 September 2014 across 20 dates. Recordings from the Los Angeles dates were released as "Live at the Orpheum".
Tours across Europe, Canada, and Japan followed in the later half of 2015. A live recording from the Canadian leg of the tour was released as "Live In Toronto". A European tour was planned for 2016. Following Rieflin's decision to take a break from music after the three dates of March, April and June in Salisbury, drummer Jeremy Stacey of Noel Gallagher's High Flying Birds was called in place for dates from September, building-up the now so-called 2016-SOND line-up.
On 7 December 2016, founding King Crimson member Greg Lake died of cancer.
On 3 January 2017, Robert Fripp announced Bill Rieflin's return to King Crimson. Since the band liked and wished to retain Jeremy Stacey, Rieflin shifted his group role and became King Crimson's full-time keyboard player. Consequently, King Crimson became an octet. Initially referred to by Fripp as the "Double Quartet Formation",referencing four drummers and four "back line" musicians, Fripp re-christened the lineup the "Three Over Five" (or "Five Over Three") Formation after Rieflin's decision to play only keyboards.
On 31 January 2017, another former King Crimson member, John Wetton, died of colon cancer.
On 27 April 2017, King Crimson announced a new live EP named "Heroes" after the David Bowie song, as a tribute to both the artist and the album featuring the song in question (both of which featured distinctive Robert Fripp guitar contributions throughout). The video to the song won "Video of the Year" at the 2017 Progressive Music Awards. Shortly afterwards, King Crimson embarked on an extensive tour of North America beginning on 11 June 2017 in Seattle, Washington and ending on 26 November 2017 in Milwaukee, Wisconsin.
On 3 September 2017, Robert Fripp announced that his differences with Adrian Belew had been resolved and that Belew was now King Crimson's "Ninth Man Inactive"; meaning that there were "no current plans for (him) to come out with the current formation; but (he) has rejoined the larger family – hooray! - and doors to the future are open." Belew confirmed this, adding "it means I may be back in the band in the future at some point. It leaves the door open for Crimson to evolve as necessary."
On 13 October 2017, it was announced that Bill Rieflin would be unable to join the Three Over Five Formation on the 2017 Autumn tour in the U.S. He was temporarily replaced by Seattle-based Crafty Guitarist Chris Gibson.
During 2018, King Crimson performed the extensive 33-date Uncertain Times tour through the UK and Europe between 13 June and 16 November, visiting Poland, Germany, Austria, Czech Republic, Sweden, Norway, the Netherlands, Italy, the UK and France.
On 6 April 2019, it was announced at a press conference that Rieflin would take another break from King Crimson to attend to family matters, and that his place on keyboards for the 2019 50th anniversary tour would be taken by Theo Travis, better known as a jazz saxophonist, Soft Machine member and occasional duo collaborator with Robert Fripp. Although Travis joined the band for rehearsals, Fripp announced on 2 May that the band had decided that it was no longer possible to have other musicians deputising for Rieflin and for this reason were "proceed(ing) as a Seven-Headed Beast" without Travis. Rieflin's parts were divided among other band members, with Jakszyk and Collins adding keyboards to their on-stage rigs, and Levin once again using the synthesizer he used during the 1980s tours.
On 11 June 2019, King Crimson's entire discography was made available to stream online on all the major streaming platforms, as part of the band’s 50th anniversary celebration.
On March 24 2020, it was announced that Bill Rieflin had died (with cancer being cited as the cause of death), reducing King Crimson to a septet.
Since the early 2000s, several bands containing former, recent or current King Crimson members have toured and recorded, performing King Crimson music.
Active between 2002 and 2004, the 21st Century Schizoid Band reunited several former King Crimson members who had played on the band's first four albums. The band featured Ian McDonald, Mel Collins, Peter Giles and Michael Giles (the latter subsequently replaced by Ian Wallace), and was fronted by guitarist/singer Jakko Jakszyk (a decade prior to his own recruitment into King Crimson proper). The band engaged in several tours, played material from the band's 1960s and 1970s catalogue, and recorded several live albums.
Since 2007, Tony Levin has led the trio Stick Men, which also features Pat Mastelotto (the band was initially completed by Chapman Stick player Michael Bernier, replaced in 2010 by touch guitarist and former Fripp student Markus Reuter). This band includes and interprets King Crimson compositions from the band's entire career in their live sets. Reuter and Mastelotto also play together as a duo (originally called Tuner), for which they have been known to rework the mid-1980s King Crimson instrumental "Industry" live.
Between 2011 and 2014, Stick Men and Adrian Belew's Power Trio band (Belew plus drummer Tobias Ralph and bass player Julie Slick) joined forces to play and tour as The Crimson ProjeKCt, covering the music made during Belew's tenure as King Crimson frontman and principal songwriter. The two groups still (in 2019) perform together from time to time, usually under names like "Belew, Levin, Mastelotto and friends" or "Tony Levin and friends".
During his solo career (including performance with the Power Trio), Adrian Belew has performed versions of certain King Crimson songs written predominantly by himself, such as "Dinosaur," as well as ensemble pieces like "Frame by Frame" and "Neurotica,". Post-Crimson, he has also performed live versions of King Crimson songs which he neither wrote nor performed on when originally recorded (in particular when he has played with Eddie Jobson), such as "Red" or "Larks' Tongues In Aspic, Pt. II".
King Crimson has been described musically as progressive rock, art rock, and post-progressive, with their earlier works being described as proto-prog. Their music was initially grounded in the rock of the 1960s, especially the acid rock and psychedelic rock movements. The band played Donovan's "Get Thy Bearings" in concert, and were known to play the Beatles' "Lucy in the Sky with Diamonds" in their rehearsals. However, for their own compositions, King Crimson (unlike the rock bands that had come before them) largely stripped away the blues-based foundations of rock music and replaced them with influences derived from classical composers. The first incarnation of King Crimson played the "Mars" section of Gustav Holst's suite "The Planets" as a regular part of their live set and Fripp has frequently cited the influence of Béla Bartók. As a result of this influence, "In the Court of the Crimson King" is frequently viewed as the nominal starting point of the progressive rock movements. King Crimson also initially displayed strong jazz influences, most obviously on its signature track "21st Century Schizoid Man". The band also drew on English folk music for compositions such as "Moonchild" and "I Talk to the Wind."
The 1981 reunion of the band brought in even more elements, displaying the influence of gamelan music and of late 20th century classical composers such as Philip Glass, Steve Reich, and Terry Riley. For its 1994 reunion, King Crimson reassessed both the mid-1970s and 1980s approaches in the light of new technology, intervening music forms such as grunge, and further developments in industrial music, as well as expanding the band's ambient textural content via Fripp's Soundscapes looping approach.
Several King Crimson compositional approaches have remained constant from the earliest versions of the band to the present. These include:
King Crimson have incorporated improvisation into their performances and studio recordings from the beginning, some of which has been embedded into loosely composed pieces such as "Moonchild" or "THRaK". Most of the band's performances over the years have included at least one stand-alone improvisation where the band simply started playing and took the music wherever it went, sometimes including passages of restrained silence, as with Bill Bruford's contribution to the improvised "Trio". The earliest example of King Crimson unambiguously improvising is the spacious, oft-criticised extended coda of "Moonchild" from "In the Court of the Crimson King".
Rather than using the standard jazz or blues "jamming" format for improvisation (in which one soloist at a time takes centre stage while the rest of the band lies back and plays along with established rhythm and chord changes), King Crimson improvisation is a group affair in which each member of the band is able to make creative decisions and contributions as the music is being played. Individual soloing is largely eschewed; each musician is to listen to each other and to the group sound, to be able to react creatively within the group dynamic. A slightly similar method of continuous improvisation ("everybody solos and nobody solos") was initially used by King Crimson's jazz-fusion contemporaries Weather Report. Fripp has used the metaphor of "white magic" to describe this process, in particular when the method works particularly well.
Similarly, King Crimson's improvised music is rarely jazz or blues-based, and varies so much in sound that the band has been able to release several albums consisting entirely of improvised music, such as the "Thrakattak" album. Occasionally, particular improvised pieces will be recalled and reworked in different forms at different shows, becoming more and more refined and eventually appearing on official studio releases (the most recent example being "Power to Believe III", which originally existed as the stage improvisation "Deception of the Thrush", a piece played on stage for a long time before appearing on record).
King Crimson have been influential both on the early 1970s progressive rock movement and numerous contemporary artists. Genesis and Yes were directly influenced by the band's initial style of symphonic Mellotron rock, and many King Crimson band members were involved in other notable bands: Lake in Emerson, Lake & Palmer (some of whose songs can be regarded stylistically as Lake's attempt to continue the early work of King Crimson); McDonald in Foreigner; Burrell in Bad Company, and Wetton in U.K. and Asia. Canadian rock band Rush cites King Crimson as a strong early influence on their sound; drummer Neil Peart credited the adventurous and innovative style of Michael Giles on his own approach to percussion.
King Crimson's influence extends to many bands from diverse genres, especially of the 1990s and 2000s. Tool are known to be heavily influenced by King Crimson, with vocalist Maynard James Keenan joking on a tour with them: "Now you know who we ripped off. Just don't tell anyone, especially the members of King Crimson." Modern progressive, experimental, psychedelic and indie rock bands have cited them as an influence as well, including the Mars Volta, Porcupine Tree, Primus, Mystery Jets, Fanfarlo, Phish, and Anekdoten, who first practised together playing King Crimson songs. Steven Wilson, the leader of Porcupine Tree, was responsible for remixing King Crimson's back catalogue in surround sound and said that the process had an enormous influence on his solo albums. In November 2012 the Flaming Lips in collaboration with Stardeath and White Dwarfs released a track-by-track reinterpretation of "In the Court of the Crimson King" entitled "Playing Hide and Seek with the Ghosts of Dawn". Colin Newman, of Wire, said he saw King Crimson perform many times, and that they influenced him deeply. The seminal hardcore punk group Black Flag acknowledge Wetton-era King Crimson as an influence on their experimental period in the mid-1980s. Melvin Gibbs said that the Rollins Band was influenced most by King Crimson, using similar chords. Bad Religion cites the lyrics of "21st Century Schizoid Man" on their single "21st Century (Digital Boy)" and the name of their record label, Epitaph (founded by their guitarist Brett Gurewitz), comes from the song of the same name of Crimson's debut album.
King Crimson have frequently been cited as pioneers of progressive metal and as an influence on bands of this genre, including Opeth, Mastodon, Between the Buried and Me, Leprous, Haken, the Ocean, Caligula's Horse, Last Chance to Reason, and Indukti. Members of metal bands Mudvayne, Voivod, Enslaved, Yob, Pyrrhon, and Pallbearer have cited King Crimson as an influence. Heavy experimental and avant-garde acts like the Dillinger Escape Plan, Neurosis, Zeni Geva, Ancestors, and Oranssi Pazuzu all cite King Crimson's influence.
Other artists affected by King Crimson include noise music artist Masami Akita of Merzbow, jazz guitarist Dennis Rea of Land, folktronica exponent Juana Molina, hip hop producer RJD2, hip hop and soul composer Adrian Younge, film director Hal Hartley, and folk-pop singer Ian Kelly.
Current members
Studio albums | https://en.wikipedia.org/wiki?curid=16616 |
Kilogram
The kilogram (also kilogramme) is the base unit of mass in the metric system, formally the International System of Units (SI), having the unit symbol kg. It is a widely used measure in science, engineering, and commerce worldwide, and is often simply called a kilo in everyday speech.
The kilogram was originally defined in 1795 as the mass of one litre of water. This was a simple definition, but difficult to use in practice. By the latest definitions of the unit, however, this relationship still has an accuracy of 30 ppm. In 1799, the platinum "Kilogramme des Archives" replaced it as the standard of mass. In 1889, a cylinder of platinum-iridium, the International Prototype of the Kilogram (IPK) became the standard of the unit of mass for the metric system, and remained so until 2019. The kilogram was the last of the SI units to be defined by a physical artefact.
The kilogram is now defined in terms of the second and the metre, based on fixed fundamental constants of nature. This allows a properly-equipped metrology laboratory to calibrate a mass measurement instrument such as a Kibble balance as the primary standard to determine an exact kilogram mass, although the IPK and other precision kilogram masses remain in use as secondary standards for all ordinary purposes.
The kilogram is defined in terms of three fundamental physical constants: The speed of light , a specific atomic transition frequency , and the Planck constant . The formal definition is:
This definition makes the kilogram consistent with the older definitions: the mass remains within 30 ppm of the mass of one litre of water.
The kilogram is the only base SI unit with an SI prefix ("kilo") as part of its name. The word "kilogramme" or "kilogram" is derived from the French , which itself was a learned coinage, prefixing the Greek stem of "a thousand" to , a Late Latin term for "a small weight", itself from Greek .
The word was written into French law in 1795, in the "Decree of 18 Germinal",
which revised the provisional system of units introduced by the French National Convention two years earlier, where the had been defined as weight () of a cubic centimetre of water, equal to 1/1000 of a . In the decree of 1795, the term thus replaced , and replaced .
The French spelling was adopted in Great Britain when the word was used for the first time in English in 1795, with the spelling "kilogram" being adopted in the United States. In the United Kingdom both spellings are used, with "kilogram" having become by far the more common. UK law regulating the units to be used when trading by weight or measure does not prevent the use of either spelling.
In the 19th century the French word , a shortening of , was imported into the English language where it has been used to mean both kilogram and kilometre. While "kilo" as an alternative is acceptable, to "The Economist" for example, the Canadian government's Termium Plus system states that "SI (International System of Units) usage, followed in scientific and technical writing" does not allow its usage and it is described as "a common informal name" on Russ Rowlett's Dictionary of Units of Measurement. When the United States Congress gave the metric system legal status in 1866, it permitted the use of the word "kilo" as an alternative to the word "kilogram", but in 1990 revoked the status of the word "kilo".
The SI system was introduced in 1960, and in 1970 the BIPM started publishing the "SI Brochure", which contains all relevant decisions and recommendations by the CGPM concerning units. The "SI Brochure" states that "It is not permissible to use abbreviations for unit symbols or unit names ...".
As it happens, it is mostly because of units for electromagnetism that the kilogram rather than the gram was eventually adopted as the base unit of mass in the SI system. The relevant series of discussions and decisions started roughly in the 1850s and effectively concluded in 1946. In brief, by the end of the 19th century, the ‘practical units’ for electric and magnetic quantities such as the ampere and the volt were well established in practical use (e.g. for telegraphy). Unfortunately, they were not coherent with the then-prevailing base units for length and mass, the centimeter and the gram. However, the ‘practical units’ also included some purely mechanical units; in particular, the product of the ampere and the volt gives a purely mechanical unit of power, the watt. It was noticed that the purely mechanical practical units such as the watt would be coherent in a system in which the base unit of length was the meter and the base unit of mass was the kilogram. In fact, given that no one wanted to replace the second as the base unit of time, the meter and the kilogram are the "only" pair of base units of length and mass such that 1. the watt is a coherent unit of power, 2. the base units of length and time are decimal multiples or submultiples of the meter and the gram (so that the system remains ‘metric’), and 3. the sizes of the base units of length and mass are convenient for practical use. This would still leave out the purely electrical and magnetic units: while the purely mechanical practical units such as the watt are coherent in the meter-kilogram-second system, the explicitly electrical and magnetic units such as the volt, the ampere, etc. are not. The only way to also make "those" units coherent with the meter-kilogram-second system is to modify that system in a different way: one has to increase the number of fundamental dimensions from three (length, mass, and time) to four (the previous three, plus one purely electrical one).
During the second half of the 19th century, the centimetre–gram–second (CGS) system of units was becoming widely accepted for scientific work, treating the gram as the fundamental unit of mass and the "kilogram" as a decimal multiple of the base unit formed by using a metric prefix. However, as the century drew to a close, there was widespread dissatisfaction with the state of units for electricity and magnetism in the CGS system. To begin with, there were two obvious choices for absolute units of electromagnetism: the ‘electrostatic’ (CGS-ESU) system and the ‘electromagnetic’ (CGS-EMU) system. But the main problem was that the sizes of coherent electric and magnetic units were not convenient in "either" of these systems; for example, the ESU unit of electrical resistance, which was later named the statohm, corresponds to about , while the EMU unit, which was later named the abohm, corresponds to .
To circumvent this difficulty, a "third" set of units was introduced: the so-called practical units. The practical units were obtained as decimal multiples of coherent CGS-EMU units, chosen so that the resulting magnitudes were convenient for practical use and so that the practical units were, as far as possible, coherent with each other. The practical units included such units as the volt, the ampere, the ohm, etc., which were later incorporated in the SI system and which we use to this day. Indeed, the main reason why the meter and the kilogram were later chosen to be the base units of length and mass was that they are the only combination of reasonably sized decimal multiples or submultiples of the meter and the gram that can in any way be made coherent with the volt, the ampere, etc.
The reason is that electrical quantities cannot be isolated from mechanical and thermal ones: they are connected by relations such as current × electric potential difference power. For this reason, the practical system also included coherent units for certain mechanical quantities. For example, the previous equation implies that ampere × volt is a coherent derived practical unit of power; this unit was named the watt. The coherent unit of energy is then the watt times the second, which was named the joule. The joule and the watt also have convenient magnitudes and are decimal multiples of CGS coherent units for energy (the erg) and power (the erg per second). The watt is not coherent in the centimeter-gram-second system, but it "is" coherent in the meter-kilogram-second system—and in no other system whose base units of length and mass are reasonably sized decimal multiples or submultiples of the meter and the gram.
However, unlike the watt and the joule, the explicitly electrical and magnetic units (the volf, the ampere…) are not coherent even in the (absolute three-dimensional) meter-kilogram-second system. Indeed, one can work out what the base units of length and mass have to be in order for "all" the practical units to be coherent (the watt and the joule as well as the volt, the ampere, etc.). The values are (one half of a meridian of the Earth, called a "quadrant") and (called an "eleventh-gram").
Therefore, the full absolute system of units in which the practical electrical units are coherent is the quadrant–eleventh-gram–second (QES) system. However, the extremely inconvenient magnitudes of the base units for length and mass made it so that no one seriously considered adopting the QES system. Thus, people working on practical applications of electricity had to use units for electrical quantities and for energy and power that were not coherent with the units they were using for e.g. length, mass, and force.
Meanwhile, scientists developed a yet another fully coherent absolute system, which came to be called the Gaussian system, in which the units for purely electrical quantities are taken from CGE-ESU, while the units for magnetic quantities are taken from the CGS-EMU. This system proved very convenient for scientific work and is still widely used. However, the sizes of its units remained either too large or too small—by many orders of magnitude—for practical applications.
Finally, on top of all this, in both CGS-ESU and CGS-EMU as well as in the Gaussian system, Maxwell's equations are ‘unrationalized', meaning that they contain various factors of that many workers found awkward. So yet another system was developed to rectify that: the ‘rationalized’ Gaussian system, usually called the Lorentz–Heaviside system. This system is still used in some subfields of physics. However, the units in that system are related to Gaussian units by factors of , which means that their magnitudes remained, like those of the Gaussian units, either far too large or far too small for practical applications.
In 1901, Giovanni Giorgi proposed a new system of units that would remedy this state of affairs. | https://en.wikipedia.org/wiki?curid=16619 |
Michael Schumacher
Michael Schumacher (; ; born 3 January 1969) is a German retired racing driver who competed in Formula One for Jordan Grand Prix, Benetton, Ferrari (where he spent most of his career), and Mercedes upon his return to the sport. Schumacher is widely regarded as one of the greatest Formula One drivers ever, and is regarded by some, including six-time world champion Lewis Hamilton, as the greatest of all time. Schumacher is the only driver in history to win seven Formula One World Championships, five of which he won consecutively. The most successful driver in the history of the sport, Schumacher holds the records for the most World Championship titles (7), the most Grand Prix wins (91), the most fastest laps (77) and the most races won in a single season (13), and according to the official Formula One website, Schumacher was "statistically the greatest driver the sport has ever seen" at the time of his retirement from the sport. He was also noted throughout his career for pushing his car to the very limit for sustained periods and a pioneering fitness regimen.
After success in karting as a child, Schumacher won titles in Formula König and Formula Three before joining Mercedes in the World Sportscar Championship. In 1991 his Mercedes-funded race debut for the Jordan Formula One team resulted in Schumacher being signed by Benetton for the rest of that season. He finished third in 1992 and fourth in 1993, before becoming the first German World Drivers' Champion in 1994 by one point over Damon Hill, albeit in controversial circumstances. In 1995 he repeated the success, this time with a greater margin. In 1996 Schumacher moved to Ferrari, who had last won the Drivers' Championship in 1979, and helped them transform into the most successful team in Formula One history, as he came close to winning the 1997 and 1998 titles, before breaking his leg at the 1999 British Grand Prix, ending another title run.
Schumacher won five consecutive drivers' titles from 2000 to 2004, including an unprecedented sixth and seventh title. In 2002 Schumacher won the title with a record six races remaining and finished on the podium in every race. In 2004, Schumacher won 12 out of the first 13 races, and went on to win a record 13 times as he won his final title. Schumacher retired from Formula One in 2006, after finishing runner-up to Renault's Fernando Alonso. Schumacher returned to Formula One in 2010 with Mercedes. He produced the fastest qualifying time at the 2012 Monaco Grand Prix, and achieved his only podium on his return at the 2012 European Grand Prix, where he finished third. In October 2012 Schumacher announced he would retire for a second time at the end of the season.
His career was at times controversial, as he was twice involved in collisions in the final race of a season that determined the outcome of the World Championship, with Damon Hill in 1994 in Adelaide, and with Jacques Villeneuve in 1997 in Jerez. Schumacher is an ambassador for UNESCO and has been involved in numerous humanitarian efforts throughout his life, donating tens of millions of dollars to charity. Schumacher and his younger brother, Ralf, are the only siblings to win races in Formula One, and they were the first brothers to finish first and second in the same race, a feat they repeated in four subsequent races.
In December 2013 Schumacher suffered severe brain injury in a skiing accident. He was placed in a medically induced coma until June 2014. He left hospital in Grenoble for further rehabilitation at the University Hospital of Lausanne, before being relocated to his home to receive medical treatment and rehabilitation privately in September 2014.
Schumacher was born in Hürth, North Rhine-Westphalia, to Rolf Schumacher, a bricklayer, and his wife Elisabeth. When Schumacher was four, his father modified his pedal kart by adding a small motorcycle engine. When Schumacher crashed it into a lamp post in Kerpen, his parents took him to the karting track at Kerpen-Horrem, where he became the youngest member of the karting club. His father soon built him a kart from discarded parts and at the age of six Schumacher won his first club championship. To support his son's racing, Rolf Schumacher took on a second job renting and repairing karts, while his wife worked at the track's canteen. Nevertheless, when Michael needed a new engine costing 800 DM, his parents were unable to afford it; he was able to continue racing with support from local businessmen.
Regulations in Germany require a driver to be at least 14 years old to obtain a kart license. To get around this, Schumacher obtained a license in Luxembourg at the age of 12.
In 1983, he obtained his German license, a year after he won the German Junior Kart Championship. From 1984 on, Schumacher won many German and European kart championships. He joined Eurokart dealer Adolf Neubert in 1985 and by 1987 he was the German and European kart champion, then he quit school and began working as a mechanic. In 1988 he made his first step into single-seat car racing by participating in the German Formula Ford and Formula König series, winning the latter.
In 1989, Schumacher signed with Willi Weber's WTS Formula Three team. Funded by Weber, he competed in the German Formula 3 series, winning the title in 1990. He also won the Macau Grand Prix in 1990 under controversial circumstances. He placed second behind Mika Häkkinen in the first heat, three seconds behind. At the start of the second heat, he overtook Häkkinen, who only had to finish within three seconds of Schumacher to clinch the overall win. In the closing laps, Schumacher made a mistake, allowing Häkkinen to attempt to overtake. Schumacher blocked his competitor, who crashed into the back of his car. While Häkkinen spun and lost time, Schumacher cruised to victory without a rear wing.
At the end of 1990, along with his Formula 3 rivals Heinz-Harald Frentzen and Karl Wendlinger, he joined the Mercedes junior racing programme in the World Sports-Prototype Championship. This was unusual for a young driver: most of Schumacher's contemporaries would compete in Formula 3000 on the way to Formula One. However, Weber advised Schumacher that being exposed to professional press conferences and driving powerful cars in long-distance races would help his career. In the 1990 World Sportscar Championship season, Schumacher won the season finale at the Autódromo Hermanos Rodríguez in a Sauber–Mercedes C11, and finished fifth in the drivers' championship despite only driving in three of the nine races. He continued with the team in the 1991 World Sportscar Championship season, winning again at the final race of the season at Autopolis in Japan with a Sauber–Mercedes-Benz C291, leading to a ninth-place finish in the drivers' championship. He also competed at Le Mans during that season, finishing fifth in a car shared with Karl Wendlinger and Fritz Kreutzpointner. In 1991, he competed in one race in the Japanese Formula 3000 Championship, finishing second.
Schumacher was noted throughout his career for his ability to produce fast laps at crucial moments in a race and to push his car to the very limit for sustained periods. He was also noted for his pioneering fitness regime and ability to galvanise teams around him. Motorsport author Christopher Hilton observed in 2003 that a "measure of a driver's capabilities is his performance in wet races, because the most delicate car control and sensitivity are needed", and noted that like other great drivers, Schumacher's record in wet conditions shows very few mistakes: up to the end of the 2003 season, Schumacher won 17 of the 30 races in wet conditions he contested. Some of Schumacher's best performances occurred in such conditions, earning him the nicknames ""Regenkönig"" (rain king) or ""Regenmeister"" (rain master), even in the non-German-language media. He is known as "the Red Baron", because of his red Ferrari and in reference to the German Manfred von Richthofen, the famous flying ace of World War I. Schumacher's nicknames include "Schumi",
"Schuey"
and "Schu".
Schumacher is often credited with popularising Formula One in Germany, where it was formerly considered a fringe sport. When Schumacher retired in 2006, three of the top ten drivers were German, more than any other nationality and more than have ever been present in Formula One history. Younger German drivers, such as Sebastian Vettel, felt Schumacher was key in their becoming Formula One drivers. In the latter part of his Formula One career, and as one of the senior drivers, Schumacher was the president of the Grand Prix Drivers' Association. In a 2006 FIA survey, Michael Schumacher was voted the most popular driver of the season among Formula One fans.
Schumacher made his Formula One debut with the Jordan-Ford team at the 1991 Belgian Grand Prix, driving car number 32 as a replacement for the imprisoned Bertrand Gachot. Schumacher, still a contracted Mercedes driver, was signed by Eddie Jordan after Mercedes paid Jordan $150,000 for his debut.
The week before the race, Schumacher impressed Jordan designer Gary Anderson and team manager Trevor Foster during a test drive at Silverstone. His manager Willi Weber assured Jordan that Schumacher knew the challenging Spa track well, although in fact he had only seen it as a spectator. During the race weekend, teammate Andrea de Cesaris was meant to show Schumacher the circuit, but was held up with contract negotiations. Schumacher then learned the track on his own, by cycling around the track on a fold-up bike he had brought with him. He impressed the paddock by qualifying seventh in this race. This matched the team's season-best grid position, and out-qualified 11-year veteran de Cesaris. Motorsport journalist Joe Saward reported that after qualifying "clumps of German journalists were talking about 'the best talent since Stefan Bellof. Schumacher retired on the first lap of the race with clutch problems.
Following his Belgian Grand Prix debut, and despite an agreement in principle between Jordan and Schumacher's Mercedes management that would see the German race for the Irish team for the remainder of the season, Schumacher was engaged by Benetton-Ford for the following race. Jordan applied for an injunction in the UK courts to prevent Schumacher driving for Benetton, but lost the case as they had not yet signed a final contract.
Schumacher finished the season with four points out of six races. His best finish was fifth in his second race, the , in which he finished ahead of his teammate and three-time World Champion Nelson Piquet.
At the start of the season the Sauber team, planning their Formula One debut with Mercedes backing for the following year, invoked a clause in Schumacher's contract that stated that if Mercedes entered Formula One, Schumacher would drive for them. It was eventually agreed that Schumacher would stay with Benetton, Peter Sauber said that "[Schumacher] didn't want to drive for us. Why would I have forced him?". The year was dominated by the Williams of Nigel Mansell and Riccardo Patrese, featuring powerful Renault engines, semi-automatic gearboxes and active suspension to control the car's ride height. In the "conventional" Benetton B192 Schumacher took his place on the podium for the first time, finishing third in the . He went on to take his first victory at the , in a wet race at the Spa-Francorchamps circuit, which by 2003 he would call "far and away my favourite track". He finished third in the Drivers' Championship in 1992 with 53 points, three points behind runner-up Patrese.
The Williams of Damon Hill and Alain Prost also dominated the season. Benetton introduced their own active suspension and traction control early in the season, last of the frontrunning teams to do so. Schumacher won one race, the where he beat Prost, and had nine podium finishes, but retired in seven of the other 15 races. He finished the season in fourth, with 52 points.
The season was Schumacher's first Drivers' Championship. The season, however, was marred by the deaths of Ayrton Senna (witnessed by Schumacher, who was directly behind in second position) and Roland Ratzenberger during the , and by allegations that several teams, but most particularly Schumacher's Benetton team, broke the sport's technical regulations.
Schumacher won six of the first seven races and was leading the , before a gearbox failure left him stuck in fifth gear. Schumacher finished the race in second place. Following the San Marino Grand Prix, the Benetton, Ferrari and McLaren teams were investigated on suspicion of breaking the FIA-imposed ban on electronic aids. Benetton and McLaren initially refused to hand over their source code for investigation. When they did so, the FIA discovered hidden functionality in both teams' software, but no evidence that it had been used in a race. Both teams were fined $100,000 for their initial refusal to cooperate. However, the McLaren software, which was a gearbox program that allowed automatic shifts, was deemed legal. By contrast, the Benetton software was deemed to be a form of "launch control" that would have allowed Schumacher to make perfect starts, which was explicitly outlawed by the regulations. However, there was no evidence to suggest that this software was actually used.
At the , Schumacher was penalised for overtaking on the formation lap. He then ignored the penalty and the subsequent black flag, which indicates that the driver must immediately return to the pits, for which he was disqualified and later given a two-race ban. Benetton blamed the incident on a communication error between the stewards and the team. Schumacher was also disqualified after winning the after his car was found to have illegal wear on its skidblock, a measure used after the accidents at Imola to limit downforce and hence cornering speed. Benetton protested that the skidblock had been damaged when Schumacher spun over a kerb, but the FIA rejected their appeal because of the pattern of wear and damage visible on the block.
These incidents helped Damon Hill close the points gap, and Schumacher led by a single point going into the final race in Australia. On lap 36 Schumacher hit the guardrail on the outside of the track while leading. Hill attempted to pass, but as Schumacher's car returned to the track there was a collision on the corner causing them both to retire. As a result, Schumacher won a very controversial championship, the first German to do so (Jochen Rindt raced under the Austrian flag). At the FIA conference after the race, the new World Champion dedicated his title to Ayrton Senna.
In Schumacher successfully defended his title with Benetton. He now had the same Renault engine as Williams. He accumulated 33 more points than second-placed Damon Hill. With teammate Johnny Herbert, he took Benetton to its first Constructors' Championship and became the youngest two-time World Champion in Formula One history.
The season was marred by several collisions with Hill, in particular an overtaking manoeuvre by Hill took them both out of the on lap 45, and again on lap 23 of the Italian Grand Prix. Schumacher won nine of the 17 races, and finished on the podium 11 times. Only once did he qualify worse than fourth; at the , he qualified 16th, but nevertheless went on to win the race.
In , Schumacher joined Ferrari, a team that had last won the Drivers' Championship in and the Constructors' Championship in , for a salary of $60 million over two years. He left Benetton a year before his contract with them expired; he later cited the team's damaging actions in 1994 as his reason for opting out of his deal. A year later Benetton employees Rory Byrne (designer) and Ross Brawn (Technical Director) joined Ferrari.
Ferrari had previously come close to the championship in 1982 and 1990. The team had suffered a disastrous downturn in the early 1990s, partially as its famous V12 engine was no longer competitive against the smaller, lighter and more fuel-efficient V10s of its competitors. Various drivers, notably Alain Prost, had given the vehicles labels such as "truck", "pig", and "accident waiting to happen". Furthermore, the poor performance of the Ferrari pit crews was considered a running joke. At the end of 1995, though the team had improved into a solid competitor, it was still considered inferior to front-running teams such as Benetton and Williams. Schumacher declared the Ferrari 412T good enough to win the Championship.
Schumacher, Ross Brawn, Rory Byrne, and Jean Todt (hired in 1993), have been credited as turning this once struggling team into the most successful team in Formula One history. Three-time World Champion Jackie Stewart believes the transformation of the Ferrari team was Schumacher's greatest feat. Eddie Irvine also joined the team, moving from Jordan. During winter testing, Schumacher first drove a Ferrari, their 1995 Ferrari 412 T2, and was two seconds faster than former regulars Jean Alesi and Gerhard Berger had been.
Schumacher finished third in the Drivers' Championship in 1996 and helped Ferrari to second place in the Constructors' Championship ahead of his old team Benetton. He won three races, more than the team's total tally for the period from 1991 to 1995. Early in the 1996 season the car had reliability trouble and Schumacher did not finish six of the 16 races. He took his first win for Ferrari at the Spanish Grand Prix, where he lapped the entire field up to third place in the wet. Having taken the lead on lap 19, he consistently lapped five seconds faster than the rest of the field in the difficult conditions. In the Schumacher qualified in pole position, but suffered engine failure on the race's formation lap. However, at Spa-Francorchamps, Schumacher used well-timed pit-stops to fend off Williams's Jacques Villeneuve. Following that, at Monza, Schumacher won in front of the tifosi.
Michael Schumacher and Jacques Villeneuve vied for the title in . Villeneuve, driving the superior Williams FW19, led the championship in the early part of the season. However, by mid-season, Schumacher had taken the championship lead, winning five races, and entered the season's final Grand Prix with a one-point advantage. Towards the end of the race, held at Jerez, Schumacher's Ferrari developed a coolant leak and loss of performance indicating he may not finish the race. As Villeneuve approached to pass his rival, Schumacher attempted to provoke an accident, but got the short end of the stick, retiring from the race. Villeneuve went on and scored four points to take the championship. Schumacher was punished for unsportsmanlike conduct for the collision and was disqualified from the Drivers' Championship.
In , Finnish driver Mika Häkkinen became Schumacher's main title competition. Häkkinen won the first two races of the season, gaining a 16-point advantage over Schumacher. Schumacher then won in Argentina and, with the Ferrari improving significantly in the second half of the season, Schumacher took six victories and had five other podium finishes. Ferrari took a 1–2 finish at the , the first Ferrari 1–2 finish since 1990, and the , which tied Schumacher with Häkkinen for the lead of the Drivers' Championship with 80 points, but Häkkinen won the Championship by winning the final two races. There were two controversies; at the Schumacher was leading on the last lap when he turned into the pit lane, crossed the start finish line and stopped for a ten-second stop go penalty. There was some doubt whether this counted as serving the penalty, but, because he had crossed the finish line when he came into the pit lane, the win was valid. At Spa, Schumacher was leading the race by 40 seconds in heavy spray, but collided with David Coulthard's McLaren when the Scot, a lap down, slowed in very poor visibility to let Schumacher past. After both cars returned to the pits, Schumacher leaped out of his car and headed to McLaren's garage in an infuriated manner and accused Coulthard of trying to kill him. Coulthard admitted five years later that the accident had been his mistake.
Schumacher's efforts helped Ferrari win the Constructors' title in . He lost his chance to win the Drivers' Championship at the at the high-speed "Stowe Corner"; his car's rear brake failed, sending him off the track and resulting in a broken leg. During his 98-day absence, he was replaced by Finnish driver Mika Salo. After missing six races he made his return at the inaugural , qualifying in pole position by almost a second. He then assumed the role of second driver, assisting teammate Eddie Irvine's bid to win the Drivers' Championship for Ferrari. In the last race of the season, the , Häkkinen won his second consecutive title. Schumacher would later say that Häkkinen was the opponent he respected the most.
During this period Schumacher won more races and championships than any other driver in the history of the sport. Schumacher won his third World Championship in after a year-long battle with Häkkinen. Schumacher won the first three races of the season and five of the first eight. Midway through the year, Schumacher's chances suffered with three consecutive non-finishes, allowing Häkkinen to close the gap in the standings. Häkkinen then took another two victories, before Schumacher won at the . At the post race press conference, after equalling the number of wins (41) won by his idol, Ayrton Senna, Schumacher broke into tears. The championship fight would come down to the penultimate race of the season, the . Starting from pole position, Schumacher lost the lead to Häkkinen at the start. After his second pit-stop, however, Schumacher came out ahead of Häkkinen and went on to win the race and the championship.
In , Schumacher took his fourth drivers' title. Four other drivers won races, but none sustained a season-long challenge for the championship. Schumacher scored a record-tying nine wins and clinched the World Championship with four races yet to run. He finished the championship with 123 points, 58 ahead of runner-up Coulthard. Season highlights included the , where Schumacher finished second to his brother Ralf, thus scoring the first ever 1–2 finish by brothers in Formula One; and the Belgian Grand Prix in which Schumacher scored his 52nd career win, breaking Alain Prost's record for most career wins.
In , Schumacher used the Ferrari F2002 to retain his Drivers' Championship.
There was again some controversy, however, at the , where his teammate, Rubens Barrichello was leading, but in the final metres of the race, under team orders, slowed down to allow Schumacher to win the race. The crowd broke into outraged boos at the result and Schumacher tried to make amends by allowing Barrichello to stand on the top step of the podium. At the later that year, Schumacher dominated the race and was set for a close finish with Barrichello. At the end he slowed down to create a formation finish with Barrichello, but slowed too much allowing Barrichello to take the victory. In winning the Drivers' Championship he equalled the record set by Juan Manuel Fangio of five World Championships. Ferrari won 15 out of 17 races, and Schumacher won the title with six races remaining in the season, which is still the earliest point in the season for a driver to be crowned World Champion. Schumacher broke his own record, shared with Nigel Mansell, of nine race wins in a season, by winning 11 times and finishing every race on the podium. He finished with 144 points, a record-breaking 67 points ahead of the runner-up, his teammate Rubens Barrichello. This pair finished nine of the 17 races in the first two places.
Schumacher broke Juan Manuel Fangio's record of five World Drivers' Championships by winning the drivers' title for the sixth time in , a closely contested season. The biggest competition came once again from the McLaren Mercedes and Williams BMW teams. In the first race, Schumacher ran off track, and in the following two, was involved in collisions. He fell 16 points behind Kimi Räikkönen. Schumacher won the and the next two races, and closed within two points of Räikkönen. Aside from Schumacher's victory in Canada, and Barrichello's victory in Britain, the mid-season was dominated by Williams drivers Ralf Schumacher and Juan Pablo Montoya, who each claimed two victories. After the , Michael Schumacher led Montoya and Kimi Räikkönen by only one and two points, respectively. Ahead of the next race, the FIA announced changes to the way tyre widths were to be measured: this forced Michelin, supplier to Williams and McLaren among others, to rapidly redesign their tyres before the . Schumacher, running on Bridgestone tyres, won the next two races. After Montoya was penalised in the , only Schumacher and Räikkönen remained in contention for the title. At the final round, the , Schumacher needed only one point whilst Räikkönen needed to win. By finishing the race in eighth place, Schumacher took one point and assured his sixth World Drivers' title, ending the season two points ahead of Räikkönen.
In , Schumacher won a record 12 of the first 13 races of the season, only failing to finish in Monaco after an accident with Juan Pablo Montoya during a safety car period when he briefly locked his car's brakes. He clinched a record seventh drivers' title at the . He finished that season with a record 148 points, 34 points ahead of the runner-up, teammate Rubens Barrichello, and set a new record of 13 race wins out of a possible 18, surpassing his previous best of 11 wins from the season.
Rule changes for the 2005 season required tyres to last an entire race, tipping the overall advantage to teams using Michelins over teams such as Ferrari that relied on Bridgestone tyres. The rule changes were partly in an effort to dent Ferrari's dominance and make the series more interesting. The most notable moment of the early season for Schumacher was his battle with Fernando Alonso in San Marino, where he started 13th and finished only 0.2 seconds behind the Spanish driver. Less than halfway through the season, Schumacher said "I don't think I can count myself in this battle any more. It was like trying to fight with a blunted weapon... If your weapons are weak you don't have a chance." Schumacher's sole win in 2005 came at the . Before that race, the Michelin tyres were found to have significant safety issues. When no compromise between the teams and the FIA could be reached, all but the six drivers using Bridgestone tyres dropped out of the race after the formation lap. Schumacher retired in six of the 19 races. He finished the season in third with 62 points, fewer than half the points of World Champion Alonso.
Schumacher was stripped of pole position at the and started the race at the back of the grid. This was due to his stopping his car and blocking part of the circuit while Alonso was on his qualifying lap; he still managed to work his way up to fifth place on the notoriously cramped Monaco circuit. By the , the ninth race of the season, Schumacher was 25 points behind Alonso, but he then won the following three races to reduce his disadvantage to 11. After his victories in Italy (in which Alonso had an engine failure) and China, in which Alonso had tyre problems, Schumacher led in the championship standings for the first time during the season. Although he and Alonso had the same point total, Schumacher was in front because he had won more races.
The Japanese Grand Prix was led by Schumacher with only 16 laps to go, when, for the first time since the 2000 French Grand Prix, Schumacher's car suffered an engine failure. Alonso won the race, giving himself a ten-point championship lead. With only one race left in the season, Schumacher could only win the championship if he won the season finale and Alonso scored no points.
Before the , Schumacher conceded the title to Alonso. In pre-race ceremonies, football legend Pelé presented a trophy to Schumacher for his years of dedication to Formula One. During the race's qualifying session, Schumacher had one of the quickest times during the first session and was fastest in the second session; but a fuel pressure problem prevented him from completing a single lap during the third session, forcing him to start the race in tenth position. Early in the race Schumacher moved up to sixth place. However, in overtaking Alonso's teammate, Giancarlo Fisichella, Schumacher experienced a tyre puncture caused by the front wing of Fisichella's car. Schumacher pitted and consequently fell to 19th place, 70 seconds behind teammate and race leader Felipe Massa. Schumacher recovered and overtook both Fisichella and Räikkönen to secure fourth place. His performance was classified in the press as "heroic", an "utterly breath-taking drive", and a "performance that ... sums up his career".
While Schumacher was on the podium after winning the 2006 Italian Grand Prix, Ferrari issued a press release stating that he would retire from racing at the end of the 2006 season. Schumacher confirmed his retirement. The press release stated that Schumacher would continue working for Ferrari. It was revealed on 29 October 2006 that Ferrari wanted Schumacher to act as assistant to the newly appointed CEO Jean Todt. This would involve selecting the team's future drivers. After Schumacher's announcement, leading Formula One figures such as Niki Lauda and David Coulthard hailed Schumacher as the greatest all-round racing driver in the history of Formula One. The tifosi and the Italian press, who did not always take to Schumacher's relatively cold public persona, displayed an affectionate response after he announced his retirement.
Schumacher attended several Grands Prix during the season. He drove the Ferrari F2007 for the first time on 24 October at Ferrari's home track in Fiorano, Italy. He ran no more than five laps and no lap times were recorded. A Ferrari spokesman said the short drive was done for the Fiat board of directors who were holding their meeting in Maranello.
During the season, Schumacher acted as Ferrari's adviser and Jean Todt's 'super assistant'. On 13 November 2007, Schumacher, who had not driven a Formula One car since he had retired a year earlier, undertook a formal test session for the first time aboard the F2007. He returned in December 2007 to continue helping Ferrari with their development programme at Jerez circuit. He focused on testing electronics and tyres for the 2008 Formula One season.
In 2007, former Ferrari top manager Ross Brawn said that Schumacher was very likely and also happy to continue testing in 2008; Schumacher later explained his role further saying that he would "deal with the development of the car inside Gestione Sportiva" and as part of that "I'd like to drive, but not too often".
During 2008 Schumacher also competed in motorcycle racing in the IDM Superbike-series, but stated that he had no intention of a second competitive career in this sport. He was quoted as saying that riding a Ducati was the most exhilarating thing he had done in his life, the second most being sky diving.
In his capacity as racing advisor to Ferrari, Schumacher was present in Budapest for the when Ferrari driver Felipe Massa was seriously injured after being struck by a suspension spring during qualifying. As it became clear that Massa would be unable to compete in the next race at Valencia, Schumacher was chosen as a replacement for the Brazilian driver and on 29 July 2009, Ferrari announced that they planned to draft in Schumacher for the and subsequent Grands Prix until Massa was able to race again. Schumacher tested in a modified F2007 to prepare himself as he had been unable to test the 2009 car due to testing restrictions. Ferrari appealed for special permission for Schumacher to test in a 2009 spec car, but Williams, Red Bull and Toro Rosso were against this test. In the end, Schumacher was forced to call off his return due to the severity of the neck injury he had received in a motorcycle accident earlier in the year. Massa's place at Ferrari was instead filled by Luca Badoer and Giancarlo Fisichella.
The Ferrari Museum in Maranello, Italy stated it was planning an exhibition, that would commence on his birthday and span a few months "both as a celebration and a mark of gratitude to the most successful Prancing Horse driver ever".
In December 2009 it was announced that Schumacher would be returning to Formula One in the season alongside fellow German driver Nico Rosberg in the new Mercedes GP team. The new Mercedes team was their first majority involvement in an F1 team since 1955. Schumacher stated that his preparations to replace the injured Massa for Ferrari had initiated a renewed interest in F1, which, combined with the opportunity to fulfil a long-held ambition to drive for Mercedes and to be working again with team principal Ross Brawn, led Schumacher to accept the offer once he was passed fit. After a period of intensive training medical tests, it was confirmed that the neck injury that had prevented him driving for Ferrari the year before had fully healed. Schumacher signed a three-year contract, reportedly worth £20 million.
Schumacher's surprise return to F1 was compared to Niki Lauda's in 1982 at age 33 and Nigel Mansell's return in 1994 at age 41. Schumacher turned 41 in January 2010 and his prospects with Mercedes were compared with the record set by the oldest F1 champion Juan Manuel Fangio who was 46 when he won his fifth championship.
Schumacher's first drive of the 2010 Mercedes car – the Mercedes MGP W01 – was at an official test in February 2010 in Valencia. He finished sixth in the first race of the season at the . After the Malaysian race, former driver Stirling Moss suggested that Schumacher, who had finished behind his teammate in each of the first four qualifying sessions and races, might be "past it". Many other respected former Formula One drivers thought otherwise, including former rival Damon Hill, who warned "you should never write Schumacher off". GrandPrix.com identified the inherent understeer of the Mercedes car, exacerbated by the narrower front tyres introduced for the 2010 season, as contributing to Schumacher's difficulties. Jenson Button would later claim that Mercedes's 2010 car was designed for him, and that their differing driving styles may have contributed to Schumacher's difficulties.
Mercedes upgraded their car for the where Schumacher finished fourth. At the Schumacher finished sixth after passing Ferrari's Fernando Alonso on the final corner of the race when the safety car returned to the pits. However, he was penalised 20 seconds after the race by the race stewards dropping him to 12th. The stewards judged the pass to be in breach of the FIA's sporting code. Mercedes's differing interpretation of the regulation would later lead to it being clarified by the FIA.
In Turkey, Schumacher qualified fifth, and finished fourth in the race, both his best results since his return. In in Valencia, Schumacher finished 15th, the lowest recorded finish in his career. In Hungary, Schumacher finished outside the points in 11th, but was found guilty of dangerous driving at while unsuccessfully defending tenth position against Rubens Barrichello. As a result, he was demoted ten places on the grid for the following race, the Belgian Grand Prix, where he finished seventh, despite starting 21st after his grid penalty.
At the season finale in Abu Dhabi, Schumacher was involved in a major accident on the first lap, which occurred after a spin. In recovering from the incident Vitantonio Liuzzi's car collided with Schumacher, barely missing his head. Nobody was hurt in the crash, but Schumacher said the crash had been "frightening".
He finished the season ninth with 72 points. Before, it had happened only in his début in 1991 that he finished without a win, pole position, podium or fastest lap.
Schumacher's first points of 2011 were scored in Malaysia; he later came sixth in Spain and had a strong race at the finishing fourth, after running as high as second in a wet race. Schumacher was passed late in the race by eventual winner Jenson Button.
Schumacher clashed with Vitaly Petrov in Valencia, and with Kamui Kobayashi in Britain, and marked the 20th anniversary of his Formula One début at the . Despite starting last in Belgium, Schumacher raced well and finished fifth. Schumacher again raced well in Italy, duelling with Lewis Hamilton for fourth place. The saw Schumacher lead three laps during the race, marking the first time he had led a race since 2006. In doing so, he became the oldest driver to lead a race since Jack Brabham in .
At the Schumacher started well and finished fifth after overtaking Rosberg at the end of the race. Schumacher diced again with Rosberg in , battling over sixth position on the first lap. Schumacher finished the season in eighth place in the Drivers' Championship, with 76 points.
Schumacher was again partnered by Rosberg at Mercedes for the 2012 season. Schumacher retired from the inaugural race of the season , and scored a point in the second round in Malaysia. In China Schumacher started on the front row alongside Rosberg on pole, but retired due to a loose wheel after a mechanic's error during a pit stop.
After causing a collision with Bruno Senna in Spain, Schumacher received a five-place grid penalty for the Monaco Grand Prix. Schumacher was fastest in qualifying in Monaco; but started sixth owing to his penalty. He later retired from seventh place in the race.
At the , Schumacher finished third in the race, his only podium finish since his return to F1 with Mercedes. At the age of 43 years and 173 days, he became the oldest driver to achieve a podium since Jack Brabham's second-place finish at the 1970 British Grand Prix. Further records were set by Schumacher in Germany, where he set the fastest lap in a Grand Prix for the 77th time in his career, and in Belgium where he became the second driver in history to race in 300 Grands Prix.
Schumacher's indecision over his future plans in F1 led to him being replaced by Lewis Hamilton at Mercedes for the 2013 season. In October 2012, Schumacher announced he would retire for a second time at the conclusion of the season. The following week he was quoted as saying: "There were times in the past few months in which I didn't want to deal with Formula One or prepare for the next Grand Prix." The season and his 21-year F1 career concluded with the 2012 Brazilian Grand Prix, in which Schumacher finished seventh. He placed 13th in the 2012 Drivers' Championship.
Schumacher, in conjunction with Schuberth, helped develop the first lightweight carbon helmet. In 2004, a prototype was publicly tested by being driven over by a tank; it survived intact. The helmet keeps the driver cool by funneling directed airflow through fifty holes. Schumacher's original helmet sported the colours of the German flag and his sponsor's decals. On the top was a blue circle with white astroids. From the 2000 Monaco Grand Prix, in order to differentiate his colours from his new teammate Rubens Barrichello (whose helmet was predominantly white with a blue circle on top and a red ellipsis surrounding the visor), Schumacher changed the upper blue colour and some of the white areas to red. For the Brazilian Grand Prix race of 2006 (at the time intended to be his final Grand Prix), he wore an all-red helmet that included the names of his ninety-one Grand Prix victories. For the 2011 Belgian Grand Prix, Schumacher's 20th anniversary in Formula One, he wore a commemorative gold-leafed helmet. The helmet, very similar to his current helmet, included the year of his début to the present, and the years of his seven World titles. For the 2012 Belgian Grand Prix, Schumacher's 300th Grand Prix appearance, he wore a special platinum-leafed helmet with a message of his achievement.
Schumacher was honoured many times during his career. In April 2002, for his contributions to sport and his contributions in raising awareness of child education, he was named as one of the UNESCO Champions for sport, joining the other eight, which include Pelé, Sergey Bubka and Justine Henin. He won the Laureus World Sportsman of the Year award twice, in 2002 and 2004 for his performances in the and seasons respectively. He also received nominations for the 2001, 2003, 2005 and 2007 awards. He shares the record for having the second-most nominations for the award with Roger Federer with six nominations, and is eclipsed only by Tiger Woods who has been nominated seven times. He holds the distinction of having the most nominations for a motorsport athlete, (Fernando Alonso has been nominated only twice, Sebastian Vettel three times, and Valentino Rossi five times) and being the only motorsport athlete to have won the award more than once.
In honour of Schumacher's racing career and his efforts to improve safety and the sport, he was awarded an FIA Gold Medal for Motor Sport in 2006. In 2007, in recognition of his contribution to Formula One racing, the Nürburgring racing track renamed turns 8 and 9 (the Audi and Shell Kurves) as the "Schumacher S", and a month later he presented A1 Team Germany with the A1 World Cup at the A1GP World Cup of Motorsport 2007 awards ceremony. He was nominated for the Prince of Asturias Award for Sport for 2007, which he won both for sporting prowess and for his humanitarian record.
In 2008, the Swiss Football Association appointed long-time Swiss resident Schumacher as the country's ambassador for UEFA Euro 2008, hosted by Switzerland and Austria.
On 30 April 2010, Schumacher was honored with the Officier of Légion d'honneur title from French prime minister François Fillon.
On 13 November 2014, Schumacher was awarded the Millennium Trophy at the Bambi Awards.
Going into the 1994 Australian Grand Prix, the final race of the 1994 season, Schumacher led Damon Hill by a single point in the Drivers' Championship. Schumacher led the race from the beginning, but on lap 35 he went off track and hit the wall with his right side wheels, returning to the track at reduced speed, and with car damage, but still leading the race. At the next corner Hill attempted to pass on the inside, but Schumacher turned in sharply and they collided. Both cars were eliminated from the race and, as neither driver scored, Schumacher took the title. The race stewards judged it a racing accident and took no action against either driver, but public opinion is divided over the incident, and Schumacher was vilified in the British media.
At the 1997 European Grand Prix at Jerez, the last race of the season, Schumacher led Williams's Jacques Villeneuve by one point in the Drivers' Championship. As Villeneuve attempted to pass Schumacher at the Dry Sac corner on lap 48, Schumacher turned in and the right-front wheel of Schumacher's Ferrari hit the left sidepod of Villeneuve's car. Schumacher retired from the race as a result, but Villeneuve finished in third place, taking four points and so becoming the World Champion. The race stewards did not initially award any penalty, but two weeks after the race Schumacher was disqualified from the entire 1997 Drivers' Championship after an FIA disciplinary hearing found that his "manoeuvre was an instinctive reaction and although deliberate not made with malice or premeditation. It was a serious error." Schumacher accepted the decision and admitted having made a mistake. Schumacher's actions were widely condemned in British, German, and Italian newspapers. This made Schumacher the only driver in the history of the sport, as of 2019, to be disqualified from a Drivers' World Championship.
Historically, team orders have always been an accepted part of Formula One. However, in the final metres of the 2002 Austrian Grand Prix, Schumacher's teammate, Rubens Barrichello, slowed his car under orders from Ferrari to allow Schumacher to pass and win the race. Although the switching of positions did not break any actual sporting or technical regulation, it angered fans and it was claimed that the team's actions showed a lack of sportsmanship and respect to the spectators. Many argued that Schumacher did not need to be "given" wins in only the sixth race of the season, particularly given that he had already won four of the previous five Grands Prix, and that Barrichello had dominated the race weekend up to that point. At the podium ceremony, Schumacher pushed Barrichello onto the top step, and for this disturbance, the Ferrari team incurred a US$1 million fine. Later in the season at the end of the 2002 United States Grand Prix, Schumacher slowed down within sight of the finishing line, allowing Barrichello to win by 0.011 seconds, the second-closest margin in F1 history. Schumacher's explanation varied between it being him "returning the favour" for Austria (now that Schumacher's title was secure), or trying to engineer a dead-heat (a feat derided as near-impossible in a sport where timings are taken to within a thousandth of a second). The FIA subsequently banned "team orders which interfere with the race result", but the ban was lifted for the 2011 season because the ruling was difficult to enforce.
During his spell in Sauber, in the 1991 Sportscar World Championship, Schumacher was involved in a serious incident with Derek Warwick in that year's 430 km of Nürburgring. While trying to set his flying lap in qualifying, Schumacher encountered Warwick's Jaguar on a slow lap resulting in lost time for Schumacher. As retaliation for being in his way, Schumacher swerved the Sauber into Warwick's car, hitting the Jaguar's nose and front wheel. Enraged by the German's attitude, Warwick drove to the pits and chased a fleeing Schumacher on foot through the Sauber pits. He eventually caught up with Schumacher, and it took intervention from Jochen Mass to prevent Warwick physically assaulting Schumacher.
Toward the end of the 2010 Hungarian Grand Prix, Rubens Barrichello attempted to pass Schumacher down the inside on the main straight. Schumacher closed the inside line to force Barrichello onto the outside, but Barrichello persisted on the inside at , despite the close proximity of a concrete wall and Schumacher leaving him only inches to spare. Barrichello said "It is the most dangerous thing that I have been through", and "There is not a rule for that, but between ourselves we should take a line, stick to it and that's it." Schumacher said that "Obviously there was space enough to go through. We didn't touch, so I guess I just left enough space for him to come through." Ross Brawn said "at the end of the day he gave him enough space. You can argue that it was marginal, but it was just tough – tough racing." A range of ex-drivers and commentators were highly critical of Schumacher. Although there was no accident, the race steward, the same Derek Warwick of the 1991 Nürburgring incident, wanted to black flag Schumacher since that "would have shown a better example to our young drivers". The Hungaroring incident was ruled to be dangerous and Schumacher received a 10 place grid penalty for the next race. Schumacher accepted the decision, and apologised.
In 1994, suspicion of foul play by the Benetton team (who were eventually found to have been responsible for some technical violations over the course of the season) was said to have troubled Ayrton Senna that season. For example, in the words of his then teammate, Damon Hill, Senna had chosen to stay at the first corner of the Aida circuit following his retirement from the Pacific Grand Prix. After listening to Schumacher's Benetton B194 as it went past, Senna "concluded that there was, what he regarded, as unusual noises from the engine". The FIA subsequently issued a press release setting out action that it required teams to take before the German Grand Prix, given that various cars were found to have advanced engine management systems emulating launch and traction control.
In 1995, Schumacher and Williams driver David Coulthard were disqualified for fuel irregularities, after a switch to Renault engines and Elf oils. On appeal, both drivers had their results and points reinstated, but both teams lost the points the results would normally have earned in the Constructors' Championship.
The 1998 Canadian Grand Prix saw Schumacher accused of dangerous driving when his exit from the pit-lane forced Heinz-Harald Frentzen off the track and into retirement. Despite receiving a 10-second penalty, Schumacher recovered and won the race.
Two laps from the finish of the 1998 British Grand Prix, Schumacher was leading the race when he was issued a stop-and-go penalty for overtaking a lapped car (Alexander Wurz) during the early moments of a Safety Car period. This penalty involves going into the pit lane and stopping for 10 seconds, and the rules state that a driver must serve his penalty within three laps of the penalty being issued. On the third lap after receiving the penalty, Schumacher turned into the pit lane to serve his penalty, but as this was the last lap of the race, and as Ferrari's pit box was located after the start/finish line, Schumacher technically finished the race before serving the penalty. The stewards initially resolved that problem by adding 10 seconds to Schumacher's race time, then later rescinded the penalty completely due to the irregularities in how the penalty had been issued.
During qualifying for the 2006 Monaco Grand Prix, Schumacher set the fastest time, but his car stopped in the Rascasse corner on the racing line, leaving the corner partially blocked, while his main contender for the season title, Fernando Alonso, was on his final qualifying lap. Schumacher stated that he simply locked up the wheels going into the corner and that the car then stalled while he attempted to reverse out. Alonso believed he would have been on pole if the incident had not happened, and Schumacher was stripped of pole position by the race stewards and started the race at the back of the grid. In the same qualifying session, Giancarlo Fisichella was similarly found to have blocked David Coulthard from improving his time, but Fisichella was only demoted five places on the grid.
At the 2010 Monaco Grand Prix, the safety car was deployed after an accident involving Karun Chandhok and Jarno Trulli, and pulled into the pits on the last lap. Schumacher passed Alonso before the finish line. Mercedes held that "the combination of the race control messages 'Safety Car in this lap' and 'Track Clear' and the green flags and lights shown by the marshals after safety car line one indicated that the race was not finishing under the safety car and all drivers were free to race." However, an FIA investigation found Schumacher guilty of breaching Safety Car regulations and awarded him a 20-seconds penalty, which cost him six places.
Schumacher's younger brother Ralf is also a racing driver. He competed in Formula One for ten years, starting from 1997 until the end of 2007. Their step-brother Sebastian Stahl has also been a racing driver.
In August 1995, Michael married Corinna Betsch. They have two children, a daughter Gina-Marie, born 20 February 1997 and a son Mick, born 22 March 1999. He has always been very protective of his private life and is known to dislike the celebrity spotlight. The family moved to a newly-built mansion near Gland, Switzerland in 2007, covering an area of with a private beach on Lake Geneva and featuring an underground garage and petrol station, with a vintage Shell fuel pump. On 19 January 2019, Michael's son Mick Schumacher was announced as a driver for the Ferrari Driver Academy.
Schumacher and his wife own horse ranches in Texas and Switzerland.
The family has two dogs – one stray that Corinna fell in love with in Brazil, and an Australian Shepherd named "Ed" whose arrival in the family made headlines. In fact, in 2007, Schumacher personally drove a taxi through the Bavarian town of Coburg to collect the dog and enable the family to make their return flight to Switzerland. Both Schumacher and the taxi driver were reprimanded by local police.
One of his main hobbies was horse riding, and he played football for his local team FC Echichens. He has appeared in several charity football games and organised games between Formula One drivers. He is a supporter of 1. FC Köln, his local football club where he grew up, citing Pierre Littbarski and Harald Schumacher as his idols.
On 23 June 2003, Schumacher was appointed as an Ambassador-at-Large for the Most Serene Republic of San Marino.
Schumacher is a special ambassador to UNESCO and has donated 1.5 million euros to the organisation. Additionally, he paid for the construction of a school for poor children and for area improvements in Dakar, Senegal. He supports a hospital for child victims of war in Sarajevo, which specialises in caring for amputees. In Lima, Peru he funded the "Palace for the Poor", a centre for helping homeless street children obtain an education, clothing, food, medical attention, and shelter. He stated his interest in these various efforts was piqued both by his love for children and the fact that these causes had received little attention. While an exact figure for the amount of money he has donated throughout his life is unknown, it is known that in his last four years as a driver, he donated at least $50 million. In 2008, it was revealed that he had donated between $5M and $10M to the Clinton Foundation.
Since his participation in an FIA European road safety campaign, as part of his punishment after the collision at the 1997 European Grand Prix, Schumacher continued to support other campaigns, such as Make Roads Safe, which is led by the FIA Foundation and calls on G8 countries and the UN to recognise global road deaths as a major global health issue. In 2008, Schumacher was the figurehead of an advertising campaign by Bacardi to raise awareness about responsible drinking, with a focus on communicating an international message 'drinking and driving don't mix'. He featured in an advertising campaign for television, cinema and online media, supported by consumer engagements, public relations and digital media across the World.
On the eve of the 2002 British Grand Prix, on behalf of Fiat, Schumacher presented a Ferrari 360 Modena to the Indian cricketer Sachin Tendulkar at Silverstone.
On 21 June 2009, Schumacher appeared on the BBC's motoring programme "Top Gear" as The Stig. Presenter Jeremy Clarkson hinted later in the programme that Schumacher was not the regular Stig, which the BBC subsequently confirmed. Schumacher was there on that occasion because Ferrari would not allow anyone else to drive the unique black Ferrari FXX that was featured in the show.
During his interview with Clarkson, Schumacher stated that his road cars are a Fiat 500 Abarth and a Fiat Croma, which is his family car.
In 2004 "Forbes" magazine listed him as the second highest paid athlete in the world. In 2005, "Eurobusiness" magazine identified Schumacher as the world's first billionaire athlete. His 2004 salary was reported to be around US$80 million. "Forbes" magazine ranked him 17th in its "The World's Most Powerful Celebrities" list. A significant share of his income came from advertising. For example, Deutsche Vermögensberatung paid him $8 million over three years from 1999 for wearing a 10 by 8 centimetre advertisement on his post-race cap. The deal was extended until 2010. He donated $10 million for aid after the 2004 Indian Ocean earthquake. His donation surpassed that of any other sports person, most sports leagues, many worldwide corporations and even some countries.
In 2010, his personal fortune was estimated at £515 million.
On 29 December 2013, Schumacher was skiing with his 14-year-old son Mick, descending the Combe de Saulire below the Dent de Burgin above Méribel in the French Alps. While crossing an unsecured off-piste area between Piste Chamois and Piste Mauduit, he fell and hit his head on a rock, sustaining a serious head injury, despite wearing a ski helmet. According to his physicians, he would most likely have died if he had not been wearing a helmet. He was airlifted to Grenoble Hospital where he underwent two surgical interventions.
Schumacher was put into a medically induced coma because of traumatic brain injury; his doctors reported on 7 March 2014 that his condition was stable. On 4 April 2014, Schumacher's agent reported that he was showing "moments of consciousness" as he was gradually withdrawn from the medically induced coma, adding to reports by relatives of "small encouraging signs" over the preceding month.
In mid-June 2014, he was moved from intensive care into a rehabilitation ward. By 16 June 2014, Schumacher had regained consciousness and left Grenoble Hospital for further rehabilitation at the University Hospital (CHUV) in Lausanne, Switzerland. On 9 September 2014, Schumacher left CHUV and was brought back to his home for further rehabilitation. In November 2014, it was reported that Schumacher was "paralysed and in a wheelchair"; he "cannot speak and has memory problems". In a video interview released in May 2015, Schumacher's manager Sabine Kehm said that his condition is slowly improving "considering the severeness of the injury he had".
In September 2016, Felix Damm, lawyer for Schumacher, told a German court that his client "cannot walk", in response to false reports from December 2015 in German publication "Die Bunte" that he could "walk a couple of steps". In December 2016 Schumacher's manager stated that "Michael's health is not a public issue, and so we will continue to make no comment in that regard".
In July 2019, former Ferrari manager Jean Todt gave an interview to Radio Monte Carlo giving a brief update on Schumacher's health, saying that Schumacher was making "good progress" but also "struggles to communicate". Todt also said that Schumacher is able to watch Formula One races on television at his home in Switzerland.
In September 2019, "Le Parisien" reported that Schumacher had been admitted to the Hôpital Européen Georges-Pompidou in Paris for treatment by cardiovascular surgeon Philippe Menasché, described as a "pioneer in cell surgery". Following the treatment, which involved him receiving an anti-inflammatory stem cell perfusion, medical staff stated that Michael Schumacher was "conscious".
Schumacher was disqualified from the 1997 World Drivers' Championship due to dangerous driving in the European Grand Prix, where he caused an avoidable accident with Jacques Villeneuve. His points tally would have placed him in second place in that year's standings.
Schumacher holds the following records in Formula One:
Footnotes
Schumacher had a voice role in the Disney/Pixar film "Cars". His character is himself as a Ferrari F430 who visits Radiator Springs to get new tires from Luigi and Guido at the recommendation of Lightning McQueen. During arrival, Luigi and Guido both faint in excitement when they see him. The French film "Asterix and Obelix at the Olympic Games" features Schumacher in a cameo role as a chariot driver called Schumix.
All race and championship results (1991–2006) are taken from the official Formula 1 website (Formula1.com). | https://en.wikipedia.org/wiki?curid=20396 |
Muonium
Muonium is an exotic atom made up of an antimuon and an electron, which was discovered in 1960 by Vernon W. Hughes
and is given the chemical symbol Mu. During the muon's lifetime, muonium can enter into compounds such as muonium chloride (MuCl) or sodium muonide (NaMu). Due to the mass difference between the antimuon and the electron, muonium () is more similar to atomic hydrogen () than positronium (). Its Bohr radius and ionization energy are within 0.5% of hydrogen, deuterium, and tritium, and thus it can usefully be considered as an exotic light isotope of hydrogen.
Although muonium is short-lived, physical chemists study it using muon spin spectroscopy (μSR), a magnetic resonance technique analogous to nuclear magnetic resonance (NMR) or electron spin resonance (ESR) spectroscopy. Like ESR, μSR is useful for the analysis of chemical transformations and the structure of compounds with novel or potentially valuable electronic properties. Muonium is usually studied by muon spin rotation, in which the Mu atom's spin precesses in a magnetic field applied transverse to the muon spin direction (since muons are typically produced in a spin-polarized state from the decay of pions), and by avoided level crossing (ALC), which is also called level crossing resonance (LCR). The latter employs a magnetic field applied longitudinally to the polarization direction, and monitors the relaxation of muon spins caused by "flip/flop" transitions with other magnetic nuclei.
Because the muon is a lepton, the atomic energy levels of muonium can be calculated with great precision from quantum electrodynamics (QED), unlike in the case of hydrogen, where the precision is limited by uncertainties related to the internal structure of the proton. For this reason, muonium is an ideal system for studying bound-state QED and also for searching for physics beyond the standard model.
Normally in the nomenclature of particle physics, an atom composed of a positively charged particle bound to an electron is named after the positive particle with "-ium" appended, in this case "muium". The suffix "-onium" is mostly used for bound states of a particle with its own antiparticle. The exotic atom consisting of a muon and an antimuon is known as "true muonium". It is yet to be observed, but it may have been generated in the collision of electron and positron beams. | https://en.wikipedia.org/wiki?curid=20398 |
Medicine man
A medicine man or medicine woman is a traditional healer and spiritual leader who serves a community of indigenous people of the Americas. Individual cultures have their own names, in their respective Indigenous languages, for the spiritual healers and ceremonial leaders in their particular cultures.
In the ceremonial context of Indigenous North American communities, "medicine" usually refers to "spiritual" healing. Medicine men/women should not be confused with those who employ Native American ethnobotany, a practice that is very common in a large number of Native American and First Nations households.
The terms "medicine people" or "ceremonial people" are sometimes used in Native American and First Nations communities, for example, when Arwen Nuttall (Cherokee) of the National Museum of the American Indian writes, "The knowledge possessed by medicine people is privileged, and it often remains in particular families."
Native Americans tend to be quite reluctant to discuss issues about medicine or medicine people with non-Indians. In some cultures, the people will not even discuss these matters with Indians from other tribes. In most tribes, medicine elders are prohibited from advertising or introducing themselves as such. As Nuttall writes, "An inquiry to a Native person about religious beliefs or ceremonies is often viewed with suspicion." One example of this is the Apache medicine cord or "Izze-kloth" whose purpose and use by Apache medicine elders was a mystery to nineteenth century ethnologists because "the Apache look upon these cords as so sacred that strangers are not allowed to see them, much less handle them or talk about them."
The 1954 version of "Webster's New World Dictionary of the American Language" reflects the poorly-grounded perceptions of the people whose use of the term effectively defined it for the people of that time: "a man supposed to have supernatural powers of curing disease and controlling spirits." In effect, such definitions were not explanations of what these "medicine people" are to their own communities but instead reported on the consensus of socially and psychologically remote observers when they tried to categorize the individuals. The term "medicine man/woman," like the term "shaman," has been criticized by Native Americans, as well as other specialists in the fields of religion and anthropology.
While non-Native anthropologists sometimes use the term "shaman" for Indigenous healers worldwide, including the Americas, "shaman" is the specific name for a spiritual mediator from the Tungusic peoples of Siberia and is not used in Native American or First Nations communities.
The term "medicine man/woman" has also frequently been used by Europeans to refer to African traditional healers, along with the offensive term "witch doctors".
Cherokee spiritual, ceremonial and healing knowledge has been passed down for thousands of years. The Cherokee people were among the first Native Americans to formalize a written language. Some of the information in the Cherokee ledgers is written in code to prevent other people from trying to misuse or exploit their medicine ways. As in all Native American cultures, Cherokee medicine people had to practice in secret from the post-contact era until 1978, when the American Indian Religious Freedom Act was passed.
Training a Cherokee medicine person takes many years due to the vast amount of knowledge needed to practice. Modern-day Cherokee medicine people must be born and raised in the Cherokee community and culture, and raised with the language. The skills of gifted and well-trained medicine people are still very important to the Cherokee people, though genocide and oppression have resulted in there being fewer now than pre-contact.
There are many fraudulent healers and scam artists who pose as Cherokee "shamans", and the Cherokee Nation has had to speak out against these people, even forming a task force to handle the issue. In order to seek help from a Cherokee medicine person a person needs to know someone in the community who can vouch for them and provide a referral. Usually one makes contact through a relative who knows the healer. | https://en.wikipedia.org/wiki?curid=20401 |
Miles Davis
Miles Dewey Davis III (May 26, 1926September 28, 1991) was an American jazz trumpeter, bandleader, and composer. He is among the most influential and acclaimed figures in the history of jazz and 20th-century music. Davis adopted a variety of musical directions in a five-decade career that kept him at the forefront of many major stylistic developments in jazz.
Born in Alton, Illinois, and raised in East St. Louis, Davis left to study at the Juilliard School in New York City, before dropping out and making his professional debut as a member of saxophonist Charlie Parker's bebop quintet from 1944 to 1948. Shortly after, he recorded the "Birth of the Cool" sessions for Capitol Records, which were instrumental to the development of cool jazz. In the early 1950s, Miles Davis recorded some of the earliest hard bop music while on Prestige Records but did so haphazardly due to a heroin addiction. After a widely acclaimed comeback performance at the Newport Jazz Festival in 1955, he signed a long-term contract with Columbia Records and recorded the 1957 album "'Round About Midnight". It was his first work with saxophonist John Coltrane and bassist Paul Chambers, key members of the sextet he led into the early 1960s. During this period, he alternated between orchestral jazz collaborations with arranger Gil Evans, such as the Spanish-influenced "Sketches of Spain" (1960), and band recordings, such as "Milestones" (1958) and "Kind of Blue" (1959). The latter recording remains one of the most popular jazz albums of all time, having sold over five million copies in the U.S.
Davis made several lineup changes while recording "Someday My Prince Will Come" (1961), his 1961 Blackhawk concerts, and "Seven Steps to Heaven" (1963), another mainstream success that introduced bassist Ron Carter, pianist Herbie Hancock, and drummer Tony Williams. After adding saxophonist Wayne Shorter to his new quintet in 1964, Davis led them on a series of more abstract recordings often composed by the band members, helping pioneer the post-bop genre with albums such as "E.S.P" (1965) and "Miles Smiles" (1967), before transitioning into his electric period. During the 1970s, he experimented with rock, funk, African rhythms, emerging electronic music technology, and an ever-changing line-up of musicians, including keyboardist Joe Zawinul, drummer Al Foster, and guitarist John McLaughlin. This period, beginning with Davis' 1969 studio album "In a Silent Way" and concluding with the 1975 concert recording "Agharta", was the most controversial in his career, alienating and challenging many in jazz. His million-selling 1970 record "Bitches Brew" helped spark a resurgence in the genre's commercial popularity with jazz fusion as the decade progressed.
After a five-year retirement due to poor health, Davis resumed his career in the 1980s, employing younger musicians and pop sounds on albums such as "The Man with the Horn" (1981) and "Tutu" (1986). Critics were generally unreceptive but the decade garnered the trumpeter his highest level of commercial recognition. He performed sold-out concerts worldwide, while branching out into visual arts, film, and television work, before his death in 1991 from the combined effects of a stroke, pneumonia and respiratory failure. In 2006, Davis was inducted into the Rock and Roll Hall of Fame, which recognized him as "one of the key figures in the history of jazz". "Rolling Stone" described him as "the most revered jazz trumpeter of all time, not to mention one of the most important musicians of the 20th century," while Gerald Early called him inarguably one of the most influential and innovative musicians of that period.
Miles Dewey Davis III was born on May 26, 1926, to an affluent African-American family in Alton, Illinois, north of St. Louis. He had an older sister, Dorothy Mae (born 1925), and a younger brother, Vernon (born 1929). His mother, Cleota Mae Henry of Arkansas, was a music teacher and violinist, and his father, Miles Dewey Davis Jr., also of Arkansas, was a dentist. They owned a estate near Pine Bluff, Arkansas with a profitable pig farm. In Pine Bluff, he and his siblings fished, hunted, and rode horses. Davis' grandparents were the owners of an Arkansas farm where he would spend many summers.
In 1927, the family moved to East St. Louis, Illinois. They lived on the second floor of a commercial building behind a dental office in a predominantly white neighbourhood. Davis' father would soon become distant to his children as the Great Depression caused him to become increasingly consumed by his job; typically working six days a week. From 1932 to 1934, Davis attended John Robinson Elementary School, an all-black school, then Crispus Attucks, where he performed well in mathematics, music, and sports. Davis had previously attended Catholic school; per his religious urbringing. At an early age he liked music, especially blues, big bands, and gospel.
In 1935, Davis received his first trumpet as a gift from John Eubanks, a friend of his father. He took lessons from "the biggest influence on my life," Elwood Buchanan, a teacher and musician who was a patient of his father. His mother wanted him to play the violin instead. Against the fashion of the time, Buchanan stressed the importance of playing without vibrato and encouraged him to use a clear, mid-range tone. Davis said that whenever he started playing with heavy vibrato, Buchanan slapped his knuckles. In later years Davis said, "I prefer a round sound with no attitude in it, like a round voice with not too much tremolo and not too much bass. Just right in the middle. If I can't get that sound I can't play anything." The family soon moved to 1701 Kansas Avenue in East St. Louis.
According to Davis "By the age of 12, music had become the most important thing in my life." On his thirteenth birthday his father bought him a new trumpet, and Davis began to play in local bands. He took additional trumpet lessons from Joseph Gustat, principal trumpeter of the St. Louis Symphony Orchestra. Davis would also play the trumpet in talent shows he and his siblings would put on.
In 1941, the 15-year-old attended East St. Louis Lincoln High School, where he joined the marching band directed by Buchanan and entered music competitions. Years later, Davis said that he was discriminated against in these competitions due to his race, but he added that these experiences made him a better musician. When a drummer asked him to play a certain passage of music, and he couldn't do it, he began to learn music theory. "I went and got everything, every book I could get to learn about theory." At Lincoln, Davis met his first girlfriend, Irene Birth (later Cawthon). He had a band that performed at the Elks Club. Part of his earnings paid for his sister's education at Fisk University. Davis befriended trumpeter Clark Terry, who suggested he play without vibrato, and performed with him for several years.
With encouragement from his teacher and girlfriend, Davis filled a vacant spot in the Rhumboogie Orchestra, also known as the Blue Devils, led by Eddie Randle. He became the band's musical director, which involved hiring musicians and scheduling rehearsal. Years later, Davis considered this job one of the most important of his career. Sonny Stitt tried to persuade him to join the Tiny Bradshaw band, which was passing through town, but his mother insisted he finish high school before going on tour. He said later, "I didn't talk to her for two weeks. And I didn't go with the band either." In January 1944, Davis finished high school and graduated in absentia in June. During the next month, his girlfriend gave birth to a daughter, Cheryl.
In July 1944, Billy Eckstine visited St. Louis with a band that included Art Blakey, Dizzy Gillespie, and Charlie Parker. Trumpeter Buddy Anderson was too sick to perform, so Davis was invited to join. He played with the band for two weeks at Club Riviera. After playing with these musicians, he was certain he should move to New York City, "where the action was". His mother wanted him to go to Fisk University, like his sister, and study piano or violin. Davis had other interests.
In September 1944, Davis accepted his father's idea of studying at the Institute of Musical Arts, later known as the Juilliard School, in New York City. After passing the audition, he attended classes in music theory, piano and dictation. Although Davis would frequently skip said classes.
Much of Davis' time was spent in clubs looking for his idol, Charlie Parker. According to Davis, Coleman Hawkins told him "finish your studies at Juilliard and forget Bird". After finding Parker, he became one of a cadre of regulars at Minton's and Monroe's in Harlem who held jam sessions every night. The other regulars included J. J. Johnson, Kenny Clarke, Thelonious Monk, Fats Navarro, and Freddie Webster. Davis reunited with Cawthon and their daughter when they moved to New York City. Parker became a roommate. Around this time Davis was paid an allowance of $40 ($582 by 2020).
In mid-1945, Davis failed to register for the year's autumn term at Juilliard and dropped out after three semesters because he wanted to perform full-time. Years later he criticized Juilliard for concentrating too much on classical European and "white" repertoire, but he praised the school for teaching him music theory and improving his trumpet technique.
He began performing at clubs on 52nd Street with Coleman Hawkins and Eddie "Lockjaw" Davis. He recorded for the first time on April 24, 1945 when he entered the studio as a sideman for Herbie Fields's band. During the next year, he recorded as a leader for the first time with the Miles Davis Sextet plus Earl Coleman and Ann Hathaway, one of the few times he accompanied a singer.
In 1945, he replaced Dizzy Gillespie in Charlie Parker's quintet. On November 26, Davis participated in several recording sessions as part of Parker's group Reboppers that also involved Gillespie and Max Roach, displaying hints of the style he would become known for. In Parker's tune "Now's the Time", Davis played a solo that anticipated cool jazz. He then joined a big band led by Benny Carter, performing in St. Louis and remaining with the band in California. He again played with Parker and Gillespie. In Los Angeles, Parker had a nervous breakdown that put him in the hospital for several months. In March 1946, Davis played in studio sessions with Parker and began a collaboration with bassist Charles Mingus that summer. Cawthon gave birth to Davis's second child, Gregory, in East St. Louis before reuniting with Davis in New York City the following year. Davis noted that by this time, "I was still so much into the music that I was even ignoring Irene." He had also turned to alcohol and cocaine.
He was a member of Billy Eckstine's big band in 1946 and Gillespie's in 1947. He joined a quintet led by Parker that also included Max Roach. Together they performed live with Duke Jordan and Tommy Potter for much of the year, including several studio sessions. In one session that May, Davis wrote the tune "Cheryl", named after his daughter. Davis's first session as a leader followed in August 1947, playing as the Miles Davis All Stars that included Parker, pianist John Lewis, and bassist Nelson Boyd; they recorded "Milestones", "Half Nelson", and "Sippin' at Bells". After touring Chicago and Detroit with Parker's quintet, Davis returned to New York City in March 1948 and joined the Jazz at the Philharmonic tour, which included a stop in St. Louis on April 30.
In August 1948, Davis declined an offer to join Duke Ellington's orchestra as he had entered rehearsals with a nine-piece band with pianist and arranger Gil Evans and baritone saxophonist Gerry Mulligan, taking an active role on what soon became his own project. Evans' Manhattan apartment had become the meeting place for several young musicians and composers such as Davis, Roach, Lewis, and Mulligan who were unhappy with the increasingly virtuoso instrumental techniques that dominated bebop. These gatherings led to the formation of the Miles Davis Nonet, which included the unusual additions of French horn and tuba; leading to a thickly textured orchestral sound. The intent was to imitate the human voice through carefully arranged compositions and a relaxed, melodic approach to improvisation. In September, the band completed their sole engagement as the opening band for Count Basie at the Royal Roost for two weeks. Davis had to persuade the venue's manager to write the sign "Miles Davis Nonet. Arrangements by Gil Evans, John Lewis and Gerry Mulligan". He prevailed only with the help of Monte Kay, the club's artistic director. Davis returned to Parker's quintet, but relationships within the quintet were growing tense mainly due to Parker's erratic behavior caused by his drug addiction. Early in his time with Parker, Davis abstained from drugs, ate a vegetarian diet, and spoke of the benefits of water and juice. Davis and Roach objected to the addition of pianist Duke Jordan, preferring Bud Powell.
In December 1948 Davis quit, claiming he was not being paid. His departure began a period when he worked mainly as a freelancer and sideman. His nonet remained active until the end of 1949. After signing a contract with Capitol Records, they recorded sessions in January and April 1949, which sold little but influenced the "cool" or "west coast" style of jazz. The line-up changed throughout the year and included the additions of tuba player Bill Barber, alto saxophonist Lee Konitz, who had been preferred to Sonny Stitt as his style was considered too bop-oriented, pianist Al Haig, trombone players Mike Zwerin with Kai Winding, French horn players Junior Collins with Sandy Siegelstein and Gunther Schuller, and bassists Al McKibbon and Joe Shulman. One track featured singer Kenny Hagood. The presence of white musicians in the group angered some black players, many of whom were unemployed at the time, yet Davis rebuffed their criticisms. Recording sessions with the nonet for Capitol continued until April 1950. The Nonet recorded in total a dozen tracks which were released as singles and subsequently compiled on "Birth of the Cool" (1957).
In May 1949, Davis performed with the Tadd Dameron Quintet with Kenny Clarke and James Moody at the Paris International Jazz Festival. On his first trip abroad Davis took a strong liking for Paris and its cultural environment, where he felt black jazz musicians and people of color in general were better respected than in America. The trip, he said, "changed the way I looked at things forever". He began an affair with singer and actress Juliette Gréco.
After returning from Paris in mid-1949, he became depressed and found little work, which included a short engagement with Powell in October and guest spots in New York City, Chicago, and Detroit until January 1950. He was falling behind in hotel rent and attempts were made to repossess his car. His heroin use became an expensive addiction, and Davis, yet to reach 24 years old, "lost my sense of discipline, lost my sense of control over my life, and started to drift". In August 1950, during a family trip to East St. Louis and Chicago in an attempt to improve their fortunes, Cawthon gave birth to Davis's second son, Miles IV. Davis befriended boxer Johnny Bratton and began his interest in the sport. Davis left Cawthon and his three children in New York City in the hands of a friend, jazz singer Betty Carter. He remained grateful to her for the rest of his life. He toured with Eckstine and Billie Holiday and was arrested for heroin possession in Los Angeles. The story was reported in "DownBeat" magazine, which caused a further reduction in work, though he was acquitted weeks later. By the 1950s Davis had become a more skilled player and was experimenting with the middle register of the trumpert alongside harmonies and rhythms.
In January 1951, Davis's fortunes improved when he signed a one-year contract with Prestige after owner Bob Weinstock became a fan of the nonet Davis chose Lewis, trombonist Bennie Green, bassist Percy Heath, saxophonist Sonny Rollins, and drummer Roy Haynes; they recorded what became part of "Miles Davis and Horns" (1956). Davis was hired for other studio dates in March, June, and September 1951 and started transcribing scores for record labels to fund his heroin addiction. During the next month, he recorded his second session for Prestige as band leader. The material was released on "The New Sounds" (1951), "Dig" (1956), and "Conception" (1956).
Davis supported his heroin habit by playing music and by living the life of a hustler, exploiting prostitutes, and receiving money from friends. By 1953, his addiction began to impair his playing. His drug habit became public in a "Down Beat" interview with Cab Calloway, whom he never forgave as it brought him "all pain and suffering". He returned to St. Louis and stayed with his father for several months. Though he continued to use heroin, he met Roach and Mingus in September 1953 on their way to Los Angeles and joined their band, but the trip caused problems. He returned to his father's home, "determined to kick my habit ... that was the only thing on my mind." He locked himself inside the guest house "for about seven or eight days" until he had gone through withdrawal. After the ordeal, he "sat down and started thinking about how I was going to get my life back together".
Davis lived in Detroit for about six months, avoiding New York City where it was easy to get drugs. Though he used heroin, he was still able to perform locally with Elvin Jones and Tommy Flanagan as part of Billy Mitchell's house band at the Blue Bird club. He was also "pimping a little". A widely related story, attributed to Richard "Prophet" Jennings, was that Davis stumbled into Baker's Keyboard Lounge out of the rain, carrying his trumpet in a paper bag under his coat. He walked to the bandstand, interrupted Roach and Clifford Brown in the middle of performing "Sweet Georgia Brown", and played "My Funny Valentine" before leaving. Davis was supposedly embarrassed into getting clean by this incident. He disputed this account, stating that Roach had invited him to play and that his decision to quit heroin was unrelated to the incident. He said he was inspired to quit by his idol, boxer Sugar Ray Robinson.
In February 1954 Davis returned to New York City, feeling good "for the first time in a long time," mentally and physically stronger, and joined a gym. He informed Weinstock and management at Blue Note that he was ready to record with a quintet, which he was granted. He considered the resulting albums "Miles Davis Quartet" (1954) and "Miles Davis Volume 2" (1956) "very important" because he felt his performances were particularly strong. He was paid roughly $750 (US$ in dollars) for each album and refused to give away his publishing rights.
Davis abandoned the bebop style and turned to the music of pianist Ahmad Jamal, whose approach and use of space influenced him. When he returned to the studio in June 1955 to record "Miles Davis Quartet", he wanted a pianist like Jamal and picked Red Garland. "Blue Haze" (1956), "Bags' Groove" (1957), "Walkin'" (1957), and "Miles Davis and the Modern Jazz Giants" (1959) were recorded after his recovery from heroin addiction. They documented the evolution of his sound with the Harmon mute, also known as a wah-wah mute, placed close to the microphone, and the use of more spacious and relaxed phrasing. He assumed a central role in hard bop, which was slower than bebop, less radical in harmony and melody, and often used popular songs and American standards as starting points for improvisation. Hard bop distanced itself from cool jazz with a harder beat and music inspired by the blues. A few critics consider "Walkin' "(April 1954) the album that created the hard bop genre.
Davis gained a reputation for being cold, distant—and easily angered. He wrote that in 1954 Sugar Ray Robinson "was the most important thing in my life besides music" and adopted Robinson's "arrogant attitude". He showed contempt for critics and the press. There were well-publicized confrontations with the public and with other musicians. An argument with Thelonious Monk during the recording of "Bags' Groove" was reported. In mid-1954, Davis reunited with Gréco for the first time since 1949 after she arrived in New York City for film prospects. The two had been in occasional contact since he left Paris. Davis being too busy to move to Spain with her, they've agreed to get together the next time he comes to France.
Davis had an operation to remove polyps from his larynx in October 1955. The doctors told him to remain silent after the operation—but he got into an argument that permanently damaged his vocal cords and gave him a raspy voice for the rest of his life. He was called the "prince of darkness", adding a patina of mystery to his public persona.
In July 1955, Davis's fortunes improved considerably when he was invited to the second annual Newport Jazz Festival on July 17, with a line-up of Monk, Heath, drummer Connie Kay, and horn players Zoot Sims and Gerry Mulligan. He convinced organizer George Wein, a fan of Davis' work, that he should be on the bill, to which Wein agreed.The performance was praised by critics and audiences alike who considered it to be a highlight of the session as well as helping Davis; the least well known musician in the group to increase his popularity among affluent white audiences. He tied with Dizzy Gillespie for best trumpeter in the 1955 "Down Beat" magazine Readers' Poll.
George Avakian of Columbia Records saw Davis perform at Newport and wanted to sign him to the label. Davis had one year left on his contract with Prestige, which required him to release four more albums. He signed a contract with Columbia that included a $4,000 advance (US$ in dollars) and a condition that his recordings for Columbia would remain unreleased until his agreement with Prestige expired.
At the request of Avakian, he formed the Miles Davis Quintet for a performance at Café Bohemia. The quintet consisted of Davis on trumpet, Sonny Rollins on tenor saxophone, Red Garland on piano, Paul Chambers on double bass, and Philly Joe Jones on drums. Rollins was replaced by John Coltrane, completing the membership of the first quintet. This group appeared on his final albums for Prestige: "Cookin' with the Miles Davis Quintet" (1957), "Relaxin' with the Miles Davis Quintet" (1958), "Workin' with the Miles Davis Quintet" (1960), and "Steamin' with the Miles Davis Quintet" (1961). The music on all four albums was recorded in two one-day sessions in 1956 with Rudy Van Gelder. Each album helped establish Davis's quintet as one of the best.
The style of the group was an extension of their experience playing with Davis. He played long, legato, melodic lines, while Coltrane contrasted with energetic solos. Their live repertoire was a mix of bebop, standards from the Great American Songbook and pre-bop eras, and traditional tunes. They appeared on "'Round About Midnight", Davis's first album for Columbia.
In 1956, he left his quintet temporarily to tour Europe as part of the Birdland All-Stars, which included the Modern Jazz Quartet and French and German musicians. In Paris, he reunited with Gréco and they "remained lovers for many years". He then returned home, reunited his quintet and toured the US for two months. Conflict arose on tour as he grew impatient with the drug habits of Jones and Coltrane. Davis was trying to live a healthier life by exercising and reducing his alcohol. But he continued to use cocaine. At the end of the tour, he fired Jones and Coltrane and replaced them with Sonny Rollins and Art Taylor.
In November 1957, Davis went to Paris and recorded the soundtrack to "Ascenseur pour l'échafaud" directed by Louis Malle (1958). Consisting of French session musicians Barney Wilen, Pierre Michelot, and René Urtreger, and American drummer Kenny Clarke, the group avoided a written score and instead improvised while they watched the film in a recording studio.
After returning to New York City, Davis revived his quintet with Adderley and Coltrane, who was clean from his drug habit. Now a sextet, the group recorded material in early 1958 that was released on "Milestones" (1958), an album that demonstrated Davis's interest in modal jazz. A performance by Les Ballets Africains drew him to slower, deliberate music that allowed the creation of solos from harmony rather than chords. In this form of ballet music, the kalimba was played for a long periods on a single chord, weaving in and out of consonance and dissonance.
By May 1958, he had replaced Jones with drummer Jimmy Cobb, and Red Garland left the group, leaving Davis to play piano on "Sid's Ahead" for the album "Milestones". He wanted someone who could play modal jazz, so he hired Bill Evans, a young, white pianist with a background in classical music. Evans had an impressionistic approach to piano. His ideas greatly influenced Davis. But after eight months of touring, a tired Evans left. Wynton Kelly, his replacement, brought to the group a swinging style that contrasted with Evans's delicacy. The sextet made their recording debut on "Jazz Track" (1958).
By early 1957, Davis was exhausted from recording and touring with his quintet and wished to pursue new projects. During a two-week residency in Chicago in March, the 30-year-old Davis told journalists of his intention to retire at its conclusion and revealed offers he had received to become a teacher at Harvard University and a musical director at a record label. Avakian agreed that it was time for Davis to explore something different, but Davis rejected his suggestion of returning to his nonet as he considered that a step backward. Avakian then suggested that he work with a bigger ensemble, similar to "Music for Brass" (1957), an album of orchestral and brass-arranged music led by Gunther Schuller featuring Davis as a guest soloist.
Davis accepted and worked with Gil Evans in what became a five-album collaboration from 1957 to 1962. "Miles Ahead" (1957) showcased Davis playing a flugelhorn and a rendition "The Maids of Cadiz" by Léo Delibes, the first piece of classical music that Davis recorded. Evans devised orchestral passages as transitions, thus turning the album into one long piece of music. "Porgy and Bess" (1959) includes arrangements of pieces from George Gershwin's opera. "Sketches of Spain" (1960) contained music by composers Joaquín Rodrigo and Manuel de Falla and originals by Evans. The classical musicians had trouble improvising, while the jazz musicians couldn't handle the difficult arrangements, but the album was a critical success, selling over 120,000 copies in the US. Davis performed with an orchestra conducted by Evans at Carnegie Hall in May 1961 to raise money for charity. The pair's final album was "Quiet Nights" (1962), a collection of bossa nova songs released against their wishes. Evans stated it was only half an album and blamed the record company; Davis blamed producer Teo Macero and refused to speak to him for more than two years. Davis noted later that "my best friend is Gil Evans"; their work was included in the boxed set "" (1996), which won a Grammy Award for Best Historical Album and Best Album Notes in 1997.
In March and April 1959, Davis recorded what many critics consider his greatest album, "Kind of Blue" (1959). He named the album for its mood. He called back Bill Evans, as the music had been planned around Evans's piano style. Both Davis and Evans were familiar with George Russell's ideas about modal jazz. But Davis neglected to tell pianist Wynton Kelly that Evans was returning, so Kelly appeared on only one song, "Freddie Freeloader". The sextet had played "So What" and "All Blues" at performances, but the remaining three compositions they saw for the first time in the studio.
Released in August 1959, "Kind of Blue" was an instant success, with widespread radio airplay and rave reviews from critics. It remains the best selling jazz album of all time. In October 2008, the album reached 4× platinum certification from the Recording Industry Association of America for selling over four million copies in the US alone. In 2009, the US House of Representatives voted 409–0 to pass a resolution that honored it as a national treasure.
During the success of "Kind of Blue", Davis found himself involved with the law. On August 25, 1959, during a recording session at the Birdland nightclub in New York City for the US Armed Services, he took a break outside the club. As he was escorting a blonde-haired woman to a taxi, policeman Gerald Kilduff told him to "move on". Davis said that he was working at the club, and he refused to move. Kilduff arrested him and grabbed him as he tried to protect himself. Witnesses said the policeman punched Davis in the stomach with a nightstick without provocation. Two detectives held the crowd back, while a third approached Davis from behind and beat him in the head. Davis was taken to jail, charged for assaulting an officer, then taken to the hospital where he received five stitches. He was released on a $525 bail (US$ in dollars). By January 1960, he was acquitted of disorderly conduct and third-degree assault. He later stated the incident "changed my whole life and whole attitude again, made me feel bitter and cynical again when I was starting to feel good about the things that had changed in this country".
Davis and his sextet toured to support "Kind of Blue". He persuaded Coltrane to play with the group on one final European tour in the spring of 1960. Coltrane then departed to form his quartet, though he returned for some tracks on Davis's album "Someday My Prince Will Come" (1961). Its front cover shows a photograph of his wife, Frances Taylor, after Davis demanded that Columbia depict black women on his album covers. In 1957, Davis began a relationship with Frances Taylor, a dancer he had met in 1953 at Ciro's in Los Angeles. They married on December 21, 1959 in Toledo, Ohio. The relationship involved numerous incidents of Davis' domestic violence towards Taylor. He later wrote, "Every time I hit her, I felt bad because a lot of it really wasn't her fault but had to do with me being temperamental and jealous." One reason for his behavior was that in 1963 he had increased his use of alcohol and cocaine to reduce joint pain caused by sickle cell anemia. He hallucinated, "looking for this imaginary person" in his house while wielding a kitchen knife. Soon after the photograph for the album "E.S.P." (1965) was taken, Taylor left him for the final time. She filed for divorce in 1966; it was finalized in February 1968. Davis later recalled that "Frances was the best wife I ever had and I made a mistake when I broke up with her."
In December 1962, Davis, Kelly, Chambers, Cobb, and Rollins played together for the last time as the first three wanted to leave and play as a trio. Rollins left to join them soon after, leaving Davis to pay over $25,000 (US$ in dollars) to cancel upcoming gigs and quickly assemble a new group. Following auditions, he found his new band in tenor saxophonist George Coleman, bassist Ron Carter, pianist Victor Feldman, and drummer Frank Butler. By May 1963, Feldman and Butler were replaced by pianist Herbie Hancock and 17-year-old drummer Tony Williams who made Davis "excited all over again". With this group, Davis completed the rest of what became "Seven Steps to Heaven" (1963) and recorded the live albums "Miles Davis in Europe" (1964), "My Funny Valentine" (1965), and "Four & More" (1966). The quintet played essentially the same bebop tunes and standards that Davis's previous bands had played, but they approached them with structural and rhythmic freedom and occasionally breakneck speed.
In 1964, Coleman was replaced by saxophonist Sam Rivers until Davis persuaded Wayne Shorter to leave Art Blakey. This quintet lasted through 1968. Shorter became the group's principal composer, and the album "E.S.P." (1965) was named after his composition. While touring Europe, the group made its first album, "Miles in Berlin" (1965).
Davis needed medical attention for hip pain, which had worsened since his Japanese tour during the previous year. He underwent hip replacement surgery in April 1965, with bone taken from his shin, but it failed. After his third month in the hospital, he discharged himself due to boredom and went home. He returned to the hospital in August after a fall required the insertion of a plastic hip joint. In November 1965, he had recovered enough to return to performing with his quintet, which included gigs at the Plugged Nickel in Chicago. Teo Macero returned as his engineer and record producer after their rift over "Quiet Nights" had healed.
In January 1966, Davis spent three months in the hospital due to a liver infection. When he resumed touring, he performed more at colleges because he had grown tired of the typical jazz venues. Columbia president Clive Davis noted that in 1966 his sales had declined to around 40,000–50,000 per album, compared to as many as 100,000 per release a few years before. Matters were not helped by the press reporting his apparent financial troubles and imminent demise. After his appearance at the 1966 Newport Jazz Festival, he returned to the studio with his quintet for a series of productive sessions. He started a relationship with actress Cicely Tyson, who helped him reduce his alcohol consumption.
Material from the 1966–1968 sessions was released on "Miles Smiles" (1966), "Sorcerer" (1967), "Nefertiti" (1967), "Miles in the Sky" (1968), and "Filles de Kilimanjaro" (1968). The quintet's approach to the new music became known as "time no changes"—which referred to Davis's decision to depart from chordal sequences and adopt a more open approach, with the rhythm section responding to the soloists' melodies. Through "Nefertiti" the studio recordings consisted primarily of originals composed by Shorter, with occasional compositions by the other sidemen. In 1967, the group began to play their concerts in continuous sets, each tune flowing into the next, with only the melody indicating any sort of change. His bands performed this way until his hiatus in 1975.
"Miles in the Sky" and "Filles de Kilimanjaro"—which tentatively introduced electric bass, electric piano, and electric guitar on some tracks—pointed the way to the fusion phase of Davis's career. He also began experimenting with more rock-oriented rhythms on these records. By the time the second half of "Filles de Kilimanjaro" was recorded, bassist Dave Holland and pianist Chick Corea had replaced Carter and Hancock. Davis soon took over the compositional duties of his sidemen.
In September 1968, Davis married 23-year-old model and songwriter Betty Mabry. In his autobiography, Davis described her as a "high-class groupie, who was very talented but who didn't believe in her own talent". Mabry, a familiar face in the New York City counterculture, helped introduce Davis to popular rock, soul, and funk musicians. Jazz critic Leonard Feather visited Davis's apartment and was shocked to find him listening to albums by The Byrds, Aretha Franklin, and Dionne Warwick. He also liked James Brown, Sly and the Family Stone, and Jimi Hendrix, whose group Band of Gypsys particularly made an impression on Davis. Davis filed for divorce from Mabry in 1969, after accusing her of having an affair with Jimi Hendrix.
"In a Silent Way" (1969) was recorded in a single studio session on February 18, 1969, with Shorter, Hancock, Holland, and Williams alongside keyboardists Chick Corea and Josef Zawinul and guitarist John McLaughlin. The album contains two side-long tracks that Macero pieced together from different takes recorded at the session.
When the album was released in July 1969, some critics accused him of "selling out" to the rock and roll audience. Nevertheless, it reached number 134 on the US "Billboard" Top LPs chart, his first album since "My Funny Valentine" to reach the chart. "In a Silent Way" was his entry into jazz fusion. The touring band of 1969–1970—with Shorter, Corea, Holland, and DeJohnette—never completed a studio recording together, and became known as Davis's "lost quintet".
In October 1969, Davis was shot at five times while in his car with one of his two lovers, Marguerite Eskridge. The incident left him with a graze and Eskridge unharmed. In 1970, Marguerite gave birth to their son Erin.
For the double album "Bitches Brew" (1970), he hired Jack DeJohnette, Airto Moreira, and Bennie Maupin. The album contained long compositions, some over twenty minutes, that were never played in the studio but were constructed from several takes by Macero and Davis. Other studio techniques included splicing, multitrack recording, and tape loops. "Bitches Brew" peaked at No. 35 on the "Billboard" Album chart. In 1976 it was certified gold for selling over 500,000 records. By 2003, it had sold one million copies.
In March 1970, Davis began to perform as the opening act for various rock acts, allowing Columbia to market "Bitches Brew" to a larger audience. He was so offended by Clive Davis's suggestion to perform at the Fillmore East that he threatened to switch record labels, but he reconsidered and shared a bill with the Steve Miller Band and Neil Young with Crazy Horse on March 6 and 7. Biographer Paul Tingen wrote, "Miles' newcomer status in this environment" led to "mixed audience reactions, often having to play for dramatically reduced fees, and enduring the 'sell-out' accusations from the jazz world", as well as being "...attacked by sections of the black press for supposedly genuflecting to white culture". The 1970 tours included the 1970 Isle of Wight Festival on August 29 when he performed to an estimated 600,000 people, the largest of his career. Plans to record with Hendrix ended after the guitarist's death; his funeral was the last that Davis attended. Several live albums with a transitional sextet/septet including Corea, DeJohnette, Holland, Moreira, saxophonist Steve Grossman, and keyboardist Keith Jarrett were recorded during this period, including "Miles Davis at Fillmore" (1970) and "" (1973).
By 1971, Davis had signed a contract with Columbia that paid him $100,000 a year (US$ in dollars) for three years in addition to royalties. He recorded a soundtrack album (1971's "Jack Johnson") for the 1970 documentary film about heavyweight boxer Jack Johnson, containing two long pieces of 25 and 26 minutes in length with Hancock, McLaughlin, Sonny Sharrock, and Billy Cobham. He was committed to making music for African-Americans who liked more commercial, pop, groove-oriented music. By November 1971, DeJohnette and Moreira had been replaced in the touring ensemble by drummer Leon "Ndugu" Chancler and percussionists James Mtume and Don Alias. "Live-Evil" (1971) was released in the same month. Showcasing former Stevie Wonder touring bassist Michael Henderson, who replaced Holland in the autumn of 1970, the album demonstrated that Davis's ensemble had transformed into a funk-oriented group while retaining the exploratory imperative of "Bitches Brew".
In 1972, composer-arranger Paul Buckmaster introduced Davis to the music of German avant-garde composer Karlheinz Stockhausen, leading to a period of creative exploration. Biographer J. K. Chambers wrote, "The effect of Davis' study of Stockhausen could not be repressed for long ... Davis' own 'space music' shows Stockhausen's influence compositionally." His recordings and performances during this period were described as "space music" by fans, Feather, and Buckmaster, who described it as "a lot of mood changes—heavy, dark, intense—definitely space music". The studio album "On the Corner" (1972) blended the influence of Stockhausen and Buckmaster with funk elements. Davis invited Buckmaster to New York City to oversee the writing and recording of the album with Macero. The album reached No. 1 on the "Billboard" jazz chart but peaked at No. 156 on the more heterogeneous Top 200 Albums chart. "On the Corner" elicited a favorable review from Ralph J. Gleason of "Rolling Stone", but Davis felt that Columbia marketed it to the wrong audience. "The music was meant to be heard by young black people, but they just treated it like any other jazz album and advertised it that way, pushed it on the jazz radio stations. Young black kids don't listen to those stations; they listen to R&B stations and some rock stations." In October 1972, he broke his ankles in a car crash. He took painkillers and cocaine to cope with the pain. Looking back at his career after the incident, he wrote, "Everything started to blur."
After recording "On the Corner", he assembled a group with Henderson, Mtume, Carlos Garnett, guitarist Reggie Lucas, organist Lonnie Liston Smith, tabla player Badal Roy, sitarist Khalil Balakrishna, and drummer Al Foster. Only Smith was a jazz instrumentalist; consequently, the music emphasized rhythmic density and shifting textures instead of solos. This group was recorded live for "In Concert" (1973), but Davis found it unsatisfactory, leading him to drop the tabla and sitar and play keyboards. He also added guitarist Pete Cosey. The compilation studio album "Big Fun" (1974) contains four long improvisations recorded between 1969 and 1972.
Studio activity in the 1970s culminated in sessions throughout 1973 and 1974 for "Get Up with It" (1974), a compilation that included four long pieces (comprising over ninety minutes of new music) alongside four shorter recordings from 1970 and 1972. The album contained "He Loved Him Madly", a thirty-minute tribute to the recently deceased Duke Ellington that presaged later developments in ambient music. In the United States, it performed comparably to "On the Corner", reaching number 8 on the jazz chart and number 141 on the pop chart. He then concentrated on live performance with a series of concerts that Columbia released on the double live albums "Agharta" (1975), "Pangaea" (1976), and "Dark Magus" (1977). The first two are recordings of two sets from February 1, 1975 in Osaka, by which time Davis was troubled by pneumonia, osteoarthritis, sickle-cell anemia, depression, bursitis, and stomach ulcers; he relied on alcohol, codeine, and morphine to get through the engagements. His shows were routinely panned by critics who mentioned his habit of performing with his back to the audience. Cosey later asserted that "the band really advanced after the Japanese tour", but Davis was again hospitalized, for his ulcers and a hernia, during a tour of the US while opening for Herbie Hancock. Hancock had eclipsed his former employer from a commercial standpoint with "Head Hunters" (1973) and "Thrust" (1974), two albums that were marketed to pop audiences in the aftermath of the "On the Corner" farrago and peaked at number 13 on the "Billboard" pop chart.
After appearances at the 1975 Newport Jazz Festival in July and the Schaefer Music Festival in New York City on September 5, Davis dropped out of music.
In his autobiography, Davis wrote frankly about his life during his hiatus from music. He called his Upper West Side brownstone a wreck and chronicled his heavy use of alcohol and cocaine, in addition to his sexual encounters with many women. He also stated that "Sex and drugs took the place music had occupied in my life," Drummer Tony Williams recalled that by noon (on average) Davis would be sick from the previous night's intake.
In December 1975, he had regained enough strength to undergo a much needed hip replacement operation. In March 1976, "Rolling Stone" reported rumors of his imminent demise, citing his health problems during the previous tour. In December 1976, Columbia was reluctant to renew his contract and pay his usual large advances. But after his lawyer started negotiating with United Artists, Columbia matched their offer, establishing the Miles Davis Fund to pay him regularly. Pianist Vladimir Horowitz was the only other musician with Columbia who had a similar status.
In 1978, Julie Coryell interviewed Davis. Concerned about his health, she had him stay with a friend in Norwalk, Connecticut. Davis asked Coryell's husband, fusion guitarist Larry Coryell, to participate in sessions with keyboardists Masabumi Kikuchi and George Pavlis, bassist T. M. Stevens, and drummer Al Foster. Davis played the arranged piece uptempo, abandoned his trumpet for the organ, and had Macero record the session without the band's knowledge. After Coryell declined a spot in a band that Davis was beginning to put together, Davis returned to his reclusive lifestyle in New York City. Soon after, Marguerite Eskridge had Davis jailed for failing to pay child support to their son Erin, which cost him $10,000 (US$ in dollars) for release on bail. A recording session that involved Buckmaster and Gil Evans was halted, with Evans leaving after failing to receive the payment he was promised. In August 1978, Davis hired a new manager, Mark Rothbaum, who had worked with him since 1972. Despite the dearth of new material, Davis placed in the Top 10 trumpeter poll of "Down Beat" magazine in 1979.
By 1979, Davis had rekindled his relationship with actress Cicely Tyson, with whom he overcame his cocaine addiction and regained his enthusiasm for music. The two married on November 26, 1981, in a ceremony at Bill Cosby's home in Massachusetts that was officiated by politician and civil rights activist Andrew Young. Their tumultuous marriage ended with Tyson filing for divorce in 1988, which was finalized in 1989.
In October 1979, his contract with Columbia was up for negotiation. Label president Clive Davis was replaced by George Butler, who had visited Davis several times during the previous two years to encourage him to return to the studio. To help his situation, Davis had Buckmaster come over to collaborate on new music. After arriving, Buckmaster organized an intervention for Davis, who was living in squalor among cockroach infestations, in the dark with his curtains always closed. His sister Dorothy cleaned his house with help from Buckmaster, Tyson, and neighbor Chaka Khan. Davis later thanked Buckmaster for helping him.
Davis hadn't played the trumpet much for three years and found it difficult to reclaim his embouchure. His first post-hiatus studio appearance took place on May 1, 1980. A day later, Davis was hospitalized due to a leg infection. He recorded "The Man with the Horn" (1981) from June 1980 to May 1981 with Macero producing. A large band was abandoned in favor of a combo with saxophonist Bill Evans (not to be confused with pianist Bill Evans) and bassist Marcus Miller. Both would collaborate with him during the next decade.
"The Man with the Horn" received a poor critical reception despite selling well. In June 1981, Davis returned to the stage for the first time since 1975 in a ten-minute guest solo as part of Mel Lewis's band at the Village Vanguard. This was followed by appearances with a new band, a four-night run at Kix in Boston, and two shows at Avery Fisher Hall on July 5 as part of the Kool Jazz Festival. Recordings from a mixture of dates from 1981, including the Kix and Avery Fisher Hall gigs, were released on "We Want Miles" (1982), which earned him a Grammy Award for Best Jazz Instrumental Performance by a Soloist.
In January 1982, while Tyson was working in Africa, Davis "went a little wild" with alcohol, and suffered a stroke that temporarily paralyzed his right hand. Tyson returned home and cared for him. After three months of treatment with a Chinese acupuncturist, he was able to play the trumpet again. He listened to his doctor's warnings and gave up alcohol and drugs. He credited Tyson with helping his recovery, which involved exercise, piano playing, and visits to spas. She encouraged him to draw, which he pursued for the rest of his life.
Davis resumed touring in May 1982 with a line-up that included French percussionist Mino Cinelu and guitarist John Scofield, with whom he worked closely on the album "Star People" (1983). In mid-1983, he worked on the tracks for "Decoy", an album mixing soul music and electronica that was released in 1984. He brought in producer, composer, and keyboardist Robert Irving III, who had collaborated with him on "The Man with the Horn". With a seven-piece band that included Scofield, Evans, Irving, Foster, and Darryl Jones, he played a series of European performances that were positively received. In December 1984, while in Denmark, he was awarded the Léonie Sonning Music Prize. Trumpeter Palle Mikkelborg had written a contemporary classical piece titled "Aura" for the event which impressed Davis to the point of returning to Denmark in early 1985 to record his next studio album, "Aura" (1989). Columbia was dissatisfied with the recording and delayed its release.
Also in 1984, Davis met 34-year old sculptor Jo Gelbard. Gelbard would teach Davis how to paint and the two were frequent collaborators and soon romantically engaged.
In May 1985, one month into a tour, Davis signed a contract with Warner Bros. that required him to give up his publishing rights. Trumpeter Wynton Marsalis appeared unannounced onstage during Davis' performance at the inaugural Vancouver International Jazz Festival in 1986. Marsalis whispered into Davis' ear that "someone" had told him to do so. Davis responded by ordering him off the stage. Davis had become increasingly irritated at Columbia's delay in releasing "Aura". The breaking point appears to have come when a producer at Columbia asked him to call Marsalis and wish him a happy birthday. The tour in 1985 included a performance in London in July in which Davis performed on stage for five hours. Jazz critic John Fordham concluded, "The leader is clearly enjoying himself." By 1985, Davis was diabetic and required daily injections of insulin.
He released "You're Under Arrest", his final album for Columbia, in September 1985. It included cover versions of two pop songs: "Time After Time" by Cyndi Lauper and Michael Jackson's "Human Nature". He considered releasing an album of pop songs, and he recorded dozens of them, but the idea was rejected. He said that many of today's jazz standards had been pop songs in Broadway theater and that he was simply updating the standards repertoire.
Davis collaborated with a number of figures from the British post-punk and new wave movements during this period, including Scritti Politti. At the invitation of producer Bill Laswell, he recorded some trumpet parts during sessions for Public Image Ltd.'s "Album", according to John Lydon in the liner notes of their "Plastic Box" box set. In Lydon's words, however, "Strangely enough, we didn't use [his contributions]." According to Lydon in the "Plastic Box" notes, Davis favorably compared Lydon's singing voice to his trumpet sound during these sessions. This period also saw Davis move from his funk inspired sound of the early 70s to a more melodic style.
After taking part in the recording of the 1985 protest song "Sun City" as a member of Artists United Against Apartheid, Davis appeared on the instrumental "Don't Stop Me Now" by Toto for their album "Fahrenheit" (1986). Davis intended to collaborate with Prince, but the project was dropped. Davis also collaborated with Zane Giles and Randy Hall on the "Rubberband" sessions in 1985 but those would remain unreleased until 2019. Instead, he worked with Marcus Miller, and "Tutu" (1986) became the first time he used modern studio tools such as programmed synthesizers, sampling, and drum loops. Released in September 1986, its front cover is a portrait of Davis by Irving Penn. In 1987, he won a Grammy Award for Best Jazz Instrumental Performance, Soloist. Also in 1987, Davis contacted American journalist Quincy Troupe to work with him on his autobiography. The two had met the previous year when Troupe conducted an 2 day long interview. The interview was then published by "Spin" as an 45 page article.
In 1988, Davis had a small part as a street musician in the Christmas comedy film "Scrooged" starring Bill Murray. He also collaborated with Zucchero Fornaciari in a version of "Dune Mosse" ("Blue's"), published in 2004 in "Zu & Co." of the Italian bluesman. In November 1988, he was inducted into the Knights of Malta at a ceremony at the Alhambra Palace in Spain (hence the "Sir" title on his gravestone). Later that month, Davis cut his European tour short after he collapsed and fainted after a two-hour show in Madrid and flew home. Rumors of his health were made public after the American magazine "Star", in its February 21, 1989 edition, published that Davis had contracted AIDS, prompting his manager Peter Shukat to issue a statement the following day to deny the claim. Shukat revealed Davis had been in the hospital for a mild case of pneumonia and the removal of a benign polyp on his vocal cords and was resting comfortably in preparation for his 1989 tours. Davis later blamed one of his former wives or girlfriends for starting the rumor and decided against taking legal action. He was interviewed on "60 Minutes" by Harry Reasoner. In October 1989, he received a Grande Medaille de Vermeil from Paris mayor Jacques Chirac. In 1990, he received a Grammy Lifetime Achievement Award. In early 1991, he appeared in the Rolf de Heer film "Dingo" as a jazz musician.
Davis followed "Tutu" with "Amandla" (1989) and soundtracks to four films: "Street Smart", "Siesta", "The Hot Spot", and "Dingo." His last albums were released posthumously: the hip hop-influenced "Doo-Bop" (1992) and "Miles & Quincy Live at Montreux" (1993), a collaboration with Quincy Jones from the 1991 Montreux Jazz Festival where, for the first time in three decades, he performed songs from "Miles Ahead", "Porgy and Bess", and "Sketches of Spain".
On July 8, 1991, Davis returned to performing material from his past at the 1991 Montreux Jazz Festival with a band and orchestra conducted by Quincy Jones. The set consisted of arrangements from his albums recorded with Gil Evans. The show was followed by a concert billed as "Miles and Friends" at the Grande halle de la Villette in Paris two days later, with guest performances by musicians from throughout his career, including John McLaughlin, Herbie Hancock, and Joe Zawinul. In Paris, he was awarded the Chevalier of the Legion of Honor. After returning to America, he stopped in New York City to record material for "Doo-Bop", then returned to California to play at the Hollywood Bowl on August 25, his final live performance.
Miles would become increasingly aggressive in his final year due in part to the medication he was taking. His aggression would take the form of violence towards his partner Jo Gelbard.
In early September 1991, Davis checked into St. John's Hospital near his home in Santa Monica, California, for routine tests. Doctors suggested he have a tracheal tube implanted to relieve his breathing after repeated bouts of bronchial pneumonia. The suggestion provoked an outburst from Davis that led to an intracerebral hemorrhage followed by a coma. According to Jo Gelbard, on September 26, Davis painted his final painting, composed of dark, ghostly figures, dripping blood and "his imminent demise." After several days on life support, his machine was turned off and he died on September 28, 1991 in the arms of Gelbard. He was 65 years old. His death was attributed to the combined effects of a stroke, pneumonia, and respiratory failure. According to Troupe, Davis was taking azidothymidine (AZT), a type of antiretroviral drug used for the treatment of HIV and AIDS, during his treatments in hospital. A funeral service was held on October 5, 1991, at St. Peter's Lutheran Church on Lexington Avenue in New York City that was attended by around 500 friends, family members, and musical acquaintances, with many fans standing in the rain. He was buried in Woodlawn Cemetery in the Bronx, New York City, with one of his trumpets, near the site of Duke Ellington's grave.
At the time of his death, Davis' estate was valued at more than $1 million. In his will, Davis left 20 percent to his daughter Cheryl Davis; 40 percent to his son Erin Davis; 10 percent to nephew Vincent Wilburn Jr. and 15 percent each to his brother Vernon Davis and his sister Dorothy Wilburn. He excluded his two sons Gregory and Miles IV.
Late in his life, from the "electric period" onwards, Davis repeatedly explained his reasons for not wishing to perform his earlier works, such as "Birth of the Cool" or "Kind of Blue". In his view, remaining stylistically static was the wrong option. He commented: So What' or "Kind of Blue", they were done in that era, the right hour, the right day, and it happened. It's over ... What I used to play with Bill Evans, all those different modes, and substitute chords, we had the energy then and we liked it. But I have no feel for it anymore, it's more like warmed-over turkey." When Shirley Horn insisted in 1990 that Miles reconsider playing the ballads and modal tunes of his "Kind of Blue" period, he demurred. "Nah, it hurts my lip," was the reason he gave. Bill Evans, who played piano on "Kind of Blue", said: "I would like to hear more of the consummate melodic master, but I feel that big business and his record company have had a corrupting influence on his material. The rock and pop thing certainly draws a wider audience." Throughout the his later career Davis turned down offers to reinstate his 60s quintet.
Many books and documentarys focus more extensively his earlier work with 1975 typically being the cutoff date. According to an article by The Independent from 1975 onwards a decline in critcal praise for Davis' output began to form with many viewing the era as "worthless" : "There is a surprisingly widespread view that, in terms of the merits of his musical output, Davis might as well have died in 1975," In a 1982 interview in "Downbeat", Wynton Marsalis said: "They call Miles's stuff jazz. That stuff is not jazz, man. Just because somebody played jazz at one time, that doesn't mean they're still playing it." Despite his contempt for Davis' later work; Marsalis' work is "laden with ironic references to Davis' music of the '60s." Writer Stanley Crouch criticised Davis' work from "In a Silent Way" onwards"."
Miles Davis is considered one of the most innovative, influential, and respected figures in the history of music. Based on professional rankings of his albums and songs, the aggregate website Acclaimed Music lists him as the 16th most acclaimed recording artist in history. "The Guardian" described him as "a pioneer of 20th-century music, leading many of the key developments in the world of jazz." He has been called "one of the great innovators in jazz", and had the titles Prince of Darkness and the Picasso of Jazz bestowed upon him. "The Rolling Stone Encyclopedia of Rock & Roll" said, "Miles Davis played a crucial and inevitably controversial role in every major development in jazz since the mid-'40s, and no other jazz musician has had so profound an effect on rock. Miles Davis was the most widely recognized jazz musician of his era, an outspoken social critic and an arbiter of style—in attitude and fashion—as well as music."
William Ruhlmann of AllMusic wrote, "To examine his career is to examine the history of jazz from the mid-1940s to the early 1990s, since he was in the thick of almost every important innovation and stylistic development in the music during that period ... It can even be argued that jazz stopped evolving when Davis wasn't there to push it forward." Francis Davis of The Atlantic notes that Davis' career can be seen as a critique of the jazz music played time, specifically bebob. Music writer Christopher Smith wrote,
Miles Davis' artistic interest was in the creation and manipulation of ritual space, in which gestures could be endowed with symbolic power sufficient to form a functional communicative, and hence musical, vocabulary. ... Miles' performance tradition emphasized orality and the transmission of information and artistic insight from individual to individual. His position in that tradition, and his personality, talents, and artistic interests, impelled him to pursue a uniquely individual solution to the problems and the experiential possibilities of improvised performance.
His approach, owing largely to the African-American performance tradition that focused on individual expression, emphatic interaction, and creative response to shifting contents, had a profound impact on generations of jazz musicians. Musicians and admirers of Davis work include Carlos Santana, Herbie Hancock, Flea, The Roots, Wayne Shorter. In 2016, digital publiction The Pudding in an article examing Davis' legacy found that 2,452 Wikipedia pages mention Davis with over 286 citing him as an influence.
"Kind of Blue" remains the best-selling jazz album of all time. On November 5, 2009, U.S. Representative John Conyers of Michigan sponsored a measure in the United States House of Representatives to commemorate the album on its 50th anniversary. The measure also affirms jazz as a national treasure and "encourages the United States government to preserve and advance the art form of jazz music". It passed with a vote of 409–0 on December 15, 2009. The trumpet Davis used on the recording is displayed on the campus of the University of North Carolina at Greensboro. It was donated to the school by Arthur "Buddy" Gist, who met Davis in 1949 and became a close friend. The gift was the reason why the jazz program at UNCG is named the Miles Davis Jazz Studies Program.
In 1986, the New England Conservatory awarded Davis an honorary doctorate for his contributions to music. Since 1960 the National Academy of Recording Arts and Sciences (NARAS) honored him with eight Grammy Awards, a Grammy Lifetime Achievement Award, and three Grammy Hall of Fame Awards.
In 2001 a two-hour documentary film by Mike Dibb entitled "The Miles Davis Story" won an International Emmy Award for arts documentary of the year. Since 2005, the Miles Davis Jazz Committee has held an annual Miles Davis Jazz Festival. The festival was cancelled in 2020 due to the COVID-19 Pandemic. Also in 2005, a London exhibition was held demostrating his paintings, "'The Last Miles: The Music of Miles Davis, 1980-1991"' was released detailing his final years and eight of his albums from the 1960s and 70s were reissued in celebration of the 50th anniversary of his signing to Columbia records. In 2006, Davis was inducted into the Rock and Roll Hall of Fame. In 2012, the U.S. Postal Service issued commemorative stamps featuring Davis.
"Miles Ahead" was a 2015 American music film directed by Don Cheadle, co-written by Cheadle with Steven Baigelman, Stephen J. Rivele, and Christopher Wilkinson, which interprets the life and compositions of Davis. It premiered at the New York Film Festival in October 2015. The film stars Cheadle, Emayatzy Corinealdi as Frances Taylor, Ewan McGregor, Michael Stuhlbarg, and Lakeith Stanfield. That same year a statute of him was erected in his home city, Alton, Illinois and listeners of BBC Radio and Jazz FM voted Davis the greatest jazz musician. Publications such as The Guardian have also ranked Davis amongst the best. On May 27, 2016, American pianist and record producer Robert Glasper released an tirbute album entitled "Everything's Beautiful" which features 11 reinterpretations of Davis songs.
In 2018, American rapper Q-Tip played Miles Davis in a theatre production entitled "My Funny Valentine." Q-Tip had previously played Davis in 2010. In 2019, the documentary "Miles Davis: Birth of the Cool", directed by Stanley Nelson, premiered at the Sundance Film Festival. "Birth of the cool" was later released on PBS' "American Masters" series.
Davis has however been subject to criticism. In 1990, writer Stanley Crouch, labelled Davis "the most brilliant sellout in the history of jazz," An 1993 essay by Robert Walser entitled "The Musical Quarterly" claims that "Davis has long been infamous for missing more notes than any other major trumpet player." Also in the essay is a quote by music critic James Lincoln Collier who states that "if his influence was profound, the ultimate value of his work is another matter," and calls Davis an "adequate instrumentalist" but "not a great one." In 2013 The A.V. Club published an article "Miles Davis beat his wives "and" made beautiful music," In the article, writer Sonia Saraiya praises Davis as an musician however criticises him as a person; in particular his abuse of his wives. Others such as Francis Davis have criticised his treatment of women describing it as "contemptible,"
Grammy Awards
Other awards | https://en.wikipedia.org/wiki?curid=20405 |
M-theory
M-theory is a theory in physics that unifies all consistent versions of superstring theory. Edward Witten first conjectured the existence of such a theory at a string-theory conference at the University of Southern California in the spring of 1995. Witten's announcement initiated a flurry of research activity known as the second superstring revolution.
Prior to Witten's announcement, string theorists had identified five versions of superstring theory. Although these theories appeared, at first, to be very different, work by several physicists showed that the theories were related in intricate and nontrivial ways. Physicists found that apparently distinct theories could be unified by mathematical transformations called S-duality and T-duality. Witten's conjecture was based in part on the existence of these dualities and in part on the relationship of the string theories to a field theory called eleven-dimensional supergravity.
Although a complete formulation of M-theory is not known, such a formulation should describe two- and five-dimensional objects called branes and should be approximated by eleven-dimensional supergravity at low energies. Modern attempts to formulate M-theory are typically based on matrix theory or the AdS/CFT correspondence.
According to Witten, M should stand for "magic", "mystery" or "membrane" according to taste, and the true meaning of the title should be decided when a more fundamental formulation of the theory is known.
Investigations of the mathematical structure of M-theory have spawned important theoretical results in physics and mathematics. More speculatively, M-theory may provide a framework for developing a unified theory of all of the fundamental forces of nature. Attempts to connect M-theory to experiment typically focus on compactifying its extra dimensions to construct candidate models of our four-dimensional world, although so far none has been verified to give rise to physics as observed in high-energy physics experiments.
One of the deepest problems in modern physics is the problem of quantum gravity. The current understanding of gravity is based on Albert Einstein's general theory of relativity, which is formulated within the framework of classical physics. However, nongravitational forces are described within the framework of quantum mechanics, a radically different formalism for describing physical phenomena based on probability. A quantum theory of gravity is needed in order to reconcile general relativity with the principles of quantum mechanics, but difficulties arise when one attempts to apply the usual prescriptions of quantum theory to the force of gravity.
String theory is a theoretical framework that attempts to reconcile gravity and quantum mechanics. In string theory, the point-like particles of particle physics are replaced by one-dimensional objects called strings. String theory describes how strings propagate through space and interact with each other. In a given version of string theory, there is only one kind of string, which may look like a small loop or segment of ordinary string, and it can vibrate in different ways. On distance scales larger than the string scale, a string will look just like an ordinary particle, with its mass, charge, and other properties determined by the vibrational state of the string. In this way, all of the different elementary particles may be viewed as vibrating strings. One of the vibrational states of a string gives rise to the graviton, a quantum mechanical particle that carries gravitational force.
There are several versions of string theory: type I, type IIA, type IIB, and two flavors of heterotic string theory ( and ). The different theories allow different types of strings, and the particles that arise at low energies exhibit different symmetries. For example, the type I theory includes both open strings (which are segments with endpoints) and closed strings (which form closed loops), while types IIA and IIB include only closed strings. Each of these five string theories arises as a special limiting case of M-theory. This theory, like its string theory predecessors, is an example of a quantum theory of gravity. It describes a force just like the familiar gravitational force subject to the rules of quantum mechanics.
In everyday life, there are three familiar dimensions of space: height, width and depth. Einstein's general theory of relativity treats time as a dimension on par with the three spatial dimensions; in general relativity, space and time are not modeled as separate entities but are instead unified to a four-dimensional spacetime, three spatial dimensions and one time dimension. In this framework, the phenomenon of gravity is viewed as a consequence of the geometry of spacetime.
In spite of the fact that the universe is well described by four-dimensional spacetime, there are several reasons why physicists consider theories in other dimensions. In some cases, by modeling spacetime in a different number of dimensions, a theory becomes more mathematically tractable, and one can perform calculations and gain general insights more easily. There are also situations where theories in two or three spacetime dimensions are useful for describing phenomena in condensed matter physics. Finally, there exist scenarios in which there could actually be more than four dimensions of spacetime which have nonetheless managed to escape detection.
One notable feature of string theory and M-theory is that these theories require extra dimensions of spacetime for their mathematical consistency. In string theory, spacetime is "ten-dimensional" (nine spatial dimensions, and one time dimension), while in M-theory it is "eleven-dimensional" (ten spatial dimensions, and one time dimension). In order to describe real physical phenomena using these theories, one must therefore imagine scenarios in which these extra dimensions would not be observed in experiments.
Compactification is one way of modifying the number of dimensions in a physical theory. In compactification, some of the extra dimensions are assumed to "close up" on themselves to form circles. In the limit where these curled up dimensions become very small, one obtains a theory in which spacetime has effectively a lower number of dimensions. A standard analogy for this is to consider a multidimensional object such as a garden hose. If the hose is viewed from a sufficient distance, it appears to have only one dimension, its length. However, as one approaches the hose, one discovers that it contains a second dimension, its circumference. Thus, an ant crawling on the surface of the hose would move in two dimensions.
Theories that arise as different limits of M-theory turn out to be related in highly nontrivial ways. One of the relationships that can exist between these different physical theories is called S-duality. This is a relationship which says that a collection of strongly interacting particles in one theory can, in some cases, be viewed as a collection of weakly interacting particles in a completely different theory. Roughly speaking, a collection of particles is said to be strongly interacting if they combine and decay often and weakly interacting if they do so infrequently. Type I string theory turns out to be equivalent by S-duality to the heterotic string theory. Similarly, type IIB string theory is related to itself in a nontrivial way by S-duality.
Another relationship between different string theories is T-duality. Here one considers strings propagating around a circular extra dimension. T-duality states that a string propagating around a circle of radius is equivalent to a string propagating around a circle of radius in the sense that all observable quantities in one description are identified with quantities in the dual description. For example, a string has momentum as it propagates around a circle, and it can also wind around the circle one or more times. The number of times the string winds around a circle is called the winding number. If a string has momentum and winding number in one description, it will have momentum and winding number in the dual description. For example, type IIA string theory is equivalent to type IIB string theory via T-duality, and the two versions of heterotic string theory are also related by T-duality.
In general, the term "duality" refers to a situation where two seemingly different physical systems turn out to be equivalent in a nontrivial way. If two theories are related by a duality, it means that one theory can be transformed in some way so that it ends up looking just like the other theory. The two theories are then said to be "dual" to one another under the transformation. Put differently, the two theories are mathematically different descriptions of the same phenomena.
Another important theoretical idea that plays a role in M-theory is supersymmetry. This is a mathematical relation that exists in certain physical theories between a class of particles called bosons and a class of particles called fermions. Roughly speaking, fermions are the constituents of matter, while bosons mediate interactions between particles. In theories with supersymmetry, each boson has a counterpart which is a fermion, and vice versa. When supersymmetry is imposed as a local symmetry, one automatically obtains a quantum mechanical theory that includes gravity. Such a theory is called a supergravity theory.
A theory of strings that incorporates the idea of supersymmetry is called a superstring theory. There are several different versions of superstring theory which are all subsumed within the M-theory framework. At low energies, the superstring theories are approximated by supergravity in ten spacetime dimensions. Similarly, M-theory is approximated at low energies by supergravity in eleven dimensions.
In string theory and related theories such as supergravity theories, a brane is a physical object that generalizes the notion of a point particle to higher dimensions. For example, a point particle can be viewed as a brane of dimension zero, while a string can be viewed as a brane of dimension one. It is also possible to consider higher-dimensional branes. In dimension , these are called -branes. Branes are dynamical objects which can propagate through spacetime according to the rules of quantum mechanics. They can have mass and other attributes such as charge. A -brane sweeps out a -dimensional volume in spacetime called its "worldvolume". Physicists often study fields analogous to the electromagnetic field which live on the worldvolume of a brane. The word brane comes from the word "membrane" which refers to a two-dimensional brane.
In string theory, the fundamental objects that give rise to elementary particles are the one-dimensional strings. Although the physical phenomena described by M-theory are still poorly understood, physicists know that the theory describes two- and five-dimensional branes. Much of the current research in M-theory attempts to better understand the properties of these branes.
In the early 20th century, physicists and mathematicians including Albert Einstein and Hermann Minkowski pioneered the use of four-dimensional geometry for describing the physical world. These efforts culminated in the formulation of Einstein's general theory of relativity, which relates gravity to the geometry of four-dimensional spacetime.
The success of general relativity led to efforts to apply higher dimensional geometry to explain other forces. In 1919, work by Theodor Kaluza showed that by passing to five-dimensional spacetime, one can unify gravity and electromagnetism into a single force. This idea was improved by physicist Oskar Klein, who suggested that the additional dimension proposed by Kaluza could take the form of a circle with radius around cm.
The Kaluza–Klein theory and subsequent attempts by Einstein to develop unified field theory were never completely successful. In part this was because Kaluza–Klein theory predicted a particle that has never been shown to exist, and in part because it was unable to correctly predict the ratio of an electron's mass to its charge. In addition, these theories were being developed just as other physicists were beginning to discover quantum mechanics, which would ultimately prove successful in describing known forces such as electromagnetism, as well as new nuclear forces that were being discovered throughout the middle part of the century. Thus it would take almost fifty years for the idea of new dimensions to be taken seriously again.
New concepts and mathematical tools provided fresh insights into general relativity, giving rise to a period in the 1960s–70s now known as the golden age of general relativity. In the mid-1970s, physicists began studying higher-dimensional theories combining general relativity with supersymmetry, the so-called supergravity theories.
General relativity does not place any limits on the possible dimensions of spacetime. Although the theory is typically formulated in four dimensions, one can write down the same equations for the gravitational field in any number of dimensions. Supergravity is more restrictive because it places an upper limit on the number of dimensions. In 1978, work by Werner Nahm showed that the maximum spacetime dimension in which one can formulate a consistent supersymmetric theory is eleven. In the same year, Eugene Cremmer, Bernard Julia, and Joel Scherk of the École Normale Supérieure showed that supergravity not only permits up to eleven dimensions but is in fact most elegant in this maximal number of dimensions.
Initially, many physicists hoped that by compactifying eleven-dimensional supergravity, it might be possible to construct realistic models of our four-dimensional world. The hope was that such models would provide a unified description of the four fundamental forces of nature: electromagnetism, the strong and weak nuclear forces, and gravity. Interest in eleven-dimensional supergravity soon waned as various flaws in this scheme were discovered. One of the problems was that the laws of physics appear to distinguish between clockwise and counterclockwise, a phenomenon known as chirality. Edward Witten and others observed this chirality property cannot be readily derived by compactifying from eleven dimensions.
In the first superstring revolution in 1984, many physicists turned to string theory as a unified theory of particle physics and quantum gravity. Unlike supergravity theory, string theory was able to accommodate the chirality of the standard model, and it provided a theory of gravity consistent with quantum effects. Another feature of string theory that many physicists were drawn to in the 1980s and 1990s was its high degree of uniqueness. In ordinary particle theories, one can consider any collection of elementary particles whose classical behavior is described by an arbitrary Lagrangian. In string theory, the possibilities are much more constrained: by the 1990s, physicists had argued that there were only five consistent supersymmetric versions of the theory.
Although there were only a handful of consistent superstring theories, it remained a mystery why there was not just one consistent formulation. However, as physicists began to examine string theory more closely, they realized that these theories are related in intricate and nontrivial ways.
In the late 1970s, Claus Montonen and David Olive had conjectured a special property of certain physical theories. A sharpened version of their conjecture concerns a theory called supersymmetric Yang–Mills theory, which describes theoretical particles formally similar to the quarks and gluons that make up atomic nuclei. The strength with which the particles of this theory interact is measured by a number called the coupling constant. The result of Montonen and Olive, now known as Montonen–Olive duality, states that supersymmetric Yang–Mills theory with coupling constant is equivalent to the same theory with coupling constant . In other words, a system of strongly interacting particles (large coupling constant) has an equivalent description as a system of weakly interacting particles (small coupling constant) and vice versa by spin-moment.
In the 1990s, several theorists generalized Montonen–Olive duality to the S-duality relationship, which connects different string theories. Ashoke Sen studied S-duality in the context of heterotic strings in four dimensions. Chris Hull and Paul Townsend showed that type IIB string theory with a large coupling constant is equivalent via S-duality to the same theory with small coupling constant. Theorists also found that different string theories may be related by T-duality. This duality implies that strings propagating on completely different spacetime geometries may be physically equivalent.
String theory extends ordinary particle physics by replacing zero-dimensional point particles by one-dimensional objects called strings. In the late 1980s, it was natural for theorists to attempt to formulate other extensions in which particles are replaced by two-dimensional supermembranes or by higher-dimensional objects called branes. Such objects had been considered as early as 1962 by Paul Dirac, and they were reconsidered by a small but enthusiastic group of physicists in the 1980s.
Supersymmetry severely restricts the possible number of dimensions of a brane. In 1987, Eric Bergshoeff, Ergin Sezgin, and Paul Townsend showed that eleven-dimensional supergravity includes two-dimensional branes. Intuitively, these objects look like sheets or membranes propagating through the eleven-dimensional spacetime. Shortly after this discovery, Michael Duff, Paul Howe, Takeo Inami, and Kellogg Stelle considered a particular compactification of eleven-dimensional supergravity with one of the dimensions curled up into a circle. In this setting, one can imagine the membrane wrapping around the circular dimension. If the radius of the circle is sufficiently small, then this membrane looks just like a string in ten-dimensional spacetime. In fact, Duff and his collaborators showed that this construction reproduces exactly the strings appearing in type IIA superstring theory.
In 1990, Andrew Strominger published a similar result which suggested that strongly interacting strings in ten dimensions might have an equivalent description in terms of weakly interacting five-dimensional branes. Initially, physicists were unable to prove this relationship for two important reasons. On the one hand, the Montonen–Olive duality was still unproven, and so Strominger's conjecture was even more tenuous. On the other hand, there were many technical issues related to the quantum properties of five-dimensional branes. The first of these problems was solved in 1993 when Ashoke Sen established that certain physical theories require the existence of objects with both electric and magnetic charge which were predicted by the work of Montonen and Olive.
In spite of this progress, the relationship between strings and five-dimensional branes remained conjectural because theorists were unable to quantize the branes. Starting in 1991, a team of researchers including Michael Duff, Ramzi Khuri, Jianxin Lu, and Ruben Minasian considered a special compactification of string theory in which four of the ten dimensions curl up. If one considers a five-dimensional brane wrapped around these extra dimensions, then the brane looks just like a one-dimensional string. In this way, the conjectured relationship between strings and branes was reduced to a relationship between strings and strings, and the latter could be tested using already established theoretical techniques.
Speaking at the string theory conference at the University of Southern California in 1995, Edward Witten of the Institute for Advanced Study made the surprising suggestion that all five superstring theories were in fact just different limiting cases of a single theory in eleven spacetime dimensions. Witten's announcement drew together all of the previous results on S- and T-duality and the appearance of two- and five-dimensional branes in string theory. In the months following Witten's announcement, hundreds of new papers appeared on the Internet confirming that the new theory involved membranes in an important way. Today this flurry of work is known as the second superstring revolution.
One of the important developments following Witten's announcement was Witten's work in 1996 with string theorist Petr Hořava. Witten and Hořava studied M-theory on a special spacetime geometry with two ten-dimensional boundary components. Their work shed light on the mathematical structure of M-theory and suggested possible ways of connecting M-theory to real world physics.
Initially, some physicists suggested that the new theory was a fundamental theory of membranes, but Witten was skeptical of the role of membranes in the theory. In a paper from 1996, Hořava and Witten wrote
In the absence of an understanding of the true meaning and structure of M-theory, Witten has suggested that the "M" should stand for "magic", "mystery", or "membrane" according to taste, and the true meaning of the title should be decided when a more fundamental formulation of the theory is known.
In mathematics, a matrix is a rectangular array of numbers or other data. In physics, a matrix model is a particular kind of physical theory whose mathematical formulation involves the notion of a matrix in an important way. A matrix model describes the behavior of a set of matrices within the framework of quantum mechanics.
One important example of a matrix model is the BFSS matrix model proposed by Tom Banks, Willy Fischler, Stephen Shenker, and Leonard Susskind in 1997. This theory describes the behavior of a set of nine large matrices. In their original paper, these authors showed, among other things, that the low energy limit of this matrix model is described by eleven-dimensional supergravity. These calculations led them to propose that the BFSS matrix model is exactly equivalent to M-theory. The BFSS matrix model can therefore be used as a prototype for a correct formulation of M-theory and a tool for investigating the properties of M-theory in a relatively simple setting.
In geometry, it is often useful to introduce coordinates. For example, in order to study the geometry of the Euclidean plane, one defines the coordinates and as the distances between any point in the plane and a pair of axes. In ordinary geometry, the coordinates of a point are numbers, so they can be multiplied, and the product of two coordinates does not depend on the order of multiplication. That is, . This property of multiplication is known as the commutative law, and this relationship between geometry and the commutative algebra of coordinates is the starting point for much of modern geometry.
Noncommutative geometry is a branch of mathematics that attempts to generalize this situation. Rather than working with ordinary numbers, one considers some similar objects, such as matrices, whose multiplication does not satisfy the commutative law (that is, objects for which is not necessarily equal to ). One imagines that these noncommuting objects are coordinates on some more general notion of "space" and proves theorems about these generalized spaces by exploiting the analogy with ordinary geometry.
In a paper from 1998, Alain Connes, Michael R. Douglas, and Albert Schwarz showed that some aspects of matrix models and M-theory are described by a noncommutative quantum field theory, a special kind of physical theory in which the coordinates on spacetime do not satisfy the commutativity property. This established a link between matrix models and M-theory on the one hand, and noncommutative geometry on the other hand. It quickly led to the discovery of other important links between noncommutative geometry and various physical theories.
The application of quantum mechanics to physical objects such as the electromagnetic field, which are extended in space and time, is known as quantum field theory. In particle physics, quantum field theories form the basis for our understanding of elementary particles, which are modeled as excitations in the fundamental fields. Quantum field theories are also used throughout condensed matter physics to model particle-like objects called quasiparticles.
One approach to formulating M-theory and studying its properties is provided by the anti-de Sitter/conformal field theory (AdS/CFT) correspondence. Proposed by Juan Maldacena in late 1997, the AdS/CFT correspondence is a theoretical result which implies that M-theory is in some cases equivalent to a quantum field theory. In addition to providing insights into the mathematical structure of string and M-theory, the AdS/CFT correspondence has shed light on many aspects of quantum field theory in regimes where traditional calculational techniques are ineffective.
In the AdS/CFT correspondence, the geometry of spacetime is described in terms of a certain vacuum solution of Einstein's equation called anti-de Sitter space. In very elementary terms, anti-de Sitter space is a mathematical model of spacetime in which the notion of distance between points (the metric) is different from the notion of distance in ordinary Euclidean geometry. It is closely related to hyperbolic space, which can be viewed as a disk as illustrated on the left. This image shows a tessellation of a disk by triangles and squares. One can define the distance between points of this disk in such a way that all the triangles and squares are the same size and the circular outer boundary is infinitely far from any point in the interior.
Now imagine a stack of hyperbolic disks where each disk represents the state of the universe at a given time. The resulting geometric object is three-dimensional anti-de Sitter space. It looks like a solid cylinder in which any cross section is a copy of the hyperbolic disk. Time runs along the vertical direction in this picture. The surface of this cylinder plays an important role in the AdS/CFT correspondence. As with the hyperbolic plane, anti-de Sitter space is curved in such a way that any point in the interior is actually infinitely far from this boundary surface.
This construction describes a hypothetical universe with only two space dimensions and one time dimension, but it can be generalized to any number of dimensions. Indeed, hyperbolic space can have more than two dimensions and one can "stack up" copies of hyperbolic space to get higher-dimensional models of anti-de Sitter space.
An important feature of anti-de Sitter space is its boundary (which looks like a cylinder in the case of three-dimensional anti-de Sitter space). One property of this boundary is that, within a small region on the surface around any given point, it looks just like Minkowski space, the model of spacetime used in nongravitational physics. One can therefore consider an auxiliary theory in which "spacetime" is given by the boundary of anti-de Sitter space. This observation is the starting point for AdS/CFT correspondence, which states that the boundary of anti-de Sitter space can be regarded as the "spacetime" for a quantum field theory. The claim is that this quantum field theory is equivalent to the gravitational theory on the bulk anti-de Sitter space in the sense that there is a "dictionary" for translating entities and calculations in one theory into their counterparts in the other theory. For example, a single particle in the gravitational theory might correspond to some collection of particles in the boundary theory. In addition, the predictions in the two theories are quantitatively identical so that if two particles have a 40 percent chance of colliding in the gravitational theory, then the corresponding collections in the boundary theory would also have a 40 percent chance of colliding.
One particular realization of the AdS/CFT correspondence states that M-theory on the product space is equivalent to the so-called (2,0)-theory on the six-dimensional boundary. Here "(2,0)" refers to the particular type of supersymmetry that appears in the theory. In this example, the spacetime of the gravitational theory is effectively seven-dimensional (hence the notation ), and there are four additional "compact" dimensions (encoded by the factor). In the real world, spacetime is four-dimensional, at least macroscopically, so this version of the correspondence does not provide a realistic model of gravity. Likewise, the dual theory is not a viable model of any real-world system since it describes a world with six spacetime dimensions.
Nevertheless, the (2,0)-theory has proven to be important for studying the general properties of quantum field theories. Indeed, this theory subsumes many mathematically interesting effective quantum field theories and points to new dualities relating these theories. For example, Luis Alday, Davide Gaiotto, and Yuji Tachikawa showed that by compactifying this theory on a surface, one obtains a four-dimensional quantum field theory, and there is a duality known as the AGT correspondence which relates the physics of this theory to certain physical concepts associated with the surface itself. More recently, theorists have extended these ideas to study the theories obtained by compactifying down to three dimensions.
In addition to its applications in quantum field theory, the (2,0)-theory has spawned important results in pure mathematics. For example, the existence of the (2,0)-theory was used by Witten to give a "physical" explanation for a conjectural relationship in mathematics called the geometric Langlands correspondence. In subsequent work, Witten showed that the (2,0)-theory could be used to understand a concept in mathematics called Khovanov homology. Developed by Mikhail Khovanov around 2000, Khovanov homology provides a tool in knot theory, the branch of mathematics that studies and classifies the different shapes of knots. Another application of the (2,0)-theory in mathematics is the work of Davide Gaiotto, Greg Moore, and Andrew Neitzke, which used physical ideas to derive new results in hyperkähler geometry.
Another realization of the AdS/CFT correspondence states that M-theory on is equivalent to a quantum field theory called the ABJM theory in three dimensions. In this version of the correspondence, seven of the dimensions of M-theory are curled up, leaving four non-compact dimensions. Since the spacetime of our universe is four-dimensional, this version of the correspondence provides a somewhat more realistic description of gravity.
The ABJM theory appearing in this version of the correspondence is also interesting for a variety of reasons. Introduced by Aharony, Bergman, Jafferis, and Maldacena, it is closely related to another quantum field theory called Chern–Simons theory. The latter theory was popularized by Witten in the late 1980s because of its applications to knot theory. In addition, the ABJM theory serves as a semi-realistic simplified model for solving problems that arise in condensed matter physics.
In addition to being an idea of considerable theoretical interest, M-theory provides a framework for constructing models of real world physics that combine general relativity with the standard model of particle physics. Phenomenology is the branch of theoretical physics in which physicists construct realistic models of nature from more abstract theoretical ideas. String phenomenology is the part of string theory that attempts to construct realistic models of particle physics based on string and M-theory.
Typically, such models are based on the idea of compactification. Starting with the ten- or eleven-dimensional spacetime of string or M-theory, physicists postulate a shape for the extra dimensions. By choosing this shape appropriately, they can construct models roughly similar to the standard model of particle physics, together with additional undiscovered particles, usually supersymmetric partners to analogues of known particles. One popular way of deriving realistic physics from string theory is to start with the heterotic theory in ten dimensions and assume that the six extra dimensions of spacetime are shaped like a six-dimensional Calabi–Yau manifold. This is a special kind of geometric object named after mathematicians Eugenio Calabi and Shing-Tung Yau. Calabi–Yau manifolds offer many ways of extracting realistic physics from string theory. Other similar methods can be used to construct models with physics resembling to some extent that of our four-dimensional world based on M-theory.
Partly because of theoretical and mathematical difficulties and partly because of the extremely high energies (beyond what is technologically possible for the foreseeable future) needed to test these theories experimentally, there is so far no experimental evidence that would unambiguously point to any of these models being a correct fundamental description of nature. This has led some in the community to criticize these approaches to unification and question the value of continued research on these problems.
In one approach to M-theory phenomenology, theorists assume that the seven extra dimensions of M-theory are shaped like a manifold. This is a special kind of seven-dimensional shape constructed by mathematician Dominic Joyce of the University of Oxford. These manifolds are still poorly understood mathematically, and this fact has made it difficult for physicists to fully develop this approach to phenomenology.
For example, physicists and mathematicians often assume that space has a mathematical property called smoothness, but this property cannot be assumed in the case of a manifold if one wishes to recover the physics of our four-dimensional world. Another problem is that manifolds are not complex manifolds, so theorists are unable to use tools from the branch of mathematics known as complex analysis. Finally, there are many open questions about the existence, uniqueness, and other mathematical properties of manifolds, and mathematicians lack a systematic way of searching for these manifolds.
Because of the difficulties with manifolds, most attempts to construct realistic theories of physics based on M-theory have taken a more indirect approach to compactifying eleven-dimensional spacetime. One approach, pioneered by Witten, Hořava, Burt Ovrut, and others, is known as heterotic M-theory. In this approach, one imagines that one of the eleven dimensions of M-theory is shaped like a circle. If this circle is very small, then the spacetime becomes effectively ten-dimensional. One then assumes that six of the ten dimensions form a Calabi–Yau manifold. If this Calabi–Yau manifold is also taken to be small, one is left with a theory in four-dimensions.
Heterotic M-theory has been used to construct models of brane cosmology in which the observable universe is thought to exist on a brane in a higher dimensional ambient space. It has also spawned alternative theories of the early universe that do not rely on the theory of cosmic inflation. | https://en.wikipedia.org/wiki?curid=20406 |
Multicast
In computer networking, multicast is group communication where data transmission is addressed to a group of destination computers simultaneously. Multicast can be one-to-many or many-to-many distribution. Multicast should not be confused with physical layer point-to-multipoint communication.
Group communication may either be "application layer multicast" or "network assisted multicast", where the latter makes it possible for the source to efficiently send to the group in a single transmission. Copies are automatically created in other network elements, such as routers, switches and cellular network base stations, but only to network segments that currently contain members of the group. Network assisted multicast may be implemented at the data link layer using one-to-many addressing and switching such as Ethernet multicast addressing, Asynchronous Transfer Mode (ATM), point-to-multipoint virtual circuits (P2MP) or Infiniband multicast. Network assisted multicast may also be implemented at the Internet layer using IP multicast. In IP multicast the implementation of the multicast concept occurs at the IP routing level, where routers create optimal distribution paths for datagrams sent to a multicast destination address.
Multicast is often employed in Internet Protocol (IP) applications of streaming media, such as IPTV and multipoint videoconferencing.
Ethernet frames with a value of 1 in the least-significant bit of the first octet of the destination address are treated as multicast frames and are flooded to all points on the network. This mechanism constitutes multicast at the data link layer. This mechanism is used by IP multicast to achieve one-to-many transmission for IP on Ethernet networks. Modern Ethernet controllers filter received packets to reduce CPU load, by looking up the hash of a multicast destination address in a table, initialized by software, which controls whether a multicast packet is dropped or fully received.
Ethernet multicast is available on all Ethernet networks. Multicasts span the broadcast domain of the network. Multiple Registration Protocol can be used to control Ethernet multicast delivery.
IP multicast is a technique for one-to-many communication over an IP network. The destination nodes send Internet Group Management Protocol "join" and "leave" messages, for example in the case of IPTV when the user changes from one TV channel to another. IP multicast scales to a larger receiver population by not requiring prior knowledge of who or how many receivers there are. Multicast uses network infrastructure efficiently by requiring the source to send a packet only once, even if it needs to be delivered to a large number of receivers. The nodes in the network take care of replicating the packet to reach multiple receivers only when necessary.
The most common transport layer protocol to use multicast addressing is User Datagram Protocol (UDP). By its nature, UDP is not reliable—messages may be lost or delivered out of order. By adding loss detection and retransmission mechanisms, reliable multicast has been implemented on top of UDP or IP by various middleware products, e.g. those that implement the Real-Time Publish-Subscribe (RTPS) Protocol of the Object Management Group (OMG) Data Distribution Service (DDS) standard, as well as by special transport protocols such as Pragmatic General Multicast (PGM).
IP multicast is always available within the local subnet. Achieving IP multicast service over a wider area requires multicast routing. Many networks, including the Internet, do not support multicast routing. Multicast routing functionality is available in enterprise-grade network equipment but is typically not available until configured by a network administrator. The Internet Group Management Protocol is used to control IP multicast delivery.
Application layer multicast overlay services are not based on IP multicast or data link layer multicast. Instead they use multiple unicast transmissions to simulate a multicast. These services are designed for application-level group communication. Internet Relay Chat (IRC) implements a single spanning tree across its overlay network for all conference groups. The lesser-known PSYC technology uses custom multicast strategies per conference. Some peer-to-peer technologies employ the multicast concept known as peercasting when distributing content to multiple recipients.
Explicit multi-unicast (Xcast) is another multicast strategy that includes addresses of all intended destinations within each packet. As such, given maximum transmission unit limitations, Xcast cannot be used for multicast groups with many destinations. The Xcast model generally assumes that stations participating in the communication are known ahead of time, so that distribution trees can be generated and resources allocated by network elements in advance of actual data traffic.
Wireless communications (with exception to point-to-point radio links using directional antennas) are inherently broadcasting media. However, the communication service provided may be unicast, multicast as well as broadcast, depending on if the data is addressed to one, to a group or to all receivers in the covered network, respectively.
In digital television, the concept of multicast service sometimes is used to refer to content protection by broadcast encryption, i.e. encrypted content over a simplex broadcast channel only addressed to paying viewers (pay television). In this case, data is broadcast (or distributed) to all receivers but only addressed to a specific group.
The concept of "interactive multicast", for example using IP multicast, may be used over TV broadcast networks to improve efficiency, offer more TV programs, or reduce the required spectrum. Interactive multicast implies that TV programs are sent only over transmitters where there are viewers and that only the most popular programs are transmitted. It relies on an additional interaction channel (a back-channel or return channel), where user equipment may send join and leave messages when the user changes TV channel. Interactive multicast has been suggested as an efficient transmission scheme in DVB-H and DVB-T2 terrestrial digital television systems, A similar concept is "switched broadcast" over cable-TV networks, where only the currently most popular content is delivered in the cable-TV network. Scalable video multicast in an application of interactive multicast, where a subset of the viewers receive additional data for high-resolution video.
TV gateways converts Satellite (DVB-S, DVB-S2), Cable (DVB-C, DVB-C2) and Terrestrial television (DVB-T, DVB-T2) to IP for distribution using unicast and multicast in home, hospitality and enterprise applications
Another similar concept is Cell-TV, and implies TV distribution over 3G cellular networks using the network-assisted multicasting offered by the Multimedia Broadcast Multicast Service (MBMS) service, or over 4G/LTE cellular networks with the eMBMS (enhanced MBMS) service. | https://en.wikipedia.org/wiki?curid=20407 |
Marie Curie
Marie Skłodowska Curie ( , , ), born Maria Salomea Skłodowska (; 7 November 1867 – 4 July 1934), was a Polish and naturalized-French physicist and chemist who conducted pioneering research on radioactivity. She was the first woman to win a Nobel Prize, the first person and the only woman to win the Nobel Prize twice, and the only person to win the Nobel Prize in two different scientific fields. She was part of the Curie family legacy of five Nobel Prizes. She was also the first woman to become a professor at the University of Paris, and in 1995 became the first woman to be entombed on her own merits in the Panthéon in Paris.
She was born in Warsaw, in what was then the Kingdom of Poland, part of the Russian Empire. She studied at Warsaw's clandestine Flying University and began her practical scientific training in Warsaw. In 1891, aged 24, she followed her older sister Bronisława to study in Paris, where she earned her higher degrees and conducted her subsequent scientific work. She shared the 1903 Nobel Prize in Physics with her husband Pierre Curie and physicist Henri Becquerel. She won the 1911 Nobel Prize in Chemistry.
Her achievements include the development of the theory of "radioactivity" (a term she coined), techniques for isolating radioactive isotopes, and the discovery of two elements, polonium and radium. Under her direction, the world's first studies were conducted into the treatment of neoplasms using radioactive isotopes. She founded the Curie Institutes in Paris and in Warsaw, which remain major centres of medical research today. During World War I she developed mobile radiography units to provide X-ray services to field hospitals.
While a French citizen, Marie Skłodowska Curie, who used both surnames, never lost her sense of Polish identity. She taught her daughters the Polish language and took them on visits to Poland. She named the first chemical element she discovered "polonium", after her native country.
Marie Curie died in 1934, aged 66, at a sanatorium in Sancellemoz (Haute-Savoie), France, of aplastic anaemia from exposure to radiation in the course of her scientific research and in the course of her radiological work at field hospitals during World War I.
Maria Skłodowska was born in Warsaw, in Congress Poland in the Russian Empire, on 7 November 1867, the fifth and youngest child of well-known teachers Bronisława, "née" Boguska, and Władysław Skłodowski. The elder siblings of Maria (nicknamed "Mania") were Zofia (born 1862, nicknamed "Zosia"), (born 1863, nicknamed "Józio"), Bronisława (born 1865, nicknamed "Bronia") and Helena (born 1866, nicknamed "Hela").
On both the paternal and maternal sides, the family had lost their property and fortunes through patriotic involvements in Polish national uprisings aimed at restoring Poland's independence (the most recent had been the January Uprising of 1863–65). This condemned the subsequent generation, including Maria and her elder siblings, to a difficult struggle to get ahead in life. Maria's paternal grandfather, , had been principal of the Lublin primary school attended by Bolesław Prus, who became a leading figure in Polish literature.
Władysław Skłodowski taught mathematics and physics, subjects that Maria was to pursue, and was also director of two Warsaw "gymnasia" (secondary schools) for boys. After Russian authorities eliminated laboratory instruction from the Polish schools, he brought much of the laboratory equipment home and instructed his children in its use. He was eventually fired by his Russian supervisors for pro-Polish sentiments and forced to take lower-paying posts; the family also lost money on a bad investment and eventually chose to supplement their income by lodging boys in the house. Maria's mother Bronisława operated a prestigious Warsaw boarding school for girls; she resigned from the position after Maria was born. She died of tuberculosis in May 1878, when Maria was ten years old. Less than three years earlier, Maria's oldest sibling, Zofia, had died of typhus contracted from a boarder. Maria's father was an atheist; her mother a devout Catholic. The deaths of Maria's mother and sister caused her to give up Catholicism and become agnostic.
When she was ten years old, Maria began attending the boarding school of J. Sikorska; next, she attended a "gymnasium" for girls, from which she graduated on 12 June 1883 with a gold medal. After a collapse, possibly due to depression, she spent the following year in the countryside with relatives of her father, and the next year with her father in Warsaw, where she did some tutoring. Unable to enroll in a regular institution of higher education because she was a woman, she and her sister Bronisława became involved with the clandestine Flying University (sometimes translated as "Floating University"), a Polish patriotic institution of higher learning that admitted women students.
Maria made an agreement with her sister, Bronisława, that she would give her financial assistance during Bronisława's medical studies in Paris, in exchange for similar assistance two years later. In connection with this, Maria took a position as governess: first as a home tutor in Warsaw; then for two years as a governess in Szczuki with a landed family, the Żorawskis, who were relatives of her father. While working for the latter family, she fell in love with their son, Kazimierz Żorawski, a future eminent mathematician. His parents rejected the idea of his marrying the penniless relative, and Kazimierz was unable to oppose them. Maria's loss of the relationship with Żorawski was tragic for both. He soon earned a doctorate and pursued an academic career as a mathematician, becoming a professor and rector of Kraków University. Still, as an old man and a mathematics professor at the Warsaw Polytechnic, he would sit contemplatively before the statue of Maria Skłodowska that had been erected in 1935 before the Radium Institute, which she had founded in 1932.
At the beginning of 1890, Bronisława—who a few months earlier had married Kazimierz Dłuski, a Polish physician and social and political activist—invited Maria to join them in Paris. Maria declined because she could not afford the university tuition; it would take her a year and a half longer to gather the necessary funds. She was helped by her father, who was able to secure a more lucrative position again. All that time she continued to educate herself, reading books, exchanging letters, and being tutored herself. In early 1889 she returned home to her father in Warsaw. She continued working as a governess and remained there till late 1891. She tutored, studied at the Flying University, and began her practical scientific training (1890–91) in a chemical laboratory at the Museum of Industry and Agriculture at "Krakowskie Przedmieście" 66, near Warsaw's Old Town. The laboratory was run by her cousin Józef Boguski, who had been an assistant in Saint Petersburg to the Russian chemist Dmitri Mendeleev.
In late 1891, she left Poland for France. In Paris, Maria (or Marie, as she would be known in France) briefly found shelter with her sister and brother-in-law before renting a garret closer to the university, in the Latin Quarter, and proceeding with her studies of physics, chemistry, and mathematics at the University of Paris, where she enrolled in late 1891. She subsisted on her meagre resources, keeping herself warm during cold winters by wearing all the clothes she had. She focused so hard on her studies that she sometimes forgot to eat.
Skłodowska studied during the day and tutored evenings, barely earning her keep. In 1893, she was awarded a degree in physics and began work in an industrial laboratory of Professor Gabriel Lippmann. Meanwhile, she continued studying at the University of Paris and with the aid of a fellowship she was able to earn a second degree in 1894.
Skłodowska had begun her scientific career in Paris with an investigation of the magnetic properties of various steels, commissioned by the Society for the Encouragement of National Industry ("Société d'encouragement pour l'industrie nationale"). That same year Pierre Curie entered her life; it was their mutual interest in natural sciences that drew them together. Pierre Curie was an instructor at The City of Paris Industrial Physics and Chemistry Higher Educational Institution ("École supérieure de physique et de chimie industrielles de la ville de Paris" [ESPCI]). They were introduced by the Polish physicist, Professor Józef Wierusz-Kowalski, who had learned that she was looking for a larger laboratory space, something that Wierusz-Kowalski thought Pierre Curie had access to. Though Curie did not have a large laboratory, he was able to find some space for Skłodowska where she was able to begin work.
Their mutual passion for science brought them increasingly closer, and they began to develop feelings for one another. Eventually Pierre Curie proposed marriage, but at first Skłodowska did not accept as she was still planning to go back to her native country. Curie, however, declared that he was ready to move with her to Poland, even if it meant being reduced to teaching French. Meanwhile, for the 1894 summer break, Skłodowska returned to Warsaw, where she visited her family. She was still labouring under the illusion that she would be able to work in her chosen field in Poland, but she was denied a place at Kraków University because she was a woman. A letter from Pierre Curie convinced her to return to Paris to pursue a Ph.D. At Skłodowska's insistence, Curie had written up his research on magnetism and received his own doctorate in March 1895; he was also promoted to professor at the School. A contemporary quip would call Skłodowska "Pierre's biggest discovery." On 26 July 1895 they were married in Sceaux (Seine); neither wanted a religious service. Curie's dark blue outfit, worn instead of a bridal gown, would serve her for many years as a laboratory outfit. They shared two pastimes: long bicycle trips and journeys abroad, which brought them even closer. In Pierre, Marie had found a new love, a partner, and a scientific collaborator on whom she could depend.
In 1895, Wilhelm Roentgen discovered the existence of X-rays, though the mechanism behind their production was not yet understood. In 1896, Henri Becquerel discovered that uranium salts emitted rays that resembled X-rays in their penetrating power. He demonstrated that this radiation, unlike phosphorescence, did not depend on an external source of energy but seemed to arise spontaneously from uranium itself. Influenced by these two important discoveries, Curie decided to look into uranium rays as a possible field of research for a thesis.
She used an innovative technique to investigate samples. Fifteen years earlier, her husband and his brother had developed a version of the electrometer, a sensitive device for measuring electric charge. Using her husband's electrometer, she discovered that uranium rays caused the air around a sample to conduct electricity. Using this technique, her first result was the finding that the activity of the uranium compounds depended only on the quantity of uranium present. She hypothesized that the radiation was not the outcome of some interaction of molecules but must come from the atom itself. This hypothesis was an important step in disproving the assumption that atoms were indivisible.
In 1897, her daughter Irène was born. To support her family, Curie began teaching at the École Normale Supérieure. The Curies did not have a dedicated laboratory; most of their research was carried out in a converted shed next to ESPCI. The shed, formerly a medical school dissecting room, was poorly ventilated and not even waterproof. They were unaware of the deleterious effects of radiation exposure attendant on their continued unprotected work with radioactive substances. ESPCI did not sponsor her research, but she would receive subsidies from metallurgical and mining companies and from various organizations and governments.
Curie's systematic studies included two uranium minerals, pitchblende and torbernite (also known as chalcolite). Her electrometer showed that pitchblende was four times as active as uranium itself, and chalcolite twice as active. She concluded that, if her earlier results relating the quantity of uranium to its activity were correct, then these two minerals must contain small quantities of another substance that was far more active than uranium. She began a systematic search for additional substances that emit radiation, and by 1898 she discovered that the element thorium was also radioactive. Pierre Curie was increasingly intrigued by her work. By mid-1898 he was so invested in it that he decided to drop his work on crystals and to join her.
She was acutely aware of the importance of promptly publishing her discoveries and thus establishing her priority. Had not Becquerel, two years earlier, presented his discovery to the "Académie des Sciences" the day after he made it, credit for the discovery of radioactivity (and even a Nobel Prize), would instead have gone to Silvanus Thompson. Curie chose the same rapid means of publication. Her paper, giving a brief and simple account of her work, was presented for her to the "Académie" on 12 April 1898 by her former professor, Gabriel Lippmann. Even so, just as Thompson had been beaten by Becquerel, so Curie was beaten in the race to tell of her discovery that thorium gives off rays in the same way as uranium; two months earlier, Gerhard Carl Schmidt had published his own finding in Berlin.
At that time, no one else in the world of physics had noticed what Curie recorded in a sentence of her paper, describing how much greater were the activities of pitchblende and chalcolite than uranium itself: "The fact is very remarkable, and leads to the belief that these minerals may contain an element which is much more active than uranium." She later would recall how she felt "a passionate desire to verify this hypothesis as rapidly as possible." On 14 April 1898, the Curies optimistically weighed out a 100-gram sample of pitchblende and ground it with a pestle and mortar. They did not realize at the time that what they were searching for was present in such minute quantities that they would eventually have to process tonnes of the ore.
In July 1898, Curie and her husband published a joint paper announcing the existence of an element they named "polonium", in honour of her native Poland, which would for another twenty years remain partitioned among three empires (Russian, Austrian, and Prussian). On 26 December 1898, the Curies announced the existence of a second element, which they named "radium", from the Latin word for "ray". In the course of their research, they also coined the word "radioactivity".
To prove their discoveries beyond any doubt, the Curies sought to isolate polonium and radium in pure form. Pitchblende is a complex mineral; the chemical separation of its constituents was an arduous task. The discovery of polonium had been relatively easy; chemically it resembles the element bismuth, and polonium was the only bismuth-like substance in the ore. Radium, however, was more elusive; it is closely related chemically to barium, and pitchblende contains both elements. By 1898 the Curies had obtained traces of radium, but appreciable quantities, uncontaminated with barium, were still beyond reach. The Curies undertook the arduous task of separating out radium salt by differential crystallization. From a tonne of pitchblende, one-tenth of a gram of radium chloride was separated in 1902. In 1910, she isolated pure radium metal. She never succeeded in isolating polonium, which has a half-life of only 138 days.
Between 1898 and 1902, the Curies published, jointly or separately, a total of 32 scientific papers, including one that announced that, when exposed to radium, diseased, tumour-forming cells were destroyed faster than healthy cells.
In 1900, Curie became the first woman faculty member at the École Normale Supérieure and her husband joined the faculty of the University of Paris. In 1902 she visited Poland on the occasion of her father's death.
In June 1903, supervised by Gabriel Lippmann, Curie was awarded her doctorate from the University of Paris. That month the couple were invited to the Royal Institution in London to give a speech on radioactivity; being a woman, she was prevented from speaking, and Pierre Curie alone was allowed to. Meanwhile, a new industry began developing, based on radium. The Curies did not patent their discovery and benefited little from this increasingly profitable business.
In December 1903, the Royal Swedish Academy of Sciences awarded Pierre Curie, Marie Curie, and Henri Becquerel the Nobel Prize in Physics, "in recognition of the extraordinary services they have rendered by their joint researches on the radiation phenomena discovered by Professor Henri Becquerel." At first the committee had intended to honour only Pierre Curie and Henri Becquerel, but a committee member and advocate for women scientists, Swedish mathematician Magnus Goesta Mittag-Leffler, alerted Pierre to the situation, and after his complaint, Marie's name was added to the nomination. Marie Curie was the first woman to be awarded a Nobel Prize.
Curie and her husband declined to go to Stockholm to receive the prize in person; they were too busy with their work, and Pierre Curie, who disliked public ceremonies, was feeling increasingly ill. As Nobel laureates were required to deliver a lecture, the Curies finally undertook the trip in 1905. The award money allowed the Curies to hire their first laboratory assistant. Following the award of the Nobel Prize, and galvanized by an offer from the University of Geneva, which offered Pierre Curie a position, the University of Paris gave him a professorship and the chair of physics, although the Curies still did not have a proper laboratory. Upon Pierre Curie's complaint, the University of Paris relented and agreed to furnish a new laboratory, but it would not be ready until 1906.
In December 1904, Curie gave birth to their second daughter, Ève. She hired Polish governesses to teach her daughters her native language, and sent or took them on visits to Poland.
On 19 April 1906, Pierre Curie was killed in a road accident. Walking across the Rue Dauphine in heavy rain, he was struck by a horse-drawn vehicle and fell under its wheels, causing his skull to fracture. Curie was devastated by her husband's death. On 13 May 1906 the physics department of the University of Paris decided to retain the chair that had been created for her late husband and offer it to Marie. She accepted it, hoping to create a world-class laboratory as a tribute to her husband Pierre. She was the first woman to become a professor at the University of Paris.
Curie's quest to create a new laboratory did not end with the University of Paris, however. In her later years, she headed the Radium Institute ("Institut du radium", now Curie Institute, "Institut Curie"), a radioactivity laboratory created for her by the Pasteur Institute and the University of Paris. The initiative for creating the Radium Institute had come in 1909 from Pierre Paul Émile Roux, director of the Pasteur Institute, who had been disappointed that the University of Paris was not giving Curie a proper laboratory and had suggested that she move to the Pasteur Institute. Only then, with the threat of Curie leaving, did the University of Paris relent, and eventually the Curie Pavilion became a joint initiative of the University of Paris and the Pasteur Institute.
In 1910 Curie succeeded in isolating radium; she also defined an international standard for radioactive emissions that was eventually named for her and Pierre: the curie. Nevertheless, in 1911 the French Academy of Sciences failed, by one or two votes, to elect her to membership in the Academy. Elected instead was Édouard Branly, an inventor who had helped Guglielmo Marconi develop the wireless telegraph. It was only over half a century later, in 1962, that a doctoral student of Curie's, Marguerite Perey, became the first woman elected to membership in the Academy.
Despite Curie's fame as a scientist working for France, the public's attitude tended toward xenophobia—the same that had led to the Dreyfus affair—which also fuelled false speculation that Curie was Jewish. During the French Academy of Sciences elections, she was vilified by the right-wing press as a foreigner and atheist. Her daughter later remarked on the French press' hypocrisy in portraying Curie as an unworthy foreigner when she was nominated for a French honour, but portraying her as a French heroine when she received foreign honours such as her Nobel Prizes.
In 1911, it was revealed that Curie was involved in a year-long affair with physicist Paul Langevin, a former student of Pierre Curie's, a married man who was estranged from his wife. This resulted in a press scandal that was exploited by her academic opponents. Curie (then in her mid-40s) was five years older than Langevin and was misrepresented in the tabloids as a foreign Jewish home-wrecker. When the scandal broke, she was away at a conference in Belgium; on her return, she found an angry mob in front of her house and had to seek refuge, with her daughters, in the home of her friend, Camille Marbo.
International recognition for her work had been growing to new heights, and the Royal Swedish Academy of Sciences, overcoming opposition prompted by the Langevin scandal, honoured her a second time, with the 1911 Nobel Prize in Chemistry. This award was "in recognition of her services to the advancement of chemistry by the discovery of the elements radium and polonium, by the isolation of radium and the study of the nature and compounds of this remarkable element." She was the first person to win or share two Nobel Prizes, and remains alone with Linus Pauling as Nobel laureates in two fields each. A delegation of celebrated Polish men of learning, headed by novelist Henryk Sienkiewicz, encouraged her to return to Poland and continue her research in her native country. Curie's second Nobel Prize enabled her to persuade the French government into supporting the Radium Institute, built in 1914, where research was conducted in chemistry, physics, and medicine. A month after accepting her 1911 Nobel Prize, she was hospitalised with depression and a kidney ailment. For most of 1912, she avoided public life but did spend time in England with her friend and fellow physicist, Hertha Ayrton. She returned to her laboratory only in December, after a break of about 14 months.
In 1912, the Warsaw Scientific Society offered her the directorship of a new laboratory in Warsaw but she declined, focusing on the developing Radium Institute to be completed in August 1914, and on a new street named Rue Pierre-Curie. She was appointed Director of the Curie Laboratory in the Radium Institute of the University of Paris, founded in 1914. She visited Poland in 1913 and was welcomed in Warsaw but the visit was mostly ignored by the Russian authorities. The Institute's development was interrupted by the coming war, as most researchers were drafted into the French Army, and it fully resumed its activities in 1919.
During World War I, Curie recognised that wounded soldiers were best served if operated upon as soon as possible. She saw a need for field radiological centres near the front lines to assist battlefield surgeons. After a quick study of radiology, anatomy, and automotive mechanics she procured X-ray equipment, vehicles, auxiliary generators, and developed mobile radiography units, which came to be popularly known as "petites Curies" ("Little Curies"). She became the director of the Red Cross Radiology Service and set up France's first military radiology centre, operational by late 1914. Assisted at first by a military doctor and her 17-year-old daughter Irène, Curie directed the installation of 20 mobile radiological vehicles and another 200 radiological units at field hospitals in the first year of the war. Later, she began training other women as aides.
In 1915, Curie produced hollow needles containing "radium emanation", a colourless, radioactive gas given off by radium, later identified as radon, to be used for sterilizing infected tissue. She provided the radium from her own one-gram supply. It is estimated that over a million wounded soldiers were treated with her X-ray units. Busy with this work, she carried out very little scientific research during that period. In spite of all her humanitarian contributions to the French war effort, Curie never received any formal recognition of it from the French government.
Also, promptly after the war started, she attempted to donate her gold Nobel Prize medals to the war effort but the French National Bank refused to accept them. She did buy war bonds, using her Nobel Prize money. She said:I am going to give up the little gold I possess. I shall add to this the scientific medals, which are quite useless to me. There is something else: by sheer laziness I had allowed the money for my second Nobel Prize to remain in Stockholm in Swedish crowns. This is the chief part of what we possess. I should like to bring it back here and invest it in war loans. The state needs it. Only, I have no illusions: this money will probably be lost. She was also an active member in committees of Polonia in France dedicated to the Polish cause. After the war, she summarized her wartime experiences in a book, "Radiology in War" (1919).
In 1920, for the 25th anniversary of the discovery of radium, the French government established a stipend for her; its previous recipient was Louis Pasteur (1822–95). In 1921, she was welcomed triumphantly when she toured the United States to raise funds for research on radium. Mrs. William Brown Meloney, after interviewing Curie, created a "Marie Curie Radium Fund" and raised money to buy radium, publicising her trip.
In 1921, U.S. President Warren G. Harding received her at the White House to present her with the 1 gram of radium collected in the United States, and the First Lady praised her as an example of a professional achiever who was also a supportive wife. Before the meeting, recognising her growing fame abroad, and embarrassed by the fact that she had no French official distinctions to wear in public, the French government offered her a Legion of Honour award, but she refused. In 1922 she became a fellow of the French Academy of Medicine. She also travelled to other countries, appearing publicly and giving lectures in Belgium, Brazil, Spain, and Czechoslovakia.
Led by Curie, the Institute produced four more Nobel Prize winners, including her daughter Irène Joliot-Curie and her son-in-law, Frédéric Joliot-Curie. Eventually it became one of the world's four major radioactivity-research laboratories, the others being the Cavendish Laboratory, with Ernest Rutherford; the Institute for Radium Research, Vienna, with Stefan Meyer; and the Kaiser Wilhelm Institute for Chemistry, with Otto Hahn and Lise Meitner.
In August 1922 Marie Curie became a member of the League of Nations' newly created International Committee on Intellectual Cooperation. She sat on the Committee until 1934 and contributed to League of Nations' scientific coordination with other prominent researchers such as Albert Einstein, Hendrik Lorentz, and Henri Bergson. In 1923 she wrote a biography of her late husband, titled "Pierre Curie". In 1925 she visited Poland to participate in a ceremony laying the foundations for Warsaw's Radium Institute. Her second American tour, in 1929, succeeded in equipping the Warsaw Radium Institute with radium; the Institute opened in 1932, with her sister Bronisława its director. These distractions from her scientific labours, and the attendant publicity, caused her much discomfort but provided resources for her work. In 1930 she was elected to the International Atomic Weights Committee, on which she served until her death. In 1931, Curie was awarded the Cameron Prize for Therapeutics of the University of Edinburgh.
Curie visited Poland for the last time in early 1934. A few months later, on 4 July 1934, she died at the Sancellemoz sanatorium in Passy, Haute-Savoie, from aplastic anaemia believed to have been contracted from her long-term exposure to radiation.
The damaging effects of ionising radiation were not known at the time of her work, which had been carried out without the safety measures later developed. She had carried test tubes containing radioactive isotopes in her pocket, and she stored them in her desk drawer, remarking on the faint light that the substances gave off in the dark. Curie was also exposed to X-rays from unshielded equipment while serving as a radiologist in field hospitals during the war. Although her many decades of exposure to radiation caused chronic illnesses (including near-blindness due to cataracts) and ultimately her death, she never really acknowledged the health risks of radiation exposure.
She was interred at the cemetery in Sceaux, alongside her husband Pierre. Sixty years later, in 1995, in honour of their achievements, the remains of both were transferred to the Panthéon, Paris. She became the first woman to be honoured with interment in the Panthéon on her own merits.
Because of their levels of radioactive contamination, her papers from the 1890s are considered too dangerous to handle. Even her cookbook is highly radioactive. Her papers are kept in lead-lined boxes, and those who wish to consult them must wear protective clothing. In her last year, she worked on a book, "Radioactivity", which was published posthumously in 1935.
The physical and societal aspects of the Curies' work contributed to shaping the world of the twentieth and twenty-first centuries. Cornell University professor Williams observes:
If Curie's work helped overturn established ideas in physics and chemistry, it has had an equally profound effect in the societal sphere. To attain her scientific achievements, she had to overcome barriers, in both her native and her adoptive country, that were placed in her way because she was a woman. This aspect of her life and career is highlighted in Françoise Giroud's "Marie Curie: A Life", which emphasizes Curie's role as a feminist precursor.
She was known for her honesty and moderate lifestyle. Having received a small scholarship in 1893, she returned it in 1897 as soon as she began earning her keep. She gave much of her first Nobel Prize money to friends, family, students, and research associates. In an unusual decision, Curie intentionally refrained from patenting the radium-isolation process so that the scientific community could do research unhindered. She insisted that monetary gifts and awards be given to the scientific institutions she was affiliated with rather than to her. She and her husband often refused awards and medals. Albert Einstein reportedly remarked that she was probably the only person who could not be corrupted by fame.
As one of the most famous scientists, Marie Curie has become an icon in the scientific world and has received tributes from across the globe, even in the realm of pop culture. In a 2009 poll carried out by "New Scientist", she was voted the "most inspirational woman in science". Curie received 25.1 percent of all votes cast, nearly twice as many as second-place Rosalind Franklin (14.2 per cent).
Poland and France declared 2011 the Year of Marie Curie, and the United Nations declared that this would be the International Year of Chemistry. An artistic installation celebrating "Madame Curie" filled the Jacobs Gallery at San Diego's Museum of Contemporary Art. On 7 November, Google celebrated the anniversary of her birth with a special Google Doodle. On 10 December, the New York Academy of Sciences celebrated the centenary of Marie Curie's second Nobel Prize in the presence of Princess Madeleine of Sweden.
Marie Curie was the first woman to win a Nobel Prize, the first person to win two Nobel Prizes, the only woman to win in two fields, and the only person to win in multiple sciences. Awards that she received include:
Marie Curie's 1898 publication with her husband and their collaborator Gustave Bémont of their discovery of radium and polonium was honoured by a Citation for Chemical Breakthrough Award from the Division of History of Chemistry of the American Chemical Society presented to the ESPCI Paris in 2015.
In 1995, she became the first woman to be entombed on her own merits in the Panthéon, Paris. The curie (symbol Ci), a unit of radioactivity, is named in honour of her and Pierre Curie (although the commission which agreed on the name never clearly stated whether the standard was named after Pierre, Marie or both of them). The element with atomic number 96 was named curium. Three radioactive minerals are also named after the Curies: curite, sklodowskite, and cuprosklodowskite. She received numerous honorary degrees from universities across the world. The Marie Skłodowska-Curie Actions fellowship program of the European Union for young scientists wishing to work in a foreign country is named after her. In Poland, she had received honorary doctorates from the Lwów Polytechnic (1912), Poznań University (1922), Kraków's Jagiellonian University (1924), and the Warsaw Polytechnic (1926).
In 1920 she became the first female member of The Royal Danish Academy of Sciences and Letters. In 1921, in the U.S., she was awarded membership in the Iota Sigma Pi women scientists' society. In 1924, she became an Honorary Member of the Polish Chemical Society.
Her name is included on the "Monument to the X-ray and Radium Martyrs of All Nations", erected in Hamburg, Germany in 1936.
Numerous locations around the world are named after her. In 2007, a metro station in Paris was renamed to honour both of the Curies. Polish nuclear research reactor Maria is named after her. The 7000 Curie asteroid is also named after her. A KLM McDonnell Douglas MD-11 (registration PH-KCC) is named in her honour.
Several institutions bear her name, starting with the two Curie institutes: the Maria Skłodowska-Curie Institute of Oncology, in Warsaw and the "Institut Curie" in Paris. She is the patron of Maria Curie-Skłodowska University, in Lublin, founded in 1944; and of Pierre and Marie Curie University (Paris VI), France's pre-eminent science university. In Britain, Marie Curie Cancer Care was organized in 1948 to care for the terminally ill.
Two museums are devoted to Marie Curie. In 1967, the Maria Skłodowska-Curie Museum was established in Warsaw's "New Town", at her birthplace on "ulica Freta" (Freta Street). Her Paris laboratory is preserved as the Musée Curie, open since 1992.
Several works of art bear her likeness. In 1935, Michalina Mościcka, wife of Polish President Ignacy Mościcki, unveiled a statue of Marie Curie before Warsaw's Radium Institute. During the 1944 Second World War Warsaw Uprising against the Nazi German occupation, the monument was damaged by gunfire; after the war it was decided to leave the bullet marks on the statue and its pedestal. In 1955 Jozef Mazur created a stained glass panel of her, the Maria Skłodowska-Curie Medallion, featured in the University at Buffalo Polish Room.
A number of biographies are devoted to her. In 1938 her daughter, Ève Curie, published "Madame Curie". In 1987 Françoise Giroud wrote "Marie Curie: A Life". In 2005 Barbara Goldsmith wrote "Obsessive Genius: The Inner World of Marie Curie". In 2011 Lauren Redniss published "Radioactive: Marie and Pierre Curie, a Tale of Love and Fallout".
Marie Curie has been the subject of several biographical films:
Curie is the subject of the 2013 play "False Assumptions" by Lawrence Aronovitch, in which the ghosts of three other women scientists observe events in her life. Curie has also been portrayed by Susan Marie Frontczak in her play "Manya: The Living History of Marie Curie", a one-woman show performed in 30 U.S. states and nine countries, by 2014.
Curie's likeness also has appeared on banknotes, stamps and coins around the world. She was featured on the Polish late-1980s 20,000-"złoty" banknote as well as on the last French 500-franc note, before the franc was replaced by the euro. Curie-themed postage stamps from Mali, the Republic of Togo, Zambia, and the Republic of Guinea actually show a picture of Susan Marie Frontczak portraying Curie in a 2001 picture by Paul Schroeder.
In 2011, on the centenary of Marie Curie's second Nobel Prize, an allegorical mural was painted on the façade of her Warsaw birthplace. It depicted an infant Maria Skłodowska holding a test tube from which emanated the elements that she would discover as an adult: polonium and radium.
Also in 2011, a new Warsaw bridge over the Vistula River was named in her honour.
In January 2020, Satellogic, a high-resolution Earth observation imaging and analytics company, launched a ÑuSat type micro-satellite named in honour of Marie Curie. | https://en.wikipedia.org/wiki?curid=20408 |
MATLAB
MATLAB ("matrix laboratory") is a multi-paradigm numerical computing environment and proprietary programming language developed by MathWorks. MATLAB allows matrix manipulations, plotting of functions and data, implementation of algorithms, creation of user interfaces, and interfacing with programs written in other languages.
Although MATLAB is intended primarily for numerical computing, an optional toolbox uses the MuPAD symbolic engine allowing access to symbolic computing abilities. An additional package, Simulink, adds graphical multi-domain simulation and model-based design for dynamic and embedded systems.
As of 2020, MATLAB has more than 4 million users worldwide. MATLAB users come from various backgrounds of engineering, science, and economics.
Cleve Moler, the chairman of the computer science department at the University of New Mexico, started developing MATLAB in the late 1970s. He designed it to give his students access to LINPACK and EISPACK without them having to learn Fortran. It soon spread to other universities and found a strong audience within the applied mathematics community. Jack Little, an engineer, was exposed to it during a visit Moler made to Stanford University in 1983. Recognizing its commercial potential, he joined with Moler and Steve Bangert. They rewrote MATLAB in C and founded MathWorks in 1984 to continue its development. These rewritten libraries were known as JACKPAC. In 2000, MATLAB was rewritten to use a newer set of libraries for matrix manipulation, LAPACK.
MATLAB was first adopted by researchers and practitioners in control engineering, Little's specialty, but quickly spread to many other domains. It is now also used in education, in particular the teaching of linear algebra and numerical analysis, and is popular amongst scientists involved in image processing.
The MATLAB application is built around the MATLAB programming language. Common usage of the MATLAB application involves using the "Command Window" as an interactive mathematical shell or executing text files containing MATLAB code.
Variables are defined using the assignment operator, codice_1. MATLAB is a weakly typed programming language because types are implicitly converted. It is an inferred typed language because variables can be assigned without declaring their type, except if they are to be treated as symbolic objects, and that their type can change. Values can come from constants, from computation involving values of other variables, or from the output of a function. For example:
» x = 17
x =
» x = 'hat'
x =
hat
» x = [3*4, pi/2]
x =
» y = 3*sin(x)
y =
A simple array is defined using the colon syntax: "initial"codice_2"increment"codice_2"terminator". For instance:
» array = 1:2:9
array =
defines a variable named codice_4 (or assigns a new value to an existing variable with the name codice_4) which is an array consisting of the values 1, 3, 5, 7, and 9. That is, the array starts at 1 (the "initial" value), increments with each step from the previous value by 2 (the "increment" value), and stops once it reaches (or to avoid exceeding) 9 (the "terminator" value).
» array = 1:3:9
array =
the "increment" value can actually be left out of this syntax (along with one of the colons), to use a default value of 1.
» ari = 1:5
ari =
assigns to the variable named codice_6 an array with the values 1, 2, 3, 4, and 5, since the default value of 1 is used as the increment.
Indexing is one-based, which is the usual convention for matrices in mathematics, unlike zero-based indexing commonly used in other programming languages such as C, C++, and Java.
Matrices can be defined by separating the elements of a row with blank space or comma and using a semicolon to terminate each row. The list of elements should be surrounded by square brackets codice_7. Parentheses codice_8 are used to access elements and subarrays (they are also used to denote a function argument list).
» A = [16 3 2 13; 5 10 11 8; 9 6 7 12; 4 15 14 1]
A =
» A(2,3)
ans =
Sets of indices can be specified by expressions such as codice_9, which evaluates to codice_10. For example, a submatrix taken from rows 2 through 4 and columns 3 through 4 can be written as:
» A(2:4,3:4)
ans =
A square identity matrix of size "n" can be generated using the function codice_11, and matrices of any size with zeros or ones can be generated with the functions codice_12 and codice_13, respectively.
» eye(3,3)
ans =
» zeros(2,3)
ans =
» ones(2,3)
ans =
Transposing a vector or a matrix is done either by the function codice_14 or by adding dot-prime after the matrix (without the dot, prime will perform conjugate transpose for complex arrays):
» A = [1 ; 2], B = A.', C = transpose(A)
A =
B =
C =
» D = [0 3 ; 1 5], D.'
D =
ans =
Most functions accept arrays as input and operate element-wise on each element. For example, codice_15 will multiply every element in "J" by 2, and then reduce each element modulo "n". MATLAB does include standard codice_16 and codice_17 loops, but (as in other similar applications such as R), using the vectorized notation is encouraged and is often faster to execute. The following code, excerpted from the function "magic.m", creates a magic square "M" for odd values of "n" (MATLAB function codice_18 is used here to generate square matrices "I" and "J" containing "1:n").
[J,I] = meshgrid(1:n);
A = mod(I + J - (n + 3) / 2, n);
B = mod(I + 2 * J - 2, n);
M = n * A + B + 1;
MATLAB supports structure data types. Since all variables in MATLAB are arrays, a more adequate name is "structure array", where each element of the array has the same field names. In addition, MATLAB supports dynamic field names (field look-ups by name, field manipulations, etc.).
When creating a MATLAB function, the name of the file should match the name of the first function in the file. Valid function names begin with an alphabetic character, and can contain letters, numbers, or underscores. Variables and functions are case sensitive.
MATLAB supports elements of lambda calculus by introducing function handles, or function references, which are implemented either in .m files or anonymous/nested functions.
MATLAB supports object-oriented programming including classes, inheritance, virtual dispatch, packages, pass-by-value semantics, and pass-by-reference semantics. However, the syntax and calling conventions are significantly different from other languages. MATLAB has value classes and reference classes, depending on whether the class has "handle" as a super-class (for reference classes) or not (for value classes).
Method call behavior is different between value and reference classes. For example, a call to a method
object.method();
can alter any member of "object" only if "object" is an instance of a reference class, otherwise value class methods must return a new instance if it needs to modify the object.
An example of a simple class is provided below.
classdef Hello
end
When put into a file named hello.m, this can be executed with the following commands:
» x = Hello();
» x.greet();
Hello!
"version": 2,
"width": 400,
}
MATLAB has tightly integrated graph-plotting features. For example, the function "plot" can be used to produce a graph from two vectors "x" and "y". The code:
x = 0:pi/100:2*pi;
y = sin(x);
plot(x,y)
produces the following figure of the sine function:
MATLAB supports three-dimensional graphics as well:
MATLAB supports developing graphical user interface (GUI) applications. UIs can be generated either programmatically or using visual design environments such as "GUIDE" and "App Designer".
MATLAB can call functions and subroutines written in the programming languages C or Fortran. A wrapper function is created allowing MATLAB data types to be passed and returned. MEX files (MATLAB executables) are the dynamically loadable object files created by compiling such functions. Since 2014 increasing two-way interfacing with Python was being added.
Libraries written in Perl, Java, ActiveX or .NET can be directly called from MATLAB, and many MATLAB libraries (for example XML or SQL support) are implemented as wrappers around Java or ActiveX libraries. Calling MATLAB from Java is more complicated, but can be done with a MATLAB toolbox which is sold separately by MathWorks, or using an undocumented mechanism called JMI (Java-to-MATLAB Interface), (which should not be confused with the unrelated Java Metadata Interface that is also called JMI). Official MATLAB API for Java was added in 2016.
As alternatives to the MuPAD based Symbolic Math Toolbox available from MathWorks, MATLAB can be connected to Maple or Mathematica.
Libraries also exist to import and export MathML.
In 2020 Chinese state media reported that MATLAB had withdrawn services from two Chinese universities as a result of US sanctions, and said this will be responded to by increased use of open-source alternatives and by developing domestic alternatives.
There are a number of competitors to MATLAB. Some notable examples include:
There are also free open source alternatives to MATLAB, in particular:
which are somewhat compatible with the MATLAB language. GNU Octave is unique from the others in that it aims to be drop-in compatible with MATLAB syntax-wise (see MATLAB Compatibility of GNU Octave).
Among other languages that treat arrays as basic entities (array programming languages) are:
There are also libraries to add similar functionality to existing languages, such as:
The number (or release number) is the version reported by Concurrent License Manager program FLEXlm.
For a complete list of changes of both MATLAB and official toolboxes, consult the MATLAB release notes. | https://en.wikipedia.org/wiki?curid=20412 |
Meuse
The Meuse ( , also , ; ) or Maas ( , ; or ) is a major European river, rising in France and flowing through Belgium and the Netherlands before draining into the North Sea from the Rhine–Meuse–Scheldt delta. It has a total length of .
From 1301 the upper Meuse roughly marked the western border of the Holy Roman Empire with the Kingdom of France, after Count Henry III of Bar had to receive the western part of the County of Bar ("Barrois mouvant") as a French fief from the hands of King Philip IV. The border remained stable until the annexation of the Three Bishoprics Metz, Toul and Verdun by King Henry II in 1552 and the occupation of the Duchy of Lorraine by the forces of King Louis XIII in 1633. Its lower Belgian (Walloon) portion, part of the sillon industriel, was the first fully industrialized area in continental Europe.
The Afgedamde Maas was created in the late Middle Ages, when a major flood made a connection between the Maas and the Merwede at the town of Woudrichem. From that moment on, the current Afgedamde Maas was the main branch of the lower Meuse. The former main branch eventually silted up and is today called the Oude Maasje. In the late 19th century and early 20th century the connection between the Maas and Rhine was closed off and the Maas was given a new, artificial mouth - the Bergse Maas. The resulting separation of the rivers Rhine and Maas reduced the risk of flooding and is considered to be the greatest achievement in Dutch hydraulic engineering before the completion of the Zuiderzee Works and Delta Works. The former main branch was, after the dam at its southern inlet was completed in 1904, renamed "Afgedamde Maas" and no longer receives water from the Maas.
The Meuse and its crossings were a key objective of the last major German WWII counter-offensive on the Western Front, the Battle of the Bulge (Battle of the Ardennes) in the winter of 1944/45.
The Meuse is represented in the documentary "The River People" released in 2012 by Xavier Istasse.
The name "Meuse" is derived from the French name of the river, derived from its Latin name, "Mosa", which ultimately derives from the Celtic or Proto-Celtic name *"Mosā". This probably derives from the same root as English "maze", referring to the river's twists and turns.
The Dutch name "Maas" descends from Middle Dutch "Mase", which comes from the presumed but unattested Old Dutch form *"Masa", from Proto-Germanic *"Masō". Modern Dutch and German "Maas" and Limburgish "Maos" preserve this Germanic form. Despite the similarity, the Germanic name is not derived from the Celtic name, judging from the change from earlier "o" into "a", which is characteristic of the Germanic languages.
The Meuse rises in Pouilly-en-Bassigny, commune of Le Châtelet-sur-Meuse on the Langres plateau in France from where it flows northwards past Sedan (the head of navigation) and Charleville-Mézières into Belgium.
At Namur it is joined by the Sambre. Beyond Namur the Meuse winds eastwards, skirting the Ardennes, and passes Liège before turning north. The river then forms part of the Belgian-Dutch border, except that at Maastricht the border lies further to the west. In the Netherlands it continues northwards through Venlo closely along the border to Germany, then turns towards the west, where it runs parallel to the Waal and forms part of the extensive Rhine–Meuse–Scheldt delta, together with the Scheldt in its south and the Rhine in the north. The river has been divided near Heusden into the Afgedamde Maas on the right and the Bergse Maas on the left. The Bergse Maas continues under the name of Amer, which is part of De Biesbosch. The Afgedamde Maas joins the Waal, the main stem of the Rhine at Woudrichem, and then flows under the name of Boven Merwede to Hardinxveld-Giessendam, where it splits into Nieuwe Merwede and Beneden Merwede. Near Lage Zwaluwe, the Nieuwe Merwede joins the Amer, forming the Hollands Diep, which splits into Grevelingen and Haringvliet, before finally flowing into the North Sea.
The Meuse is crossed by railway bridges between the following stations (on the left and right banks respectively):
There are also numerous road bridges and around 32 ferry crossings.
The Meuse is navigable over a substantial part of its total length: In the Netherlands and Belgium, the river is part of the major inland navigation infrastructure, connecting the Rotterdam-Amsterdam-Antwerp port areas to the industrial areas upstream: 's-Hertogenbosch, Venlo, Maastricht, Liège, Namur. Between Maastricht and Maasbracht, an unnavigable section of the Meuse is bypassed by the 36 km (22.4 mi) Juliana Canal. South of Namur, further upstream, the river can only carry more modest vessels, although a barge as long as 100 m (328 ft). can still reach the French border town of Givet.
From Givet, the river is canalized over a distance of 272 kilometres (169 mi). The canalized Meuse used to be called the "Canal de l'Est — Branche Nord" but was recently rebaptized into "Canal de la Meuse". The waterway can be used by the smallest barges that are still in use commercially almost 40 metres (131 ft) long and just over 5 metres (16 ft) wide. Just upstream of the town of Commercy, the Canal de la Meuse connects with the Marne–Rhine Canal by means of a short diversion canal.
The Cretaceous sea reptile Mosasaur is named after the river Meuse. The first fossils of it were discovered outside Maastricht in 1780.
An international agreement was signed in 2002 in Ghent, Belgium about the management of the river amongst France, Germany, Luxembourg, the Netherlands, and Belgium. Also participating in the agreement were the Belgian regional governments of Flanders, Wallonia, and Brussels (which is not in the basin of the Meuse but pumps running water into the Meuse).
Most of the basin area (approximately 36,000 km2) is in Wallonia (12,000 km2), followed by France (9,000 km2), the Netherlands (8,000 km2), Germany (2,000 km2), Flanders (2,000 km2) and Luxembourg (a few km2).
An International Commission on the Meuse has the responsibility of the implementation of the treaty.
The costs of this Commission are met by all these countries, in proportion of their own territory into the basin of the Meuse: Netherlands and Wallonia 30%, France 15%, Germany 14.5%, Flanders 5%, Brussels 4.5%, Kingdom of Belgium and Luxemburg 0.5%.
The map of the basin area of Meuse was joined to the text of the treaty.
On the cultural plan, the river Meuse, as a major communication route, is the origin of the Mosan art, principally (Wallonia and France).
The first landscape painted in the Middle-Age was the landscape of Meuse. For instance Joachim Patinir He was likely the uncle of Henri Blès who is sometimes defined as a Mosan landscape painter active during the second third of the 16th century (i.e., second generation of landscape painters)
The main tributaries of the Meuse are listed below in downstream-upstream order, with the town where the tributary meets the river:
The mean annual discharge rate of the Meuse has been relatively stable over the last few thousand years. One recent study estimates that average flow has increased about 10% since 2000 BC. The hydrological distribution of the Meuse changed during the later Middle Ages, when a major flood forced it to shift its main course northwards towards the river Merwede. From then on, several stretches of the original Merwede were named "Maas" (i.e. Meuse) instead and served as the primary outflow of that river. Those branches are currently known as the Nieuwe Maas and Oude Maas.
However, during another series of severe floods the Meuse found an additional path towards the sea, resulting in the creation of the Biesbosch wetlands and Hollands Diep estuaries. Thereafter, the Meuse split near Heusden into two main distributaries, one flowing north to join the Merwede, and one flowing directly to the sea. The branch of the Meuse leading directly to the sea eventually silted up, (and now forms the Oude Maasje stream), but in 1904 the canalised Bergse Maas was dug to take over the functions of the silted-up branch. At the same time, the branch leading to the Merwede was dammed at Heusden, (and has since been known as the Afgedamde Maas) so that little water from the Meuse entered the old Maas courses, or the Rhine distributaries. The resulting separation of the rivers Rhine and Meuse is considered to be the greatest achievement in Dutch hydraulic engineering before the completion of the Zuiderzee Works and Delta Works. In 1970 the Haringvlietdam has been finished. Since then the reunited Rhine and Meuse waters reach the North Sea either at this site or, during times of lower discharges of the Rhine, at Hoek van Holland.
A 2008 study notes that the difference between summer and winter flow volumes has increased significantly in the last 100–200 years. These workers point out that the frequency of serious floods ("i.e." flows > 1000% of normal) has increased markedly. They predict that winter flooding of the Meuse may become a recurring problem in the coming decades.
The Meuse flows through the following departments of France, provinces of Belgium, provinces of the Netherlands and towns:
The Meuse ("Maas") is mentioned in the first stanza of Germany's old national anthem, the "Deutschlandlied". However, since its re-adoption as national anthem in 1952, only the third stanza of the "Deutschlandlied" has been sung as the German national anthem, the first and second stanzas being omitted. This was confirmed after German reunification in 1991 when only the third stanza was defined as the official anthem. The lyrics written in 1841 describe a then–disunited Germany with the river as its western boundary, where King William I of the Netherlands had joined the German Confederation with his Duchy of Limburg in 1839. Though the duchy's territory officially became an integral part of the Netherlands by the 1867 Treaty of London, the text passage remained unchanged when the "Deutschlandlied" was declared the national anthem of the Weimar Republic in 1922.
The name of the rivers also forms part of the title of "Le Régiment de Sambre et Meuse", written after the French defeat in the Franco-Prussian War of 1870, and a popular patriotic song for the rest of the 19th century and into the 20th. | https://en.wikipedia.org/wiki?curid=20414 |
Michael Bentine
Michael Bentine, (born Michael James Bentin; 26 January 1922 – 26 November 1996) was a British comedian, comic actor and founding member of the Goons. His father was a Peruvian Briton. In 1971, Bentine received the Order of Merit of Peru following his fund-raising work for the 1970 Great Peruvian earthquake.
Bentine was born in Watford, Hertfordshire, to a Peruvian father, Adam Bentin, and a British mother, Florence Dawkins, and grew up in Folkestone, Kent. He was educated at Eton College. With the help of speech trainer, Harry Burgess, he overcame a stammer and subsequently developed an interest in amateur theatricals, along with the Tomlinson family, including the young David Tomlinson. He spoke fluent Spanish and French.
His father was an early aeronautical engineer for the Sopwith Aviation Company during and after World War I and invented a tensometer for setting the tension on aircraft rigging wires.
In World War II, he volunteered for all services when the war broke out (the RAF was his first choice owing to the influence of his father's experience), but was initially rejected because of his father's nationality.
He started his acting career in 1940, in a touring company in Cardiff playing a juvenile lead in "Sweet Lavender". He went on to join Robert Atkin's Shakespearean company in Regent's Park, London, until he was called up for service in the RAF. He was appearing in a Shakespearean play in doublet and hose in the open-air theatre in London's Hyde Park when two RAF MPs marched on stage and arrested him for desertion. Unknown to him, an RAF conscription notice had been following him for a month as his company toured.
Once in the RAF he went through flying training. He was the penultimate man going through a medical line receiving inoculations for typhoid with the other flight candidates in his class (they were going to Canada to receive new aircraft) when the vaccine ran out. They refilled the bottle to inoculate him and the other man as well. By mistake they loaded a pure culture of typhoid. The other man died immediately, and Bentine was in a coma for six weeks. When he regained consciousness his eyesight was ruined, leaving him myopic for the rest of his life. Since he was no longer physically qualified for flying, he was transferred to RAF Intelligence and seconded to MI9, a unit that was dedicated to supporting resistance movements and helping prisoners escape. His immediate superior was the Colditz escapee Airey Neave.
At the end of the war, he took part in the liberation of Bergen-Belsen concentration camp. He said about this experience:
Millions of words have been written about these horror camps, many of them by inmates of those unbelievable places. I’ve tried, without success, to describe it from my own point of view, but the words won’t come. To me Belsen was the ultimate blasphemy. ("The Reluctant Jester", Chapter 17.)
After the war Bentine decided to become a comedian and worked in the Windmill Theatre where he met Harry Secombe. He specialised in off-the-wall humour, often involving cartoons and other types of animation. His acts included giving lectures in an invented language called Slobodian, "Imaginative Young Man with a Walking Stick" and "The Chairback", with a broken chairback having a number of uses from comb to machine gun and taking on a demoniacal life of its own. Peter Sellers told him this was the inspiration for the prosthetic arm routine in "Dr Strangelove". This act led to his engagement by Val Parnell to appear in the Starlight Roof revues starring Vic Oliver, where he met and married his second wife Clementina, with whom he had four children. Also on the bill were Fred Emney and a young Julie Andrews.
He co-founded "The Goon Show" radio show with Spike Milligan, Peter Sellers and Harry Secombe, but appeared in only the first 38 shows on the BBC Light Programme from 1951 to 1953. The first of these shows were actually called "Crazy People" and subtitled "The Junior Crazy Gang"; the term "Goon" was used as the headline of a review of Bentine's act by "Picture Post" dated 5 November 1948. Only one of this first series (and very few of the following three in which he did not appear) has survived, the rest of the original disc recordings having apparently been destroyed or discarded as no longer usable, so there is almost no record of his work as a radio "Goon". He also appeared in the "Goon Show" film "Down Among the Z Men".
In 1951 Bentine was invited to the United States to appear on "The Ed Sullivan Show". On his return he parted amicably from his partners and continued touring in variety, remaining close to Secombe and Sellers for the rest of his life. In 1972, Secombe and Sellers told Michael Parkinson that Bentine was "always calling everyone a genius" and, since he was the only one of the four with a "proper education", they always believed him.
His first appearances on television were as presenter on a 13-part children's series featuring remote controlled puppets, "The Bumblies", which he also devised, designed and wrote. These were three small creatures from outer space who slept on "Professor Bentine's" ceiling and who had come to Earth to learn the ways of Earthling children. Angelo de Calferta modelled the puppets from Bentine's designs and Richard Dendy moulded them in latex rubber. He sold the series to the BBC for less than they had cost to make. He then spent two years touring in Australia (1954–55).
On his return to Britain in 1954, he worked as a scriptwriter for Peter Sellers and then on 39 episodes of his own radio show "Round the Bend in 30 Minutes", which has also been wiped from the BBC archive. He then teamed up with Dick Lester to devise a series of six TV programmes "Before Midnight" for Associated British Corporation (ABC) in Birmingham in 1958. This led to a 13-programme series called "After Hours" in which he appeared alongside Dick Emery, Clive Dunn, David Lodge, Joe Gibbons and Benny Lee. The show featured the "olde English sport of drats, later known as nurdling". Some of the sketches were adapted into a stage revue, "Don't Shoot, We're British". He also appeared in the film comedy "Raising a Riot", starring Kenneth More, which featured his five-year-old daughter "Fusty". He joked that she got better billing.
From 1960 to 1964, he had a television series, "It's a Square World", which won a BAFTA award in 1962 and Grand Prix de la Presse at Montreux in 1963. A prominent feature of the series was the imaginary flea circus where plays were enacted on tiny sets using nothing but special effects to show the movement of things too small to see and sounds with Bentine's commentary. One, titled "The Beast of the Black Bog Tarn", was set in a (miniature) haunted house.
He was the subject of "This Is Your Life" in April 1963 when he was surprised by Eamonn Andrews at the BBC Television Theatre.
In 1969–70 he was presenter of "The Golden Silents" on BBC TV, which attempted authentic showings of silent films, without the commentaries with which they were usually shown on television before then.
From 1974 to 1980 he wrote, designed, narrated and presented the children's television programme "Michael Bentine's Potty Time" and made one-off comedy specials.
From January to May 1984 Bentine put out 11 half-hour episodes, in two series, of "The Michael Bentine Show" on Radio 4. These have subsequently been repeated, several times, on the BBC's archive radio station BBC7 (now BBC Radio 4 Extra).
He was the writer of 16 best-selling novels, comedies and non-fiction books. Four of his books, "The Long Banana Skin" (1975), "The Door Marked Summer" (1981), "Doors to the Mind" and "The Reluctant Jester" (1992) are autobiographical.
In 1968, travelling on the British Hovercraft Corporation (BHC) SR.N6, "GH–2012", Bentine took part in the first hovercraft expedition up the River Amazon.
In 1995, Bentine received a CBE from Queen Elizabeth II "for services to entertainment". He was also a holder of the Peruvian Order of Merit (as was his grandfather, Don ) for his work leading the fundraising for the Peruvian Earthquake Appeal.
Bentine was a crack pistol shot and helped to start the idea of a counter-terrorist wing within 22 SAS Regiment. In doing so, he became the first non-SAS person ever to fire a gun inside the close-quarters battle training house at Hereford.
His interests included parapsychology. This was as a result of his and his family's extensive research into the paranormal, which resulted in his writing "The Door Marked Summer" and "The Doors of the Mind". He was, for the final years of his life, president of the Association for the Scientific Study of Anomalous Phenomena.
On 14 December 1977, he appeared with Arthur C. Clarke on Patrick Moore's BBC "The Sky at Night" programme. The broadcast was entitled "Suns, Spaceships and Bug-Eyed Monsters" – a light-hearted look at how science fiction had become science fact, as well as how ideas of space travel had become reality through the 20th century. In the opening of the programme, Moore introduces Bentine with Bentine confirming that he was the possessor of a "Readers Digest Degree". This remark was typical of Bentine's comic approach to most things in life that concealed his knowledge of science. Bentine appeared in a subsequent broadcast on a similar theme with Moore in 1980. Following the death of Arthur C. Clarke, "BBC Sky at Night" magazine released a copy of the 1977 archive programme on the cover of their May 2008 edition.
Bentine was married twice. With his first wife Marie Barradell, married 1941–1947, he had a daughter:
In 1949 he married his second wife Clementina Stuart, a Royal Ballet dancer. They had four children:
Of his five children, the two eldest daughters, Elaine and Marylla, died from cancer (breast cancer and lymphoma) in the 1980s. His elder son Stuart was killed with a pilot friend when a Piper PA-18 Super Cub crashed into a hillside at Ditcham Park Woods near Petersfield, Hampshire, on 28 August 1971. Their bodies and the aircraft were not found until October 1971. The AAIB after an 11-month investigation found that the aircraft went into cloud when taking action to avoid power cables while flying low in poor visibility and subsequently went out of control. Bentine's subsequent investigation into regulations governing private airfields resulted in his writing a report for Special Branch into the use of personal aircraft in smuggling operations. He fictionalised much of the material in his novel "Lords of the Levels".
From 1975 until his death in 1996, he and his wife spent their winters at a second home in Palm Springs, California, US.
Shortly before his death from prostate cancer at the age of 74, he was visited in hospital by Prince Charles.
Some of the programmes Bentine appeared in were: | https://en.wikipedia.org/wiki?curid=20418 |
Mania
Mania, also known as manic syndrome, is a state of abnormally elevated arousal, affect, and energy level, or "a state of heightened overall activation with enhanced affective expression together with lability of affect." Although mania is often conceived as a "mirror image" to depression, the heightened mood can be either euphoric or irritable; indeed, as the mania intensifies, irritability can be more pronounced and result in anxiety or violence.
The symptoms of mania include elevated mood (either euphoric or irritable), flight of ideas and pressure of speech, increased energy, decreased need and desire for sleep, and hyperactivity. They are most plainly evident in fully developed hypomanic states. However, in full-blown mania, they undergo progressively severe exacerbations and become more and more obscured by other signs and symptoms, such as delusions and fragmentation of behavior.
Mania is a syndrome with multiple causes. Although the vast majority of cases occur in the context of bipolar disorder, it is a key component of other psychiatric disorders (such as schizoaffective disorder, bipolar type) and may also occur secondary to various general medical conditions, such as multiple sclerosis; certain medications may perpetuate a manic state, for example prednisone; or substances prone to abuse, especially stimulants, such as caffeine and cocaine. In the current DSM-5, hypomanic episodes are separated from the more severe full manic episodes, which, in turn, are characterized as either mild, moderate, or severe, with certain diagnostic criteria (e.g. catatonia, psychosis). Mania is divided into three stages: hypomania, or stage I; acute mania, or stage II; and delirious mania (delirium), or stage III. This "staging" of a manic episode is useful from a descriptive and differential diagnostic point of view.
Mania varies in intensity, from mild mania (hypomania) to delirious mania, marked by such symptoms as disorientation, florid psychosis, incoherence, and catatonia. Standardized tools such as Altman Self-Rating Mania Scale and Young Mania Rating Scale can be used to measure severity of manic episodes. Because mania and hypomania have also long been associated with creativity and artistic talent, it is not always the case that the clearly manic/hypomanic bipolar patient needs or wants medical help; such persons often either retain sufficient self-control to function normally or are unaware that they have "gone manic" severely enough to be committed or to commit themselves. Manic persons often can be mistaken for being under the influence of drugs.
In a mixed affective state, the individual, though meeting the general criteria for a hypomanic (discussed below) or manic episode, experiences three or more concurrent depressive symptoms. This has caused some speculation, among clinicians, that mania and depression, rather than constituting "true" polar opposites, are, rather, two independent axes in a unipolar—bipolar spectrum.
A mixed affective state, especially with prominent manic symptoms, places the patient at a greater risk for completed suicide. Depression on its own is a risk factor but, when coupled with an increase in energy and goal-directed activity, the patient is far more likely to act with violence on suicidal impulses.
Hypomania, which means "less than mania", is a lowered state of mania that does little to impair function or decrease quality of life. It may, in fact, increase productivity and creativity. In hypomania, there is less need for sleep and both goal-motivated behaviour and metabolism increase. Some studies exploring brain metabolism in subjects with hypomania, however, did not find any conclusive link; while there are studies that reported abnormalities, some failed to detect differences. Though the elevated mood and energy level typical of hypomania could be seen as a benefit, true mania itself generally has many undesirable consequences including suicidal tendencies, and hypomania can, if the prominent mood is irritable as opposed to euphoric, be a rather unpleasant experience. In addition, the exaggerated case of hypomania can lead to problems. For instance, trait-based positivity for a person could make him more engaging and outgoing, and cause him to have a positive outlook in life. When exaggerated in hypomania, however, such a person can display excessive optimism, grandiosity, and poor decision making, often with little regard to the consequences.
A single manic episode, in the absence of secondary causes, (i.e., substance use disorders, pharmacologics, or general medical conditions) is often sufficient to diagnose bipolar I disorder. Hypomania may be indicative of bipolar II disorder. Manic episodes are often complicated by delusions and/or hallucinations; and if the psychotic features persist for a duration significantly longer than the episode of typical mania (two weeks or more), a diagnosis of schizoaffective disorder is more appropriate. Certain obsessive-compulsive spectrum disorders as well as impulse control disorders share the suffix "-mania," namely, kleptomania, pyromania, and trichotillomania. Despite the unfortunate association implied by the name, however, no connection exists between mania or bipolar disorder and these disorders.
Furthermore, evidence indicates a B12 deficiency can also cause symptoms characteristic of mania and psychosis.
Hyperthyroidism can produce similar symptoms to those of mania, such as agitation, elevated mood, increased energy, hyperactivity, sleep disturbances and sometimes, especially in severe cases, psychosis.
A "manic episode" is defined in the American Psychiatric Association's diagnostic manual as a "distinct period of abnormally and persistently elevated, expansive, or irritable mood and abnormally and persistently increased activity or energy, lasting at least 1 week and present most of the day, nearly every day (or any duration, if hospitalization is necessary)," where the mood is not caused by drugs/medication or a non-mental medical illness (e.g., hyperthyroidism), and: (a) is causing obvious difficulties at work or in social relationships and activities, or (b) requires admission to hospital to protect the person or others, or (c) the person is suffering psychosis.
To be classified as a manic episode, while the disturbed mood and an increase in goal directed activity or energy is present, at least three (or four, if only irritability is present) of the following must have been consistently present:
Though the activities one participates in while in a manic state are not "always" negative, those with the potential to have negative outcomes are far more likely.
If the person is concurrently depressed, they are said to be having a mixed episode.
The World Health Organization's classification system defines "a manic episode" as one where mood is higher than the person's situation warrants and may vary from relaxed high spirits to barely controllable exuberance, is accompanied by hyperactivity, a compulsion to speak, a reduced sleep requirement, difficulty sustaining attention and/or often increased distractibility. Frequently, confidence and self-esteem are excessively enlarged, and grand, extravagant ideas are expressed. Behavior that is out of character and risky, foolish or inappropriate may result from a loss of normal social restraint.
Some people also have physical symptoms, such as sweating, pacing, and weight loss. In full-blown mania, often the manic person will feel as though his or her goal(s) are of paramount importance, that there are no consequences or that negative consequences would be minimal, and that they need not exercise restraint in the pursuit of what they are after. Hypomania is different, as it may cause little or no impairment in function. The hypomanic person's connection with the external world, and its standards of interaction, remain intact, although intensity of moods is heightened. But those who suffer from prolonged unresolved hypomania do run the risk of developing full mania, and indeed may cross that "line" without even realizing they have done so.
One of the signature symptoms of mania (and to a lesser extent, hypomania) is what many have described as racing thoughts. These are usually instances in which the manic person is excessively distracted by objectively unimportant stimuli. This experience creates an absent-mindedness where the manic individual's thoughts totally preoccupy him or her, making him or her unable to keep track of time, or be aware of anything besides the flow of thoughts. Racing thoughts also interfere with the ability to fall asleep.
Manic states are always relative to the normal state of intensity of the afflicted individual; thus, already irritable patients may find themselves losing their tempers even more quickly, and an academically gifted person may, during the hypomanic stage, adopt seemingly "genius" characteristics and an ability to perform and articulate at a level far beyond that which they would be capable of during euthymia. A very simple indicator of a manic state would be if a heretofore clinically depressed patient suddenly becomes inordinately energetic, enthusiastic, cheerful, aggressive, or "over happy". Other, often less obvious, elements of mania include delusions (generally of either grandeur or persecution, according to whether the predominant mood is euphoric or irritable), hypersensitivity, hypervigilance, hypersexuality, hyper-religiosity, hyperactivity and impulsivity, a compulsion to over explain (typically accompanied by pressure of speech), grandiose schemes and ideas, and a decreased need for sleep (for example, feeling rested after only 3 or 4 hours of sleep). In the case of the latter, the eyes of such patients may both look and seem abnormally "wide open", rarely blinking, and may contribute to some clinicians’ erroneous belief that these patients are under the influence of a stimulant drug, when the patient, in fact, is either not on any mind-altering substances or is actually on a depressant drug. Individuals may also engage in out-of-character behavior during the episode, such as questionable business transactions, wasteful expenditures of money (e.g., spending sprees), risky sexual activity, abuse of recreational substances, excessive gambling, reckless behavior (such as extreme speeding or other daredevil activity), abnormal social interaction (e.g. over familiarity and conversing with strangers), or highly vocal arguments. These behaviours may increase stress in personal relationships, lead to problems at work, and increase the risk of altercations with law enforcement. There is a high risk of impulsively taking part in activities potentially harmful to the self and others.
Although "severely elevated mood" sounds somewhat desirable and enjoyable, the experience of mania is ultimately often quite unpleasant and sometimes disturbing, if not frightening, for the person involved and for those close to them, and it may lead to impulsive behaviour that may later be regretted. It can also often be complicated by the sufferer's lack of judgment and insight regarding periods of exacerbation of characteristic states. Manic patients are frequently grandiose, obsessive, impulsive, irritable, belligerent, and frequently deny anything is wrong with them. Because mania frequently encourages high energy and decreased perception of need or ability to sleep, within a few days of a manic cycle, sleep-deprived psychosis may appear, further complicating the ability to think clearly. Racing thoughts and misperceptions lead to frustration and decreased ability to communicate with others.
Mania may also, as earlier mentioned, be divided into three “stages”. Stage I corresponds with hypomania and may feature typical hypomanic characteristics, such as gregariousness and euphoria. In stages II and III mania, however, the patient may be extraordinarily irritable, psychotic or even delirious. These latter two stages are referred to as acute and delirious (or Bell's), respectively.
Various triggers have been associated with switching from euthymic or depressed states into mania. One common trigger of mania is antidepressant therapy. Studies show that the risk of switching while on an antidepressant is between 6-69 percent. Dopaminergic drugs such as reuptake inhibitors and dopamine agonists may also increase risk of switch. Other medication possibly include glutaminergic agents and drugs that alter the HPA axis. Lifestyle triggers include irregular sleep-wake schedules and sleep deprivation, as well as extremely emotional or stressful stimuli.
Various genes that have been implicated in genetic studies of bipolar have been manipulated in preclinical animal models to produce syndromes reflecting different aspects of mania. CLOCK and DBP polymorphisms have been linked to bipolar in population studies, and behavioral changes induced by knockout are reversed by lithium treatment. Metabotropic glutamate receptor 6 has been genetically linked to bipolar, and found to be under-expressed in the cortex. Pituitary adenylate cyclase-activating peptide has been associated with bipolar in gene linkage studies, and knockout in mice produces mania like-behavior. Targets of various treatments such as GSK-3, and ERK1 have also demonstrated mania like behavior in preclinical models.
Mania may be associated with strokes, especially cerebral lesions in the right hemisphere.
Deep brain stimulation of the subthalamic nucleus in Parkinson's disease has been associated with mania, especially with electrodes placed in the ventromedial STN. A proposed mechanism involves increased excitatory input from the STN to dopaminergic nuclei.
Mania can also be caused by physical trauma or illness. When the causes are physical, it is called secondary mania.
The mechanism underlying mania is unknown, but the neurocognitive profile of mania is highly consistent with dysfunction in the right prefrontal cortex, a common finding in neuroimaging studies. Various lines of evidence from post-mortem studies and the putative mechanisms of anti-manic agents point to abnormalities in GSK-3, dopamine, Protein kinase C and Inositol monophosphatase.
Meta analysis of neuroimaging studies demonstrate increased thalamic activity, and bilaterally reduced inferior frontal gyrus activation. Activity in the amygdala and other subcortical structures such as the ventral striatum tend to be increased, although results are inconsistent and likely dependent upon task characteristics such as valence. Reduced functional connectivity between the ventral prefrontal cortex and amygdala along with variable findings supports a hypothesis of general dysregulation of subcortical structures by the prefrontal cortex. A bias towards positively valenced stimuli, and increased responsiveness in reward circuitry may predispose towards mania. Mania tends to be associated with right hemisphere lesions, while depression tends to be associated with left hemisphere lesions.
Post-mortem examinations of bipolar disorder demonstrate increased expression of Protein Kinase C (PKC). While limited, some studies demonstrate manipulation of PKC in animals produces behavioral changes mirroring mania, and treatment with PKC inhibitor tamoxifen (also an anti-estrogen drug) demonstrates antimanic effects. Traditional antimanic drugs also demonstrate PKC inhibiting properties, among other effects such as GSK3 inhibition.
Manic episodes may be triggered by dopamine receptor agonists, and this combined with tentative reports of increased VMAT2 activity, measured via PET scans of radioligand binding, suggests
a role of dopamine in mania. Decreased cerebrospinal fluid levels of the serotonin metabolite 5-HIAA have been found in manic patients too, which may be explained by a failure of serotonergic regulation and dopaminergic hyperactivity.
Limited evidence suggests that mania is associated with behavioral reward hypersensitivity, as well as with neural reward hypersensitivity. Electrophysiological evidence supporting this comes from studies associating left frontal EEG activity with mania. As left frontal EEG activity is generally thought to be a reflection of behavioral activation system activity, this is thought to support a role for reward hypersensitivity in mania. Tentative evidence also comes from one study that reported an association between manic traits and feedback negativity during receipt of monetary reward or loss. Neuroimaging evidence during acute mania is sparse, but one study reported elevated orbitofrontal cortex activity to monetary reward, and another study reported elevated striatal activity to reward omission. The latter finding was interpreted in the context of either elevated baseline activity (resulting in a null finding of reward hypersensitivity), or reduced ability to discriminate between reward and punishment, still supporting reward hyperactivity in mania. Punishment hyposensitivity, as reflected in a number of neuroimaging studies as reduced lateral orbitofrontal response to punishment, has been proposed as a mechanism of reward hypersensitivity in mania.
In the ICD-10 there are several disorders with the manic syndrome: organic manic disorder (), mania without psychotic symptoms (), mania with psychotic symptoms (), other manic episodes (), unspecified manic episode (), manic type of schizoaffective disorder (), bipolar affective disorder, current episode manic without psychotic symptoms (), bipolar affective disorder, current episode manic with psychotic symptoms ().
Before beginning treatment for mania, careful differential diagnosis must be performed to rule out secondary causes.
The acute treatment of a manic episode of bipolar disorder involves the utilization of either a mood stabilizer (valproate, lithium, lamotrigine, or carbamazepine) or an atypical antipsychotic (olanzapine, quetiapine, risperidone, or aripiprazole). Although hypomanic episodes may respond to a mood stabilizer alone, full-blown episodes are treated with an atypical antipsychotic (often in conjunction with a mood stabilizer, as these tend to produce the most rapid improvement).
When the manic behaviours have gone, long-term treatment then focuses on prophylactic treatment to try to stabilize the patient's mood, typically through a combination of pharmacotherapy and psychotherapy. The likelihood of having a relapse is very high for those who have experienced two or more episodes of mania or depression. While medication for bipolar disorder is important to manage symptoms of mania and depression, studies show relying on medications alone is not the most effective method of treatment. Medication is most effective when used in combination with other bipolar disorder treatments, including psychotherapy, self-help coping strategies, and healthy lifestyle choices.
Lithium is the classic mood stabilizer to prevent further manic and depressive episodes. A systematic review found that long term lithium treatment substantially reduces the risk of bipolar manic relapse, by 42%. Anticonvulsants such as valproate, oxcarbazepine and carbamazepine are also used for prophylaxis. More recent drug solutions include lamotrigine and topiramate, both anticonvulsants as well.
In some cases, long-acting benzodiazepines, particularly clonazepam, are used after other options are exhausted. In more urgent circumstances, such as in emergency rooms, lorazepam has been used to promptly alleviate symptoms of agitation, aggression, and psychosis. Sometimes atypical antipsychotics are used in combination with the previous mentioned medications as well, including olanzapine which helps treat hallucinations or delusions, asenapine, aripiprazole, risperidone, ziprasidone, and clozapine which is often used for people who do not respond to lithium or anticonvulsants.
Verapamil, a calcium-channel blocker, is useful in the treatment of hypomania and in those cases where lithium and mood stabilizers are contraindicated or ineffective. Verapamil is effective for both short-term and long-term treatment.
Antidepressant monotherapy is not recommended for the treatment of depression in patients with bipolar disorders I or II, and no benefit has been demonstrated by combining antidepressants with mood stabilizers in these patients. Some atypical antidepressants, however, such as mirtazepine and trazodone have been occasionally used after other options have failed.
In "Electroboy: A Memoir of Mania" by Andy Behrman, he describes his experience of mania as "the most perfect prescription glasses with which to see the world... life appears in front of you like an oversized movie screen". Behrman indicates early in his memoir that he sees himself not as a person suffering from an uncontrollable disabling illness, but as a director of the movie that is his vivid and emotionally alive life. There is some evidence that people in the creative industries suffer from bipolar disorder more often than those in other occupations.
Winston Churchill had periods of "manic symptoms" that may have been both an asset and a liability.
English actor Stephen Fry, who suffers from bipolar disorder, recounts manic behaviour during his adolescence: "When I was about 17 ... going around London on two stolen credit cards, it was a sort of fantastic reinvention of myself, an attempt to. I bought ridiculous suits with stiff collars and silk ties from the 1920s, and would go to the Savoy and Ritz and drink cocktails." While he has experienced suicidal thoughts, he says the manic side of his condition has had positive contributions on his life.
The nosology of the various stages of a manic episode has changed over the decades. The word derives from the Ancient Greek μανία ("manía"), "madness, frenzy" and the verb μαίνομαι ("maínomai"), "to be mad, to rage, to be furious". | https://en.wikipedia.org/wiki?curid=20419 |
Multimedia
Multimedia is content that uses a combination of different content forms such as text, audio, images, animations, video and interactive content. Multimedia contrasts with media that use only rudimentary computer displays such as text-only or traditional forms of printed or hand-produced material.
Multimedia can be recorded and played, displayed, interacted with or accessed by information content processing devices, such as computerized and electronic devices, but can also be part of a live performance. Multimedia devices are electronic media devices used to store and experience multimedia content. Multimedia is distinguished from mixed media in fine art; for example, by including audio it has a broader scope. In the early years of multimedia the term "rich media" was synonymous with interactive multimedia, and "hypermedia" was an application of multimedia.
The term "multimedia" was coined by singer and artist Bob Goldstein (later 'Bobb Goldsteinn') to promote the July 1966 opening of his "LightWorks at L'Oursin" show at Southampton, Long Island. Goldstein was perhaps aware of an American artist named Dick Higgins, who had two years previously discussed a new approach to art-making he called "intermedia".
On August 10, 1966, Richard Albarino of "Variety" borrowed the terminology, reporting: "Brainchild of songscribe-comic Bob ('Washington Square') Goldstein, the 'Lightworks' is the latest "multi-media" music-cum-visuals to debut as discothèque fare." Two years later, in 1968, the term "multimedia" was re-appropriated to describe the work of a political consultant, David Sawyer, the husband of Iris Sawyer—one of Goldstein's producers at L'Oursin.
In the 1993 first edition of "Multimedia: Making It Work", Tay Vaughan declared "Multimedia is any combination of text, graphic art, sound, animation, and video that is delivered by computer. When you allow the user – the viewer of the project – to control what and when these elements are delivered, it is "interactive multimedia". When you provide a structure of linked elements through which the user can navigate, interactive multimedia becomes "hypermedia"."
The German language society Gesellschaft für deutsche Sprache recognized the word's significance and ubiquitousness in the 1990s by awarding it the title of German 'Word of the Year' in 1995. The institute summed up its rationale by stating "[Multimedia] has become a central word in the wonderful new media world".
In common usage, "multimedia" refers to an electronically delivered combination of media including video, still images, audio, and text in such a way that can be accessed interactively. Much of the content on the web today falls within this definition as understood by millions. Some computers which were marketed in the 1990s were called "multimedia" computers because they incorporated a CD-ROM drive, which allowed for the delivery of several hundred megabytes of video, picture, and audio data. That era saw also a boost in the production of educational multimedia CD-ROMs.
The term "video", if not used exclusively to describe motion photography, is ambiguous in multimedia terminology. "Video" is often used to describe the file format, delivery format, or presentation format instead of ""footage"" which is used to distinguish motion photography from ""animation"" of rendered motion imagery. Multiple forms of information content are often not considered modern forms of presentation such as audio or video. Likewise, single forms of information content with single methods of information processing (e.g. non-interactive audio) are often called multimedia, perhaps to distinguish static media from active media. In the fine arts, for example, Leda Luss Luyken's ModulArt brings two key elements of musical composition and film into the world of painting: variation of a theme and movement of and within a picture, making "ModulArt" an interactive multimedia form of art. Performing arts may also be considered multimedia considering that performers and props are multiple forms of both content and media.
Multimedia presentations may be viewed by person on stage, projected, transmitted, or played locally with a media player. A broadcast may be a live or recorded multimedia presentation. Broadcasts and recordings can be either analog or digital electronic media technology. Digital online multimedia may be downloaded or streamed. Streaming multimedia may be live or on-demand.
Multimedia games and simulations may be used in a physical environment with special effects, with multiple users in an online network, or locally with an offline computer, game system, or simulator.
The various formats of technological or digital multimedia may be intended to enhance the users' experience, for example to make it easier and faster to convey information. Or in entertainment or art, to transcend everyday experience.
Enhanced levels of interactivity are made possible by combining multiple forms of media content. Online multimedia is increasingly becoming object-oriented and data-driven, enabling applications with collaborative end-user innovation and personalization on multiple forms of content over time. Examples of these range from multiple forms of content on Web sites like photo galleries with both images (pictures) and title (text) user-updated, to simulations whose co-efficients, events, illustrations, animations or videos are modifiable, allowing the multimedia "experience" to be altered without reprogramming. In addition to seeing and hearing, haptic technology enables virtual objects to be felt. Emerging technology involving illusions of taste and smell may also enhance the multimedia experience.
Multimedia may be broadly divided into linear and non-linear categories:
Multimedia presentations can be live or recorded:
Multimedia finds its application in various areas including, but not limited to, advertisements, art, education, entertainment, engineering, medicine, mathematics, business, scientific research and spatial temporal applications. Several examples are as follows:
Creative industries use multimedia for a variety of purposes ranging from fine arts, to entertainment, to commercial art, to journalism, to media and software services provided for any of the industries listed below. An individual multimedia designer may cover the spectrum throughout their career. Request for their skills range from technical, to analytical, to creative.
Much of the electronic old and new media used by commercial artists and graphic designers is multimedia. Exciting presentations are used to grab and keep attention in advertising. Business to business, and interoffice communications are often developed by creative services firms for advanced multimedia presentations beyond simple slide shows to sell ideas or liven up training. Commercial multimedia developers may be hired to design for governmental services and nonprofit services applications as well.
Multimedia is heavily used in the entertainment industry, especially to develop special effects in movies and animations (VFX, 3D animation, etc.). Multimedia games are a popular pastime and are software programs available either as CD-ROMs or online. Some video games also use multimedia features.
Multimedia applications that allow users to actively participate instead of just sitting by as passive recipients of information are called "interactive multimedia".
In the arts there are multimedia artists, whose minds are able to blend techniques using different media that in some way incorporates interaction with the viewer. One of the most relevant could be Peter Greenaway who is melding cinema with opera and all sorts of digital media. Another approach entails the creation of multimedia that can be displayed in a traditional fine arts arena, such as an art gallery. Although multimedia display material may be volatile, the survivability of the content is as strong as any traditional media. Digital recording material may be just as durable and infinitely reproducible with perfect copies every time.
In education, multimedia is used to produce computer-based training courses (popularly called CBTs) and reference books like encyclopedia and almanacs. A CBT lets the user go through a series of presentations, text about a particular topic, and associated illustrations in various information formats. Edutainment is the combination of education with entertainment, especially multimedia entertainment.
Learning theory in the past decade has expanded dramatically because of the introduction of multimedia. Several lines of research have evolved, e.g. cognitive load and multimedia learning.
From multimedia learning (MML) theory, David Roberts has developed a large group lecture practice using PowerPoint and based on the use of full-slide images in conjunction with a reduction of visible text (all text can be placed in the notes view’ section of PowerPoint). The method has been applied and evaluated in 9 disciplines. In each experiment, students’ engagement and active learning has been approximately 66% greater, than with the same material being delivered using bullet points, text and speech, corroborating a range of theories presented by multimedia learning scholars like Sweller and Mayer. The idea of media convergence is also becoming a major factor in education, particularly higher education. Defined as separate technologies such as voice (and telephony features), data (and productivity applications) and video that now share resources and interact with each other, media convergence is rapidly changing the curriculum in universities all over the world.
Multimedia provides students with an alternate means of acquiring knowledge designed to enhance teaching and learning through various mediums and platforms. This technology allows students to learn at their own pace and gives teachers the ability to observe the individual needs of each student. The capacity for multimedia to be used in multi-disciplinary settings is structured around the idea of creating a hands-on learning environment through the use of technology. Lessons can be tailored to the subject matter as well as be personalized to the students' varying levels of knowledge on the topic. Learning content can be managed through activities that utilize and take advantage of multimedia platforms. This kind of learning encourages interactive communication between students and teachers and opens feedback channels, introducing an active learning process especially with the prevalence of new media and social media. Technology has impacted multimedia as it is largely associated with the use of computers or other electronic devices and digital media due to its capabilities concerning research, communication, problem-solving through simulations and feedback opportunities.
Multimedia is a robust education and research methodology within the social work context. The five different multimedia which supports the education process are narrative media, interactive media, communicative media, adaptive media, and productive media. Contrary to long-standing belief, multimedia technology in social work education existed before the prevalence of the internet. It takes the form of images, audio, and video into the curriculum.
First introduced to social work education by Seabury & Maple in 1993, multimedia technology is utilized to teach social work practice skills including interviewing, crisis intervention, and group work. In comparison with conventional teaching method, including face-to-face courses, multimedia education shortens transportation time, increases knowledge and confidence in a richer and more authentic context for learning, generates interaction between online users, and enhances understanding of conceptual materials for novice students.
In an attempt to examine the impact of multimedia technology on students’ study, A. Elizabeth Cauble & Linda P. Thurston conducted a research in which Building Family Foundations (BFF), an interactive multimedia training platform, was utilized to assess social work students’ reactions to multimedia technology on variables of knowledge, attitudes, and self-efficacy. The results states that respondents show a substantial increase in academic knowledge, confidence, and attitude. Multimedia also benefits students because it brings expert to students online, fits students’ schedule, allows students to choose courses that suit them.
Mayer's Cognitive Theory of Multimedia Learning suggests, "people learn more from words and pictures than from words alone." According to Mayer and other scholars, multimedia technology stimulates people's brains by implementing visual and auditory effects, and thereby assists online users to learn efficiently. Researchers suggest that when users establish dual channels while learning, they tend to understand and memorize better. Mixed literature of this theory are still present in the field of multimedia and social work.
With the spread and development of the English language around the world, it has become an important way of communicating between different people and cultures. Multimedia Technology creates a platform where language can be taught. The traditional form of teaching English as a Second Language (ESL) in classrooms have drastically changed with the prevalence of technology, making easier for students to obtain language learning skills. Multimedia motivates students to learn more languages through audio, visual and animation support. It also helps create English contexts since an important aspect of learning a language is developing their grammar, vocabulary and knowledge of pragmatics and genres. In addition, cultural connections in terms of forms, contexts, meanings and ideologies have to be constructed. By improving thought patterns, multimedia develops students’ communicative competence by improving their capacity to understand the language. One of the studies, carried out by Izquierdo, Simard and Pulido, presented the correlation between "Multimedia Instruction (MI) and learners’ second language (L2)" and its effects on learning behavior. Their findings based on Gardner’s theory of the "socio-educational model of learner motivation and attitudes", the study shows that there is easier access to language learning materials as well as increased motivation with MI along with the use of Computer-Assisted Language Learning.
Newspaper companies all over are trying to embrace the new phenomenon by implementing its practices in their work. While some have been slow to come around, other major newspapers like "The New York Times", "USA Today" and "The Washington Post" are setting the precedent for the positioning of the newspaper industry in a globalized world.
News reporting is not limited to traditional media outlets. Freelance journalists can make use of different new media to produce multimedia pieces for their news stories. It engages global audiences and tells stories with technology, which develops new communication techniques for both media producers and consumers. The Common Language Project, later renamed to The Seattle Globalist, is an example of this type of multimedia journalism production.
Multimedia reporters who are mobile (usually driving around a community with cameras, audio and video recorders, and laptop computers) are often referred to as mojos, from "mo"bile "jo"urnalist.
Software engineers may use multimedia in computer simulations for anything from entertainment to training such as military or industrial training. Multimedia for software interfaces are often done as a collaboration between creative professionals and software engineers.
In mathematical and scientific research, multimedia is mainly used for modeling and simulation. For example, a scientist can look at a molecular model of a particular substance and manipulate it to arrive at a new substance. Representative research can be found in journals such as the "Journal of Multimedia".
In medicine, doctors can get trained by looking at a virtual surgery or they can simulate how the human body is affected by diseases spread by viruses and bacteria and then develop techniques to prevent it. Multimedia applications such as virtual surgeries also help doctors to get practical training.
In Europe, the reference organisation for the multimedia industry is the European Multimedia Associations Convention (EMMAC).
Scholarly conferences about multimedia include:
Multimedia Exchange Network over Satellite (MENOS) is a communications protocol for exchanging multimedia content using communications satellites, most commonly used by professional broadcasters. | https://en.wikipedia.org/wiki?curid=20420 |
Max Headroom (TV series)
Max Headroom is an American satirical science fiction television series by Chrysalis Visual Programming and Lakeside Productions for Lorimar-Telepictures that aired in the United States on ABC from March 31, 1987 to May 12, 1988. The series is set in a futuristic dystopia ruled by an oligarchy of television networks, and features the character and media personality Max Headroom. The story is based on the Channel 4 British TV film produced by Chrysalis, "".
In the future, an oligarchy of television networks rules the world. Even the government functions primarily as a puppet of the network executives, serving mainly to pass laws — such as banning "off" switches on televisions — that protect and consolidate the networks' power. Television technology has advanced to the point that viewers' physical movements and thoughts can be monitored through their television sets. Almost all non-television technology has been discontinued or destroyed. The only real check on the power of the networks is Edison Carter, a crusading investigative journalist who regularly exposes the unethical practices of his own employer, and the team of allies both inside and outside the system who assist him in getting his reports to air and protecting him from the forces that wish to silence or kill him.
Edison Carter (Matt Frewer) is a hard-hitting reporter for Network 23, who sometimes uncovered things that his superiors in the network would have preferred be kept private. Eventually, one of these instances required him to flee his workspace, upon which he was injured in a motorcycle accident in a parking lot.
The series depicted very little of the past described by Edison. He met a female televangelist (whom he had dated in college) when his reporting put him at odds with the Vu Age Church that she now headed. Edison was sent on a near-rampage to avenge a former colleague, who died as a result of a story on dream-harvesting.
Edison cares about his co-workers, especially Theora Jones and Bryce Lynch, and he has a deep respect for his producer, Murray (although he rarely shows it).
Max Headroom (Frewer) is a computer reconstruction of Carter, created after Bryce Lynch uploaded a copy of his mind. He appears as a computer-rendered bust of Carter superimposed on a wire-frame background. Since Carter's last sight before the motorcycle crash was the sign "Max. headroom" on a parking garage gate, these were the reconstruction's first words and ultimately his name. While Carter is a dedicated professional, Max is a wisecracking observer of human contradictions.
Despite being the titular character, Max sparsely appeared on the show. While he occasionally played a significant part in a plot — sometimes by traveling through networks to gain information or by revealing secrets about Carter that Carter himself wouldn't divulge — his most frequent role was as comic relief, delivering brief quips in reaction to certain events or giving a humorous soliloquy at the end of an episode.
Theora Jones first appeared in the British-made television pilot film for the series. She was Network 23's star controller ("stolen" from the World One Network by Murray) and, working with Edison, the network's star reporter, she often helped save the day for everyone. She was also a potential love interest for Edison, but that subplot was not explored fully on the show before it was cancelled.
Network 23's personnel files list her father as unknown, her mother as deceased, and her brother as Shawn Jones; Shawn is the focus on the second episode broadcast, "Rakers".
Theora Jones was played by Amanda Pays, who along with Matt Frewer and W. Morgan Sheppard, was one of only three cast members to also appear in the American-made series that followed.
Cheviot (George Coe), was one of the executives on Network 23's board of directors. He later becomes the board's new chairman after Ned Grossberg is fired in the wake of the Blipvert incident. He is mostly ethical and almost invariably backs Edison Carter, occasionally against the wishes of the Network 23 board of directors. However he has compromised himself on a few occasions when he felt the ratings for the Network would rise using methods that were questionable such as allowing the network to copyright the exclusive news of a terrorist organization, and mixing sex and politics. He once had an affair with board member Julia Fornby though by the start of the show they had ended it long ago. Cheviot while usually rolling over for his greatest client did not do so when they attempted to supplant television networks themselves.
Bryce Lynch (Chris Young), a child prodigy and computer hacker, is Network 23's one-man technology research department.
In the stereotypical hacker ethos, Bryce has few principles and fewer loyalties. He seems to accept any task, even morally questionable ones, as long as he is allowed to have the freedom to play with technology however he sees fit. This, in turn, makes him a greater asset to the technological needs and demands of the network, and the whims of its executives and stars. However, he also generally does not hurt or infringe on others, making him a rare neutral character in the Max Headroom universe.
In the pilot episode of the series, Bryce is enlisted by evil network CEO Ned Grossberg (Charles Rocket) to investigate the mental patterns of unconscious reporter Edison Carter, to determine whether or not Carter has discovered the secrets of the "Blipverts" scandal. Bryce uploads the contents of Carter's memory into the Network 23 computer system, creating Max Headroom. It had been Bryce, following orders from Grossberg, who fought a hacking battle of sorts with Theora Jones that led to Edison hitting his head on a traffic barrier and falling unconscious.
After the first episode, Bryce is generally recruited by Carter and his controller, Theora Jones, to provide technical aid to their investigative reporting efforts.
Murray (Jeffrey Tambor), Carter's serious and high-strung producer, whose job often becomes a balancing act between supporting Carter's stories and pleasing Network 23's executives. In his younger years he was also a field reporter and may have had some experience with the systems of a controller, though the system in his younger years had changed since and would not be reliable to replace one. When creating the "What I Want To Know Show" it was a toss up between Eddison Carter and another reporter and Murray "Choose The Best" a decision that would have future repercussions. Murray is divorced and see's his kids on weekends.
Reg (W. Morgan Sheppard) is a "blank", a person not indexed in the government's database. He broadcasts the underground Big Time Television Network from his bus. He is a good friend of Edison Carter, and saves him on more than one occasion. With colleague/lover Dominique, he operates and is the onscreen voice of Big Time television, "All day every day, making tomorrow seem like yesterday."
He dresses in a punk style and has a Mohawk haircut. He has an energetic personality and a strong nostalgic streak, defending antiquated music videos and printed books in equal measure.
Ned Grossberg is a recurring villain on the series, played by former "Saturday Night Live" cast member Charles Rocket.
In the pilot episode, Grossberg is the chairman of Network 23, a major city television station with the highest-rated investigative-news show in town, hosted by Edison Carter. In the Max Headroom world, real-time ratings equal advertising dollars, and advertisements have replaced stocks as the measure of corporate worth.
Grossberg, with his secret prodigy Bryce Lynch, develops a high-speed advertising delivery method known as Blipverts, which condenses full advertisements into a few seconds. When Carter discovers that Blipverts are killing people, Grossberg orders Lynch to prevent Carter from getting out of the building. Knocked unconscious, Carter's memories are extracted into a computer by Lynch in order to determine whether Carter uncovered Grossberg's knowledge of the danger of Blipverts. The resulting computer file of the memory-extraction process becomes Max Headroom, making Grossberg directly responsible for the creation of the character. In the end, Grossberg is publicly exposed as responsible for the Blipverts scandal, and is removed as chairman of Network 23.
A few episodes later, in "Grossberg's Return", Grossberg reappears as a board member of Network 66. Again, he invents a dubious advertising medium and convinces the chairman of the network to adopt it. When the advertising method is shown to be a complete fraud, the resulting public reaction against the network leads to the chairman being removed, and Grossberg manages to assume the chairmanship.
When under stress, Grossberg exhibits a tic of slightly stretching his neck in his suit's collar, first seen in episode 1 when he confronts Lynch in his lab regarding Max retaining Carter's memory about the blipverts.
In the UK telefilm "Max Headroom: 20 Minutes Into the Future" upon which the American series was based, the character was called Grossman and was played by Nickolas Grace. Rocket portrayed Grossberg as an American yuppie with a characteristic facial and neck-stretching twitch.
The series was based on the Channel 4 British TV film produced by Chrysalis, "". Cinemax aired the UK pilot followed by a six-week run of highlights from "The Max Headroom Show", a UK music video show where Headroom appears between music videos. ABC took an interest in the pilot and asked Chrysalis/Lakeside to produce the series for American audiences.
"Max Headroom: 20 Minutes into the Future" was re-shot as a pilot program for a new series broadcast by the U.S. ABC television network. The pilot featured plot changes and some minor visual touches, but retained the same basic storyline. The only original cast retained for the series were Matt Frewer (Max Headroom/Edison Carter) and Amanda Pays (Theora Jones); a third original cast member, W. Morgan Sheppard, joined the series as "Blank Reg" in later episodes. Among the non-original cast, Jeffrey Tambor co-starred as "Murray", Edison Carter's neurotic producer.
The show went into production in late 1986 and ran for six episodes in the first season and eight in season two.
The series began as a mid-season replacement in spring of 1987, and did well enough to be renewed for the fall television season, but the viewer ratings could not be sustained in direct competition with CBS's Top 20 hit "Dallas" (also produced by Lorimar) and NBC's Top 30 hit "Miami Vice." "Max Headroom" was canceled part-way into its second season. The entire series, along with two previously unbroadcast episodes, was rerun in spring 1988 during the Writers Guild of America strike. In the late 1990s, U.S. cable TV channels Bravo and the Sci-Fi Channel re-ran the series. Reruns also briefly appeared on TechTV in 2001. A cinema spin-off titled "Max Headroom for President" was announced with production intended to start in early 1988 in order to capitalize on that year's U.S. presidential election, but it was never made.
"Max Headroom" has been called "the first cyberpunk television series", with "deep roots in the Western philosophical tradition".
Shout! Factory (under license from Warner Bros. Home Entertainment) released "Max Headroom: The Complete Series" on DVD in the United States and Canada on August 10, 2010. The bonus features includes a round-table discussion with most of the major cast members (other than Matt Frewer), and interviews with the writers and producers. | https://en.wikipedia.org/wiki?curid=20421 |
Malaria
Malaria is a mosquito-borne infectious disease that affects humans and other animals. Malaria causes symptoms that typically include fever, tiredness, vomiting, and headaches. In severe cases it can cause yellow skin, seizures, coma, or death. Symptoms usually begin ten to fifteen days after being bitten by an infected mosquito. If not properly treated, people may have recurrences of the disease months later. In those who have recently survived an infection, reinfection usually causes milder symptoms. This partial resistance disappears over months to years if the person has no continuing exposure to malaria.
Malaria is caused by single-celled microorganisms of the "Plasmodium" group. The disease is most commonly spread by an infected female "Anopheles" mosquito. The mosquito bite introduces the parasites from the mosquito's saliva into a person's blood. The parasites travel to the liver where they mature and reproduce. Five species of "Plasmodium" can infect and be spread by humans. Most deaths are caused by "P. falciparum", whereas "P. vivax", "P. ovale", and "P. malariae" generally cause a milder form of malaria. The species "P. knowlesi" rarely causes disease in humans. Malaria is typically diagnosed by the microscopic examination of blood using blood films, or with antigen-based rapid diagnostic tests. Methods that use the polymerase chain reaction to detect the parasite's DNA have been developed, but are not widely used in areas where malaria is common due to their cost and complexity.
The risk of disease can be reduced by preventing mosquito bites through the use of mosquito nets and insect repellents, or with mosquito control measures such as spraying insecticides and draining standing water. Several medications are available to prevent malaria in travellers to areas where the disease is common. Occasional doses of the combination medication sulfadoxine/pyrimethamine are recommended in infants and after the first trimester of pregnancy in areas with high rates of malaria. As of 2020, there is one vaccine which has been shown to reduce the risk of malaria by about 40% in children in Africa. Efforts to develop more effective vaccines are ongoing. The recommended treatment for malaria is a combination of antimalarial medications that includes an artemisinin. The second medication may be either mefloquine, lumefantrine, or sulfadoxine/pyrimethamine. Quinine along with doxycycline may be used if an artemisinin is not available. It is recommended that in areas where the disease is common, malaria is confirmed if possible before treatment is started due to concerns of increasing drug resistance. Resistance among the parasites has developed to several antimalarial medications; for example, chloroquine-resistant "P. falciparum" has spread to most malarial areas, and resistance to artemisinin has become a problem in some parts of Southeast Asia.
The disease is widespread in the tropical and subtropical regions that exist in a broad band around the equator. This includes much of sub-Saharan Africa, Asia, and Latin America. In 2018 there were 228 million cases of malaria worldwide resulting in an estimated 405,000 deaths. Approximately 93% of the cases and 94% of deaths occurred in Africa. Rates of disease have decreased from 2010 to 2014, but increased from 2015 to 2017, during which there were 231 million cases. Malaria is commonly associated with poverty and has a major negative effect on economic development. In Africa, it is estimated to result in losses of US$12 billion a year due to increased healthcare costs, lost ability to work, and negative effects on tourism.
The signs and symptoms of malaria typically begin 8–25 days following infection, but may occur later in those who have taken antimalarial medications as prevention. Initial manifestations of the disease—common to all malaria species—are similar to flu-like symptoms, and can resemble other conditions such as sepsis, gastroenteritis, and viral diseases. The presentation may include headache, fever, shivering, joint pain, vomiting, hemolytic anemia, jaundice, hemoglobin in the urine, retinal damage, and convulsions.
The classic symptom of malaria is paroxysm—a cyclical occurrence of sudden coldness followed by shivering and then fever and sweating, occurring every two days (tertian fever) in "P. vivax" and "P. ovale" infections, and every three days (quartan fever) for "P. malariae". "P. falciparum" infection can cause recurrent fever every 36–48 hours, or a less pronounced and almost continuous fever.
Severe malaria is usually caused by "P. falciparum" (often referred to as falciparum malaria). Symptoms of falciparum malaria arise 9–30 days after infection. Individuals with cerebral malaria frequently exhibit neurological symptoms, including abnormal posturing, nystagmus, conjugate gaze palsy (failure of the eyes to turn together in the same direction), opisthotonus, seizures, or coma.
Malaria has several serious complications. Among these is the development of respiratory distress, which occurs in up to 25% of adults and 40% of children with severe "P. falciparum" malaria. Possible causes include respiratory compensation of metabolic acidosis, noncardiogenic pulmonary oedema, concomitant pneumonia, and severe anaemia. Although rare in young children with severe malaria, acute respiratory distress syndrome occurs in 5–25% of adults and up to 29% of pregnant women. Coinfection of HIV with malaria increases mortality. Kidney failure is a feature of blackwater fever, where haemoglobin from lysed red blood cells leaks into the urine.
Infection with "P. falciparum" may result in cerebral malaria, a form of severe malaria that involves encephalopathy. It is associated with retinal whitening, which may be a useful clinical sign in distinguishing malaria from other causes of fever. An enlarged spleen, enlarged liver or both of these, severe headache, low blood sugar, and haemoglobin in the urine with kidney failure may occur. Complications may include spontaneous bleeding, coagulopathy, and shock.
Malaria in pregnant women is an important cause of stillbirths, infant mortality, abortion and low birth weight, particularly in "P. falciparum" infection, but also with "P. vivax".
Malaria parasites belong to the genus "Plasmodium" (phylum Apicomplexa). In humans, malaria is caused by "P. falciparum", "P. malariae", "P. ovale", "P. vivax" and "P. knowlesi". Among those infected, "P. falciparum" is the most common species identified (~75%) followed by "P. vivax" (~20%). Although "P. falciparum" traditionally accounts for the majority of deaths, recent evidence suggests that "P. vivax" malaria is associated with potentially life-threatening conditions about as often as with a diagnosis of "P. falciparum" infection. "P. vivax" proportionally is more common outside Africa. There have been documented human infections with several species of "Plasmodium" from higher apes; however, except for "P. knowlesi"—a zoonotic species that causes malaria in macaques—these are mostly of limited public health importance.
In the life cycle of "Plasmodium", a female "Anopheles" mosquito (the definitive host) transmits a motile infective form (called the sporozoite) to a vertebrate host such as a human (the secondary host), thus acting as a transmission vector. A sporozoite travels through the blood vessels to liver cells (hepatocytes), where it reproduces asexually (tissue schizogony), producing thousands of merozoites. These infect new red blood cells and initiate a series of asexual multiplication cycles (blood schizogony) that produce 8 to 24 new infective merozoites, at which point the cells burst and the infective cycle begins anew.
Other merozoites develop into immature gametocytes, which are the precursors of male and female gametes. When a fertilised mosquito bites an infected person, gametocytes are taken up with the blood and mature in the mosquito gut. The male and female gametocytes fuse and form an ookinete—a fertilised, motile zygote. Ookinetes develop into new sporozoites that migrate to the insect's salivary glands, ready to infect a new vertebrate host. The sporozoites are injected into the skin, in the saliva, when the mosquito takes a subsequent blood meal.
Only female mosquitoes feed on blood; male mosquitoes feed on plant nectar and do not transmit the disease. Females of the mosquito genus "Anopheles" prefer to feed at night. They usually start searching for a meal at dusk, and continue through the night until they succeed. Malaria parasites can also be transmitted by blood transfusions, although this is rare.
Symptoms of malaria can recur after varying symptom-free periods. Depending upon the cause, recurrence can be classified as either recrudescence, relapse, or reinfection. Recrudescence is when symptoms return after a symptom-free period. It is caused by parasites surviving in the blood as a result of inadequate or ineffective treatment. Relapse is when symptoms reappear after the parasites have been eliminated from the blood but persist as dormant hypnozoites in liver cells. Relapse commonly occurs between 8–24 weeks and is often seen in "P. vivax" and "P. ovale" infections. However, relapse-like "P. vivax" recurrences are probably being over-attributed to hypnozoite activation. Some of them might have an extra-vascular merozoite origin, making these recurrences recrudescences, not relapses. One newly recognised, non-hypnozoite, possible contributing source to recurrent peripheral "P. vivax" parasitemia is erythrocytic forms in bone marrow. "P. vivax" malaria cases in temperate areas often involve overwintering by hypnozoites, with relapses beginning the year after the mosquito bite. Reinfection means the parasite that caused the past infection was eliminated from the body but a new parasite was introduced. Reinfection cannot readily be distinguished from recrudescence, although recurrence of infection within two weeks of treatment for the initial infection is typically attributed to treatment failure. People may develop some immunity when exposed to frequent infections.
Global climate change is likely to affect malaria transmission, but the degree of effect and the areas affected is uncertain. Greater rainfall in certain areas of India, and following an El Niño event is associated with increased mosquito numbers.
Malaria infection develops via two phases: one that involves the liver (exoerythrocytic phase), and one that involves red blood cells, or erythrocytes (erythrocytic phase). When an infected mosquito pierces a person's skin to take a blood meal, sporozoites in the mosquito's saliva enter the bloodstream and migrate to the liver where they infect hepatocytes, multiplying asexually and asymptomatically for a period of 8–30 days.
After a potential dormant period in the liver, these organisms differentiate to yield thousands of merozoites, which, following rupture of their host cells, escape into the blood and infect red blood cells to begin the erythrocytic stage of the life cycle. The parasite escapes from the liver undetected by wrapping itself in the cell membrane of the infected host liver cell.
Within the red blood cells, the parasites multiply further, again asexually, periodically breaking out of their host cells to invade fresh red blood cells. Several such amplification cycles occur. Thus, classical descriptions of waves of fever arise from simultaneous waves of merozoites escaping and infecting red blood cells.
Some "P. vivax" sporozoites do not immediately develop into exoerythrocytic-phase merozoites, but instead, produce hypnozoites that remain dormant for periods ranging from several months (7–10 months is typical) to several years. After a period of dormancy, they reactivate and produce merozoites. Hypnozoites are responsible for long incubation and late relapses in "P. vivax" infections, although their existence in "P. ovale" is uncertain.
The parasite is relatively protected from attack by the body's immune system because for most of its human life cycle it resides within the liver and blood cells and is relatively invisible to immune surveillance. However, circulating infected blood cells are destroyed in the spleen. To avoid this fate, the "P. falciparum" parasite displays adhesive proteins on the surface of the infected blood cells, causing the blood cells to stick to the walls of small blood vessels, thereby sequestering the parasite from passage through the general circulation and the spleen. The blockage of the microvasculature causes symptoms such as those in placental malaria. Sequestered red blood cells can breach the blood–brain barrier and cause cerebral malaria.
According to a 2005 review, due to the high levels of mortality and morbidity caused by malaria—especially the "P. falciparum" species—it has placed the greatest selective pressure on the human genome in recent history. Several genetic factors provide some resistance to it including sickle cell trait, thalassaemia traits, glucose-6-phosphate dehydrogenase deficiency, and the absence of Duffy antigens on red blood cells.
The impact of sickle cell trait on malaria immunity illustrates some evolutionary trade-offs that have occurred because of endemic malaria. Sickle cell trait causes a change in the haemoglobin molecule in the blood. Normally, red blood cells have a very flexible, biconcave shape that allows them to move through narrow capillaries; however, when the modified haemoglobin S molecules are exposed to low amounts of oxygen, or crowd together due to dehydration, they can stick together forming strands that cause the cell to distort into a curved sickle shape. In these strands, the molecule is not as effective in taking or releasing oxygen, and the cell is not flexible enough to circulate freely. In the early stages of malaria, the parasite can cause infected red cells to sickle, and so they are removed from circulation sooner. This reduces the frequency with which malaria parasites complete their life cycle in the cell. Individuals who are homozygous (with two copies of the abnormal haemoglobin beta allele) have sickle-cell anaemia, while those who are heterozygous (with one abnormal allele and one normal allele) experience resistance to malaria without severe anaemia. Although the shorter life expectancy for those with the homozygous condition would tend to disfavour the trait's survival, the trait is preserved in malaria-prone regions because of the benefits provided by the heterozygous form.
Liver dysfunction as a result of malaria is uncommon and usually only occurs in those with another liver condition such as viral hepatitis or chronic liver disease. The syndrome is sometimes called "malarial hepatitis". While it has been considered a rare occurrence, malarial hepatopathy has seen an increase, particularly in Southeast Asia and India. Liver compromise in people with malaria correlates with a greater likelihood of complications and death.
Owing to the non-specific nature of the presentation of symptoms, diagnosis of malaria in non-endemic areas requires a high degree of suspicion, which might be elicited by any of the following: recent travel history, enlarged spleen, fever, low number of platelets in the blood, and higher-than-normal levels of bilirubin in the blood combined with a normal level of white blood cells. Reports in 2016 and 2017 from countries where malaria is common suggest high levels of over diagnosis due to insufficient or inaccurate laboratory testing.
Malaria is usually confirmed by the microscopic examination of blood films or by antigen-based rapid diagnostic tests (RDT). In some areas, RDTs must be able to distinguish whether the malaria symptoms are caused by "Plasmodium falciparum" or by other species of parasites since treatment strategies could differ for non-"P. falciparum" infections. Microscopy is the most commonly used method to detect the malarial parasite—about 165 million blood films were examined for malaria in 2010. Despite its widespread usage, diagnosis by microscopy suffers from two main drawbacks: many settings (especially rural) are not equipped to perform the test, and the accuracy of the results depends on both the skill of the person examining the blood film and the levels of the parasite in the blood. The sensitivity of blood films ranges from 75–90% in optimum conditions, to as low as 50%. Commercially available RDTs are often more accurate than blood films at predicting the presence of malaria parasites, but they are widely variable in diagnostic sensitivity and specificity depending on manufacturer, and are unable to tell how many parasites are present. However, incorporating RDTs into the diagnosis of malaria can reduce antimalarial prescription. Although RDT does not improve the health outcomes of those infected with malaria, it also does not lead to worse outcomes when compared to presumptive antimalarial treatment.
In regions where laboratory tests are readily available, malaria should be suspected, and tested for, in any unwell person who has been in an area where malaria is endemic. In areas that cannot afford laboratory diagnostic tests, it has become common to use only a history of fever as the indication to treat for malaria—thus the common teaching "fever equals malaria unless proven otherwise". A drawback of this practice is overdiagnosis of malaria and mismanagement of non-malarial fever, which wastes limited resources, erodes confidence in the health care system, and contributes to drug resistance. Although polymerase chain reaction-based tests have been developed, they are not widely used in areas where malaria is common as of 2012, due to their complexity.
Malaria is classified into either "severe" or "uncomplicated" by the World Health Organisation (WHO). It is deemed severe when "any" of the following criteria are present, otherwise it is considered uncomplicated.
Cerebral malaria is defined as a severe "P. falciparum"-malaria presenting with neurological symptoms, including coma (with a Glasgow coma scale less than 11, or a Blantyre coma scale less than 3), or with a coma that lasts longer than 30 minutes after a seizure.
Various types of malaria have been called by the names below:
Methods used to prevent malaria include medications, mosquito elimination and the prevention of bites. As of 2020, there is one vaccine for malaria (known as RTS,S) which is licensed for use. The presence of malaria in an area requires a combination of high human population density, high anopheles mosquito population density and high rates of transmission from humans to mosquitoes and from mosquitoes to humans. If any of these is lowered sufficiently, the parasite eventually disappears from that area, as happened in North America, Europe, and parts of the Middle East. However, unless the parasite is eliminated from the whole world, it could re-establish if conditions revert to a combination that favors the parasite's reproduction. Furthermore, the cost per person of eliminating anopheles mosquitoes rises with decreasing population density, making it economically unfeasible in some areas.
Prevention of malaria may be more cost-effective than treatment of the disease in the long run, but the initial costs required are out of reach of many of the world's poorest people. There is a wide difference in the costs of control (i.e. maintenance of low endemicity) and elimination programs between countries. For example, in China—whose government in 2010 announced a strategy to pursue malaria elimination in the Chinese provinces—the required investment is a small proportion of public expenditure on health. In contrast, a similar programme in Tanzania would cost an estimated one-fifth of the public health budget.
In areas where malaria is common, children under five years old often have anaemia, which is sometimes due to malaria. Giving children with anaemia in these areas preventive antimalarial medication improves red blood cell levels slightly but does not affect the risk of death or need for hospitalisation.
Vector control refers to methods used to decrease malaria by reducing the levels of transmission by mosquitoes. For individual protection, the most effective insect repellents are based on DEET or picaridin. However, there is insufficient evidence that mosquito repellants can prevent malaria infection. Insecticide-treated mosquito nets (ITNs) and indoor residual spraying (IRS) have been shown highly effective in preventing malaria among children in areas where malaria is common. Prompt treatment of confirmed cases with artemisinin-based combination therapies (ACTs) may also reduce transmission.
Mosquito nets help keep mosquitoes away from people and reduce infection rates and transmission of malaria. Nets are not a perfect barrier and are often treated with an insecticide designed to kill the mosquito before it has time to find a way past the net. Insecticide-treated nets are estimated to be twice as effective as untreated nets and offer greater than 70% protection compared with no net. Between 2000 and 2008, the use of ITNs saved the lives of an estimated 250,000 infants in Sub-Saharan Africa. About 13% of households in Sub-Saharan countries owned ITNs in 2007 and 31% of African households were estimated to own at least one ITN in 2008. In 2000, 1.7 million (1.8%) African children living in areas of the world where malaria is common were protected by an ITN. That number increased to 20.3 million (18.5%) African children using ITNs in 2007, leaving 89.6 million children unprotected and to 68% African children using mosquito nets in 2015. Most nets are impregnated with pyrethroids, a class of insecticides with low toxicity. They are most effective when used from dusk to dawn. It is recommended to hang a large "bed net" above the center of a bed and either tuck the edges under the mattress or make sure it is large enough such that it touches the ground. ITN is beneficial towards pregnancy outcomes in malaria-endemic regions in Africa but more data is needed in Asia and Latin America.
In areas of high malaria resistance, Piperonyl Butoxide combined with Pyrethroids in ITN is effective in reducing malaria infection rates.
Indoor residual spraying is the spraying of insecticides on the walls inside a home. After feeding, many mosquitoes rest on a nearby surface while digesting the bloodmeal, so if the walls of houses have been coated with insecticides, the resting mosquitoes can be killed before they can bite another person and transfer the malaria parasite. As of 2006, the World Health Organisation recommends 12 insecticides in IRS operations, including DDT and the pyrethroids cyfluthrin and deltamethrin. This public health use of small amounts of DDT is permitted under the Stockholm Convention, which prohibits its agricultural use. One problem with all forms of IRS is insecticide resistance. Mosquitoes affected by IRS tend to rest and live indoors, and due to the irritation caused by spraying, their descendants tend to rest and live outdoors, meaning that they are less affected by the IRS. It is uncertain whether the use of IRS together with ITN is effective in reducing malaria cases due to wide geographical variety of malaria distribution, malaria transmission, and insecticide resistance.
People have tried a number of other methods to reduce mosquito bites and slow the spread of malaria. Efforts to decrease mosquito larvae by decreasing the availability of open water where they develop, or by adding substances to decrease their development, are effective in some locations. Electronic mosquito repellent devices, which make very high-frequency sounds that are supposed to keep female mosquitoes away, have no supporting evidence of effectiveness. There is a low certainty evidence that fogging may have an effect on malaria transmission. Larviciding by hand delivery of chemical or microbial insecticides into water bodies containing low larval distribution may reduce malarial transmission. There is insufficient evidence to determine whether larvivorous fish can decrease mosquito density and transmission in the area.
There are a number of medications that can help prevent or interrupt malaria in travellers to places where infection is common. Many of these medications are also used in treatment. In places where "Plasmodium" is resistant to one or more medications, three medications—mefloquine, doxycycline, or the combination of atovaquone/proguanil ("Malarone")—are frequently used for prevention. Doxycycline and the atovaquone/proguanil are better tolerated while mefloquine is taken once a week. Areas of the world with chloroquine-sensitive malaria are uncommon. Antimalarial mass drug administration to an entire population at the same time may reduce the risk of contracting malaria in the population.
The protective effect does not begin immediately, and people visiting areas where malaria exists usually start taking the drugs one to two weeks before they arrive, and continue taking them for four weeks after leaving (except for atovaquone/proguanil, which only needs to be started two days before and continued for seven days afterward). The use of preventative drugs is often not practical for those who live in areas where malaria exists, and their use is usually given only to pregnant women and short-term visitors. This is due to the cost of the drugs, side effects from long-term use, and the difficulty in obtaining antimalarial drugs outside of wealthy nations. During pregnancy, medication to prevent malaria has been found to improve the weight of the baby at birth and decrease the risk of anaemia in the mother. The use of preventative drugs where malaria-bearing mosquitoes are present may encourage the development of partial resistance.
Giving antimalarial drugs to infants through intermittent preventive therapy can reduce the risk of having malaria infection, hospital admission, and anaemia.
Mefloquine is more effective than sulfadoxine-pyrimethamine in preventing malaria for HIV-negative pregnant women. Cotrimoxazole is effective in preventing malaria infection and reduce the risk of getting anaemia in HIV-positive women. Giving sulfadoxine-pyrimethamine for three or more doses as intermittent preventive therapy is superior than two doses for HIV-positive women living in malaria-endemic areas.
Community participation and health education strategies promoting awareness of malaria and the importance of control measures have been successfully used to reduce the incidence of malaria in some areas of the developing world. Recognising the disease in the early stages can prevent it from becoming fatal. Education can also inform people to cover over areas of stagnant, still water, such as water tanks that are ideal breeding grounds for the parasite and mosquito, thus cutting down the risk of the transmission between people. This is generally used in urban areas where there are large centers of population in a confined space and transmission would be most likely in these areas. Intermittent preventive therapy is another intervention that has been used successfully to control malaria in pregnant women and infants, and in preschool children where transmission is seasonal.
Malaria is treated with antimalarial medications; the ones used depends on the type and severity of the disease. While medications against fever are commonly used, their effects on outcomes are not clear. Providing free antimalarial drugs to households may reduce childhood deaths when used appropriately. Programmes which presumptively treat all causes of fever with antimalarial drugs may lead to overuse of antimalarials and undertreat other causes of fever. Nevertheless, the use of malaria rapid-diagnostic kits can help to reduce over-usage of antimalarials.
Simple or uncomplicated malaria may be treated with oral medications. Arteminisin drugs are effective and safe in treating uncomplicated malaria. Arteminisim in combination with other antimalarials (known as artemisinin-combination therapy, or ACT) is about 90% effective when used to treat uncomplicated malaria. The most effective treatment for "P. falciparum" infection is the use of ACT, which decreases resistance to any single drug component. Artemether-lumefantrine (six-dose regimen) is more effective than the artemether-lumefantrine (four-dose regimen) or other regimens not containing artemisinin derivatives in treating falciparum malaria. Another recommended combination is dihydroartemisinin and piperaquine. Artemisinin-naphthoquine combination therapy showed promising results in treating falciparum malaria. However, more research need to establish its efficacy as a reliable treatment. Artesunate plus mefloquine performs better than mefloquine alone in treating uncomplicated falciparum malaria in low transmission settings. There is limited data to show atovaquone-proguanil is more effective than chloroquine, amodiaquine, and mefloquine in treating falciparum malaria. Azithromycin monotherapy or combination therapy has not shown effectiveness in treating plasmodium or vivax malaria. Amodiaquine plus sulfadoxine-pyrimethamine may achieve less treatment failures when compared to sulfadoxine-pyrimethamine alone in uncomplicated falciparum malaria. There is insufficient data on chlorproguanil-dapsone in treating uncomplicated falciparum malaria. The addition of primaquine with artemisinin-based combination therapy for falciparum malaria reduces its transmission at day 3-4 and day 8 of infection. Sulfadoxine-pyrimethamine plus artesunate is better than sulfadoxine-pyrimethamine plus amodiaquine in controlling treatment failure at day 28. However, the latter is better than the former in reducing gametocytes in blood at day 7.
Infection with "P. vivax", "P. ovale" or "P. malariae" usually does not require hospitalisation. Treatment of "P. vivax" requires both treatment of blood stages (with chloroquine or ACT) and clearance of liver forms with primaquine. Arteminisin-based combination therapy is as effective as chloroquine in treating uncomplicated "P. vivax" malaria. Treatment with tafenoquine prevents relapses after confirmed "P. vivax" malaria. However, for those treated with chloroquine for blood stage infection, 14 days of primaquine treatment is required to prevent relapse. Shorter primaquine regimens may lead to higher relapse rates. There is no difference in effectiveness between primaquine given for seven or 14 days for prevention of relapse in vivax malaria. The shorter regimen may be useful for those with treatment compliance problems.
To treat malaria during pregnancy, the WHO recommends the use of quinine plus clindamycin early in the pregnancy (1st trimester), and ACT in later stages (2nd and 3rd trimesters). There is limited safety data on the antimalarial drugs in pregnancy.
Cases of severe and complicated malaria are almost always caused by infection with "P. falciparum". The other species usually cause only febrile disease. Severe and complicated malaria cases are medical emergencies since mortality rates are high (10% to 50%).
Recommended treatment for severe malaria is the intravenous use of antimalarial drugs. For severe malaria, parenteral artesunate was superior to quinine in both children and adults. In another systematic review, artemisinin derivatives (artemether and arteether) were as efficacious as quinine in the treatment of cerebral malaria in children. Treatment of severe malaria involves supportive measures that are best done in a critical care unit. This includes the management of high fevers and the seizures that may result from it. It also includes monitoring for poor breathing effort, low blood sugar, and low blood potassium. Artemisinin derivatives have the same or better efficacy than quinolones in preventing deaths in severe or complicated malaria. Quinine loading dose helps to shorten the duration of fever and increases parasite clearance from the body. There is no difference in effectiveness when using intrarectal quinine compared to intravenous or intramuscular quinine in treating uncomplicated/complicated falciparum malaria. There is insufficient evidence for intramuscular arteether to treat severe malaria. The provision of rectal artesunate before transfer to hospital may reduce the rate of death for children with severe malaria.
Cerebral malaria is the form of severe and complicated malaria with the worst neurological symptoms. There is insufficient data on whether osmotic agents such as mannitol or urea are effective in treating cerebral malaria. Routine phenobarbitone in cerebral malaria is associated with fewer convulsions but possibly more deaths. There is no evidence that steroids would bring treatment benefits for cerebral malaria.
There is insufficient evidence to show that blood transfusion is useful in either reducing deaths for children with severe anaemia or in improving their haematocrit in one month. There is insufficient evidence that iron chelating agents such as deferoxamine and deferiprone improve outcomes of those with malaria falciparum infection.
Drug resistance poses a growing problem in 21st-century malaria treatment. In the 2000s (decade), malaria with partial resistance to artemisins emerged in Southeast Asia. Resistance is now common against all classes of antimalarial drugs apart from artemisinins. Treatment of resistant strains became increasingly dependent on this class of drugs. The cost of artemisinins limits their use in the developing world. Malaria strains found on the Cambodia–Thailand border are resistant to combination therapies that include artemisinins, and may, therefore, be untreatable. Exposure of the parasite population to artemisinin monotherapies in subtherapeutic doses for over 30 years and the availability of substandard artemisinins likely drove the selection of the resistant phenotype. Resistance to artemisinin has been detected in Cambodia, Myanmar, Thailand, and Vietnam, and there has been emerging resistance in Laos. Resistance to the combination of artemisinin and piperaquine was first detected in 2013 in Cambodia, and by 2019 had spread across Cambodia and into Laos, Thailand and Vietnam (with up to 80 percent of malaria parasites resistant in some regions).
There is insufficient evidence in unit packaged antimalarial drugs in preventing treatment failures of malaria infection. However, if supported by training of healthcare providers and patient information, there is improvement in compliance of those receiving treatment.
When properly treated, people with malaria can usually expect a complete recovery. However, severe malaria can progress extremely rapidly and cause death within hours or days. In the most severe cases of the disease, fatality rates can reach 20%, even with intensive care and treatment. Over the longer term, developmental impairments have been documented in children who have suffered episodes of severe malaria. Chronic infection without severe disease can occur in an immune-deficiency syndrome associated with a decreased responsiveness to "Salmonella" bacteria and the Epstein–Barr virus.
During childhood, malaria causes anaemia during a period of rapid brain development, and also direct brain damage resulting from cerebral malaria. Some survivors of cerebral malaria have an increased risk of neurological and cognitive deficits, behavioural disorders, and epilepsy. Malaria prophylaxis was shown to improve cognitive function and school performance in clinical trials when compared to placebo groups.
The WHO estimates that in 2018 there were 228 million new cases of malaria resulting in 405,000 deaths. Children under 5 years old are the most affected, accounting for 67% (272,000) of malaria deaths worldwide in 2018. About 125 million pregnant women are at risk of infection each year; in Sub-Saharan Africa, maternal malaria is associated with up to 200,000 estimated infant deaths yearly. There are about 10,000 malaria cases per year in Western Europe, and 1300–1500 in the United States. The United States eradicated malaria in 1951. About 900 people died from the disease in Europe between 1993 and 2003. Both the global incidence of disease and resulting mortality have declined in recent years. According to the WHO and UNICEF, deaths attributable to malaria in 2015 were reduced by 60% from a 2000 estimate of 985,000, largely due to the widespread use of insecticide-treated nets and artemisinin-based combination therapies. In 2012, there were 207 million cases of malaria. That year, the disease is estimated to have killed between 473,000 and 789,000 people, many of whom were children in Africa. Efforts at decreasing the disease in Africa since 2000 have been partially effective, with rates of the disease dropping by an estimated forty percent on the continent.
Malaria is presently endemic in a broad band around the equator, in areas of the Americas, many parts of Asia, and much of Africa; in Sub-Saharan Africa, 85–90% of malaria fatalities occur. An estimate for 2009 reported that countries with the highest death rate per 100,000 of population were Ivory Coast (86.15), Angola (56.93) and Burkina Faso (50.66). A 2010 estimate indicated the deadliest countries per population were Burkina Faso, Mozambique and Mali. The Malaria Atlas Project aims to map global levels of malaria, providing a way to determine the global spatial limits of the disease and to assess disease burden. This effort led to the publication of a map of "P. falciparum" endemicity in 2010 and an update in 2019. As of 2010, about 100 countries have endemic malaria. Every year, 125 million international travellers visit these countries, and more than 30,000 contract the disease.
The geographic distribution of malaria within large regions is complex, and malaria-afflicted and malaria-free areas are often found close to each other. Malaria is prevalent in tropical and subtropical regions because of rainfall, consistent high temperatures and high humidity, along with stagnant waters where mosquito larvae readily mature, providing them with the environment they need for continuous breeding. In drier areas, outbreaks of malaria have been predicted with reasonable accuracy by mapping rainfall. Malaria is more common in rural areas than in cities. For example, several cities in the Greater Mekong Subregion of Southeast Asia are essentially malaria-free, but the disease is prevalent in many rural regions, including along international borders and forest fringes. In contrast, malaria in Africa is present in both rural and urban areas, though the risk is lower in the larger cities.
Although the parasite responsible for "P. falciparum" malaria has been in existence for 50,000–100,000 years, the population size of the parasite did not increase until about 10,000 years ago, concurrently with advances in agriculture and the development of human settlements. Close relatives of the human malaria parasites remain common in chimpanzees. Some evidence suggests that the "P. falciparum" malaria may have originated in gorillas.
References to the unique periodic fevers of malaria are found throughout recorded history. Hippocrates described periodic fevers, labelling them tertian, quartan, subtertian and quotidian. The Roman Columella associated the disease with insects from swamps. Malaria may have contributed to the decline of the Roman Empire, and was so pervasive in Rome that it was known as the "Roman fever". Several regions in ancient Rome were considered at-risk for the disease because of the favourable conditions present for malaria vectors. This included areas such as southern Italy, the island of Sardinia, the Pontine Marshes, the lower regions of coastal Etruria and the city of Rome along the Tiber. The presence of stagnant water in these places was preferred by mosquitoes for breeding grounds. Irrigated gardens, swamp-like grounds, run-off from agriculture, and drainage problems from road construction led to the increase of standing water. In Mediaeval West Africa, the people of Djenné successfully identified the mosquito as the vector and cause of malaria.
The term malaria originates from Mediaeval —"bad air"; the disease was formerly called "ague" or "marsh fever" due to its association with swamps and marshland. The term first appeared in the English literature about 1829. Malaria was once common in most of Europe and North America, where it is no longer endemic, though imported cases do occur.
Scientific studies on malaria made their first significant advance in 1880, when Charles Louis Alphonse Laveran—a French army doctor working in the military hospital of Constantine in Algeria—observed parasites inside the red blood cells of infected people for the first time. He, therefore, proposed that malaria is caused by this organism, the first time a protist was identified as causing disease. For this and later discoveries, he was awarded the 1907 Nobel Prize for Physiology or Medicine. A year later, Carlos Finlay, a Cuban doctor treating people with yellow fever in Havana, provided strong evidence that mosquitoes were transmitting disease to and from humans. This work followed earlier suggestions by Josiah C. Nott, and work by Sir Patrick Manson, the "father of tropical medicine", on the transmission of filariasis.
In April 1894, a Scottish physician, Sir Ronald Ross, visited Sir Patrick Manson at his house on Queen Anne Street, London. This visit was the start of four years of collaboration and fervent research that culminated in 1897 when Ross, who was working in the Presidency General Hospital in Calcutta, proved the complete life-cycle of the malaria parasite in mosquitoes. He thus proved that the mosquito was the vector for malaria in humans by showing that certain mosquito species transmit malaria to birds. He isolated malaria parasites from the salivary glands of mosquitoes that had fed on infected birds. For this work, Ross received the 1902 Nobel Prize in Medicine. After resigning from the Indian Medical Service, Ross worked at the newly established Liverpool School of Tropical Medicine and directed malaria-control efforts in Egypt, Panama, Greece and Mauritius. The findings of Finlay and Ross were later confirmed by a medical board headed by Walter Reed in 1900. Its recommendations were implemented by William C. Gorgas in the health measures undertaken during construction of the Panama Canal. This public-health work saved the lives of thousands of workers and helped develop the methods used in future public-health campaigns against the disease.
In 1896, Amico Bignami discussed the role of mosquitoes in malaria. In 1898, Bignami, Giovanni Battista Grassi and Giuseppe Bastianelli succeeded in showing experimentally the transmission of malaria in humans, using infected mosquitoes to contract malaria themselves which they presented in November 1898 to the Accademia dei Lincei.
The first effective treatment for malaria came from the bark of cinchona tree, which contains quinine. This tree grows on the slopes of the Andes, mainly in Peru. The indigenous peoples of Peru made a tincture of cinchona to control fever. Its effectiveness against malaria was found and the Jesuits introduced the treatment to Europe around 1640; by 1677, it was included in the London Pharmacopoeia as an antimalarial treatment. It was not until 1820 that the active ingredient, quinine, was extracted from the bark, isolated and named by the French chemists Pierre Joseph Pelletier and Joseph Bienaimé Caventou.
Quinine was the predominant malarial medication until the 1920s when other medications began to appear. In the 1940s, chloroquine replaced quinine as the treatment of both uncomplicated and severe malaria until resistance supervened, first in Southeast Asia and South America in the 1950s and then globally in the 1980s.
The medicinal value of "Artemisia annua" has been used by Chinese herbalists in traditional Chinese medicines for 2,000 years. In 1596, Li Shizhen recommended tea made from qinghao specifically to treat malaria symptoms in his "Compendium of Materia Medica". Artemisinins, discovered by Chinese scientist Tu Youyou and colleagues in the 1970s from the plant "Artemisia annua", became the recommended treatment for "P. falciparum" malaria, administered in severe cases in combination with other antimalarials. Tu says she was influenced by a traditional Chinese herbal medicine source, "The Handbook of Prescriptions for Emergency Treatments", written in 340 by Ge Hong. For her work on malaria, Tu Youyou received the 2015 Nobel Prize in Physiology or Medicine.
"Plasmodium vivax" was used between 1917 and the 1940s for malariotherapy—deliberate injection of malaria parasites to induce a fever to combat certain diseases such as tertiary syphilis. In 1927, the inventor of this technique, Julius Wagner-Jauregg, received the Nobel Prize in Physiology or Medicine for his discoveries. The technique was dangerous, killing about 15% of patients, so it is no longer in use.
The first pesticide used for indoor residual spraying was DDT. Although it was initially used exclusively to combat malaria, its use quickly spread to agriculture. In time, pest control, rather than disease control, came to dominate DDT use, and this large-scale agricultural use led to the evolution of pesticide-resistant mosquitoes in many regions. The DDT resistance shown by "Anopheles" mosquitoes can be compared to antibiotic resistance shown by bacteria. During the 1960s, awareness of the negative consequences of its indiscriminate use increased, ultimately leading to bans on agricultural applications of DDT in many countries in the 1970s. Before DDT, malaria was successfully eliminated or controlled in tropical areas like Brazil and Egypt by removing or poisoning the breeding grounds of the mosquitoes or the aquatic habitats of the larval stages, for example by applying the highly toxic arsenic compound Paris Green to places with standing water.
Malaria vaccines have been an elusive goal of research. The first promising studies demonstrating the potential for a malaria vaccine were performed in 1967 by immunising mice with live, radiation-attenuated sporozoites, which provided significant protection to the mice upon subsequent injection with normal, viable sporozoites. Since the 1970s, there has been a considerable effort to develop similar vaccination strategies for humans. The first vaccine, called RTS,S, was approved by European regulators in 2015.
Malaria is not just a disease commonly associated with poverty: some evidence suggests that it is also a cause of poverty and a major hindrance to economic development. Although tropical regions are most affected, malaria's furthest influence reaches into some temperate zones that have extreme seasonal changes. The disease has been associated with major negative economic effects on regions where it is widespread. During the late 19th and early 20th centuries, it was a major factor in the slow economic development of the American southern states.
A comparison of average per capita GDP in 1995, adjusted for parity of purchasing power, between countries with malaria and countries without malaria gives a fivefold difference (US$1,526 versus US$8,268). In the period 1965 to 1990, countries where malaria was common had an average per capita GDP that increased only 0.4% per year, compared to 2.4% per year in other countries.
Poverty can increase the risk of malaria since those in poverty do not have the financial capacities to prevent or treat the disease. In its entirety, the economic impact of malaria has been estimated to cost Africa US$12 billion every year. The economic impact includes costs of health care, working days lost due to sickness, days lost in education, decreased productivity due to brain damage from cerebral malaria, and loss of investment and tourism. The disease has a heavy burden in some countries, where it may be responsible for 30–50% of hospital admissions, up to 50% of outpatient visits, and up to 40% of public health spending.
Cerebral malaria is one of the leading causes of neurological disabilities in African children. Studies comparing cognitive functions before and after treatment for severe malarial illness continued to show significantly impaired school performance and cognitive abilities even after recovery. Consequently, severe and cerebral malaria have far-reaching socioeconomic consequences that extend beyond the immediate effects of the disease.
Sophisticated counterfeits have been found in several Asian countries such as Cambodia, China, Indonesia, Laos, Thailand, and Vietnam, and are an important cause of avoidable death in those countries. The WHO said that studies indicate that up to 40% of artesunate-based malaria medications are counterfeit, especially in the Greater Mekong region. They have established a rapid alert system to rapidly report information about counterfeit drugs to relevant authorities in participating countries. There is no reliable way for doctors or lay people to detect counterfeit drugs without help from a laboratory. Companies are attempting to combat the persistence of counterfeit drugs by using new technology to provide security from source to distribution.
Another clinical and public health concern is the proliferation of substandard antimalarial medicines resulting from inappropriate concentration of ingredients, contamination with other drugs or toxic impurities, poor quality ingredients, poor stability and inadequate packaging. A 2012 study demonstrated that roughly one-third of antimalarial medications in Southeast Asia and Sub-Saharan Africa failed chemical analysis, packaging analysis, or were falsified.
Throughout history, the contraction of malaria has played a prominent role in the fates of government rulers, nation-states, military personnel, and military actions. In 1910, Nobel Prize in Medicine-winner Ronald Ross (himself a malaria survivor), published a book titled "The Prevention of Malaria" that included a chapter titled "The Prevention of Malaria in War." The chapter's author, Colonel C. H. Melville, Professor of Hygiene at Royal Army Medical College in London, addressed the prominent role that malaria has historically played during wars: "The history of malaria in war might almost be taken to be the history of war itself, certainly the history of war in the Christian era. ... It is probably the case that many of the so-called camp fevers, and probably also a considerable proportion of the camp dysentery, of the wars of the sixteenth, seventeenth and eighteenth centuries were malarial in origin." In British-occupied India the cocktail gin and tonic may have come about as a way of taking quinine, known for its antimalarial properties.
Malaria was the most significant health hazard encountered by U.S. troops in the South Pacific during World War II, where about 500,000 men were infected. According to Joseph Patrick Byrne, "Sixty thousand American soldiers died of malaria during the African and South Pacific campaigns."
Significant financial investments have been made to procure existing and create new antimalarial agents. During World War I and World War II, inconsistent supplies of the natural antimalaria drugs cinchona bark and quinine prompted substantial funding into research and development of other drugs and vaccines. American military organisations conducting such research initiatives include the Navy Medical Research Center, Walter Reed Army Institute of Research, and the U.S. Army Medical Research Institute of Infectious Diseases of the US Armed Forces.
Additionally, initiatives have been founded such as Malaria Control in War Areas (MCWA), established in 1942, and its successor, the Communicable Disease Center (now known as the Centers for Disease Control and Prevention, or CDC) established in 1946. According to the CDC, MCWA "was established to control malaria around military training bases in the southern United States and its territories, where malaria was still problematic".
Several notable attempts are being made to eliminate the parasite from sections of the world, or to eradicate it worldwide. In 2006, the organisation Malaria No More set a public goal of eliminating malaria from Africa by 2015, and the organization claimed they planned to dissolve if that goal was accomplished. In 2007, World Malaria Day was established by the 60th session of World Health Assembly. As of 2018 they are still functioning. One malaria vaccine is currently licensed for use and several others are in clinical trials, which are intended to provide protection for children in endemic areas and reduce the speed of transmission of the disease. , The Global Fund to Fight AIDS, Tuberculosis and Malaria has distributed 230 million insecticide-treated nets intended to stop mosquito-borne transmission of malaria. The U.S.-based Clinton Foundation has worked to manage demand and stabilise prices in the artemisinin market. Other efforts, such as the Malaria Atlas Project, focus on analysing climate and weather information required to accurately predict the spread of malaria based on the availability of habitat of malaria-carrying parasites. The Malaria Policy Advisory Committee (MPAC) of the World Health Organisation (WHO) was formed in 2012, "to provide strategic advice and technical input to WHO on all aspects of malaria control and elimination". In November 2013, WHO and the malaria vaccine funders group set a goal to develop vaccines designed to interrupt malaria transmission with the long-term goal of malaria eradication.
Malaria has been successfully eliminated or greatly reduced in certain areas. Malaria was once common in the United States and southern Europe, but vector control programs, in conjunction with the monitoring and treatment of infected humans, eliminated it from those regions. Several factors contributed, such as the draining of wetland breeding grounds for agriculture and other changes in water management practices, and advances in sanitation, including greater use of glass windows and screens in dwellings. Malaria was eliminated from most parts of the United States in the early 20th century by such methods, and the use of the pesticide DDT and other means eliminated it from the remaining pockets in the South in the 1950s as part of the National Malaria Eradication Program.
In 2015 the WHO targeted a 90% reduction in deaths from malaria by 2030 and Bill Gates said in 2016 that he thought global eradication would be possible by 2040.
In 2018, WHO announced that Paraguay was free of malaria, after an eradication effort that began in 1950.
As of 2019, the eradication process is ongoing, but with the current approaches and tools, it will be very hard to achieve a world free of malaria. Approaches may require investing more in research and greater universal health care. Continuing surveillance will also be important to prevent return of malaria in countries where the disease has been eliminated.
The Malaria Eradication Research Agenda (malERA) initiative was a consultative process to identify which areas of research and development (R&D) must be addressed for worldwide eradication of malaria.
A vaccine against malaria called RTS,S/AS01 (RTS,S) was approved by European regulators in 2015. As of 2019 it is undergoing pilot trials in 3 sub-Saharan African countries – Ghana, Kenya and Malawi – as part of the WHO's Malaria Vaccine Implementation Programme (MVIP).
Immunity (or, more accurately, tolerance) to "P. falciparum" malaria does occur naturally, but only in response to years of repeated infection. An individual can be protected from a "P. falciparum" infection if they receive about a thousand bites from mosquitoes that carry a version of the parasite rendered non-infective by a dose of X-ray irradiation. The highly polymorphic nature of many "P. falciparum" proteins results in significant challenges to vaccine design. Vaccine candidates that target antigens on gametes, zygotes, or ookinetes in the mosquito midgut aim to block the transmission of malaria. These transmission-blocking vaccines induce antibodies in the human blood; when a mosquito takes a blood meal from a protected individual, these antibodies prevent the parasite from completing its development in the mosquito. Other vaccine candidates, targeting the blood-stage of the parasite's life cycle, have been inadequate on their own. For example, SPf66 was tested extensively in areas where the disease was common in the 1990s, but trials showed it to be insufficiently effective.
Malaria parasites contain apicoplasts, organelles usually found in plants, complete with their own genomes. These apicoplasts are thought to have originated through the endosymbiosis of algae and play a crucial role in various aspects of parasite metabolism, such as fatty acid biosynthesis. Over 400 proteins have been found to be produced by apicoplasts and these are now being investigated as possible targets for novel antimalarial drugs.
With the onset of drug-resistant "Plasmodium" parasites, new strategies are being developed to combat the widespread disease. One such approach lies in the introduction of synthetic pyridoxal-amino acid adducts, which are taken up by the parasite and ultimately interfere with its ability to create several essential B vitamins. Antimalarial drugs using synthetic metal-based complexes are attracting research interest.
A non-chemical vector control strategy involves genetic manipulation of malaria mosquitoes. Advances in genetic engineering technologies make it possible to introduce foreign DNA into the mosquito genome and either decrease the lifespan of the mosquito, or make it more resistant to the malaria parasite. Sterile insect technique is a genetic control method whereby large numbers of sterile male mosquitoes are reared and released. Mating with wild females reduces the wild population in the subsequent generation; repeated releases eventually eliminate the target population.
Genomics is central to malaria research. With the sequencing of "P. falciparum", one of its vectors "Anopheles gambiae", and the human genome, the genetics of all three organisms in the malaria lifecycle can be studied. Another new application of genetic technology is the ability to produce genetically modified mosquitoes that do not transmit malaria, potentially allowing biological control of malaria transmission.
In one study, a genetically-modified strain of "Anopheles stephensi" was created that no longer supported malaria transmission, and this resistance was passed down to mosquito offspring.
Gene drive is a technique for changing wild populations, for instance to combat or eliminate insects so they cannot transmit diseases (in particular mosquitoes in the cases of malaria, zika, dengue and yellow fever).
Nearly 200 parasitic "Plasmodium" species have been identified that infect birds, reptiles, and other mammals, and about 30 species naturally infect non-human primates. Some malaria parasites that affect non-human primates (NHP) serve as model organisms for human malarial parasites, such as "P. coatneyi" (a model for "P. falciparum") and "P. cynomolgi" ("P. vivax"). Diagnostic techniques used to detect parasites in NHP are similar to those employed for humans. Malaria parasites that infect rodents are widely used as models in research, such as "P. berghei". Avian malaria primarily affects species of the order Passeriformes, and poses a substantial threat to birds of Hawaii, the Galapagos, and other archipelagoes. The parasite "P. relictum" is known to play a role in limiting the distribution and abundance of endemic Hawaiian birds. Global warming is expected to increase the prevalence and global distribution of avian malaria, as elevated temperatures provide optimal conditions for parasite reproduction. | https://en.wikipedia.org/wiki?curid=20423 |
Lunar phase
The lunar phase or Moon phase is the shape of the directly sunlit portion of the Moon as viewed from Earth. The lunar phases gradually change over the period of a synodic month (about 29.53 days), as the orbital positions of the Moon around Earth and of Earth around the Sun shift.
The Moon's rotation is tidally locked by Earth's gravity; therefore, most of the same lunar side always faces Earth. This near side is variously sunlit, depending on the position of the Moon in its orbit. Thus, the sunlit portion of this face can vary from 0% (at new moon) to 100% (at full moon). The lunar terminator is the boundary between the illuminated and darkened hemispheres.
Each of the four "intermediate" lunar phases (see below) is around 7.4 days, but this varies slightly due to the elliptical shape of the Moon's orbit. Aside from some craters near the lunar poles, such as Shoemaker, all parts of the Moon see around 13.77 days of daylight, followed by 13.77 days of "night". (The side of the Moon facing away from Earth is sometimes called the "dark side of the Moon", although that is a misnomer).
There are four "principal" lunar phases: new moon, first quarter, full moon, and last quarter (also known as third or final quarter), when the Moon's ecliptic longitude is at an angle to the Sun (as viewed from Earth) of 0°, 90°, 180°, and 270°, respectively. Each of these phases appear at slightly different times at different locations on Earth. During the intervals between principal phases are "intermediate" phases, during which the Moon's apparent shape is either crescent or gibbous. The intermediate phases last one-quarter of a synodic month, or 7.38 days, on average. The descriptor "waxing" is used for an intermediate phase when the Moon's apparent shape is thickening, from new to full moon, and "waning" when the shape is thinning. The longest duration between full moon to new moon (or new moon to full moon) lasts about 15 days and 14.5 hours, while the shortest duration between full moon to new moon (or new moon to full moon) lasts only about 13 days and 22.5 hours.
Non-Western cultures may use a different number of lunar phases; for example, traditional Hawaiian culture has a total of 30 phases (one per day).
When the Sun and Moon are aligned on the same side of the Earth, the Moon is "new", and the side of the Moon facing Earth is not illuminated by the Sun. As the Moon "waxes" (the amount of illuminated surface as seen from Earth is increasing), the lunar phases progress through new moon, crescent moon, first-quarter moon, gibbous moon, and full moon. The Moon is then said to "wane" as it passes through the gibbous moon, third-quarter moon, crescent moon, and back to new moon. The terms "old moon" and "new moon" are not interchangeable. The "old moon" is a waning sliver (which eventually becomes undetectable to the naked eye) until the moment it aligns with the Sun and begins to wax, at which point it becomes new again. "Half moon" is often used to mean the first- and third-quarter moons, while the term "quarter" refers to the extent of the Moon's cycle around the Earth, not its shape.
When an illuminated hemisphere is viewed from a certain angle, the portion of the illuminated area that is visible will have a two-dimensional shape as defined by the intersection of an ellipse and circle (in which the ellipse's major axis coincides with the circle's diameter). If the half-ellipse is convex with respect to the half-circle, then the shape will be gibbous (bulging outwards), whereas if the half-ellipse is concave with respect to the half-circle, then the shape will be a crescent. When a crescent moon occurs, the phenomenon of earthshine may be apparent, where the night side of the Moon dimly reflects indirect sunlight reflected from Earth.
In the Northern Hemisphere, if the left (east) side of the Moon is dark, then the bright part is thickening, and the Moon is described as waxing (shifting toward full moon). If the right (west) side of the Moon is dark, then the bright part is thinning, and the Moon is described as waning (past full and shifting toward new moon). Assuming that the viewer is in the Northern Hemisphere, the right side of the Moon is the part that is always waxing. (That is, if the right side is dark, the Moon is becoming darker; if the right side is lit, the Moon is getting brighter.)
In the Southern Hemisphere, the Moon is observed from a perspective inverted, or rotated 180°, to that of the Northern and to all of the images in this article, so that the opposite sides appear to wax or wane.
Closer to the Equator, the lunar terminator will appear horizontal during the morning and evening. Since the above descriptions of the lunar phases only apply at middle or high latitudes, observers moving towards the tropics from northern or southern latitudes will see the Moon rotated anti-clockwise or clockwise with respect to the images in this article.
The lunar crescent can open upward or downward, with the "horns" of the crescent pointing up or down, respectively. When the Sun appears above the Moon in the sky, the crescent opens downward; when the Moon is above the Sun, the crescent opens upward. The crescent Moon is most clearly and brightly visible when the Sun is below the horizon, which implies that the Moon must be above the Sun, and the crescent must open upward. This is therefore the orientation in which the crescent Moon is most often seen from the tropics. The waxing and waning crescents look very similar. The waxing crescent appears in the western sky in the evening, and the waning crescent in the eastern sky in the morning.
When the Moon as seen from Earth is a thin crescent, Earth as viewed from the Moon is almost fully lit by the Sun. Often, the dark side of the Moon is dimly illuminated by indirect sunlight reflected from Earth, but is bright enough to be easily visible from Earth. This phenomenon is called earthshine and sometimes picturesquely described as "the old moon in the new moon's arms" or "the new moon in the old moon's arms".
The Gregorian calendar month, which is of a tropical year, is about 30.44 days, while the cycle of lunar phases (the Moon's synodic period) repeats every 29.53 days on average. Therefore, the timing of the lunar phases shifts by an average of almost one day for each successive month. (A lunar year lasts about 354 days.)
Photographing the Moon's phase every day for a month (starting in the evening after sunset, and repeating roughly 24 hours and 50 minutes later, and ending in the morning before sunrise) and arranging the series of photos on a calendar would create a composite image like the example calendar (May 8 – June 6, 2005) shown on the left. May 20 is blank because a picture would be taken before midnight on May 19 and the next after midnight on May 21.
Similarly, on a calendar listing moonrise or moonset times, some days will appear to be skipped. When moonrise precedes midnight one night, the next moonrise will follow midnight on the next night (so too with moonset). The "skipped day" is just a feature of the Moon's eastward movement in relation to the Sun, which at most latitudes, causes the Moon to rise later each day. The Moon follows a predictable orbit every month.
Each of the four intermediate phases lasts approximately seven days (7.38 days on average), but varies slightly due to lunar apogee and perigee.
The number of days counted from the time of the new moon is the Moon's "age". Each complete cycle of phases is called a "lunation".
The approximate age of the Moon, and hence the approximate phase, can be calculated for any date by calculating the number of days since a known new moon (such as January 1, 1900 or August 11, 1999) and reducing this modulo 29.530588853 (the length of a synodic month). The difference between two dates can be calculated by subtracting the Julian day number of one from that of the other, or there are simpler formulae giving (for instance) the number of days since December 31, 1899. However, this calculation assumes a perfectly circular orbit and makes no allowance for the time of day at which the new moon occurred and therefore may be incorrect by several hours. (It also becomes less accurate the larger the difference between the required date and the reference date). It is accurate enough to use in a novelty clock application showing lunar phase, but specialist usage taking account of lunar apogee and perigee requires a more elaborate calculation.
The Earth subtends an angle of about two degrees when seen from the Moon. This means that an observer on Earth who sees the Moon when it is close to the eastern horizon sees it from an angle that is about 2 degrees different from the line of sight of an observer who sees the Moon on the western horizon. The Moon moves about 12 degrees around its orbit per day, so, if these observers were stationary, they would see the phases of the Moon at times that differ by about one-sixth of a day, or 4 hours. But in reality, the observers are on the surface of the rotating Earth, so someone who sees the Moon on the eastern horizon at one moment sees it on the western horizon about 12 hours later. This adds an oscillation to the apparent progression of the lunar phases. They appear to occur more slowly when the Moon is high in the sky than when it is below the horizon. The Moon appears to move jerkily, and the phases do the same. The amplitude of this oscillation is never more than about four hours, which is a small fraction of a month. It does not have any obvious effect on the appearance of the Moon. However, it does affect accurate calculations of the times of lunar phases.
It might be expected that once every month, when the Moon passes between Earth and the Sun during a new moon, its shadow would fall on Earth causing a solar eclipse, but this does not happen every month. Nor is it true that during every full moon, the Earth's shadow falls on the Moon, causing a lunar eclipse. Solar and lunar eclipses are not observed "every" month because the plane of the Moon's orbit around the Earth is tilted by about 5° with respect to the plane of Earth's orbit around the Sun (the plane of the ecliptic). Thus, when new and full moons occur, the Moon usually lies to the north or south of a direct line through the Earth and Sun. Although an eclipse can only occur when the Moon is either new (solar) or full (lunar), it must also be positioned very near the intersection of Earth's orbital plane about the Sun and the Moon's orbital plane about the Earth (that is, at one of its nodes). This happens about twice per year, and so there are between four and seven eclipses in a calendar year. Most of these eclipses are partial; total eclipses of the Moon or Sun are less frequent. | https://en.wikipedia.org/wiki?curid=20424 |
Metonic cycle
The Metonic cycle or enneadecaeteris (from
(enneakaidekaeteris), "nineteen") is a period of approximately 19 years after which the phases of the moon recur on the same day of the year. The recurrence is not perfect, and by precise observation the Metonic cycle is defined as 235 synodic lunar months, a period which is just 1h27m33s longer than 19 tropical years. Using these integer numbers facilitates the construction of a luni-solar calendar.
A tropical year is longer than 12 lunar months and shorter than 13 of them. The arithmetical equality "12x12 + 7x13 = 235" allows it to be seen that a combination of 12 'shorter' (12 months) years and 7 'longer' (13 months) years will be equal to 19=12+7 solar years.
Traditionally, for the Babylonian and Hebrew lunisolar calendars, the years 3, 6, 8, 11, 14, 17, and 19 are the long (13-month) years of the Metonic cycle. This cycle forms the basis of the Greek and Hebrew calendars, and is used for the computation of the date of Easter each year.
The Babylonians applied the 19-year cycle since the late sixth century BC. As they measured the moon's motion against the stars, the 235:19 relationship may originally have referred to sidereal years, instead of tropical years as it has been used for various calendars.
According to Livy, the king of Rome Numa Pompilius (753–673 BC) inserted intercalary months in such a way that "in the twentieth year the days should fall in with the same position of the sun from which they had started." As "the twentieth year" takes place nineteen years after "the first year", this seems to indicate that the Metonic cycle was applied to Numa's calendar.
Diodorus Siculus reports that Apollo is said to have visited the Hyperboreans once every 19 years.
The Metonic cycle has been implemented in the Antikythera mechanism which offers unexpected evidence for the popularity of the calendar based on it.
Meton of Athens approximated the cycle to a whole number (6,940) of days, obtained by 125 long months of 30 days and 110 short months of 29 days. During the next century, Callippus developed the Callippic cycle of four 19-year periods for a 76-year cycle with a mean year of exactly 365.25 days.
The (19-year) Metonic cycle is a lunisolar cycle, as is the (76-year) Callippic cycle. An important example of an application of the Metonic cycle in the Julian calendar is the 19-year lunar cycle insofar as provided with a Metonic structure. Around AD 260 the Alexandrian computist Anatolius, who became bishop of Laodicea in AD 268, was the first to construct a version of this efficient computistical instrument for determining the date of Easter Sunday. However, it was some later, somewhat different, version of the Metonic 19-year lunar cycle which ultimately, as the basic structure of Dionysius Exiguus’ and also of Bede’s Easter table, would prevail throughout Christendom for a long time, at least until in the year 1582, when the Julian calendar was replaced with the Gregorian calendar.
The Runic calendar is a perpetual calendar based on the 19-year-long Metonic cycle. Also known as a Rune staff or Runic Almanac, it appears to have been a medieval Swedish invention. This calendar does not rely on knowledge of the duration of the tropical year or of the occurrence of leap years. It is set at the beginning of each year by observing the first full moon after the winter solstice. The oldest one known, and the only one from the Middle Ages, is the Nyköping staff, which is believed to date from the 13th century.
The Bahá'í calendar, established during the middle of the 19th century, is also based on cycles of 19 years.
In China, the traditional Chinese calendar used the Metonic cycle ever since the first known ancient China calendar. The cycle was continually used until the 5th century when it was replaced by a more accurate cycle.
The importance of the tropical year for agriculture came to be realized much later than the adoption of lunar months for time keeping. However, it was recognized that the two cannot be easily coordinated over a short time span, so longer intervals were considered and the Metonic cycle was discovered as rather good, but not perfect, schema. The currently accepted values are:
The difference is 0.086 days for a cycle which mean that after a dozen returns there will be a full day of delay between the astronomical data and calculations. The error is actually one day every 219 years, or 12.4 parts per million. However, the Metonic cycle turned out to be very close to other periods:
Being close (to somewhat more than half a day) to 255 draconic months, the Metonic cycle is also an eclipse cycle, which lasts only for about 4 or 5 recurrences of eclipses. The Octon is of a Metonic cycle (47 synodic months, 3.8 years), and it recurs about 20 to 25 cycles.
This cycle seems to be a coincidence. The periods of the Moon's orbit around the Earth and the Earth's orbit around the Sun are believed to be independent, and not to have any known physical resonance. An example of a non-coincidental cycle is the orbit of Mercury, with its 3:2 spin-orbit resonance.
A lunar year of 12 synodic months is about 354 days, approximately 11 days short of the "365-day" solar year. Therefore, for a lunisolar calendar, every 2 to 3 years there is a difference of more than a full lunar month between the lunar and solar years, and an extra ("embolismic") month needs to be inserted (intercalation). The Athenians initially seem not to have had a regular means of intercalating a 13th month; instead, the question of when to add a month was decided by an official. Meton's discovery made it possible to propose a regular intercalation scheme. The Babylonians seem to have introduced this scheme around 500 BC, thus well before Meton.
The Metonic cycle is related to two less accurate subcycles:
By combining appropriate numbers of 11-year and 19-year periods, it is possible to generate ever more accurate cycles. For example, simple arithmetic shows that:
This gives an error of only about half an hour in 687 years (2.5 seconds a year), although this is subject to secular variation in the length of the tropical year and the lunation.
At the time of Meton, axial precession had not yet been discovered, and he could not distinguish between sidereal years (currently: 365.256363 days) and tropical years (currently: 365.242190 days). Most calendars, like the commonly used Gregorian calendar, are based on the tropical year and maintain the seasons at the same calendar times each year. | https://en.wikipedia.org/wiki?curid=20426 |
Marcello Malpighi
Marcello Malpighi (10 March 1628 – 29 November 1694) was an Italian biologist and physician, who is referred to as the "Founder of microscopical anatomy, histology & Father of physiology and embryology". Malpighi's name is borne by several physiological features related to the biological excretory system, such as the Malpighian corpuscles and Malpighian pyramids of the kidneys and the Malpighian tubule system of insects. The splenic lymphoid nodules are often called the "Malpighian bodies of the spleen" or Malpighian corpuscles. The botanical family Malpighiaceae is also named after him. He was the first person to see capillaries in animals, and he discovered the link between arteries and veins that had eluded William Harvey. Malpighi was one of the earliest people to observe red blood cells under a microscope, after Jan Swammerdam. His treatise "De polypo cordis" (1666) was important for understanding blood composition, as well as how blood clots. In it, Malpighi described how the form of a blood clot differed in the right against the left sides of the heart.
The use of the microscope enabled Malpighi to discover that invertebrates do not use lungs to breathe, but small holes in their skin called tracheae. Malpighi also studied the anatomy of the brain and concluded this organ is a gland. In terms of modern endocrinology, this deduction is correct because the hypothalamus of the brain has long been recognized for its hormone-secreting capacity.
Because Malpighi had a wide knowledge of both plants and animals, he made contributions to the scientific study of both. The Royal Society of London published two volumes of his botanical and zoological works in 1675 and 1679. Another edition followed in 1687, and a supplementary volume in 1697. In his autobiography, Malpighi speaks of his "Anatome Plantarum", decorated with the engravings of Robert White, as "the most elegant format in the whole literate world."
His study of plants led him to conclude that plants had tubules similar to those he saw in insects like the silk worm (using his microscope, he probably saw the stomata, through which plants exchange carbon dioxide with oxygen). Malpighi observed that when a ring-like portion of bark was removed on a trunk a swelling occurred in the tissues above the ring, and he correctly interpreted this as growth stimulated by food coming down from the leaves, and being blocked above the ring.
Malpighi was born on 10 March 1628 at Crevalcore near Bologna, Italy. The son of well-to-do parents, Malpighi was educated in his native city, entering the University of Bologna at the age of 17. In a posthumous work delivered and dedicated to the Royal Society in London in 1697, Malpighi says he completed his grammatical studies in 1645, at which point he began to apply himself to the study of peripatetic philosophy. He completed these studies about 1649, where at the persuasion of his mother Frances Natalis, he began to study physics. When his parents and grandmother became ill, he returned to his family home near Bologna to care for them.
Malpighi studied Aristotelian philosophy at the University of Bologna while he was very young.
Despite opposition from the university authorities because he was non-Bolognese by birth, in 1653 he was granted doctorates in both medicine and philosophy. He later graduated as a medical doctor at the age of 25. Subsequently, he was appointed as a teacher, whereupon he immediately dedicated himself to further study in anatomy and medicine. For most of his career, Malpighi combined an intense interest in scientific research with a fond love of teaching. He was invited to correspond with the Royal Society in 1667 by Henry Oldenburg, and became a fellow of the society the next year.
In 1656, Ferdinand II of Tuscany invited him to the professorship of theoretical medicine at the University of Pisa. There Malpighi began his lifelong friendship with Giovanni Borelli, mathematician and naturalist, who was a prominent supporter of the Accademia del Cimento, one of the first scientific societies. Malpighi questioned the prevailing medical teachings at Pisa, tried experiments on colour changes in blood, and attempted to recast anatomical, physiological, and medical problems of the day. Family responsibilities and poor health prompted Malpighi's return in 1659 to the University of Bologna, where he continued to teach and do research with his microscopes. In 1661 he identified and described the pulmonary and capillary network connecting small arteries with small veins. Malpighi's views evoked increasing controversy and dissent, mainly from envy and lack of understanding on the part of his colleagues.
In 1653, his father, mother, and grandmother being dead, Malpighi left his family villa and returned to the University of Bologna to study anatomy. In 1656, he was made a reader at Bologna, and then a professor of physics at Pisa, where he began to abandon the disputative method of learning and apply himself to a more experimental method of research. Based on this research, he wrote some "Dialogues against the Peripatetics and Galenists" (those who followed the precepts of Galen), which were destroyed when his house burned down. Weary of philosophical disputation, in 1660, Malpighi returned to Bologna and dedicated himself to the study of anatomy. He subsequently discovered a new structure of the lungs which led him to several disputes with the learned medical men of the times. In 1662, he was made a professor of Physics at the Academy of Messina.
Retiring from university life to his villa in the country near Bologna in 1663, he worked as a physician while continuing to conduct experiments on the plants and insects he found on his estate. There he made discoveries of the structure of plants which he published in his "Observations". At the end of 1666, Malpighi was invited to return to the public academy at Messina, which he did in 1667. Although he accepted temporary chairs at the universities of Pisa and Messina, throughout his life he continuously returned to Bologna to practice medicine, a city that repaid him by erecting a monument in his memory after his death.
In 1668, Malpighi received a letter from Mr. Oldenburg of the Royal Society in London, inviting him to correspond. Malpighi wrote his history of the silkworm in 1668, and sent the manuscript to Mr. Oldenburg. As a result, Malpighi was made a member of the Royal Society in 1669. In 1671, Malpighi's "Anatomy of Plants" was published in London by the Royal Society, and he simultaneously wrote to Mr. Oldenburg, telling him of his recent discoveries regarding the lungs, fibers of the spleen and testicles, and several other discoveries involving the brain and sensory organs. He also shared more information regarding his research on plants. At that time, he related his disputes with some younger physicians who were strenuous supporters of the Galenic principles and opposed to all new discoveries. Following many other discoveries and publications, in 1691, Malpighi was invited to Rome by Pope Innocent XII to become papal physician and professor of medicine at the Papal Medical School. He remained in Rome until his death.
Marcello Malpighi is buried in the church of the Santi Gregorio e Siro, in Bologna, where nowadays can be seen a marble monument to the scientist with an inscription in Latin remembering – among other things – his "SUMMUM INGENIUM / INTEGERRIMAM VITAM / FORTEM STRENUAMQUE MENTEM / AUDACEM SALUTARIS ARTIS AMOREM" (great genius, honest life, strong and tough mind, daring love for the medical art).
Around the age of 38, and with a remarkable academic career behind him, Malpighi decided to dedicate his free time to anatomical studies. Although he conducted some of his studies using vivisection and others through the dissection of corpses, his most illustrative efforts appear to have been based on the use of the microscope. Because of this work, many microscopic anatomical structures are named after Malpighi, including a skin layer (Malpighi layer) and two different Malpighian corpuscles in the kidneys and the spleen, as well as the Malpighian tubules in the excretory system of insects.
Although a Dutch spectacle maker created the compound lens and inserted it in a microscope around the turn of the 17th century, and Galileo had applied the principle of the compound lens to the making of his microscope patented in 1609, its possibilities as a microscope had remained unexploited for half a century, until Robert Hooke improved the instrument. Following this, Marcello Malpighi, Hooke, and two other early investigators associated with the Royal Society, Nehemiah Grew and Antoine van Leeuwenhoek were fortunate to have a virtually untried tool in their hands as they began their investigations.
In 1661, Malpighi observed capillary structures in frog lungs. Extrapolating to humans, he offered an explanation for how air and blood mix in the lungs. Malpighi also used the microscope for his studies of the skin, kidneys, and liver. For example, after he dissected a black male, Malpighi made some groundbreaking headway into the discovery of the origin of black skin. He found that the black pigment was associated with a layer of mucus just beneath the skin.
A talented sketch artist, Malpighi seems to have been the first author to have made detailed drawings of individual organs of flowers. In his "Anatome plantarum" is a longitudinal section of a flower of "Nigella" (his Melanthi, literally honey-flower) with details of the nectariferous organs. He adds that it is strange that nature has produced on the leaves of the flower shell-like organs in which honey is produced.
Malpighi had success in tracing the ontogeny of plant organs, and the serial development of the shoot owing to his instinct shaped in the sphere of animal embryology. He specialized in seedling development, and in 1679, he published a volume containing a series of exquisitely drawn and engraved images of the stages of development of Leguminosae (beans) and Cucurbitaceae (squash, melons). Later, he published material depicting the development of the date palm. The great Swedish botanist Linnaeus named the genus "Malpighia" in honor of Malpighi's work with plants; "Malpighia" is the type genus for the Malpighiaceae, a family of tropical and subtropical flowering plants.
Because Malpighi was concerned with teratology (the scientific study of the visible conditions caused by the interruption or alteration of normal development) he expressed grave misgivings about the view of his contemporaries that the galls of trees and herbs gave birth to insects. He conjectured (correctly) that the creatures in question arose from eggs previously laid in the plant tissue.
Malpighi's investigations of the lifecycle of plants and animals led him into the topic of reproduction. He created detailed drawings of his studies of chick embryo development, seed development in plants (such as the lemon tree), and the transformation of caterpillars into insects. His discoveries helped to illuminate philosophical arguments surrounding the topics of "emboîtment", pre-existence, preformation, epigenesis, and metamorphosis.
In 1691 Pope Innocent XII invited him to Rome as papal physician. He taught medicine in the Papal Medical School and wrote a long treatise about his studies which he donated to the Royal Society of London.
Marcello Malpighi died of apoplexy (an old-fashioned term for a stroke or stroke-like symptoms) in Rome on 29 September 1694, at the age of 66. In accordance with his wishes, an autopsy was performed. The Royal Society published his studies in 1696. Asteroid 11121 Malpighi is named in his honor.
Malpighi is buried in the church of the Santi Gregorio e Siro, in Bologna, where nowadays can be seen a marble monument to the scientist with an inscription in Latin remembering – among other things – his "SUMMUM INGENIUM / INTEGERRIMAM VITAM / FORTEM STRENUAMQUE MENTEM / AUDACEM SALUTARIS ARTIS AMOREM" (great genius, honest life, strong and tough mind, daring love for the medical art). | https://en.wikipedia.org/wiki?curid=20428 |
Momentum
In Newtonian mechanics, linear momentum, translational momentum, or simply momentum (pl. momenta) is the product of the mass and velocity of an object. It is a vector quantity, possessing a magnitude and a direction. If is an object's mass and is its velocity (also a vector quantity), then the object's momentum is:
formula_1
In SI units, momentum is measured in kilogram meters per second (kg⋅m/s).
Newton's second law of motion states that the rate of change of a body's momentum is equal to the net force acting on it. Momentum depends on the frame of reference, but in any inertial frame it is a "conserved" quantity, meaning that if a closed system is not affected by external forces, its total linear momentum does not change. Momentum is also conserved in special relativity (with a modified formula) and, in a modified form, in electrodynamics, quantum mechanics, quantum field theory, and general relativity. It is an expression of one of the fundamental symmetries of space and time: translational symmetry.
Advanced formulations of classical mechanics, Lagrangian and Hamiltonian mechanics, allow one to choose coordinate systems that incorporate symmetries and constraints. In these systems the conserved quantity is "generalized momentum", and in general this is different from the "kinetic" momentum defined above. The concept of generalized momentum is carried over into quantum mechanics, where it becomes an operator on a wave function. The momentum and position operators are related by the Heisenberg uncertainty principle.
In continuous systems such as electromagnetic fields, fluid dynamics and deformable bodies, a momentum density can be defined, and a continuum version of the conservation of momentum leads to equations such as the Navier–Stokes equations for fluids or the Cauchy momentum equation for deformable solids or fluids.
Momentum is a vector quantity: it has both magnitude and direction. Since momentum has a direction, it can be used to predict the resulting direction and speed of motion of objects after they collide. Below, the basic properties of momentum are described in one dimension. The vector equations are almost identical to the scalar equations (see multiple dimensions).
The momentum of a particle is conventionally represented by the letter . It is the product of two quantities, the particle's mass (represented by the letter ) and its velocity ():
The unit of momentum is the product of the units of mass and velocity. In SI units, if the mass is in kilograms and the velocity is in meters per second then the momentum is in kilogram meters per second (kg⋅m/s). In cgs units, if the mass is in grams and the velocity in centimeters per second, then the momentum is in gram centimeters per second (g⋅cm/s).
Being a vector, momentum has magnitude and direction. For example, a 1 kg model airplane, traveling due north at 1 m/s in straight and level flight, has a momentum of 1 kg⋅m/s due north measured with reference to the ground.
The momentum of a system of particles is the vector sum of their momenta. If two particles have respective masses and , and velocities and , the total momentum is
The momenta of more than two particles can be added more generally with the following:
A system of particles has a center of mass, a point determined by the weighted sum of their positions:
If one or more of the particles is moving, the center of mass of the system will generally be moving as well (unless the system is in pure rotation around it). If the total mass of the particles is formula_6, and the center of mass is moving at velocity , the momentum of the system is:
This is known as Euler's first law.
If the net force applied to a particle is constant, and is applied for a time interval , the momentum of the particle changes by an amount
In differential form, this is Newton's second law; the rate of change of the momentum of a particle is equal to the instantaneous force acting on it,
If the net force experienced by a particle changes as a function of time, , the change in momentum (or impulse ) between times and is
Impulse is measured in the derived units of the newton second (1 N⋅s = 1 kg⋅m/s) or dyne second (1 dyne⋅s = 1 g⋅cm/s)
Under the assumption of constant mass , it is equivalent to write
hence the net force is equal to the mass of the particle times its acceleration.
"Example": A model airplane of mass 1 kg accelerates from rest to a velocity of 6 m/s due north in 2 s. The net force required to produce this acceleration is 3 newtons due north. The change in momentum is 6 kg⋅m/s due north. The rate of change of momentum is 3 (kg⋅m/s)/s due north which is numerically equivalent to 3 newtons.
In a closed system (one that does not exchange any matter with its surroundings and is not acted on by external forces) the total momentum is constant. This fact, known as the "law of conservation of momentum", is implied by Newton's laws of motion. Suppose, for example, that two particles interact. Because of the third law, the forces between them are equal and opposite. If the particles are numbered 1 and 2, the second law states that and . Therefore,
with the negative sign indicating that the forces oppose. Equivalently,
If the velocities of the particles are and before the interaction, and afterwards they are and , then
This law holds no matter how complicated the force is between particles. Similarly, if there are several particles, the momentum exchanged between each pair of particles adds up to zero, so the total change in momentum is zero. This conservation law applies to all interactions, including collisions and separations caused by explosive forces. It can also be generalized to situations where Newton's laws do not hold, for example in the theory of relativity and in electrodynamics.
Momentum is a measurable quantity, and the measurement depends on the motion of the observer. For example: if an apple is sitting in a glass elevator that is descending, an outside observer, looking into the elevator, sees the apple moving, so, to that observer, the apple has a non-zero momentum. To someone inside the elevator, the apple does not move, so, it has zero momentum. The two observers each have a frame of reference, in which, they observe motions, and, if the elevator is descending steadily, they will see behavior that is consistent with those same physical laws.
Suppose a particle has position in a stationary frame of reference. From the point of view of another frame of reference, moving at a uniform speed , the position (represented by a primed coordinate) changes with time as
This is called a Galilean transformation. If the particle is moving at speed in the first frame of reference, in the second, it is moving at speed
Since does not change, the accelerations are the same:
Thus, momentum is conserved in both reference frames. Moreover, as long as the force has the same form, in both frames, Newton's second law is unchanged. Forces such as Newtonian gravity, which depend only on the scalar distance between objects, satisfy this criterion. This independence of reference frame is called Newtonian relativity or Galilean invariance.
A change of reference frame, can, often, simplify calculations of motion. For example, in a collision of two particles, a reference frame can be chosen, where, one particle begins at rest. Another, commonly used reference frame, is the center of mass frame – one that is moving with the center of mass. In this frame,
the total momentum is zero.
By itself, the law of conservation of momentum is not enough to determine the motion of particles after a collision. Another property of the motion, kinetic energy, must be known. This is not necessarily conserved. If it is conserved, the collision is called an "elastic collision"; if not, it is an "inelastic collision".
An elastic collision is one in which no kinetic energy is absorbed in the collision. Perfectly elastic "collisions" can occur when the objects do not touch each other, as for example in atomic or nuclear scattering where electric repulsion keeps them apart. A slingshot maneuver of a satellite around a planet can also be viewed as a perfectly elastic collision. A collision between two pool balls is a good example of an "almost" totally elastic collision, due to their high rigidity, but when bodies come in contact there is always some dissipation.
A head-on elastic collision between two bodies can be represented by velocities in one dimension, along a line passing through the bodies. If the velocities are and before the collision and and after, the equations expressing conservation of momentum and kinetic energy are:
A change of reference frame can simplify analysis of a collision. For example, suppose there are two bodies of equal mass , one stationary and one approaching the other at a speed (as in the figure). The center of mass is moving at speed and both bodies are moving towards it at speed . Because of the symmetry, after the collision both must be moving away from the center of mass at the same speed. Adding the speed of the center of mass to both, we find that the body that was moving is now stopped and the other is moving away at speed . The bodies have exchanged their velocities. Regardless of the velocities of the bodies, a switch to the center of mass frame leads us to the same conclusion. Therefore, the final velocities are given by
In general, when the initial velocities are known, the final velocities are given by
If one body has much greater mass than the other, its velocity will be little affected by a collision while the other body will experience a large change.
In an inelastic collision, some of the kinetic energy of the colliding bodies is converted into other forms of energy (such as heat or sound). Examples include traffic collisions, in which the effect of loss of kinetic energy can be seen in the damage to the vehicles; electrons losing some of their energy to atoms (as in the Franck–Hertz experiment); and particle accelerators in which the kinetic energy is converted into mass in the form of new particles.
In a perfectly inelastic collision (such as a bug hitting a windshield), both bodies have the same motion afterwards. A head-on inelastic collision between two bodies can be represented by velocities in one dimension, along a line passing through the bodies. If the velocities are and before the collision then in a perfectly inelastic collision both bodies will be travelling with velocity after the collision. The equation expressing conservation of momentum is:
If one body is motionless to begin with (e.g. formula_23), the equation for conservation of momentum is
so
In a different situation, if the frame of reference is moving at the final velocity such that formula_26, the objects would be brought to rest by a perfectly inelastic collision and 100% of the kinetic energy is converted to other forms of energy. In this instance the initial velocities of the bodies would be non-zero, or the bodies would have to be massless.
One measure of the inelasticity of the collision is the coefficient of restitution , defined as the ratio of relative velocity of separation to relative velocity of approach. In applying this measure to a ball bouncing from a solid surface, this can be easily measured using the following formula:
The momentum and energy equations also apply to the motions of objects that begin together and then move apart. For example, an explosion is the result of a chain reaction that transforms potential energy stored in chemical, mechanical, or nuclear form into kinetic energy, acoustic energy, and electromagnetic radiation. Rockets also make use of conservation of momentum: propellant is thrust outward, gaining momentum, and an equal and opposite momentum is imparted to the rocket.
Real motion has both direction and velocity and must be represented by a vector. In a coordinate system with axes, velocity has components in the -direction, in the -direction, in the -direction. The vector is represented by a boldface symbol:
Similarly, the momentum is a vector quantity and is represented by a boldface symbol:
The equations in the previous sections, work in vector form if the scalars and are replaced by vectors and . Each vector equation represents three scalar equations. For example,
represents three equations:
The kinetic energy equations are exceptions to the above replacement rule. The equations are still one-dimensional, but each scalar represents the magnitude of the vector, for example,
Each vector equation represents three scalar equations. Often coordinates can be chosen so that only two components are needed, as in the figure. Each component can be obtained separately and the results combined to produce a vector result.
A simple construction involving the center of mass frame can be used to show that if a stationary elastic sphere is struck by a moving sphere, the two will head off at right angles after the collision (as in the figure).
The concept of momentum plays a fundamental role in explaining the behavior of variable-mass objects such as a rocket ejecting fuel or a star accreting gas. In analyzing such an object, one treats the object's mass as a function that varies with time: . The momentum of the object at time is therefore . One might then try to invoke Newton's second law of motion by saying that the external force on the object is related to its momentum by , but this is incorrect, as is the related expression found by applying the product rule to :
This equation does not correctly describe the motion of variable-mass objects. The correct equation is
where is the velocity of the ejected/accreted mass "as seen in the object's rest frame". This is distinct from , which is the velocity of the object itself as seen in an inertial frame.
This equation is derived by keeping track of both the momentum of the object as well as the momentum of the ejected/accreted mass ("dm"). When considered together, the object and the mass ("dm") constitute a closed system in which total momentum is conserved.
Newtonian physics assumes that absolute time and space exist outside of any observer; this gives rise to Galilean invariance. It also results in a prediction that the speed of light can vary from one reference frame to another. This is contrary to observation. In the special theory of relativity, Einstein keeps the postulate that the equations of motion do not depend on the reference frame, but assumes that the speed of light is invariant. As a result, position and time in two reference frames are related by the Lorentz transformation instead of the Galilean transformation.
Consider, for example, one reference frame moving relative to another at velocity in the direction. The Galilean transformation gives the coordinates of the moving frame as
while the Lorentz transformation gives
where is the Lorentz factor:
Newton's second law, with mass fixed, is not invariant under a Lorentz transformation. However, it can be made invariant by making the "inertial mass" of an object a function of velocity:
The modified momentum,
obeys Newton's second law:
Within the domain of classical mechanics, relativistic momentum closely approximates Newtonian momentum: at low velocity, is approximately equal to , the Newtonian expression for momentum.
In the theory of special relativity, physical quantities are expressed in terms of four-vectors that include time as a fourth coordinate along with the three space coordinates. These vectors are generally represented by capital letters, for example for position. The expression for the "four-momentum" depends on how the coordinates are expressed. Time may be given in its normal units or multiplied by the speed of light so that all the components of the four-vector have dimensions of length. If the latter scaling is used, an interval of proper time, , defined by
is invariant under Lorentz transformations (in this expression and in what follows the metric signature has been used, different authors use different conventions). Mathematically this invariance can be ensured in one of two ways: by treating the four-vectors as Euclidean vectors and multiplying time by ; or by keeping time a real quantity and embedding the vectors in a Minkowski space. In a Minkowski space, the scalar product of two four-vectors and is defined as
In all the coordinate systems, the (contravariant) relativistic four-velocity is defined by
and the (contravariant) four-momentum is
where is the invariant mass. If (in Minkowski space), then
Using Einstein's mass-energy equivalence, , this can be rewritten as
Thus, conservation of four-momentum is Lorentz-invariant and implies conservation of both mass and energy.
The magnitude of the momentum four-vector is equal to :
and is invariant across all reference frames.
The relativistic energy–momentum relationship holds even for massless particles such as photons; by setting it follows that
In a game of relativistic "billiards", if a stationary particle is hit by a moving particle in an elastic collision, the paths formed by the two afterwards will form an acute angle. This is unlike the non-relativistic case where they travel at right angles.
The four-momentum of a planar wave can be related to a wave four-vector
For a particle, the relationship between temporal components, , is the Planck–Einstein relation, and the relation between spatial components, , describes a de Broglie matter wave.
Newton's laws can be difficult to apply to many kinds of motion because the motion is limited by "constraints". For example, a bead on an abacus is constrained to move along its wire and a pendulum bob is constrained to swing at a fixed distance from the pivot. Many such constraints can be incorporated by changing the normal Cartesian coordinates to a set of "generalized coordinates" that may be fewer in number. Refined mathematical methods have been developed for solving mechanics problems in generalized coordinates. They introduce a "generalized momentum", also known as the "canonical" or "conjugate momentum", that extends the concepts of both linear momentum and angular momentum. To distinguish it from generalized momentum, the product of mass and velocity is also referred to as "mechanical", "kinetic" or "kinematic momentum". The two main methods are described below.
In Lagrangian mechanics, a Lagrangian is defined as the difference between the kinetic energy and the potential energy :
If the generalized coordinates are represented as a vector and time differentiation is represented by a dot over the variable, then the equations of motion (known as the Lagrange or Euler–Lagrange equations) are a set of equations:
If a coordinate is not a Cartesian coordinate, the associated generalized momentum component does not necessarily have the dimensions of linear momentum. Even if is a Cartesian coordinate, will not be the same as the mechanical momentum if the potential depends on velocity. Some sources represent the kinematic momentum by the symbol .
In this mathematical framework, a generalized momentum is associated with the generalized coordinates. Its components are defined as
Each component is said to be the "conjugate momentum" for the coordinate .
Now if a given coordinate does not appear in the Lagrangian (although its time derivative might appear), then
This is the generalization of the conservation of momentum.
Even if the generalized coordinates are just the ordinary spatial coordinates, the conjugate momenta are not necessarily the ordinary momentum coordinates. An example is found in the section on electromagnetism.
In Hamiltonian mechanics, the Lagrangian (a function of generalized coordinates and their derivatives) is replaced by a Hamiltonian that is a function of generalized coordinates and momentum. The Hamiltonian is defined as
where the momentum is obtained by differentiating the Lagrangian as above. The Hamiltonian equations of motion are
As in Lagrangian mechanics, if a generalized coordinate does not appear in the Hamiltonian, its conjugate momentum component is conserved.
Conservation of momentum is a mathematical consequence of the homogeneity (shift symmetry) of space (position in space is the canonical conjugate quantity to momentum). That is, conservation of momentum is a consequence of the fact that the laws of physics do not depend on position; this is a special case of Noether's theorem.
In Maxwell's equations, the forces between particles are mediated by electric and magnetic fields. The electromagnetic force ("Lorentz force") on a particle with charge due to a combination of electric field and magnetic field is
(in SI units).
It has an electric potential and magnetic vector potential .
In the non-relativistic regime, its generalized momentum is
while in relativistic mechanics this becomes
formula_59
The quantity formula_60 is sometimes called the "potential momentum". It is the momentum due to the interaction of the particle with the electromagnetic fields. The name is an analogy with the potential energy formula_61, which is the energy due to the interaction of the particle with the electromagnetic fields. These quantities form a four-vector, so the analogy is consistent; besides, the concept of potential momentum is important in explaining the so-called hidden-momentum of the electromagnetic fields
In Newtonian mechanics, the law of conservation of momentum can be derived from the law of action and reaction, which states that every force has a reciprocating equal and opposite force. Under some circumstances, moving charged particles can exert forces on each other in non-opposite directions. Nevertheless, the combined momentum of the particles and the electromagnetic field is conserved.
The Lorentz force imparts a momentum to the particle, so by Newton's second law the particle must impart a momentum to the electromagnetic fields.
In a vacuum, the momentum per unit volume is
where is the vacuum permeability and is the speed of light. The momentum density is proportional to the Poynting vector which gives the directional rate of energy transfer per unit area:
If momentum is to be conserved over the volume over a region , changes in the momentum of matter through the Lorentz force must be balanced by changes in the momentum of the electromagnetic field and outflow of momentum. If is the momentum of all the particles in , and the particles are treated as a continuum, then Newton's second law gives
The electromagnetic momentum is
and the equation for conservation of each component of the momentum is
The term on the right is an integral over the surface area of the surface representing momentum flow into and out of the volume, and is a component of the surface normal of . The quantity is called the Maxwell stress tensor, defined as
The above results are for the "microscopic" Maxwell equations, applicable to electromagnetic forces in a vacuum (or on a very small scale in media). It is more difficult to define momentum density in media because the division into electromagnetic and mechanical is arbitrary. The definition of electromagnetic momentum density is modified to
where the H-field is related to the B-field and the magnetization by
The electromagnetic stress tensor depends on the properties of the media.
In quantum mechanics, momentum is defined as a self-adjoint operator on the wave function. The Heisenberg uncertainty principle defines limits on how accurately the momentum and position of a single observable system can be known at once. In quantum mechanics, position and momentum are conjugate variables.
For a single particle described in the position basis the momentum operator can be written as
where is the gradient operator, is the reduced Planck constant, and is the imaginary unit. This is a commonly encountered form of the momentum operator, though the momentum operator in other bases can take other forms. For example, in momentum space the momentum operator is represented as
where the operator acting on a wave function yields that wave function multiplied by the value , in an analogous fashion to the way that the position operator acting on a wave function yields that wave function multiplied by the value "x".
For both massive and massless objects, relativistic momentum is related to the phase constant formula_72 by
Electromagnetic radiation (including visible light, ultraviolet light, and radio waves) is carried by photons. Even though photons (the particle aspect of light) have no mass, they still carry momentum. This leads to applications such as the solar sail. The calculation of the momentum of light within dielectric media is somewhat controversial (see Abraham–Minkowski controversy).
In fields such as fluid dynamics and solid mechanics, it is not feasible to follow the motion of individual atoms or molecules. Instead, the materials must be approximated by a continuum in which there is a particle or fluid parcel at each point that is assigned the average of the properties of atoms in a small region nearby. In particular, it has a density and velocity that depend on time and position . The momentum per unit volume is .
Consider a column of water in hydrostatic equilibrium. All the forces on the water are in balance and the water is motionless. On any given drop of water, two forces are balanced. The first is gravity, which acts directly on each atom and molecule inside. The gravitational force per unit volume is , where is the gravitational acceleration. The second force is the sum of all the forces exerted on its surface by the surrounding water. The force from below is greater than the force from above by just the amount needed to balance gravity. The normal force per unit area is the pressure . The average force per unit volume inside the droplet is the gradient of the pressure, so the force balance equation is
If the forces are not balanced, the droplet accelerates. This acceleration is not simply the partial derivative because the fluid in a given volume changes with time. Instead, the material derivative is needed:
Applied to any physical quantity, the material derivative includes the rate of change at a point and the changes due to advection as fluid is carried past the point. Per unit volume, the rate of change in momentum is equal to . This is equal to the net force on the droplet.
Forces that can change the momentum of a droplet include the gradient of the pressure and gravity, as above. In addition, surface forces can deform the droplet. In the simplest case, a shear stress , exerted by a force parallel to the surface of the droplet, is proportional to the rate of deformation or strain rate. Such a shear stress occurs if the fluid has a velocity gradient because the fluid is moving faster on one side than another. If the speed in the direction varies with , the tangential force in direction per unit area normal to the direction is
where is the viscosity. This is also a flux, or flow per unit area, of x-momentum through the surface.
Including the effect of viscosity, the momentum balance equations for the incompressible flow of a Newtonian fluid are
These are known as the Navier–Stokes equations.
The momentum balance equations can be extended to more general materials, including solids. For each surface with normal in direction and force in direction , there is a stress component . The nine components make up the Cauchy stress tensor , which includes both pressure and shear. The local conservation of momentum is expressed by the Cauchy momentum equation:
where is the body force.
The Cauchy momentum equation is broadly applicable to deformations of solids and liquids. The relationship between the stresses and the strain rate depends on the properties of the material (see Types of viscosity).
A disturbance in a medium gives rise to oscillations, or waves, that propagate away from their source. In a fluid, small changes in pressure can often be described by the acoustic wave equation:
where is the speed of sound. In a solid, similar equations can be obtained for propagation of pressure (P-waves) and shear (S-waves).
The flux, or transport per unit area, of a momentum component by a velocity is equal to . In the linear approximation that leads to the above acoustic equation, the time average of this flux is zero. However, nonlinear effects can give rise to a nonzero average. It is possible for momentum flux to occur even though the wave itself does not have a mean momentum.
In about 530 AD, working in Alexandria, Byzantine philosopher John Philoponus developed a concept of momentum in his commentary to Aristotle's "Physics". Aristotle claimed that everything that is moving must be kept moving by something. For example, a thrown ball must be kept moving by motions of the air. Most writers continued to accept Aristotle's theory until the time of Galileo, but a few were skeptical. Philoponus pointed out the absurdity in Aristotle's claim that motion of an object is promoted by the same air that is resisting its passage. He proposed instead that an impetus was imparted to the object in the act of throwing it. Ibn Sīnā (also known by his Latinized name Avicenna) read Philoponus and published his own theory of motion in "The Book of Healing" in 1020. He agreed that an impetus is imparted to a projectile by the thrower; but unlike Philoponus, who believed that it was a temporary virtue that would decline even in a vacuum, he viewed it as a persistent, requiring external forces such as air resistance to dissipate it.
The work of Philoponus, and possibly that of Ibn Sīnā, was read and refined by the European philosophers Peter Olivi and Jean Buridan. Buridan, who in about 1350 was made rector of the University of Paris, referred to impetus being proportional to the weight times the speed. Moreover, Buridan's theory was different from his predecessor's in that he did not consider impetus to be self-dissipating, asserting that a body would be arrested by the forces of air resistance and gravity which might be opposing its impetus.
René Descartes believed that the total "quantity of motion" () in the universe is conserved, where the quantity of motion is understood as the product of size and speed. This should not be read as a statement of the modern law of momentum, since he had no concept of mass as distinct from weight and size, and more important, he believed that it is speed rather than velocity that is conserved. So for Descartes if a moving object were to bounce off a surface, changing its direction but not its speed, there would be no change in its quantity of motion. Galileo, in his "Two New Sciences", used the Italian word "impeto" to similarly describe Descartes' quantity of motion.
Leibniz, in his "Discourse on Metaphysics", gave an argument against Descartes' construction of the conservation of the "quantity of motion" using an example of dropping blocks of different sizes different distances. He points out that force is conserved but quantity of motion, construed as the product of size and speed of an object, is not conserved.
Christiaan Huygens concluded quite early that Descartes's laws for the elastic collision of two bodies must be wrong, and he formulated the correct laws. An important step was his recognition of the Galilean invariance of the problems. His views then took many years to be circulated. He passed them on in person to William Brouncker and Christopher Wren in London, in 1661. What Spinoza wrote to Henry Oldenburg about them, in 1666 which was during the Second Anglo-Dutch War, was guarded. Huygens had actually worked them out in a manuscript "De motu corporum ex percussione" in the period 1652–6. The war ended in 1667, and Huygens announced his results to the Royal Society in 1668. He published them in the "Journal des sçavans" in 1669.
The first correct statement of the law of conservation of momentum was by English mathematician John Wallis in his 1670 work, "Mechanica sive De Motu, Tractatus Geometricus": "the initial state of the body, either of rest or of motion, will persist" and "If the force is greater than the resistance, motion will result". Wallis used "momentum" for quantity of motion, and "vis" for force. Newton's "Philosophiæ Naturalis Principia Mathematica", when it was first published in 1687, showed a similar casting around for words to use for the mathematical momentum. His Definition II defines "quantitas motus", "quantity of motion", as "arising from the velocity and quantity of matter conjointly", which identifies it as momentum. Thus when in Law II he refers to "mutatio motus", "change of motion", being proportional to the force impressed, he is generally taken to mean momentum and not motion. It remained only to assign a standard term to the quantity of motion. The first use of "momentum" in its proper mathematical sense is not clear but by the time of Jennings's "Miscellanea" in 1721, five years before the final edition of Newton's "Principia Mathematica", momentum or "quantity of motion" was being defined for students as "a rectangle", the product of and , where is "quantity of material" and is "velocity", . | https://en.wikipedia.org/wiki?curid=20431 |
Mood stabilizer
A mood stabilizer is a psychiatric pharmaceutical drug used to treat mood disorders characterized by intense and sustained mood shifts, such as bipolar disorder type I or type II and schizoaffective disorder.
Used to treat bipolar disorder, mood stabilizers suppress shifts between mania and depression. Mood-stabilizing drugs are also used in schizoaffective disorder.
The term "mood stabilizer" does not describe a mechanism, but rather an effect. More precise terminology is used to classify these agents.
Drugs commonly classed as mood stabilizers include:
Many agents described as "mood stabilizers" are also categorized as anticonvulsants. The term "anticonvulsant mood stabilizers" is sometimes used to describe these as a class. Although this group is also defined by effect rather than mechanism, there is at least a preliminary understanding of the mechanism of most of the anticonvulsants used in the treatment of mood disorders.
There is insufficient evidence to support the use of various other anticonvulsants, such as gabapentin and topiramate, as mood stabilizers.
Chakrabarti S. Thyroid Functions and Bipolar Affective Disorder. Journal of Thyroid Research. 2011;2011:306367. doi:10.4061/2011/306367.
MLA Chakrabarti, Subho. "Thyroid Functions and Bipolar Affective Disorder". Journal of Thyroid Research 2011 (2011): 306367. PMC. Web. 19 May 2017.
APA Chakrabarti, S. (2011). Thyroid Functions and Bipolar Affective Disorder. Journal of Thyroid Research, 2011, 306367. http://doi.org/10.4061/2011/306367
In routine practice, monotherapy is often not sufficiently effective for acute and/or maintenance therapy and thus most patients are given combination therapies. Combination therapy (atypical antipsychotic with lithium or valproate) shows better efficacy over monotherapy in the manic phase in terms of efficacy and prevention of relapse. However, side effects are more frequent and discontinuation rates due to adverse events are higher with combination therapy than with monotherapy.
Most mood stabilizers are primarily antimanic agents, meaning that they are effective at treating mania and mood cycling and shifting, but are not effective at treating acute depression. The principal exceptions to that rule, because they treat both manic and depressive symptoms, are lamotrigine, lithium carbonate and quetiapine.
Nevertheless, antidepressants are still often prescribed in addition to mood stabilizers during depressive phases. This brings some risks, however, as antidepressants can induce mania, psychosis, and other disturbing problems in people with bipolar disorder—in particular, when taken alone. The risk of antidepressant-induced mania when given to patients concomitantly on antimanic agents is not known for certain but may still exist. The majority of antidepressants appear ineffective in treating bipolar depression.
Antidepressants cause several risks when given to bipolar patients. They are ineffective in treating acute bipolar depression, preventing relapse, and can cause rapid cycling. Studies have shown that antidepressants have no benefit versus a placebo or other treatment. Antidepressants can also lead to a higher rate of non-lethal suicidal behavior. Relapse can also be related to treatment with antidepressants. This is less likely to occur if a mood stabilizer is combined with an antidepressant, rather than an antidepressant being used alone. Evidence from previous studies shows that rapid cycling is linked to use of antidepressants. Rapid cycling is defined as the presence of four or more mood episodes within a year's time. Evidence suggests that rapid cycling and mixed symptoms have become more common since antidepressant medication has come into widespread use. There is a need for caution when treating bipolar patients with antidepressant medication due to the risks that they pose.
Use of mood stabilizers and anticonvulsants such as lamotrigine, carbamazapine, valproate and others may lead to chronic folate deficiency, potentiating depression. Also, "Folate deficiency may increase the risk of depression and reduce the action of antidepressants." L-methylfolate (also formally known as 5-MTHF or levomefolic acid), a centrally acting trimonoamine modulator, boosts the synthesis of three CNS neurotransmitters: dopamine, norepinephrine and serotonin. Mood stabilizers and anticonvulsants may interfere with folic acid absorption and L-methylfolate formation. Augmentation with the medical food L-methylfolate may improve antidepressant effects of these medicines, including lithium and antidepressants themselves, by boosting the synthesis of antidepressant neurotransmitters. However, the U.S. National Institutes of Health issued a warning caution about the use of L-methylfolate for patients with bipolar disease.
The precise mechanism of action of lithium is still unknown, and it is suspected that it acts at various points of the neuron between the nucleus and the synapse. Lithium is known to inhibit the enzyme GSK-3B. This improves the functioning of the circadian clock—which is thought to be often malfunctioning in people with bipolar disorder—and positively modulates gene transcription of brain-derived neurotrophic factor (BDNF). The resulting increase in neural plasticity may be central to lithium's therapeutic effects. Lithium may also increase the synthesis of serotonin.
All of the anticonvulsants routinely used to treat bipolar disorder are blockers of voltage-gated sodium channels, affecting the brain's glutamate system. For valproic acid, carbamazepine and oxcarbazepine, however, their mood-stabilizing effects may be more related to effects on the GABAergic system. Lamotrigine is known to decrease the patient's cortisol response to stress.
One possible downstream target of several mood stabilizers such as lithium, valproate, and carbamazepine is the arachidonic acid cascade. | https://en.wikipedia.org/wiki?curid=20432 |
Mere Christianity
Mere Christianity is a 1952 theological book by C. S. Lewis, adapted from a series of BBC radio talks made between 1941 and 1944, while Lewis was at Oxford during the Second World War. Considered a classic of Christian apologetics, the transcripts of the broadcasts originally appeared in print as three separate pamphlets: "The Case for Christianity" ("Broadcast Talks" in the UK) (1942), "Christian Behaviour" (1943), and "Beyond Personality" (1944). Lewis was invited to give the talks by James Welch, the BBC Director of Religious Broadcasting, who had read his 1940 book, "The Problem of Pain".
James Welch, the Director of Religious Broadcasting for the BBC, read Lewis's "The Problem of Pain" and subsequently wrote to him the following:I write to ask whether you would be willing to help us in our work of religious broadcasting ... The microphone is a limiting, and rather irritating, instrument, but the quality of thinking and depth of conviction which I find in your book ought sure to be shared with a great many other people. Welch suggested two potential subjects. Lewis responded with thanks and observed that modern literature, the first potential subject, did not suit him, thereby electing the latter option, the Christian Faith as Lewis saw it... Ultimately, this was the course he set upon. In the radio talks and the derived book, "Mere Christianity", he articulated the congruous tenets of Christian faith.
In the preface to later editions of the book, Lewis described his intentions of avoiding contended theological doctrine, focusing instead on foundational principles and applicable derivations. Along with his use of pithy and succinct language, Lewis was able to more nearly pertain and better appeal his subject to the commonly-educated man, who comprised the vast majority of his audience. Although, he still kept the work plausibly erudite for the intellectuals of his generation, for whom the jargon of formal Christian theology did not retain its original meaning.
The core of the first section centers on an argument from morality, the basis of which is the "law of human nature", a "rule about right and wrong," which, Lewis maintained, is commonly available and known to all human beings. He cites, as an example, the case of Nazi Germany, writing: "This law was called the Law of nature because people thought that every one knew it by nature and did not need to be taught it. They did not mean, of course, that you might not find an odd individual here and there who did not know it, just as you find a few people who are colour-blind or have no ear for a tune. But taking the race as a whole, they thought that the human idea of decent behaviour was obvious to every one. And I believe they were right. If they were not, then all the things we said about the war were nonsense. What was the sense in saying the enemy were in the wrong unless Right is a real thing which the Nazis at bottom knew as well as we did and ought to have practised? If they had had no notion of what we mean by right, then, though we might still have had to fight them, we could no more have blamed them for that than for the colour of their hair."
On a more mundane level, it is generally accepted that stealing is a violation of this moral law. Lewis argues that the moral law is like scientific laws (e.g. gravity) or mathematics in that it was not contrived by humans. However, it is unlike scientific laws in that it can be broken or ignored, and it is known intuitively, rather than through experimentation. After introducing the moral law, Lewis argues that thirst reflects the fact that people naturally need water, and there is no other substance which satisfies that need. Lewis points out that earthly experience does not satisfy the human craving for "joy" and that only God could fit the bill; humans cannot know to yearn for something if it does not exist.
After providing reasons for his conversion to theism, Lewis goes over rival conceptions of God to Christianity. Pantheism, he argues, is incoherent, and atheism too simple. Eventually he arrives at Jesus Christ, and invokes a well-known argument now known as "Lewis's trilemma". Lewis, arguing that Jesus was claiming to be God, uses logic to advance three possibilities: either he really was God, was deliberately lying, or was not God but thought himself to be (which would make him delusional and likely insane). The book goes on to say that the latter two possibilities are not consistent with Jesus' character and it was most likely that he was being truthful.
The next third of the book explores the ethics resulting from Christian belief. He cites the four cardinal virtues: prudence, justice, temperance, and fortitude. After touching on these, he goes into the three theological virtues: hope, faith, and charity. Lewis also explains morality as being composed of three "layers": relationships between man and man, the motivations and attitudes of the man himself, and contrasting worldviews.
Lewis also covers such topics as social relations and forgiveness, sexual ethics and the tenets of Christian marriage, and the relationship between morality and psychoanalysis. He also writes about "the great sin": pride, which he argues to be the root cause of all evil and rebellion.
His most important point is that Christianity mandates that one "love your neighbour as yourself." He points out that all persons unconditionally love themselves. Even if one does not "like" oneself, one would still love oneself. Christians, he writes, must also apply this attitude to others, even if they do not like them. Lewis calls this one of the "great secrets": when one acts as if he loves others, he will presently come to love them.
Lewis' voice became nearly as recognizable as that of Winston Churchill during World War II, when the talks were given. The book has since become among the most popular evangelical works in existence. In 2006, "Mere Christianity" was placed third in "Christianity Today" list of the most influential books amongst evangelicals since 1945. The title has influenced "Touchstone Magazine: A Journal of Mere Christianity" and William Dembski's book "Mere Creation". Charles Colson's conversion to Christianity resulted from his reading this book, as did the conversions of Francis Collins, Jonathan Aitken, Josh Caterer, and the philosopher C. E. M. Joad.
A passage in the book also influenced the name of contemporary Christian Texan Grammy-nominated pop/rock group Sixpence None the Richer. The phrase, "the hammering process" was used by the Christian metal band Living Sacrifice for the name of their album "The Hammering Process". The metalcore band Norma Jean derived the title of their song "No Passenger: No Parasite" from the section in the book in which Lewis describes a fully Christian society as having "No passengers or parasites". | https://en.wikipedia.org/wiki?curid=20433 |
Mathematical game
A mathematical game is a game whose rules, strategies, and outcomes are defined by clear mathematical parameters. Often, such games have simple rules and match procedures, such as Tic-tac-toe and Dots and Boxes. Generally, mathematical games need not be conceptually intricate to involve deeper computational underpinnings. For example, even though the rules of Mancala are relatively basic, the game can be rigorously analyzed through the lens of combinatorial game theory.
Mathematical games differ sharply from mathematical puzzles in that mathematical puzzles require specific mathematical expertise to complete, whereas mathematical games do not require a deep knowledge of mathematics to play. Often, the arithmetic core of mathematical games is not readily apparent to players untrained to note the statistical or mathematical aspects.
Some mathematical games are of deep interest in the field of recreational mathematics.
When studying a game's core mathematics, arithmetic theory is generally of higher utility than actively playing or observing the game itself. To analyze a game numerically, it is particularly useful to study the rules of the game insofar as they can yield equations or relevant formulas. This is frequently done to determine winning strategies or to distinguish if the game has a solution.
Sometimes it is not immediately obvious that a particular game involves chance. Often a card game is described as "pure strategy" and such, but a game with any sort of random shuffling or face-down dealing of cards should not be considered to be "no chance". Several abstract strategy games are listed below: | https://en.wikipedia.org/wiki?curid=20434 |
Martin Gardner
Martin Gardner (October 21, 1914May 22, 2010) was an American popular mathematics and popular science writer, with interests also encompassing scientific skepticism, micromagic, philosophy, religion, and literature—especially the writings of Lewis Carroll, L. Frank Baum, and G. K. Chesterton. He was also a leading authority on Lewis Carroll. "The Annotated Alice", which incorporated the text of Carroll's two Alice books, was his most successful work and sold over a million copies. He had a lifelong interest in magic and illusion and was regarded as one of the most important magicians of the twentieth century. He was considered the doyen of American puzzlers. He was a prolific and versatile author, publishing more than 100 books.
Gardner was best known for creating and sustaining interest in recreational mathematics—and by extension, mathematics in general—throughout the latter half of the 20th century, principally through his "Mathematical Games" columns. These appeared for twenty-five years in "Scientific American", and his subsequent books collecting them.
Gardner was one of the foremost anti-pseudoscience polemicists of the 20th century. His 1957 book "Fads and Fallacies in the Name of Science" became a classic and seminal work of the skeptical movement. In 1976 he joined with fellow skeptics to found CSICOP, an organization promoting scientific inquiry and the use of reason in examining extraordinary claims.
Martin Gardner was born into a prosperous family in Tulsa, Oklahoma, to James Henry Gardner, a prominent petroleum geologist, and his wife, Willie Wilkerson Spiers, a Montessori-trained teacher. His mother taught Martin to read before he started school, reading him "The Wizard of Oz", and this began a lifelong interest in the Oz books of L. Frank Baum. His fascination with puzzles started in his boyhood when his father gave him a copy of Sam Loyd's "Cyclopedia of 5000 Puzzles, Tricks and Conundrums".
He attended the University of Chicago, where he earned his bachelor's degree in philosophy in 1936. Early jobs included reporter on the "Tulsa Tribune", writer at the University of Chicago Office of Press Relations, and case worker in Chicago's Black Belt for the city's Relief Administration. During World War II, he served for four years in the U.S. Navy as a yeoman on board the destroyer escort USS "Pope" in the Atlantic. His ship was still in the Atlantic when the war came to an end with the surrender of Japan in August 1945.
After the war, Gardner returned to the University of Chicago. He attended graduate school for a year there, but he did not earn an advanced degree.
In 1950 he wrote an article in the "Antioch Review" entitled "The Hermit Scientist". It was one of Gardner's earliest articles about junk science, and in 1952 a much-expanded version became his first published book: "In the Name of Science: An Entertaining Survey of the High Priests and Cultists of Science, Past and Present".
In the late 1940s, Gardner moved to New York City and became a writer and editor at "Humpty Dumpty" magazine, where for eight years he wrote features and stories for it and several other children's magazines. His paper-folding puzzles at that magazine led to his first work at "Scientific American." For many decades, Gardner, his wife Charlotte, and their two sons, Jim and Tom, lived in Hastings-on-Hudson, New York, where he earned his living as a freelance author, publishing books with several different publishers, and also publishing hundreds of magazine and newspaper articles. The year 1960 saw the original edition of the best-selling book of his career, "The Annotated Alice".
In 1979, Gardner retired from Scientific American and he and his wife Charlotte moved to Hendersonville, North Carolina. Gardner never really retired as an author, but continued to write math articles, sending them to The Mathematical Intelligencer, Math Horizons, The College Mathematics Journal, and Scientific American. He also revised some of his older books such as "Origami, Eleusis, and the Soma Cube". Charlotte died in 2000 and in 2004 Gardner returned to Oklahoma, where his son, James Gardner, was a professor of education at the University of Oklahoma in Norman. He died there on May 22, 2010. An autobiography — "Undiluted Hocus-Pocus: The Autobiography of Martin Gardner" — was published posthumously.
Martin Gardner had a major impact on mathematics in the second half of the 20th century. His column was called "Mathematical Games" but it was much more than that. His writing introduced many readers to real mathematics for the first time in their lives. The column lasted for 25 years and was read avidly by the generation of mathematicians and physicists who grew up in the years 1956 to 1981. It was the original inspiration for many of them to become mathematicians or scientists themselves.
David Auerbach wrote: "A case can be made, in purely practical terms, for Martin Gardner as one of the most influential writers of the 20th century. His popularizations of science and mathematical games in Scientific American, over the 25 years he wrote for them, might have helped create more young mathematicians and computer scientists than any other single factor prior to the advent of the personal computer."
His admirers included such diverse people as W. H. Auden, Arthur C. Clarke, Carl Sagan, Isaac Asimov, Richard Dawkins, Stephen Jay Gould, and the entire French literary group known as the Oulipo. Salvador Dali once sought him out to discuss four-dimensional hypercubes. Gardner wrote to M.C. Escher in 1961 to ask permission to use his Horseman tessellation in an upcoming column about H.S.M. Coxeter. Escher replied, saying that he knew Gardner as author of "The Annotated Alice", which had been sent to Escher by Coxeter. The correspondence led to Gardner introducing the previously unknown Escher's art to the world. His writing was both broad and deep. Noam Chomsky once wrote, "Martin Gardner's contribution to contemporary intellectual culture is unique—in its range, its insight, and understanding of hard questions that matter." Gardner repeatedly alerted the public (and other mathematicians) to recent discoveries in mathematics–recreational and otherwise. In addition to introducing many first-rate puzzles and topics such as Penrose tiles and Conway's Game of Life, he was equally adept at writing captivating columns about traditional mathematical topics such as knot theory, Fibonacci numbers, Pascal's triangle, the Möbius strip, transfinite numbers, four-dimensional space, Zeno's paradoxes, Fermat's last theorem, and the four-color problem.
Gardner set a new high standard for writing about mathematics. In a 2004 interview he said, "I go up to calculus, and beyond that I don't understand any of the papers that are being written. I consider that that was an advantage for the type of column I was doing because I had to understand what I was writing about, and that enabled me to write in such a way that an average reader could understand what I was saying. If you are writing popularly about math, I think it's good not to know too much math." And he was fearsomely bright. John Horton Conway called him "the most learned man I have ever met." Colm Mulcahy spoke for many when he said, "Gardner was without doubt the best friend mathematics ever had."
For over a quarter century Gardner wrote a monthly column on the subject of recreational mathematics for "Scientific American". It all began with his free-standing article on hexaflexagons which ran in the December 1956 issue. Flexagons became a bit of a fad and soon people all over New York City were making them. Gerry Piel, the "SA" publisher at the time, asked Gardner, "Is there enough similar material to this to make a regular feature?" Gardner said he thought so. The January 1957 issue contained his first column, entitled "Mathematical Games". Almost 300 more columns were to follow.
The "Mathematical Games" column became the most popular feature of the magazine and was the first thing that many readers turned to. In September 1977 Scientific American acknowledged the prestige and popularity of Gardner's column by moving it from the back to the very front of the magazine. It ran from 1956 to 1981 with sporadic columns afterwards and was the first introduction of many subjects to a wider audience, notably:
Ironically, Gardner had problems learning calculus and never took a mathematics course after high school. While editing "Humpty Dumpty's Magazine" he constructed many paper folding puzzles, and this led to his interest in the flexagons invented by British mathematician Arthur H Stone. The subsequent article he wrote on hexaflexagons led directly to the column.
Gardner's son Jim once asked him what was his favorite puzzle, and Gardner answered almost immediately: "The monkey and the coconuts". It had been the subject of his April 1958 Games column and in 2001 he chose to make it the first chapter of his "best of" collection, "The Colossal Book of Mathematics".
In the 1980s "Mathematical Games" began to appear only irregularly. Other authors began to share the column, and the June 1986 issue saw the final installment under that title. In 1981, on Gardner's retirement from "Scientific American", the column was replaced by Douglas Hofstadter's "Metamagical Themas", a name that is an anagram of "Mathematical Games".
Virtually all of the games columns were collected in book form starting in 1959 with "The Scientific American Book of Mathematical Puzzles & Diversions". Over the next four decades fourteen more books followed. Donald Knuth called them the .
There is a group of people who lie at the intersection of mathematics, philosophy, magic, and scientific skepticism who all knew and worked with Martin Gardner. They are united by a striking originality in their work and they sometimes owe much of their fame to being featured by Gardner in his column. The fact that Gardner had an unerring eye for spotting and promoting such people is one of the reasons his column was so influential. Indeed, Gardner filled a role a little bit like the 17th century French polymath Marin Mersenne in that he maintained this network and made these people aware of each other–and this led to further fruitful collaborations. For example, if it were not for Gardner, mathematicians Conway, Berlekamp, and Guy probably would have never gotten together to write "Winning Ways for your Mathematical Plays", a foundational book in combinatorial game theory. Gardner also introduced Conway to Benoit Mandelbrot because he knew they both were interested in Penrose tiles. If it were not for Gardner, Doris Schattschneider and Marjorie Rice would not have gotten together to document the newly discovered pentagon tilings.
Late in his life Gardner once said, "When I first started the column, I was not in touch with any mathematicians, and gradually mathematicians who were creative in the field found out about the column and began corresponding with me. So my most interesting columns were columns based on the material I got from them, so I owe them a big debt of gratitude." The games column was not just Gardner. It was this group of people he collected, nurtured, and acted as a conduit for—a group that came to be known as "Martin Gardner’s mathematical grapevine."
Gardner prepared each of his columns in a painstaking and scholarly fashion and conducted copious correspondence to be sure that everything was fact-checked for mathematical accuracy. But this grapevine, with Gardner at the center, was also a rich source of ideas, mathematical and otherwise, with gossip flowing in many directions. Communications was often by postcard or telephone and Gardner kept meticulous notes of everything, typically on index cards. Archives of just some of his correspondence stored at Stanford University occupy some 63 linear feet of shelf space. This correspondence led to columns about the rep-tiles and pentominos of Solomon W. Golomb; the space filling curves of Bill Gosper; the aperiodic tiles of Roger Penrose; the Game of Life invented by John H. Conway; the superellipse and the Soma cube of Piet Hein; the trapdoor functions of Diffie, Hellman, and Merkle; the flexigons of Stone, Tuckerman, Feynman, and Tukey; the geometrical delights in a book by H. S. M. Coxeter; the game of Hex invented by John Nash; Tutte's account of squaring the square; and many other topics.
The wide array of mathematicians, physicists, computer scientists, philosophers, magicians, artists, writers, and other influential thinkers who can be counted as part of Gardner's mathematical grapevine includes:
Doris Schattschneider sometimes refers to this circle of collaborators as MG2..
Gardner died in 2010 but the grapevine lives on, and is even adding new members from time to time. Many of the people in the Gardner circle continue to meet every two years at Gathering 4 Gardner (G4G) founded in 1993 by Berlekamp, Tom Rodgers, and other admirers of Martin Gardner. A keynote speaker at G4G13 was Fields medalist Manjul Bhargava.
Gardner was an uncompromising critic of fringe science. His book "Fads and Fallacies in the Name of Science" (1952, revised 1957) launched the modern skeptical movement. It debunked dubious movements and theories including Fletcherism, Lamarckism, food faddism, Dowsing Rods, Charles Fort, Rudolf Steiner, Dianetics, the Bates method for improving eyesight, Einstein deniers, the Flat Earth theory, the lost continents of Atlantis and Lemuria, Immanuel Velikovsky's Worlds in Collision, the reincarnation of Bridey Murphy, Wilhelm Reich's orgone theory, the spontaneous generation of life, extra-sensory perception and psychokinesis, homeopathy, phrenology, palmistry, graphology, and numerology. This book and his subsequent efforts ("Science: Good, Bad and Bogus", 1981; "Order and Surprise", 1983, "Gardner's Whys & Wherefores", 1989, etc.) provoked a lot of criticism from the advocates of alternative science and New Age philosophy; he kept up running dialogues (both public and private) with many of them for decades.
In a review of "Science: Good, Bad and Bogus", Stephen Jay Gould called Gardner "The Quack Detector", a writer who "expunge[d] nonsense" and in so doing had "become a priceless national resource."
In 1976 Gardner joined with fellow skeptics philosopher Paul Kurtz, psychologist Ray Hyman, sociologist Marcello Truzzi, and stage magician James Randi to found the Committee for the Scientific Investigation of Claims of the Paranormal (now called the Committee for Skeptical Inquiry). Luminaries such as astronomer Carl Sagan, author and biochemist Isaac Asimov, psychologist B. F. Skinner, and journalist Philip J. Klass became fellows of the program. From 1983 to 2002 he wrote a monthly column called "Notes of a Fringe Watcher" (originally "Notes of a Psi-Watcher") for "Skeptical Inquirer", that organization's monthly magazine. These columns have been collected in five books starting with "The New Age: Notes of a Fringe Watcher" in 1988.
Gardner was a relentless critic of self-proclaimed Israeli psychic Uri Geller and wrote two satirical booklets about him in the 1970s using the pen name "Uriah Fuller" in which he explained how such purported psychics do their seemingly impossible feats such as mentally bending spoons and reading minds.
Martin Gardner continued to criticize junk science throughout his life–and he was fearless. His targets included not just safe subjects like astrology and UFO sightings, but topics such as chiropractic, vegetarianism, Madame Blavatsky, creationism, Scientology, the Laffer curve, Christian Science, and the Hutchins-Adler Great Books Movement. The last thing he wrote in the spring of 2010 (a month before his death) was an article excoriating the "dubious medical opinions and bogus science" of Oprah Winfrey—particularly her support for the thoroughly discredited theory that vaccinations cause autism; it went on to bemoan the "needless deaths of children" that such notions are likely to cause.
"Skeptical Inquirer" named him one of the Ten Outstanding Skeptics of the Twentieth Century. In 2010 he was posthumously honored with an award for his contributions in the skeptical field from the Independent Investigations Group. In 1982 the Committee for Skeptical Inquiry awarded Gardner its "In Praise of Reason Award" for his "heroic efforts in defense of reason and the dignity of the skeptical attitude", and in 2011 it added Gardner to its Pantheon of Skeptics.
Martin Gardner's father once showed him a magic trick when he was a little boy. Young Martin was fascinated to see physical laws seemingly violated and this led to a lifelong passion for magic and illusion. He wrote for a magic magazine in high school and worked in a department store demonstrating magic tricks while he was at the University of Chicago. The very first thing that Martin Gardner ever published (at the age of fifteen) was a magic trick in "The Sphinx", the official magazine of the Society of American Magicians. He focused mainly on micromagic (table or close-up magic) and, from the 1930s on, published a significant number of original contributions to this secretive field. Magician Joe M. Turner said, "The Encyclopedia of Impromptu Magic", which Gardner wrote in 1985, "is guaranteed to show up in any poll of magicians' favorite magic books." His first magic book for the general public, "Mathematics, Magic and Mystery" (Dover, 1956), is still considered a classic in the field. He was well known for his innovative tapping and spelling effects, with and without playing cards, and was most proud of the effect he called the "Wink Change".
Many of Gardner's lifelong friends were magicians. These included William Simon who introduced Gardner to Charlotte Greenwald, whom he married in 1952, fellow CSICOP founder and pseudoscience fighter James Randi, Dai Vernon, Jerry Andrus, statistician Persi Diaconis, and polymath Raymond Smullyan. Diaconis and Smullyan like Gardner straddled the two worlds of mathematics and magic. Mathematics and magic were frequently intertwined in Gardner's work. One of his earliest books, "Mathematics, Magic and Mystery" (1956), was about mathematically based magic tricks. Mathematical magic tricks were often featured in his "Mathematical Games" column–for example, his August 1962 column was titled "A variety of diverting tricks collected at a fictitious convention of magicians." From 1998 to 2002 he wrote a monthly column on magic tricks called "Trick of the Month" in The Physics Teacher, a journal published by the American Association of Physics Teachers.
In 1999 "Magic magazine" named Gardner one of the "100 Most Influential Magicians of the Twentieth Century". In 2005 he received a 'Lifetime Achievement Fellowship' from the Academy of Magical Arts. The last work to be published during his lifetime was a magic trick in the May 2010 issue of "".
Gardner was raised as a Methodist (his mother was very religious) but rejected established religion as an adult. He considered himself a philosophical theist and a fideist. He believed in a personal God, in an afterlife, and prayer, but rejected established religion. Nevertheless, he had abiding fascination with religious belief. In his autobiography, he stated: "When many of my fans discovered that I believed in God and even hoped for an afterlife, they were shocked and dismayed ... I do not mean the God of the Bible, especially the God of the Old Testament, or any other book that claims to be divinely inspired. For me God is a "Wholly Other" transcendent intelligence, impossible for us to understand. He or she is somehow responsible for our universe and capable of providing, how I have no inkling, an afterlife."
Gardner described his own belief as philosophical theism inspired by the works of philosopher Miguel de Unamuno. While eschewing systematic religious doctrine, he retained a belief in God, asserting that this belief cannot be confirmed or disconfirmed by reason or science. At the same time, he was skeptical of claims that any god has communicated with human beings through spoken or telepathic revelation or through miracles in the natural world. Gardner has been quoted as saying that he regarded parapsychology and other research into the paranormal as tantamount to "tempting God" and seeking "signs and wonders". He stated that while he would expect tests on the efficacy of prayers to be negative, he would not rule out "a priori" the possibility that as yet unknown paranormal forces may allow prayers to influence the physical world.
Gardner wrote repeatedly about what public figures such as Robert Maynard Hutchins, Mortimer Adler, and William F. Buckley, Jr. believed and whether their beliefs were logically consistent. In some cases, he attacked prominent religious figures such as Mary Baker Eddy on the grounds that their claims are unsupportable. His semi-autobiographical novel "The Flight of Peter Fromm" depicts a traditionally Protestant Christian man struggling with his faith, examining 20th century scholarship and intellectual movements and ultimately rejecting Christianity while remaining a theist.
Gardner said that he suspected that the fundamental nature of human consciousness may not be knowable or discoverable, unless perhaps a physics more profound than ("underlying") quantum mechanics is some day developed. In this regard, he said, he was an adherent of the "New Mysterianism".
Gardner was considered a leading authority on Lewis Carroll. His annotated version of "Alice's Adventures in Wonderland" and "Through the Looking Glass", explaining the many mathematical riddles, wordplay, and literary references found in the Alice books, was first published as "The Annotated Alice" (Clarkson Potter, 1960). Sequels were published with new annotations as "More Annotated Alice" (Random House, 1990), and finally as "The Annotated Alice: The Definitive Edition" (Norton, 1999), combining notes from the earlier editions and new material. The original book arose when Gardner found the Alice books "sort of frightening" when he was young, but found them fascinating as an adult. He felt that someone ought to annotate them, and suggested to a publisher that Bertrand Russell be asked; when the publisher was unable to get past Russell's secretary, Gardner was asked to take on the project himself.
There had long been annotated books written by scholars for other scholars, but Gardner was the first to write such a work for the general public, and soon many other writers followed his lead. Gardner himself went on to produce annotated editions of G. K. Chesterton's "The Innocence Of Father Brown" and "The Man Who Was Thursday", as well as of celebrated poems including "The Rime of the Ancient Mariner", "Casey at the Bat", "The Night Before Christmas", and "The Hunting of the Snark".
Gardner wrote two novels. He was a perennial fan of the Oz books written by L. Frank Baum, and in 1988 he published "Visitors from Oz", based on the characters in Baum's various Oz books. Gardner was a founding member of the International Wizard of Oz Club, and winner of its 1971 L. Frank Baum Memorial Award. His other novel was "The Flight of Peter Fromm" (1973), which reflected his lifelong fascination with religious belief and the problem of faith.
His short stories were collected in "The No-Sided Professor and Other Tales of Fantasy, Humor, Mystery, and Philosophy" (1987).
At the age of 95 Gardner wrote "Undiluted Hocus-Pocus: The Autobiography of Martin Gardner". He was living in a one-room apartment in Norman, Oklahoma and, as was his custom, wrote it on a typewriter and edited it using scissors and rubber cement. He took the title from a poem, a so-called grook, by his good friend Piet Hein, which perfectly expresses Gardner's abiding sense of mystery and wonder about existence.
Gardner's interest in wordplay led him to conceive of a magazine on recreational linguistics. In 1967 he pitched the idea to Greenwood Periodicals and nominated Dmitri Borgmann as editor. The resulting journal, "Word Ways", carried many of his articles; it was still publishing his submissions posthumously. He also wrote a "Puzzle Tale" column for "Asimov's Science Fiction" magazine from 1977 to 1986. Gardner was a member of the all-male literary banqueting club, the Trap Door Spiders, which served as the basis of Isaac Asimov's fictional group of mystery solvers, the Black Widowers.
Gardner often used pen names. In 1952, while working for the children's magazine "Humpty Dumpty", he contributed stories written by "Humpty Dumpty Jnr". For several years starting in 1953 he was a managing editor of "Polly Pigtails", a magazine for young girls, and also wrote under that name. His "Annotated Casey at the Bat" (1967) included a parody of the poem, attributed to "Nitram Rendrag" (his name spelled backwards). Using the pen name "Uriah Fuller", he wrote two books attacking the alleged psychic Uri Geller. In later years, Gardner often wrote parodies of his favorite poems under the name "Armand T. Ringer", an anagram of his name. In 1983 one George Groth panned Gardner's book "The Whys of a Philosophical Scrivener" in the "New York Review of Books". Only in the last line of the review was it revealed that George Groth was Martin Gardner himself.
In his January 1960 "Mathematical Games" column, Gardner introduced the fictitious "Dr. Matrix" and wrote about him often over the next two decades. Dr. Matrix was not exactly a pen name, although Gardner did pretend that everything in these columns came from the fertile mind of the good doctor. Then in 1979 Dr. Matrix himself published an article in the quite respectable "Two-Year College Mathematics Journal". It was called "Martin Gardner: Defending the Honor of the Human Mind" and contained a biography of Gardner and a history of his "Mathematical Games" column. It would be a further decade before Martin published an article in such a mathematics journal under his own name.
Gardner was known for his sometimes controversial philosophy of mathematics. He wrote negative reviews of "The Mathematical Experience" by Philip J. Davis and Reuben Hersh and "What Is Mathematics, Really?" by Hersh, both of which were critical of aspects of mathematical Platonism, and the first of which was well received by the mathematical community. While Gardner was often perceived as a hard-core Platonist, his reviews demonstrated some formalist tendencies. Gardner maintained that his views are widespread among mathematicians, but Hersh has countered that in his experience as a professional mathematician and speaker, this is not the case.
Over the years Gardner held forth on many contemporary issues, arguing for his points of view in fields from general semantics to fuzzy logic to watching TV (he once wrote a negative review of Jerry Mander's book "Four Arguments for the Elimination of Television"). He was a frequent contributor to "The New York Review of Books". His philosophical views are described and defended in his book "The Whys of a Philosophical Scrivener" (1983, revised 1999).
The numerous awards Gardner received include:
The Mathematical Associating of America has established a Martin Gardner Lecture to be given each year on the last day of MAA MathFest, the summer meeting of the MAA. The first annual lecture, "Recreational Mathematics and Computer Science: Martin Gardner's Influence on Research", was given by Erik Demaine of the Massachusetts Institute of Technology on Saturday, August 3, 2019, at MathFest in Cincinnati.
There are eight bricks honoring Gardner in the Paul R. Halmos Commemorative Walk, installed by The Mathematical Association of America (MAA) at their Conference Center in Washington, D.C. Gardner has an Erdös number of 1.
Martin Gardner continued to write up until his death in 2010, and his community of fans grew to span several generations. Moreover, his influence was so broad that many of his fans had little or no contact with each other. This led Atlanta entrepreneur and puzzle collector Tom Rodgers to the idea of hosting a weekend gathering celebrating Gardner's contributions to recreational mathematics, rationality, magic, puzzles, literature, and philosophy. Although Gardner was famously shy, and would usually decline an honor if it required him to make a personal appearance, Rogers persuaded him to attend the first such "Gathering 4 Gardner" (G4G), held in Atlanta in January 1993.
A second such get-together was held in 1996, again with Gardner in attendance, and this led Rodgers and his friends to make the gathering a regular, bi-annual event. Participants over the years have ranged from long-time Gardner friends such as John Horton Conway, Elwyn Berlekamp, Ronald Graham, Donald Coxeter, and Richard K. Guy, to newcomers like mathematician and mathematical artist Erik Demaine and mathematical video maker Vi Hart.
The program at the "G4G" meetings presents topics which Gardner had written about. The first gathering in 1993 was G4G1 and the 1996 event was G4G2. Since then it has been in even-numbered years, so far always in Atlanta. The 2018 event was G4G13.
In a publishing career spanning 80 years (1930-2010), Gardner authored or edited over 100 books and countless articles, columns and reviews.
All Gardner's works were non-fiction except for two novels — "The Flight of Peter Fromm" (1973) and "Visitors from Oz" (1998) — and two collections of short pieces — "The Magic Numbers of Dr. Matrix" (1967, 1985) and "The No-Sided Professor" (1987). | https://en.wikipedia.org/wiki?curid=20435 |
MIDI timecode
MIDI time code (MTC) embeds the same timing information as standard SMPTE timecode as a series of small 'quarter-frame' MIDI messages. There is no provision for the user bits in the standard MIDI time code messages, and messages are used to carry this information instead. The quarter-frame messages are transmitted in a sequence of eight messages, thus a complete timecode value is specified every two frames. If the MIDI data stream is running close to capacity, the MTC data may arrive a little behind schedule which has the effect of introducing a small amount of jitter. In order to avoid this it is ideal to use a completely separate MIDI port for MTC data. Larger full-frame messages, which encapsulate a frame worth of timecode in a single message, are used to locate to a time while timecode is not running.
Unlike standard SMPTE timecode, MIDI timecode's quarter-frame and full-frame messages carry a two-bit flag value that identifies the rate of the timecode, specifying it as either:
MTC distinguishes between film speed and video speed only by the rate at which timecode advances, not by the information contained in the timecode messages; thus, 29.97 frame/s dropframe is represented as 30 frame/s dropframe at 0.1% pulldown.
MTC allows the synchronisation of a sequencer or DAW with other devices that can synchronise to MTC or for these devices to 'slave' to a tape machine that is striped with SMPTE. For this to happen a SMPTE to MTC converter needs to be employed. It is possible for a tape machine to synchronise to an MTC signal (if converted to SMPTE), if the tape machine is able to 'slave' to incoming timecode via motor control, which is a rare feature.
The MIDI time code is 32 bits long, of which 24 are used, while 8 bits are unused and always zero. Because the full-time code messages requires that the most significant bits of each byte are zero (valid MIDI data bytes), there are really only 28 available bits and 4 spare bits.
Like most audiovisual timecodes such as SMPTE time code, it encodes only time of day, repeating each 24 hours. Time is given in units of hours, minutes, seconds, and frames. There may be 24, 25, or 30 frames per second.
Unlike most other timecodes, the components are encoded in straight binary, not binary-coded decimal.
Each component is assigned one byte:
When there is a jump in the time code, a single full-time code is sent to synchronize attached equipment. This takes the form of a special global system exclusive message:
The manufacturer ID of codice_10 indicates a real-time universal message, the channel of codice_10 indicates it is a global broadcast. The following ID of codice_12 identifies this is a time code type message, and the second codice_12 indicates it is a full-time code message. The 4 bytes of time code follow. Although MIDI is generally little-endian, the 4 time code bytes follow in big-endian order, followed by a codice_14 "end of exclusive" byte.
After a jump, the time clock stops until the first following quarter-frame message is received.
When the time is running continuously, the 32-bit time code is broken into 8 4-bit pieces, and one piece is transmitted each quarter frame. I.e. 96—120 times per second, depending on the frame rate. Since it takes eight quarter frames for a complete time code message, the complete SMPTE time is updated every two frames. A quarter-frame messages consists of a status byte of 0xF1, followed by a single 7-bit data value: 3 bits to identify the piece, and 4 bits of partial time code. When time is running forward, the piece numbers increment from 0–7; with the time that piece 0 is transmitted is the coded instant, and the remaining pieces are transmitted later.
If the MIDI data stream is being rewound, the piece numbers count backward. Again, piece 0 is transmitted at the coded moment.
The time code is divided little-endian as follows: | https://en.wikipedia.org/wiki?curid=20436 |
Mass transfer
Mass transfer is the net movement of mass from one location, usually meaning stream, phase, fraction or component, to another. Mass transfer occurs in many processes, such as absorption, evaporation, drying, precipitation, membrane filtration, and distillation. Mass transfer is used by different scientific disciplines for different processes and mechanisms. The phrase is commonly used in engineering for physical processes that involve diffusive and convective transport of chemical species within physical systems.
Some common examples of mass transfer processes are the evaporation of water from a pond to the atmosphere, the purification of blood in the kidneys and liver, and the distillation of alcohol. In industrial processes, mass transfer operations include separation of chemical components in distillation columns, absorbers such as scrubbers or stripping, adsorbers such as activated carbon beds, and liquid-liquid extraction. Mass transfer is often coupled to additional transport processes, for instance in industrial cooling towers. These towers couple heat transfer to mass transfer by allowing hot water to flow in contact with air. The water is cooled by expelling some of its content in the form of water vapour.
In astrophysics, mass transfer is the process by which matter gravitationally bound to a body, usually a star, fills its Roche lobe and becomes gravitationally bound to a second body, usually a compact object (white dwarf, neutron star or black hole), and is eventually accreted onto it. It is a common phenomenon in binary systems, and may play an important role in some types of supernovae and pulsars.
Mass transfer finds extensive application in chemical engineering problems. It is used in reaction engineering, separations engineering, heat transfer engineering, and many other sub-disciplines of chemical engineering like electrochemical engineering.
The driving force for mass transfer is usually a difference in chemical potential, when it can be defined, though other thermodynamic gradients may couple to the flow of mass and drive it as well. A chemical species moves from areas of high chemical potential to areas of low chemical potential. Thus, the maximum theoretical extent of a given mass transfer is typically determined by the point at which the chemical potential is uniform. For single phase-systems, this usually translates to uniform concentration throughout the phase, while for multiphase systems chemical species will often prefer one phase over the others and reach a uniform chemical potential only when most of the chemical species has been absorbed into the preferred phase, as in liquid-liquid extraction.
While thermodynamic equilibrium determines the theoretical extent of a given mass transfer operation, the actual rate of mass transfer will depend on additional factors including the flow patterns within the system and the diffusivities of the species in each phase. This rate can be quantified through the calculation and application of mass transfer coefficients for an overall process. These mass transfer coefficients are typically published in terms of dimensionless numbers, often including Péclet numbers, Reynolds numbers, Sherwood numbers and Schmidt numbers, among others.
There are notable similarities in the commonly used approximate differential equations for momentum, heat, and mass transfer. The molecular transfer equations of Newton's law for fluid momentum at low Reynolds number (Stokes flow), Fourier's law for heat, and Fick's law for mass are very similar, since they are all linear approximations to transport of conserved quantities in a flow field.
At higher Reynolds number, the analogy between mass and heat transfer and momentum transfer becomes less useful due to the nonlinearity of the Navier-Stokes equation (or more fundamentally, the general momentum conservation equation), but the analogy between heat and mass transfer remains good. A great deal of effort has been devoted to developing analogies among these three transport processes so as to allow prediction of one from any of the others. | https://en.wikipedia.org/wiki?curid=20437 |
Museum of Jurassic Technology
The Museum of Jurassic Technology at 9341 Venice Boulevard in the Palms district of Los Angeles, California, was founded by David Hildebrand Wilson and Diana Drake Wilson in 1988. It calls itself "an educational institution dedicated to the advancement of knowledge and the public appreciation of the Lower Jurassic", the relevance of the term "Lower Jurassic" to the museum's collections being left uncertain and unexplained.
The museum's collection includes a mixture of artistic, scientific, ethnographic, and historic items, as well as some unclassifiable exhibits; the diversity evokes the cabinets of curiosities that were the 16th-century predecessors of modern natural-history museums. The factual claims of many of the museum's exhibits strain credibility, provoking an array of interpretations.
David Hildebrand Wilson received a MacArthur Foundation fellowship in 2001.
The museum contains an unusual collection of exhibits and objects with varying and uncertain degrees of authenticity. "The New York Times" critic Edward Rothstein described it as a "museum about museums", "where the persistent question is: what kind of place is this?" "Smithsonian" magazine called it "a witty, self-conscious homage to private museums of yore . . . when natural history was only barely charted by science, and museums were closer to Renaissance cabinets of curiosity." In a similar vein, "The Economist" said the museum "captures a time chronicled in Richard Holmes's recent book "The Age of Wonder", when science mingled with poetry in its pursuit of answers to life's mysterious questions."
Lawrence Weschler's book, "Mr. Wilson's Cabinet of Wonder: Pronged Ants, Horned Humans, Mice on Toast, And Other Marvels of Jurassic Technology", attempts to explain the mystery of the Museum of Jurassic Technology. Weschler deeply explores the museum through conversations with its founder, David Wilson, and through outside research on several exhibitions. His investigations into the history of certain exhibits led to varying results of authenticity; some exhibits seem to have been created by Wilson's imagination while other exhibits might be suitable for display in a natural history museum. The Museum of Jurassic Technology at its heart, according to Wilson, is "a museum interested in presenting phenomena that other natural history museums are unwilling to present."
The museum's introductory slideshow recounts that "In its original sense, the term, 'museum' meant 'a spot dedicated to the Muses, a place where man's mind could attain a mood of aloofness above everyday affairs'". In this spirit, the dimly lit atmosphere, wood and glass vitrines, and labyrinthine floorplan lead visitors through an eclectic range of exhibits on art, natural history, history of science, philosophy, and anthropology, with a special focus on the history of museums and the variety of paths to knowledge. The museum attracts approximately 25,000 visitors per year.
The museum maintains more than thirty permanent exhibits, including:
From 1992 to 2006, the museum's Foundation Collection was on display in its Tochtermuseum at the Karl Ernst Osthaus-Museum in Hagen, Germany. This exhibition was part of the Museum of Museums wing at the KEOM, which came into being under the stewardship of director Michael Fehr.
In 2005, the museum opened its Tula Tea Room, a Russian-style tea room where Georgian tea is served. This room is a miniature reconstruction of the study of Tsar Nicolas II from the Winter Palace in St. Petersburg, Russia. The Borzoi Kabinet Theater screens a series of poetic documentaries produced by the Museum of Jurassic Technology in collaboration with the St. Petersburg–based arts and science collective Kabinet. The series of films, entitled "A Chain of Flowers", draws its name from the quotation by Charles Willson Peale: "The Learner must be led always from familiar objects toward the unfamiliar, guided along, as it were, a chain of flowers into the mysteries of life". The titles of the films are "Levsha: The Cross-eyed Lefty from Tula and the Steel Flea" (2001), "Obshee Delo: The Common Task" (2005), "Bol'shoe Sovietskaia Zatmenie: The Great Soviet Eclipse" (2008), "The Book of Wisdom and Lies" (2011), and "Language of the Birds" (2012).
The museum was the subject of a 1995 book by Lawrence Weschler entitled "Mr. Wilson's Cabinet of Wonder: Pronged Ants, Horned Humans, Mice on Toast, and Other Marvels of Jurassic Technology", which describes in detail many of its exhibits. The museum is mentioned in the novel "The Museum of Innocence", by Turkish Nobel-laureate Orhan Pamuk. | https://en.wikipedia.org/wiki?curid=20448 |
Men at Work
Men at Work are an Australian rock band formed in 1979 and best known for hits such as "Who Can It Be Now?" and "Down Under". Its founding member was Colin Hay on lead vocals and guitar. After playing as an acoustic duo with Ron Strykert during 1978–79, he formed the group with Strykert playing bass guitar, and Jerry Speiser on drums. They were soon joined by Greg Ham on flute, saxophone, and keyboards and John Rees on bass guitar, with Strykert then switching to lead guitar. The group was managed by Russell Depeller, a friend of Hay, whom he met at Latrobe University. This line-up achieved national and international success in the early 1980s. In January 1983, they were the first Australian artists to have a simultaneous No. 1 album and No. 1 single in the United States "Billboard" charts: "Business as Usual" (released on 9 November 1981) and "Down Under" (1981), respectively. With the same works, they achieved the distinction of a simultaneous No. 1 album and No. 1 single on the Australian, New Zealand, and United Kingdom charts. Their second album "Cargo" (2 May 1983) was also No. 1 in Australia, No. 2 in New Zealand, No. 3 in the US, and No. 8 in the UK. Their third album "Two Hearts" (3 April 1985) reached the top 20 in Australia and top 50 in the US.
They won the Grammy Award for Best New Artist in 1983, they were inducted into the ARIA Hall of Fame in 1994, and they have sold over 30 million albums worldwide. In May 2001, "Down Under" was listed at No. 4 on the APRA Top 30 Australian songs and "Business as Usual" appeared in the book "100 Best Australian Albums" (October 2010).
The original band line-up split in two in 1984, with Speiser and Rees being asked to leave the group. This left Hay, Ham and Strykert. During the recording of the "Two Hearts" album, Strykert decided to leave. Soon after the release of "Two Hearts", Ham left also, leaving Hay as the sole remaining member. Hay and Ham toured the world as Men at Work from 1996 until 2002. On 19 April 2012, Ham was found dead at his home from an apparent heart attack. In 2019, Hay revived the moniker and began touring as Men at Work with a backing band.
The nucleus of Men at Work formed in Melbourne around June 1979 with Colin Hay on lead vocals and guitar, Ron Strykert on bass guitar, and Jerry Speiser on drums. They were soon joined by Greg Ham on flute, sax and keyboards, and then John Rees on bass guitar, with Ron switching to lead guitar. Hay had emigrated to Australia in 1967 from Scotland with his family. In 1978, he had formed an acoustic duo with Strykert, which expanded by mid-1979 with the addition of Speiser. Around this time as a side project, keyboardist Greg Sneddon (ex-Alroy Band). a former bandmate of Jerry Speiser, together with Speiser, Hay and Strykert performed and recorded the music to 'Riff Raff", a low budget stage musical, upon which Sneddon had worked.
Hay had asked Greg Ham to join the group, but Greg had hesitated, as he was finishing his music degree. Ultimately, he decided to join the band in October 1979. John Rees, a friend of Jerry, joined soon after. The name Men At Work was thrown into the hat by Colin Hay, and was seconded by Ron Strykert, when a name was required to put on the blackboard outside The Cricketer's Arms Hotel, Richmond. The band built a "grass roots" reputation as a pub rock band. In 1980, the group issued their debut single, "Keypunch Operator" backed by "Down Under", with both tracks co-written by Hay and Strykert. It was "self-financed" and appeared on their own independent, M. A. W. label. Australian musicologist, Ian McFarlane, felt the A-side was "a fast-paced country-styled rocker with a clean sound and quirky rhythm". Despite not appearing in the top 100 on the Australian Kent Music Report Singles Chart, by the end of that year the group had "grown in stature to become the most in-demand and highly paid, unsigned band of the year".
Early in 1981 Men at Work signed with CBS Records, the Australian branch of CBS Records International, (which became Sony Music) on the recommendation of Peter Karpin, the label's A&R person. The group's first single with CBS Records in Australia "Who Can It Be Now?", was released in June 1981 which reached No. 2 and remained in the chart for 24 weeks. It had been produced by United States-based Peter McIan, who was also working on their debut album, "Business as Usual".
McIan, together with the band worked on the arrangements for all the songs that appeared on "Business As Usual". Their next single was a re-arranged and "popified" version of "Down Under". It appeared in October that year and reached No. 1 in November, where it remained for six weeks. "Business as Usual" was also released in October and went to No. 1 on the Australian Kent Music Report Albums Chart, spending a total of nine weeks at the top spot. "The Canberra Times" Garry Raffaele opined that it "generally stays at a high level, tight and jerky ... There is a delicacy about this music — and that is not a thing you can say about too many rock groups. The flute and reeds of Greg Ham do much to further that". McFarlane noted that "[a]side from the strength of the music, part of the album's appeal was its economy. The production sound was low-key, but clean and uncluttered. Indeed, the songs stood by themselves with little embellishment save for a bright, melodic, singalong quality".
By February the following year both "Down Under" and "Business as Usual" had reached No. 1 on the respective Official New Zealand Music Charts – the latter was the first Australian album to reach that peak in New Zealand. Despite its strong Australian and New Zealand showing, and having an American producer (McIan), "Business as Usual" was twice rejected by Columbia's US parent company. Thanks to the persistence of Russell Depeller and Karpin, the album was finally released in the US and the United Kingdom in April 1982 – six months after its Australian release. Their next single, "Be Good Johnny", was issued in Australia in April 1982 and reached No. 8 in Australia, and No. 3 in New Zealand.
Men at Work initially broke through to North American audiences in the western provinces of Canada with "Who Can It Be Now?" hitting the top 10 on radio stations in Winnipeg by May 1982. It peaked at No. 8 on the Canadian "RPM" Top Singles Chart in July. In August the group toured Canada and the US to promote the album and related singles, supporting Fleetwood Mac. The band became more popular on Canadian radio in the following months and also started receiving top 40 US airplay by August. In October "Who Can It Be Now?" reached No. 1 on the US "Billboard" Hot 100, while Canada was one single ahead with "Down Under" topping the Canadian charts that same month. In the following month "Business as Usual" began a 15-week run at No. 1 on the "Billboard" 200.
While "Who Can It Be Now?" was still in the top ten in the US, "Down Under" was finally released in that market. It entered the US charts at No. 79 and ten weeks later, it was No. 1. By January 1983 Men at Work had the top album and single in both the US and the UK – never previously achieved by an Australian act. "Be Good Johnny" received moderate airplay in the US; it reached the top 20 in Canada.
"Down Under" gained international media exposure in September 1983 through television coverage of the Australian challenge for the America's Cup yacht trophy in September 1983 when it was adopted as the theme song by the crew of the successful "Australia II".
The band released their second album, "Cargo", in April 1983, which also peaked at No. 1 – for two weeks – on the Australian charts. In New Zealand it reached No. 2. It had been finished in mid-1982 with McIan producing again, but was held back due to the success of their debut album on the international market, where "Business as Usual" was still riding high. "Cargo" appeared at No. 3 on the "Billboard" 200, and No. 8 in the UK. The lead single, "Overkill", was issued in Australia ahead of the album in October 1982 and reached No. 6, it peaked at No. 3 in the US. "Dr. Heckyll & Mr. Jive" followed in March 1983 made it to No. 5 in Australia, and No. 28 in the US. "It's a Mistake" reached No. 6 in the US. The band toured the world extensively in 1983.
In 1984, long standing tensions between Hay and Speiser led to a split in the band. Both Rees and Speiser were told they were "not required", as Hay, Ham and Strykert used session musicians to record their third album, "Two Hearts" (23 April 1985). Studio musicians included Jeremy Alsop on bass guitar (ex-Ram Band, Pyramid, Broderick Smith Band); and Mark Kennedy on drums (Spectrum, Ayers Rock, Marcia Hines Band). "Two Hearts" was produced by Hay and Ham. It was a critical and commercial failure compared to their previous albums and only peaked at No. 16 in Australia, and No. 50 on the US chart. Strykert had left during its production.
Four tracks were released as singles, "Everything I Need" (May 1985), "Man with Two Hearts", "Maria" (August), and "Hard Luck Story" (October); only the lead single charted in Australia (No. 37) and the US (No. 47). The album relied heavily on drum machines and synthesisers, and reduced the presence of Ham's saxophone, giving it a different feel compared to its predecessors. Hay and Ham hired new bandmates, to tour in support of "Two Hearts", with Alsop and Kennedy joined by James Black on guitar and keyboards (Mondo Rock, The Black Sorrows). Soon after a third guitarist, Colin Bayley (Mi-Sex), was added and Kennedy was replaced on drums by Chad Wackerman (Frank Zappa). Australian singers Kate Ceberano and Renée Geyer had also worked on the album and performed live as guest vocalists.
On 13 July 1985 Men at Work performed three tracks for the Oz for Africa concert (part of the global Live Aid program)—"Maria", "Overkill", and an unreleased one, "The Longest Night". They were broadcast in Australia (on both Seven Network and Nine Network) and on MTV in the US. "Maria" and "Overkill" were also broadcast by American Broadcasting Company (ABC) during their Live Aid telecast. Ham left during the band's time touring behind the album. The final Men at Work performances during 1985 had jazz saxophonist Paul Williamson (The Black Sorrows), replacing Ham. By early 1986 the band was defunct and Hay started recording his first solo album, "Looking for Jack" (January 1987), which had Alsop and Wackerman as session musicians.
By mid-1996, after a ten-year absence, Hay and Ham reformed Men at Work to tour South America. They had enjoyed strong fan support there during their earlier career and demands for a reunion had persisted. The 1996 line up had Stephen Hadley on bass guitar and backing vocals (ex-The Black Sorrows, Paul Kelly Band); Simon Hosford on guitar and backing vocals (Colin Hay backing band); and John Watson on drums (The Black Sorrows). The tour culminated in a performance in São Paulo, which was recorded for the Brazilian release of a live album, "Brazil '96", in 1997, which was co-produced by Hay and Ham for Sony Records. It was re-released worldwide in 1998 as "Brazil" with a bonus track, "The Longest Night", the first new studio track since "Two Hearts".
In 1997 drummer Tony Floyd replaced Watson but by 1998 the lineup was Hay, Ham, James Ryan (guitar, backing vocals), Rick Grossman (of the Hoodoo Gurus) on bass and Peter Maslen (ex-Boom Crash Opera) on drums. In 1999 Ryan, Grossman and Maslen were out and Hosford and Floyd were back in, along with bassist Stuart Speed. Rodrigo Aravena was brought in on bass in 2000, along with Heta Moses on drums. Moses was replaced by Warren Trout in 2001 as Stephen Hadley returned on bass.
The band toured Australia, South America, Europe and the US from 1998 to 2000. Men at Work performed "Down Under" at the closing ceremony of the 2000 Summer Olympics in Sydney, alongside Paul Hogan of ""Crocodile" Dundee" (1986).
One of their European tours for mid-2000 was cancelled and the group had disbanded by 2002, although Hay and Ham periodically reunited Men at Work with guest musicians (including an appearance in February 2009, when they performed "Down Under" as a duo at the Australia Unites Victorian Bushfire Appeal Telethon).
In February 2010 Larrikin Music Publishing won a case against Hay and Strykert, their record label (Sony BMG Music Entertainment) and music publishing company (EMI Songs Australia) arising from the uncredited appropriation of "Kookaburra", originally written in 1932 by Marion Sinclair and for which Larrikin owned the publishing rights, as the flute line in the Men at Work song, "Down Under". Back in early 2009 the Australian music-themed TV quiz, "Spicks and Specks", had posed a question which suggested that "Down Under" contained elements of "Kookaburra".
Larrikin, then headed by Norman Lurie, filed suit after Larrikin was sold to another company and had demanded between 40% and 60% of the previous six years of earnings from the song. In February 2010 the judge ruled that "Down Under" did contain a flute riff based on "Kookaburra" but stipulated that neither was it necessarily the hook nor a substantial part of the hit song (Hay and Strykert had written the track years before the flute riff was added by Ham). In July 2010 a judge ruled that Larrikin should be paid 5% of past (since 2002) and future profits. Ham took the verdict particularly hard, feeling responsible for having performed the flute riff at the centre of the lawsuit and worried that he would only be remembered for copying someone else's music, resulting in depression and anxiety. Ham's body was found in his Carlton North home on April 19, 2012 after he suffered a fatal heart attack at age 58.
In June 2019, Hay toured Europe with a group of Los Angeles-based session musicians under the name Men at Work, despite the band featuring no other original members of the band.
Hay maintained a solo career and played with Ringo Starr & His All-Starr Band. Strykert relocated to Hobart in 2009 from Los Angeles, and continued to play music and released his first solo album, "Paradise", in September that year. He expressed resentment towards Hay, mainly over royalties. Ham remained musically active and played sax with the Melbourne-based group The Nudist Funk Orchestra until his death. Rees was a music teacher in Melbourne and also played the violin and bass guitar for the band Beggs 2 Differ. Speiser played drums for the band, The Afterburner.
The group won the 1983 Grammy Award for Best New Artist; the other nominees were Asia, Jennifer Holliday, The Human League and Stray Cats. In August 1983 they were given a Crystal Globe Award for $100 million worth of record business by their US label. That same year in Canada they were awarded a Juno Award for "International LP of the Year". Men at Work have sold over 30 million albums worldwide.
At the ARIA Music Awards of 1994 they were inducted into the related Hall of Fame. On 28 May 2001 "Down Under" was listed at No. 4 on the APRA Top 30 Australian songs. In October 2010, "Business as Usual" was listed in the book, "100 Best Australian Albums".
Colin Hay has been the only constant member in all configurations. | https://en.wikipedia.org/wiki?curid=20451 |
Meconium aspiration syndrome
Meconium aspiration syndrome (MAS) also known as neonatal aspiration of meconium is a medical condition affecting newborn infants. It describes the spectrum of disorders and pathophysiology of newborns born in meconium-stained amniotic fluid (MSAF) and have meconium within their lungs. Therefore, MAS has a wide range of severity depending on what conditions and complications develop after parturition. Furthermore, the pathophysiology of MAS is multifactorial and extremely complex which is why it is the leading cause of morbidity and mortality in term infants.
The word "meconium" is derived from the Greek word "mēkōnion" meaning "juice from the opium poppy" as the sedative effects it had on the foetus were observed by Aristotle.
Meconium is a sticky dark-green substance which contains gastrointestinal secretions, amniotic fluid, bile acids, bile, blood, mucus, cholesterol, pancreatic secretions, lanugo, vernix caseosa and cellular debris. Meconium accumulates in the foetal gastrointestinal tract throughout the third trimester of pregnancy and it is the first intestinal discharge released within the first 48 hours after birth. Notably, since meconium and the whole content of the gastrointestinal tract is located ‘extracorporeally,’ its constituents are hidden and normally not recognised by the foetal immune system.
For the meconium within the amniotic fluid to successfully cause MAS, it has to enter the respiratory system during the period when the fluid-filled lungs transition into an air-filled organ capable of gas exchange.
The main theories of meconium passage into amniotic fluid are caused by foetal maturity or from foetal stress as a result of hypoxia or infection. Other factors that promote the passage of meconium "in utero" include placental insufficiency, maternal hypertension, pre-eclampsia and maternal drug use of tobacco and cocaine. However, the exact mechanism for meconium passage into the amniotic fluid is not completely understood and it may be a combination of several factors.
There may be an important association between foetal distress and hypoxia with MSAF. It is believed that foetal distress develops into foetal hypoxia causing the foetus to defecate meconium resulting in MSAF and then perhaps MAS. Other stressors which causes foetal distress, and therefore meconium passage, includes when umbilical vein oxygen saturation is below 30%.
Foetal hypoxic stress during parturition can stimulate colonic activity, by enhancing intestinal peristalsis and relaxing the anal sphincter, which results in the passage of meconium. Then, because of intrauterine gasping or from the first few breaths after delivery, MAS may develop. Furthermore, aspiration of thick meconium leads to obstruction of airways resulting in a more severe hypoxia.
It is important to note that the association between foetal distress and meconium passage is not a definite cause-effect relationship as over ¾ of infants with MSAF are vigorous at birth and do not have any distress or hypoxia. Additionally, foetal distress occurs frequently without the passage of meconium as well.
Although meconium is present in the gastrointestinal tract early in development, MSAF rarely occurs before 34 weeks gestation.
Peristalsis of the foetal intestines is present as early as 8 weeks gestation and the anal sphincter develops at about 20–22 weeks. The early control mechanisms of the anal sphincter are not well understood, however there is evidence that the foetus does defecate routinely into the amniotic cavity even in the absence of distress. The presence of fetal intestinal enzymes have been found in the amniotic fluid of women who are as early as 14–22 weeks pregnant. Thus, suggesting there is free passage of the intestinal contents into the amniotic fluid.
Motilin is found in higher concentrations in post-term than pre-term foetal gastrointestinal tracts. Similarly, intestinal parasympathetic innervation and myelination also increases in later gestations. Therefore, the increased incidence of MAS in post-term pregnancies may reflect the maturation and development of the peristalsis within the gastrointestinal tract in the newborn.
As MAS describes a spectrum of disorders of newborns born through MSAF, without any congenital respiratory disorders or other underlying pathology, there are numerous hypothesised mechanisms and causes for the onset of this syndrome. Long-term consequences may arise from these disorders, for example, infants that develop MAS have higher rates of developing neurodevelopmental defects due to poor respiration.
In the first 15 minutes of meconium aspiration, there is obstruction of larger airways which causes increased lung resistance, decreased lung compliance, acute hypoxemia, hypercapnia, atelectasis and respiratory acidosis. After 60 minutes of exposure, the meconium travels further down into the smaller airways. Once within the terminal bronchioles and alveoli, the meconium triggers inflammation, pulmonary edema, vasoconstriction, bronchoconstriction, collapse of airways and inactivation of surfactant.
The lung areas which do not or only partially participate in ventilation, because of obstruction and/or destruction, will become hypoxic and an inflammatory response may consequently occur. Partial obstruction will lead to air trapping and hyperinflation of certain lung areas and pneumothorax may follow. Chronic hypoxia will lead to an increase in pulmonary vascular smooth muscle tone and persistent pulmonary hypertension causing respiratory and circulatory failure.
Microorganisms, most commonly Gram-negative rods, and endotoxins are found in samples of MSAF at a higher rate than in clear amniotic fluid, for example 46.9% of patients with MSAF also had endotoxins present. A microbial invasion of the amniotic cavity (MIAC) is more common in patients with MSAF and this could ultimately lead to an intra-amniotic inflammatory response. MIAC is associated with high concentrations of cytokines (such as IL-6), chemokines (such as IL-8 and monocyte chemoattractant protein-1), complement, phospholipase A2 and matrix-degrading enzymes. Therefore, these aforementioned mediators within the amniotic fluid during MIAC and intra-amniotic infection could, when aspirated "in" "utero", induce lung inflammation within the foetus.
Meconium has a complex chemical composition, so it is difficult to identify a single agent responsible for the several diseases that arise. As meconium is stored inside the intestines, and is partly unexposed to the immune system, when it becomes aspirated the innate immune system recognises as a foreign and dangerous substance. The immune system, which is present at birth, responds within minutes with a low specificity and no memory in order to try to eliminate microbes. Meconium perhaps leads to chemical pneumonitis as it is a potent activator of inflammatory mediators which include cytokines, complement, prostaglandins and reactive oxygen species.
Meconium is a source of pro-inflammatory cytokines, including tumour necrosis factor (TNF) and interleukins (IL-1, IL-6, IL-8), and mediators produced by neutrophils, macrophages and epithelial cells that may injure the lung tissue directly or indirectly. For example, proteolytic enzymes are released from neutrophilic granules and these may damage the lung membrane and surfactant proteins. Additionally, activated leukocytes and cytokines generate reactive nitrogen and oxygen species which have cytotoxic effects. Oxidative stress results in vasoconstriction, bronchoconstriction, platelet aggregation and accelerated cellular apoptosis. Recently, it has been hypothesised that meconium is a potent activator of toll-like receptor (TLRs) and complement, key mediators in inflammation, and may thus contribute to the inflammatory response in MAS.
Meconium contains high amounts of phospholipase A2 (PLA2), a potent proinflammatory enzyme, which may directly (or through the stimulation of arachidonic acid) lead to surfactant dysfunction, lung epithelium destruction, tissue necrosis and an increase in apoptosis. Meconium can also activate the coagulation cascade, production of platelet-activating factor (PAF) and other vasoactive substances that may lead to destruction of capillary endothelium and basement membranes. Injury to the alveolocapillary membrane results in leakage of liquid, plasma proteins, and cells into the interstitium and alveolar spaces.
Surfactant is synthesised by type II alveolar cells and is made of a complex of phospholipids, proteins and saccharides. It functions to lower surface tension (to allow for lung expansion during inspiration), stabilise alveoli at the end of expiration (to prevent alveolar collapse) and prevents lung oedema. Surfactant also contributes to lung protection and defence as it is also an anti-inflammatory agent. Surfactant enhances the removal of inhaled particles and senescent cells away from the alveolar structure.
The extent of surfactant inhibition depends on both the concentration of surfactant and meconium. If the surfactant concentration is low, even very highly diluted meconium can inhibit surfactant function whereas, in high surfactant concentrations, the effects of meconium are limited. Meconium may impact surfactant mechanisms by preventing surfactant from spreading over the alveolar surface, decreasing the concentration of surfactant proteins (SP-A and SP-B), and by changing the viscosity and structure of surfactant. Several morphological changes occur after meconium exposure, the most notable being the detachment of airway epithelium from stroma and the shedding of epithelial cells into the airway. These indicate a direct detrimental effect on lung alveolar cells because of the introduction of meconium into the lungs.
Persistent pulmonary hypertension (PPHN) is the failure of the foetal circulation to adapt to extra-uterine conditions after birth. PPHN is associated with various respiratory diseases, including MAS (as 15-20% of infants with MAS develop PPHN), but also pneumonia and sepsis. A combination of hypoxia, pulmonary vasoconstriction and ventilation/perfusion mismatch can trigger PPHN, depending on the concentration of meconium within the respiratory tract. PPHN in newborns is the leading cause of death in MAS.
Apoptosis is an important mechanism in the clearance of injured cells and in tissue repair, however too much apoptosis may cause harm, such as acute lung injury. Meconium induces apoptosis and DNA cleavage of lung airway epithelial cells, this is detected by the presence of fragmented DNA within the airways and in alveolar epithelial nuclei. Meconium induces an inflammatory reaction within the lungs as there is an increase of autophagocytic cells and levels of caspase 3 after exposure. After 8 hours of meconium exposure, in rabbit foetuses, the total amount of apoptotic cells is 54%. Therefore, the majority of meconium-induced lung damage may be due to the apoptosis of lung epithelium.
Respiratory distress in an infant born through the darkly coloured MSAF as well as meconium obstructing the airways is usually sufficient enough to diagnose MAS. Additionally, newborns with MAS can have other types of respiratory distress such as tachypnea and hypercapnia. Sometimes it is hard to diagnose MAS as it can be confused with other diseases that also cause respiratory distress, such as pneumonia. Additionally, X-rays and lung ultrasounds can be quick, easy and cheap imaging techniques to diagnose lung diseases like MAS.
In general, the incidence of MAS has been significantly reduced over the past two decades as the number of post-term deliveries has minimised. Currently, labour is induced in women who have been pregnant for longer than 41 weeks gestation.
Prevention during pregnancy may include amnioinfusion and antibiotics but the effectiveness of these treatments are questionable.
As previously mentioned, oropharyngeal and nasopharyngeal suctioning is not an ideal preventative treatment for both vigorous and depressed (not breathing) infants.
Most infants born through MSAF do not require any treatments (other than routine postnatal care) as they show no signs of respiratory distress, as only approximately 5% of infants born through MSAF develop MAS. However, infants which do develop MAS need to be admitted to a neonatal unit where they will be closely observed and provided any treatments needed. Observations include monitoring heart rate, respiratory rate, oxygen saturation and blood glucose (to detect worsening respiratory acidosis or the development of hypoglycemia). In general, treatment of MAS is more supportive in nature.
To clear the airways of meconium, tracheal suctioning can be used however, the efficacy of this method is in question and it can cause harm.
In cases of MAS, there is a need for supplemental oxygen for at least 12 hours in order to maintain oxygen saturation of haemoglobin at 92% or more. The severity of respiratory distress can vary significantly between newborns with MAS, as some require minimal or no supplemental oxygen requirement and, in severe cases, mechanical ventilation may be needed. The desired oxygen saturation is between 90-95% and PaO2 may be as high as 90mmHg. In cases where there is thick meconium deep within the lungs, mechanical ventilation may be required. In extreme cases, extracorporeal membrane oxygenation (ECMO) may be utilised in infants who fail to respond to ventilation therapy. While on ECMO, the body can have time to absorb the meconium and for all the associated disorders to resolve. There has been an excellent response to this treatment, as the survival rate of MAS while on ECMO is more than 94%.
Ventilation of infants with MAS can be challenging and, as MAS can affect each individual differently, ventilation administration may need to be customised. Some newborns with MAS can have homogenous lung changes and others can have inconsistent and patchy changes to their lungs. It is common for sedation and muscle relaxants to be used to optimise ventilation and minimise the risk of pneumothorax associated with dyssynchronous breathing.
Inhaled nitric oxide (iNO) acts on vascular smooth muscle causing selective pulmonary vasodilation. This is ideal in the treatment of PPHN as it causes vasodilation within ventilated areas of the lung thus, decreasing the ventilation-perfusion mismatch and thereby, improves oxygenation. Treatment utilising iNO decreases the need for ECMO and mortality in newborns with hypoxic respiratory failure and PPHN as a result of MAS. However, approximately 30-50% of infants with PPHN do not respond to iNO therapy.
As inflammation is such a huge issue in MAS, treatment has consisted of anti-inflammatories.
Glucocorticoids (GCs) have a strong anti-inflammatory activity and works to reduce the migration and activation of neutrophils, eosinophils, mononuclears and other cells. GCs reduce the migration of neutrophils into the lungs ergo, decreasing their adherence to the endothelium. Thus, there is a reduction in the action of mediators released from these cells and therefore, a reduced inflammatory response.
GCs also possess a genomic mechanism of action in which, once bound to a glucocorticoid receptor, the activated complex moves into the nucleus and inhibits transcription of mRNA. Ultimately, effecting whether various proteins get produced or not. Inhibiting the transcription of nuclear factor (NF-κB) and protein activator (AP-1) attenuates the expression of pro-inflammatory cytokines (IL-1, IL-6, IL-8 and TNF etc.), enzymes (PLA2, COX-2, iNOs etc.) and other biologically active substances. The anti-inflammatory effect of GCs is also demonstrated by enhancing the activity of lipocortines which inhibit the activity of PLA2 and therefore, decrease the production of arachidonic acid and mediators of lipoxygenase and cyclooxygenase pathways.
Anti-inflammatories need to be administered as quickly as possible as the effect of these drugs can diminish even just an hour after meconium aspiration. For example, early administration of dexamethasone significantly enhanced gas exchange, reduced ventilatory pressures, decreased the number of neutrophils in the bronchoalveolar area, reduced oedema formation and oxidative lung injury.However, GCs may increase the risk of infection and this risk increases with the dose and duration of glucocorticoid treatment. Other issues can arise, such as aggravation of diabetes mellitus, osteoporosis, skin atrophy and growth retardation in children.
Phosphodiesterases (PDE) degrades cAMP and cGMP and, within the respiratory system of a newborn with MAS, various isoforms of PDE may be involved due to their pro-inflammatory and smooth muscle contractile activity. Therefore, non-selective and selective inhibitors of PDE could potentially be used in MAS therapy. However, the use of PDE inhibitors can cause cardiovascular side effects. Non-selective PDE inhibitors, such as methylxanthines, increase concentrations of cAMP and cGMP in the cells leading to bronchodilation and vasodilation. Additionally, methylxanthines decreases the concentrations of calcium, acetylcholine and monoamines, this controls the release of various mediators of inflammation and bronchoconstriction, including prostaglandins. Selective PDE inhibitors target one subtype of phosphodiesterase and in MAS the activities of PDE-3, PDE-4, PDE-5 and PDE-7 may become enhanced. For example, Milrinone (a selective PDE3 inhibitor) improved oxygenation and survival of neonates with MAS.
Arachidonic acid is metabolised, via cyclooxygenase (COX) and lipoxygenase, to various substances including prostaglandins and leukotrienes, which exhibit potent pro-inflammatory and vasoactive effects. By inhibiting COX, and more specifically COX-2, (either through selective or non-selective drugs) inflammation and oedema can be reduced. However, COX inhibitors may induce peptic ulcers and cause hyperkalemia and hypernatremia. Additionally, COX inhibitors have not shown any great response in the treatment of MAS.
Meconium is typically sterile however, it can contain various cultures of bacteria so appropriate antibiotics may need to be prescribed.
Lung lavage with diluted surfactant is a new treatment with potentially beneficial results depending on how early it is administered in newborns with MAS. This treatment shows promise as it has a significant effect on air leaks, pneumothorax, the need for ECMO and death. Early intervention and using it on newborns with mild MAS is more effective. However, there are risks as a large volume of fluid instillation to the lung of a newborn can be dangerous (particularly in cases of severe MAS with pulmonary hypertension) as it can exacerbate hypoxia and lead to mortality.
Originally, it was believed that MAS developed as a result of the meconium being a physical blockage of the airways. Thus, to prevent newborns, who were born through MSAF, from developing MAS, suctioning of the oropharyngeal and nasopharyngeal area before delivery of the shoulders followed by tracheal aspiration was utilised for 20 years. This treatment was believed to be effective as it was reported to significantly decrease the incidence of MAS compared to those newborns born through MSAF who were not treated. This claim was later disproved and future studies concluded that oropharyngeal and nasopharyngeal suctioning, before delivery of the shoulders in infants born through MSAF, does not prevent MAS or its complications. In fact, it can cause more issues and damage (e.g. mucosal damage), thus it is not a recommended preventative treatment. Suctioning may not significantly reduce the incidence of MAS as meconium passage and aspiration may occur "in-utero." Thereby making the suctioning redundant and useless as the meconium may already be deep within the lungs at the time of birth.
Historically, amnioinfusion has been used when MSAF was present, which involves a transcervical infusion of fluid during labour. The idea was to dilute the thick meconium to reduce its potential pathophysiology and reduce cases of MAS, since MAS is more prevalent in cases of thick meconium. However, there are associated risks, such as umbilical cord prolapse and prolongation of labour. The UK National Institute of Health and Clinical Excellence (NICE) Guidelines recommend against the use of amnioinfusion in women with MSAF.
1 in every 7 pregnancies have MSAF and, of these cases, approximately 5% of these infants develop MAS. MSAF is observed 23-52% in pregnancies at 42 weeks therefore, the frequency of MAS increases as the length of gestation increases, such that the prevalence is greatest in post-term pregnancies. Conversely, preterm births are not frequently associated with MSAF (only approximately 5% in total contain MSAF). The rate of MAS declines in populations where labour is induced in women that have pregnancies exceeding 41 weeks. There are many suspected pre-disposing factors that are thought to increase the risk of MAS. For example, the risk of MSAF is higher in African American, African and Pacific Islander mothers, compared to mothers from other ethnic groups.
Research is being focused on developing both a successful method for preventing MAS as well as an effective treatment. For example, investigations are being made in the efficiency of anti-inflammatory agents, surfactant replacement therapy and antibiotic therapy. More research needs to be conducted on the pharmacological properties of, for example, glucocorticoids, including dosages, administration, timing or any drug interactions. Additionally, there is still research being conducted on whether intubation and suctioning of meconium in newborns with MAS is beneficial, harmful or is simply a redundant and outdated treatment. In general, there is still no generally accepted therapeutic protocol and effective treatment plan for MAS. | https://en.wikipedia.org/wiki?curid=20452 |
Meconium
Meconium is the earliest stool of a mammalian infant. Unlike later feces, meconium is composed of materials ingested during the time the infant spends in the uterus: intestinal epithelial cells, lanugo, mucus, amniotic fluid, bile and water. Meconium, unlike later feces, is viscous and sticky like tar, its color usually being a very dark olive green; it is almost odorless. When diluted in amniotic fluid, it may appear in various shades of green, brown, or yellow. It should be completely passed by the end of the first few days after birth, with the stools progressing toward yellow (digested milk).
Meconium is normally retained in the infant's bowel until after birth, but sometimes it is expelled into the amniotic fluid (also called "amniotic liquor") prior to birth or during labor and delivery. The stained amniotic fluid (called "meconium liquor" or "meconium-stained liquor") is recognized by medical staff as a possible sign of fetal distress. Some post-dates pregnancies (where the woman is more than 40 weeks pregnant) may also have meconium-stained liquor without fetal distress. Medical staff may aspirate the meconium from the nose and mouth of a newborn immediately after delivery in the event the baby shows signs of respiratory distress to decrease the risk of meconium aspiration syndrome, which can occur in meconium-stained amniotic fluid.
Most of the time that the amniotic fluid is stained with meconium, it will be homogeneously distributed throughout the fluid, making it brown. This indicates that the fetus passed the meconium some time ago such that sufficient mixing occurred as to establish the homogeneous mixture. Terminal meconium occurs when the fetus passes the meconium a short enough time before birth/cesarean section that the amniotic fluid remains clear, but individual clumps of meconium are in the fluid.
The failure to pass meconium is a symptom of several diseases including Hirschsprung's disease and cystic fibrosis.
The meconium sometimes becomes thickened and congested in the intestines, a condition known as meconium ileus. Meconium ileus is often the first sign of cystic fibrosis. In cystic fibrosis, the meconium can form a bituminous black-green mechanical obstruction in a segment of the ileum. Beyond this, there may be a few separate grey-white globular pellets. Below this level, the bowel is a narrow and empty micro-colon. Above the level of the obstruction, there are several loops of hypertrophied bowel distended with fluid. No meconium is passed, and abdominal distension and vomiting appear soon after birth. About 20% of cases of cystic fibrosis present with meconium ileus, while approximately 20% of one series of cases of meconium ileus did not have cystic fibrosis. The presence of meconium ileus is not related to the severity of the cystic fibrosis. The obstruction can be relieved in a number of different ways.
Meconium ileus should be distinguished from meconium plug syndrome, in which a tenacious mass of mucus prevents the meconium from passing and there is no risk of intestinal perforation. Meconium ileus has a significant risk of intestinal perforation. In a barium enema, meconium plug syndrome shows a normal or dilated colon as compared to micro-colon in meconium ileus.
Meconium can be tested for various drugs, to check for "in utero" exposure. Using meconium, a Canadian research group showed that by measuring a by-product of alcohol, fatty acid ethyl esters (FAEE) they could objectively detect excessive maternal drinking of alcohol during pregnancy. In the US, the results of meconium testing may be used by child protective services and other law enforcement agencies to determine the eligibility of the parents to keep the newborn. Meconium can also be analyzed to detect the tobacco use of mothers during their pregnancy, which is commonly under-reported.
The issue of whether meconium is sterile remains debated and is an area of ongoing research. Although some researchers have reported evidence of bacteria in meconium, this has not been consistently confirmed. Other researchers have raised questions about whether these findings may be due to contamination after sample collection and that meconium is, in fact, sterile until after birth. Further researchers have hypothesized that there may be bacteria in the womb, but these are a normal part of pregnancy and could have an important role in shaping the developing immune system and are not harmful to the baby.
The Latin term "meconium" derives from the Greek , "mēkōnion", a diminutive of , "mēkōn", i.e. poppy, in reference either to its tar-like appearance that may resemble some raw opium preparations or to Aristotle's belief that it induces sleep in the fetus. | https://en.wikipedia.org/wiki?curid=20453 |
Montreux Convention Regarding the Regime of the Straits
The Montreux Convention Regarding the Regime of the Straits is a 1936 agreement that gives Turkey control over the Turkish Straits (the Bosporus and Dardanelles straits) and regulates the transit of naval warships. The Convention guarantees the free passage of civilian vessels in peacetime, and restricts the passage of naval ships not belonging to Black Sea states. The terms of the Convention have been a source of controversy over the years, most notably about the Soviet Union's military access to the Mediterranean Sea.
Signed on 20 July 1936 at the Montreux Palace in Switzerland, the Convention permitted Turkey to remilitarise the Straits. It went into effect on 9 November 1936 and was registered in the "League of Nations Treaty Series" on 11 December 1936. It remains in force.
The proposed 21st-century Kanal Istanbul project may be a possible bypass to the Montreux Convention and allow greater Turkish autonomy with respect to the passage of military ships (which are limited in number, tonnage, and weaponry) from the Black Sea to the Sea of Marmara.
The convention was one of a series of agreements in the 19th and 20th centuries that sought to address the long-running "Straits Question" of who should control the strategically vital link between the Black Sea and Mediterranean Sea. In 1923, the Treaty of Lausanne had demilitarised the Dardanelles and opened the Straits to unrestricted civilian and military traffic, under the supervision of the International Straits Commission of the League of Nations.
By the late 1930s, the strategic situation in the Mediterranean had altered with the rise of Fascist Italy, which controlled the Greek-inhabited Dodecanese islands off the west coast of Turkey and had constructed fortifications on Rhodes, Leros and Kos. The Turks feared that Italy would seek to exploit access to the Straits to expand its power into Anatolia and the Black Sea region. There were also fears of Bulgarian rearmament. Although Turkey was not permitted to refortify the Straits, it nonetheless did so secretly.
In April 1935, the Turkish government dispatched a lengthy diplomatic note to the signatories of the Treaty of Lausanne proposing a conference on the agreement of a new regime for the Straits and requested that the League of Nations authorise the reconstruction of the Dardanelles forts. In the note, Turkish foreign minister Tevfik Rüştü Aras explained that the international situation had changed greatly since 1923. At that time, Europe had been moving towards disarmament and an international guarantee to defend the Straits. The Abyssinia Crisis of 1934–35, the denunciation by Germany of the Treaty of Versailles and international moves towards rearmament meant that "the only guarantee intended to guard against the total insecurity of the Straits has just disappeared in its turn." Indeed, Aras said, "the Powers most closely concerned are proclaiming the existence of a threat of general conflagration." The key weaknesses of the present regime were that the machinery for collective guarantees were too slow and ineffective, there was no contingency for a general threat of war and no provision for Turkey to defend itself. Turkey was therefore prepared
The response to the note was generally favourable, and Australia, Bulgaria, France, Germany, Greece, Japan, Romania, the Soviet Union, Turkey, the United Kingdom and Yugoslavia agreed to attend negotiations at Montreux in Switzerland, which began on 22 June 1936. Two major powers were not represented: Italy, whose aggressively expansionist policies had prompted the conference in the first place, refused to attend and the United States declined even to send an observer.
Turkey, the UK and the Soviet Union each put forward their own set of proposals, aimed chiefly at protecting their own interests. The British favoured the continuation of a relatively restrictive approach, while the Turks sought a more liberal regime that reasserted their own control over the Straits and the Soviets proposed a regime that would guarantee absolute freedom of passage. The British, supported by France, sought to exclude the Soviet fleet from the Mediterranean Sea, where it might have threatened the vital shipping lanes to India, Egypt and the Far East. In the end, the British conceded some of their requests while the Soviets succeeded in ensuring that the Black Sea countries – including the USSR – were given some exemptions from the military restrictions imposed on non-Black Sea nations. The agreement was ratified by all of the conference attendees with the exception of Germany, which had not been a signatory to the Treaty of Lausanne, and with reservations by Japan, and came into force on 9 November 1936.
Britain's willingness to make concessions has been attributed to a desire to avoid Turkey being driven to ally itself with, or fall under the influence of, Adolf Hitler or Benito Mussolini. It was thus the first in a series of steps by Britain and France to ensure that Turkey would either remain neutral or tilt towards the Western Allies in the event of any future conflict with the Axis.
The Convention consists of 29 Articles, four annexes and one protocol. Articles 2–7 consider the passage of merchant ships. Articles 8–22 consider the passage of war vessels. The key principle of freedom of passage and navigation is stated in articles 1 and 2. Article 1 provides, "The High Contracting Parties recognise and affirm the principle of freedom of passage and navigation by sea in the Straits". Article 2 states, "In time of peace, merchant vessels shall enjoy complete freedom of passage and navigation in the Straits, by day and by night, under any flag with any kind of cargo."
The International Straits Commission was abolished, authorising the full resumption of Turkish military control over the Straits and the refortification of the Dardanelles. Turkey was authorised to close the Straits to all foreign warships in wartime or when it was threatened by aggression. Also, it was authorised to refuse transit from merchant ships belonging to countries at war with Turkey.
A number of highly-specific restrictions were imposed on what type of warships are allowed passage. Non-Black-Sea powers willing to send a vessel must notify Turkey 8 days prior of their sought passing. Also, no more than nine foreign warships, with a total aggregate tonnage of 15,000 tons, may pass at any one time. Furthermore, no single ship heavier than 10.000t can pass. An aggregate tonnage of all non-Black Sea warships in the Black Sea must be no more than 30,000 tons (or 45,000 tons under special conditions), and they are permitted to stay in the Black Sea for no longer than twenty-one days. Only Black Sea states may transit capital ships of any tonnage, escorted by no more than two destroyers.
Under Article 12, Black Sea states are also allowed to send submarines through the Straits, with prior notice, as long as the vessels have been constructed, purchased or sent for repair outside the Black Sea. The less restrictive rules applicable to Black Sea states were agreed as, effectively, a concession to the Soviet Union, the only Black Sea state other than Turkey with any significant number of capital ships or submarines. The passage of civil aircraft between the Mediterranean and Black Seas is permitted but only along routes authorised by the Turkish government.
The terms of the Convention were largely a reflection of the international situation in the mid-1930s. They largely served Turkish and Soviet interests, enabling Turkey to regain military control of the Straits and assuring Soviet dominance of the Black Sea. Although the Convention restricted the Soviets' ability to send naval forces into the Mediterranean Sea, thereby satisfying British concerns about Soviet intrusion into what was considered a British sphere of influence, it also ensured that outside powers could not exploit the Straits to threaten the Soviet Union. That was to have significant repercussions during World War II when the Montreux regime prevented the Axis powers from sending naval forces through the Straits to attack the Soviet Union. The Axis powers were thus severely limited in naval capability in their Black Sea campaigns, relying principally on small vessels that had been transported overland by rail and canal networks.
Auxiliary vessels and armed merchant ships occupied a grey area, however, and the transit of such vessels through the straits led to friction between the Allies and Turkey. Repeated protests from Moscow and London led to the Turkish government banning the movements of "suspicious" Axis ships with effect from June 1944 after a number of German auxiliary ships had been permitted to transit the Straits.
Although the Montreux Convention is cited by the Turkish government as prohibiting aircraft carriers from transiting the straits, the treaty actually contains no explicit prohibition on aircraft carriers. However, modern aircraft carriers are heavier than the 15,000 ton limit imposed on warships, making it impossible for non-Black Sea powers to transit modern aircraft carriers through the Straits.
Under Article 11, Black Sea states are permitted to transit capital ships of any tonnage through the straits, but Annex II specifically excludes aircraft carriers from the definition of capital ship. In 1936, it was common for battleships to carry observation aircraft. Therefore, aircraft carriers were defined as ships that were "designed or adapted primarily for the purpose of carrying and operating aircraft at sea." The inclusion of aircraft on any other ship does not classify it as an aircraft carrier.
The Soviet Union designated its "Kiev"-class and "Kuznetsov"-class ships as "aircraft-carrying cruisers" because these ships were armed with P-500 and P-700 cruise missiles, which also form the main armament of the "Slava"-class cruiser and the "Kirov"-class battlecruiser. The result was that the Soviet Navy could send its aircraft-carrying cruisers through the Straits in compliance with the Convention, but at the same time the Convention denied access to NATO aircraft carriers, which exceeded the 15,000 ton limit.
Turkey chose to accept the designation of the Soviet aircraft carrying cruisers as aircraft cruisers, as any revision of the Convention could leave Turkey with less control over the Turkish Straits, and the UN Convention on the Law of the Sea had already established more liberal passage through other straits. By allowing the Soviet aircraft carrying cruisers to transit the Straits, Turkey could leave the more restrictive Montreux Convention in place.
The Convention remains in force, but not without dispute. It was repeatedly challenged by the Soviet Union during World War II and the Cold War. As early as 1939, Joseph Stalin sought to reopen the Straits Question and proposed joint Turkish and Soviet control of the Straits, complaining that "a small state [i.e. Turkey] supported by Great Britain held a great state by the throat and gave it no outlet". After the Molotov–Ribbentrop Pact was signed by the Soviet Union and Nazi Germany, the Soviet Foreign Minister Vyacheslav Molotov informed his German counterparts that the Soviet Union wished to take military control of the Straits and establish its own military base there. The Soviets returned to the issue in 1945 and 1946, demanding a revision of the Montreux Convention at a conference excluding most of the Montreux signatories, a permanent Soviet military presence and joint control of the Straits. That was firmly rejected by Turkey, despite an ongoing Soviet "strategy of tension". For several years after World War II, the Soviets exploited the restriction on the number of foreign warships by ensuring that one of theirs was always in the Straits, thus effectively blocking any state other than Turkey from sending warships through the Straits. Soviet pressure expanded into full demands to revise the Montreux Convention, which led to the Turkish Straits crisis of 1946, which led to Turkey abandoning its policy of neutrality. In 1947, it became the recipient of US military and economic assistance under the Truman Doctrine of containment and joined the NATO alliance, along with Greece, in 1952.
The passage of US warships through the Straits also raised controversy, as the convention forbids the transit of non-Black Sea nations' warships with guns of a calibre larger than eight inches (203 mm). In the 1960s, the US sent warships carrying 420 mm calibre ASROC missiles through the Straits, prompting Soviet protests. The Turkish government rejected the Soviet complaints, pointing out that guided missiles were not guns and that since such weapons had not existed at the time of the Convention, they were not restricted.
According to Antiwar.com news editor Jason Ditz, the Montreux Convention is an obstacle to US Naval buildup in the Black Sea because of its stipulations regulating warship traffic by nations not sharing Black Sea coastline. The US thinktank Stratfor has written that these stipulations place Turkey's relationship to the US and its obligations as a NATO alliance member in conflict with Russia and the regulations of the Montreux Convention. The US Navy has stated its forces will continue to operate in the Black Sea in accordance with the convention.
The United Nations Convention on the Law of the Sea (UNCLOS), which entered into force in November 1994, has prompted calls for the Montreux Convention to be revised and adapted to make it compatible with UNCLOS's regime governing straits used for international navigation. However, Turkey's longstanding refusal to sign UNCLOS has meant that Montreux remains in force without further amendments.
The safety of vessels passing through the Bosporus has become a major concern in recent years as the volume of traffic has increased greatly since the Convention was signed: from 4,500 in 1934 to 49,304 by 1998. As well as obvious environmental concerns, the Straits bisect the city of Istanbul, with over 14 million people living on its shores and so maritime incidents in the Straits pose a considerable risk to public safety. The Convention does not, however, make any provision for the regulation of shipping for the purposes of safety and environmental protection. In January 1994, the Turkish government adopted new "Maritime Traffic Regulations for the Turkish Straits and the Marmara Region". That introduced a new regulatory regime "to ensure the safety of navigation, life and property and to protect the environment in the region" but without violating the Montreux principle of free passage. The new regulations provoked some controversy when Russia, Greece, Cyprus, Romania, Ukraine and Bulgaria raised objections. However, they were approved by the International Maritime Organization on the grounds that they were not intended to prejudice "the rights of any ship using the Straits under international law". The regulations were revised in November 1998 to address Russian concerns. | https://en.wikipedia.org/wiki?curid=20454 |
Michael Jordan
Michael Jeffrey Jordan (born February 17, 1963), also known by his initials MJ, is an American former professional basketball player and the principal owner of the Charlotte Hornets of the National Basketball Association (NBA). He played 15 seasons in the NBA, winning six championships with the Chicago Bulls. His biography on the official NBA website states: "By acclamation, Michael Jordan is the greatest basketball player of all time." He is considered instrumental in popularizing the NBA around the world in the 1980s and 1990s.
Jordan played three seasons for coach Dean Smith with the North Carolina Tar Heels. As a freshman, he was a member of the Tar Heels' national championship team in 1982. Jordan joined the Bulls in 1984 as the third overall draft pick, and quickly emerged as a league star, entertaining crowds with his prolific scoring. His leaping ability, demonstrated by performing slam dunks from the free throw line in Slam Dunk Contests, earned him the nicknames "Air Jordan" and "His Airness". He also gained a reputation as one of the best defensive players in basketball. Jordan won his first NBA championship with the Bulls in 1991, and followed that achievement with titles in 1992 and 1993, securing a "three-peat". Although Jordan abruptly retired from basketball before the 1993–94 NBA season and started a new career in Minor League Baseball, he returned to the Bulls in March 1995 and led them to three additional championships in 1996, 1997, and 1998, as well as a then-record 72 regular-season wins in the 1995–96 NBA season. He retired for a second time in January 1999 but returned for two more NBA seasons from 2001 to 2003 as a member of the Washington Wizards.
Jordan's individual accolades and accomplishments include six NBA Finals Most Valuable Player (MVP) Awards, ten scoring titles (both all-time records), five MVP Awards, ten All-NBA First Team designations, nine All-Defensive First Team honors, fourteen NBA All-Star Game selections, three All-Star Game MVP Awards, three steals titles, and the 1988 NBA Defensive Player of the Year Award. He holds the NBA records for highest career regular season scoring average (30.12 points per game) and highest career playoff scoring average (33.45 points per game). In 1999, he was named the greatest North American athlete of the 20th century by ESPN, and was second to Babe Ruth on the Associated Press' list of athletes of the century. Jordan is a two-time inductee into the Naismith Memorial Basketball Hall of Fame, having been enshrined in 2009 for his individual career and again in 2010 as part of the group induction of the 1992 United States men's Olympic basketball team ("The Dream Team"). He became a member of the FIBA Hall of Fame in 2015.
One of the most effectively marketed athletes of his generation, Jordan is also known for his product endorsements. He fueled the success of Nike's Air Jordan sneakers which were introduced in 1984 and remain popular today. Jordan also starred as himself in the 1996 film "Space Jam". He became part-owner and head of basketball operations for the Charlotte Bobcats (now Hornets) in 2006, and bought a controlling interest in 2010. In 2014, Jordan became the first billionaire player in NBA history. With a net worth of $2.1 billion, he is the fourth-richest African American, behind Robert F. Smith, David Steward, and Oprah Winfrey.
Michael Jeffrey Jordan was born at Cumberland Hospital in the Fort Greene neighborhood of New York City's Brooklyn borough on February 17, 1963, the son of bank employee Deloris (née Peoples) and equipment supervisor James R. Jordan Sr. When he was a toddler, he moved with his family to Wilmington, North Carolina. Jordan attended Emsley A. Laney High School in Wilmington, where he highlighted his athletic career by playing basketball, baseball, and football. He tried out for the varsity basketball team during his sophomore year but, at 5'11" (1.80 m), he was deemed too short to play at that level. His taller friend, Harvest Leroy Smith, was the only sophomore to make the team.
Motivated to prove his worth, Jordan became the star of Laney's junior varsity team, and tallied several 40-point games. The following summer, he grew four inches (10 cm) and trained rigorously. Upon earning a spot on the varsity roster, Jordan averaged more than 25 points per game (ppg) over his final two seasons of high school play. As a senior, he was selected to play in the 1981 McDonald's All-American Game and scored 30 points, after averaging 27 points, 12 rebounds, and 6 assists per game for the season. Jordan was recruited by numerous college basketball programs, including Duke, North Carolina, South Carolina, Syracuse, and Virginia. In 1981, he accepted a basketball scholarship to the University of North Carolina at Chapel Hill, where he majored in cultural geography.
As a freshman in coach Dean Smith's team-oriented system, he was named ACC Freshman of the Year after he averaged 13.4 ppg on 53.4% shooting (field goal percentage). He made the game-winning jump shot in the 1982 NCAA Championship game against Georgetown, which was led by future NBA rival Patrick Ewing. Jordan later described this shot as the major turning point in his basketball career. During his three seasons with the Tar Heels, he averaged 17.7 ppg on 54.0% shooting, and added 5.0 rpg. He was selected by consensus to the NCAA All-American First Team in both his sophomore (1983) and junior (1984) seasons. After winning the Naismith and the Wooden College Player of the Year awards in 1984, Jordan left North Carolina one year before his scheduled graduation to enter the 1984 NBA draft. The Chicago Bulls selected Jordan with the third overall pick, after Hakeem Olajuwon (Houston Rockets) and Sam Bowie (Portland Trail Blazers). One of the primary reasons why Jordan was not drafted sooner was because the first two teams were in need of a center. However, Trail Blazers general manager Stu Inman contended that it was not a matter of drafting a center, but more a matter of taking Sam Bowie over Jordan, in part because Portland already had Clyde Drexler, who was a guard with similar skills to Jordan. ESPN, citing Bowie's injury-laden college career, named the Blazers' choice of Bowie as the worst draft pick in North American professional sports history. Jordan returned to North Carolina to complete his degree in 1986. He graduated the same year with a Bachelor of Arts degree in geography.
During his rookie season with the Bulls, Jordan averaged 28.2 ppg on 51.5% shooting, and helped make a team that had won 35% of games in the previous three seasons playoff contenders. He quickly became a fan favorite even in opposing arenas. Roy S. Johnson of "The New York Times" described him as "the phenomenal rookie of the Bulls" in November, and Jordan appeared on the cover of "Sports Illustrated" with the heading "A Star Is Born" in December. The fans also voted in Jordan as an All-Star starter during his rookie season. Controversy arose before the All-Star game when word surfaced that several veteran players—led by Isiah Thomas—were upset by the amount of attention Jordan was receiving. This led to a so-called "freeze-out" on Jordan, where players refused to pass the ball to him throughout the game. The controversy left Jordan relatively unaffected when he returned to regular season play, and he would go on to be voted Rookie of the Year. The Bulls finished the season 38–44 and lost to the Milwaukee Bucks in four games in the first round of the playoffs.
Jordan's second season was cut short when he broke his foot in the third game of the year, causing him to miss 64 games. The Bulls made the playoffs despite Jordan's injury and a 30–52 record, at the time the fifth worst record of any team to qualify for the playoffs in NBA history. Jordan recovered in time to participate in the postseason and performed well upon his return. Against a 1985–86 Boston Celtics team that is often considered one of the greatest in NBA history, Jordan set the still-unbroken record for points in a playoff game with 63 in Game 2. The Celtics, however, managed to sweep the series.
Jordan had completely recovered in time for the 1986–87 season, and he had one of the most prolific scoring seasons in NBA history. He became the only player other than Wilt Chamberlain to score 3,000 points in a season, averaging a league high 37.1 points on 48.2% shooting. In addition, Jordan demonstrated his defensive prowess, as he became the first player in NBA history to record 200 steals and 100 blocked shots in a season. Despite Jordan's success, Magic Johnson won the league's Most Valuable Player Award. The Bulls reached 40 wins, and advanced to the playoffs for the third consecutive year. However, they were again swept by the Celtics.
Jordan again led the league in scoring during the 1987–88 season, averaging 35.0 ppg on 53.5% shooting and won his first league MVP Award. He was also named the Defensive Player of the Year, as he had averaged 1.6 blocks and a league high 3.16 steals per game. The Bulls finished 50–32, and made it out of the first round of the playoffs for the first time in Jordan's career, as they defeated the Cleveland Cavaliers in five games. However, the Bulls then lost in five games to the more experienced Detroit Pistons, who were led by Isiah Thomas and a group of physical players known as the "".
In the 1988–89 season, Jordan again led the league in scoring, averaging 32.5 ppg on 53.8% shooting from the field, along with 8 rpg and 8 apg. The Bulls finished with a 47–35 record, and advanced to the Eastern Conference Finals, defeating the Cavaliers and New York Knicks along the way. The Cavaliers series included a career highlight for Jordan when he hit "The Shot" over Craig Ehlo at the buzzer in the fifth and final game of the series. However, the Pistons again defeated the Bulls, this time in six games, by utilizing their "Jordan Rules" method of guarding Jordan, which consisted of double and triple teaming him every time he touched the ball.
The Bulls entered the 1989–90 season as a team on the rise, with their core group of Jordan and young improving players like Scottie Pippen and Horace Grant, and under the guidance of new coach Phil Jackson. On March 28, 1990, Jordan scored a career-high 69 points in a 117–113 road win over the Cavaliers. He averaged a league leading 33.6 ppg on 52.6% shooting, to go with 6.9 rpg and 6.3 apg in leading the Bulls to a 55–27 record. They again advanced to the Eastern Conference Finals after beating the Bucks and Philadelphia 76ers. However, despite pushing the series to seven games, the Bulls lost to the Pistons for the third consecutive season.
In the 1990–91 season, Jordan won his second MVP award after averaging 31.5 ppg on 53.9% shooting, 6.0 rpg, and 5.5 apg for the regular season. The Bulls finished in first place in their division for the first time in 16 years and set a franchise record with 61 wins in the regular season. With Scottie Pippen developing into an All-Star, the Bulls had elevated their play. The Bulls defeated the New York Knicks and the Philadelphia 76ers in the opening two rounds of the playoffs. They advanced to the Eastern Conference Finals where their rival, the Detroit Pistons, awaited them. However, this time the Bulls beat the Pistons in a four-game sweep.
The Bulls advanced to the NBA Finals for the first time in franchise history to face the Los Angeles Lakers, who had Magic Johnson and James Worthy, two formidable opponents. The Bulls won the series four games to one, and compiled a 15–2 playoff record along the way. Perhaps the best known moment of the series came in Game 2 when, attempting a dunk, Jordan avoided a potential Sam Perkins block by switching the ball from his right hand to his left in mid-air to lay the shot into the basket. In his first Finals appearance, Jordan posted per game averages of 31.2 points on 56% shooting from the field, 11.4 assists, 6.6 rebounds, 2.8 steals, and 1.4 blocks. Jordan won his first NBA Finals MVP award, and he cried while holding the NBA Finals trophy.
Jordan and the Bulls continued their dominance in the 1991–92 season, establishing a 67–15 record, topping their franchise record from 1990–91. Jordan won his second consecutive MVP award with averages of 30.1 points, 6.4 rebounds and 6.1 assists per game on 52% shooting. After winning a physical 7-game series over the New York Knicks in the second round of the playoffs and finishing off the Cleveland Cavaliers in the Conference Finals in 6 games, the Bulls met Clyde Drexler and the Portland Trail Blazers in the Finals. The media, hoping to recreate a Magic–Bird rivalry, highlighted the similarities between "Air" Jordan and Clyde "The Glide" during the pre-Finals hype. In the first game, Jordan scored a Finals-record 35 points in the first half, including a record-setting six three-point field goals. After the sixth three-pointer, he jogged down the court shrugging as he looked courtside. Marv Albert, who broadcast the game, later stated that it was as if Jordan was saying, "I can't believe I'm doing this." The Bulls went on to win Game 1, and defeat the Blazers in six games. Jordan was named Finals MVP for the second year in a row and finished the series averaging 35.8 ppg, 4.8 rpg, and 6.5 apg, while shooting 53% from the floor.
In the 1992–93 season, despite a 32.6 ppg, 6.7 rpg, and 5.5 apg campaign and a second-place finish in Defensive Player of the Year voting, Jordan's streak of consecutive MVP seasons ended as he lost the award to his friend Charles Barkley. Coincidentally, Jordan and the Bulls met Barkley and his Phoenix Suns in the 1993 NBA Finals. The Bulls won their third NBA championship on a game-winning shot by John Paxson and a last-second block by Horace Grant, but Jordan was once again Chicago's leader. He averaged a Finals-record 41.0 ppg during the six-game series, and became the first player in NBA history to win three straight Finals MVP awards. He scored more than 30 points in every game of the series, including 40 or more points in 4 consecutive games. With his third Finals triumph, Jordan capped off a seven-year run where he attained seven scoring titles and three championships, but there were signs that Jordan was tiring of his massive celebrity and all of the non-basketball hassles in his life.
During the Bulls' playoff run in 1993, Jordan was seen gambling in Atlantic City, New Jersey, the night before a game against the New York Knicks. The previous year, he admitted that he had to cover $57,000 in gambling losses, and author Richard Esquinas wrote a book in 1993 claiming he had won $1.25 million from Jordan on the golf course. NBA Commissioner David Stern denied in 1995 and 2006 that Jordan's 1993 retirement was a secret suspension by the league for gambling, but the rumor spread widely.
In 2005, Jordan discussed his gambling with Ed Bradley of "60 Minutes" and admitted that he made reckless decisions. Jordan stated, "Yeah, I've gotten myself into situations where I would not walk away and I've pushed the envelope. Is that compulsive? Yeah, it depends on how you look at it. If you're willing to jeopardize your livelihood and your family, then yeah". When Bradley asked him if his gambling ever got to the level where it jeopardized his livelihood or family, Jordan replied, "No". In 2010 Ron Shelton, director of "Jordan Rides the Bus", said that he began working on the documentary believing that the NBA had suspended him, but that research "convinced [him it] was nonsense".
On October 6, 1993, Jordan announced his retirement, citing a loss of desire to play the game. Jordan later stated that the death of his father three months earlier also shaped his decision. Jordan's father was murdered on July 23, 1993, at a highway rest area in Lumberton, North Carolina, by two teenagers, Daniel Green and Larry Martin Demery, who carjacked his luxury Lexus bearing the license plate "UNC 0023". His body was dumped in a South Carolina swamp and was not discovered until August 3. The assailants were traced from calls that they made on James Jordan's cell phone. The two criminals were caught, convicted at trial, and sentenced to life in prison. Jordan was close to his father; as a child he had imitated his father's proclivity to stick out his tongue while absorbed in work. He later adopted it as his own signature, displaying it each time he drove to the basket. In 1996, he founded a Chicago area Boys & Girls Club and dedicated it to his father.
In his 1998 autobiography "For the Love of the Game", Jordan wrote that he had been preparing for retirement as early as the summer of 1992. The added exhaustion due to the Dream Team run in the 1992 Olympics solidified Jordan's feelings about the game and his ever-growing celebrity status. Jordan's announcement sent shock waves throughout the NBA and appeared on the front pages of newspapers around the world.
Jordan then further surprised the sports world by signing a Minor League Baseball contract with the Chicago White Sox on February 7, 1994. He reported to spring training in Sarasota, Florida, and was assigned to the team's minor league system on March 31, 1994. Jordan has stated this decision was made to pursue the dream of his late father, who had always envisioned his son as a Major League Baseball player. The White Sox were another team owned by Bulls owner Jerry Reinsdorf, who continued to honor Jordan's basketball contract during the years he played baseball.
In 1994, Jordan played for the Birmingham Barons, a Double-A minor league affiliate of the Chicago White Sox, batting .202 with three home runs, 51 runs batted in, 30 stolen bases, 114 strikeouts, 51 base on balls, and 11 errors. He also appeared for the Scottsdale Scorpions in the 1994 Arizona Fall League, batting .252 against the top prospects in baseball. On November 1, 1994, his number 23 was retired by the Bulls in a ceremony that included the erection of a permanent sculpture known as "The Spirit" outside the new United Center.
In the 1993–94 season, the Bulls achieved a 55–27 record without Jordan in the lineup, and lost to the New York Knicks in the second round of the playoffs. The 1994–95 Bulls were a shell of the championship team of just two years earlier. Struggling at mid-season to ensure a spot in the playoffs, Chicago was 31–31 at one point in mid-March. The team received help, however, when Jordan decided to return to the Bulls.
In March 1995, Jordan decided to quit baseball due to the ongoing Major League Baseball strike, as he wanted to avoid becoming a potential replacement player. On March 18, 1995, Jordan announced his return to the NBA through a two-word press release: "I'm back." The next day, Jordan took to the court with the Bulls to face the Indiana Pacers in Indianapolis, scoring 19 points. The game had the highest Nielsen rating of a regular season NBA game since 1975. Although he could have opted to wear his normal number in spite of the Bulls having retired it, Jordan instead wore number 45, as he had while playing baseball.
Although he had not played an NBA game in a year and a half, Jordan played well upon his return, making a game-winning jump shot against Atlanta in his fourth game back. He then scored 55 points in the next game against the Knicks at Madison Square Garden on March 28, 1995. Boosted by Jordan's comeback, the Bulls went 13–4 to make the playoffs and advanced to the Eastern Conference Semifinals against the Orlando Magic. At the end of Game 1, Orlando's Nick Anderson stripped Jordan from behind, leading to the game-winning basket for the Magic; he would later comment that Jordan "didn't look like the old Michael Jordan" and that "No. 45 doesn't explode like No. 23 used to."
Jordan responded by scoring 38 points in the next game, which Chicago won. Before the game, Jordan decided that he would immediately resume wearing his former number, 23. The Bulls were fined $25,000 for failing to report the impromptu number change to the NBA. Jordan was fined an additional $5,000 for opting to wear white sneakers when the rest of the Bulls wore black. He averaged 31 points per game in the series, but Orlando won the series in six games.
Jordan was freshly motivated by the playoff defeat, and he trained aggressively for the 1995–96 season. The Bulls were strengthened by the addition of rebound specialist Dennis Rodman, and the team dominated the league, starting the season at 41–3. The Bulls eventually finished with the then-best regular season record in NBA history, 72–10; this record was later surpassed by the 2015–16 Golden State Warriors. Jordan led the league in scoring with 30.4 ppg and won the league's regular season and All-Star Game MVP awards.
In the playoffs, the Bulls lost only three games in four series (Miami Heat 3–0, New York Knicks 4–1, Orlando Magic 4–0). They defeated the Seattle SuperSonics 4–2 in the NBA Finals to win their fourth championship. Jordan was named Finals MVP for a record fourth time, surpassing Magic Johnson's three Finals MVP awards. He also achieved only the second sweep of the MVP Awards in the All-Star Game, regular season and NBA Finals, Willis Reed having achieved the first, during the 1969–70 season. Because this was Jordan's first championship since his father's murder, and it was won on Father's Day, Jordan reacted very emotionally upon winning the title, including a memorable scene of him crying on the locker room floor with the game ball.
In the 1996–97 season, the Bulls started out 69–11, but missed out on a second consecutive 70-win season by losing their final two games to finish 69–13. However, this year Jordan was beaten for the NBA MVP Award by Karl Malone. The Bulls again advanced to the Finals, where they faced Malone and the Utah Jazz. The series against the Jazz featured two of the more memorable clutch moments of Jordan's career. He won Game 1 for the Bulls with a buzzer-beating jump shot. In Game 5, with the series tied at 2, Jordan played despite being feverish and dehydrated from a stomach virus. In what is known as the "Flu Game", Jordan scored 38 points, including the game-deciding 3-pointer with 25 seconds remaining. The Bulls won 90–88 and went on to win the series in six games. For the fifth time in as many Finals appearances, Jordan received the Finals MVP award. During the 1997 NBA All-Star Game, Jordan posted the first triple double in All-Star Game history in a victorious effort; however, he did not receive the MVP award.
Jordan and the Bulls compiled a 62–20 record in the 1997–98 season. Jordan led the league with 28.7 points per game, securing his fifth regular-season MVP award, plus honors for All-NBA First Team, First Defensive Team and the All-Star Game MVP. The Bulls won the Eastern Conference Championship for a third straight season, including surviving a seven-game series with the Indiana Pacers in the Eastern Conference Finals; it was the first time Jordan had played in a Game 7 since the 1992 Eastern Conference Semifinals with the Knicks. After winning, they moved on for a rematch with the Jazz in the Finals.
The Bulls returned to the Delta Center for Game 6 on June 14, 1998, leading the series 3–2. Jordan executed a series of plays, considered to be one of the greatest clutch performances in NBA Finals history. With the Bulls trailing 86–83 with 41.9 seconds remaining in the fourth quarter, Phil Jackson called a timeout. When play resumed, Jordan received the inbound pass, drove to the basket, and hit a shot over several Jazz defenders, cutting the Utah lead to 86–85. The Jazz brought the ball upcourt and passed the ball to forward Karl Malone, who was set up in the low post and was being guarded by Rodman. Malone jostled with Rodman and caught the pass, but Jordan cut behind him and took the ball out of his hands for a steal. Jordan then dribbled down the court and paused, eyeing his defender, Jazz guard Bryon Russell. With 10 seconds remaining, Jordan started to dribble right, then crossed over to his left, possibly pushing off Russell, although the officials did not call a foul. With 5.2 seconds left, Jordan made the climactic shot of his Bulls career. He gave Chicago an 87–86 lead with a jumper over Russell. Afterwards, the Jazz' John Stockton narrowly missed a game-winning three-pointer. The buzzer sounded, and Jordan and the Bulls won their sixth NBA championship, achieving a second three-peat in the decade. Once again, Jordan was voted Finals MVP, having led all scorers averaging 33.5 points per game, including 45 in the deciding Game 6. Jordan's six Finals MVPs is a record; Shaquille O'Neal, Magic Johnson, LeBron James, and Tim Duncan are tied for second place with three apiece. The 1998 Finals holds the highest television rating of any Finals series in history. Game 6 also holds the highest television rating of any game in NBA history.
With Phil Jackson's contract expiring, the pending departures of Scottie Pippen and Dennis Rodman looming, and being in the latter stages of an owner-induced lockout of NBA players, Jordan retired for the second time on January 13, 1999. On January 19, 2000, Jordan returned to the NBA not as a player, but as part owner and president of basketball operations for the Washington Wizards. Jordan's responsibilities with the Wizards were comprehensive. He controlled all aspects of the Wizards' basketball operations, and had the final say in all personnel matters. Opinions of Jordan as a basketball executive were mixed. He managed to purge the team of several highly paid, unpopular players (such as forward Juwan Howard and point guard Rod Strickland), but used the first pick in the 2001 NBA draft to select high schooler Kwame Brown, who did not live up to expectations and was traded away after four seasons.
Despite his January 1999 claim that he was "99.9% certain" that he would never play another NBA game, in the summer of 2001 Jordan expressed interest in making another comeback, this time with his new team. Inspired by the NHL comeback of his friend Mario Lemieux the previous winter, Jordan spent much of the spring and summer of 2001 in training, holding several invitation-only camps for NBA players in Chicago. In addition, Jordan hired his old Chicago Bulls head coach, Doug Collins, as Washington's coach for the upcoming season, a decision that many saw as foreshadowing another Jordan return.
On September 25, 2001, Jordan announced his return to the NBA to play for the Washington Wizards, indicating his intention to donate his salary as a player to a relief effort for the victims of the September 11 attacks. In an injury-plagued 2001–02 season, he led the team in scoring (22.9 ppg), assists (5.2 apg), and steals (1.42 spg). However, torn cartilage in his right knee ended Jordan's season after only 60 games, the fewest he had played in a regular season since playing 17 games after returning from his first retirement during the 1994–95 season. Jordan started 53 of his 60 games for the season, averaging 24.3 points, 5.4 assists, and 6.0 rebounds, and shooting 41.9% from the field in his 53 starts. His last seven appearances were in a reserve role, in which he averaged just over 20 minutes per game.
Playing in his 14th and final NBA All-Star Game in 2003, Jordan passed Kareem Abdul-Jabbar as the all-time leading scorer in All-Star Game history (a record since broken by LeBron James and Kobe Bryant). That year, Jordan was the only Washington player to play in all 82 games, starting in 67 of them. He averaged 20.0 points, 6.1 rebounds, 3.8 assists, and 1.5 steals per game. He also shot 45% from the field, and 82% from the free throw line. Even though he turned 40 during the season, he scored 20 or more points 42 times, 30 or more points nine times, and 40 or more points three times. On February 21, 2003, Jordan became the first 40-year-old to tally 43 points in an NBA game. During his stint with the Wizards, all of Jordan's home games at the MCI Center were sold out, and the Wizards were the second most-watched team in the NBA, averaging 20,172 fans a game at home and 19,311 on the road. However, neither of Jordan's final two seasons resulted in a playoff appearance for the Wizards, and Jordan was often unsatisfied with the play of those around him. At several points he openly criticized his teammates to the media, citing their lack of focus and intensity, notably that of the number one draft pick in the 2001 NBA draft, Kwame Brown.
With the recognition that 2002–03 would be Jordan's final season, tributes were paid to him throughout the NBA. In his final game at the United Center in Chicago, which was his old home court, Jordan received a four-minute standing ovation. The Miami Heat retired the number 23 jersey on April 11, 2003, even though Jordan never played for the team. At the 2003 All-Star Game, Jordan was offered a starting spot from Tracy McGrady and Allen Iverson, but refused both. In the end, he accepted the spot of Vince Carter.
Jordan played in his final NBA game on April 16, 2003, in Philadelphia. After scoring only 13 points in the game, Jordan went to the bench with 4 minutes and 13 seconds remaining in the third quarter and his team trailing the Philadelphia 76ers, 75–56. Just after the start of the fourth quarter, the First Union Center crowd began chanting "We want Mike!" After much encouragement from coach Doug Collins, Jordan finally rose from the bench and re-entered the game, replacing Larry Hughes with 2:35 remaining. At 1:45, Jordan was intentionally fouled by the 76ers' Eric Snow, and stepped to the line to make both free throws. After the second foul shot, the 76ers in-bounded the ball to rookie John Salmons, who in turn was intentionally fouled by Bobby Simmons one second later, stopping time so that Jordan could return to the bench. Jordan received a three-minute standing ovation from his teammates, his opponents, the officials, and the crowd of 21,257 fans.
Jordan played on two Olympic gold medal-winning American basketball teams. He won a gold medal as a college player in the 1984 Summer Olympics. The team was coached by Bob Knight and featured players such as Patrick Ewing, Sam Perkins, Chris Mullin, Steve Alford, and Wayman Tisdale. Jordan led the team in scoring, averaging 17.1 ppg for the tournament.
In the 1992 Summer Olympics, he was a member of the star-studded squad that included Magic Johnson and Larry Bird that was dubbed the "Dream Team". Jordan was the only player to start all eight games in the Olympics. Playing limited minutes due to the frequent blowouts, Jordan averaged 14.9 ppg, finishing second on the team in scoring. Jordan and fellow Dream Team members Ewing and Mullin are the only American men's basketball players to win Olympic gold medals as amateurs and professionals.
After his third retirement, Jordan assumed that he would be able to return to his front office position as Director of Basketball Operations with the Wizards. However, his previous tenure in the Wizards' front office had produced the aforementioned mixed results and may have also influenced the trade of Richard "Rip" Hamilton for Jerry Stackhouse (although Jordan was not technically Director of Basketball Operations in 2002). On May 7, 2003, Wizards owner Abe Pollin fired Jordan as the team's president of basketball operations. Jordan later stated that he felt betrayed, and that if he had known he would be fired upon retiring he never would have come back to play for the Wizards.
Jordan kept busy over the next few years. He stayed in shape, played golf in celebrity charity tournaments, and spent time with his family in Chicago. He also promoted his Jordan Brand clothing line and rode motorcycles. Since 2004, Jordan has owned Michael Jordan Motorsports, a professional closed-course motorcycle road racing team that competed with two Suzukis in the premier Superbike championship sanctioned by the American Motorcyclist Association (AMA) until the end of the 2013 season.
On June 15, 2006, Jordan bought a minority stake in the Charlotte Bobcats (now known as the Hornets), becoming the team's second-largest shareholder behind majority owner Robert L. Johnson. As part of the deal, Jordan took full control over the basketball side of the operation, with the title "Managing Member of Basketball Operations". Despite Jordan's previous success as an endorser, he has made an effort not to be included in Charlotte's marketing campaigns. A decade earlier, Jordan had made a bid to become part-owner of Charlotte's original NBA team, the Charlotte Hornets, but talks collapsed when owner George Shinn refused to give Jordan complete control of basketball operations.
In February 2010, it was reported that Jordan was seeking majority ownership of the Bobcats. As February wore on, it became apparent that Jordan and former Houston Rockets president George Postolos were the leading contenders for ownership of the team. On February 27, the Bobcats announced that Johnson had reached an agreement with Jordan and his group, MJ Basketball Holdings, to buy the team pending NBA approval. On March 17, the NBA Board of Governors unanimously approved Jordan's purchase, making him the first former player to become the majority owner of an NBA team. It also made him the league's only African-American majority owner of an NBA team.
During the 2011 NBA lockout, "The New York Times" wrote that Jordan led a group of 10 to 14 hardline owners who wanted to cap the players' share of basketball-related income at 50 percent and as low as 47. Journalists observed that, during the labor dispute in 1998, Jordan had told Washington Wizards then-owner Abe Pollin, "If you can't make a profit, you should sell your team." Jason Whitlock of FoxSports.com called Jordan a "sellout" wanting "current players to pay for his incompetence." He cited Jordan's executive decisions to draft disappointing players Kwame Brown and Adam Morrison.
During the 2011–12 NBA season that was shortened to 66 games by the lockout, the Bobcats posted a 7–59 record. The team closed out the season with a 23-game losing streak. Their .106 winning percentage was the worst in NBA history. "I'm not real happy about the record book scenario last year. It's very, very frustrating," Jordan said before the next season.
During the 2019 NBA offseason, Jordan sold a minority piece of the Hornets to Gabe Plotkin and Daniel Sundheim, retaining the majority of the team for himself.
Jordan was a shooting guard who was also capable of playing as a small forward (the position he would primarily play during his second return to professional basketball with the Washington Wizards), and as a point guard. Jordan was known throughout his career for being a strong clutch performer. With the Bulls, he decided 25 games with field goals or free throws in the last 30 seconds, including two NBA Finals games and five other playoff contests. His competitiveness was visible in his prolific trash-talk and well-known work ethic. Jordan often used perceived slights to fuel his performances. Sportswriter Wright Thompson described him as "a killer, in the Darwinian sense of the word, immediately sensing and attacking someone's weakest spot." As the Bulls organization built the franchise around Jordan, management had to trade away players who were not "tough enough" to compete with him in practice. To help improve his defense, he spent extra hours studying film of opponents. On offense, he relied more upon instinct and improvization at game time.
Noted as a durable player, Jordan did not miss four or more games while active for a full season from 1986–87 to 2001–02, when he injured his right knee. Of the 15 seasons Jordan was in the NBA, he played all 82 regular season games nine times. Jordan has frequently cited David Thompson, Walter Davis, and Jerry West as influences. Confirmed at the start of his career, and possibly later on, Jordan had a special "Love of the Game Clause" written into his contract (unusual at the time) which allowed him to play basketball against anyone at any time, anywhere.
Jordan had a versatile offensive game. He was capable of aggressively driving to the basket, as well as drawing fouls from his opponents at a high rate; his 8,772 free throw attempts are the 11th-highest total in NBA history. As his career progressed, Jordan also developed the ability to post up his opponents and score with his trademark fadeaway jump shot, using his leaping ability to "fade away" from block attempts. According to Hubie Brown, this move alone made him nearly unstoppable. Despite media criticism as a "selfish" player early in his career, Jordan's 5.3 assists per game also indicate his willingness to defer to his teammates. After shooting under 30% from three-point range in his first five seasons in the NBA, including a career-low 13% in the season, Jordan improved to a career-high 50% in the season. The three-point shot became more of a focus of his game from 1994–95 to 1996–97, when the NBA shortened its three-point line from to . His three-point field-goal percentages ranged from 35% to 43% in seasons in which he attempted at least 230 three-pointers between 1989–90 and 1996–97. For a guard, Jordan was also a good rebounder (6.2 per game).
In 1988, Jordan was honored with the NBA's Defensive Player of the Year Award and became the first NBA player to win both the Defensive Player of the Year and Most Valuable Player awards in a career (since equaled by Hakeem Olajuwon, David Robinson, and Kevin Garnett; Olajuwon is the only player other than Jordan to win both during the same season). In addition, he set both seasonal and career records for blocked shots by a guard, and combined this with his ball-thieving ability to become a standout defensive player. He ranks third in NBA history in total steals with 2,514, trailing John Stockton and Jason Kidd. Jerry West often stated that he was more impressed with Jordan's defensive contributions than his offensive ones. He was also known to have strong eyesight; broadcaster Al Michaels said that he was able to read baseball box scores on a 27-inch (69 cm) television clearly from about 50 feet (15 m) away.
Source:
Jordan's talent was clear from his first NBA season; by November he was being compared to Julius Erving. Larry Bird said of the rookie Jordan that he was the best player he had ever seen and that Jordan was "one of a kind" and comparable to Wayne Gretzky as an athlete. In his first game in Madison Square Garden against the New York Knicks, Jordan received a standing ovation of almost one minute. After he scored a playoff record 63 points against the Boston Celtics on April 20, 1986, Bird described him as "God disguised as Michael Jordan".
Jordan led the NBA in scoring in 10 seasons (NBA record) and tied Wilt Chamberlain's record of seven consecutive scoring titles. He was also a fixture on the NBA All-Defensive First Team, making the roster nine times (NBA record shared with Gary Payton, Kevin Garnett and Kobe Bryant). Jordan also holds the top career regular season and playoff scoring averages of 30.1 and 33.4 points per game, respectively. By 1998, the season of his Finals-winning shot against the Jazz, he was well known throughout the league as a clutch performer. In the regular season, Jordan was the Bulls' primary threat in the final seconds of a close game and in the playoffs; he would always ask for the ball at crunch time. Jordan's total of 5,987 points in the playoffs is the second-highest in NBA history. He retired with 32,292 points in regular season play, placing him fifth on the NBA's all-time scoring list behind Kareem Abdul-Jabbar, Karl Malone, Kobe Bryant, and LeBron James.
With five regular-season MVPs (tied for second place with Bill Russell—only Kareem Abdul-Jabbar has won more, with six), six Finals MVPs (NBA record), and three All-Star Game MVPs, Jordan is the most decorated player in NBA history. Jordan finished among the top three in regular-season MVP voting 10 times, and was named one of the 50 Greatest Players in NBA History in 1996. He is one of only seven players in history to win an NCAA championship, an NBA championship, and an Olympic gold medal (doing so twice with the 1984 and 1992 U.S. men's basketball teams). Since 1976, the year of the NBA's merger with the American Basketball Association, Jordan and Pippen are the only two players to win six NBA Finals playing for one team. In the All-Star Game fan ballot, Jordan received the most votes nine times, more than any other player.
Many of Jordan's contemporaries have said that Jordan is the greatest basketball player of all time. In 1999, an ESPN survey of journalists, athletes and other sports figures ranked Jordan the greatest North American athlete of the 20th century, above such luminaries as Babe Ruth and Muhammad Ali. Jordan placed second to Babe Ruth in the Associated Press' December 1999 list of 20th century athletes. In addition, the Associated Press voted him the greatest basketball player of the 20th century. Jordan has also appeared on the front cover of "Sports Illustrated" a record 50 times. In the September 1996 issue of "Sport", which was the publication's 50th-anniversary issue, Jordan was named the greatest athlete of the past 50 years.
Jordan's athletic leaping ability, highlighted in his back-to-back Slam Dunk Contest championships in 1987 and 1988, is credited by many people with having influenced a generation of young players. Several current NBA players—including LeBron James and Dwyane Wade—have stated that they considered Jordan their role model while they were growing up. In addition, commentators have dubbed a number of next-generation players "the next Michael Jordan" upon their entry to the NBA, including Penny Hardaway, Grant Hill, Allen Iverson, Bryant, James, Vince Carter, and Dwyane Wade. Although Jordan was a well-rounded player, his "Air Jordan" image is also often credited with inadvertently decreasing the jump shooting skills, defense, and fundamentals of young players, a fact Jordan himself has lamented.
During his heyday, Jordan did much to increase the status of the game. Television ratings increased only during his time in the league. The popularity of the NBA in the U.S. declined after his last title. As late as 2015, Finals ratings had not returned to the level reached during his last championship-winning season.
In August 2009, the Naismith Memorial Basketball Hall of Fame in Springfield, Massachusetts, opened a Michael Jordan exhibit that contained items from his college and NBA careers, as well as from the 1992 "Dream Team". The exhibit also has a batting glove to signify Jordan's short career in Minor League Baseball. After Jordan received word of his acceptance into the Hall of Fame, he selected Class of 1996 member David Thompson to present him. As Jordan would later explain during his induction speech in September 2009, when he was growing up in North Carolina, he was not a fan of the Tar Heels and greatly admired Thompson, who played at rival North Carolina State. In September, he was inducted into the Hall with several former Bulls teammates in attendance, including Scottie Pippen, Dennis Rodman, Charles Oakley, Ron Harper, Steve Kerr, and Toni Kukoč. Two of Jordan's former coaches, Dean Smith and Doug Collins, were also among those present. His emotional reaction during his speech—when he began to cry—was captured by Associated Press photographer Stephan Savoia and would later go viral on social media as the Crying Jordan Internet meme. In 2016, President Barack Obama honored Jordan with the Presidential Medal of Freedom.
Jordan married Juanita Vanoy in September 1989. They had two sons, Jeffrey and Marcus, and a daughter, Jasmine. Jordan and Juanita filed for divorce on January 4, 2002, citing irreconcilable differences, but reconciled shortly thereafter. They again filed for divorce and were granted a final decree of dissolution of marriage on December 29, 2006, commenting that the decision was made "mutually and amicably". It is reported that Juanita received a $168 million settlement (equivalent to $ million in ), making it the largest celebrity divorce settlement on public record at the time.
In 1991, Jordan purchased a lot in Highland Park, Illinois, on which he planned to build a 56,000 square-foot (5,200 m2) mansion. It was completed in 1995. He listed the mansion for sale in 2012. His two sons attended Loyola Academy, a private Catholic school in Wilmette, Illinois. Jeffrey graduated in 2007 and played his first collegiate basketball game for the University of Illinois on November 11, 2007. After two seasons, he left the Illinois basketball team in 2009. He later rejoined the team for a third season, then received a release to transfer to the University of Central Florida, where Marcus was attending. Marcus transferred to Whitney Young High School after his sophomore year at Loyola Academy and graduated in 2009. He began attending UCF in the fall of 2009, and played three seasons of basketball for the school.
Jordan is the fourth of five children. He has two older brothers, Larry Jordan and James R. Jordan Jr., one older sister, Deloris, and one younger sister, Roslyn. James retired in 2006 as the Command Sergeant Major of the 35th Signal Brigade of the XVIII Airborne Corps in the U.S. Army. Jordan's nephew through Larry, Justin, played Division I basketball at UNC Greensboro and is a scout for the Charlotte Hornets.
On July 21, 2006, a judge in Cook County, Illinois, determined that Jordan did not owe his alleged former lover Karla Knafel $5 million in a breach of contract claim. Jordan had allegedly paid Knafel $250,000 to keep their relationship a secret. Knafel claimed Jordan promised her $5 million for remaining silent and agreeing not to file a paternity suit after Knafel learned she was pregnant in 1991. A DNA test showed Jordan was not the father of the child.
In 2008, Lisa Miceli was arrested for breaking a restraining order that Jordan had against her. She was seeking a third paternity test for her son, who had been shown not to be Jordan's child by two DNA tests in 2005. She was released after agreeing not to contact Jordan again.
In 2013, Pamela Smith filed a paternity suit against Jordan, claiming that he was the father of her 16-year-old son. The claim was dismissed. Jordan countersued and Smith ultimately had to pay his legal fees.
Jordan proposed to his longtime girlfriend, Cuban-American model Yvette Prieto, on Christmas 2011, and they were married on April 27, 2013, at Bethesda-by-the-Sea Episcopal Church. It was announced on November 30, 2013, that the two were expecting their first child together. On February 11, 2014, Prieto gave birth to identical twin daughters named Victoria and Ysabel. Jordan became a grandfather in 2019 when his daughter Jasmine gave birth to a son, whose father is professional basketball player Rakeem Christmas.
Jordan is one of the most marketed sports figures in history. He has been a major spokesman for such brands as Nike, Coca-Cola, Chevrolet, Gatorade, McDonald's, Ball Park Franks, Rayovac, Wheaties, Hanes, and MCI. Jordan has had a long relationship with Gatorade, appearing in over 20 commercials for the company since 1991, including the "Be Like Mike" commercials in which a song was sung by children wishing to be like Jordan.
Nike created a signature shoe for Jordan, called the Air Jordan, in 1984. One of Jordan's more popular commercials for the shoe involved Spike Lee playing the part of Mars Blackmon. In the commercials Lee, as Blackmon, attempted to find the source of Jordan's abilities and became convinced that "it's gotta be the shoes". The hype and demand for the shoes even brought on a spate of "shoe-jackings" where people were robbed of their sneakers at gunpoint. Subsequently, Nike spun off the Jordan line into its own division named the "Jordan Brand". The company features an impressive list of athletes and celebrities as endorsers. The brand has also sponsored college sports programs such as those of North Carolina, California, Georgetown, and Marquette.
Jordan also has been associated with the Looney Tunes cartoon characters. A Nike commercial shown during 1992's Super Bowl XXVI featured Jordan and Bugs Bunny playing basketball. The Super Bowl commercial inspired the 1996 live action/animated film "Space Jam", which starred Jordan and Bugs in a fictional story set during the former's first retirement from basketball. They have subsequently appeared together in several commercials for MCI. Jordan also made an appearance in the music video for Michael Jackson's "Jam" (1992).
Jordan's yearly income from the endorsements is estimated to be over $40 million. In addition, when Jordan's power at the ticket gates was at its highest point, the Bulls regularly sold out both their home and road games. Due to this, Jordan set records in player salary by signing annual contracts worth in excess of US$30 million per season. An academic study found that Jordan's first NBA comeback resulted in an increase in the market capitalization of his client firms of more than $1 billion.
Most of Jordan's endorsement deals, including his first deal with Nike, were engineered by his agent, David Falk. Jordan has described Falk as "the best at what he does" and that "marketing-wise, he's great. He's the one who came up with the concept of 'Air Jordan.'"
In June 2010, Jordan was ranked by "Forbes" magazine as the 20th-most powerful celebrity in the world with $55 million earned between June 2009 and June 2010. According to the "Forbes" article, Jordan Brand generates $1 billion in sales for Nike. In June 2014, Jordan was named the first NBA player to become a billionaire, after he increased his stake in the Charlotte Hornets from 80% to 89.5%. On January 20, 2015, Jordan was honored with the "Charlotte Business Journal"'s Business Person of the Year for 2014. In 2017, he became a part owner of the Miami Marlins of Major League Baseball.
"Forbes" designated Jordan as the athlete with the highest career earnings in 2017. From his Jordan Brand income and endorsements, Jordan's 2015 income was an estimated $110 million, the most of any retired athlete. , his net worth is estimated at $2.1 billion by "Forbes". Jordan is the fourth-richest African-American as of 2019, behind Robert F. Smith, David Steward, and Oprah Winfrey.
Jordan co-owns an automotive group which bears his name. The company has a Nissan dealership in Durham, North Carolina, acquired in 1990, and formerly had a Lincoln–Mercury dealership from 1995 until its closure in June 2009. The company also owned a Nissan franchise in Glen Burnie, Maryland. The restaurant industry is another business interest of Jordan's. His restaurants include a steakhouse in New York City's Grand Central Terminal, among others. Jordan is the majority investor in a golf course, Grove XXIII, under construction in Hobe Sound, Florida.
From 2001 to 2014, Jordan hosted an annual golf tournament, the Michael Jordan Celebrity Invitational, that raised money for various charities. In 2006, Jordan and his wife Juanita pledged $5 million to Chicago's Hales Franciscan High School. The Jordan Brand has made donations to Habitat for Humanity and a Louisiana branch of the Boys & Girls Clubs of America.
The Make-A-Wish Foundation named Jordan its Chief Wish Ambassador in 2008. Five years later, he granted his 200th wish for the organization. As of 2019, he has raised more than $5 million for the Make-A-Wish Foundation.
In 2015, Jordan donated a settlement of undisclosed size from a lawsuit against supermarkets that had used his name without permission to 23 different Chicago charities. Jordan funded two Novant Health Michael Jordan Family Clinics in Charlotte, North Carolina, in 2017 by giving $7 million, the biggest donation he had made at the time. One year later, after Hurricane Florence damaged parts of North Carolina, including his former hometown of Wilmington, Jordan donated $2 million to relief efforts. He gave $1 million to aid the Bahamas' recovery following Hurricane Dorian in 2019.
On June 5, 2020, in the wake of the protests following the killing of George Floyd, Jordan and his brand announced in a joint statement that they will be donating $100 million over the next 10 years to organizations dedicated to "ensuring racial equality, social justice and greater access to education."
Jordan played himself in the 1996 comedy film "Space Jam". The film was a box office success, making $230 million worldwide, although the reviews were mixed.
In 2000, Jordan was the subject of an IMAX documentary about his career with the Chicago Bulls, especially the 1998 championship season, entitled "Michael Jordan to the Max". Two decades later, the same period of Jordan's life was covered in much greater and more personal detail by "The Last Dance", a 10-part TV documentary which debuted on ESPN in April and May 2020. "The Last Dance" relied heavily on about 500 hours of candid film of Jordan's and his teammates' off-court activities which an NBA Entertainment crew had shot over the course of the 1997–98 NBA season for use in a documentary. The project was delayed for many years because Jordan had not yet given his permission for the footage to be used.
Jordan has authored several books focusing on his life, basketball career, and world view. | https://en.wikipedia.org/wiki?curid=20455 |
Musicology
Musicology (from Greek 'μουσική' (mousikē) for 'music' and 'λογος' (logos) for 'domain of study') is the scholarly analysis and research-based study of music. Musicology departments traditionally belong to the humanities, although some music research is scientific in focus (psychological, sociological, acoustical, neurological, computational). A scholar who participates in musical research is a musicologist.
Musicology traditionally is divided in three main branches: historical musicology, systematic musicology and ethnomusicology. Historical musicologists mostly study the history of the so-called Western classical tradition, though the study of music history need not be limited to that. Ethnomusicologists draw from anthropology (particularly field research) to understand how and why people make music. Systematic musicology includes music theory, aesthetics, pedagogy, musical acoustics, the science and technology of musical instruments, and the musical implications of physiology, psychology, sociology, philosophy and computing. Cognitive musicology is the set of phenomena surrounding the cognitive modeling of music. When musicologists carry out research using computers, their research often falls under the field of computational musicology. Music therapy is a specialized form of applied musicology which is sometimes considered more closely affiliated with health fields, and other times regarded as part of musicology proper.
The 19th century philosophical trends that led to the re-establishment of formal musicology education in German and Austrian universities had combined methods of systematization with evolution. These models were established not only in the field of physical anthropology, but also cultural anthropology. This was influenced by Hegel's ideas on ordering "phenomena" from the simple to complex as the stages of evolution are classified from primitive to developed, and stages of history from ancient to modern. Comparative methods became more widespread in diverse disciplines from anatomy to Indo-European linguistics, and beginning around 1880, also in comparative musicology.
The parent disciplines of musicology include:
Musicology also has two central, practically oriented sub-disciplines with no parent discipline: performance practice and research (sometimes viewed as a form of artistic research), and the theory, analysis and composition of music. The disciplinary neighbors of musicology address other forms of art, performance, ritual and communication, including the history and theory of the visual and plastic arts and of architecture; linguistics, literature and theater; religion and theology; and sport. Musical knowledge is applied in medicine, education and music therapy—which, effectively, are parent disciplines of applied musicology.
Music history or historical musicology is concerned with the composition, performance, reception and criticism of music over time. Historical studies of music are for example concerned with a composer's life and works, the developments of styles and genres, "e.g.", baroque concertos, the social function of music for a particular group of people, "e.g.", court music, or modes of performance at a particular place and time, "e.g.", Johann Sebastian Bach's choir in Leipzig. Like the comparable field of art history, different branches and schools of historical musicology emphasize different types of musical works and approaches to music. There are also national differences in various definitions of historical musicology. In theory, "music history" could refer to the study of the history of any type or genre of music, "e.g.", the history of Indian music or the history of rock. In practice, these research topics are more often considered within ethnomusicology (see below) and "historical musicology" is typically assumed to imply Western Art music of the European tradition.
The methods of historical musicology include source studies (especially manuscript studies), paleography, philology (especially textual criticism), style criticism, historiography (the choice of historical method), musical analysis (analysis of music to find "inner coherence") and iconography. The application of musical analysis to further these goals is often a part of music history, though pure analysis or the development of new tools of music analysis is more likely to be seen in the field of music theory. Music historians create a number of written products, ranging from journal articles describing their current research, new editions of musical works, biographies of composers and other musicians, book-length studies or university textbook chapters or entire textbooks. Music historians may examine issues in a close focus, as in the case of scholars who examine the relationship between words and music for a given composer's art songs. On the other hand, some scholars take a broader view, and assess the place of a given type of music, such as the symphony in society using techniques drawn from other fields, such as economics, sociology or philosophy.
"New musicology" is a term applied since the late 1980s to a wide body of work emphasizing cultural study, analysis and criticism of music. Such work may be based on feminist, gender studies, queer theory or postcolonial theory, or the work of Theodor W. Adorno. Although New Musicology emerged from within historical musicology, the emphasis on cultural study within the Western art music tradition places New Musicology at the junction between historical, ethnological and sociological research in music.
New musicology was a reaction against traditional historical musicology, which according to Susan McClary, "fastidiously declares issues of musical signification off-limits to those engaged in legitimate scholarship." Charles Rosen, however, retorts that McClary, "sets up, like so many of the 'new musicologists', a straw man to knock down, the dogma that music has no meaning, and no political or social significance." Today, many musicologists no longer distinguish between musicology and new musicology since it has been recognized that many of the scholarly concerns once associated with new musicology already were mainstream in musicology, so that the term "new" no longer applies.
Ethnomusicology, formerly comparative musicology, is the study of music in its cultural context. It is often considered the anthropology or ethnography of music. Jeff Todd Titon has called it the study of "people making music". Although it is most often concerned with the study of non-Western musics, it also includes the study of Western music from an anthropological or sociological perspective, cultural studies and sociology as well as other disciplines in the social sciences and humanities. Some ethnomusicologists primarily conduct historical studies, but the majority are involved in long-term participant observation or combine ethnographic, musicological, and historical approaches in their fieldwork. Therefore, ethnomusiological scholarship can be characterized as featuring a substantial, intensive fieldwork component, often involving long-term residence within the community studied.
Closely related to ethnomusiology is the emerging branch of sociomusicology. For instance, Ko (2011) proposed the hypothesis of "Biliterate and Trimusical" in Hong Kong sociomusicology.
Popular music studies, known, "misleadingly", as "popular musicology", emerged in the 1980s as an increasing number of musicologists, ethnomusicologists and other varieties of historians of American and European culture began to write about popular musics past and present. The first journal focusing on popular music studies was Popular Music, which began publication in 1981. The same year an academic society solely devoted to the topic was formed, the International Association for the Study of Popular Music. The Association's founding was partly motivated by the interdisciplinary agenda of popular musicology though the group has been characterized by a polarized 'musicological' and 'sociological' approach also typical of popular musicology.
Music theory is a field of study that describes the elements of music and includes the development and application of methods for composing and for analyzing music through both notation and, on occasion, musical sound itself. Broadly, theory may include any statement, belief or conception of or about music (Boretz, 1995). A person who studies or practices music theory is a music theorist.
Some music theorists attempt to explain the techniques composers use by establishing rules and patterns. Others model the experience of listening to or performing music. Though extremely diverse in their interests and commitments, many Western music theorists are united in their belief that the acts of composing, performing and listening to music may be explicated to a high degree of detail (this, as opposed to a conception of musical expression as fundamentally ineffable except in musical sounds). Generally, works of music theory are both descriptive and prescriptive, attempting both to define practice and to influence later practice.
Musicians study music theory to understand the structural relationships in the (nearly always notated) music. Composers study music theory to understand how to produce effects and structure their own works. Composers may study music theory to guide their precompositional and compositional decisions. Broadly speaking, music theory in the Western tradition focuses on harmony and counterpoint, and then uses these to explain large scale structure and the creation of melody.
Music psychology applies the content and methods of all subdisciplines of psychology (perception, cognition, motivation, etc.) to understand how music is created, perceived, responded to, and incorporated into individuals' and societies' daily lives. Its primary branches include cognitive musicology, which emphasizes the use of computational models for human musical abilities and cognition, and the cognitive neuroscience of music, which studies the way that music perception and production manifests in the brain using the methodologies of cognitive neuroscience. While aspects of the field can be highly theoretical, much of modern music psychology seeks to optimize the practices and professions of music performance, composition, education and therapy.
Performance practice draws on many of the tools of historical musicology to answer the specific question of how music was performed in various places at various times in the past. Although previously confined to early music, recent research in performance practice has embraced questions such as how the early history of recording affected the use of vibrato in classical music or instruments in Klezmer.
Within the rubric of musicology, performance practice tends to emphasize the collection and synthesis of evidence about how music should be performed. The important other side, learning how to sing authentically or perform a historical instrument is usually part of conservatory or other performance training. However, many top researchers in performance practice are also excellent musicians.
Music performance research (or music performance science) is strongly associated with music psychology. It aims to document and explain the psychological, physiological, sociological and cultural details of how music is actually performed (rather than how it should be performed). The approach to research tends to be systematic and empirical and to involve the collection and analysis of both quantitative and qualitative data. The findings of music performance research can often be applied in music education.
Musicologists in tenure track professor positions typically hold a Ph.D in musicology. In the 1960s and 1970s, some musicologists obtained professor positions with an M.A. as their highest degree, but in the 2010s, the Ph.D is the standard minimum credential for tenure track professor positions. As part of their initial training, musicologists typically complete a B.Mus or a B.A. in music (or a related field such as history) and in many cases an M.A. in musicology. Some individuals apply directly from a bachelor's degree to a Ph.D, and in these cases, they may not receive an M.A. In the 2010s, given the increasingly interdisciplinary nature of university graduate programs, some applicants for musicology Ph.D programs may have academic training both in music and outside of music (e.g., a student may apply with a B.Mus and an M.A. in psychology). In music education, individuals may hold an M.Ed and an Ed.D.
Most musicologists work as instructors, lecturers or professors in colleges, universities or conservatories. The job market for tenure track professor positions is very competitive. Entry-level applicants must hold a completed Ph.D or the equivalent degree and applicants to more senior professor positions must have a strong record of publishing in peer-reviewed journals. Some Ph.D-holding musicologists are only able to find insecure positions as sessional lecturers. The job tasks of a musicologist are the same as those of a professor in any other humanities discipline: teaching undergraduate and/or graduate classes in their area of specialization and, in many cases some general courses (such as Music Appreciation or Introduction to Music History); conducting research in their area of expertise, publishing articles about their research in peer-reviewed journals, authors book chapters, books or textbooks; traveling to conferences to give talks on their research and learn about research in their field; and, if their program includes a graduate school, supervising M.A. and Ph.D students, giving them guidance on the preparation of their theses and dissertations. Some musicology professors may take on senior administrative positions in their institution, such as Dean or Chair of the School of Music.
The vast majority of major musicologists and music historians from past generations have been men, as in the 19th century and early 20th century; women's involvement in teaching music was mainly in elementary and secondary music teaching. Nevertheless, some women musicologists have reached the top ranks of the profession. Carolyn Abbate (born 1956) is an American musicologist who did her PhD at Princeton University. She has been described by the "Harvard Gazette" as "one of the world's most accomplished and admired music historians".
Susan McClary (born 1946) is a musicologist associated with new musicology who incorporates feminist music criticism in her work. McClary holds a PhD from Harvard University. One of her best known works is "Feminine Endings" (1991), which covers musical constructions of gender and sexuality, gendered aspects of traditional music theory, gendered sexuality in musical narrative, music as a gendered discourse and issues affecting women musicians.
Other notable women scholars include:
Many musicology journals are only available in print or through pay-for-access portals. This list, however, contains a sample of peer reviewed and open-access journals in various subfields as examples of musicological writings:
A list of open-access European journals in the domains of music theory and/or analysis is available on the website of the European Network for Theory & Analysis of Music. A more complete list of open-access journals in theory and analysis can be found on the website of the Société Belge d'Analyse Musicale (in French). | https://en.wikipedia.org/wiki?curid=20458 |
Film promotion
Film promotion is the practice of promotion specifically in the film industry, and usually occurs in coordination with the process of film distribution. Sometimes called the press junket or film junket, film promotion generally includes press releases, advertising campaigns, merchandising, franchising, media and interviews with the key people involved with the making of the film, like actors and directors. As with all business, it is an important part of any release because of the inherent high financial risk; film studios will invest in expensive marketing campaigns to maximize revenue early in the release cycle. Marketing budgets tend to equal about half the production budget. Publicity is generally handled by the distributor and exhibitors.
Trailers are a mainstay of film promotion, because they are delivered directly to movie-goers. They screen in theatres before movie showings. Generally they tell the story of the movie in a highly condensed fashion compressing maximum appeal into two and half minutes.
Film actors, directors, and producers appear for television, cable, radio, print, and online media interviews, which can be conducted in person or remotely. During film production, these can take place "on set". After the film's premiere, key personnel make appearances in major market cities or participate remotely via satellite videoconference or telephone. The purpose of interviews is to encourage journalists to publish stories about their "exclusive interviews" with the film's stars, thereby creating "marketing buzz" around the film and stimulating audience interest in watching the film.
When it comes to feature films picked up by a major film studio for international distribution, promotional tours are notoriously grueling. Key cast and crew are often contracted to travel to several major cities around the world to promote the film and sit for dozens of interviews. In every interview they are supposed to stay "on message" by energetically expressing their enthusiasm for the film in a way that appears candid, fun, and fresh, even though it may be their fifth or sixth interview that day. They are expected to disclose just enough juicy "behind-the-scenes" information about the filmmaking process or the filmmakers' artistic vision to make each journalist feel like he or she got a nice scoop, while at the same time tactfully avoiding disclosure of anything embarrassing, humiliating or truly negative that may be detrimental to the film's box office gross and profit or influence a critic's review as well as the public's opinion.
There are seven distinct types of research conducted by film distributors in connection with domestic theatrical releases, according to "Marketing to Moviegoers: Second Edition." Such audience research can cost $1 million per film, especially when scores of TV advertisements are tested and re-tested. The bulk of research is done by major studios for the roughly 170 major releases they mount each year that are supported by tens of millions of advertising buys for each film. Independent film distributors, which typically spend less than $10 million in media buys per film, don’t have the budget or breadth of advertising materials to analyze, so they spend little or nothing on pre-release audience research.
When audience research is conducted for domestic theatrical release, it involves these areas:
Marketing can play a big role in whether or not a film gets the green light. Audience research is a strong factor in determining the ability of a film to sell in theaters, which is ultimately how films make their money. As part of a movie's Marketing strategy, audience research comes into account as producers create promotional materials. These promotional materials consistently change and evolve as a direct consequence of audience research up until the film opens in theaters. | https://en.wikipedia.org/wiki?curid=20460 |
M25 motorway
The M25 or London Orbital Motorway is a major road encircling almost all of Greater London, England. The Dartford Crossing (A282) is part of the orbital route but is not part of the motorway. The M25 is one of the most important roads in Britain and one of the busiest. The final section was opened by the Prime Minister Margaret Thatcher in 1986; on opening it was the longest ring road in Europe at .
Patrick Abercrombie had proposed an orbital motorway around London in the 1944 "Greater London Plan", which evolved into the London Ringways project in the early 1960s. By 1966, planning had started on two projects, Ringway 3 to the north and Ringway 4 to the south. By the time the first sections opened in 1975, it was decided the ringways would be combined into a single orbital motorway. The M25 was one of the first motorway projects to consider environmental concerns and almost 40 public inquiries took place. The road was built as planned despite some protest that included the section over the North Downs and around Epping Forest which required an extension of the Bell Common Tunnel.
Although the M25 was popular during construction, it quickly became apparent that there was insufficient traffic capacity. Because of the public inquiries, several junctions merely served local roads where office and retail developments were built, attracting even more traffic onto the M25 than it was designed for. The congestion has led to traffic management schemes that include variable speed limit and smart motorway. Since opening, the M25 has been progressively widened, particularly near Heathrow Airport.
In some cases, such as the Communications Act 2003, it is used as a de-facto reference to Greater London.
The M25 almost completely encircles Greater London and passes briefly through it to the east. Junctions 1A–5 are in Kent, 6–14 are in Surrey, 15–16 are in Buckinghamshire, 17–25 are in Hertfordshire, and 26–31 are in Essex. Policing of the road is carried out by an integrated group made up of the Metropolitan, Thames Valley, Essex, Kent, Hertfordshire and Surrey forces. Primary destinations signed ahead on the motorway include the Dartford Crossing, Sevenoaks, Gatwick Airport, Heathrow Airport, Watford, Stansted Airport and Brentwood.
To the east of London the two ends of the M25 are joined to complete a loop by the non-motorway A282 Dartford Crossing of the River Thames between Thurrock and Dartford. The crossing consists of twin two-lane tunnels and the four-lane QE2 (Queen Elizabeth II) bridge. with a main span of . Passage across the bridge or through the tunnels is subject to a charge between 6 am and 10 pm, its level depending on the kind of vehicle. The road is not under motorway regulations so that other traffic can cross the Thames east of the Woolwich Ferry; the only crossing further to the east is a passenger ferry between Gravesend, Kent, and Tilbury, Essex.
At Junction 5, the clockwise carriageway of the M25 is routed off the main north–south dual carriageway onto the main east–west dual carriageway with the main north–south carriageway becoming the A21. In the opposite direction, to the east of the point where the M25 diverges from the main east–west carriageway, that carriageway becomes the M26 motorway. From here to Junction 8, the M25 follows the edge of the North Downs close to several historic buildings such as Chevening, Titsey Place, Hever Castle and Chartwell. The interchange with the M23 motorway near Reigate is a four-level stack; one of only a few examples in Britain. Past this, the M25 runs close to the Surrey Hills AONB.
To the west, the M25 passes close to the edge of Heathrow, and within sight of Windsor Castle. North of this, it crosses the Chiltern Main Line under the Chalfont Viaduct, a 19th-century railway bridge. Red kites can often be seen overhead to the north of this, up to Junction 21. The northern section of the M25 passes close to All Saints Pastoral Centre near London Colney, Waltham Abbey and Copped Hall. This section also features two cut-and-cover tunnels, including the Bell Common Tunnel. The north-eastern section of the motorway passes close to North Ockendon, the only settlement of Greater London situated outside the M25. It then runs close to the Rainham Marshes Nature Reserve before reaching the northern end of the Dartford Crossing.
In 2004, following an opinion poll, the London Assembly proposed aligning the Greater London boundary with the M25. "Inside the M25" and "outside/beyond the M25" are colloquial, looser alternatives to "Greater London" sometimes used in haulage. The Communications Act 2003 explicitly uses the M25 as the boundary in requiring a proportion of television programmes to be made outside the London area; it states a requirement of "a suitable proportion of the programmes made in the United Kingdom" to be made "in the United Kingdom outside the M25 area", defined in Section 362 as "the area the outer boundary of which is represented by the London Orbital Motorway (M25)".
Sections of the M25 form part of two long-distance E-roads, designated by the United Nations Economic Commission for Europe. The E15, which runs from Inverness to Algeciras, follows the M25 and A282 clockwise from the A1(M) at junction 23 to the M20 at junction 3; while the E30 Cork to Omsk route runs from the M4 at junction 15, clockwise to the A12 at junction 28. The United Kingdom is formally part of the E-roads network but, unlike in other countries, these routes are not marked on any road signs.
The M25 was originally built mostly as a dual three-lane motorway. Much of this has since been widened to dual four lanes for almost half, to a dual five-lanes section between Junctions 12 and 14 and a dual six-lane section between Junctions 14 and 15. Further widening is in progress of minor sections with plans for smart motorways in many others.
Two motorway service areas are on the M25, and two others are directly accessible from it. Those on the M25 are Clacket Lane between Junctions 5 and 6 (in the south-east) and Cobham between Junctions 9 and 10 (in the south-west). Those directly accessible from it are South Mimms off Junction 23 (to the north of London) and Thurrock off Junction 31 (to the east of London).
As is common with other motorways, the M25 is equipped with emergency ("SOS") telephones. These connect to two Highways England operated control centres at Godstone (for Junctions 1 to 15 inclusive) and South Mimms (for 16–31). The Dartford Crossing has a dedicated control centre. There is an extensive network of closed circuit television (CCTV) on the motorway so incidents can be easily identified and located. A number of 4×4 vehicles patrol the motorway, attempting to keep traffic moving where possible, and assisting the local police. They can act as a rolling roadblock when there are obstacles on the road.
When completed, the M25 only had street lighting for of its length. Originally, low pressure sodium (SOX) lighting was the most prominent technology used, but this has been gradually replaced with high-pressure sodium (SON) lighting. the motorway has more than 10,000 streetlights. The M25 has a number of pollution control valves along its length, which can shut off drainage in the event of a chemical or fuel spill.
The idea of a general bypass around London was first proposed early in the 20th century. An outer orbital route around the capital had been suggested in 1913, and was re-examined as a motorway route in Sir Charles Bressey's and Sir Edwin Lutyens' "The Highway Development Survey, 1937". Sir Patrick Abercrombie's "County of London Plan, 1943" and "Greater London Plan, 1944" proposed a series of five roads encircling the capital. The northern sections of the M25 follow a similar route to the Outer London Defence Ring, a concentric series of anti-tank defences and pillboxes designed to slow down a potential German invasion of the capital during World War II. This was marked as the D Ring on Abercombie's plans. Following the war, 11 separate county councils told the Ministry of Transport that an orbital route was "first priority" for London.
Plans stalled because the route was planned to pass through several urban areas, which attracted criticism. The original D Ring through north-west London was intended to be a simple upgrade of streets. In 1951, Middlesex County Council planned a route for the orbital road through the county, passing through Eastcote and west of Bushey, connecting with the proposed M1 motorway, but it was rejected by the Ministry two years later. An alternative route via Harrow and Ealing was proposed, but this was abandoned after the council revealed the extent of property demolition required.
In 1964, the London County Council announced the London Ringways plan, to consist of four concentric motorway rings around London. The following year, the transport minister Barbara Castle announced that the D Ring would be essential to build. The component parts of what became the M25 came from Ringway 3 / M16 motorway in the north and Ringway 4 in the south.
The Ringways plan was controversial owing to the destruction required for the inner two ring roads, (Ringway 1 and Ringway 2). Parts of Ringway 1 were constructed (including the West Cross Route), despite stiff opposition, before the overall plan was postponed in February 1972. In April 1973, the Greater London Council elections resulted in a Labour Party victory; the party then formally announced the cancellation of the Ringways running inside Greater London. This did not affect the routes that would become the M25, because they were planned as central government projects from the outset.
There was no individual public inquiry into the M25 as a whole. Each section was presented to planning authorities in its own right and was individually justified, with 39 separate public inquiries relating to sections of the route. The need for the ministry to negotiate with local councils meant that more junctions with local traffic were built than originally proposed. A report in 1981 showed that the M25 had the potential to attract office and retail development along its route, negating the proposed traffic improvements and making Central London a less desirable place to work. None of the motorway was prevented from being built by objections at the public inquiries. However, as a consequence of the backlash against the Ringways, and criticism at the public inquiries, the motorway was built with environmental concerns in mind. New features included additional earth mounds, cuttings and fences that reduced noise, and over two million trees and shrubs to hide the view of the road.
Construction of parts of the two outer ring roads, Ringways 3 and 4, began in 1973. The first section, between South Mimms and Potters Bar in Hertfordshire (Junctions 23 to 24) opened in September 1975. It was provisionally known as the M16 and was given the temporary general-purpose road designation A1178. A section of the North Orbital Road between Rickmansworth and Hunton Bridge was proposed in 1966, with detailed planning in 1971. The road was constructed to motorway standards and opened in October 1976. It eventually became part of the M25's route. The section to the south, from Heathrow Airport to Rickmansworth had five separate routes proposed when a public inquiry was launched in 1974. The Department of Transport sent out 15,000 questionnaires about the preferred route, with 5,000 replies. A route was fixed in 1978, with objections delaying the start of construction in 1982.
The southern section of what became the M25 through Surrey and Kent was first conceived to be an east–west road south of London to relieve the A25, and running parallel to it, with its eastern end following the route of what is now the M26. It was originally proposed as an all-purpose route, but was upgraded to motorway standard in 1966. It was the first section of the route announced as M25 from the beginning. The first section from Godstone to Reigate (Junctions 6 to 8) was first planned in 1966 and opened in February 1976. A section of Ringway 3 south of the river between Dartford and Swanley (Junctions 1 to 3) was constructed between May 1974 and April 1977.
In 1975, following extensive opposition to some parts of Ringway 3 through Middlesex and South London, the transport minister John Gilbert announced that the north section of Ringway 3 already planned would be combined with the southern section of Ringway 4, forming a single orbital motorway to be known as the M25, and the M16 designation was dropped. This scheme required two additional sections to join what were two different schemes, from Swanley to Sevenoaks in the south-east and Hunton Bridge to Potters Bar in the north-west. The section of Ringway 3 west of South Mimms anti-clockwise around London to Swanley in Kent was cancelled.
The section from Potters Bar to the Dartford Tunnel was constructed in stages from June 1979 onwards, with the final section between Waltham Cross (Junction 25) to Theydon Garnon (Junction 27) opening in January 1984. This section, running through Epping Forest, attracted opposition and protests. In 1973, local residents had parked combine harvesters in Parliament Square in protest against the road, draped with large banners reading "Not Epping Likely". As a consequence of this, the Bell Common Tunnel that runs in this area is twice as long as originally proposed.
The most controversial section of the M25 was that between Swanley and Sevenoaks (Junctions 3 to 5) in Kent across the Darenth Valley, Badgers Mount and the North Downs. An 1800-member group named Defend Darenth Valley and the North Downs Action Group (DANDAG) argued that the link was unnecessary, it would damage an Area of Outstanding Natural Beauty and it would be primarily used by local traffic as a bypass for the old A21 road between Farnborough and Sevenoaks. After a length inquiry process, chaired by George Dobry QC, the transport minister Kenneth Clarke announced the motorway would be built as proposed.
The section from the M40 motorway to the 1970s North Orbital Road construction (Junctions 16 to 17) opened in January 1985. The route under the Chalfont Viaduct meant the motorway was restricted to a width of three lanes in each direction.
The Prime Minister Margaret Thatcher officially opened the M25 on 29 October 1986, with a ceremony in the section between Junctions 22 to 23 (London Colney and South Mimms). To avoid the threat of road protesters, the ceremony was held a quarter of a mile from the nearest bridge. The total estimated cost of the motorway was around £1 billion. It required of concrete, of asphalt and involved the removal of of spoil. Upon completion, it was the longest orbital motorway in the world at . At the opening ceremony, Thatcher announced that 98 miles had been constructed while the Conservatives were in office, calling it "a splendid achievement for Britain". A 58-page brochure was published, commemorating the completion of the motorway.
The M25 was initially popular with the public. In the 1987 general election, the Conservatives won in every constituency that the motorway passed through, in particular gaining Thurrock from Labour. Coach tours were organised for a trip around the new road. However, it quickly became apparent that the M25 suffered from chronic congestion. A report in "The Economist" said it "had taken 70 years to plan [the motorway], 12 to build it and just one to find it was inadequate". Thatcher rebuked the negative response, calling it "carping and criticism".
Traffic levels quickly exceeded the maximum design capacity. Two months before opening, the government admitted that the three-lane section between Junctions 11 and 13 was inadequate, and that it would have to be widened to four. In 1990 the Secretary of State for Transport announced plans to widen the whole of the M25 to four lanes. By 1993 the motorway, designed for a maximum of 88,000 vehicles per day, was carrying 200,000. At this time, the M25 carried 15% of UK motorway traffic and there were plans to add six lanes to the section from Junctions 12 to 15 as well as widening the rest of the motorway to four lanes.
In parts, particularly the western third, this plan went ahead, due to consistent congestion. Again, however, plans to widen further sections to eight lanes (four each way) were scaled back in 2009 in response to rising costs. The plans were reinstated in the agreed Highways Agency 2013–14 business plan.
In June 1992, the Department for Transport (DfT) announced a proposal to widen the section close to Heathrow Airport to fourteen lanes by way of three additional link roads. This attracted fierce opposition from road protesters opposing the Newbury Bypass and other schemes, but also from local authorities. Surrey County Council led a formal objection to the widening scheme. It was cancelled shortly afterwards. In 1994, the Standing Advisory Committee on Trunk Road Appraisal published a report saying that "the M25 experience most probably does ... serve as an example of a case where roads generate traffic" and that further improvements to the motorway were counterproductive. In April 1995, the Transport Minister Brian Mawhinney announced that the Heathrow link roads would be scrapped.
In 1995 a contract was awarded to widen the section between Junctions 8 and 10 from six to eight lanes for a cost of £93.4 million, and a Motorway Incident Detection and Automatic Signalling (MIDAS) system was introduced to the M25 from Junction 10 to Junction 15 at a cost of £13.5m in 1995. This was then extended to Junction 16 at a cost of £11.7m in 2002. This consists of a distributed network of traffic and weather sensors, speed cameras and variable-speed signs that control traffic speeds with little human supervision, and has improved traffic flow slightly, reducing the amount of start-stop driving.
After Labour won the 1997 election, the road budget was cut from £6 billion to £1.4 billion. However, the DfT announced new proposals to widen the section between Junction 12 (M3) and Junction 15 (M4) to twelve lanes. At the Heathrow Terminal 5 public inquiry, a Highways Agency official said that the widening was needed to accommodate traffic to the proposed new terminal; however, the transport minister said that no such evidence had been given. Environmental groups objected to the decision to go ahead with a scheme that would create the widest motorways in the UK without holding a public inquiry. Friends of the Earth claimed the real reason for the widening was to support Terminal 5. The decision was again deferred. A ten-lane scheme was announced in 1998 and the £148 million 'M25 Jct 12 to 15 Widening' contract was awarded to Balfour Beatty in 2003. The scheme was completed in 2005 as dual-five lanes between Junctions 12 and 14 and dual-six lanes from Junctions 14 to 15.
In 2007, Junction 25 (A10/Waltham Cross) was remodelled to increase capacity. The nearby Holmesdale Tunnel was widened to three lanes in an easterly direction, and an additional left-turn lane added from the A10 onto the motorway. The total cost was £75 million.
Work to widen the exit slip-roads in both directions at Junction 28 (A12 / A1023) was completed in 2008. It was designed to reduce the amount of traffic queueing on the slip roads at busy periods, particularly traffic from the clockwise M25 joining the northbound A12. In 2018, a new scheme was proposed as the junction had reached capacity at over 7,500 vehicles per hour. This would involve building a two-lane link road between the M25 and the A12. The work is expected to be completed around 2021/22.
In 2006 the Highways Agency proposed widening of the M25 from six to eight lanes, between Junctions 5 and 6, and 16 to 30, as part of a Design, Build, Finance and Operate (DBFO) project. A shortlist of contractors was announced in October 2006 for the project, which was expected to cost £4.5 billion. Contractors were asked to resubmit their bids in January 2008, and in June 2009 the new transport minister indicated that the cost had risen to £5.5 billion and the benefit to cost ratio had dropped considerably. In January 2009 the government announced that plans to widen the sections from Junctions 5 to 7 and 23 to 27 had been 'scrapped' and that hard shoulder running would be introduced instead. However, widening to four lanes was reinstated in the 2013–14 Highways Agency Business Plan.
In 2009 a £6.2 billion M25 DBFO private finance initiative contract was awarded to Connect Plus to widen the sections between Junctions 16 to 23 and 27 to 30, and maintain the M25 and the Dartford Crossing for a 30-year period.
Works to widen the section between Junctions 16 (M40) and 23 (A1(M)) to dual four lanes started in July 2009 at an estimated cost of £580 million. The Junction 16 to 21 (M1) section was completed by July 2011 and the Junction 21 to 23 by June 2012. Works to widen the Junctions 27 (M11) to 30 (A13) section to dual four lanes also started in July 2009. The Junction 27 to 28 (A12) section was completed in July 2010, and the Junction 28 to 29 (A127) in June 2011, and finally the Junction 29 to 30 (A13) section opened in May 2012.
Works to introduce smart motorway technology and permanent hard shoulder running on two sections of the M25 began in 2013. The first section between Junctions 5 (A21/M26) and 7 (M23) started construction in May 2013 with the scheme being completed and opened in April 2014. The second section, between Junctions 23 (A1/A1(M)) and 27 (M11), began construction in February 2013 and was completed and opened in November 2014.
In December 2016, Highways England completed the capacity project at Junction 30 (Thurrock) as part of the Thames Gateway Delivery Plan. The £100m scheme included widening the M25 to four lanes, adding additional link roads, and improvements to drainage.
The M25 is one of Europe's busiest motorways. In 2003, a maximum of 196,000 vehicles a day were recorded on the motorway just south of Heathrow between junctions 13 and 14. The stretch between Junctions 14 and 15 nearby consistently records the highest daily traffic counts on the British strategic road network, with the average flow in 2018 of 219,492 counts (lower than the record peak measured in 2014 of 262,842 counts).
Traffic on the M25 is monitored by Connect Plus Services on behalf of Highways England. The company operates a series of transportable CCTV cameras that can be easily moved into congestion hotspots. This allows operators to see a clear view of the motorway and what can be done to tackle individual areas of congestion. Prior to its liquidation, Carillion was subcontracted to manage traffic on the M25, delivering live alerts from body-worn cameras via 3G, 4G and Wi-Fi.
Since 1995, sections of the M25 have been equipped with variable speed limits. These purposefully slow traffic down in the event of congestion or an obstruction and help manage the traffic flow. The scheme was originally trialled between Junctions 10 and 16, and was made a permanent fixture in 1997.
The Dartford Crossing is the only fixed vehicle crossing of the Thames east of Greater London. It is also the busiest crossing in the United Kingdom, and consequently puts pressure on M25 traffic. Users of the crossing do not pay a toll, but rather a congestion charge; the signs at the crossing are the same deployed over the London congestion charge zone. In 2009 the Department for Transport published options for a new Lower Thames Crossing to add capacity to the Dartford Crossing or create a new road and crossing linking to the M2 and M20 motorways. Plans for this stalled, and were cancelled by the Mayor of London, Boris Johnson in 2013, to be replaced by the Gallions Reach Crossing. Initially a straight ferry replacement for the Woolwich Ferry, this was later changed to be a possible bridge or tunnel.
On 11 December 1984, nine people died and ten were injured in a multiple-vehicle collision between junctions 5 and 6. Dense fog had descended suddenly, and 26 vehicles were involved.
On 16 December 1988, several vehicles were stolen and used as getaway for acts of murder and robbery, using the M25 to quickly move between targets. Three men, including Raphael Rowe were tried and sentenced to life imprisonment in 1990, but maintained their innocence. They became known as the M25 Three. Rowe studied journalism while in prison and following release became investigative journalist for the BBC.
In 1996, Kenneth Noye murdered Stephen Cameron in a road rage incident while stopped at traffic lights on an M25 junction. He was convicted in 2000 and sentenced to life imprisonment. He was released in June 2019.
In November 2014, during overnight roadworks, a piece of road surface near Junction 9 at Leatherhead failed to set correctly because of rain. This created a pothole in the road and caused a tailback. The Minister for Transport John Hayes criticised the work and the resulting traffic problems.
The M25 has had problems with animals or birds on the carriageway. In 2009, the Highways Agency reported that they were called out several times a week to remove a swan from the motorway around Junction 13. There have been several reported accidents resulting from horses running onto the main carriageway.
The orbital nature of the motorway, in common with racetracks, lent itself to unofficial, and illegal, motor racing. At the end of the 1980s, before the advent of speed enforcement devices, owners of supercars would meet at night at service stations such as South Mimms and conduct time trials. Times below 1 hour were achieved – an average speed of over 117 mph (188 km/h), which included coming to a halt at the Dartford Tunnel road user charge payment booths. The winner received champagne rather than money. The "Enfield Gazette" referred to an "M25 club", and posters appeared near the M25 advertising the "First London Cannonball Run". The racing had mostly disappeared by the end of the 1980s and could not be done after speed cameras were introduced on the M25.
The video game, "M25 Racer", was produced by Davilex Games in 1997, simulating the racing phenomenon. A similar game, "TOCA Race Driver 3" by Codemasters, was released in 2006. It was criticised by road safety experts for its realistic portrayal of a V8 supercar on the M25 at , completing a lap in 42 minutes.
The M25 and the Dartford Crossing are known for frequent traffic jams. This was noticed before the entire road had been completed; at the official opening ceremony Margaret Thatcher complained about "those who carp and criticise". The jams have inspired derogatory names, such as "Britain's Biggest Car Park" and songs (e.g., Chris Rea's "The Road to Hell"). Nevertheless, coach tours around the M25 have continued to run into the 21st century.
The M25 plays a role in the comedy-fantasy novel "Good Omens", as "evidence for the hidden hand of Satan in the affairs of Man". The demon character, Crowley, had manipulated the design of the M25 to resemble a Satanic sigil, and tried to ensure it would anger as many people as possible to drive them off the path of good. The lengthy series of public inquiries for motorways throughout the 1970s, particularly the M25, influenced the opening of "The Hitchhiker's Guide to the Galaxy", where the Earth is destroyed to make way for a hyperspace bypass.
The M25 enjoyed a more positive reputation among ravers in the late 1980s, when this new orbital motorway became a popular route to the parties that took place around the outskirts of London. The use of the M25 for these raves inspired the name of the electronic duo Orbital.
Iain Sinclair's 2002 book and film "London Orbital" is based on a year-long journey around the M25 on foot.
A piece of graffiti on the Chalfont Viaduct, clearly visible from the M25 and reading "" (parodying John Lennon's "Give Peace A Chance") became popular with the public, attracting its own Facebook group. The message originally read "Peas", supposedly the tag of a London graffiti artist; the rest of the wording is reported to have referred to his frequent clashes with the law. In September 2018, after almost 20 years, the graffiti was vandalised and then removed and replaced with the message "give Helch a break". A spokesman for Network Rail sympathised with the requests to restore the "much-loved graffiti", but said they do not condone people putting their lives at risk by trespassing.
Data from driver location signs provide carriageway identifier information. The numbers on the signs are kilometres from a point on the north side of the Dartford Crossing, while the letter is "A" for the clockwise carriageway and "B" for the anticlockwise. They are spaced every .
The M25 has been criticised for having too many junctions; 14 of them serve only local roads. In 2016, Edmund King, president of the Automobile Association, attributed congestion on the M25 to excessive junctions. This leads to "junction hoppers" who only use the motorway for a short distance before exiting; their difference in speed when entering and leaving the main carriageway causes a domino effect, resulting in all vehicles slowing down.
The M25 originally opened without any service areas. The first, at South Mimms, was opened by Margaret Thatcher in June 1987, a week before the election. Thatcher admired the practical and no-frills architecture of Charles Forte and praised him in her opening speech. The second, Clacket Lane, was opened by Robert Key, Minister for Roads and Traffic, on 21 July 1993. Construction was delayed as the remains of a Roman villa were found on the site, requiring archaeological research. The other service area between junctions is Cobham, which opened on 13 September 2012.
!scope=col| miles
!scope=col| km
!scope=col| Clockwise exits (A carriageway)
!scope=col| Junction
!scope=col| Anti-clockwise exits (B carriageway)
!scope=col| Opening date
! scope=row| RiverThames
! scope=row| J1A
! scope=row| J1B
! scope=row| J2
! scope=row| J3
! scope=row| J4
! scope=row| J5
! scope=row| Services
! scope=row| J6
! scope=row| J7
! scope=row| J8
! scope=row| J9
! scope=row| Services
! scope=row| J10
! scope=row| J11
! scope=row| J12
! scope=row| J13
! scope=row| J14
! scope=row| J15
! scope=row| J16
! scope=row| J17
! scope=row| J18
! scope=row| J19
! scope=row| J20
! scope=row| J21
! scope=row| J21A
! scope=row| J22
! scope=row| J23
! scope=row| J24
! scope=row| J25
! scope=row| J26
! scope=row| J27
! scope=row| J28
! scope=row| J29
! scope=row| J30
! scope=row| J31
! scope=row| RiverThames
Notes
Citations
Sources | https://en.wikipedia.org/wiki?curid=20469 |
Mohs scale of mineral hardness
The Mohs scale of mineral hardness () is a qualitative ordinal scale characterizing scratch resistance of various minerals through the ability of harder material to scratch softer material. Created in 1812 by German geologist and mineralogist Friedrich Mohs, it is one of several definitions of hardness in materials science, some of which are more quantitative. The method of comparing hardness by observing which minerals can scratch others is of great antiquity, having been mentioned by Theophrastus in his treatise "On Stones", , followed by Pliny the Elder in his "Naturalis Historia", . While greatly facilitating the identification of minerals in the field, the Mohs scale does not show how well hard materials perform in an industrial setting.
Despite its lack of precision, the Mohs scale is relevant for field geologists, who use the scale to roughly identify minerals using scratch kits. The Mohs scale hardness of minerals can be commonly found in reference sheets.
Mohs hardness is useful in milling. It allows assessment of which kind of mill will best reduce a given product whose hardness is known. The scale is used at electronic manufacturers for testing the resilience of flat panel display components (such as cover glass for LCDs or encapsulation for OLEDs).
As an example, most modern smartphone displays use a glass panel that scratches at level 6 with deeper grooves at level 7 (on the Mohs scale of hardness).
The Mohs scale of mineral hardness is based on the ability of one natural sample of mineral to scratch another mineral visibly. The samples of matter used by Mohs are all different minerals. Minerals are chemically pure solids found in nature. Rocks are made up of one or more minerals. As the hardest known naturally occurring substance when the scale was designed, diamonds are at the top of the scale. The hardness of a material is measured against the scale by finding the hardest material that the given material can scratch, or the softest material that can scratch the given material. For example, if some material is scratched by apatite but not by fluorite, its hardness on the Mohs scale would fall between 4 and 5. "Scratching" a material for the purposes of the Mohs scale means creating non-elastic dislocations visible to the naked eye. Frequently, materials that are lower on the Mohs scale can create microscopic, non-elastic dislocations on materials that have a higher Mohs number. While these microscopic dislocations are permanent and sometimes detrimental to the harder material's structural integrity, they are not considered "scratches" for the determination of a Mohs scale number.
The Mohs scale is a purely ordinal scale. For example, corundum (9) is twice as hard as topaz (8), but diamond (10) is four times as hard as corundum. The table below shows the comparison with the absolute hardness measured by a sclerometer, with pictorial examples.
On the Mohs scale, a streak plate (unglazed porcelain) has a hardness of approximately 7.0. Using these ordinary materials of known hardness can be a simple way to approximate the position of a mineral on the scale.
The table below incorporates additional substances that may fall between levels:
Comparison between Mohs hardness and Vickers hardness: | https://en.wikipedia.org/wiki?curid=20474 |
Murray Gell-Mann
Murray Gell-Mann (; September 15, 1929 – May 24, 2019) was an American physicist who received the 1969 Nobel Prize in Physics for his work on the theory of elementary particles. He was the Robert Andrews Millikan Professor of Theoretical Physics Emeritus at the California Institute of Technology, a distinguished fellow and one of the co-founders of the Santa Fe Institute, a professor of physics at the University of New Mexico, and the Presidential Professor of Physics and Medicine at the University of Southern California.
Gell-Mann spent several periods at CERN, a nuclear research facility in Switzerland, among others as a John Simon Guggenheim Memorial Foundation fellow in 1972.
Gell-Mann was born in lower Manhattan into a family of Jewish immigrants from the Austro-Hungarian Empire, specifically from Chernivtsi (historical name: Czernowitz) in present-day Ukraine. His parents were Pauline (née Reichstein) and Arthur Isidore Gell-Mann, who taught English as a second language (ESL).
Propelled by an intense boyhood curiosity and love for nature and mathematics, he graduated valedictorian from the Columbia Grammar & Preparatory School aged 14 and subsequently entered Yale College as a member of Jonathan Edwards College. At Yale, he participated in the William Lowell Putnam Mathematical Competition and was on the team representing Yale University (along with Murray Gerstenhaber and Henry O. Pollak) that won the second prize in 1947.
Gell-Mann graduated from Yale with a bachelor's degree in physics in 1948 and intended to pursue graduate studies in physics. He sought to remain in the Ivy League for his graduate education and applied to Princeton University as well as Harvard University. He was rejected by Princeton and accepted by Harvard, but the latter institution was unable to offer him any of the financial assistance that he needed. He was accepted by the Massachusetts Institute of Technology (MIT) and received a letter from Victor Weisskopf urging him to attend MIT and become Weisskopf's research assistant, which would provide Gell-Mann with the financial assistance he needed. Unaware of MIT's eminent status in physics research, Gell-Mann was "miserable" with the fact that he would not be able to attend Princeton or Harvard and considered suicide. He stated that he realized he could try to first enter MIT and commit suicide afterwards if he found it to be truly terrible. However, he couldn't first choose suicide and then attend MIT; the two were not "commutable", as Gell-Mann said.
Gell-Mann received his Ph.D. in physics from MIT in 1951 after completing a doctoral dissertation, titled "Coupling strength and nuclear reactions", under the supervision of Victor Weisskopf.
Gell-Mann was a postdoctoral fellow at the Institute for Advanced Study in 1951, and a visiting research professor at the University of Illinois at Urbana–Champaign from 1952 to 1953. He was a visiting associate professor at Columbia University and an associate professor at the University of Chicago in 1954–1955 before moving to the California Institute of Technology, where he taught from 1955 until he retired in 1993.
In 1958, Gell-Mann in collaboration with Richard Feynman, in parallel with the independent team of E. C. George Sudarshan and Robert Marshak, discovered the chiral structures of the weak interaction of physics and developed the V-A theory (vector minus axial vector theory). This work followed the experimental discovery of the violation of parity by Chien-Shiung Wu, as suggested by Chen-Ning Yang and Tsung-Dao Lee, theoretically.
Gell-Mann's work in the 1950s involved recently discovered cosmic ray particles that came to be called kaons and hyperons. Classifying these particles led him to propose that a quantum number called strangeness would be conserved by the strong and the electromagnetic interactions, but not by the weak interactions. (Kazuhiko Nishijima arrived at this idea independently, calling the quantity formula_1-charge after the eta meson.) Another of Gell-Mann's ideas is the Gell-Mann–Okubo formula, which was, initially, a formula based on empirical results, but was later explained by his quark model. Gell-Mann and Abraham Pais were involved in explaining the puzzling aspect of the neutral kaon mixing.
Murray Gell-Mann's fortunate encounter with mathematician Richard Earl Block at Caltech, in the fall of 1960, "enlightened" him to introduce a novel classification scheme, in 1961, for hadrons, elementary particles that participate in the strong interaction. A similar scheme had been independently proposed by Yuval Ne'eman, and is now explained by the quark model. Gell-Mann referred to the scheme as the "eightfold way", because of the "octets" of particles in the classification (the term is a reference to the Eightfold Path of Buddhism).
Gell-Mann, along with Maurice Lévy, developed the sigma model of pions, which describes low-energy pion interactions.
In 1964, Gell-Mann and, independently, George Zweig went on to postulate the existence of quarks, particles of which the hadrons of this scheme are composed. The name was coined by Gell-Mann and is a reference to the novel "Finnegans Wake", by James Joyce ("Three quarks for Muster Mark!" book 2, episode 4). Zweig had referred to the particles as "aces", but Gell-Mann's name caught on. Quarks, antiquarks, and gluons were soon established as the underlying elementary objects in the study of the structure of hadrons. He was awarded a Nobel Prize in Physics in 1969 for his contributions and discoveries concerning the classification of elementary particles and their interactions.
In the 1960s, he introduced current algebra as a method of systematically exploiting symmetries to extract predictions from quark models, in the absence of reliable dynamical theory. This method led to model-independent sum rules confirmed by experiment and provided starting points underpinning the development of the Standard Model (SM), the widely accepted theory of elementary particles.
In 1972 he and Harald Fritzsch introduced the conserved quantum number "color charge", and later, together with Heinrich Leutwyler, they coined the term quantum chromodynamics (QCD) as the gauge theory of the strong interaction. The quark model is a part of QCD, and it has been robust enough to accommodate in a natural fashion the discovery of new "flavors" of quarks, which superseded the eightfold way scheme.
Gell-Mann was responsible, together with Pierre Ramond and Richard Slansky, and independently of Peter Minkowski, Rabindra Mohapatra, Goran Senjanović, Sheldon Lee Glashow, and Tsutomu Yanagida, for the seesaw theory of neutrino masses, that produces masses at the large scale in any theory with a right-handed neutrino. He is also known to have played a role in keeping string theory alive through the 1970s and early 1980s, supporting that line of research at a time when it was a topic of niche interest.
At the time of his death, Gell-Mann was the Robert Andrews Millikan Professor of Theoretical Physics Emeritus at California Institute of Technology as well as a University Professor in the Physics and Astronomy Department of the University of New Mexico in Albuquerque, New Mexico, and the Presidential Professor of Physics and Medicine at the University of Southern California. He was a member of the editorial board of the "Encyclopædia Britannica". In 1984 Gell-Mann was one of several co-founders of the Santa Fe Institute—a non-profit theoretical research institute in Santa Fe, New Mexico intended to study complex systems and disseminate the notion of a separate interdisciplinary study of complexity theory.
He wrote a popular science book about physics and complexity science, "The Quark and the Jaguar: Adventures in the Simple and the Complex" (1994). | https://en.wikipedia.org/wiki?curid=20476 |
Magnetopause
The magnetopause is the abrupt boundary between a magnetosphere and the surrounding plasma. For planetary science, the magnetopause is the boundary between the planet's magnetic field and the solar wind. The location of the magnetopause is determined by the balance between the pressure of the dynamic planetary magnetic field and the dynamic pressure of the solar wind. As the solar wind pressure increases and decreases, the magnetopause moves inward and outward in response. Waves (ripples and flapping motion) along the magnetopause move in the direction of the solar wind flow in response to small-scale variations in the solar wind pressure and to Kelvin–Helmholtz instability.
The solar wind is supersonic and passes through a bow shock where the direction of flow is changed so that most of the solar wind plasma is deflected to either side of the magnetopause, much like water is deflected before the bow of a ship. The zone of shocked solar wind plasma is the magnetosheath. At Earth and all the other planets with intrinsic magnetic fields, some solar wind plasma succeeds in entering and becoming trapped within the magnetosphere. At Earth, the solar wind plasma which enters the magnetosphere forms the plasma sheet. The amount of solar wind plasma and energy that enters the magnetosphere is regulated by the orientation of the interplanetary magnetic field, which is embedded in the solar wind.
The Sun and other stars with magnetic fields and stellar winds have a solar magnetopause or heliopause where the stellar environment is bounded by the interstellar environment.
Prior to the age of space exploration, interplanetary space was considered to be a vacuum. The coincidence of the Carrington super flare and the super geomagnetic event of 1859 was evidence that plasma was ejected from the Sun during a flare event. Chapman and Ferraro proposed that a plasma was emitted by the Sun in a burst as part of a flare event which disturbed the planet's magnetic field in a manner known as a geomagnetic storm. The collision frequency of particles in the plasma in the interplanetary medium is very low and the electrical conductivity is so high that it could be approximated to an infinite conductor. A magnetic field in a vacuum cannot penetrate a volume with infinite conductivity. Chapman and Bartels (1940) illustrated this concept by postulating a plate with infinite conductivity placed on the dayside of a planet's dipole as shown in the schematic. The field lines on the dayside are bent. At low latitudes, the magnetic field lines are pushed inward. At high latitudes, the magnetic field lines are pushed backwards and over the polar regions. The boundary between the region dominated by the planet's magnetic field (i.e., the magnetosphere) and the plasma in the interplanetary medium is the magnetopause. The configuration equivalent to a flat, infinitely conductive plate is achieved by placing an image dipole (green arrow at left of schematic) at twice the distance from the planet's dipole to the magnetopause along the planet-Sun line. Since the solar wind is continuously flowing outward, the magnetopause above, below and to the sides of the planet are swept backward into the geomagnetic tail as shown in the artist's concept. The region (shown in pink in the schematic) which separates field lines from the planet which are pushed inward from those which are pushed backward over the poles is an area of weak magnetic field or day-side cusp. Solar wind particles can enter the planet's magnetosphere through the cusp region. Because the solar wind exists at all times and not just times of solar flares, the magnetopause is a permanent feature of the space near any planet with a magnetic field.
The magnetic field lines of the planet's magnetic field are not stationary. They are continuously joining or merging with magnetic field lines of the interplanetary magnetic field. The joined field lines are swept back over the poles into the planetary magnetic tail. In the tail, the field lines from the planet's magnetic field are re-joined and start moving toward night-side of the planet. The physics of this process was first explained by Dungey (1961).
If one assumed that magnetopause was just a boundary between a magnetic field in a vacuum and a plasma with a weak magnetic field embedded in it, then the magnetopause would be defined by electrons and ions penetrating one gyroradius into the magnetic field domain. Since the gyro-motion of electrons and ions is in opposite directions, an electric current flows along the boundary. The actual magnetopause is much more complex.
If the pressure from particles within the magnetosphere is neglected, it is possible to estimate the distance to the part of the magnetosphere that faces the Sun. The condition governing this position is that the dynamic ram pressure from the solar wind is equal to the magnetic pressure from the Earth's magnetic field:
where formula_2 and formula_3 are the density and velocity of the solar wind, and
"B"("r") is the magnetic field strength of the planet in SI units ("B" in T, μ0 in H/m).
Since the dipole magnetic field strength varies with distance as formula_4 the magnetic field strength can be written as formula_5, where formula_6 is the planet's magnetic moment, expressed in formula_7.
Solving this equation for r leads to an estimate of the distance
The distance from Earth to the subsolar magnetopause varies over time due to solar activity, but typical distances range from 6–15 Rformula_10. Empirical models using real-time solar wind data can provide a real-time estimate of the magnetopause location. A bow shock stands upstream from the magnetopause. It serves to decelerate and deflect the solar wind flow before it reaches the magnetopause.
Research on the magnetopause is conducted using the LMN coordinate system (which is set of axes like XYZ). N points normal to the magnetopause outward to the magnetosheath, L lies along the projection of the dipole axis onto the magnetopause (positive northward), and M completes the triad by pointing dawnward.
Venus and Mars do not have a planetary magnetic field and do not have a magnetopause. The solar wind interacts with the planet's atmosphere and a void is created behind the planet. In the case of the Earth's moon and other bodies without a magnetic field or atmosphere, the body's surface interacts with the solar wind and a void is created behind the body. | https://en.wikipedia.org/wiki?curid=20478 |
Magnetosphere
A magnetosphere is a region of space surrounding an astronomical object in which charged particles are affected by that object's magnetic field. It is created by a star or planet with an active interior dynamo.
In the space environment close to a planetary body, the magnetic field resembles a magnetic dipole. Farther out, field lines can be significantly distorted by the flow of electrically conducting plasma, as emitted from the Sun or a nearby star. e.g. the solar wind. Planets having active magnetospheres, like the Earth, are capable of mitigating or blocking the effects of solar radiation or cosmic radiation, that also protects all living organisms from potentially detrimental and dangerous consequences. This is studied under the specialized scientific subjects of plasma physics, space physics and aeronomy.
Study of Earth's magnetosphere began in 1600, when William Gilbert discovered that the magnetic field on the surface of Earth resembled that of a terrella, a small, magnetized sphere. In the 1940s, Walter M. Elsasser proposed the model of dynamo theory, which attributes Earth's magnetic field to the motion of Earth's iron outer core. Through the use of magnetometers, scientists were able to study the variations in Earth's magnetic field as functions of both time and latitude and longitude.
Beginning in the late 1940s, rockets were used to study cosmic rays. In 1958, Explorer 1, the first of the Explorer series of space missions, was launched to study the intensity of cosmic rays above the atmosphere and measure the fluctuations in this activity. This mission observed the existence of the Van Allen radiation belt (located in the inner region of Earth's magnetosphere), with the follow up Explorer 3 later that year definitively proving its existence. Also during 1958, Eugene Parker proposed the idea of the solar wind, with the term 'magnetosphere' being proposed by Thomas Gold in 1959 to explain how the solar wind interacted with the Earth's magnetic field. The later mission of Explorer 12 in 1961 led by the Cahill and Amazeen observation in 1963 of a sudden decrease in magnetic field strength near the noon-time meridian, later was named the magnetopause. By 1983, the International Cometary Explorer observed the magnetotail, or the distant magnetic field.
Magnetospheres are dependent on several variables: the type of astronomical object, the nature of sources of plasma and momentum, the period of the object's spin, the nature of the axis about which the object spins, the axis of the magnetic dipole, and the magnitude and direction of the flow of solar wind.
The planetary distance where the magnetosphere can withstand the solar wind pressure is called the Chapman–Ferraro distance. This is usefully modeled by the formula wherein formula_1 represents the radius of the planet, formula_2 represents the magnetic field on the surface of the planet at the equator, and formula_3 represents the velocity of the solar wind:
A magnetosphere is classified as "intrinsic" when formula_5, or when the primary opposition to the flow of solar wind is the magnetic field of the object. Mercury, Earth, Jupiter, Ganymede, Saturn, Uranus, and Neptune, for example, exhibit intrinsic magnetospheres. A magnetosphere is classified as "induced" when formula_6, or when the solar wind is not opposed by the object's magnetic field. In this case, the solar wind interacts with the atmosphere or ionosphere of the planet (or surface of the planet, if the planet has no atmosphere). Venus has an induced magnetic field, which means that because Venus appears to have no internal dynamo effect, the only magnetic field present is that formed by the solar wind's wrapping around the physical obstacle of Venus (see also Venus' induced magnetosphere). When formula_7, the planet itself and its magnetic field both contribute. It is possible that Mars is of this type.
The bow shock forms the outermost layer of the magnetosphere; the boundary between the magnetosphere and the ambient medium. For stars, this is usually the boundary between the stellar wind and interstellar medium; for planets, the speed of the solar wind there decreases as it approaches the magnetopause.
The magnetosheath is the region of the magnetosphere between the bow shock and the magnetopause. It is formed mainly from shocked solar wind, though it contains a small amount of plasma from the magnetosphere. It is an area exhibiting high particle energy flux, where the direction and magnitude of the magnetic field varies erratically. This is caused by the collection of solar wind gas that has effectively undergone thermalization. It acts as a cushion that transmits the pressure from the flow of the solar wind and the barrier of the magnetic field from the object.
The magnetopause is the area of the magnetosphere wherein the pressure from the planetary magnetic field is balanced with the pressure from the solar wind. It is the convergence of the shocked solar wind from the magnetosheath with the magnetic field of the object and plasma from the magnetosphere. Because both sides of this convergence contain magnetized plasma, the interactions between them are complex. The structure of the magnetopause depends upon the Mach number and beta of the plasma, as well as the magnetic field. The magnetopause changes size and shape as the pressure from the solar wind fluctuates.
Opposite the compressed magnetic field is the magnetotail, where the magnetosphere extends far beyond the astronomical object. It contains two lobes, referred to as the northern and southern tail lobes. Magnetic field lines in the northern tail lobe point towards the object while those in the southern tail lobe point away. The tail lobes are almost empty, with few charged particles opposing the flow of the solar wind. The two lobes are separated by a plasma sheet, an area where the magnetic field is weaker, and the density of charged particles is higher.
Over Earth's equator, the magnetic field lines become almost horizontal, then return to reconnect at high latitudes. However, at high altitudes, the magnetic field is significantly distorted by the solar wind and its solar magnetic field. On the dayside of Earth, the magnetic field is significantly compressed by the solar wind to a distance of approximately . Earth's bow shock is about thick and located about from Earth. The magnetopause exists at a distance of several hundred kilometers above Earth's surface. Earth's magnetopause has been compared to a sieve because it allows solar wind particles to enter. Kelvin–Helmholtz instabilities occur when large swirls of plasma travel along the edge of the magnetosphere at a different velocity from the magnetosphere, causing the plasma to slip past. This results in magnetic reconnection, and as the magnetic field lines break and reconnect, solar wind particles are able to enter the magnetosphere. On Earth's nightside, the magnetic field extends in the magnetotail, which lengthwise exceeds . Earth's magnetotail is the primary source of the polar aurora. Also, NASA scientists have suggested that Earth's magnetotail might cause "dust storms" on the Moon by creating a potential difference between the day side and the night side.
Many astronomical objects generate and maintain magnetospheres. In the Solar System this includes the Sun, Mercury, Jupiter, Saturn, Uranus, and Neptune. The magnetosphere of Jupiter is the largest planetary magnetosphere in the Solar System, extending up to on the dayside and almost to the orbit of Saturn on the nightside. Jupiter's magnetosphere is stronger than Earth's by an order of magnitude, and its magnetic moment is approximately 18,000 times larger.
Venus, Mars, and Pluto, on the other hand, have no magnetic field. This may have had significant effects on their geological history. It is theorized that Venus and Mars may have lost their primordial water to photodissociation and the solar wind. A strong magnetosphere greatly slows this process. | https://en.wikipedia.org/wiki?curid=20479 |
Manama
Manama ( "" Bahrani pronunciation: ) is the capital and largest city of Bahrain, with an approximate population of 157,000 people. Long an important trading center in the Persian Gulf, Manama is home to a very diverse population. After periods of Portuguese and Persian control and invasions from the ruling dynasties of Saudi Arabia and Oman, Bahrain established itself as an independent nation during the 19th century period of British hegemony.
Although the current twin cities of Manama and Muharraq appear to have been founded simultaneously in the 1300s, Muharraq took prominence due to its defensive location and was thus the capital of Bahrain until 1337. Manama became the mercantile capital and was the gateway to the main Bahrain Island. In the 20th century, Bahrain's oil wealth helped spur fast growth and in the 1990s a concerted diversification effort led to expansion in other industries and helped transform Manama into an important financial hub in the Middle East. Manama was designated as the 2012 capital of Arab culture by the Arab League, and a beta global city by the Globalization and World Cities Research Network in 2018.
The name is derived from the Arabic word المنامة (transliterated:"al-manãma") meaning "the place of rest" or "the place of dreams".
There is evidence of human settlement on the northern coastline of Bahrain dating back to the Bronze Age. The Dilmun civilisation inhabited the area in 3000 BC, serving as a key regional trading hub between Mesopotamia, Magan and the Indus Valley civilisation. Approximately 100,000 Dilmun burial mounds were found across the north and central regions of the country, some originating 5,000 years ago. Despite the discovery of the mounds, there is no significant evidence to suggest heavy urbanisation took place during the Dilmun era. It is believed that the majority of the population lived in rural areas, numbering several thousand. Evidence of an ancient large rural population was confirmed by one of Alexander the Great's ship captains, during voyages in the Persian Gulf. A vast system of aqueducts in northern Bahrain helped facilitate ancient horticulture and agriculture.
The commercial network of Dilmun lasted for almost 2,000 years, after which the Assyrians took control of the island in 700 BC for more than a century. This was followed by Babylonian and Achaemenid rule, which later gave way to Greek influence during the time of Alexander the Great's conquests. In the first century AD, the Roman writer Pliny the Elder wrote of Tylos, the Hellenic name of Bahrain in the classical era, and its pearls and cotton fields. The island came under the control of the Parthian and Sassanid empires respectively, by which time Nestorian Christianity started to spread in Bahrain. By 410–420 AD, a Nestorian bishopric and monastery was established in Al Dair, on the neighbouring island of Muharraq. Following the conversion of Bahrain to Islam in 628 AD, work on one of the earliest mosques in the region, the Khamis Mosque, began as early as the seventh century AD. During this time, Bahrain was engaged in long distance marine trading, evident from the discovery of Chinese coins dating between 600–1200 AD, in Manama.
In 1330, under the Jarwanid dynasty, the island became a tributary of the Kingdom of Hormuz. The town of Manama was mentioned by name for the first time in a manuscript dating to 1345 AD. Bahrain, particularly Manama and the nearby settlement of Bilad Al Qadeem, became a centre of Shia scholarship and training for the ulema, it would remain so for centuries. The ulema would help fund pearling expeditions and finance grain production in the rural areas surrounding the city. In 1521, Bahrain fell to the expanding Portuguese Empire in the Persian Gulf, having already defeated Hormuz. The Portuguese consolidated their hold on the island by constructing the Bahrain Fort, on the outskirts of Manama. After numerous revolts and an expanding Safavid empire in Persia, the Portuguese were expelled from Bahrain and the Safavids took control in 1602.
The Safavids, sidelining Manama, designated the nearby town of Bilad Al Qadeem as the provincial capital. The town was also the seat of the Persian governor and the Shaikh al-Islam of the islands. The position of Shaikh al-Islam lay under the jurisdiction of the central Safavid government and as such, candidates were carefully vetted by the Isfahan courts. During the Safavid era, the islands continued to be a centre for Twelver Shi'ism scholarship, producing clerics for use in mainland Persia. Additionally, the rich agricultural northern region of Bahrain continued to flourish due to an abundance of date palm farms and orchards. The Portuguese traveler Pedro Teixeira commented on the extensive cultivation of crops like barley and wheat. The opening of Persian markets to Bahraini exports, especially pearls, boosted the islands' export economy. The yearly income of exported Bahraini pearls was 600,000 ducats, collected by around 2,000 pearling dhows. Another factor that contributed to Bahrain's agricultural wealth was the migration of Shia cultivators from Ottoman-occupied Qatif and al-Hasa, fearing religious persecution, in 1537. Sometime after 1736, Nader Shah constructed a fort on the southern outskirts of Manama (likely the Diwan Fort).
Persian control over the Persian Gulf waned during the later half of the 18th century. At this time, Bahrain archipelago was a dependency of the emirate of Bushehr, itself a part of Persia. In 1783, the Bani Utbah tribal confederation invaded Bahrain and expelled the resident governor Nasr Al-Madhkur. As a result, the Al Khalifa family became the rulers of the country, and all political relations with Bushehr and Persia/Iran were terminated. Ahmed ibn Muhammad ibn Khalifa (later called Ahmed al-Fateh, lit. "Ahmed the conqueror") become the dynasty's first Hakim of Bahrain. Political instability in the 19th century had disastrous effects on Manama's economy; Invasions by the Omanis in 1800 and by the Wahhabis in 1810–11, in addition to a civil war in 1842 between Bahrain's co-rulers saw the town being a major battleground. The instability paralysed commercial trade in Manama; the town's port was closed, most merchants fled abroad to Kuwait and the Persian coast until hostilities ceased. The English scholar William Gifford Palgrave, on a visit to Manama in 1862, described the town as having a few ruined stone buildings, with a landscape dominated with the huts of poor fishermen and pearl-divers.
The Pax Britannica of the 19th century resulted in British consolidation of trade routes, particularly those close to the British Raj. In response to piracy in the Persian Gulf region, the British deployed warships and forced much of the Persian Gulf States at the time (including Bahrain) to sign the General Maritime Treaty of 1820, which prohibited piracy and slavery. In 1861, the Perpetual Truce of Peace and Friendship was signed between Britain and Bahrain, which placed the British in charge of defending Bahrain in exchange for British control over Bahraini foreign affairs. With the ascension of Isa ibn Ali Al Khalifa as the Hakim of Bahrain in 1869, Manama became the centre of British activity in the Persian Gulf, though its interests were initially strictly commercial. Trading recovered fully by 1873 and the country's earnings from pearl exports increased by sevenfold between 1873 and 1900. Representing the British were native agents, usually from minorities such as Persians or Huwala who regularly reported back to British India and the British political residency in Bushehr. The position of native agent was later replaced by a British political agent, following the construction of the British political residency (locally referred to in ) in 1900, which further solidified Britain's position in Manama.
Following the outbreak of World War I in 1914, the British Raj used Manama as a military base of operations during the Mesopotamian campaign. Prompted by the presence of oil in the region, the British political agency in Bushire concluded an oil agreement with the Hakim to prohibit the exploration and exploitation of oil for a five-year period. In 1919, Bahrain was officially integrated into the British empire as an overseas imperial territory following the Bahrain order-in-council decree, issued in 1913. The decree gave the resident political agent greater powers and placed Bahrain under the residency of Bushire and therefore under the governance of the British Raj. The British pressured a series of administrative reforms in Bahrain during the 1920s (a move met with opposition from tribal leaders), during which the aging Hakim Isa ibn Ali Al Khalifa was forced to abdicate in favour of his reform-minded son Hamad ibn Isa Al Khalifa. A municipal government was established in Manama in 1919, the Customs office was reorganised in 1923 and placed under the supervision of an English businessman, the pearling industry was later reformed in 1924. Earnings from the customs office would be kept in the newly created state treasury. Civil courts were established for the first time in 1923, followed by the establishment of the Department of Land Registration in 1924. Charles Belgrave, from the Colonial office, was appointed in 1926 by the British to carry on further reforms and manage administration as a financial advisor to the King. He later organised the State Police and was in charge of the Finance and Land departments of the government.
In 1927, the country's pearling economy collapsed due to the introduction of Japanese cultured pearls in the world market. It is estimated that between 1929 and 1931, pearling entrepreneurs lost more than two-thirds of their income. Further aggravated by the Great Depression, many leading Bahraini businessmen, shopkeepers, and pearl-divers fell into debt. With the discovery of oil in 1932 and the subsequent production of oil exports in 1934, the country gained a greater significance in geopolitics. The security of oil supplies in the Middle East was a priority of the British, especially in the run-up to the Second World War. The discovery of oil led to gradual employment of bankrupt divers from the pearling industry in the 1930s, eventually causing the pearling industry to disappear. During the war, the country served as a strategic airbase between Britain and India as well as hosting RAF Muharraq and a naval base in Juffair. Bahrain was bombed by the Italian Air Force in 1940. In 1947, following the end of the war and subsequent Indian independence, the British residency of the Persian Gulf moved to Manama from Bushire.
Following the rise of Arab nationalism across the Middle East and sparked by the Suez Crisis in 1956, anti-British unrest broke out in Manama, organised by the National Union Committee. Though the NUC advocated peaceful demonstrations, buildings and enterprises belonging to Europeans (the British in particular) as well as the main Catholic church in the city and petrol stations, were targeted and set ablaze. Demonstrations held in front of the British political residency called for the dismissal of Charles Belgrave, who was later dismissed by the direct intervention of the Foreign Office the following year. A subsequent crackdown on the NUC led to the dissolution of the body. Another anti-British uprising erupted in March 1965, though predominately led by students aspiring for independence rather than by Arab nationalists. In 1968, the British announced their withdrawal from Bahrain by 1971. The newly independent State of Bahrain designated Manama as the capital city.
Post-independence Manama was characterised by the rapid urbanisation of the city and the swallowing-up of neighboring villages and hamlets into a single urbanised area, incorporating new neighbourhoods such as Adliya and Salmaniya. The construction boom attracted large numbers of foreigners from the Indian subcontinent and by 1981, foreigners outnumbered Bahrainis two-to-one. The construction of the Diplomatic Area district in the city's northeast helped facilitate diversification of the country's economy from oil by exploiting the lucrative financial industry. Financial institutions in the district numbered 187 by 1986. The scarcity of land suitable for construction led to land reclamation. Religious activism migrated from Manama to the suburban districts of Bani Jamra, Diraz and Bilad Al Qadeem, hotspots of unrest in the 1990s uprising that called for the reinstatement of an elected parliament. In 2001, the National Action Charter, presented by King Hamad bin Isa al-Khalifa was approved by Bahrainis. The charter led to the first parliamentary and municipal elections in decades. Further elections in 2006 and 2010 led to the election of Islamist parties, Al Wefaq, Al Menbar, and Al Asalah, as well as independent candidates. In 2011, a month-long uprising led to the intervention of GCC forces and the proclamation of a three-month state of emergency. The Bahrain Independent Commission of Inquiry published a 500-page report on the events of 2011.
Historically, Manama has been restricted to what is now known as the Manama Souq and the Manama Fort (now the Ministry of Interior) to its south. However the city has now grown to include a number of newer suburban developments as well as older neighboring villages that have been engulfed by the growth of the city. The districts that make up Manama today include:
Manama is part of the Capital Governorate, one of five Governorates of Bahrain. Until 2002 it was part of the municipality of Al-Manamah. Councils exist within the governorates; eight constituencies are voted upon within Capital Governorate in 2006.
Manama is the focal point of the Bahraini economy. While petroleum has decreased in importance in recent years due to depleting reserves and growth in other industries, it is still the mainstay of the economy. Heavy industry (e.g. aluminium smelting, ship repair), banking and finance, and tourism are among the industries which have experienced recent growth. Several multinationals have facilities and offices in and around Manama. The primary industry in Manama itself is financial services, with over two hundred financial institutions and banks based in the CBD and the Diplomatic Area. Manama is a financial hub for the Persian Gulf region and a center of Islamic banking. There is also a large retail sector in the shopping malls around Seef, while the center of Manama is dominated by small workshops and traders.
Manama's economy in the early 20th century relied heavily on pearling; in 1907, the pearling industry was estimated to include 917 boats providing employment for up to 18,000 people. Shipbuilding also employed several hundred in both Manama and Muharraq. The estimated income earned from pearling in 1926 and subsequent years prior to the Great Depression was £1.5 million annually. Custom duties and tariffs served as the prime source of revenue for the government. With the onset of the Great Depression, the collapse of the pearling industry and the discovery of oil in 1932, the country's economy began to shift towards oil.
Historically, the ports at Manama were of poor reputation. The British described the ports importing systems as being "very bad – goods were exposed to the weather and there were long delays in delivery", in 1911. Indians began maintaining the ports and new resources were built on site, improving the situation. As of 1920, Manama was one of the main exporters of Bahrain pearls, attracting steamships from India. During this time, they also imported goods from India and from other regional countries. They imported rice, textiles, ghee, coffee, dates, tea, tobacco, fuel, and livestock. They exported less of a variety, with a focus on pearls, oysters, and sailcloth. For the year of 1911–12, Manama was visited by 52 steamships, the majority being British and the rest Turkish-Arabian.
The role of Manama as a regional port city in the Persian Gulf made it a hub for migrant workers in search of a better living. As a result, Manama has often been described, both in the pre-oil and post-oil era, as a cosmopolitan city. In 1904, it was estimated that Manama's population numbered 25,000, out of which half were believed to have been foreigners from Basra, Najd, al-Hasa and Iran, as well as from India and Europe.
The two main branches of Islam, Shia Islam and Sunni Islam, coexisted in Manama for centuries and are represented by distinct ethnic groups. The Shia community is represented by the native Arab Baharna, the Hasawis and Qatifis of mainland Arabia and the Persian Ajam. The Sunni community is represented by Arab Bedouin tribes who migrated in the eighteenth century along with the Bani Utbah and the Huwala, Arabic-speaking Persians. There is also a sizable native Bahraini Christian population in the country, numbering more than a thousand, in addition to immigrant Hindus and a small native Jewish community numbering 37.
Manama is the main hub of the country's road network. At the moment the city's road network is undergoing substantial development to ameliorate the situation of traffic in the city. Due to the fact that it is the capital and the main city in the country, where most of the government and the commercial offices and facilities are established, along with the entertainment centers, and the country's fast growth, vehicle population is increasing rapidly.
The widening of roads in the old districts of Manama and the development of a national network linking the capital to other settlements commenced as early as the arrival of the first car in 1914. The continuous increase in the number of cars from 395 in 1944, to 3,379 in 1954 and to 18,372 cars in 1970 caused urban development to primarily focus on expanding the road network, widening carriageways and the establishment of more parking spaces. Many tracks previously laid in the pre-oil era (prior to the 1930s) were resurfaced and widened, turning them into 'road arteries'. Initial widening of the roads started in the Manama Souq district, widening its main roads by demolishing encroaching houses.
A series of ring roads were constructed (Isa al Kabeer avenue in the 1930s, Exhibition avenue in the 1960s and Al Fateh highway in the 1980s), to push back the coastline and extend the city area in belt-like forms. To the north, the foreshore used to be around "Government Avenue" in the 1920s but it shifted to a new road, "King Faisal Road", in the early 1930s which became the coastal road. To the east, a bridge connected Manama to Muharraq since 1929, a new causeway was built in 1941 which replaced the old wooden bridge. Transits between the two islands peaked after the construction of the Bahrain International Airport in 1932.
To the south of Manama, roads connected groves, lagoons and marshes of Hoora, Adliya, Gudaibiya and Juffair. Villages such as Mahooz, Ghuraifa, Seqaya served as the end of these roads. To the west, a major highway was built that linked Manama to the isolated village port of Budaiya, this highway crossed through the 'green belt' villages of Sanabis, Jidhafs and Duraz. To the south, a road was built that connected Manama to Riffa. The discovery of oil accelerated the growth of the city's road network.
The four main islands and all the towns and villages are linked by well-constructed roads. There were of roadways in 2002, of which were paved. A causeway stretching over , connect Manama with Muharraq Island, and another bridge joins Sitra to the main island. A four-lane highway atop a causeway, linking Bahrain with the Saudi Arabian mainland via the island of Umm an-Nasan was completed in December 1986, and financed by Saudi Arabia. In 2000, there were 172,684 passenger vehicles and 41,820 commercial vehicles.
Bahrain's port of Mina Salman can accommodate 16 oceangoing vessels drawing up to . In 2001, Bahrain had a merchant fleet of eight ships of 1,000 GRT or over, totaling 270,784 GRT. Private vehicles and taxis are the primary means of transportation in the city.
Manama has a recently reformed comprehensive bus service that launched on 1 April 2015, with a fleet of 141 MAN buses. Regulated by the Ministry of Transportation, bus routes extend across Bahrain and around Manama with fares of a minimum 200 Fils (BD0.200) (around $0.50(USD); £0.30).
Bahrain International Airport is located on the nearby Muharraq Island, approximately from the CBD. It is a premier hub airport in the Middle East. Strategically located in the Northern Persian Gulf between the major markets of Saudi Arabia and Iran, the airport has one of the widest range and highest frequency of regional services with connections to major international destinations in Europe, Asia, Africa, and North America.
Bahrain also has a military airbase, the Isa Air Base, located in the south at Sakhir. This is the base of the Bahrain Defence Force, or BDF.
Quranic schools were the only source of education in Bahrain prior to the 20th century; such schools were primarily dedicated to the study of the Qur'an. The first modern school to open in the country was a missionary elementary school set up in 1892 (according to one account) in Manama by the Reformed Church in America, with the school's syllabus comprising English, Mathematics and the study of Christianity. Leading merchants in the country sent their children to the school until it was closed down in 1933 due to financial difficulties. The school reopened some years later under the name of Al Raja School where it operates till the present day. In addition to the American Mission School, another foreign private school was opened in 1910; Al-Ittihad school, funded by the Persian community of Bahrain.
Following the end of the First World War, Western ideas became more widespread in the country, culminating in the opening of the first public school of Bahrain, Al-Hidaya Al-Khalifia Boys school, in the island of Muharraq in 1919. The school was founded by prominent citizens of Muharraq and was endorsed by the Bahraini royal family. The country's first Education Committee was established by several leading Bahraini merchants, headed by Shaikh Abdulla bin Isa Al-Khalifa, the son of the then-ruler of Bahrain Isa ibn Ali Al Khalifa, who acted as the de facto Minister of Education. The Education Committee was also responsible for managing the Al-Hidaya Boys school. The school was, in fact, the brainchild of Shaikh Abdulla, who suggested the idea after returning from post-World War I celebrations in England.
In 1926, a second public school for boys opened up in Manama called the Jafaria School. Two years later, in 1928, the first public school for girls was established. Due to financial constraints suffered by the Education Committee, the Bahraini government took control of the schools in 1930.
Presently, Manama has a wide range of private and public universities and colleges such as Ahlia University, Applied Science University, Arab Open University, Arabian Gulf University, Bahrain Institute of Banking and Finance, and the College of Health and Sport Sciences. Other notable primary and secondary schools situated in the city include the Bahrain School, the Indian School, Al Raja School amongst others.
The city is located in the north-eastern corner of Bahrain on a small peninsula. As in the rest of Bahrain, the land is generally flat (or gently rolling) and arid.
Manama has an arid climate. In common with the rest of Bahrain, Manama experiences extreme climatic conditions, with summer temperatures up to , and winter as low as with even hail on rare occasions. Average temperatures of the summer and winter seasons are generally from about 17 °C (63 °F) to about 34 °C (93 °F). The most pleasant time in Bahrain is autumn when sunshine is comparatively low, coupled with warm temperatures tempered by soft breezes.
The country attracts a large number of foreigners and foreign influences, with just under one-third of the population hailing from abroad. Alcohol is legal in the country, with bars and nightclubs operating in the city. Bahrain gave women the right to vote in elections for the first time in 2002. Football is the most popular sport in Manama (and the rest of the country), with three teams from Manama participating in the Bahraini Premier League.
Notable cultural sites within Manama include the Bab Al Bahrain and the adjacent souq area. In the 2010s, the historic core of Manama underwent revitalisation efforts alongside the Manama souq, which are due to be completed in 2020. The central areas of Manama are also the main location for Muharram processions in the country, attracting hundreds of thousands of people annually from Bahrain and across the Gulf.
These student protests were led by intellectuals and poets such as Qassim Haddad. | https://en.wikipedia.org/wiki?curid=20481 |
Melbourne Cup
The Melbourne Cup is Australia's most famous annual Thoroughbred horse race. It is a 3200-metre race for three-year-olds and over, conducted by the Victoria Racing Club on the Flemington Racecourse in Melbourne, Victoria as part of the Melbourne Spring Racing Carnival. It is the richest "two-mile" handicap in the world and one of the richest turf races. The event starts at 3:00 pm on the first Tuesday of November and is known locally as "the race that stops the nation".
The Melbourne Cup has a long tradition, with the first race held in 1861. It was originally over but was shortened to in 1972 when Australia adopted the metric system. This reduced the distance by , and Rain Lover's 1968 race record of 3:19.1 was accordingly adjusted to 3:17.9. The present record holder is the 1990 winner Kingston Rule with a time of 3:16.3.
The race is a quality handicap for horses three years old and over, run over a distance of 3200 metres, on the first Tuesday in November at Flemington Racecourse. The minimum handicap weight is 50 kg. There is no maximum weight, but the top allocated weight must not be less than 57 kg. The weight allocated to each horse is declared by the VRC Handicapper in early September.
The Melbourne Cup race is a handicap contest in which the weight of the jockey and riding gear is adjusted with ballast to a nominated figure. Older horses carry more weight than younger ones and weights are adjusted further according to the horse's previous results.
Weights were theoretically calculated to give each horse an equal winning chance in the past, but in recent years the rules were adjusted to a "quality handicap" formula where superior horses are given less severe weight penalties than under pure handicap rules.
After the declaration of weights for the Melbourne Cup, the winner of any handicap flat race of the advertised value of A$55,000 or over to the winner, or an internationally recognised Listed, Group, or Graded handicap flat race, shall carry such additional weight (if any), for each win, as the VRC Handicapper shall determine.
Entries for the Melbourne Cup usually close during the first week of August. The initial entry fee is $600 per horse. Around 300 to 400 horses are nominated each year, but the final field is limited to 24 starters. Following the allocation of weights, the owner of each horse must on the four occasions before the race in November, declare the horse as an acceptor and pay a fee. First acceptance is $960, second acceptance is $1,450 and third acceptance is $2,420. The final acceptance fee, on the Saturday prior to the race, is $45,375. Should a horse be balloted out of the final field, the final declaration fee is refunded.
The race directors may exclude any horse from the race or exempt any horse from the ballot on the race, but in order to reduce the field to the safety limit of 24, horses are balloted out based on a number of factors which include prize money earned in the previous two years, wins or placings in certain lead-up races and allocated handicap weight
The winner of the following races are exempt from any ballot:
The limitation of 24 starters is stated explicitly to be for safety reasons. However, in the past far larger numbers were allowed - the largest field ever raced was 39 runners in 1890.
International horses (New Zealand not included) entered for the Melbourne Cup must undergo quarantine in an approved premises in their own country for a minimum period of 14 days before travelling to Australia. The premises must meet the Australian Government Standards. The Werribee International Horse Centre at Werribee racecourse is the Victorian quarantine station for international horses competing in the Melbourne Spring Racing Carnival. The facility has stabling for up to 24 horses in five separate stable complexes and is located 32 km from the Melbourne CBD.
The total prize money for the 2019 race is A$8,000,000, plus trophies valued at $250,000. The first 12 past the post receive prize money, with the winner being paid $4.4 million, second $1.1 million, third $550,000, fourth $350,000, fifth $230,000, with sixth through to twelfth place earning $160,000. Prize money is distributed to the connections of each horse in the ratio of 85 percent to the owner, 10 percent to the trainer and 5 percent to the jockey.
The 1985 Melbourne Cup, won by "What a Nuisance", was the first race run in Australia with prize money of $1 million.
The Cup currently has a $500,000 bonus for the owner of the winner if it has also won the group one Irish St. Leger run the previous September.
The winner of the first Melbourne Cup in 1861 received a gold watch. The first Melbourne Cup trophy was awarded in 1865 and was an elaborate silver bowl on a stand that had been manufactured in England. The first existing and un-altered Melbourne Cup is from 1866, presented to the owners of The Barb; as of 2013, it is in the National Museum of Australia. The silver trophy presented in 1867, now also in the National Museum of Australia, was also made in England but jewellers in Victoria complained to the Victorian Racing Club that the trophy should have been made locally. They believed the work of Melbournian, William Edwards, to be superior in both design and workmanship to the English made trophy. No trophy was awarded to the Melbourne Cup winner for the next eight years.
In 1876 Edward Fischer, an immigrant from Austria produced the first Australian-made trophy. It was an Etruscan shape with two handles. One side depicted a horse race with the grandstand and hill of Flemington in the background. The opposite side had the words "Melbourne Cup, 1876" and the name of the winning horse. A silver-plated base sporting three silver horses was added in 1888, but in 1891 the prize changed to being a , trophy showing a Victory figure offering an olive wreath to a jockey. From 1899 the trophy was in the form of silver galloping horse embossed on a plaque, although it was said to look like a greyhound by some people.
The last Melbourne Cup trophy manufactured in England was made for the 1914 event. It was a chalice centred on a long base which had a horse at each end. The trophy awarded in 1916, the first gold trophy, was a three-legged, three-armed rose bowl. The three-handled loving cup design was first awarded in 1919. In that year the Victorian Racing Club had commissioned James Steeth to design a trophy that would be in keeping with the prestige of the race, little realising that it would become the iconic Melbourne Cup still presented today. In the Second World War years (1942, 1943 and 1944) the winning owner received war bonds valued at 200 pounds.
A new trophy is struck each year and becomes the property of the winning owner. In the event of a dead heat, a second cup is on hand. The present trophy is hand spun from 1.65 kg of 18-carat gold. The winning trainer and jockey also receive a miniature replica of the cup (since 1973) and the strapper is awarded the Tommy Woodcock Trophy, named after the strapper of Phar Lap.
In 2003 an annual tour of the Melbourne Cup trophy was initiated to provide communities across Australia and New Zealand with an opportunity to view the Cup trophy and highlight the contribution the Melbourne Cup has made to Australia's social, sporting and racing culture. Each year, communities in Australia and New Zealand apply for the cup to tour their community and the tour also takes in cities around the world as part of the Victoria Racing Club's strategy to promote the Melbourne Cup and the Melbourne Cup Carnival internationally.
The Tour has visited schools and aged-care and hospital facilities, and participated in community events and celebrations including race days across Australia and New Zealand.
Frederick Standish, member of the Victorian Turf Club and steward on the day of the first Cup, was credited with forming the idea to hold a horse race and calling it the "Melbourne Cup".
Seventeen horses contested the first Melbourne Cup on Thursday 7 November 1861, racing for the modest prize of 710 gold sovereigns (£710) cash and a hand-beaten gold watch, winner takes all. The prize was not, as some have suggested, the largest purse up to that time. A large crowd of 4,000 men and women watched the race, although it has been suggested this was less than expected because of news reaching Melbourne of the death of explorers Burke and Wills five days earlier on 2 November. Nevertheless, the attendance was the largest at Flemington on any day for the past two years, with the exception of the recently run Two Thousand Guinea Stakes.
The winner of this first Melbourne Cup race was a 16.3 hand bay stallion by the name of Archer in a time of 3.52.00, ridden by John Cutts, trained by Etienne de Mestre, and leased (and consequently raced in his own name) by de Mestre. As a lessee de Mestre "owned" and was fully responsible for Archer during the lease. Archer was leased from the "Exeter Farm" of Jembaicumbene near Braidwood, New South Wales. His owners were Thomas John "Tom" Roberts (a good school-friend of de Mestre's), Rowland H. Hassall (Roberts' brother-in-law), and Edmund Molyneux Royds and William Edward Royds (Roberts' nephews).
The inaugural Melbourne Cup of 1861 was an eventful affair when one horse bolted before the start, and three of the seventeen starters fell during the race, two of which died. Archer, a Sydney "outsider" who drew scant favour in the betting, spread-eagled the field and defeated the favourite, and Victorian champion, Mormon by six lengths. Dismissed by the bookies, Archer took a lot of money away from Melbourne, 'refuelling interstate rivalry' and adding to the excitement of the Cup. The next day, Archer was raced in and won another 2-mile long distance race, the Melbourne Town Plate.
It has become legend that Archer walked over 800 km (over 500 miles) to Flemington from de Mestre's stable at "Terara" near Nowra, New South Wales. However, newspaper archives of the day reveal that he had travelled south from Sydney to Melbourne on the steamboat "City of Melbourne", together with de Mestre, and two of de Mestre's other horses Exeter and Inheritor. Before being winched aboard the steamboat for the trip to Melbourne, the horses had arrived in Sydney in September 1861.
Archer traveled to Melbourne by steamboat again the following year (1862) to run in the second Melbourne Cup. This time he won 810 gold sovereigns (£810) cash and a gold watch before a crowd of 7,000, nearly twice the size of the previous years large crowd in a time of 3.47.00, taking to two the number of Melbourne Cup wins by this horse. Archer had already won the 1862 AJC Queen Elizabeth Stakes in Randwick, Sydney, and returned to win his second Melbourne Cup carrying 10 stone 2 pounds. He defeated a field of twenty starters by eight lengths, a record that has never been beaten, and that was not matched for over 100 years. Mormon again running second. Winning the Melbourne Cup twice was a feat not repeated until more than seventy years later when Peter Pan won the race in 1932 and 1934, and winning the Melbourne Cup two years in a row was a feat not repeated until more than 30 years later when Rain Lover won in 1968 and 1969.
Archer traveled to Melbourne by steamboat yet again the next year (1863). Despite his weight of 11 stone 4 pounds, Archer would have contested the third cup in 1863, but due to a Victorian public holiday trainer Etienne de Mestre's telegraphed acceptance form arrived late, and Archer was scratched on a technicality. In protest of this decision and in a show of solidarity, many of de Mestre's owners boycotted the third race and scratched their horses in sympathy. As a result, the Melbourne Cup of that year ran with only 7 starters, the smallest number in the history of the Cup.
In 1865, Adam Lindsay Gordon wrote a verse in which the Melbourne Cup winner was called Tim Whiffler. Two years later in 1867 two horses with the name Tim Whiffler ran in the Melbourne Cup. (The year before in 1866 two horses with the same name, Falcon, also ran in the Melbourne Cup.) To distinguish between the two Tim Whifflers they were called "Sydney" Tim Whiffler and "Melbourne" Tim Whiffler. "Sydney" Tim Whiffler actually won the Cup. He was trained by Etienne de Mestre, and like Archer before him raced in de Mestre's name but was leased from the "Exeter Farm".
As early as 1865, Cup day was a half-holiday in Melbourne for public servants and bank officials. Various businesses also closed at lunchtime.
It took some years before the purpose of the declared holiday was acknowledged in the Victoria Government Gazette. The Gazette of 31 October 1873 announced that the following Thursday (Cup Day) be observed as a bank and civil (public) service holiday.
The Melbourne Cup was first run on a Tuesday in 1875, the first Tuesday in that month.
On 7 November 1876, the three-year-old filly, Briseis, owned and trained by James Wilson Snr., won in a time of 3.36.25. Briseis then went on to create a record that is never likely to be equalled, winning the VRC Derby, the Melbourne Cup and the VRC Oaks in the space of six days. She was ridden in the Melbourne Cup by the tiny featherweight figure of jockey Peter St. Albans. In 1876 at the recorded age thirteen (he was actually twelve, being 8 days short of his thirteenth birthday), Peter St. Albans is also the youngest person ever to win a Melbourne Cup. Before 75,000 at Flemington Briseis, with St Albans in the saddle, comfortably won by 1 length in the biggest field of all time. "At 4 o'clock the starter released the 33 runners and they swept down the long Flemington straight in a thundering rush. Briseis, ridden by what one writer termed a mere child, (in the Cup) captured a rare double, the Victoria Race Club Derby and the Melbourne Cup. Shouts and hurrahs were heard, hats were thrown in the air and one excited individual fell on his back in the attempt to do a somersault. The boy who rode the winner was carried around the pack and is the hero of the day," reported the "Australasian Sketcher" in 1876. Both Peter St. Albans and Briseis have now become racing legends, and Briseis is regarded as one of the greatest mares foaled in Australia.
Briseis wasn't the only sensation surrounding the 1876 Melbourne Cup. Two months before the event, on Saturday 9 September, the "City of Melbourne" sailed for Melbourne from Sydney with a cargo including 13 racehorses, many of whom were considered serious contenders for the Melbourne Cup. The following day the ship ran into a savage storm and was hit by several rogue waves, with Nemesis (the winner of the 1876 AJC Metropolitan Handicap in Randwick, Sydney and favourite for the Cup, owned by John Moffat) and Robin Hood (another favourite, owned by Etienne de Mestre) being among the 11 horses that were killed. Betting on the big race was paralysed. To the dismay and anger of the public, bookmakers, showing no feeling, presented a purse (loaded with coins) to the captain as token of their appreciation for his part in saving them many thousands of pounds in bets already laid on the favourites who had perished. Perhaps they should have kept their money, however. The outsider Briseis comfortably won by 1 length in the biggest field of all time and is an extremely good time, so it is unlikely that the horses who perished could have beaten her.
1877 is also the year that the trainer Etienne de Mestre won his fourth Melbourne Cup with Chester owned by Hon. James White. In 1878, as in previous years, De Mestre fielded more than one horse. He entered the favourite Firebell (owned by W.S. Cox) who finished last, Chester (owned by Hon. James White) the previous year's winner who fell, and Calamia (owned by de Mestre) who, though less fancied, won easily by two lengths. First prize was £1,790, the crowd was 80,000 and there were 30 starters. De Mestre's 1878 win with Calamia brought to 5 the number of Melbourne Cups he had won. This record was not to be matched for nearly 100 years when the trainer Bart Cummings won his fifth Melbourne Cup in 1975. Bart Cummings, regarded as the best Australian horse trainer of all time, went on to win 12 Melbourne Cups to 2008.
In 1883, the hardy New Zealand bred, Martini-Henry won the VRC Derby, the Melbourne Cup and on the following Monday retained his undefeated record by winning Mares' Produce Stakes.
Phar Lap, the most famous horse in the world of his day, won the 1930 Melbourne Cup at 11/8 odds on, the shortest-priced favourite in the history of the race. He had to be hidden away at Geelong before the race after an attempt was made to shoot him and only emerged an hour before the race time of the Cup. Phar Lap also competed in 1929 and 1931, but came 3rd and 8th respectively, despite heavy favouritism in both years.
There are a few legends of the first Aboriginal jockey to ride in a Melbourne Cup. It was believed to be John Cutts who won the first and second cups in 1861 and 1862 riding Archer. He was reputedly an Aboriginal stockman born in the area where Archer was trained but was actually John 'Cutts' Dillon, the son of a Sydney clerk, a jockey who rode for many trainers in his long career, and who was one of the best known, best-liked and most respected jockeys in New South Wales. It is thought that Peter St. Albans was the first Aboriginal jockey to win the cup, on Briseis in 1876. Because St. Albans was not quite 13 years old, the jockey was too young to ride in the cup. Thus, to allow him to race Briseis in the Cup, it was argued his birthdate and parents were unknown, and from this, the legend of him being Aboriginal grew. Both these legends, however, can definitely be disproved, and history had to wait nearly another 100 years. The first jockey of Indigenous heritage to ride a Melbourne Cup winner was Frank Reys in 1973 on Gala Supreme, who had a Filipino father and a half-Aboriginal mother.
The race has undergone several alterations in recent years, the most visible being the entry of many foreign-trained horses. Most have failed to cope with the conditions; the three successful "foreign raids" include two by Irish trainer Dermot K. Weld successful in 1993 and 2002, and one in 2006 by Katsumi Yoshida of Japan's renowned Yoshida racing and breeding family. The attraction for foreigners to compete was, primarily, the low-profile change to the new "quality handicap" weighting system.
The 1910 Melbourne Cup was won by Comedy King, the first foreign bred horse to do so. Subsequent foreign bred horses to win Cup were Backwood 1924; Phar Lap 1930; Wotan 1936; Beldale Ball 1980; At Talaq 1986; Kingston Rule 1990; Vintage Crop 1993; Jeune 1994; Media Puzzle 2002; Makybe Diva 2003, 2004, 2005; Americain 2010 and Dunaden 2011.
The 1938 Melbourne Cup was won by trainer Mrs. Allan McDonald, who conditioned Catalogue. Mrs McDonald was a successful trainer in New Zealand, however, at the time women were not allowed to compete as trainers in Australia so her husband's name was officially recorded as the winning trainer. The 2001 edition was won by New Zealand mare Ethereal, trained by Sheila Laxon, the first woman to formally train a Melbourne Cup winner. She also won the Caulfield Cup, a 2,400-metre race also held in Melbourne, and therefore has won the "Cups Double".
Maree Lyndon became the first female to ride in the Melbourne Cup, when she partnered Argonaut Style in 1987, in which she ran second last in the 21 horse field.
In 2004, Makybe Diva became the first mare to win two cups, and also the first horse to win with different trainers, after David Hall moved to Hong Kong and transferred her to the Lee Freedman stables.
The 2005 Melbourne Cup was held before a crowd of 106,479. Makybe Diva made history by becoming the only horse to win the race three times. Trainer Lee Freedman said after the race, "Go and find the youngest child on the course because that's the only person here who will have a chance of seeing this happen again in their lifetime."
Due to the 2007 Australian equine influenza outbreak, believed to have been started by a horse brought into Australia from Japan, neither Delta Blues nor Pop Rock participated in the 2007 Melbourne Cup. Both horses had been stabled in Japan. Corowa, NSW trained "Leica Falcon" also was not be permitted to race in Victoria, despite Corowa being close to the Victorian border. Leica Falcon was ordained as the new staying star of Australian racing in 2005 when he ran fourth in both the Caulfield Cup and in Makybe Diva's famous third Melbourne Cup victory. But serious leg injuries saw the horse not race for another 20 months. Efficient, the previous year's VRC Derby winner, won the race.
In 2013, Damien Oliver returned from an eight-month ban, after betting against his own mount at a previous race meet, to win his 3rd Melbourne cup.
The 2019 Melbourne Cup was overshadowed by recent news of the ill-treatment of horses in the Australian racing industry, and by the pulling out of notable celebrities including Taylor Swift, Megan Gale, and X-Men actress Lana Condor.
Melbourne Cup day is a public holiday for all working within metropolitan Melbourne and some parts of regional Victoria, but not for some country Victorian cities and towns which hold their own spring carnivals. For federal public servants it is also observed as a holiday in the entire state of Victoria, and from 2007 to 2009 also in the Australian Capital Territory known as Family and Community Day replacing Picnic Day. The Melbourne cup captures the public's imagination to the extent that people, whether at work, home, school, or out and about, usually stop to watch or listen to the race. Many people from outside of Melbourne take a half or full day off work to celebrate the occasion. Many people feel that the day should be a national public holiday as sick leave is said to increase on the day and productivity wanes.
As early as 1865, Cup Day was a half-holiday in Melbourne for public servants and bank officials. Various businesses also closed at lunchtime.
It took some years before the purpose of the declared holiday was acknowledged in the Victoria Government Gazette. The Gazette of 31 October 1873 announced that the following Thursday (Cup Day) be observed as a bank and civil (public) service holiday.
The event is one of the most popular spectator events in Australia, with sometimes over 110,000 people, some dressed in traditional formal raceday wear and others in all manner of exotic and amusing costumes, attending the race. The record crowd was 122,736 in 2003. The 1926 running of the Cup was the first time the 100,000 mark had been passed. Today the record at Flemington is held by the 2006 Victoria Derby when almost 130,000 attended.
In 2007, a limit was placed on the Spring Carnival attendance at Flemington Racecourse and race-goers are now required to pre-purchase tickets. Every year more and more people travel to Flemington Racecourse, in 2016 there was a 7.8 per cent increase in the number of out-of-state individuals (80,472) attending the Melbourne Cup Carnival;
'Fashions on the Field' is a major focus of the day, with substantial prizes awarded for the best-dressed man and woman. The requirement for elegant hats, and more recently the alternative of a fascinator, almost single-handedly keeps Melbourne's milliners in business. Raceday fashion has occasionally drawn almost as much attention as the race itself, The miniskirt received worldwide publicity when model Jean Shrimpton wore a white shift version of one on Derby Day during Melbourne Cup week in 1965.
Flowers, especially roses are an important component of the week's racing at Flemington. The racecourse has around 12,000 roses within its large expanse. Over 200 varieties of the fragrant flower are nurtured by a team of up to 12 gardeners. Each of the major racedays at Flemington has an official flower. Victoria Derby Day has the Corn Flower, Melbourne Cup Day is for the Yellow Rose, Oaks Day highlights the Pink Rose and Stakes Day goes to the Red Rose.
In the Melbourne metropolitan area, the race day has been a gazetted public holiday since 1877, but around both Australia and New Zealand a majority of people watch the race on television and gamble, either through direct betting or participating in workplace cup "sweeps". In 2000, a betting agency claimed that 80 percent of the adult Australian population placed a bet on the race that year. In 2010 it was predicted that $183 million would be spent by 83,000 tourists during the Spring Racing Carnival. In New Zealand, the Melbourne Cup is the country's single biggest betting event, with carnival race-days held at several of the country's top tracks showing the cup live on big screens.
It is commonly billed as "The race that stops a nation", but it is more accurately "The race that stops two nations", as many people in New Zealand, as well as Australia, pause to watch the race. | https://en.wikipedia.org/wiki?curid=20485 |
Messerschmitt Me 163 Komet
The Messerschmitt Me 163 Komet was a German rocket-powered interceptor aircraft. Designed by Alexander Lippisch, it is the only rocket-powered fighter aircraft ever to have been operational and the first piloted aircraft of any type to exceed 1000 km/h (621 mph) in level flight. Its performance and aspects of its design were unprecedented. German test pilot Heini Dittmar in early July 1944 reached , an unofficial flight airspeed record unmatched by turbojet-powered aircraft for almost a decade. Over 300 Komets were built, but the aircraft proved lackluster in its dedicated role as an interceptor and destroyed between 9 and 18 Allied aircraft against 10 losses. Aside from combat losses many pilots were killed during testing and training.
Work on the design started around 1937 under the aegis of the "Deutsche Forschungsanstalt für Segelflug" (DFS)—the German Institute for the study of sailplane flight. Their first design was a conversion of the earlier Lippisch Delta IV known as the DFS 39 and used purely as a glider testbed of the airframe. A larger follow-on version with a small propeller engine started as the DFS 194. This version used wingtip-mounted rudders, which Lippisch felt would cause problems at high speed. Lippisch changed the system of vertical stabilization for the DFS 194's airframe from the earlier DFS 39's wingtip rudders, to a conventional vertical stabilizer at the rear of the aircraft. The design included a number of features from its origins as a glider, notably a skid used for landings, which could be retracted into the aircraft's keel in flight. For takeoff, a pair of wheels, each mounted onto the ends of a specially designed cross-axle, were needed due to the weight of the fuel, but the wheels, forming a takeoff dolly under the landing skid, were released shortly after takeoff.
The designers planned to use the forthcoming Walter R-1-203 "cold engine" of thrust, which like the self-contained Walter HWK 109-500 "Starthilfe" RATO booster rocket unit, used a monopropellant consisting of stabilized HTP known by the name "T-Stoff". Heinkel had also been working with Hellmuth Walter on his rocket engines, mounting them in the He 112R's tail for testing – this was done in competition with Wernher von Braun's bi-propellant, alcohol/LOX-fed rocket motors, also with the He 112 as a test airframe – and with the Walter catalyzed HTP propulsion format for the first purpose-designed, liquid-fueled rocket aircraft, the He 176. Heinkel had also been selected to produce the fuselage for the DFS 194 when it entered production, as it was felt that the highly volatile monopropellant fuel's reactivity with organic matter would be too dangerous in a wooden fuselage structure. Work continued under the code name "Projekt X".
The division of work between DFS and Heinkel led to problems, notably that DFS seemed incapable of building even a prototype fuselage. Lippisch eventually asked to leave DFS and join Messerschmitt instead. On 2 January 1939, he moved with his team and the partly completed DFS 194 to the Messerschmitt works at Augsburg. The delays caused by this move allowed the engine development to catch up. Once at Messerschmitt, the team decided to abandon the propeller-powered version and move directly to rocket-power. The airframe was completed in Augsburg and in early 1940 was shipped to receive its engine at Peenemünde-West, one of the quartet of "Erprobungsstelle"-designated military aviation test facilities of the Reich. Although the engine proved to be extremely unreliable, the aircraft had excellent performance, reaching a speed of in one test.
In the Me 163B and -C subtypes, a ram-air turbine on the extreme nose of the fuselage, and the backup lead-acid battery inside the fuselage that it charged, provided the electrical power for the radio, the Revi16B, -C, or -D reflector gunsight, the direction finder, the compass, the firing circuits of the cannon, and some of the lighting in the cockpit instrumentation.
There was an onboard lead/acid battery, but its capacity was limited, as was its endurance, no more than 10 minutes, hence the fitted generator.
The airspeed indicator averaged readings from two sources: the pitot tube on the leading edge of the port wing, and a small pitot inlet in the nose, just above the top edge of the underskid channel. There was a further tapping-off of pressure-ducted air from the pitot tube which also provided the rate of climb indicator with its source.
The resistance group around the later executed Austrian priest Heinrich Maier had contacts with the Heinkelwerke in Jenbach in Tyrol, where important components for the Messerschmitt Me 163 were also produced. The group passed on relevant information to the Allies. With the location sketches of the production facilities, the Allied bombers were able to carry out targeted air strikes.
In early 1941 production of a prototype series, known as the "Me 163", began. Secrecy was such that the RLM's "GL/C" airframe number, "8-163", was actually that of the earlier Messerschmitt Bf 163. Three Bf 163-prototypes (V-1-V3) were built. It was thought that intelligence services would conclude any reference to the number "163" would be for that earlier design. In May 1941, the first prototype Me 163A, V4, was shipped to Peenemünde to receive the HWK RII-203 engine. By 2 October 1941, Me 163A V4, bearing the radio call sign letters, or "Stammkennzeichen", "KE+SW", set a new world speed record of , piloted by Heini Dittmar, with no apparent damage to the aircraft during the attempt. Some postwar aviation history publications stated that the Me 163A V3 was thought to have set the record.
The record figure was only officially surpassed after the war, by the American Douglas D-558-1 on 20 August 1947. Ten Me 163As (V4-V13) were built for pilot training and further tests
During testing of the prototype (A-series) aircraft, the jettisonable undercarriage presented a serious problem. The original dollies possessed well-sprung independent suspension for each wheel,, and as the aircraft took off, the large springs rebounded and threw the dolly upward, striking the aircraft. The production (B-series) aircraft used much simpler, crossbeam-axled dollies, and relied on the landing skid’s oleo-pneumatic strut to absorb ground-running impacts during the takeoff run, as well as to absorb the shock of landing. If the hydraulic cylinder was malfunctioning, or the skid mistakenly left during a landing procedure in the "locked and lowered" position (as it had to be for takeoff), the impact of a hard touchdown on the skid could cause back injuries to the pilot.
Once on the ground, the aircraft had to be retrieved by a "Scheuch-Schlepper", a converted small agricultural vehicle, originally based on the concept of the two-wheel tractor, carrying a detachable third swiveling wheel at the extreme rear of its design for stability in normal use—this swiveling third wheel was replaced with a pivoting, special retrieval trailer that rolled on a pair of short, triple-wheeled continuous track setups (one per side) for military service wherever the "Komet" was based. This retrieval trailer usually possessed twin trailing lifting arms, that lifted the stationary aircraft off the ground from under each wing whenever it was not already on its twin-wheel dolly main gear, as when the aircraft had landed on its ventral skid and tailwheel after a mission. Another form of trailer, known also to have been trialled with the later B-series examples, was tried during the "Komet"s test phase, which used a pair of sausage-shaped air bags in place of the lifting arms and could also be towed by the "Scheuch-Schlepper" tractor, inflating the air bags to lift the aircraft. The three-wheeled "Scheuch-Schlepper" tractor used for the task was originally meant for farm use, but such a vehicle with a specialized trailer—which could also lift the Me 163's airframe completely clear of the ground to effect the recovery as a normal part of the Me 163's intended use—was required as the "Komet" was unpowered after exhausting its rocket propellants, and lacked main wheels after landing, from the jettisoning of its "dolly" main gear at takeoff. The slightly larger Sd Kfz 2 "Kettenkrad" half-track motorcycle, known to be used with the Me 262 jet fighter for ground handling needs, and documented as also being used with the Arado Ar 234B jet recon-bomber, was not known to have ever been used for ground handling operations with the "Komet" at any time.
During flight testing, the superior gliding capability of the "Komet" proved detrimental to safe landing. As the now un-powered aircraft completed its final descent, it could rise back into the air with the slightest updraft. Since the approach was unpowered, there was no opportunity to make another landing pass. For production models, a set of landing flaps allowed somewhat more controlled landings. This issue remained a problem throughout the program. Nevertheless, the overall performance was tremendous, and plans were made to put Me 163 squadrons all over Germany in around any potential target. Development of an operational version was given the highest priority.
In December 1941, work on an upgraded design began. A simplified construction format for the airframe was deemed necessary, as the Me 163A version was not truly optimized for large-scale production. The result was the Me 163B subtype, which had the desired, more mass-producible fuselage, wing panel, retractable landing skid and tailwheel designs with the previously mentioned unsprung dolly takeoff gear, and a generally one-piece conical nose for the forward fuselage which could incorporate a turbine for supplementary electrical power while in flight, as well as a one-piece, perimeter frame-only hinged canopy for ease of production.
Meanwhile, Walter had started work on the newer HWK 109-509 bipropellant "hot engine", which added a true fuel of hydrazine hydrate and methanol, designated "C-Stoff", that burned with the oxygen-rich exhaust from the "T-Stoff", used as the oxidizer, for added thrust (see: List of Stoffs). The new powerplant and numerous detail design changes meant to simplify production over the general A-series airframe design resulted in the significantly modified Me 163B of late 1941. Due to the "Reichsluftfahrtministerium" requirement that it should be possible to throttle the engine, the original power plant grew complicated and lost reliability.
The fuel system was particularly troublesome, as leaks incurred during hard landings easily caused fires and explosions. Metal fuel lines and fittings, which failed in unpredictable ways, were used as this was the best technology available. Both fuel and oxidizer were toxic and required extreme care when loading in the aircraft, yet there were occasions when "Komets" exploded on the tarmac from the propellants' hypergolic nature. Both propellants were clear fluids, and different tanker trucks were used for delivering each propellant to a particular "Komet" aircraft, usually the "C-Stoff" hydrazine/methanol-base fuel first. For safety purposes, it left the immediate area of the aircraft following its delivery and capping off of the "Komet"s fuel tanks from a rear located dorsal fuselage filling point just ahead of the "Komet"s vertical stabilizer. Then, the other tanker truck carrying the very reactive "T-Stoff" hydrogen peroxide oxidizer would deliver its load through a different filling point on the "Komet"s dorsal fuselage surface, located not far behind the rear edge of the canopy.
The corrosive nature of the liquids, especially for the "T-Stoff" oxidizer, required special protective gear for the pilots. To help prevent explosions, the engine and the propellant storage and delivery systems were frequently and thoroughly hosed down and flushed with water run through the propellant tanks and the rocket engine's propellant systems before and after flights, to clean out any remnants. The relative "closeness" to the pilot of some 120 litres (31.7 US gal) of the chemically active T-Stoff oxidizer, split between two auxiliary oxidizer tanks of equal volume to either side within the lower flanks of the cockpit area—besides the main oxidizer tank of some 1,040 litre (275 US gal) volume just behind the cockpit's rear wall, could present a serious or even fatal hazard to a pilot in a fuel-caused mishap.
Two prototypes were followed by 30 Me 163 B-0 pre-production aircraft armed with two 20 mm MG 151/20 cannon and some 400 Me 163 B-1 production aircraft armed with two 30 mm (1.18 inch) MK 108 cannons, but which were otherwise similar to the B-0. Early in the war, when German aircraft firms created versions of their aircraft for export purposes, the a was added to export ("ausland") variants (B-1a) or to foreign-built variants (Ba-1) but for the Me 163, there were neither export nor a foreign-built version. Later in the war, the "a" and successive letters were used for aircraft using different engine types: as Me 262 A-1a with Jumo engines, Me 262 A-1b with BMW engines. As the Me 163 was planned with an alternative BMW P3330A rocket engine, it is likely the "a" was used for this purpose on early examples. Only one Me 163, the V10, was tested with the BMW engine, so this designation suffix was soon dropped. The Me 163 B-1a did not have any wingtip "washout" built into it, and as a result, it had a much higher critical Mach number than the Me 163 B-1.
The Me 163B had very docile landing characteristics, mostly due to its integrated leading edge slots, located directly forward of the elevon control surfaces, and just behind and at the same angle as the wing's leading edge. It would neither stall nor spin. One could fly the "Komet" with the stick full back, and have it in a turn and then use the rudder to take it out of the turn, and not fear it snapping into a spin. It would also slip well. Because the Me 163B's airframe design was derived from glider design concepts, it had excellent gliding qualities, and the tendency to continue flying above the ground due to ground effect. On the other hand, making a too close turn from base onto final, the sink rate would increase, and one could quickly lose altitude and come in short. Another main difference from a propeller-driven aircraft is that there was no slipstream over the rudder. On takeoff, one had to attain the speed at which the aerodynamic controls become effective—about —and that was always a critical factor. Pilots accustomed to flying propeller-driven aircraft had to be careful that the control stick was not somewhere in the corner when the control surfaces began working. These, like many other specific Me 163 problems, would be resolved by specific training.
The performance of the Me 163 far exceeded that of contemporary piston engine fighters. At a speed of over the aircraft would take off, in a so-called "scharfer Start" ("sharp start", with "Start" being the German word for "take-off") from the ground, from its two-wheeled dolly. The aircraft would be kept at level flight at low altitude until the best climbing speed of around was reached, at which point it would jettison the dolly, retract its extendable skid using a knob-topped release lever just forward of the throttle (as both levers were located atop the cockpit's portside 120 litre "T-Stoff" oxidizer tank) that engaged the aforementioned pneumatic cylinder, and then pull up into a 70° angle of climb, to a bomber's altitude. It could go higher if required, reaching in an unheard-of three minutes. Once there, it would level off and quickly accelerate to around or faster, which no Allied fighter could match. The usable Mach number was similar to that of the Me 262, but because of the high thrust-to-drag ratio, it was much easier for the pilot to lose track of the onset of severe compressibility and risk loss of control. A Mach warning system was installed as a result. The aircraft was remarkably agile and docile to fly at high speed. According to Rudolf Opitz, chief test pilot of the Me 163, it could "fly circles around any other fighter of its time".
By this point, Messerschmitt was completely overloaded with production of the Messerschmitt Bf 109 and attempts to bring the Me 210 into service. Production in a dispersed network was handed over to Klemm, but quality control problems were such that the work was later given to Junkers, who were, at that time, underworked. As with many German designs of World War II's later years, parts of the airframe (especially the wings) were made of wood by furniture manufacturers. The older Me 163A and first Me 163B prototypes were used for training. It was planned to introduce the Me 163S, which removed the rocket engine and tank capacity and placed a second seat for the instructor above and behind the pilot, with his own canopy. The Me 163S would be used for glider landing training, which as explained above, was essential to operate the Me 163. It appears the 163Ss were converted from the earlier Me 163B series prototypes.
In service, the Me 163 turned out to be difficult to use against enemy aircraft. Its tremendous speed and climb rate meant a target was reached and passed in a matter of seconds. Although the Me 163 was a stable gun platform, it required excellent marksmanship to bring down an enemy bomber. The "Komet" was equipped with two 30 mm (1.18 inch) MK 108 cannons which had a relatively low muzzle velocity of 540 meters per second (1,772 feet/sec), and were accurate only at short range, making it almost impossible to hit a slow moving bomber. Four or five hits were typically needed to take down a B-17.
Innovative methods were employed to help pilots achieve kills. The most promising was a weapon called the "Sondergerät 500 Jägerfaust". This included 10 single-shot, short-barreled 50 mm (2 inch) guns pointing upwards, similar to "Schräge Musik". Five were mounted in the wing roots on each side of the aircraft. A photocell in the upper surface of the "Komet" triggered the weapons by detecting the change in brightness when the aircraft flew under a bomber. As each shell shot upwards, the disposable gun barrel that fired it was ejected downwards, thus making the weapon recoilless. It appears that this weapon was used in combat only once, resulting in the destruction of a Lancaster bomber on 10 April 1945.
The biggest concern about the design was the short flight time, which never met the projections made by Walter. With only seven and a half minutes of powered flight, the fighter truly was a dedicated point defense interceptor. To improve this, the Walter firm began developing two more advanced versions of the 509A rocket engine, the 509B and C, each with two separate combustion chambers of differing sizes, one above the other, for greater efficiency. The B-version possessed a main combustion chamber—usually termed in German as a "Hauptofen" on these dual-chamber subtypes—with an exterior shape much like that on the single chamber 509A version, with the C-version having a forward chamber shape of a more cylindrical nature, designed for a higher top thrust level of some 2,000 kg (4,410 lb) of thrust, while simultaneously dropping the use of the cubic-shape frame for the forward engine propellant flow/turbopump mechanisms as used by the earlier -A and -B versions. The 509B and 509C rocket motors' main combustion chambers were supported by the thrust tube exactly as the 509A motor's single chamber had been. They were tuned for high power for takeoff and climb. The added, smaller volume lower chamber on the two later models, nicknamed the "Marschofen" with approximately of thrust at its top performance level, was intended for more efficient, lower power cruise flight. These HWK 109–509B and C motors would improve endurance by as much as 50%. Two 163 Bs, models V6 and V18, were experimentally fitted with the lower-thrust B-version of the new twin-chamber engine (mandating twin combustion chamber pressure gauges on the instrument panel of any "Komet" equipped with them), a retractable tailwheel, and tested in spring 1944.
The main combustion chamber of the 509B engine used for the B V6 and V18 occupied the same location as the A-series' engine did, with the lower "Marschofen" cruise chamber housed within the retractable tailwheel's appropriately widened ventral tail fairing. On 6 July 1944, the Me 163B V18 (VA+SP), like the B V6 basically a standard production Me 163B airframe outfitted with the new, twin-chamber "cruiser" rocket motor with the aforementioned airframe modifications beneath the original rocket motor orifice to accept the extra combustion chamber, set a new unofficial world speed record of , piloted by Heini Dittmar, and landed with almost all of the vertical rudder surface broken away from flutter. This record was not broken in terms of absolute speed until 6 November 1947 by Chuck Yeager in flight number 58 that was part of the Bell X-1 test program, with a , or Mach 1.35 supersonic speed, recorded at an altitude of nearly . However, it is unclear if Dittmar's flight achieved sufficient altitude for its speed to be considered supersonic, as the X-1 did.
The X-1 never exceeded Dittmar's speed from a normal runway "scharfer Start" liftoff. Heini Dittmar had reached the performance, after a normal "hot start" ground takeoff, without an air drop from a mother ship. Neville Duke exceeded Heini Dittmar's record mark roughly 5-1/2 years after Yeager's achievement (and some 263 km/h short of it) on 31 August 1953 with the Hawker Hunter F Mk3 at a speed of , after a normal ground start. Postwar experimental aircraft of the aerodynamic configuration that the Me 163 used, were found to have serious stability problems when entering transonic flight, like the similarly configured, and turbojet powered, Northrop X-4 Bantam and de Havilland DH 108, which made the V18's record with the Walter 509B "cruiser" rocket motor more remarkable.
Waldemar Voigt of Messerschmitt's "Oberammergau" project and development offices started a redesign of the 163 to incorporate the new twin-chamber Walter rocket engine, as well as fix other problems. The resulting Me 163C design featured a larger wing through the addition of an insert at the wing root, an extended fuselage with extra tank capacity through the addition of a plug insert behind the wing, a ventral fairing whose aft section possessed a retractable tailwheel design closely resembling that pioneered on the Me 163B V6, and a new pressurized cockpit topped with a bubble canopy for improved visibility, on a fuselage that had dispensed with the earlier B-version's dorsal fairing. The additional tank capacity and cockpit pressurization allowed the maximum altitude to increase to , as well as improving powered time to about 12 minutes, almost doubling combat time (from about five minutes to nine). Three Me 163 C-1a prototypes were planned, but it appears only one was flown, but without its intended engine.
By this time the project was moved to Junkers. There, a new design effort under the direction of Heinrich Hertel at Dessau attempted to improve the "Komet". The Hertel team had to compete with the Lippisch team and their Me 163C. Hertel investigated the Me 163 and found it was not well suited for mass production and not optimized as a fighter aircraft, with the most glaring deficiency being the lack of retractable landing gear. To accommodate this, what would eventually become the Me 263 V1 prototype would be fitted with the desired tricycle gear, also accommodating the twin-chamber Walter rocket from the start—later it was assigned to the Ju 248 program.
The resulting "Junkers Ju 248" used a three-section fuselage to ease construction. The V1 prototype was completed for testing in August 1944, and was glider-tested behind a Junkers Ju 188. Some sources state that the Walter 109–509C engine was fitted in September, but it was probably never tested under power. At this point the RLM reassigned the project to Messerschmitt, where it became the Messerschmitt Me 263. This appears to have been a formality only, with Junkers continuing the work and planning production. By the time the design was ready to go into production, the plant where it was to be built was overrun by Soviet forces. While it did not reach operational status, the work was briefly continued by the Soviet Mikoyan-Gurevich (MiG) design bureau as the Mikoyan-Gurevich I-270.
The initial test deployment of the Me 163A, to acquaint prospective pilots with the world's first rocket-powered fighter, occurred with "Erprobungskommando 16" (Service Test Unit 16, EK 16), led by "Major" Wolfgang Späte and first established in late 1942, receiving their eight A-model service test aircraft by July 1943. Their initial base was as the "Erprobungsstelle" (test facility) at the Peenemünde-West field. They departed permanently the day after an RAF bombing raid on the area on 17 August 1943, moving southwards, to the base at Anklam, near the Baltic coast. Their stay was brief, as a few weeks later they were placed in northwest Germany, based at the military airfield at Bad Zwischenahn from August 1943 to August 1944. EK 16 received their first B-series armed Komets in January 1944, and was ready for action by May while at Bad Zwischenahn. "Major" Späte flew the first-ever Me 163B combat sortie on 13 May 1944 from the Bad Zwischenahn base, with the Me 163B armed prototype (V41), bearing the "Stammkennzeichen" PK+QL.
As EK 16 commenced small-scale combat operations with the Me 163B in May 1944, the Me 163B's unsurpassed velocity was something Allied fighter pilots were at a loss to counter. The "Komets" attacked singly or in pairs, often even faster than the intercepting fighters could dive. A typical Me 163 tactic was to fly vertically upward through the bombers at , climb to , then dive through the formation again, firing as they went. This approach afforded the pilot two brief chances to fire a few rounds from his cannons before gliding back to his airfield. The pilots reported it was possible to make four passes on a bomber, but only if it was flying alone. As the cockpit was unpressurized, the operational ceiling was limited by what the pilot could endure for several minutes while breathing oxygen from a mask, without losing consciousness. Pilots underwent altitude chamber training to harden them against the rigors of operating in the thin air of the stratosphere without a pressure suit. Special low fiber diets were prepared for pilots, as gas in the gastrointestinal tract would expand rapidly during ascent.
Following the initial combat trial missions of the Me 163B with EK 16, during the winter and spring of 1944 "Major" Späte formed the Luftwaffe's first dedicated Me 163 fighter wing, "Jagdgeschwader" 400 (JG 400), in Brandis, near Leipzig. JG 400's purpose was to provide additional protection for the Leuna synthetic gasoline works which were raided frequently during almost all of 1944. A further group was stationed at Stargard near Stettin to protect the large synthetic fuel plant at Pölitz (today Police, Poland). Further defensive units of rocket fighters were planned for Berlin, the Ruhr and the German Bight.
The first actions involving the Me 163B in regular Luftwaffe active service occurred on 28 July 1944, from I./JG 400's base at Brandis, when two USAAF B-17 Flying Fortress were attacked without confirmed kills. Combat operations continued from May 1944 to spring 1945. During this time, there were nine confirmed kills with 14 Me 163s lost. "Feldwebel" Siegfried Schubert was the most successful pilot, with three bombers to his credit. Allied fighter pilots soon noted the short duration of the powered flight. They would wait and, when the engine exhausted its propellant, pounce on the unpowered "Komet". However, the "Komet" was extremely manoeuvrable in gliding flight. Another Allied method was to attack the fields the Komets operated from and strafe them after the Me 163s landed. Due to the skid-based landing gear system, the Komet was immobile until the "Scheuch-Schlepper" tractor could back the trailer up to the nose of the aircraft, place its two rear arms under the wing panels, and jack up the trailer's arms to hoist the aircraft off the ground or place it back on its take-off dolly to tow it back to its maintenance area. Establishing a defensive perimeter with anti-aircraft guns ensured that Allied fighters avoided these bases.
At the end of 1944, 91 aircraft had been delivered to JG 400 but lack of fuel had kept most of them grounded. It was clear that the original plan for a huge network of Me 163 bases would never be realized. Up to that point, JG 400 had lost only six aircraft due to enemy action. Nine were lost to other causes, remarkably few for such a revolutionary and technically advanced aircraft. In the last days of the Third Reich, the Me 163 was given up in favor of the more successful Me 262. At the beginning of May 1945, Me 163 operations were stopped, the JG 400 disbanded, and many of its pilots sent to fly Me 262s. In any operational sense, the "Komet" was a failure. Although it shot down 16 aircraft, mainly four-engined bombers, it did not warrant the effort put into the project. Due to fuel shortages late in the war, few went into combat, and it took an experienced pilot with excellent shooting skills to achieve "kills". The "Komet" also spawned later weapons like the vertical-launch, similarly rocket-powered Bachem Ba 349 Natter, and the postwar, American turbojet-powered Convair XF-92 delta wing interceptor. Ultimately, the point defense role that the Me 163 played would be taken over by the surface-to-air missile (SAM), Messerschmitt's own example being the Enzian.
Captain Eric Brown RN, Chief Naval Test Pilot and commanding officer of the Captured Enemy Aircraft Flight, who tested the Me 163 at the Royal Aircraft Establishment (RAE) at Farnborough, said, "The Me 163 was an aeroplane that you could not afford to just step into the aircraft and say 'You know, I'm going to fly it to the limit.' You had very much to familiarise yourself with it because it was state-of-the-art and the technology used." Acting unofficially, after a spate of accidents involving Allied personnel flying captured German aircraft resulted in official disapproval of such flights, Brown was determined to fly a powered Komet. On around 17 May 1945, he flew an Me 163B at Husum with the help of a cooperative German ground crew, after initial towed flights in an Me 163A to familiarise himself with the handling.
The day before the flight, Brown and his ground crew had performed an engine run on the chosen Me 163B to ensure that everything was running correctly, the German crew being apprehensive should an accident befall Brown, until being given a disclaimer signed by him to the effect that they were acting under his orders. On the rocket-powered "scharfer-start" takeoff the next day, after dropping the takeoff dolly and retracting the skid, Brown later described the resultant climb as "like being in charge of a runaway train", the aircraft reaching 32,000 ft (9.76 km) altitude in 2 minutes, 45 seconds. During the flight, while practicing attacking passes at an imaginary bomber, he was surprised at how well the Komet accelerated in the dive with the engine shut down. When the flight was over Brown had no problems on the approach to the airfield, apart from the rather restricted view from the cockpit due to the flat angle of glide, the aircraft touching down at . Once down safely, Brown and his much-relieved ground crew celebrated with a drink.
Beyond Brown's unauthorised flight, the British never tested the Me 163 under power themselves; due to the danger of its hypergolic propellants it was only flown unpowered. Brown himself piloted RAE's Komet "VF241" on a number of occasions, the rocket motor being replaced with test instrumentation. When interviewed for a 1990s television programme, Brown said he had flown five tailless aircraft (which did not include the pair of American Northrop X-4s) in his career (including the British de Havilland DH 108). Referring to the Komet, he said "this is the only one that had good flight characteristics"; he called the other four "killers".
It has been claimed that at least 29 "Komets" were shipped out of Germany after the war and that of those at least 10 have been known to survive the war to be put on display in museums around the world. Most of the 10 surviving Me 163s were part of JG 400, and were captured by the British at Husum, the squadron's base at the time of Germany's surrender in 1945. According to the RAF museum, 48 aircraft were captured intact and 24 were shipped to the United Kingdom for evaluation, although only one, "VF241", was test flown (unpowered).
Eventually an elderly German woman came forward with Me 163 instruments that her late husband had collected after the war, and the engine was reproduced by a machine shop owned by Me 163 enthusiast Reinhold Opitz. The factory closed in the early 1990s and "Yellow 25" was moved to a small museum created on the site. The museum contained aircraft that had once served as gate guards, monuments and other damaged aircraft previously located on the air base. In 1997 "Yellow 25" was moved to the official Luftwaffe Museum located at the former RAF base at Berlin-Gatow, where it is displayed today alongside a restored Walter HWK 109–509 rocket engine. This particular Me 163B is one of the very few World War II–era German military aircraft, restored and preserved in a German aviation museum, to have a swastika national marking of the Third Reich, in a "low visibility" white outline form, currently displayed on the tailfin.
Of the 21 aircraft that were captured by the British, at least three have survived. They were assigned the British serial numbers AM200 to AM220.
As part of their alliance, Germany provided the Japanese Empire with plans and an example of the Me 163. One of the two submarines carrying Me 163 parts did not arrive in Japan, so at the time, the Japanese lacked all of the major parts and construction blueprints, including the turbopump, which they could not make themselves, forcing them to reverse-engineer their own design from information obtained in the Me 163 Erection & Maintenance manual obtained from Germany. The prototype J8M crashed on its first powered flight and was completely destroyed, but several variants were built and flown, including: trainers, fighters, and interceptors, with only minor differences between the versions.
The Navy version, the Mitsubishi J8M1 "Shūsui", replaced the Ho 155 cannon with the Navy's 30 mm (1.18 in) Type 5. Mitsubishi also planned on producing a version of the 163C for the Navy, known as the J8M2 "Shūsui" Model 21. A version of the 163 D/263 was known as the J8M3 "Shusui" for the Navy with the Type 5 cannon, and a Ki-202 with the Ho 155-II for the Army. Trainers were planned, roughly the equivalent of the Me 163 A-0/S; these were known as the Kugisho/Yokosuka MXY8 (Yokoi Ki-13) (an unpowered glider trainer) and Kugisho/Yokosuka MXY9 (a Tsu-11-powered motorjet trainer).
One complete example of the Japanese aircraft survives at the Planes of Fame Air Museum in California. The fuselage of a second aircraft is displayed at the Mitsubishi company's Komaki Plant Museum, at Komaki, Aichi in Japan.
A flying replica Me 163 was constructed between 1994 and 1996 by Joseph Kurtz, a former "Luftwaffe" pilot who trained to fly Me 163s, but who never flew in combat. He subsequently sold the aircraft to EADS. The replica is an unpowered glider whose shape matches that of an Me 163, although its construction completely differs: the glider is built of wood with an empty weight of , a fraction of the weight of a wartime aircraft. Reportedly, it has excellent flying characteristics. The glider is painted red to represent the Me 163 flown by Wolfgang Späte. As of 2011, it was still flying with the civil registration D-1636.
In the early 2000s, a rocket-powered airworthy replica, the "Komet II", was proposed by XCOR Aerospace, a former aerospace company that had previously built the XCOR EZ-Rocket rocket-plane. Although outwardly the same as a wartime aircraft, the "Komet II" design would have differed considerably for safety reasons. It would have been partially constructed with composite materials, powered by one of XCOR's own simpler and safer, pressure fed, liquid oxygen/alcohol engines, and retractable undercarriage would have been used instead of a takeoff dolly and landing skid.
Several static replica Me 163s are exhibited at museums | https://en.wikipedia.org/wiki?curid=20486 |
Mohamed Atta
Mohamed Mohamed el-Amir Awad el-Sayed Atta ( ; "" ; September 1, 1968 – September 11, 2001) was an Egyptian hijacker and one of the ringleaders of the September 11 attacks in which four United States commercial aircraft were commandeered with the intention of destroying specific civilian and military targets. He served as the hijacker-pilot of American Airlines Flight 11 which he crashed into the North Tower of the World Trade Center as part of the coordinated attacks. At 33 years of age, he was the oldest of the 19 hijackers who took part in the attacks.
Born and raised in Egypt, Atta studied architecture at Cairo University, graduating in 1990, and continued his studies in Germany at the Hamburg University of Technology. In Hamburg, Atta became involved with the al-Quds Mosque, where he met Marwan al-Shehhi, Ramzi bin al-Shibh, and Ziad Jarrah, together forming the Hamburg cell. Atta disappeared from Germany for periods of time, embarking on the hajj in 1995 but also meeting Osama bin Laden and other top al-Qaeda leaders in Afghanistan from late 1999 to early 2000. Atta and the other Hamburg cell members were recruited by bin Laden and Khalid Sheikh Mohammed for a "planes operation" in the United States. Atta returned to Hamburg in February 2000, and began inquiring about flight training in the United States.
In June 2000, Atta and Marwan al-Shehhi arrived in the United States to learn how to pilot planes, obtaining instrument ratings in November. Beginning in May 2001, Atta assisted with the arrival of the muscle hijackers, and in July he traveled to Spain to meet with bin al-Shibh to finalize the plot. In August 2001, Atta traveled as a passenger on several "surveillance" flights, to establish in detail how the attacks could be carried out.
On the morning of September 11, Atta boarded American Airlines Flight 11, which he and his team then hijacked. Atta took control of the plane and crashed it into the North Tower of the World Trade Center as planned. The crash led to the collapse of the tower and the deaths of over 1,600 people.
Mohamed Atta varied his name on documents, also using "Mehan Atta", "Mohammad El Amir", "Muhammad Atta", "Mohamed El Sayed", "Mohamed Elsayed", "Muhammad al-Amir", "Awag Al Sayyid Atta", and "Awad Al Sayad". In Germany, he registered his name as "Mohamed el-Amir Awad el-Sayed Atta", and went by the name Mohamed el-Amir at the Hamburg University of Technology. In his will, written in 1996, Atta gives his name as "Mohamed the son of Mohamed Elamir awad Elsayed". Atta also claimed different nationalities, sometimes Egyptian and other times telling people he was from the United Arab Emirates.
Atta was born on September 1, 1968, in Kafr el-Sheikh, located in Egypt's Nile Delta region. His father, Mohamed el-Amir Awad el-Sayed Atta, was a lawyer, educated in both sharia and civil law. His mother, Bouthayna Mohamed Mustapha Sheraqi, came from a wealthy farming and trading family and was also educated. Bouthayna and Mohamed married when she was 14, via an arranged marriage. The family had few relatives on the father's side and kept their distance from Bouthayna's family. In-laws characterized Atta's father as "austere, strict, and private," and neighbors viewed the family as reclusive. Atta was the only son; he had two older sisters who are both well-educated and successful in their careers — one as a medical doctor and the other as a professor.
When Atta was ten, his family moved to the Cairo neighborhood of Abdeen, situated near the city center. His father, who kept the family ever insulated, forbade young Atta to fraternize with the other children in their neighborhood. Having little else to do, he mostly studied at home and easily excelled in school. In 1985, Atta enrolled at Cairo University and focused his studies on engineering. He was among the highest-scoring students; by his senior year, he was admitted to an exclusive architecture program. After he graduated in 1990 with an architecture degree, he joined the Engineers Syndicate, an organization under the control of the Muslim Brotherhood. He then worked for several months at the Urban Development Center in Cairo, where he joined various building projects and dispatched diverse architectural tasks. Also in 1990, Atta's family moved into the eleventh floor of an apartment building in the Egyptian city of Giza.
Atta graduated from Cairo University with marks insufficient for the graduate program. As his father insisted that he go abroad for graduate studies, Atta, to this end, entered a German-language program at the Goethe Institute in Cairo. In 1992, his father had overheard a German couple who were visiting Egypt's capital. The couple explained at dinner that they ran an exchange program and invited Atta to continue his studies in Germany; they also offered him room and board at their home in the city. Mohamed Atta accepted and was in Germany two weeks later, in July.
In Germany, he enrolled in the urban planning graduate program at the Hamburg University of Technology. Atta initially lived with two high school teachers; however, they eventually found his closed-mindedness and introverted personality to be too much for them. Atta began adhering to the strictest Islamic diet, frequenting the most conservative mosques, socializing seldom, and acting disdainfully towards the couple's unmarried daughter who had a young child. After six months, they asked him to leave.
By early 1993, Atta had moved into university housing with two roommates, in Centrumshause. He stayed there until 1998. During that period, his roommates grew annoyed with him. He seldom bathed, and they could not bear his "complete, almost aggressive insularity". He kept to himself to such an extent that he would turn away from a salutation with supercilious silence.
At the Hamburg University of Technology, Atta studied under the guidance of the department chair, one Dittmar Machule, who specialized in the Middle East. Atta was averse to modern development. This included the construction of high-rise buildings in Cairo and other ancient cities in the region. He believed that the drab and impersonal apartment blocks, built in the 60s and 70s, ruined the beauty of old neighborhoods and robbed their people of privacy and dignity. Atta's family moved into one such eyesore in 1990; it was to him but "a shabby symbol of Egypt's haphazard attempts to modernize and its shameless embrace of the West." For his thesis, Atta concentrated on the ancient Syrian city of Aleppo. He researched the history of the urban landscape in relation to the general theme of conflict between Arab and modern civilization. He criticized how the newfangled skyscrapers and other modernizing projects were disrupting the fabric of communities by blocking common streets and altering the skyline.
Atta's professor, Dittmar Machule, brought him along on an archaeological expedition to Aleppo in 1994. The invitation had been for a three-day visit, but Atta ended up staying several weeks that August, only to visit Aleppo yet again that December. While in Syria, he met Amal, a young Palestinian woman who worked for a planning bureau in the city. Volker Hauth, who was traveling with Atta, described Amal as "attractive and self-confident. She observed Muslim customs, taking taxis to and from the office so as not to come into close physical contact with men on buses. But she was also said to be 'emancipated' and 'challenging'. Atta and Amal appeared to be attracted to each other, but Atta soon decides that "she had a quite different orientation and that the emancipation of the young lady did not fit." His nascent infatuation with her, begrudgingly realised, was the closest thing Atta knew to romance. In mid-1995, he stayed for three months in Cairo, on a grant from the Carl Duisberg Society, along with fellow students Volker Hauth and Ralph Bodenstein. The academic team inquired into the effects of redevelopment in the Islamic Cairo, the old quarter, which the government undertook to remodel for tourism. Atta stayed in Cairo awhile with his family after Hauth and Bodenstein flew back to Germany.
While in Hamburg, Atta held several positions, such as one part-time job at Plankontor, as well as another at an urban planning firm, beginning in 1992. He was let go from the firm in 1997, however, because its business had declined and "his draughtsmanship was not needed" after it bought a CAD system. Among other odd jobs to supplement his income, Atta sometimes worked at a cleaning company and sometimes bought and sold cars. Atta had harbored a desire to return to his native city, ever since he finished his studies in Hamburg; but he was prevented by the dearth of job prospects in Cairo, his family lacking the "right connections" to avail the customary nepotism. Further, after the Egyptian government had imprisoned droves of political activists, he knew better than to trust it not to target him too, with his social and political beliefs being such as they were.
After coming to Hamburg in 1992, Atta grew more religiously fanatical and frequented the mosque with greater regularity. His friends in Germany described him as an intelligent man in whom religious convictions and political motives held equal sway. He harbored anger and resentment toward the U.S. for its policy in Islamic nations of the Middle East, with nothing inflaming his ire more than the Oslo Accords and the Gulf War in particular. He was also angry and bitter at the elite in his native Egypt, who hoarded all the power for themselves, as well as at the Egyptian government, that cracked down on the dissident Muslim Brotherhood.
On August 1, 1995, Atta returned to Egypt for three months of study. Before this trip he grew out a beard, with a view to show himself as a devout Muslim and to make a political gesture thereby. Atta returned to Hamburg on October 31, 1995, only to join the pilgrimage to Mecca shortly thereafter.
In Hamburg, Atta was intensely drawn to al-Quds Mosque which adhered to a "harsh, uncompromisingly fundamentalist, and resoundingly militant" version of Sunni Islam. He made acquaintances at al-Quds and some of whom visited him on occasion at Centrumshaus. He also began teaching classes both at Al-Quds and at a Turkish mosque near the Harburg district. Atta also started and led a prayer group, which Ahmed Maklat and Mounir El Motassadeq joined. Ramzi bin al-Shibh was also there, teaching occasional classes, and became Atta's friend.
On April 11, 1996, Atta signed his last will and testament at the mosque, officially declaring his Muslim beliefs and giving 18 instructions regarding his burial. This was the same day that Israel, much to the outrage of Atta, attacked Lebanon in Operation Grapes of Wrath; signing the will "offering his life" was his response. The instructions in his last will and testament reflect both Sunni funeral practices along with some more puritanical demands from Salafism, including asking people not "to weep and cry" and to generally refrain from showing emotion. The will was signed by el-Motassadeq and a second person at the mosque.
After leaving Plankontor in the summer of 1997, Atta disappeared again and did not return until 1998. He had made no progress on his thesis. Atta phoned his graduate advisor, Machule, and mentioned family problems at home, saying, "Please understand, I don't want to talk about this." At the winter break in 1997, Atta left and did not return to Hamburg for three months. He said that he went on pilgrimage to Mecca again, just 18 months after his first time. This claim has been disputed; Terry McDermott has argued that it is unusual for someone to go on pilgrimage so soon after the first time and to spend three months there (more than Hajj requires). When Atta returned, he claimed that his passport was lost and applied for a new one, which is a common tactic to erase evidence of travel to places such as Afghanistan. When he returned in spring 1998, after disappearing for several months, he had grown a thick long beard, and "seemed more serious and aloof" than before to those who knew him.
By mid-1998, Atta was no longer eligible for university housing in Centrumshaus. He moved into a nearby apartment in the Wilhelmsburg district, where he lived with Said Bahaji and Ramzi bin al-Shibh. By early 1999, Atta had completed his thesis, and formally defended it in August 1999.
In mid-1998, Atta worked alongside Shehhi, bin al-Shibh, and Belfas, at a warehouse, packing computers in crates for shipping. The Hamburg group did not stay in Wilhelmsburg for long. The next winter, they moved into an apartment at Marienstrasse 54 in the borough of Harburg, near the Hamburg University of Technology, at which they enrolled. It was here that the Hamburg cell developed and acted more as a group. They met three or four times a week to discuss their anti-American feelings and to plot possible attacks. Many al-Qaeda members lived in this apartment at various times, including hijacker Marwan al-Shehhi, Zakariya Essabar, and others.
In late 1999, Atta, Shehhi, Jarrah, Bahaji, and bin al-Shibh decided to travel to Chechnya to fight against the Russians, but were convinced by Khalid al-Masri and Mohamedou Ould Slahi at the last minute to change their plans. They instead traveled to Afghanistan over a two-week period in late November. On November 29, 1999, Mohamed Atta boarded Turkish Airlines Flight TK1662 from Hamburg to Istanbul, where he changed to flight TK1056 to Karachi, Pakistan. After they arrived, they were selected by Al Qaeda leader Mohammed Atef as suitable candidates for the "planes operation" plot. They were all well-educated, had experience of living in western society, along with some English skills, and would be able to obtain visas. Even before bin al-Shibh had arrived, Atta, Shehhi, and Jarrah were sent to the House of Ghamdi near bin Laden's home in Kandahar, where he was waiting to meet them. Bin Laden asked them to pledge loyalty and commit to suicide missions, which Atta and the other three Hamburg men all accepted. Bin Laden sent them to see Atef to get a general overview of the mission, and then they were sent to Karachi to see Khalid Sheikh Mohammed to go over specifics.
German investigators said that they had evidence that Mohamed Atta trained at al-Qaeda camps in Afghanistan from late 1999 to early 2000. The timing of the Afghanistan training was outlined on August 23, 2002, by a senior investigator. The investigator, Klaus Ulrich Kersten was the director of Germany's federal anticrime agency, the Bundeskriminalamt. He provided the first official confirmation that Atta and two other pilots had been in Afghanistan, and he also provided the first dates of the training. Kersten said in an interview at the agency's headquarters in Wiesbaden that Atta was in Afghanistan from late 1999 until early 2000, and that there was evidence that Atta met with Osama bin Laden there.
A video surfaced in October 2006. The first chapter of the video showed bin Laden at Tarnak Farms on January 8, 2000. The second chapter showed Atta and Ziad Jarrah reading their wills together ten days later on January 18. On his return journey, Atta left Karachi on February 24, 2000, by flight TK1057 to Istanbul where he changed to flight TK1661 to Hamburg. Immediately after returning to Germany, Atta, al-Shehhi, and Jarrah reported their passports stolen, possibly to discard travel visas to Afghanistan.
On March 22, 2000, Atta was still in Germany when he sent an e-mail to the Academy of Lakeland in Florida. He inquired about flight training, "Dear sir, we are a small group of young men from different Arab countries. Now, we are living in Germany since a while for study purposes. We would like to start training for the career of airline professional pilots. In this field, we haven't yet any knowledge but we are ready to undergo an intensive training program (up to ATP and eventually higher)." Atta sent 50–60 similar e-mails to other flight training schools in the United States.
On May 17, Atta applied for a United States visa. The next day, he received a five-year B-1/B-2 (tourist/business) visa from the United States embassy in Berlin. Atta had lived in Germany for approximately five years and also had a "strong record as a student". He was therefore treated favorably and not scrutinized. After obtaining his visa, Atta took a bus on June 2 from Hamburg to Prague where he stayed overnight before traveling on to the United States the next day. Bin al-Shibh later explained that they believed it would contribute to operational security for Atta to fly out of Prague instead of Hamburg, where he traveled from previously. Likewise, Shehhi traveled from a different location, in his case via Brussels.
On June 6, 2002, ABC's "World News Tonight" broadcast an interview with Johnelle Bryant, former loan officer at the U.S. Department of Agriculture in south Florida, who told about her encounter with Mohamed Atta. This encounter took place "around the third week of April to the third week of May of 2000", before Atta's official entry date into the United States (see below). According to Bryant, Atta wanted to finance the purchase of a crop-duster. "He wanted to finance a twin-engine, six-passenger aircraft and remove the seats," Bryant told ABC's "World News Tonight". He insisted that she write his name as ATTA, that he originally was from Egypt but had moved to Afghanistan, that he was an engineer and that his dream was to go to a flight school. He asked about the Pentagon and the White House. He said he wanted to visit the World Trade Center and asked Bryant about the security there. He mentioned Al Qaeda and said the organization "could use memberships from Americans". He mentioned Osama bin Laden and said "this man would someday be known as the world's greatest leader." Bryant said "the picture that came out in the newspaper, that's exactly what that man looked like." Bryant contacted the authorities after recognising Atta in news reports. Law-enforcement officials said Bryant passed a lie-detector exam.
According to official reports, Atta flew from Prague to Newark International Airport, arriving on June 3, 2000. That month, Atta and Shehhi stayed in hotels and rented rooms in New York City on a short-term basis. They continued to inquire about flight schools and personally visited some, including Airman Flight School in Norman, Oklahoma, which they visited on July 3, 2000. Days later, Shehhi and Atta ended up in Venice, Florida. Atta and Shehhi established accounts at SunTrust Bank and received wire transfers from Ali Abdul Aziz Ali, Khalid Sheikh Mohammed's nephew in the United Arab Emirates. On July 6, 2000, Atta and Shehhi enrolled at Huffman Aviation in Venice, where they entered the Accelerated Pilot Program, while Ziad Jarrah took flight training from a different school also based in Venice. When Atta and Shehhi arrived in Florida, they initially stayed with Huffman's bookkeeper and his wife in a spare room of their house. After a week, they were asked to leave because they were rude. Atta and Shehhi then moved into a small house nearby in Nokomis where they stayed for six months.
Atta began flight training on July 6, 2000, and continued training nearly every day. By the end of July, both Atta and Shehhi did solo flights. Atta earned his private pilot certificate in September, and then he and Shehhi decided to switch flight schools. Both enrolled at Jones Aviation in Sarasota and took training there for a brief time. They had problems following instructions and were both very upset when they failed their Stage 1 exam at Jones Aviation. They inquired about multi-engine planes and told the instructor that "they wanted to move quickly, because they had a job waiting in their country upon completion of their training in the U.S." In mid-October, Atta and Shehhi returned to Huffman Aviation to continue training. In November 2000, Atta earned his instrument rating, and then a commercial pilot's license in December from the Federal Aviation Administration.
Atta continued with flight training that included solo flights and simulator time. On December 22, Atta and Shehhi applied to Eagle International for large jet and simulator training for McDonnell Douglas DC-9 and Boeing 737-300 models. On December 26, Atta and Shehhi needed a tow for their rented Piper Cherokee on a taxiway of Miami International Airport after the engine shut down. On December 29 and 30, Atta and Marwan went to the Opa-locka Airport where they practiced on a Boeing 727 simulator, and they obtained Boeing 767 simulator training from Pan Am International on December 31. Atta purchased cockpit videos for Boeing 747-200, Boeing 757-200, Airbus A320 and Boeing 767-300ER models via mail-order from Sporty's Pilot Shop in Batavia, Ohio, in November and December 2000.
Records on Atta's cellphone indicated that he phoned the Moroccan embassy in Washington on January 2, just before Shehhi flew to the country. Atta flew to Spain on January 4, 2001, to coordinate with bin al-Shibh and returned to the United States on January 10. While in the United States he traveled to Lawrenceville, Georgia, where he and Shehhi visited a LA Fitness Health Club. During that time Atta flew out of Briscoe Field in Lawrenceville with a pilot, and Atta and either the pilot or Shehhi flew around the Atlanta area. They lived in the area for several months. On April 3, Atta and Shehhi rented a postal box in Virginia Beach, Virginia.
On April 11, Atta and Shehhi rented an apartment at 10001 Atlantic Blvd, Apt. 122 in Coral Springs, Florida, for $840 per month, and assisted with the arrival of the muscle hijackers. On April 16, Atta was given a citation for not having a valid driver's license, and he began steps to get the license. On May 2, Atta received his driver's license in Lauderdale Lakes, Florida. While in the United States, Atta owned a red 1989 Pontiac Grand Prix.
On June 27, Atta flew from Fort Lauderdale to Boston, Massachusetts, where he spent a day, and then continued to San Francisco for a short time, and from there to Las Vegas. On June 28, Atta arrived at McCarran International Airport in Las Vegas to meet with the three other pilots. He rented a Chevrolet Malibu from an Alamo Rent A Car agency. It is not known where he stayed that night, but on the 29th he registered at the Econo Lodge at 1150 South Las Vegas Boulevard. Here he presented an AAA membership for a discount, and paid cash for the $49.50/night room. During his trip to Las Vegas, he is thought to have used a video camera that he had rented from a Select Photo outlet back in Delray Beach, Florida.
In July 2001, Atta again left for Spain in order to meet with bin al-Shibh for the last time. On July 7, 2001, Atta flew on Swissair Flight 117 from Miami to Zürich, where he had a stopover. On July 8, Atta was recorded on surveillance video when he withdrew 1700 Swiss francs from an ATM. He used his credit card to purchase two Swiss Army knives and some chocolate in a shop at the Zürich Airport. After the stopover in Zürich, he arrived in Madrid at 4:45 pm on Swissair Flight 656, and spent several hours at the airport. Then at 8:50 pm, he checked into the Hotel Diana Cazadora in Barajas, a town near the airport. That night and twice the next morning, he called Bashar Ahmad Ali Musleh, a Jordanian student in Hamburg who served as a liaison for bin al-Shibh.
On the morning of July 9, Mohamed Atta rented a silver Hyundai Accent, which he booked from SIXT Rent-A-Car for July 9 to 16, and later extended to the 19th. He drove east out of Madrid towards the Mediterranean beach area of Tarragona. On the way, Atta stopped in Reus to pick up Ramzi bin al-Shibh at the airport. They drove to Cambrils, where they spent a night at the Hotel Monica. They checked out the next morning, and spent the next few days at an unknown location in Tarragona. The absence of other hotel stays, signed receipts or credit card stubs has led investigators to believe that the men may have met in a safe house provided by other al-Qaeda operatives in Spain. There, Atta and bin al-Shibh held a meeting to complete the planning of the attacks. Several clues have been found to link their stay in Spain to Syrian-born Imad Eddin Barakat Yarkas (Abu Dahdah), and Amer el Azizi, a Moroccan in Spain. They may have helped arrange and host the meeting in Tarragona. Yosri Fouda, who interviewed bin al-Shibh and Khalid Sheikh Mohammed (KSM) before the arrest, believes that Said Bahaji and KSM may have also been present at the meeting. Spanish investigators have said that Marwan al-Shehhi and two others later joined the meeting. Bin al-Shibh would not discuss this meeting with Fouda.
During the Spain meetings, Atta and bin al-Shibh had coordinated the details of the attacks. The 9/11 Commission obtained details about the meeting, based on interrogations of Bin al-Shibh in the weeks after his arrest in September 2002. Bin al-Shibh explained that he passed along instructions from Osama bin Laden, including his desire for the attacks to be carried out as soon as possible. Bin Laden was concerned about having so many operatives in the United States. Atta confirmed that all the muscle hijackers had arrived in the United States, without any problems, but said that he needed five to six more weeks to work out details. Bin Laden also asked that other operatives not be informed of the specific data until the last minute. During the meeting, Atta and bin al-Shibh also decided on the targets to be hit, ruling out a strike on a nuclear plant. Bin al-Shibh passed along bin Laden's list of targets; bin Laden wanted the U.S. Capitol, the Pentagon, and the World Trade Center to be attacked, as they were deemed "symbols of America." If any of the hijackers could not reach their intended targets, Atta said, they were to crash the plane. They also discussed the personal difficulties Atta was having with fellow hijacker Ziad Jarrah. Bin al-Shibh was worried that Jarrah might even abandon the plan. The 9/11 Commission Report speculated that the now-convicted terrorist conspirator Zacarias Moussaoui was being trained as a possible replacement for Jarrah.
From July 13 to 16, Atta stayed at the Hotel Sant Jordi in Tarragona. After bin al-Shibh returned to Germany on July 16, 2001, Atta had three more days in Spain. He spent two nights in Salou at the beachside Casablanca Playa Hotel, then spent the last two nights at the Hotel Residencia Montsant. On July 19, Atta returned to the United States, flying on Delta Air Lines from Madrid to Fort Lauderdale, via Atlanta.
On July 22, 2001, Atta rented a Mitsubishi Galant from Alamo Rent a Car, putting 3,836 miles on the vehicle before returning it on July 26. On July 25, Atta dropped Ziad Jarrah off at Miami International Airport for a flight back to Germany. On July 26, Atta traveled via Continental Airlines to Newark, New Jersey, checked into the Kings Inn Hotel in Wayne, New Jersey, and stayed there until July 30 when he took a flight from Newark back to Fort Lauderdale.
On August 4, Atta is believed to have been at Orlando International Airport waiting to pick up suspected "20th Hijacker" Mohammed al-Qahtani from Dubai, who ended up being held by immigration as "suspicious." Atta was believed to have used a payphone at the airport to phone a number "linked to al-Qaeda" after Qahtani was denied entry.
On August 6, Atta and Shehhi rented a white, four door 1995 Ford Escort from Warrick's Rent-A-Car, which was returned on August 13. On August 6, Atta booked a flight on Spirit Airlines from Fort Lauderdale to Newark, leaving on August 7 and returning on August 9. The reservation was not used and canceled on August 9 with the reason "Family Medical Emergency". Instead, he went to Central Office & Travel in Pompano Beach to purchase a ticket for a flight to Newark, leaving on the evening of August 7 and schedule to return in the evening on August 9. Atta did not take the return flight. On August 7, Atta checked into the Wayne Inn in Wayne, New Jersey and checked out on August 9. The same day, he booked a one-way first class ticket via the Internet on America West Flight 244 from Ronald Reagan Washington National Airport to Las Vegas. Atta traveled twice to Las Vegas on "surveillance flights" rehearsing how the 9/11 attacks would be carried out. Other hijackers traveled to Las Vegas at different times in the summer of 2001.
Throughout the summer, Atta met with Nawaf al-Hazmi to discuss the status of the operation on a monthly basis.
On August 23, Atta's driver license was revoked "in absentia" after he failed to show up in traffic court to answer the earlier citation for driving without a license. On the same day, Israeli Mossad reportedly gave his name to the CIA as part of a list of 19 names they said were planning an attack in the near future. Only four of the names are known for certain, the others being Marwan al-Shehhi, Khalid al-Mihdhar and Nawaf al-Hazmi. On August 30 he was recorded purchasing a utility knife from a Wal-Mart store near the hotel where he stayed prior to 9/11.
On September 10, 2001, Atta picked up Omari from the Milner Hotel in Boston, Massachusetts, and the two terrorists drove their rented Nissan Altima to a Comfort Inn in South Portland, Maine. On the way, they were seen getting gasoline at an Exxon gas station and visited the Longfellow House in Portland that afternoon; they arrived at the hotel at 5:43 p.m. and spent the night in Room 233. While in South Portland, they were seen making two ATM withdrawals and stopping at Wal-Mart. The FBI also reported that "two middle-eastern men" were seen in the parking lot of a Pizza Hut, where Atta is known to have eaten that day.
Atta and Omari arrived early the next morning, at 5:40 a.m., at Portland International Jetport, where they left their rental car in the parking lot and boarded a 6:00 a.m. Colgan Air (US Airways Express) BE-1900C flight to Boston's Logan International Airport. In Portland, Mohamed Atta was selected by the Computer Assisted Passenger Prescreening System (CAPPS), which required his checked bags to undergo extra screening for explosives but involved no extra screening at the passenger security checkpoint.
The connection between the two flights at Logan International Airport was within Terminal B, but the two gates were not connected within security. Passengers must leave the secured area, go outdoors, cross a covered roadway, and enter another building before going through security once again. There are two separate concourses in Terminal B; the south concourse is mainly used by US Airways and the north one is mostly used by American Airlines. It had been overlooked that there would still be a security screen to pass in Boston because of this distinct detail of the terminal's arrangement. At 6:45 a.m., while at the Boston airport, Atta took a call from Flight 175 hijacker Marwan al-Shehhi. This call was apparently to confirm that the attacks were ready to begin. Atta checked in for American Airlines Flight 11, passed through security again, and boarded the flight. Atta was seated in business class, in seat 8D. At 7:59 a.m., the plane departed from Boston to Los Angeles, California, carrying 81 passengers.
The hijacking began at 8:14 a.m. — 15 minutes after the flight departed — when beverage service would be starting. At this time, the pilots stopped responding to air traffic control, and the aircraft began deviating from the planned route. At 8:18 am, flight attendants Betty Ong and Madeline Amy Sweeney began making phone calls to American Airlines to report what was happening. Ong provided information about lack of communication with the cockpit, lack of access to the cockpit, and passenger injuries. At 8:24:38 a.m., a voice believed to be Atta's was heard by air traffic controllers, saying: "We have some planes. Just stay quiet and you will be OK. We are returning to the airport." "Nobody move, everything will be OK. If you try to make any moves you'll endanger yourself and the airplane. Just stay quiet." "Nobody move, please. We are going back to the airport. Don't try to make any stupid moves." The plane's transponder was turned off at 8:21 a.m. At 8:46:35 a.m., the plane flew into the North Tower of the World Trade Center in New York city.
Because the flight from Portland to Boston had been delayed, his bags did not make it onto Flight 11. Atta's bags were later recovered in Logan International Airport, and they contained airline uniforms, flight manuals, and other items. The luggage included a copy of Atta's will, written in Arabic, as well as a list of instructions, called "The Last Night". This document is divided into three sections; the first is a fifteen point list providing detailed instructions for the last night of a martyr's life, the second gives instructions for travelling to the plane and the third from the time between boarding the plane and martyrdom. Almost all of these points discuss spiritual preparation, such as prayer and citing religious scripture.
On October 1, 2006, "The Sunday Times" released a video it had obtained "through a previously tested channel", purporting to show Mohamed Atta and Ziad Jarrah recording a martyrdom message six months earlier at a training camp in Afghanistan. The video, bearing the date of January 18, 2000, is of good resolution but contains no sound track. Lip readers have failed to decipher it. Atta and Jarrah appear in high spirits, laughing and smiling in front of the camera. They had never been pictured together before. Unidentified sources from both Al-Qaeda and the United States confirmed to The Times the video's authenticity. A separate section of the video shows Osama bin Laden addressing his followers at a complex near Kandahar. Ramzi bin al-Shibh is also identified in the video. According to "The Sunday Times", "American and German investigators have struggled to find evidence of Atta's whereabouts in January 2000 after he disappeared from Hamburg. The hour-long tape places him in Afghanistan at a decisive moment in the development of the conspiracy when he was given operational command. Months later both he and Jarrah enrolled at flying schools in America."
In the aftermath of the September 11, 2001 attacks, the names of the hijackers were released. There was some confusion regarding who Mohamed Atta was, and cases of mistaken identity. Initially, Mohamed Atta's identity was confused with that of a native Jordanian, Mahmoud Mahmoud Atta, who bombed an Israeli bus in the West Bank in 1986, killing one and severely injuring three. Mahmoud Atta was 14 years older than Atta. Mahmoud Atta, a naturalized U.S. citizen, was subsequently deported from Venezuela to the United States, extradited to Israel, tried and sentenced to life in prison. The Israeli Supreme Court later overturned his extradition and set him free. After 9/11, there also were reports stating that Mohamed Atta had attended International Officers School at Maxwell Air Force Base in Montgomery, Alabama. "The Washington Post" quoted a United States Air Force official who explained, "discrepancies in their biographical data, such as birth dates 20 years off, indicate we are probably not talking about the same people."
In the months following up to the September 11 attacks, officials at the Czech Interior Ministry asserted that Atta made a trip to Prague on April 8, 2001, to meet with an Iraqi intelligence agent named Ahmed Khalil Ibrahim Samir al-Ani. This piece of information was passed on to the FBI as "unevaluated raw intelligence". Intelligence officials have concluded that such a meeting did not occur. A Pakistani businessman named Mohammed Atta had come to Prague from Saudi Arabia on May 31, 2000, with this second Atta possibly contributing to confusion. The Egyptian Mohamed Atta arrived at the Florenc bus terminal in Prague, from Germany, on June 2, 2000. He left Prague the next day, flying on Czech Airlines to Newark, New Jersey, U.S. In the Czech Republic, some intelligence officials say the source of the purported meeting was an Arab informant who approached the Czech intelligence service with his sighting of Atta only after Atta's photograph had appeared in newspapers all over the world. United States and Czech intelligence officials have since concluded that the person seen with Ani was mistakenly identified as Atta, and the consensus of investigators has concluded that Atta never attended a meeting in Prague.
In 2005, Army Lt. Col. Anthony Shaffer and Congressman Curt Weldon alleged that the Defense Department data mining project, Able Danger, produced a chart that identified Atta, along with Nawaf al-Hazmi, Khalid al-Mihdhar, and Marwan al-Shehhi, as members of a Brooklyn-based al-Qaeda cell in early 2000. Shaffer largely based his allegations on the recollections of Navy Captain, Scott Phillpott, who later recanted his recollection, telling investigators that he was "convinced that Atta was not on the chart that we had." Phillpott said that Shaffer was "relying on my recollection 100 percent," and the Defense Department Inspector General's report indicated that Philpott "may have exaggerated knowing Atta's identity because he supported using Able Danger's techniques to fight terrorism."
Five witnesses who had worked on Able Danger and had been questioned by the Defense Department's Inspector General later told investigative journalists that their statements to the IG were distorted by investigators in the final IG's report, or the report omitted essential information that they had provided. The alleged distortions of the IG report centered around excluding any evidence that Able Danger had identified and tracked Atta years before 9/11.
Lt. Col. Shaffer's book also clearly indicates direct identification of the Brooklyn cell, and Mohamed Atta.
Atta's father, Mohamed el-Amir Awad el-Sayed Atta, a retired lawyer in Egypt, vehemently rejected allegations his son was involved in the September 11 attacks, and instead accused the Mossad and the United States government of having a hand in framing his son. Atta Sr. rejected media reports that stated his son was drinking wildly, and instead described his son as a quiet boy uninvolved with politics, shy and devoted to studying architecture. The elder Mr. Atta said he had spoken with Mohamed by phone the day after on September 12, 2001. He held interviews with the German news magazine "Bild am Sonntag" in late 2002, saying his son was alive and hiding in fear for his life, and that American Christians were responsible for the attacks. In a subsequent interview in 2005, Atta Sr. stated, "My son is gone. He is now with God. The Mossad killed him."
There are multiple, conflicting explanations for Atta's behavior and motivation. Political psychologist Jerrold Post has suggested that Atta and his fellow hijackers were just following orders from al-Qaeda leadership, "and whatever their destructive, charismatic leader Osama bin Laden said was the right thing to do for the sake of the cause was what they would do." In turn, political scientist, Robert Pape, has claimed that Atta was motivated by his commitment to the political cause, that he was psychologically normal, and that he was "not readily characterized as depressed, not unable to enjoy life, not detached from friends and society." By contrast, criminal justice professor, Adam Lankford, has found evidence that indicated Atta was suicidal, and that his struggles with social isolation, depression, guilt, shame, hopelessness, and rage were extraordinarily similar to the struggles of those who commit conventional suicide and murder-suicide. By this view, Atta's political and religious beliefs affected the method of his suicide and his choice of target, but they were not the underlying causes of his behavior. | https://en.wikipedia.org/wiki?curid=20487 |
Messerschmitt Me 262
The Messerschmitt Me 262, nicknamed Schwalbe (German: "Swallow") in fighter versions, or Sturmvogel (German: "Storm Bird") in fighter-bomber versions, was the world's first operational jet-powered fighter aircraft. Design work started before World War II began, but problems with engines, metallurgy and top-level interference kept the aircraft from operational status with the Luftwaffe until mid-1944. The Me 262 was faster and more heavily armed than any Allied fighter, including the British jet-powered Gloster Meteor. One of the most advanced aviation designs in operational use during World War II, the Me 262's roles included light bomber, reconnaissance and experimental night fighter versions.
Me 262 pilots claimed a total of 542 Allied aircraft shot down, although higher claims are sometimes made. The Allies countered its effectiveness in the air by attacking the aircraft on the ground and during takeoff and landing. Strategic materials shortages and design compromises on the Junkers Jumo 004 axial-flow turbojet engines led to reliability problems. Attacks by Allied forces on fuel supplies during the deteriorating late-war situation also reduced the effectiveness of the aircraft as a fighting force. Armament production within Germany was focused on more easily manufactured aircraft. In the end, the Me 262 had a negligible impact on the course of the war as a result of its late introduction and the consequently small numbers put in operational service.
While German use of the aircraft ended with the close of World War II, a small number were operated by the Czechoslovak Air Force until 1951. It also heavily influenced several designs, such as Sukhoi Su-9 (1946) and Nakajima Kikka. Captured Me 262s were studied and flight tested by the major powers, and ultimately influenced the designs of post-war aircraft such as the North American F-86 Sabre, MiG-15 and Boeing B-47 Stratojet. Several aircraft survive on static display in museums, and there are several privately built flying reproductions that use modern General Electric J85 engines.
Several years before World War II, the Germans foresaw the great potential for aircraft that used the jet engine constructed by Hans Joachim Pabst von Ohain in 1936. After the successful test flights of the world's first jet aircraft—the Heinkel He 178—within a week of the Invasion of Poland to start the war, they adopted the jet engine for an advanced fighter aircraft. As a result, the Me 262 was already under development as "Projekt" 1065 (P.1065) before the start of World War II. The project originated with a request by the "Reichsluftfahrtministerium" (RLM, Ministry of Aviation) for a jet aircraft capable of one hour's endurance and a speed of at least . Dr Waldemar Voigt headed the design team, with Messerschmitt's chief of development, Robert Lusser, overseeing.
Plans were first drawn up in April 1939, and the original design was very different from the aircraft that eventually entered service, with wing root-mounted engines, rather than podded ones, when submitted in June 1939. The progression of the original design was delayed greatly by technical issues involving the new jet engine. Because the engines were slow to arrive, Messerschmitt moved the engines from the wing roots to underwing pods, allowing them to be changed more readily if needed; this would turn out to be important, both for availability and maintenance. Since the BMW 003 jets proved heavier than anticipated, the wing was swept slightly, by 18.5°, to accommodate a change in the center of gravity. Funding for the jet engine program was also initially lacking as many high-ranking officials thought the war could easily be won with conventional aircraft. Among those were Hermann Göring, head of the Luftwaffe, who cut the engine development program to just 35 engineers in February 1940 (the month before the first wooden mock-up was completed); Willy Messerschmitt, who desired to maintain mass production of the piston-powered, 1935-origin Bf 109 and the projected Me 209; and Major General Adolf Galland, who had initially supported Messerschmitt through the early development years, flying the Me 262 himself on 22 April 1943. By that time, problems with engine development had slowed production of the aircraft considerably. One particularly acute problem arose with the lack of an alloy with a melting point high enough to endure the high temperatures involved, a problem that by the end of the war had not been adequately resolved. The aircraft made its first successful flight entirely on jet power on 18 July 1942, powered by a pair of Jumo 004 engines, after a November 1941 flight (with BMW 003s) ended in a double flameout.
The project aerodynamicist on the design of the Me 262 was Ludwig Bölkow. He initially designed the wing using NACA airfoils modified with an elliptical nose section. Later in the design process, these were changed to AVL derivatives of NACA airfoils, the NACA 00011-0.825-35 being used at the root and the NACA 00009-1.1-40 at the tip. The elliptical nose derivatives of the NACA airfoils were used on the horizontal and vertical tail surfaces. Wings were of single-spar cantilever construction, with stressed skins, varying from skin thickness at the root to at the tip. To expedite construction, save weight and use less strategic materials, late in the war, wing interiors were not painted. The wings were fastened to the fuselage at four points, using a pair of and forty-two bolts.
In mid-1943, Adolf Hitler envisioned the Me 262 as a ground-attack/bomber aircraft rather than a defensive interceptor. The configuration of a high-speed, light-payload "Schnellbomber" ("fast bomber") was intended to penetrate enemy airspace during the expected Allied invasion of France. His edict resulted in the development of (and concentration on) the "Sturmvogel" variant. It is debatable to what extent Hitler's interference extended the delay in bringing the "Schwalbe" into operation; it appears engine vibration issues were at least as costly, if not more so. Albert Speer, then Minister of Armaments and War Production, in his memoirs claimed Hitler originally had blocked mass production of the Me 262, before agreeing in early 1944. Hitler rejected arguments the aircraft would be more effective as a fighter against the Allied bombers destroying large parts of Germany, and wanted it as a bomber for revenge attacks. According to Speer, Hitler felt its superior speed compared to other fighters of the era meant it could not be attacked, and so preferred it for high altitude straight flying.
The Me 262 is often referred to as a "swept wing" design as the production aircraft had a small, but significant leading edge sweep of 18.5° which likely provided an advantage by increasing the critical Mach number. Sweep, uncommon at the time, was added after the initial design of the aircraft. The engines proved heavier than originally expected, and the sweep was added primarily to position the center of lift properly relative to the center of mass. (The original 35° sweep, proposed by Adolf Busemann, was not adopted.) On 1 March 1940, instead of moving the wing backward on its mount, the outer wing was re-positioned slightly aft; the trailing edge of the midsection of the wing remained unswept. Based on data from the AVA Göttingen and wind tunnel results, the inboard section's leading edge (between the nacelle and wing root) was later swept to the same angle as the outer panels, from the "V6" sixth prototype onward throughout volume production.
Test flights began on 18 April 1941, with the Me 262 V1 example, bearing its "Stammkennzeichen" radio code letters of PC+UA, but since its intended BMW 003 turbojets were not ready for fitting, a conventional Junkers Jumo 210 engine was mounted in the V1 prototype's nose, driving a propeller, to test the Me 262 V1 airframe. When the BMW 003 engines were installed, the Jumo was retained for safety, which proved wise as both 003s failed during the first flight and the pilot had to land using the nose-mounted engine alone. The V1 through V4 prototype airframes all possessed what would become an uncharacteristic feature for most later jet aircraft designs, a fully retracting conventional gear setup with a retracting tailwheel—indeed, the very first prospective German "jet fighter" airframe design ever flown, the Heinkel He 280, used a retractable tricycle landing gear from its beginnings, and flying on jet power alone as early as the end of March 1941.
The V3 third prototype airframe, with the code PC+UC, became a true jet when it flew on 18 July 1942 in Leipheim near Günzburg, Germany, piloted by test pilot Fritz Wendel. This was almost nine months ahead of the British Gloster Meteor's first flight on 5 March 1943. Its retracting conventional tail wheel gear (similar to other contemporary piston powered propeller aircraft), a feature shared with the first four Me 262 V-series airframes, caused its jet exhaust to deflect off the runway, with the wing's turbulence negating the effects of the elevators, and the first takeoff attempt was cut short.
On the second attempt, Wendel solved the problem by tapping the aircraft's brakes at takeoff speed, lifting the horizontal tail out of the wing's turbulence. The aforementioned initial four prototypes (V1-V4) were built with the conventional gear configuration. Changing to a tricycle arrangement—a permanently fixed undercarriage on the fifth prototype (V5, code PC+UE), with the definitive fully retractable nosewheel gear on the V6 (with "Stammkennzeichen" code VI+AA, from a new code block) and subsequent aircraft corrected this problem.
Test flights continued over the next year, but engine problems continued to plague the project, the Jumo 004 being only marginally more reliable than the lower-thrust (7.83 kN/1,760 lbf) BMW 003. Airframe modifications were complete by 1942 but, hampered by the lack of engines, serial production did not begin until 1944, and deliveries were low, with 28 Me 262s in June, 59 in July, but only 20 in August.
By Summer 1943, the Jumo 004A engine had passed several 100-hour tests, with a time between overhauls of 50 hours being achieved. However, the Jumo 004A engine proved unsuitable for full-scale production because of its considerable weight and its high utilization of strategic material (Ni, Co, Mo), which were in short supply. Consequently, the 004B engine was designed to use a minimum amount of strategic materials. All high heat-resistant metal parts, including the combustion chamber, were changed to mild steel (SAE 1010) and were protected only against oxidation by aluminum coating. The total engine represented a design compromise to minimize the use of strategic materials and to simplify manufacture. With the lower-quality steels used in the 004B, the engine required overhaul after just 25 hours for a metallurgical test on the turbine. If it passed the test, the engine was refitted for a further 10 hours of usage, but 35 hours marked the absolute limit for the turbine wheel. While BMW's and Junkers' axial compressor turbojet engines were characterised by a sophisticated design that could offer considerable advantage – also used in a generalized form for the contemporary American Westinghouse J30 turbojet – the lack of rare materials for the Jumo 004 design put it at a disadvantage compared to the "partly axial-flow" Power Jets W.2/700 turbojet engine which, despite its own largely centrifugal compressor-influenced design, provided (between an operating overhaul interval of 60–65 hours) an operational life span of 125 hours. Frank Whittle concludes in his final assessment over the two engines: "it was in the quality of high temperature materials that the difference between German and British engines was most marked"
Operationally, carrying of fuel in two tanks, one each fore and aft of the cockpit; and a ventral fuselage tank beneath, the Me 262 would have a total flight endurance of 60 to 90 minutes. Fuel was usually J2 (derived from brown coal), with the option of diesel or a mixture of oil and high octane B4 aviation petrol. Fuel consumption was double the rate of typical twin-engine fighter aircraft of the era, which led to the installation of a low-fuel warning indicator in the cockpit that notified pilots when remaining fuel fell below .
Unit cost for an Me 262 airframe, less engines, armament, and electronics, was "RM"87,400. To build one airframe took around 6,400 man-hours.
On 19 April 1944, "Erprobungskommando" 262 was formed at Lechfeld just south of Augsburg, as a test unit ("Jäger Erprobungskommando Thierfelder", commanded by "Hauptmann" Werner Thierfelder) to introduce the 262 into service and train a corps of pilots to fly it. On 26 July 1944, Leutnant Alfred Schreiber with the 262 A-1a W.Nr. 130 017 damaged a Mosquito reconnaissance aircraft of No. 540 Squadron RAF PR Squadron, which was allegedly lost in a crash upon landing at an air base in Italy. Other sources state the aircraft was damaged during evasive manoeuvres and escaped.
Major Walter Nowotny was assigned as commander after the death of Thierfelder in July 1944, and the unit redesignated "Kommando Nowotny". Essentially a trials and development unit, it mounted the world's first jet fighter operations. Trials continued slowly, with initial operational missions against the Allies in August 1944, and the unit made claims for 19 Allied aircraft in exchange of six Me 262s lost.
Despite orders to stay grounded, Nowotny chose to fly a mission against an enemy bomber formation flying some above, on 8 November 1944. He claimed two P-51Ds destroyed before suffering engine failure at high altitude. Then, while diving and trying to restart his engines, he was attacked by other Mustangs, forced to bail out, and died. The "Kommando" was then withdrawn for further flight training and a revision of combat tactics to optimise the 262's strengths.
On 26 November 1944, a Me 262A-2a Sturmvogel of III."Gruppe"/KG 51 'Edelweiß' based at Rheine-Hopsten Air Base near Osnabrück was the first confirmed ground-to-air kill of a jet combat aircraft. The 262 was shot down by a Bofors gun of B.11 Detachment of 2875 Squadron RAF Regiment at the RAF forward airfield of Helmond, near Eindhoven. Others were lost to ground fire on 17 and 18 December when the same airfield was attacked at intervals by a total of 18 Me 262s and the guns of 2873 and 2875 Squadrons RAF Regiment damaged several, causing at least two to crash within a few miles of the airfield. In February 1945, a B.6 gun detachment of 2809 Squadron RAF Regiment shot down another Me 262 over the airfield of Volkel. The final appearance of 262s over Volkel was in 1945 when yet another fell to 2809's guns.
By January 1945, "Jagdgeschwader" 7 (JG 7) had been formed as a pure jet fighter wing, partly based at Parchim although it was several weeks before it was operational. In the meantime, a bomber unit—I "Gruppe", "Kampfgeschwader" 54 (KG(J) 54)—redesignated as such on 1 October 1944 through being re-equipped with, and trained to use the Me 262A-2a fighter-bomber for use in a ground-attack role. However, the unit lost 12 jets in action in two weeks for minimal returns. "Jagdverband 44" (JV 44) was another Me 262 fighter unit, of squadron ("Staffel") size given the low numbers of available personnel, formed in February 1945 by Lieutenant General Adolf Galland, who had recently been dismissed as Inspector of Fighters. Galland was able to draw into the unit many of the most experienced and decorated Luftwaffe fighter pilots from other units grounded by lack of fuel.
During March, Me 262 fighter units were able, for the first time, to mount large-scale attacks on Allied bomber formations. On 18 March 1945, thirty-seven Me 262s of JG 7 intercepted a force of 1,221 bombers and 632 escorting fighters. They shot down 12 bombers and one fighter for the loss of three Me 262s. Although a 4:1 ratio was exactly what the Luftwaffe would have needed to make an impact on the war, the absolute scale of their success was minor, as it represented only 1% of the attacking force.
In the last days of the war, Me 262s from JG 7 and other units were committed in ground assault missions, in an attempt to support German troops fighting Red Army forces. Just south of Berlin, halfway between Spremberg and the German capital, Wehrmacht's 9th Army (with elements from the 12 Army and 4th Panzer Army) was assaulting the Red Army's 1st Ukrainian Front. To support this attack, on 24 April, JG 7 dispatched thirty-one Me 262s on a strafing mission in the Cottbus-Bautzen area. Luftwaffe pilots claimed six lorries and seven Soviet aircraft, but three German jets were lost. On the evening of 27 April, thirty-six Me 262s from JG 7, III.KG(J)6 and KJ(J)54 were sent against Soviet forces that were attacking German troops in the forests north-east of Baruth. They succeeded in strafing 65 Soviet lorries, after which the Me 262s intercepted low flying IL-2 Sturmoviks searching for German tanks. The jet pilots claimed six Sturmoviks for the loss of three Messerschmitts. During operations between 28 April and 1 May Soviet fighters and ground fire downed at least ten more Me 262s from JG 7.
However, JG 7 managed to keep its jets operational until the end of the war. And on the 8th of May, at around 4:00 p.m. "Oblt." Fritz Stehle of 2./JG 7, while flying a Me 262 on the Erzgebirge, attacked a formation of Soviet aircraft. He claimed a Yakovlev Yak-9, but the plane shot down was probably a P-39 Airacobra. Soviet records show that they lost two Airacobras, one of them probably downed by Stehle, who would thus have scored the last Luftwaffe air victory of the war.
Several two-seat trainer variants of the Me 262, the Me 262 B-1a, had been adapted through the "Umrüst-Bausatz 1" factory refit package as night fighters, complete with on-board FuG 218 "Neptun" high-VHF band radar, using "Hirschgeweih" ("stag's antlers") antennae with a set of dipole elements shorter than the "Lichtenstein SN-2" had used, as the B-1a/U1 version. Serving with 10. "Staffel" "Nachtjagdgeschwader" 11, near Berlin, these few aircraft (alongside several single-seat examples) accounted for most of the 13 Mosquitoes lost over Berlin in the first three months of 1945. Intercepts were generally or entirely made using "Wilde Sau" methods, rather than AI radar-controlled interception. As the two-seat trainer was largely unavailable, many pilots made their first jet flight in a single-seater without an instructor.
Despite its deficiencies, the Me 262 clearly marked the beginning of the end of piston-engined aircraft as effective fighting machines. Once airborne, it could accelerate to speeds over , about faster than any Allied fighter operational in the European Theater of Operations.
The Me 262's top ace was probably "Hauptmann" Franz Schall with 17 kills, including six four-engine bombers and ten P-51 Mustang fighters, although fighter ace "Oberleutnant" Kurt Welter claimed 25 Mosquitos and two four-engine bombers shot down by night and two further Mosquitos by day. Most of Welter's claimed night kills were achieved by eye, even though Welter had tested a prototype Me 262 fitted with FuG 218 "Neptun" radar. Another candidate for top ace on the aircraft was "Oberstleutnant" Heinrich Bär, who is credited with 16 enemy aircraft while flying Me262s out of his total of 240 aircraft shot down.
The Me 262 was so fast that German pilots needed new tactics to attack Allied bombers. In the head-on attack, the combined closing speed of about was too high for accurate shooting, with ordnance that could only fire about 44 shells a second (650 rounds/min from each cannon) in total from the quartet of them. Even from astern, the closing speed was too great to use the short-ranged quartet of MK 108 cannon to maximum effect. Therefore, a roller-coaster attack was devised. The 262s approached from astern and about than the bombers. From about , they went into a shallow dive that took them through the escort fighters with little risk of interception. When they were about and below the bombers, they pulled up sharply to reduce speed. On levelling off, they were and overtaking the bombers at about , well placed to attack them.
Since the 30mm MK 108 cannon's short barrels and low muzzle velocity (only ) rendered it inaccurate beyond , coupled with the jet's velocity, which required breaking off at to avoid colliding with the target, Me 262 pilots normally commenced firing at . Gunners of Allied bomber aircraft found their electrically powered gun turrets had problems tracking the jets. Target acquisition was difficult because the jets closed into firing range quickly and remained in firing position only briefly, using their standard attack profile, which proved more effective.
A prominent Royal Navy test pilot, Captain Eric Brown, chief naval test pilot and commanding officer of the Captured Enemy Aircraft Flight Royal Aircraft Establishment, who tested the Me 262 noted: "This was a Blitzkrieg aircraft. You whack in at your bomber. It was never meant to be a dogfighter, it was meant to be a destroyer of bombers... The great problem with it was it did not have dive brakes. For example, if you want to fight and destroy a B-17, you come in on a dive. The 30mm cannon were not so accurate beyond . So you normally came in at and would open fire on your B-17. And your closing speed was still high and since you had to break away at to avoid a collision, you only had two seconds firing time. Now, in two seconds, you can't sight. You can fire randomly and hope for the best. If you want to sight and fire, you need to double that time to four seconds. And with dive brakes, you could have done that."
Eventually, German pilots developed new combat tactics to counter Allied bombers' defences. Me 262s, equipped with up to 24 unguided folding-fin R4M rockets—12 in each of two underwing racks, outboard of the engine nacelle—approached from the side of a bomber formation, where their silhouettes were widest, and while still out of range of the bombers' machine guns, fired a salvo of rockets with strongly brisant Hexogen-filled warheads, exactly the same explosive in the shells fired by the Me 262A's quartet of MK 108 cannon. One or two of these rockets could down even the famously rugged Boeing B-17 Flying Fortress, from the "metal-shattering" brisant effect of the fast-flying rocket's explosive warhead. The much more massive BR 21 large-calibre rockets, used from their tubular launchers in undernose locations for an Me 262A's use (one either side of the nosewheel well) were only as fast as the MK 108's shells.
Though this broadside-attack tactic was effective, it came too late to have a real effect on the war, and only small numbers of Me 262s were equipped with the rocket packs. Most of those so equipped were Me 262A-1a models, members of "Jagdgeschwader" 7. This method of attacking bombers became the standard, and mass deployment of Ruhrstahl X-4 guided missiles was cancelled. Some nicknamed this tactic the Luftwaffe's Wolf Pack, as the fighters often made runs in groups of two or three, fired their rockets, then returned to base. On 1 September 1944, USAAF General Carl Spaatz expressed the fear that if greater numbers of German jets appeared, they could inflict losses heavy enough to force cancellation of the Allied bombing offensive by daylight.
The Me 262 was difficult to counter because its high speed and rate of climb made it hard to intercept. However, as with other turbojet engines at the time, the Me 262's engines did not provide sufficient thrust at low air speeds and throttle response was slow, so that in certain circumstances such as takeoff and landing the aircraft became a vulnerable target. Another disadvantage that pioneering jet aircraft of the World War II era shared, was the high risk of compressor stall and if throttle movements were too rapid, the engine(s) could suffer a flameout. The coarse opening of the throttle would cause fuel surging and lead to excessive jet pipe temperatures. Pilots were instructed to operate the throttle gently and avoid quick changes. German engineers introduced an automatic throttle regulator later in the war but it only partly alleviated the problem.
The plane had, by contemporary standards, a high wing loading (294.0 kg/m2, 60.2 lbs/ft2) that required higher takeoff and landing speeds. Due to poor throttle response, the engines' tendency for airflow disruption that could cause the compressor to stall was ubiquitous. The high speed of the Me 262 also presented problems when engaging enemy aircraft, the high-speed convergence allowing Me 262 pilots little time to line up their targets or acquire the appropriate amount of deflection. This problem faces any aircraft that approaches another from behind at much higher speed, as the slower aircraft in front can always pull a tighter turn, forcing the faster aircraft to overshoot.
Luftwaffe pilots eventually learned how to handle the Me 262's higher speed and the Me 262 soon proved a formidable air superiority fighter, with pilots such as Franz Schall managing to shoot down seventeen enemy fighters in the Me 262, ten of them American P-51 Mustangs. Other notable Me 262 aces included Georg-Peter Eder, with twelve enemy fighters to his credit (including nine P-51s), Erich Rudorffer also with twelve enemy fighters to his credit, Walther Dahl with eleven (including three Lavochkin La-7s and six P-51s) and Heinz-Helmut Baudach with six (including one Spitfire and two P-51s) amongst many others.
Pilots soon learned that the Me 262 was quite maneuverable despite its high wing loading and lack of low-speed thrust, especially if attention was drawn to its effective maneuvering speeds. The controls were light and effective right up to the maximum permissible speed and perfectly harmonised. The inclusion of full span automatic leading-edge slats, something of a "tradition" on Messerschmitt fighters dating back to the original Bf 109's outer wing slots of a similar type, helped increase the overall lift produced by the wing by as much as 35% in tight turns or at low speeds, greatly improving the aircraft's turn performance as well as its landing and takeoff characteristics. As many pilots soon found out, the Me 262's clean design also meant that it, like all jets, held its speed in tight turns much better than conventional propeller-driven fighters, which was a great potential advantage in a dogfight as it meant better energy retention in maneuvers.
Too fast to catch for the escorting Allied fighters, the Me 262s were almost impossible to head off. As a result, Me 262 pilots were relatively safe from the Allied fighters, as long as they did not allow themselves to get drawn into low-speed turning contests and saved their maneuvering for higher speeds. Combating the Allied fighters could be effectively done the same way as the U.S. fighters fought the more nimble, but slower, Japanese fighters in the Pacific.
Allied pilots soon found that the only reliable way to destroy the jets, as with the even faster Me 163B "Komet" rocket fighters, was to attack them on the ground or during takeoff or landing. Luftwaffe airfields identified as jet bases were frequently bombed by medium bombers, and Allied fighters patrolled over the fields to attack jets trying to land. The Luftwaffe countered by installing extensive "flak" alleys of anti-aircraft guns along the approach lines to protect the Me 262s from the ground—and by providing top cover during the jets' takeoff and landing with the most advanced Luftwaffe single-engined fighters, the Focke-Wulf Fw 190D and (just becoming available in 1945) Focke-Wulf Ta 152H. Nevertheless, in March–April 1945, Allied fighter patrol patterns over Me 262 airfields resulted in numerous jet losses.
As the Me 262A's pioneering Junkers Jumo 004 axial-flow jet engines needed careful nursing by their pilots, these jet aircraft were particularly vulnerable during takeoff and landing. Lt. Chuck Yeager of the 357th Fighter Group was one of the first American pilots to shoot down an Me 262, which he caught during its landing approach. On 7 October 1944, Lt. Urban Drew of the 365th Fighter Group shot down two Me 262s that were taking off, while on the same day Lt. Col. Hubert Zemke, who had transferred to the Mustang equipped 479th Fighter Group, shot down what he thought was a Bf 109, only to have his gun camera film reveal that it may have been an Me 262. On 25 February 1945, Mustangs of the 55th Fighter Group surprised an entire "Staffel" of Me 262As at takeoff and destroyed six jets.
The British Hawker Tempest scored several kills against the new German jets, including the Messerschmitt Me 262. Hubert Lange, a Me 262 pilot, said: "the Messerschmitt Me 262's most dangerous opponent was the British Hawker Tempest—extremely fast at low altitudes, highly manoeuvrable and heavily armed." Some were destroyed with a tactic known to the Tempest 135 Wing as the "Rat Scramble": Tempests on immediate alert took off when an Me 262 was reported airborne. They did not intercept the jet, but instead flew towards the Me 262 and Ar 234 base at Hopsten air base. The aim was to attack jets on their landing approach, when they were at their most vulnerable, travelling slowly, with flaps down and incapable of rapid acceleration. The German response was the construction of a "flak lane" of over 150 emplacements of the 20 mm "Flakvierling" quadruple autocannon batteries at Rheine-Hopsten to protect the approaches. After seven Tempests were lost to flak at Hopsten in a week, the "Rat Scramble" was discontinued.
Adolf Busemann had proposed swept wings as early as 1935; Messerschmitt researched the topic from 1940. In April 1941, Busemann proposed fitting a 35° swept wing ("Pfeilflügel II", literally "arrow wing II") to the Me 262, the same wing-sweep angle later used on both the American F-86 Sabre and Soviet Mikoyan-Gurevich MiG-15 fighter jets. Though this was not implemented, he continued with the projected HG II and HG III ("Hochgeschwindigkeit", "high-speed") derivatives in 1944, designed with a 35° and 45° wing sweep, respectively.
Interest in high-speed flight, which led him to initiate work on swept wings starting in 1940, is evident from the advanced developments Messerschmitt had on his drawing board in 1944. While the Me 262 V9 "Hochgeschwindigkeit I" (HG I) flight-tested in 1944 had only small changes compared to combat aircraft, most notably a low-profile canopy—tried as the "Rennkabine" (literally "racing cabin") on the ninth Me 262 prototype for a short time—to reduce drag, the HG II and HG III designs were far more radical. The projected HG II combined the low-drag canopy with a 35° wing sweep and a V-tail (butterfly tail). The HG III had a conventional tail, but a 45° wing sweep and turbines embedded in the wing roots.
Messerschmitt also conducted a series of flight tests with the series production Me 262. Dive tests determined that the Me 262 went out of control in a dive at Mach 0.86, and that higher Mach numbers would cause a nose-down trim that the pilot could not counter. The resulting steepening of the dive would lead to even higher speeds and the airframe would disintegrate from excessive negative g loads.
The HG series of Me 262 derivatives was believed capable of reaching transonic Mach numbers in level flight, with the top speed of the HG III being projected as Mach 0.96 at altitude. After the war, the Royal Aircraft Establishment, at that time one of the leading institutions in high-speed research, re-tested the Me 262 to help with British attempts at exceeding Mach 1. The RAE achieved speeds of up to Mach 0.84 and confirmed the results from the Messerschmitt dive-tests. The Soviets ran similar tests.
After Willy Messerschmitt's death in 1978, the former Me 262 pilot Hans Guido Mutke claimed to have exceeded Mach 1 on 9 April 1945 in a Me 262 in a "straight-down" 90° dive. This claim relies solely on Mutke's memory of the incident, which recalls effects other Me 262 pilots observed below the speed of sound at high indicated airspeed, but with no altitude reading required to determine the speed. The pitot tube used to measure airspeed in aircraft can give falsely elevated readings as the pressure builds up inside the tube at high speeds. The Me 262 wing had only a slight sweep, incorporated for trim (center of gravity) reasons and likely would have suffered structural failure due to divergence at high transonic speeds. One airframe—the aforementioned Me 262 V9, Werknummer 130 004, with "Stammkennzeichen" of VI+AD, was prepared as the HG I test airframe with the low-profile "Rennkabine" racing-canopy and may have achieved an unofficial record speed for a turbojet-powered aircraft of , altitude unspecified, even with the recorded wartime airspeed record being set on 6 July 1944, by another Messerschmitt design—the Me 163B V18 rocket fighter setting a record, but landing with a nearly disintegrated rudder surface.
About 1,400 planes were produced, but a maximum of 200 were operational at any one time. According to sources they destroyed from 300 to 450 enemy planes, with the Allies destroying about one hundred Me 262s in the air. While Germany was bombed intensively, production of the Me 262 was dispersed into low-profile production facilities, sometimes little more than clearings in the forests of Germany and occupied countries. Through the end of February to the end of March 1945, approximately sixty Me 262s were destroyed in attacks on Obertraubling and thirty at Leipheim; the Neuburg jet plant itself was bombed on 19 March 1945.
Large, heavily protected underground factories were constructed - as with the partly-buried Weingut I complex for Jumo 004 jet engine production - to take up production of the Me 262, safe from bomb attacks, but the war ended before they could be completed. Wings were produced in Germany's oldest motorway tunnel at Engelberg, to the west of Stuttgart. At "B8 Bergkristall-Esche II" at St. Georgen/Gusen, Austria, slave labourers of concentration camp Gusen II produced fully equipped fuselages for the Me 262 at a monthly rate of 450 units on large assembly lines from early 1945. Gusen II was known as one of the harshest concentration camps; the typical life expectancy was six months. An estimated 35,000 to 50,000 people died on the forced labour details for the Me 262.
After the end of the war, the Me 262 and other advanced German technologies were quickly swept up by the Soviets, British and Americans, as part of the USAAF's Operation Lusty. Many Me 262s were found in readily repairable condition and were confiscated. The Soviets, British and Americans wished to evaluate the technology, particularly the engines.
During testing, the Me 262 was found to be faster than the British Gloster Meteor jet fighter, and had better visibility to the sides and rear (mostly due to the canopy frames and the discoloration caused by the plastics used in the Meteor's construction), and was a superior gun platform to the Meteor F.1 which had a tendency to snake at high speed and exhibited "weak" aileron response. The Me 262 had a shorter range than the Meteor and had less reliable engines.
The USAAF compared the P-80 Shooting Star and Me 262, concluding that the Me 262 was superior in acceleration and speed, with similar climb performance. The Me 262 appeared to have a higher critical Mach number than any American fighter.
The Americans also tested a Me 262A-1a/U3 unarmed photo reconnaissance version, which was fitted with a fighter nose and a smooth finish. Between May and August 1946, the aircraft completed eight flights, lasting four hours and forty minutes. Testing was discontinued after four engine changes were required during the course of the tests, culminating in two single-engine landings. These aircraft were extensively studied, aiding development of early US, British and Soviet jet fighters. The F-86, designed by engineer Edgar Schmued, used a slat design based on the Me 262's.
The Czechoslovak aircraft industry continued to produce single-seat (Avia S-92) and two-seat (Avia CS-92) variants of the Me 262 after World War II. From August 1946, a total of nine S-92s and three two-seater CS-92s were completed and test flown. They were introduced in 1947 and in 1950 were supplied to the 5th Fighter Squadron, becoming the first jet fighters to serve in the Czechoslovak Air Force. These were kept flying until 1951, when they were replaced in service by Soviet jet fighters. Both versions are on display at the Prague Aviation museum in Kbely.
In January 2003, the American Me 262 Project, based in Everett, Washington, completed flight testing to allow the delivery of partially updated spec reproductions of several versions of the Me 262 including at least two B-1c two-seater variants, one A-1c single seater and two "convertibles" that could be switched between the A-1c and B-1c configurations. All are powered by General Electric CJ610 engines and feature additional safety features, such as upgraded brakes and strengthened landing gear. The "c" suffix refers to the new CJ610 powerplant and has been informally assigned with the approval of the Messerschmitt Foundation in Germany (the Werknummer of the reproductions picked up where the last wartime produced Me 262 left off – a continuous airframe serial number run with a near 60-year production break).
Flight testing of the first newly manufactured Me 262 A-1c (single-seat) variant (Werknummer 501244) was completed in August 2005. The first of these machines (Werknummer 501241) went to a private owner in the southwestern United States, while the second (Werknummer 501244) was delivered to the Messerschmitt Foundation at Manching, Germany. This aircraft conducted a private test flight in late April 2006, and made its public debut in May at the ILA 2006. The new Me 262 flew during the public flight demonstrations. Me 262 Werknummer 501241 was delivered to the Collings Foundation as White 1 of JG 7; this aircraft offered ride-along flights starting in 2008. The third replica, a non-flyable Me 262 A-1c, was delivered to the Evergreen Aviation & Space Museum in May 2010.
"Note:"- U = " Umrüst-Bausatz" – conversion kit installed at factory level, denoted as a suffix in the form /U"n".
Rüstsatze may be applied to various sub-types of their respective aircraft type, denoted as a suffix in the form /R"n".
"Data from:"'Messerschmitt Me 262A Schwalbe
A series of reproductions was constructed by American company Legend Flyers (later Me 262 Project) of Everett, Washington. The Jumo 004 engines of the original are replaced by more reliable General Electric CJ610 engines. The first Me 262 reproduction (a two-seater) took off for the first time in December 2002 and the second one in August 2005. This one was delivered to the Messerschmitt Foundation and was presented at the ILA airshow in 2006. | https://en.wikipedia.org/wiki?curid=20488 |
Masuria
Masuria (, , Masurian: "Mazurÿ") is a region in northeastern Poland, famous for its 2,000 lakes. Masuria occupies much of the Masurian Lake District. Administratively, it is part of the Warmian-Masurian Voivodeship (administrative area/province). Its biggest city, often regarded as its capital, is Ełk. The region covers a territory of some 10,000 km2 and had a population in 2013 of 59,790.
Before the 13th century, the territory was inhabited by the Old Prussians also called Baltic Prussians, a Baltic ethnic group that lived in Prussia (the area of the southeastern coastal region of the Baltic Sea neighbouring of the Baltic Sea around the Vistula Lagoon and the Curonian Lagoon). The territory later called Masuria was then known as Galindia and was probably a peripheral, deeply forested and lightly populated area. Its inhabitants spoke a language now known as Old Prussian and had their own mythology. Although a 19th-century German political entity bore their name, they were not Germans. They were converted to Roman Catholicism in the 13th century, after conquest by the Knights of the Teutonic Order.
Estimates range from about 170,000 to 220,000 Old Prussians living in the whole of Prussia around 1200. The wilderness was their natural barrier against attack by would-be invaders. During the Northern Crusades of the early 13th century, the Old Prussians used this wide forest as a brod zone of defence. They did so again against the Knights of the Teutonic Order, who had been invited to Poland by Konrad I of Masovia in 1226. The order's goal was to convert the native population to Christianity and baptise it by force if necessary. In the subsequent conquest, which lasted over 50 years, the original population was partly exterminated, particularly during the major Prussian rebellion of 1261–83. But several Prussian noble families also accommodated to the Knights in order to hold their power and possessions.
After the Order's acquisition of Prussia, Poles (or more specifically, Mazurs, that is inhabitants of the adjacent region of Mazovia) began to settle in the southeastern part of the conquered region. German, Dutch, Flemish, and Danish colonists entered the area afterward, from the northwest. The number of Polish settlers grew significantly again in the beginning of the 15th century, especially after the first and the second treaties of Thorn, in 1411 and 1466 respectively, following the Thirteen Years' War and the final defeat of the order. The Battle of Grunwald took place in western Masuria in 1410. In 1440 the anti-Teutonic Prussian Confederation was founded. In 1454 upon the Confederation's request King Casimir IV of Poland signed the act of incorporation of the entire region including Masuria to Poland and after the subsequent Thirteen Years' War Masuria came under the suzerainty of the Polish Crown, still ruled by the grand master of the Teutonic Order. Later assimilation of the German settlers as well as the Polish immigrants and native Prussian inhabitants created the new Prussian identity, although the subregional difference between the German- and Polish-speaking part remained.
The secularization of the Teutonic Order in Prussia and the conversion of Albert of Prussia to Lutheranism in 1525 brought Prussia including the area later called Masuria to Protestantism. The Knights untied their bonds to the Catholic Church and became land owning nobleman and the Duchy of Prussia was established as a vassal state of Poland. The Polish language predominated due to the many immigrants from Mazovia, who additionally settled the southern parts of Ducal Prussia, till then virgin part of (later Masuria) in the 16th century. While the southern countryside was inhabited by these - meanwhile Protestant - Polish-speakers, who took refuge, the very small southern towns constituted German mixed with Polish-speaking population. The ancient Old Prussian language survived in parts of the countryside in the northern and central parts of Ducal Prussia until the early 18th century. At that time they proved to be assimilated in the mass of German speaking villagers and farmers Areas that had many Polish language speakers were known as the Polish Departments.
Masuria became one of the leading centers of Polish Protestantism. In the mid-16th century Lyck (Ełk) and Angerburg (Węgorzewo) became significant Polish printing centers. A renowned Polish high school, which attracted Polish students from different regions, was founded in Ełk in eastern Masuria in 1546 by Hieronim Malecki, Polish translator and publisher, who contributed to the creation of the standards and patterns of the Polish literary language. The westernmost part of Masuria, the Osterode (Ostróda) county, in 1633 came under the administration of one of the last dukes of the Piast dynasty, John Christian of Brieg.
In 1656, during the Battle of Prostki, the forces of Polish-Lithuanian Commonwealth, including 2,000 Tatar raiders, beat the allied Swedish and Brandenburg army capturing Bogusław Radziwiłł. The war resulted in the destruction of most towns, 249 villages and settlements, and 37 churches were destroyed. Over 50% of the population of Masuria died within the years 1656–1657, 23,000 were killed, another 80,000 died of diseases and famine, 3,400 people were enslaved and deported to Russia. From 1709–1711, in all of Ducal Prussia between 200,000 and 245,000 out of 600,000 inhabitants died from the Black Death. In Masuria the death toll varied regionally; while 6,789 people died in the district of Rhein (Ryn) only 677 died in Seehesten (Szestno). In Lötzen (Giżycko) 800 out of 919 people died. Losses in population were compensated by migration of Protestant settlers or refugees from Scotland, Salzburg (expulsion of Protestants 1731), France (Huguenot refugees after the Edict of Fontainebleau in 1685), and especially from the counterreformed Polish-Lithuanian Commonwealth, including Polish brethren expelled from Poland in 1657. The last group of refugees to emigrate to Masuria were the Russian Philipons (as 'Old Believers' opposed to the State Church) in 1830, when King Frederick William III of Prussia granted them asylum.
After the death of Albert Frederick, Duke of Prussia in 1618, his son-in-law John Sigismund, Margrave of Brandenburg, inherited the duchy (including Masuria), combining the two territories under a single dynasty and forming Brandenburg-Prussia. The Treaty of Wehlau revoked the sovereignty of the King of Poland in 1657.
The region became part of the Kingdom of Prussia with the coronation of King Frederick I of Prussia in 1701 in Königsberg. Masuria became part of a newly created administrative province of East Prussia upon its creation in 1773. The name "Masuria" began to be used officially after new administrative reforms in Prussia after 1818. Masurians referred to themselves during that period as "Polish Prussians" or as "Staroprusaki" (Old Prussians) During the Napoleonic Wars and Polish national liberation struggles, in 1807, several towns of northern and eastern Masuria were taken over by Polish troops under the command of generals Jan Henryk Dąbrowski and Józef Zajączek. Some Masurians showed considerable support for the Polish uprising in 1831, and maintained many contacts with Russian-held areas of Poland beyond the border of Prussia, the areas being connected by common culture and language; before the uprising people visited each other's country fairs and much trade took place, with smuggling also widespread Nevertheless, their Lutheran belief and a traditional adherence to the Prussian royal family kept Masurians and Poles separated. Some early writers about Masurians - like Max Toeppen - postulated Masurians in general as mediators between German and Slav cultures.
Germanisation policies in Masuria included various strategies, first and foremost they included attempts to propagate the German language and to eradicate the Polish (Masurian) language as much as possible; German became the obligatory language in schools from 1834 on. The Lutheran churches and their vicars principally exerted their spiritual care in Masurian as concerned to Masurian mother tongue parishioners
Mother tongue of the inhabitants of Masuria, by county, during the first half of the 19th century:
After the Unification of Germany into the German Empire in 1871, the last lessons that made use of the Polish language were removed from schools in 1872. Masurians who expressed sympathy for Poland were deemed "national traitors" by German public opinion, especially after 1918 when the new Polish republic laid claims to, up to then German, areas inhabited by Polish speakers According to Stefan Berger, after 1871 the Masurians in the German Empire were seen in a view that while acknowledging their "objective" Polishness (in terms of culture and language) they felt "subjectively" German and thus should be tightly integrated into the German nation-state; Berger concludes that such arguments of German nationalists were aimed at integrating Masurian (and Silesian) territory firmly into the German Reich.
During the period of the German Empire, the Germanisation policies in Masuria became more widespread; children using Polish in playgrounds and classrooms were widely punished by corporal punishment, and authorities tried to appoint Protestant pastors who would use only German instead of bilinguality and this resulted in protests of local parishioners. According to Jerzy Mazurek the native Polish-speaking population, like in other areas with Polish inhabitants, faced discrimination of Polish language activities from Germanised local administration. In this climate a first resistance defending the rights of rural population was organized; according to Jerzy Mazurek usually by some teachers engaged in publishing Polish language newspapers.
Despite anti-Polish policies, such Polish language newspapers as the "Pruski Przyjaciel Ludu" (Prussian Friend of People) or the "Kalendarz Królewsko-Pruski Ewangelicki" (Royal Prussian Evangelical Calendar) or bilingual journals like the "Oletzkoer Kreisblatt - Tygodnik Obwodu Oleckiego" continued to be published in Masuria. In contrast to the Prussian-oriented periodicals, in the late 19th century such newspapers as "Przyjaciel Ludu Łecki" and "Mazur" were founded by members of the Warsaw-based "Komitet Centralny dla Śląska, Kaszub i Mazur" (Central Committee for Silesia, Kashubia and Masuria), influenced by Polish politicians like Antoni Osuchowski or Juliusz Bursche, to strengthen the Polish identity in Masuria. The "Gazeta Ludowa" (The Folk's Newspaper) was published in Lyck in 1896–1902, with 2,500 copies in 1897 and the "Mazur" in Ortelsburg after 1906 with 500 copies in 1908 and 2,000 prior to World War I.
Polish activists started to regard Masurians as "Polish brothers" after Wojciech Kętrzyński had published his pamphlet "O Mazurach" in 1872 and Polish activists engaged in active self-help against repressions by the German state Kętrzyński fought against attempts to Germanise Masuria
The attempts to create a Masurian Polish national consciousness, largely originating from nationalist circles of Provinz Posen, however faced the resistance of the Masurians, who, despite having similar folk traditions and linguistics to Poles, regarded themselves as Prussians and later Germans. and were loyal to the Hohenzollern dynasty, the Prussian and German state. After World War I the editor of the Polish language "Mazur" described the Masurians as "not nationally conscious, on the contrary, the most loyal subjects of the Prussian king". However, a minority of Masurians did exist who expressed Polish identity
After 1871 there appeared resistance among the Masurians towards Germanisation efforts, the so-called Gromadki movement was formed which supported use of Polish language and came into conflict with German authorities; while most of its members viewed themselves as loyal to the Prussian state, a part of them joined the Pro-Polish faction of Masurians. The programme of Germanisation started to unite and mobilise Polish people in Polish-inhabited territories held by Germany including Masuria A Polish-oriented party, the "Mazurska Partia Ludowa" ("People's Party of Masuria"), was founded in 1897. The eastern areas of the German Empire were systematically Germanised with changing of names and public signs, and the German state fostered cultural imperialism, in addition to giving financial and other support to German farmers, officials, and teachers to settle in the east.
The German authorities in their efforts of Germanisation tried to claim the Masurian language separate from Polish by classifying it as a non-Slavic language different from Polish one, this was reflected in official census
Thus the Masurian population in 1890, 143,397 was reported to the Prussian census as having German as their language (either primary or secondary), 152,186 Polish and 94,961 Masurian. In 1910, the German language was reported by German authorities as used by 197,060, Polish by 30,121 and Masurian by 171,413. Roman Catholics generally opted for the Polish language, Protestants appreciated Masurian. In 1925, German authorities reported 40,869 inhabitants as having declared Masurian as their native tongue and 2,297 as Polish. However, the last result may have been a result of politics at the time, the desire of the population to be German after the trauma evoked by 1920 plebiscite. So the province could be presented as - so called - 'purely German'; in reality the Masurian dialect was among bilinguals still in use.
Throughout industrialisation in the late 19th century about 10 percent of the Masurian populace emigrated to the Ruhr Area, where about 180,000 Masurians lived in 1914. Wattenscheid, Wanne and Gelsenkirchen were the centers of Masurian emigration and Gelsenkirchen-Schalke was even called Klein (little)-Ortelsburg before 1914. Masurian newspapers like the "Przyjaciel Ewangeliczny" and the "Gazeta Polska dla Ludu staropruskiego w Westfalii i na Mazurach" but also the German language "Altpreußische Zeitung" were published.
During World War I, the Battle of Tannenberg and the First and Second Battle of the Masurian Lakes between Imperial Germany and the Russian Empire took place within the borders of Masuria in 1914. After the war, the League of Nations held the East Prussian plebiscite on 11 July 1920 to determine if the people of the southern districts of East Prussia wanted to remain within East Prussia or to join the Second Polish Republic. The German side terrorised the local population before the plebiscite using violence, Polish organisations and activists were harassed by German militias, and those actions included attacks and some supposed murders of Polish activists; Masurs who supported voting for Poland were singled out and subjected to terror and repressions.
Names of those Masurs supporting the Polish side were published in German newspapers, and their photos presented in German shops; afterwards regular hunts were organised after them by German militias terrorizing the Polish minded population. At least 3,000 Warmian and Masurian activists who were engaged for Polish side decided to flee the region. At the same time also local police officials were engaged in active surveillance of the Polish minority and attacks against Polish activists. Before the plebiscite Poles started to flee the region to escape the German harassment and Germanisation policies.
The results determined that 99.32% of the voters in Masuria proper chose to remain with East Prussia. Notwithstanding national German agitation and intimidation, these results reflect that majority Masurians had adopted a German national identity next to a regional identity. Their traditional religious belief in Lutheranism kept them away from Polish national consciousness, dominated by Roman Catholicism. In fact almost only Catholics voted for Poland in the plebiscite. They were to be found as a majority in the villages around the capital Allenstein, the same were Polish cultural activism got hold between 1919 and 1932. However, the contemporary Polish ethnographer Adam Chętnik accused the German authorities of abuses and falsifications during the plebiscite. Moreover, the plebiscite took place during the time when Polish–Soviet War threatened to erase the Polish state. As a result, even many Poles of the region voted for Germany out of fear that if the area was allocated to Poland it would fall under Soviet rule. After the plebiscite in German areas of Masuria attacks on Polish population commenced by German mobs, and Polish priests and politicians were driven from their homes After the plebiscite at least 10,000 Poles had to flee German held Masuria to Poland.
The region of Działdowo (Soldau), where according to the official German census of 1910 ethnic Germans formed a minority of 37.3%, was excluded from the plebiscite and became part of Poland. This was reasoned with placing the railway connection between Warsaw and Danzig (Gdańsk), of vital importance to Poland as it connected central Poland with its recently obtained seacoast, completely under Polish sovereignty. Działdowo itself counted about 24,000 people of which 18,000 were Masurians.
According to the municipal administration of Rybno, after World War I Poles in Działdowo believed that they will be quickly joined with Poland, they organised secret gatherings during which the issue of rejoining Polish state with help of Polish military was discussed. According to the Rybno administration most active Poles in that subregion included Jóżwiakowscy, Wojnowscy, Grzeszczowscy families working under the guidance of politician Leon Wojnowski who protested German attempts to remain Działdowo a part of Germany after the war; other local pro-Polish activists were Alfred Wellenger, Paczyński, Tadeusz Bogdański, Jóźwiakowski.
The historian Andreas Kossert describes that the incorporation happened despite protests of the local populace, the municipal authorities and the German Government, According to Kossert 6,000 inhabitants of the region soon left the area.
In 1920 the candidate of the German Party in Poland, Ernst Barczewski, was elected to the Sejm with 74.6 percent of votes and to the Polish Senate with 34.6% of votes for the Bloc of National Minorities in 1928. During the Polish–Soviet War Działdowo was briefly occupied by the Red Army regarded as liberator from the Polish authority by the local German population, which hoisted the German flag, but it was soon recovered by the Polish Army.
During the interwar period many native inhabitants of Działdowo subregion left and migrated to Germany.
With the start of the German war against Poland on 1 September 1939, the German minority in the parts of Masuria attached to Poland after World War I, as Działdowo but also large parts of former Western-Prussia, organised themselves in paramilitary formations called Selbstschutz (selfdefense) and begun to engage in massacres of local Polish population; Poles were imprisoned, tortured and murdered while Masurians were sometimes forcefully placed on Volksliste
From now on conscripted Masurians had to serve without exception in the German army invading Poland, and Russia two years later on.
The Soldau concentration camp near to Działdowo was established in winter 1939, where 13,000 people were murdered by the Nazi German state during the war. Notable victims included the Polish bishops Antoni Julian Nowowiejski and Leon Wetmański, as well as the nun Mieczysława Kowalska. Additionally, almost 1,900 mentally ill patients from East Prussia and annexed areas of Poland were murdered there as well, in what was known as Action T4.
Polish resistance in Masuria was organised by Paweł Nowakowski "Leśnik" commander of the Home Army's Działdowo district.
Masuria was the only region of Germany directly affected by the battles of World War I. Damaged towns and villages were reconstructed with the aid of several twin towns from western Germany like Cologne to Neidenburg, Frankfurt to Lötzen and even Vienna to Ortelsburg. The architecture still is surprisingly distinct, being of modern Central European character. However Masuria was still largely agrarian-oriented and suffered from the economic decline after World War I, additionally badly affected by the creation of the Polish Corridor, which raised freight costs to the traditional markets in Germany.
The later implemented Osthilfe had only a minor influence on Masuria as it privileged larger estates, while Masurian farms were generally small.
The interwar period was characterised by ongoing Germanisation policies, intensified especially under the Nazis.
In the 1920s Masuria remained a heartland of conservatism with the German National People's Party as strongest party. The Nazi Party, having absorbed the conservative one, became the strongest party already in the Masurian constituencies in the elections of 1930 and received its best results in the poorest areas of Masuria with the highest rate of Polish speakers. Especially in the elections of 1932 and 1933 they reached up to 81 percent of votes in the district of Neidenburg and 80 percent in the district of Lyck. The Nazis used the economic crisis, which had significant effects in far-off Masuria, as well as traditional anti-Polish sentiments while at the same time Nazi political rallies were organised in the Masurian dialect during the campaigning.
In 1938, the Nazi government (1933–1945) changed thousands of still existing toponyms (especially names of cities and villages) of Old Prussian, Lithuanian and Polish origin to newly created German names; six thousand, that meant about 50% of the existing names were changed, but the countryside population stuck to their traditional names. Within six years a new renaming would take place after Poland annexed Masuria in 1945.
According to German author Andreas Kossert, Polish parties were financed and aided by the Polish government in Warsaw, and remained splintergroups without any political influence, e.g. in the 1932 elections the Polish Party received 147 votes in Masuria proper. According to Wojciech Wrzesiński (1963), the Polish organisations in Masuria had decided to lower their activity in order to escape acts of terror performed against Polish minority activists and organisations by Nazi activists. Jerzy Lanc, a teacher and Polish national who had moved to Masuria in 1931 to establish a Polish school in Piassutten (Piasutno), died in his home of carbon monoxide poisoning, most likely murdered by local German nationalists.
Before the war the Nazi German state sent undercover operatives to spy on Polish organisations and created lists of people that were to be executed or sent to concentration camps. This mainly took place in Silesia and only according to the few catholic schools in Masuria. Information was gathered on who sent children to Polish schools, bought Polish press or took part in Polish ceremonies and organised repressions against these people were executed by Nazi militias. Polish schools, printing presses and headquarters of Polish institutions were attacked as well as homes of the most active Poles; shops owned by Poles were vandalised or demolished. Polish masses were dispersed, and Polish teachers were intimidated as members of the SS gathered under their locals performing songs like "Wenn das Polenblut vom Messer spritzt, dann geht's noch mal so gut" ("When Polish blood spurts from the knife, everything will be better").
The anti-Polish activities intensified in 1939. Those Poles were most active in politics were evicted from their own homes, while Polish newspapers and cultural houses were closed down in the region. Polish masses were banned between June and July in Warmia and Mazury.
In the final moments of August 1939 all remains of political and cultural life of Polish minority was eradicated by the Nazis, with imprisonment of Polish activists and liquidation of Polish institutions. Seweryn Pieniężny, the chief editor of "Gazeta Olsztyńska", who opposed Germanisation of Masuria, was interned. Others included Juliusz Malewski (director of Bank Ludowy of Olsztyn), Stefan Różycki, Leon Włodarczyk (activist of Polonia Warmińsko-Mazurska).
Directors of Polish schools and teachers were imprisoned, as was the staff of Polish pre-schools in the Masuria region. They were often forced to destroy Polish signs, emblems and symbols of Polish institutions.
The Nazis believed that in future, the Masurians, as a separate non-German entity, would 'naturally' disappear in the end, while those who would cling to their "foreigness" as one Nazi report mentioned, would be deported. Local Jews were considered by the Nazis to be subhuman and were to be exterminated. The Nazi authorities also executed Polish activists in Masuria and those who remained alive were sent to concentration camps.
In August 1943 the Uderzeniowe Bataliony Kadrowe attacked the village of Mittenheide (Turośl) in southern Masuria
In 1943 "Związek Mazurski" was reactivated secretly by Masurian activists of the Polish Underground State in Warsaw and led by Karol Małłek. Związek Mazurski opposed Nazi Germany and asked Polish authorities during the war to liquidate German large landowners after the victory over Nazi Germany to help in agricultural reform and settlement of Masurian population, Masurian iconoclasts opposed to Nazi Germany requested to remove German heritage sites "regardless of their cultural value". Additionally a Masurian Institute was founded by Masurian activists in Radość near Warsaw in 1943
In the final stages of World War II, Masuria was partially devastated by the retreating German and advancing Soviet armies during the Vistula-Oder Offensive. The region came under Polish rule at the war's end in the Potsdam Conference. Most of the population fled to Germany or was killed during or after the war, while those which stayed were subject to a "nationality verification", organised by the communist government of Poland. As a result, the number of native Masurians remaining in Masuria was initially relatively high, while most of the population was subsequently expelled. Poles from central Poland and the Polish areas annexed by the Soviet Union as well as Ukrainians expelled from southern Poland throughout the Operation Vistula, were resettled in Masuria.
According to the Masurian Institute the Masurian members of resistance against Nazi Germany who survived the war, became active in 1945 in the region, working in Olsztyn in cooperation with new state authorities in administration, education and cultural affairs. Historic Polish names for most of towns of Masuria were restored, but for some places new names were determined even if there were historic Polish names.
German author Andreas Kossert describes the post-war process of "national verification" as based on an ethnic racism which categorised the local populace according to their alleged ethnic background. A Polish-sounding last name or a Polish-speaking ancestor was sufficient to be regarded as "autochthonous" Polish.
In October 1946 37,736 persons were "verified" as Polish citizens while 30,804 remained "unverified". A center of such "unverified" Masurians was the district of Mrągowo, where in early 1946 out of 28,280 persons, 20,580 were "unverified", while in October, 16,385 still refused to adopt Polish citizenship. However even those who complied with the often used pressure by Polish authorities were in fact treated as Germans because of their Lutheran faith and their often rudimentary knowledge of Polish. Names were "Polonised" and the usage of the German language in public was forbidden. In the late 1940s the pressure to sign the "verification documents" grew and in February 1949 the former chief of the stalinist secret Police (UB) of Łódź, Mieczysław Moczar, started the "Great verification" campaign. Many unverified Masurians were imprisoned and accused of pro-Nazi or pro-American propaganda, even former pro-Polish activists and inmates of Nazi concentration camps were jailed and tortured. After the end of this campaign in the district of Mrągowo (Sensburg) only 166 Masurians were still "unverified".
In 1950 1,600 Masurians left the country and in 1951, 35,000 people from Masuria and Warmia managed to obtain a declaration of their German nationality by the embassies of the United States and Great Britain in Warsaw. Sixty-three percent of the Masurians in the district of Mrągowo (Sensburg) received such a document. In December 1956 Masurian pro-Polish activists signed a memorandum to the Communist Party leadership:
"The history of the people of Warmia and Masuria is full of tragedy and suffering. Injustice, hardship and pain often pressed on the shoulders of Warmians and Masurians... Dislike, injustice and violence surrounds us...They (Warmians and Masurians) demand respect for their differentness, grown in the course of seven centuries and for freedom to maintain their traditions".
Soon after the political reforms of 1956, Masurians were given the opportunity to join their families in West Germany. The majority (over 100 thousand) gradually left and after the improvement of Germano-Polish relations by the German Ostpolitik of the 1970s, 55,227 persons from Warmia and Masuria moved to West Germany in between 1971 and 1988, today approximately between 5,000 and 6,000 Masurians still live in the area, about 50 percent of them members of the German minority in Poland, the remaining half is ethnic Polish. As the Polish journalist Andrzej K. Wróblewski stated, the Polish post-war policy succeeded in what the Prussian state never managed: the creation of a German national consciousness among the Masurians.
Most of the originally Protestant churches in Masuria are now used by the Polish Roman Catholic Church as the number of Lutherans in Masuria declined from 68,500 in 1950 to 21,174 in 1961 and further to 3,536 in 1981. Sometimes, like on 23 September 1979 in the village of Spychowo (Puppen), the Lutheran Parish was even forcefully driven out of their church while liturgy was held.
In modern Masuria the native population has virtually disappeared. Masuria was incorporated into the voivodeship system of administration in 1945. In 1999 Masuria was constituted with neighbouring Warmia as a single administrative province through the creation of the Warmian-Masurian Voivodeship.
Today, numerous summer music festivals take place in Masuria, including the largest reggae festival in Poland in Ostróda, the largest country music festival in Poland in Mrągowo, and one of Poland's largest hip hop music festivals in Giżycko and Ełk.
The Masurian Szczytno-Szymany International Airport gained international attention as press reports alleged the airport to be a so-called ""black site"" involved in the CIA's network of extraordinary renditions.
Masuria and the Masurian Lake District are known in Polish as "Kraina Tysiąca Jezior" and in German as "Land der Tausend Seen", meaning "land of a thousand lakes." These lakes were ground out of the land by glaciers during the Pleistocene ice age around 14,000 - 15,000 years ago, when ice covered northeastern Europe. From that period originates the horn of a reindeer found in the vicinity of Giżycko. By 10,000 BC this ice started to melt. Great geological changes took place and even in the last 500 years the maps showing the lagoons and peninsulas on the Baltic Sea have greatly altered in appearance. More than in other parts of northern Poland, such as from Pomerania (from the River Oder to the River Vistula), this continuous stretch of lakes is popular among tourists. The terrain is rather hilly, with connecting lakes, rivers and streams. Forests account for about 30% of the area. The northern part of Masuria is covered mostly by the broadleaved forest, while the southern part is dominated by pine and mixed forests.
Two largest lakes of Poland, Śniardwy and Mamry, are located in Masuria. | https://en.wikipedia.org/wiki?curid=20493 |
Médecins Sans Frontières
MSF's principles and operational guidelines are highlighted in its Charter, the Chantilly Principles, and the later La Mancha Agreement. Governance is addressed in Section 2 of the Rules portion of this final document. MSF has an associative structure, where operational decisions are made, largely independently, by the five operational centres (Amsterdam, Barcelona-Athens, Brussels, Geneva and Paris). Common policies on core issues are coordinated by the International Council, in which each of the 24 sections (national offices) is represented. The International Council meets in Geneva, Switzerland, where the International Office, which coordinates international activities common to the operational centres, is also based.
MSF has general consultative status with the United Nations Economic and Social Council. It received the 1999 Nobel Peace Prize in recognition of its members' continued efforts to provide medical care in acute crises, as well as raising international awareness of potential humanitarian disasters. James Orbinski, who was the president of the organization at the time, accepted the prize on behalf of MSF. Prior to this, MSF also received the 1996 Seoul Peace Prize. Christos Christou succeeded Joanne Liu as international president in June 2019.
During the Nigerian Civil War of 1967 to 1970, the Nigerian military formed a blockade around the nation's newly independent south-eastern region, Biafra. At this time, France was one of the only major countries supportive of the Biafrans (the United Kingdom, the Soviet Union and the United States sided with the Nigerian government), and the conditions within the blockade were unknown to the world. A number of French doctors volunteered with the French Red Cross to work in hospitals and feeding centres in besieged Biafra. One of the co-founders of the organisation was Bernard Kouchner, who later became a high-ranking French politician.
After entering the country, the volunteers, in addition to Biafran health workers and hospitals, were subjected to attacks by the Nigerian army, and witnessed civilians being murdered and starved by the blockading forces. The doctors publicly criticised the Nigerian government and the Red Cross for their seemingly complicit behaviour. These doctors concluded that a new aid organisation was needed that would ignore political/religious boundaries and prioritise the welfare of victims.
The "Groupe d'intervention médicale et chirurgicale en urgence" ("Emergency Medical and Surgical Intervention Group") was formed in 1971 by French doctors who had worked in Biafra, to provide aid and to emphasize the importance of victims' rights over neutrality. At the same time, Raymond Borel, the editor of the French medical journal "TONUS", had started a group called "Secours Médical Français" ("French Medical Relief") in response to the 1970 Bhola cyclone, which killed at least 625,000 in East Pakistan (now Bangladesh). Borel had intended to recruit doctors to provide aid to victims of natural disasters. On 22 December 1971, the two groups of colleagues merged to form "Médecins Sans Frontières".
MSF's first mission was to the Nicaraguan capital, Managua, where a 1972 earthquake had destroyed most of the city and killed between 10,000 and 30,000 people. The organization, today known for its quick response in an emergency, arrived three days after the Red Cross had set up a relief mission. On 18 and 19 September 1974, Hurricane Fifi caused major flooding in Honduras and killed thousands of people (estimates vary), and MSF set up its first long-term medical relief mission.
Between 1975 and 1979, after South Vietnam had fallen to North Vietnam, millions of Cambodians emigrated to Thailand to avoid the Khmer Rouge. In response MSF set up its first refugee camp missions in Thailand. When Vietnam withdrew from Cambodia in 1989, MSF started long-term relief missions to help survivors of the mass killings and reconstruct the country's health care system. Although its missions to Thailand to help victims of war in Southeast Asia could arguably be seen as its first war-time mission, MSF saw its first mission to a true war zone, including exposure to hostile fire, in 1976. MSF spent nine years (1976–1984) assisting surgeries in the hospitals of various cities in Lebanon, during the Lebanese Civil War, and established a reputation for its neutrality and willingness to work under fire. Throughout the war, MSF helped both Christian and Muslim soldiers alike, helping whichever group required the most medical aid at the time. In 1984, as the situation in Lebanon deteriorated further and security for aid groups was minimised, MSF withdrew its volunteers.
Claude Malhuret was elected as the new president of Medicins Sans Frontieres in 1977, and soon after debates began over the future of the organisation. In particular, the concept of "témoignage" ("witnessing"), which refers to speaking out about the suffering that one sees as opposed to remaining silent, was being opposed or played down by Malhuret and his supporters. Malhuret thought MSF should avoid criticism of the governments of countries in which they were working, while Kouchner believed that documenting and broadcasting the suffering in a country was the most effective way to solve a problem.
In 1979, after four years of refugee movement from South Vietnam and the surrounding countries by foot and by boat, French intellectuals made an appeal in "Le Monde" for "A Boat for Vietnam", a project intended to provide medical aid to the refugees. Although the project did not receive support from the majority of MSF, some, including later Minister Bernard Kouchner, chartered a ship called "L’Île de Lumière" ("The Island of Light"), and, along with doctors, journalists and photographers, sailed to the South China Sea and provided some medical aid to the boat people. The splinter organisation that undertook this, Médecins du Monde, later developed the idea of humanitarian intervention as a duty, in particular on the part of Western nations such as France. In 2007 MSF clarified that for nearly 30 years MSF and Kouchner have had public disagreements on such issues as the right to intervene and the use of armed force for humanitarian reasons. Kouchner is in favour of the latter, whereas MSF stands up for an impartial humanitarian action, independent from all political, economic and religious powers.
In 1982, Malhuret and Rony Brauman (who became the organisation's president in 1982) brought increased financial independence to MSF by introducing fundraising-by-mail to better collect donations. The 1980s also saw the establishment of the other operational sections from MSF-France (1971): MSF-Belgium (1980), MSF-Switzerland (1981), MSF-Holland (1984), and MSF-Spain (1986). MSF-Luxembourg was the first support section, created in 1986. The early 1990s saw the establishment of the majority of the support sections: MSF-Greece (1990), MSF-USA (1990), MSF-Canada (1991), MSF-Japan (1992), MSF-UK (1993), MSF-Italy (1993), MSF-Australia (1994), as well as Germany, Austria, Denmark, Sweden, Norway, and Hong Kong (MSF-UAE was formed later). Malhuret and Brauman were instrumental in professionalising MSF. In December 1979, after the Soviet army had invaded Afghanistan, field missions were immediately set up to provide medical aid to the mujahideen, and in February 1980, MSF publicly denounced the Khmer Rouge. During the 1983–1985 famine in Ethiopia, MSF set up nutrition programmes in the country in 1984, but was expelled in 1985 after denouncing the abuse of international aid and the forced resettlements. MSF's explicit attacks on the Ethiopian government led to other NGOs criticizing their abandonment of their supposed neutrality and contributed to a series of debates in France around humanitarian ethics. The group also set up equipment to produce clean drinking water for the population of San Salvador, capital of El Salvador, after 10 October 1986 earthquake that struck the city. In 2014, the European Speedster Assembly had contributed $717,000 to MSF.
Since 1979, MSF has been providing medical humanitarian assistance in Sudan, a nation plagued by starvation and the civil war, prevalent malnutrition and one of the highest maternal mortality rates in the world. In March 2009, it is reported that MSF has employed 4,590 field staff in Sudan tackling issues such as armed conflicts, epidemic diseases, health care and social exclusion. MSF's continued presence and work in Sudan is one of the organization's largest interventions. MSF provides a range of health care services including nutritional support, reproductive healthcare, Kala-Azar treatment, counselling services and surgery to the people living in Sudan. Common diseases prevalent in Sudan include tuberculosis, kala-azar also known as visceral leishmaniasis, meningitis, measles, cholera, and malaria.
Kala-azar, also known as visceral leishmaniasis, has been one of the major health problems in Sudan. After the Comprehensive Peace Agreement between North and Southern Sudan on 9 January 2005, the increase in stability within the region helped further efforts in healthcare delivery. Médicins Sans Frontières tested a combination of sodium stibogluconate and paromomycin, which would reduce treatment duration (from 30 to 17 days) and cost in 2008. In March 2010, MSF set up its first Kala-Azar treatment centre in Eastern Sudan, providing free treatment for this otherwise deadly disease. If left untreated, there is a fatality rate of 99% within 1–4 months of infection. Since the treatment centre was set up, MSF has cured more than 27,000 Kala-Azar patients with a success rate of approximately 90–95%. There are plans to open an additional Kala-Azar treatment centre in Malakal, Southern Sudan to cope with the overwhelming number of patients that are seeking treatment. MSF has been providing necessary medical supplies to hospitals and training Sudanese health professionals to help them deal with Kala-Azar. MSF, Sudanese Ministry of Health and other national and international institutions are combining efforts to improve on the treatment and diagnosis of Kala-Azar. Research on its cures and vaccines are currently being conducted. In December 2010, South Sudan was hit with the worst outbreak of Kala-Azar in eight years. The number of patients seeking treatment increased eight-fold as compared to the year before.
Sudan's latest civil war began in 1983 and ended in 2005 when a peace agreement was signed between North Sudan and South Sudan. MSF medical teams were active throughout and prior to the civil war, providing emergency medical humanitarian assistance in multiple locations. The situation of poor infrastructure in the South was aggravated by the civil war and resulted in the worsening of the region's appalling health indicators. An estimated 75 percent of people in the nascent nation has no access to basic medical care and 1 in seven women dies during childbirth. Malnutrition and disease outbreaks are perennial concerns as well. In 2011, MSF clinic in Jonglei State, South Sudan was looted and attacked by raiders. Hundreds, including women and children were killed. Valuable items including medical equipment and drugs were lost during the raid and parts of the MSF facilities were destroyed in a fire. The incident had serious repercussions as MSF is the only primary health care provider in this part of Jonglei State.
The early 1990s saw MSF open a number of new national sections, and at the same time, set up field missions in some of the most dangerous and distressing situations it had ever encountered.
In 1990, MSF first entered Liberia to help civilians and refugees affected by the Liberian Civil War. Constant fighting throughout the 1990s and the Second Liberian Civil War have kept MSF volunteers actively providing nutrition, basic health care, and mass vaccinations, and speaking out against attacks on hospitals and feeding stations, especially in Monrovia.
Field missions were set up to provide relief to Kurdish refugees who had survived the al-Anfal Campaign, for which evidence of atrocities was being collected in 1991. 1991 also saw the beginning of the civil war in Somalia, during which MSF set up field missions in 1992 alongside a UN peacekeeping mission. Although the UN-aborted operations by 1993, MSF representatives continued with their relief work, running clinics and hospitals for civilians.
MSF first began work in Srebrenica (in Bosnia and Herzegovina) as part of a UN convoy in 1993, one year after the Bosnian War had begun. The city had become surrounded by the Bosnian Serb Army and, containing about 60,000 Bosniaks, had become an enclave guarded by a United Nations Protection Force. MSF was the only organisation providing medical care to the surrounded civilians, and as such, did not denounce the genocide for fear of being expelled from the country (it did, however, denounce the lack of access for other organisations). MSF was forced to leave the area in 1995 when the Bosnian Serb Army captured the town. 40,000 Bosniak civilian inhabitants were deported, and approximately 7,000 were killed in mass executions.
When the genocide in Rwanda began in April 1994, some delegates of MSF working in the country were incorporated into the International Committee of the Red Cross (ICRC) medical team for protection. Both groups succeeded in keeping all main hospitals in Rwanda's capital Kigali operational throughout the main period of the genocide. MSF, together with several other aid organisations, had to leave the country in 1995, although many MSF and ICRC volunteers worked together under the ICRC's rules of engagement, which held that neutrality was of the utmost importance. These events led to a debate within the organisation about the concept of balancing neutrality of humanitarian aid workers against their witnessing role. As a result of its Rwanda mission, the position of MSF with respect to neutrality moved closer to that of the ICRC, a remarkable development in the light of the origin of the organisation.
The ICRC lost 56 and MSF lost almost one hundred of their respective local staff in Rwanda, and MSF-France, which had chosen to evacuate its team from the country (the local staff were forced to stay), denounced the murders and demanded that a French military intervention stop the genocide. MSF-France introduced the slogan ""One cannot stop a genocide with doctors"" to the media, and the controversial Opération Turquoise followed less than one month later. This intervention directly or indirectly resulted in movements of hundreds of thousands of Rwandan refugees to Zaire and Tanzania in what became known as the Great Lakes refugee crisis, and subsequent cholera epidemics, starvation and more mass killings in the large groups of civilians. MSF-France returned to the area and provided medical aid to refugees in Goma.
At the time of the genocide, competition between the medical efforts of MSF, the ICRC, and other aid groups had reached an all-time high, but the conditions in Rwanda prompted a drastic change in the way humanitarian organisations approached aid missions. The "Code of Conduct for the International Red Cross and Red Crescent Movement and NGOs in Disaster Relief Programmes" was created by the ICRC in 1994 to provide a framework for humanitarian missions and MSF is a signatory of this code. The code advocates the provision of humanitarian aid only, and groups are urged not to serve any political or religious interest, or be used as a tool for foreign governments. MSF has since still found it necessary to condemn the actions of governments, such as in Chechnya in 1999, but has not demanded another military intervention since then.
In the late 1990s, MSF missions were set up to treat tuberculosis and anaemia in residents of the Aral Sea area, and look after civilians affected by drug-resistant disease, famine, and epidemics of cholera and AIDS. They vaccinated 3 million Nigerians against meningitis during an epidemic in 1996 and denounced the Taliban's neglect of health care for women in 1997. Arguably, the most significant country in which MSF set up field missions in the late 1990s was Sierra Leone, which was involved in a civil war at the time. In 1998, volunteers began assisting in surgeries in Freetown to help with an increasing number of amputees, and collecting statistics on civilians (men, women and children) being attacked by large groups of men claiming to represent ECOMOG. The groups of men were travelling between villages and systematically chopping off one or both of each resident's arms, raping women, gunning down families, razing houses, and forcing survivors to leave the area. Long-term projects following the end of the civil war included psychological support and phantom limb pain management.
The Campaign for Access to Essential Medicines was created in late 1999, providing MSF with a new voice with which to bring awareness to the lack of effective treatments and vaccines available in developing countries. In 1999, the organisation also spoke out about the lack of humanitarian support in Kosovo and Chechnya, having set up field missions to help civilians affected by the respective political situations. Although MSF had worked in the Kosovo region since 1993, the onset of the Kosovo War prompted the movement of tens of thousands of refugees, and a decline in suitable living conditions. MSF provided shelter, water and health care to civilians affected by NATO's strategic bombing campaigns.
A serious crisis within MSF erupted in connection with the organisation's work in Kosovo when the Greek section of MSF was expelled from the organization. The Greek MSF section had gained access to Serbia at the cost of accepting Serb government imposed limits on where it could go and what it could see – terms that the rest of the MSF movement had refused. A non-MSF source alleged that the exclusion of the Greek section happened because its members extended aid to both Albanian and Serbian civilians in Pristina during NATO's bombing,
The rift was healed only in 2005 with the re-admission of the Greek section to MSF.
A similar situation was found in Chechnya, whose civilian population was largely forced from their homes into unhealthy conditions and subjected to the violence of the Second Chechen War.
MSF has been working in Haiti since 1991, but since President Jean-Bertrand Aristide was forced from power, the country has seen a large increase in civilian attacks and rape by armed groups. In addition to providing surgical and psychological support in existing hospitals – offering the only free surgery available in Port-au-Prince – field missions have been set up to rebuild water and waste management systems and treat survivors of major flooding caused by Hurricane Jeanne; patients with HIV/AIDS and malaria, both of which are widespread in the country, also receive better treatment and monitoring. As a result of 12 January 2010 Haiti earthquake, reports from Haiti indicated that all three of the organisation's hospitals had been severely damaged; one collapsing completely and the other two having to be abandoned. Following the quake, MSF sent about nine planes loaded with medical equipment and a field hospital to help treat the victims. However, the landings of some of the planes had to be delayed due to the massive number of humanitarian and military flights coming in.
The Kashmir Conflict in northern India resulted in a more recent MSF intervention (the first field mission was set up in 1999) to help civilians displaced by fighting in Jammu and Kashmir, as well as in Manipur. Psychological support is a major target of missions, but teams have also set up programmes to treat tuberculosis, HIV/AIDS and malaria. Mental health support has been of significant importance for MSF in much of southern Asia since the 2004 Indian Ocean earthquake.
MSF went through a long process of self-examination and discussion in 2005–2006. Many issues were debated, including the treatment "nationals" as well as "fair employment" and self-criticism.
MSF has been active in a large number of African countries for decades, sometimes serving as the sole provider of health care, food, and water. Although MSF has consistently attempted to increase media coverage of the situation in Africa to increase international support, long-term field missions are still necessary. Treating and educating the public about HIV/AIDS in sub-Saharan Africa, which sees the most deaths and cases of the disease in the world, is a major task for volunteers. Of the 14.6 million people in need of anti-retroviral treatment the WHO estimated that only 5.25 million people were receiving it in developing countries, and MSF continues to urge governments and companies to increase research and development into HIV/AIDS treatments to decrease cost and increase availability. "(See AIDS in Africa for more information)"
Although active in the Congo region of Africa since 1985, the First and Second Congo War brought increased violence and instability to the area. MSF has had to evacuate its teams from areas such as around Bunia, in the Ituri district due to extreme violence, but continues to work in other areas to provide food to tens of thousands of displaced civilians, as well as treat survivors of mass rapes and widespread fighting. The treatment and possible vaccination against diseases such as cholera, measles, polio, Marburg fever, sleeping sickness, HIV/AIDS, and Bubonic plague is also important to prevent or slow down epidemics.
MSF has been active in Uganda since 1980, and provided relief to civilians during the country's guerrilla war during the . However, the formation of the Lord's Resistance Army saw the beginning of a long campaign of violence in northern Uganda and southern Sudan. Civilians were subjected to mass killings and rapes, torture, and abductions of children, who would later serve as sex slaves or child soldiers. Faced with more than 1.5 million people displaced from their homes, MSF set up relief programmes in internally displaced person (IDP) camps to provide clean water, food and sanitation. Diseases such as tuberculosis, measles, polio, cholera, ebola, and HIV/AIDS occur in epidemics in the country, and volunteers provide vaccinations (in the cases of measles and polio) and/or treatment to the residents. Mental health is also an important aspect of medical treatment for MSF teams in Uganda since most people refuse to leave the IDP camps for constant fear of being attacked.
MSF first camp set up a field mission in Côte d'Ivoire in 1990, but ongoing violence and the 2002 division of the country by rebel groups and the government led to several massacres, and MSF teams have even begun to suspect that an ethnic cleansing is occurring. Mass measles vaccinations, tuberculosis treatment and the re-opening of hospitals closed by fighting are projects run by MSF, which is the only group providing aid in much of the country.
MSF has strongly promoted the use of contraception in Africa.
During the Ebola outbreak in West Africa in 2014, MSF met serious medical demands largely on its own, after the organisation's early warnings were largely ignored.
In 2014 MSF partnered with satellite operator SES, other NGOs Archemed, Fondation Follereau, Friendship Luxembourg and German Doctors, and the Luxembourg government in the pilot phase of SATMED, a project to use satellite broadband technology to bring eHealth and telemedicine to isolated areas of developing countries. SATMED was first deployed in Sierra Leone in support of the fight against Ebola.
MSF-Burundi has aided in attending to casualties suffered in the 2019 Burundi landslides.
MSF first provided medical help to civilians and refugees who have escaped to camps along the Thai-Cambodian border in 1979. Due to long decades of war, a proper health care system in the country was severely lacking and MSF moved inland in 1989 to help restructure basic medical facilities.
In 1999, Cambodia was hit with a malaria epidemic. The situation of the epidemic was aggravated by a lack of qualified practitioners and poor quality control which led to a market of fake antimalarial drugs. Counterfeit antimalarial drugs were responsible for the deaths of at least 30 people during the epidemic. This has prompted efforts by MSF to set up and fund a malaria outreach project and utilise Village Malaria Workers. MSF also introduced a switching of first-line treatment to a combination therapy (Artesunate and Mefloquine) to combat resistance and fatality of old drugs that were used to treat the disease traditionally.
Cambodia is one of the hardest hit HIV/AIDS countries in Southeast Asia. In 2001, MSF started introducing antiretroviral (ARV) therapy to AIDS patients for free. This therapy prolongs the patients' lives and is a long-term treatment. In 2002, MSF established chronic diseases clinics with the Cambodian Ministry of Health in various provinces to integrate HIV/AIDS treatment, alongside hypertension, diabetes, and arthritis which have high prevalence rate. This aims to reduce facility-related stigma as patients are able to seek treatment in a multi-purpose clinic in contrast to a HIV/AIDS specialised treatment centre.
MSF also provided humanitarian aid in times of natural disaster such as a major flood in 2002 which affected up to 1.47 million people. MSF introduced a community-based tuberculosis programme in 2004 in remote villages, where village volunteers are delegated to facilitate the medication of patients. In partnership with local health authorities and other NGOs, MSF encouraged decentralized clinics and rendered localized treatments to more rural areas from 2006. Since 2007, MSF has extended general health care, counselling, HIV/AIDS and TB treatment to prisons in Phnom Penh via mobile clinics. However, poor sanitation and lack of health care still prevails in most Cambodian prisons as they remain as some of the world's most crowded prisons.
In 2007, MSF worked with the Cambodian Ministry of Health to provide psychosocial and technical support in offering pediatric HIV/AIDS treatment to affected children. MSF also provided medical supplies and staff to help in one of the worst dengue outbreaks in 2007, which had more than 40,000 people hospitalized, killing 407 people, primarily children.
In 2010, Southern and Eastern provinces of Cambodia were hit with a cholera epidemic and MSF responded by providing medical support that were adapted for usage in the country.
Cambodia is one of 22 countries listed by WHO as having a high burden of tuberculosis. WHO estimates that 64% of all Cambodians carry the tuberculosis mycobacterium. Hence, MSF has since shifted its focus away from HIV/AIDS to tuberculosis, handing over most HIV-related programs to local health authorities.
The 2011 Libyan civil war has prompted efforts by MSF to set up a hospital and mental health services to help locals affected by the conflict. The fighting created a backlog of patients that needed surgery. With parts of the country slowly returning to livable, MSF has started working with local health personnel to address the needs. The need for psychological counseling has increased and MSF has set up mental health services to address the fears and stress of people living in tents without water and electricity. Currently MSF is the only International Aid organisation with actual presence in the country.
MSF is providing Maritime Search And Rescue (SAR) services on the Mediterranean Sea to save the lives of migrants attempting to cross with unseaworthy boats. The Mission started in 2015 after the EU ended its major SAR operation Mare Nostrum severely diminishing much needed SAR capacities in the Mediterranean. Throughout the mission MSF has operated its own vessels like the Bourbon Argos (2015–2016), Dignity I (2015–2016) and Prudence (2016–2017). MSF has also provided medical teams to support other NGOs and their ships like the MOAS Phoenix (2015) or the Aquarius with SOS Méditerranée (2017–2018). In August 2017 MSF decided to suspend the activities of the Prudence protesting restrictions and threats by the Libyan "Coast Guard".
In December 2018 MSF and SOS Méditerranée were forced to end operations of the Aquarius, the last remaining vessel supported by MSF. This came after attacks by EU states that stripped the vessel of its registration and produced criminal accusations against MSF. Up to then 80,000 people were rescued or assisted since the beginning of the mission.
MSF is involved in Sri Lanka, where a 26 year civil war ended in 2009 and MSF has adapted its activities there to continue its mission. For example, it helps with physical therapy for patients with spinal cord injuries. It conducts counseling sessions, and has set up an “operating theatre for reconstructive orthopaedic surgery and supplied specialist surgeons, anaesthetists and nurses to operate on patients with complicated war-related injuries.”
MSF is involved in trying to help with the humanitarian crisis caused by the Yemeni Civil War. The organisation operates eleven hospitals and health centres in Yemen and provides support to another 18 hospitals or health centres. According to MSF, since October 2015, four of its hospitals and one ambulance have been destroyed by Saudi-led coalition airstrikes. In August 2016, an airstrike on Abs hospital killed 19 people, including one MSF staff member, and wounded 24. According to MSF, the GPS coordinates of the hospital were repeatedly shared with all parties to the conflict, including the Saudi-led coalition, and its location was well-known.
Before a field mission is established in a country, an MSF team visits the area to determine the nature of the humanitarian emergency, the level of safety in the area and what type of aid is needed (this is called an "exploratory mission").
Medical aid is the main objective of most missions, although some missions help in such areas as water purification and nutrition.
A field mission team usually consists of a small number of coordinators to head each component of a field mission, and a "head of mission." The head of mission usually has the most experience in humanitarian situations of the members of the team, and it is his/her job to deal with the media, national governments and other humanitarian organizations. The head of mission does not necessarily have a medical background.
Medical volunteers include physicians, surgeons, nurses, and various other specialists. In addition to operating the medical and nutrition components of the field mission, these volunteers are sometimes in charge of a group of local medical staff and provide training for them.
Although the medical volunteers almost always receive the most media attention when the world becomes aware of an MSF field mission, there are a number of non-medical volunteers who help keep the field mission functioning. Logisticians are responsible for providing everything that the medical component of a mission needs, ranging from security and vehicle maintenance to food and electricity supplies. They may be engineers and/or foremen, but they usually also help with setting up treatment centres and supervising local staff. Other non-medical staff are water/sanitation specialists, who are usually experienced engineers in the fields of water treatment and management and financial/administration/human resources experts who are placed with field missions.
Vaccination campaigns are a major part of the medical care provided during MSF missions. Diseases such as diphtheria, measles, meningitis, tetanus, pertussis, yellow fever, polio, and cholera, all of which are uncommon in developed countries, may be prevented with vaccination. Some of these diseases, such as cholera and measles, spread rapidly in large populations living in close proximity, such as in a refugee camp, and people must be immunised by the hundreds or thousands in a short period of time. For example, in Beira, Mozambique in 2004, an experimental cholera vaccine was received twice by approximately 50,000 residents in about one month.
An equally important part of the medical care provided during MSF missions is AIDS treatment (with antiretroviral drugs), AIDS testing, and education. MSF is the only source of treatment for many countries in Africa, whose citizens make up the majority of people with HIV and AIDS worldwide. Because antiretroviral drugs (ARVs) are not readily available, MSF usually provides treatment for opportunistic infections and educates the public on how to slow transmission of the disease.
In most countries, MSF increases the capabilities of local hospitals by improving sanitation, providing equipment and drugs, and training local hospital staff. When the local staff is overwhelmed, MSF may open new specialised clinics for treatment of an endemic disease or surgery for victims of war. International staff start these clinics but MSF strives to increase the local staff's ability to run the clinics themselves through training and supervision. In some countries, like Nicaragua, MSF provides public education to increase awareness of reproductive health care and venereal disease.
Since most of the areas that require field missions have been affected by a natural disaster, civil war, or endemic disease, the residents usually require psychological support as well. Although the presence of an MSF medical team may decrease stress somewhat among victims, often a team of psychologists or psychiatrists work with victims of depression, domestic violence and substance abuse. The doctors may also train local mental health staff.
Often in situations where an MSF mission is set up, there is moderate or severe malnutrition as a result of war, drought, or government economic mismanagement. Intentional starvation is also sometimes used during a war as a weapon, and MSF, in addition to providing food, brings awareness to the situation and insists on foreign government intervention. Infectious diseases and diarrhoea, both of which cause weight loss and weakening of a person's body (especially in children), must be treated with medication and proper nutrition to prevent further infections and weight loss. A combination of the above situations, as when a civil war is fought during times of drought and infectious disease outbreaks, can create famine.
In emergency situations where there is a lack of nutritious food, but not to the level of a true famine, protein-energy malnutrition is most common among young children. Marasmus, a form of calorie deficiency, is the most common form of childhood malnutrition and is characterised by severe wasting and often fatal weakening of the immune system. Kwashiorkor, a form of calorie and protein deficiency, is a more serious type of malnutrition in young children, and can negatively affect physical and mental development. Both types of malnutrition can make opportunistic infections fatal. In these situations, MSF sets up "Therapeutic Feeding Centres" for monitoring the children and any other malnourished individuals.
A Therapeutic Feeding Centre (or Therapeutic Feeding Programme) is designed to treat severe malnutrition through the gradual introduction of a special diet intended to promote weight gain after the individual has been treated for other health problems. The treatment programme is split between two phases:
MSF uses foods designed specifically for treatment of severe malnutrition. During phase 1, a type of therapeutic milk called F-75 is fed to patients. F-75 is a relatively low energy, low fat/protein milk powder that must be mixed with water and given to patients to prepare their bodies for phase 2. During phase 2, therapeutic milk called F-100, which is higher in energy/fat/protein content than F-75, is given to patients, usually along with a peanut butter mixture called Plumpy'nut. F-100 and Plumpy'nut are designed to quickly provide large amounts of nutrients so that patients can be treated efficiently. Other special food fed to populations in danger of starvation includes enriched flour and porridge, as well as a high protein biscuit called BP5. BP5 is a popular food for treating populations because it can be distributed easily and sent home with individuals, or it can be crushed and mixed with therapeutic milk for specific treatments.
Dehydration, sometimes due to diarrhoea or cholera, may also be present in a population, and MSF set up rehydration centres to combat this. A special solution called Oral Rehydration Solution (ORS), which contains glucose and electrolytes, is given to patients to replace fluids lost. Antibiotics are also sometimes given to individuals with diarrhoea if it is known that they have cholera or dysentery.
Clean water is essential for hygiene, for consumption and for feeding programmes (for mixing with powdered therapeutic milk or porridge), as well as for preventing the spread of water-borne disease. As such, MSF water engineers and volunteers must create a source of clean water. This is usually achieved by modifying an existing water well, by digging a new well and/or starting a water treatment project to obtain clean water for a population. Water treatment in these situations may consist of storage sedimentation, filtration and/or chlorination depending on available resources.
Sanitation is an essential part of field missions, and it may include education of local medical staff in proper sterilisation techniques, sewage treatment projects, proper waste disposal, and education of the population in personal hygiene. Proper wastewater treatment and water sanitation are the best way to prevent the spread of serious water-borne diseases, such as cholera. Simple wastewater treatment systems can be set up by volunteers to protect drinking water from contamination. Garbage disposal could include pits for normal waste and incineration for medical waste. However, the most important subject in sanitation is the education of the local population, so that proper waste and water treatment can continue once MSF has left the area.
In order to accurately report the conditions of a humanitarian emergency to the rest of the world and to governing bodies, data on a number of factors are collected during each field mission. The rate of malnutrition in children is used to determine the malnutrition rate in the population, and then to determine the need for feeding centres. Various types of mortality rates are used to report the seriousness of a humanitarian emergency, and a common method used to measure mortality in a population is to have staff constantly monitoring the number of burials at cemeteries. By compiling data on the frequency of diseases in hospitals, MSF can track the occurrence and location of epidemic increases (or "seasons") and stockpile vaccines and other drugs. For example, the "Meningitis Belt" (sub-Saharan Africa, which sees the most cases of meningitis in the world) has been "mapped" and the meningitis season occurs between December and June. Shifts in the location of the Belt and the timing of the season can be predicted using cumulative data over many years.
In addition to epidemiological surveys, MSF also uses population surveys to determine the rates of violence in various regions. By estimating the scopes of massacres, and determining the rate of kidnappings, rapes, and killings, psychosocial programmes can be implemented to lower the suicide rate and increase the sense of security in a population. Large-scale forced migrations, excessive civilian casualties and massacres can be quantified using surveys, and MSF can use the results to put pressure on governments to provide help, or even expose genocide. MSF conducted the first comprehensive mortality survey in Darfur in 2004.
However, there may be ethical problems in collecting these statistics.
The Campaign for Access to Essential Medicines was initiated in 1999 to increase access to essential medicines in developing countries. "Essential medicines" are those drugs that are needed in sufficient supply to treat a disease common to a population. However, most diseases common to populations in developing countries are no longer common to populations in developed countries; therefore, pharmaceutical companies find that producing these drugs is no longer profitable and may raise the price per treatment, decrease development of the drug (and new treatments) or even stop production of the drug. MSF often lacks effective drugs during field missions, and started the campaign to put pressure on governments and pharmaceutical companies to increase funding for essential medicines.
In recent years, the organization has tried to use its influence to urge the drug maker Novartis to drop its case against India's patent law that prevents Novartis from patenting its drugs in India. A few years earlier, Novartis also sued South Africa to prevent it from importing cheaper AIDS drugs. Dr. Tido von Schoen-Angerer, director of DWB's Campaign for Access to Essential Medicines, says, "Just like five years ago, Novartis, with its legal actions, is trying to stand in the way of people's right to access the medicines they need."
On 1 April 2013, it was announced that the Indian court invalidated Novartis's patent on Gleevec. This decision makes the drug available via generics on the Indian market at a considerably lower price.
Aside from injuries and death associated with stray bullets, mines and epidemic disease, MSF volunteers are sometimes attacked or kidnapped for political reasons. In some countries afflicted by civil war, humanitarian-aid organizations are viewed as helping the enemy. If an aid mission is perceived to be exclusively set up for victims on one side of the conflict, it may come under attack for that reason. However, the War on Terrorism has generated attitudes among some groups in US-occupied countries that non-governmental aid organizations such as MSF are allied with or even work for the Coalition forces. Since the United States has labelled its operations "humanitarian actions," independent aid organizations have been forced to defend their positions, or even evacuate their teams. Insecurity in cities in Afghanistan and Iraq rose significantly following United States operations, and MSF has declared that providing aid in these countries was too dangerous. The organization was forced to evacuate its teams from Afghanistan on 28 July 2004, after five volunteers (Afghans Fasil Ahmad and Besmillah, Belgian Hélène de Beir, Norwegian Egil Tynæs, and Dutchman Willem Kwint) were killed on 2 June in an ambush by unidentified militia near Khair Khāna in Badghis Province. In June 2007, Elsa Serfass, a volunteer with MSF-France, was killed in the Central African Republic and in January 2008, two expatriate staff (Damien Lehalle and Victor Okumu) and a national staff member (Mohammed Bidhaan Ali) were killed in an organized attack in Somalia resulting in the closing of the project.
Arrests and abductions in politically unstable regions can also occur for volunteers, and in some cases, MSF field missions can be expelled entirely from a country. Arjan Erkel, Head of Mission in Dagestan in the North Caucasus, was kidnapped and held hostage in an unknown location by unknown abductors from 12 August 2002 until 11 April 2004. Paul Foreman, head of MSF-Holland, was arrested in Sudan in May 2005 for refusing to divulge documents used in compiling a report on rapes carried out by the pro-government Janjaweed militias (see Darfur conflict). Foreman cited the privacy of the women involved, and MSF alleged that the Sudanese government had arrested him because it disliked the bad publicity generated by the report.
On 14 August 2013, MSF announced that it was closing all of its programmes in Somalia due to attacks on its staff by Al-Shabaab militants and perceived indifference or inurement to this by the governmental authorities and wider society.
On 3 October 2015, 14 staff and 28 others died when an MSF hospital was bombed by American forces during the Battle of Kunduz.
On 27 October 2015, an MSF hospital in Sa'dah, Yemen was bombed by the Saudi Arabia-led military coalition.
On 28 November 2015, an MSF-supported hospital was barrel-bombed by a Syrian Air Force helicopter, killing seven and wounding forty-seven people near Homs, Syria.
On 10 January 2016, an MSF-supported hospital in Sa'dah was bombed by the Saudi Arabia-led military coalition, killing six people.
On 15 February 2016, two MSF-supported hospitals in Idlib District and Aleppo, Syria were bombed, killing at least 20 and injuring dozens of patients and medical personnel. Both Russia and the United States denied responsibility and being in the area at the time.
On 28 April 2016, an MSF hospital in Aleppo was bombed, killing 50, including six staff and patients.
On 12 May 2020, an MSF-supported hospital in Dasht-e-Barchi, Kabul, Afghanistan was attacked by an unknown assailant. They attack left 24 people dead and at least 20 more injured.
"" is an award-winning documentary film by Mark N. Hopkins that tells the story of four MSF volunteer doctors confronting the challenges of medical work in war-torn areas of Liberia and Congo. It premiered at the 2008 Venice Film Festival and was theatrically released in the United States in 2010.
The then president of MSF, James Orbinski, gave the Nobel Peace Prize speech on behalf of the organization. In the opening, he discusses the conditions of the victims of the Rwandan genocide and focuses on one of his woman patients:
Orbinski affirmed the organization's commitment to publicizing the issues MSF encountered, stating
On 7 October 2015, President Barack Obama, also Nobel Peace Prize winner and at that time commander in chief, issued an apology to Doctors Without Borders for Kunduz hospital airstrike. Doctors Without Borders were not mollified by Obama's apology.
MSF received the Lasker-Bloomberg Public Service Award in 2015 from the New York based Lasker Foundation.
A number of other non-governmental organizations have adopted names ending in "Sans Frontières" or "Without Borders", inspired by Médecins Sans Frontières: for example, Engineers Without Borders, Payasos Sin Fronteras (Clowns Without Borders) and Reporters Without Borders.
During the 2019 Hong Kong protests, there were conflicts between the civilians and the police. Some injured protesters feared arrest if they sought medical assistance from government hospitals, and some sought assistance from MSF.
In October conflicts erupted at multiple universities in Hong Kong, with police using tear gas. MSF stated that they considered the state medical assistance to be adequate, but were heavily criticised by supporters of the protests, including artists Gregory Wong and Gloria Yip who stated that they would no longer donate to MSF. | https://en.wikipedia.org/wiki?curid=20498 |
History of Germany
The concept of Germany as a distinct region in central Europe can be traced to Roman commander Julius Caesar, who referred to the unconquered area east of the Rhine as "Germania", thus distinguishing it from Gaul (France), which he had conquered. The victory of the Germanic tribes in the Battle of the Teutoburg Forest (AD 9) prevented annexation by the Roman Empire, although the Roman provinces of Germania Superior and Germania Inferior were established along the Rhine. Following the Fall of the Western Roman Empire, the Franks conquered the other West Germanic tribes. When the Frankish Empire was divided among Charles the Great's heirs in 843, the eastern part became East Francia. In 962, Otto I became the first Holy Roman Emperor of the Holy Roman Empire, the medieval German state.
In the Late Middle Ages, the regional dukes, princes, and bishops gained power at the expense of the emperors. Martin Luther led the Protestant Reformation against the Catholic Church after 1517, as the northern states became Protestant, while the southern states remained Catholic. The two parts of the Holy Roman Empire clashed in the Thirty Years' War (1618–1648), which was ruinous to the twenty million civilians living in both parts. The Thirty Years' War brought tremendous destruction to Germany; more than 1/4 of the population and 1/2 of the male population in the German states were killed by the catastrophic war. 1648 marked the effective end of the Holy Roman Empire and the beginning of the modern nation-state system, with Germany divided into numerous independent states, such as Prussia, Bavaria, Saxony, Austria and other states, which also controlled land outside of the area considered "Germany".
After the French Revolution and the Napoleonic Wars from 1803–1815, feudalism fell away and liberalism and nationalism clashed with reaction. The German revolutions of 1848–49 failed. The Industrial Revolution modernized the German economy, led to the rapid growth of cities and the emergence of the socialist movement in Germany. Prussia, with its capital Berlin, grew in power. German universities became world-class centers for science and humanities, while music and art flourished. The unification of Germany (excluding Austria and the German-speaking areas of Switzerland) was achieved under the leadership of the Chancellor Otto von Bismarck with the formation of the German Empire in 1871. This resulted in the "Kleindeutsche Lösung", ("small Germany solution", Germany without Austria), rather than the "Großdeutsche Lösung", ("greater Germany solution", Germany with Austria). The new "Reichstag", an elected parliament, had only a limited role in the imperial government. Germany joined the other powers in colonial expansion in Africa and the Pacific.
By 1900, Germany was the dominant power on the European continent and its rapidly expanding industry had surpassed Britain's while provoking it in a naval arms race. Germany led the Central Powers in World War I (1914–1918) against France, Great Britain, Russia and (by 1917) the United States. Defeated and partly occupied, Germany was forced to pay war reparations by the Treaty of Versailles and was stripped of its colonies as well as of home territory to be ceded to Belgium, France, and Poland, and was banned from uniting with German-settled regions of Austria. The German Revolution of 1918–19 put an end to the federal constitutional monarchy, which resulted in the establishment of the Weimar Republic, an unstable parliamentary democracy.
In the early 1930s, the worldwide Great Depression hit Germany hard, as unemployment soared and people lost confidence in the government. In January 1933, Adolf Hitler was appointed Chancellor of Germany. His Nazi Party quickly established a totalitarian regime, and Nazi Germany made increasingly aggressive territorial demands, threatening war if they were not met. Remilitarization of the Rhineland came in 1936, then annexation of Austria in the "Anschluss" and German-speaking regions of Czechoslovakia with the Munich Agreement in 1938, and further territory of Czechoslovakia in 1939. On 1 September 1939, Germany initiated World War II in Europe with the invasion of Poland. After forming a pact with the Soviet Union in 1939, Hitler and Stalin divided Eastern Europe. After a "Phoney War" in spring 1940, the Germans swept Denmark and Norway, the Low Countries, and France, giving Germany control of nearly all of Western Europe. Hitler invaded the Soviet Union in June 1941.
Racism, especially antisemitism, was a central feature of the Nazi regime. In Germany, but predominantly in the German-occupied areas, the systematic genocide program known as the Holocaust killed 17 million, including Jews, German dissidents, disabled people, Poles, Romanies, Soviets (Russian and non-Russian), and others. In 1942, the German invasion of the Soviet Union faltered, and after the United States entered the war, Britain became the base for massive Anglo-American bombings of German cities. Following the Allied invasion of Normandy (June 1944), the German Army was pushed back on all fronts until the final collapse in May 1945.
Under occupation by the Allies, German territories were split up, Austria was again made a separate country, denazification took place, and the Cold War resulted in the division of the country into democratic West Germany and communist East Germany, reduced in territory by the establishment of the Oder-Neisse line. Millions of ethnic Germans were deported from pre-war Eastern Germany, Sudetenland, and from all over Eastern Europe, in what is described as the largest scale of ethnic cleansing in history. Germans also fled from Communist areas into West Germany, which experienced rapid economic expansion, and became the dominant economy in Western Europe. West Germany was rearmed in the 1950s under the auspices of NATO but without access to nuclear weapons. The Franco-German friendship became the basis for the political integration of Western Europe in the European Union. In 1989, the Berlin Wall was destroyed, the Soviet Union collapsed, and East Germany was reunited with West Germany in 1990. In 1998–1999, Germany was one of the founding countries of the eurozone. Germany remains one of the economic powerhouses of Europe, contributing about one-quarter of the eurozone's annual gross domestic product. In the early 2010s, Germany played a critical role in trying to resolve the escalating euro crisis, especially concerning Greece and other Southern European nations. In the middle of the decade, the country faced the European migrant crisis as the main receiver of asylum seekers from Syria and other troubled regions.
"For more events, see Timeline of German history."
The discovery of the Homo heidelbergensis mandible in 1907 affirms archaic human presence in Germany by at least 600,000 years ago. The oldest complete set of hunting weapons ever found anywhere in the world was excavated from a coal mine in Schöningen, Lower Saxony. Between 1994 and 1998, eight 380,000-year-old wooden javelins between in length were eventually unearthed.
In 1856 the fossilized bones of an extinct human species were salvaged from a limestone grotto in the Neander valley near Düsseldorf, North Rhine-Westphalia. The archaic nature of the fossils, now known to be around 40,000 years old, was recognized and the characteristics published in the first-ever paleoanthropologic species description in 1858 by Hermann Schaaffhausen. The species was named "Homo neanderthalensis" – Neanderthal man in 1864.
The remains of Paleolithic early modern human occupation uncovered and documented in several caves in the Swabian Jura include various mammoth ivory sculptures that rank among the oldest uncontested works of art and several flutes, made of bird bone and mammoth ivory that are confirmed to be the oldest musical instruments ever found. The 40,000-year-old Löwenmensch figurine represents the oldest uncontested figurative work of art and the 35,000-year-old Venus of Hohle Fels has been asserted as the oldest uncontested object of human figurative art ever discovered.
The first groups of early farmers different from the indigenous hunter-gatherers to migrate into Europe came from a population in western Anatolia at the beginning of the Neolithic period between 10,000, to 8,000 years ago.
The settlers of the Corded Ware culture, that had spread all over the fertile plains of Central Europe during the Late Neolithic are of Indo-European ancestry. The Indo-Europeans had, via mass-migration, arrived into the heartland of Europe around 4,500 years ago.
By the late Bronze Age, the Urnfield culture (c. 1300 BC to 750 BC) had replaced the Bell Beaker, the Unetice and Tumulus cultures in central Europe. The Hallstatt culture, which had developed from the Urnfield culture was the predominant Western and Central European culture from the 12th to 8th centuries BC and during the early Iron Age (8th to 6th centuries BC). The people, who had adopted these cultural characteristics are regarded as Celts. How and if the Celts are related to the Urnfield culture remains disputed. However, Celtic cultural centers developed in central Europe during the late Bronze Age (circa 1200 BC until 700 BC). Some, like the Heuneburg at the Danube, grew to become important cultural centres of the Iron Age in Central Europe, that maintained trade routes to the Mediterranean. In the 5th century BC the Greek historian Herodotus mentioned a Celtic city at the Danube - "Pyrene", that historians attribute to the Heuneburg. Beginning around 700 BC, Germanic peoples from southern Scandinavia and northern Germany expanded south and gradually replaced the Celtic peoples in Central Europe.
The ethnogenesis of the Germanic tribes remains debated. However, for author Averil Cameron "it is obvious that a steady process" has occurred during the Nordic Bronze Age, or at the latest during the Pre-Roman Iron Age. From their homes in southern Scandinavia and northern Germany the tribes began expanding south, east and west during the 1st century BC, and came into contact with the Celtic tribes of Gaul, as well as with Iranian, Baltic, and Slavic cultures in Central/Eastern Europe.
Factual and detailed knowledge about the early history of the Germanic tribes is rare. Researchers have to be content with the recordings of the tribes' affairs with the Romans, linguistic conclusions, archaeological discoveries and the rather new yet auspicious results of archaeogenetic study. In the mid-1st century BC, Julius Caesar erected the first known bridges across the Rhine during his campaign in Gaul and led a military contingent across and into the territories of the local Germanic tribes. After several days and having made no contact with Germanic troops (who had retreated inland) Caesar returned to the west of the river. The Suebi tribe under chieftain Ariovistus, had around 60 BC conquered lands of the Gallic Aedui tribe to the west of the Rhine. Consequent plans to populate the region with Germanic settlers from the east were vehemently opposed by Caesar, who had already launched his ambitious campaign to subjugate all Gaul. Julius Caesar confronts and beats the Suebi forces in 58 BC in the Battle of Vosges and forces Ariovist to retreat across the Rhine.
Emperor Augustus considered conquest beyond the Rhine and the Danube not only regular foreign policy but also necessary to counter Germanic incursions into a still rebellious Gaul. A series of forts and commercial centers were established along the two rivers. Some tribes, such as the Ubii consequently allied with Rome and readily adopted advanced Roman culture. During the 1st century CE Roman legions conducted extended campaigns into Germania magna, the area north of the Upper Danube and east of the Rhine, attempting to subdue the various tribes. Roman ideas of administration, the imposition of taxes and a legal framework were frustrated by the total absence of an infrastructure. The campaigns of Germanicus, for example were almost exclusively characterized by frequent massacres of villagers and indiscriminate pillaging. The tribes, however maintained their elusive identities. In 9 AD a coalition of tribes under the Cherusci chieftain Arminius, who was familiar with Roman tactical doctrines, defeated a sizeable Roman force in the Battle of the Teutoburg Forest. Consequently, Rome resolved to permanently establish the Rhine/Danube border and refrain from further territorial advance into Germania. By AD 100 the frontier along the Rhine and the Danube and the Limes Germanicus was firmly established. Several Germanic tribes lived under Roman rule south and west of the border, as described in Tacitus's "Germania". These lands represent the modern states Baden-Württemberg, southern Bavaria, southern Hesse, Saarland and the Rhineland. Austria formed the regular provinces of Noricum and Raetia. The provinces Germania Inferior (with the capital situated at Colonia Claudia Ara Agrippinensium, modern Cologne) and Germania Superior (with its capital at Mogontiacum, modern Mainz), were formally established in 85 AD, after long and painful campaigns as lasting military control was confined to the lands surrounding the rivers.
The 3rd century saw the emergence of a number of large West Germanic tribes: the Alamanni, Franks, Bavarii, Chatti, Saxons, Frisii, Sicambri, and Thuringii. By the 3rd century the Germanic peoples began to migrate beyond the "limes" and the Danube frontier. Several large tribes – the Visigoths, Ostrogoths, Vandals, Burgundians, Lombards, Saxons and Franks – migrated and played their part in the decline of the Roman Empire and the transformation of the old Western Roman Empire.
Christianity was introduced to Roman controlled southwestern Germania and Christian religious structures such as the Aula Palatina of Trier were built during the reign of Constantine I (r. (306–337 AD). By the end of the 4th century the Huns invaded eastern and central Europe. The event triggered the Migration Period. Hunnic hegemony over a vast territory in central and eastern Europe lasted until the death of Attila's son Dengizich in 469.
Stem duchies () in Germany refer to the traditional territory of the various Germanic tribes. The concept of such duchies survived especially in the areas which by the 9th century would constitute East Francia, which included the Duchy of Bavaria, the Duchy of Swabia, the Duchy of Saxony, the Duchy of Franconia and the Duchy of Thuringia, unlike further west the County of Burgundy or Lorraine in Middle Francia.
The Salian emperors (reigned 1027–1125) retained the stem duchies as the major divisions of Germany, but they became increasingly obsolete during the early high-medieval period under the Hohenstaufen, and Frederick Barbarossa finally abolished them in 1180 in favour of more numerous territorial duchies.
Successive kings of Germany founded a series of border counties or marches in the east and the north. These included Lusatia, the North March (which would become Brandenburg and the heart of the future Prussia), and the Billung March. In the south, the marches included Carniola, Styria, and the March of Austria that would become Austria.
After the fall of the Western Roman Empire in the 5th century, the Franks, like other post-Roman Western Europeans, emerged as a tribal confederacy in the Middle Rhine-Weser region, among the territory soon to be called Austrasia (the "eastern land"), the northeastern portion of the future Kingdom of the Merovingian Franks. As a whole, Austrasia comprised parts of present-day France, Germany, Belgium, Luxembourg and the Netherlands. Unlike the Alamanni to their south in Swabia, they absorbed large swaths of former Roman territory as they spread west into Gaul, beginning in 250. Clovis I of the Merovingian dynasty conquered northern Gaul in 486 and in the Battle of Tolbiac in 496 the Alemanni tribe in Swabia, which eventually became the Duchy of Swabia.
By 500, Clovis had united all the Frankish tribes, ruled all of Gaul and was proclaimed "King of the Franks" between 509 and 511. Clovis, unlike most Germanic rulers of the time, was baptized directly into Roman Catholicism instead of Arianism. His successors would cooperate closely with papal missionaries, among them Saint Boniface. After the death of Clovis in 511, his four sons partitioned his kingdom including Austrasia. Authority over Austrasia passed back and forth from autonomy to royal subjugation, as successive Merovingian kings alternately united and subdivided the Frankish lands.
During the 5th and 6th centuries the Merovingian kings conquered the Thuringii (531 to 532), the Kingdom of the Burgundians and the principality of Metz and defeated the Danes, the Saxons and the Visigoths. King Chlothar I (558 to 561) ruled the greater part of what is now Germany and undertook military expeditions into Saxony, while the South-east of what is modern Germany remained under the influence of the Ostrogoths. Saxons controlled the area from the northern sea board to the Harz Mountains and the Eichsfeld in the south.
The Merovingians placed the various regions of their Frankish Empire under the control of semi-autonomous dukes – either Franks or local rulers, and followed imperial Roman strategic traditions of social and political integration of the newly conquered territories. While allowed to preserve their own legal systems, the conquered Germanic tribes were pressured to abandon the Arian Christian faith.
In 718 Charles Martel waged war against the Saxons in support of the Neustrians. In 743 his son Carloman in his role as Mayor of the Palace renewed the war against the Saxons, who had allied with and aided the duke Odilo of Bavaria. The Catholic Franks, who by 750 controlled a vast territory in Gaul, north-western Germany, Swabia, Burgundy and western Switzerland, that included the alpine passes allied with the Curia in Rome against the Lombards, who posed a permanent threat to the Holy See. Pressed by Liutprand, King of the Lombards, a Papal envoy for help had already been sent to the de facto ruler Charles Martel after his victory in 732 over the forces of the Umayyad Caliphate at the Battle of Tours, however a lasting and mutually beneficial alliance would only materialize after Charles' death under his successor Duke of the Franks, Pepin the Short.
In 751 Pippin III, Mayor of the Palace under the Merovingian king, himself assumed the title of king and was anointed by the Church. Pope Stephen II bestowed him the hereditary title of "Patricius Romanorum" as protector of Rome and St. Peter in response to the Donation of Pepin, that guaranteed the sovereignty of the Papal States. Charles the Great (who ruled the Franks from 774 to 814) launched a decades-long military campaign against the Franks' heathen rivals, the Saxons and the Avars. The campaigns and insurrections of the Saxon Wars lasted from 772 to 804. The Franks eventually overwhelmed the Saxons and Avars, forcibly converted the people to Christianity, and annexed their lands to the Carolingian Empire.
After the death of Frankish king Pepin the Short in 768, his oldest son "Charlemagne" ("Charles the Great") consolidated his power over and expanded the Kingdom. Charlemagne ended 200 years of Royal Lombard rule with the Siege of Pavia, and in 774 he installed himself as King of the Lombards. Loyal Frankish nobles replaced the old Lombard aristocracy following a rebellion in 776. The next 30 years of his reign were spent ruthlessly strengthening his power in Francia and on the conquest of the Slavs and Pannonian Avars in the east and all tribes, such as the Saxons and the Bavarians. On Christmas Day, 800 AD, Charlemagne was crowned "Imperator Romanorum" (Emperor of the Romans) in Rome by Pope Leo III.
Fighting among Charlemagne's three grandsons over the continuation of the custom of partible inheritance or the introduction of primogeniture caused the Carolingian empire to be partitioned into three parts by the Treaty of Verdun of 843. Louis the German received the Eastern portion of the kingdom, East Francia, all lands east of the Rhine river and to the north of Italy. This encompassed the territories of the German stem duchies – Franks, Saxons, Swabians, and Bavarians – that were united in a federation under the first non-Frankish king Henry the Fowler, who ruled from 919 to 936. The royal court permanently moved in between a series of strongholds, called "Kaiserpfalzen", that developed into economic and cultural centers. Aachen Palace played a central role, as the local Palatine Chapel served as the official site for all royal coronation ceremonies during the entire Medieval period until 1531.
In 936, Otto I was crowned German king at Aachen, in 961 "King of Italy" in Pavia and crowned emperor by Pope John XII in Rome in 962. The tradition of the German King as protector of the Kingdom of Italy and the Latin Church resulted in the term Holy Roman Empire in the 12th century. The name, that was to identify with Germany continued to be used officially, with the extension added: "Nationis Germanicæ (of the German nation)" after the last imperial coronation in Rome in 1452 until its dissolution in 1806. Otto strengthened the royal authority by re-asserting the old Carolingian rights over ecclesiastical appointments. Otto wrested from the nobles the powers of appointment of the bishops and abbots, who controlled large land holdings. Additionally, Otto revived the old Carolingian program of appointing missionaries in the border lands. Otto continued to support celibacy for the higher clergy, so ecclesiastical appointments never became hereditary. By granting lands to the abbots and bishops he appointed, Otto actually turned these bishops into "princes of the Empire" ("Reichsfürsten"). In this way, Otto was able to establish a national church. Outside threats to the kingdom were contained with the decisive defeat of the Hungarian Magyars at the Battle of Lechfeld in 955. The Slavs between the Elbe and the Oder rivers were also subjugated. Otto marched on Rome and drove John XII from the papal throne and for years controlled the election of the pope, setting a firm precedent for imperial control of the papacy for years to come.
During the reign of Conrad II's son, Henry III (1039 to 1056), the empire supported the Cluniac reforms of the Church, the Peace of God, prohibition of simony (the purchase of clerical offices), and required celibacy of priests. Imperial authority over the Pope reached its peak. However, Rome reacted with the creation of the College of Cardinals and Pope Gregory VII's series of clerical reforms. Pope Gregory insisted in his "Dictatus Papae" on absolute papal authority over appointments to ecclesiastical offices. The subsequent conflict in which emperor Henry IV was compelled to submit to the Pope at Canossa in 1077, after having been excommunicated came to be known as the Investiture Controversy. In 1122 a temporary reconciliation was reached between Henry V and the Pope with the Concordat of Worms. With the conclusion of the dispute the Roman church and the papacy regained supreme control over all religious affairs. Consequentlly the imperial Ottonian church system ("Reichskirche") declined. It also ended the royal/imperial tradition of appointing selected powerful clerical leaders to counter the Imperial secular princes.
Between 1095 and 1291 the various campaigns of the crusades to the Holy Land took place. Knightly religious orders were established, including the Knights Templar, the Knights of St John (Knights Hospitaller), and the Teutonic Order.
The term "sacrum imperium" (Holy Empire) was first used officially by Friedrich I in 1157, but the words "Sacrum Romanum Imperium", Holy Roman Empire, were only combined in July 1180 and would never consistently appear on official documents from 1254 onwards.
The Hanseatic League was a commercial and defensive alliance of the merchant guilds of towns and cities in northern and central Europe that dominated marine trade in the Baltic Sea, the North Sea and along the connected navigable rivers during the Late Middle Ages ( 12th to 15th centuries ). Each of the affiliated cities retained the legal system of its sovereign and, with the exception of the Free imperial cities, had only a limited degree of political autonomy. Beginning with an agreement of the cities of Lübeck and Hamburg, guilds cooperated in order to strengthen and combine their economic assets, like securing trading routes and tax privileges, to control prices and better protect and market their local commodities. Important centers of commerce within the empire, such as Cologne on the Rhine river and Bremen on the North Sea joined the union, which resulted in greater diplomatic esteem. Recognized by the various regional princes for the great economic potential, favorable charters for, often exclusive, commercial operations were granted. During its zenith the alliance maintained trading posts and "kontors" in virtually all cities between London and Edinburgh in the west to Novgorod in the east and Bergen in Norway. By the late 14th century the powerful league enforced its interests with military means, if necessary. This culminated in a war with the sovereign Kingdom of Denmark from 1361 to 1370. Principal city of the Hanseatic League remained Lübeck, where in 1356 the first general diet was held and its official structure was announced. The league declined after 1450 due to a number of factors, such as the 15th-century crisis, the territorial lords' shifting policies towards greater commercial control, the silver crisis and marginalization in the wider Eurasian trade network, among others.
The "Ostsiedlung" (lit. Eastern settlement) is the term for a process of largely uncoordinated immigration and chartering of settlement structures by ethnic Germans into territories, already inhabited by Slavs and Balts east of the Saale and Elbe rivers, such as modern Poland and Silesia and to the south into Bohemia, modern Hungary and Romania during the High Middle Ages from the 11th to the 14th century. The primary purpose of the early imperial military campaigns into the lands to the east during the 10th and 11th century, was to punish and subjugate the local heathen tribes. Conquered territories were mostly lost after the troops had retreated, but eventually were incorporated into the empire as marches, fortified borderlands with garrisoned troops in strongholds and castles, who were to ensure military control and enforce the exaction of tributes. Contemporary sources do not support the idea of policies or plans for the organized settlement of civilians.
Emperor Lothair II re-established feudal sovereignty over Poland, Denmark and Bohemia since 1135 and appointed margraves to turn the borderlands into hereditary fiefs and install a civilian administration. There is no discernible chronology of the immigration process as it took place in many individual efforts and stages, often even encouraged by the Slavic regional lords. However, the new communities were subjected to German law and customs. Total numbers of settlers were generally rather low and, depending on who held a numerical majority, populations usually assimilated into each other. In many regions only enclaves would persist, like Hermannstadt, founded by the Transylvanian Saxons in modern Romania.
In 1230 the Catholic monastic order of the Teutonic Knights launched the Prussian Crusade. The campaign, that was supported by the forces of Polish duke Konrad I of Masovia, initially intended to Christianize the Baltic Old Prussians, succeeded primarily in the conquest of large territories. The order, emboldened by imperial approval, quickly resolved to establish an independent state, without the consent of duke Konrad. Recognizing only papal authority and based on a solid economy, the order steadily expanded the Teutonic state during the following 150 years, engaging in several land disputes with its neighbors. Permanent conflicts with the Kingdom of Poland, the Grand Duchy of Lithuania, and the Novgorod Republic, eventually lead to military defeat and containment by the mid-15th century. The last Grand Master Albert of Brandenburg converted to Lutheranism in 1525 and turned the remaining lands of the order into the secular Duchy of Prussia.
Henry V (1086–1125), great-grandson of Conrad II, who had overthrown his father Henry IV became Holy Roman Emperor in 1111. Hoping to gain greater control over the church inside the Empire, Henry V appointed Adalbert of Saarbrücken as the powerful archbishop of Mainz in the same year. Adalbert began to assert the powers of the Church against secular authorities, that is, the Emperor. This precipitated the "Crisis of 1111" as yet another chapter of the long-term Investiture Controversy. In 1137 the prince-electors turned back to the Hohenstaufen family for a candidate, Conrad III. Conrad tried to divest his rival Henry the Proud of his two duchies—Bavaria and Saxony—that led to war in southern Germany as the empire was divided into two powerful factions. The faction of the "Welfs" or "Guelphs" (in Italian) supported the House of Welf of Henry the Proud, which was the ruling dynasty in the Duchy of Bavaria. The rival faction of the "Waiblings" or "Ghibellines" (in Italian) pledged allegiance to the Swabian House of Hohenstaufen. During this early period, the Welfs generally maintained ecclesiastical independence under the papacy and political particularism (the focus on ducal interests against the central imperial authority). The Waiblings, on the other hand, championed strict control of the church and a strong central imperial government.
During the reign of the Hohenstaufen emperor Frederick I (Barbarossa), an accommodation was reached in 1156 between the two factions. The Duchy of Bavaria was returned to Henry the Proud's son Henry the Lion, duke of Saxony, who represented the Guelph party. However, the Margraviate of Austria was separated from Bavaria and turned into the independent Duchy of Austria by virtue of the Privilegium Minus in 1156.
Having become wealthy through trade, the confident cities of Northern Italy, supported by the Pope, increasingly opposed Barbarossa's claim of feudal rule "(Honor Imperii)" over Italy. The cities united in the Lombard League and finally defeated Barbarossa in the Battle of Legnano in 1176. The following year a reconciliation was reached between the emperor and Pope Alexander III in the Treaty of Venice. The 1183 Peace of Constance eventually settled that the Italian cities remained loyal to the empire but were granted local jurisdiction and full regal rights in their territories.
In 1180, Henry the Lion was outlawed, Saxony was divided, and Bavaria was given to Otto of Wittelsbach, who founded the Wittelsbach dynasty, which was to rule Bavaria until 1918.)
From 1184 to 1186, the empire under Frederick I Barbarossa reached its cultural peak with the "Diet of Pentecost" held at Mainz and the marriage of his son Henry in Milan to the Norman princess Constance of Sicily. The power of the feudal lords was undermined by the appointment of ministerials (unfree servants of the Emperor) as officials. Chivalry and the court life flowered, as expressed in the scholastic philosophy of Albertus Magnus and the literature of Wolfram von Eschenbach.
Between 1212 and 1250, Frederick II established a modern, professionally administered state from his base in Sicily. He resumed the conquest of Italy, leading to further conflict with the Papacy. In the Empire, extensive sovereign powers were granted to ecclesiastical and secular princes, leading to the rise of independent territorial states. The struggle with the Pope sapped the Empire's strength, as Frederick II was excommunicated three times. After his death, the Hohenstaufen dynasty fell, followed by an interregnum during which there was no Emperor.
The failure of negotiations between Emperor Louis IV and the papacy led to the 1338 Declaration at Rhense by six princes of the Imperial Estate to the effect that election by all or the majority of the electors automatically conferred the royal title and rule over the empire, without papal confirmation. As result, the monarch was no longer subject to papal approbation and became increasingly dependent on the favour of the electors. Between 1346 and 1378 Emperor Charles IV of Luxembourg, king of Bohemia, sought to restore imperial authority. The 1356 decree of the Golden Bull stipulated that all future emperors were to be chosen by a college of only seven – four secular and three clerical – electors. The secular electors were the King of Bohemia, the Count Palatine of the Rhine, the Duke of Saxony, and the Margrave of Brandenburg, the clerical electors were the Archbishops of Mainz, Trier, and Cologne.
Between 1347 and 1351 Germany and almost the entire European continent were consumed by the most severe outbreak of the Black Death pandemic. Estimated to have caused the abrupt death of 30 to 60% of Europe's population, it lead to widespread social and economic disruption and deep religious disaffection and fanaticism. Minority groups, and Jews in particular were blamed, singled out and attacked. As a consequence, many Jews fled and resettled in Eastern Europe.
The early-modern European society gradually developed after the disasters of the 14th century as religious obedience and political loyalties declined in the wake of the Great Plague, the schism of the Church and prolonged dynastic wars. The rise of the cities and the emergence of the new burgher class eroded the societal, legal and economic order of feudalism. The commercial enterprises of the mercantile patriciate family of the Fuggers of Augsburg generated unprecedented financial means. As financiers to both the leading ecclesiastical and secular rulers, the Fuggers fundamentally influenced the political affairs in the empire during the 15th and 16th century. The increasingly money based economy also provoked social discontent among knights and peasants and predatory "robber knights" became common. The knightly classes had traditionally established their monopoly through warfare and military skill. However, the shift to practical mercenary infantry armies and military-technical advances led to a marginalization of heavy cavalry.
From 1438 the Habsburg dynasty, who had acquired control in the south-eastern empire over the Duchy of Austria, Bohemia and Hungary after the death of King Louis II in 1526, managed to permanently occupy the position of the Holy Roman Emperor until 1806 (with the exception of the years between 1742 and 1745). However, this strict policy of dynastic rule over a vast multi-ethnic territory, prevented the development of concepts of patriotism and unity among the empire's territorial rulers and a national identity as in France and England.
During his reign from 1493 to 1519, Maximilian I tried to reform the empire. An Imperial supreme court ("Reichskammergericht") was established, imperial taxes were levied, and the power of the Imperial Diet ("Reichstag") was increased. The reforms, however, were frustrated by the continued territorial fragmentation of the Empire.
Total population estimates of the German territories range around 5 to 6 million by the end of Henry III's reign in 1056 and about 7 to 8 million after Friedrich Barabarossa's rule in 1190. The vast majority were farmers, typically in a state of serfdom under feudal lords and monasteries. Towns gradually emerged and in the 12th century many new cities were founded along the trading routes and near imperial strongholds and castles. The towns were subjected to the municipal legal system. Cities such as Cologne, that had acquired the status of Imperial Free Cities, were no longer answerable to the local landlords or bishops, but immediate subjects of the Emperor and enjoyed greater commercial and legal liberties. The towns were ruled by a council of the – usually mercantile – elite, the patricians. Craftsmen formed guilds, governed by strict rules, which sought to obtain control of the towns; a few were open to women. Society had diversified, but was divided into sharply demarcated classes of the clergy, physicians, merchants, various guilds of artisans, unskilled day labourers and peasants. Full citizenship was not available to paupers. Political tensions arose from issues of taxation, public spending, regulation of business, and market supervision, as well as the limits of corporate autonomy.
Cologne's central location on the Rhine river placed it at the intersection of the major trade routes between east and west and was the basis of Cologne's growth. The economic structures of medieval and early modern Cologne were characterized by the city's status as a major harbor and transport hub upon the Rhine. It was the seat of an archbishop, under whose patronage the vast Cologne Cathedral was built since 1240. The cathedral houses sacred Christian relics and since it has become a well known pilgrimage destination. By 1288 the city had secured its independence from the archbishop (who relocated to Bonn), and was ruled by its burghers.
From the early medieval period and continuing through to the 18th century, Germanic law assigned women to a subordinate and dependent position relative to men. Salic (Frankish) law, from which the laws of the German lands would be based, placed women at a disadvantage with regard to property and inheritance rights. Germanic widows required a male guardian to represent them in court. Unlike Anglo-Saxon law or the Visigothic Code, Salic law barred women from royal succession. Social status was based on military and biological roles, a reality demonstrated in rituals associated with newborns, when female infants were given a lesser value than male infants. The use of physical force against wives was condoned until the 18th century in Bavarian law.
Some women of means asserted their influence during the Middle Ages, typically in royal court or convent settings. Hildegard of Bingen, Gertrude the Great, Elisabeth of Bavaria (1478–1504), and Argula von Grumbach are among the women who pursued independent accomplishments in fields as diverse as medicine, music composition, religious writing, and government and military politics.
Benedictine abbess Hildegard von Bingen (1098–1179) wrote several influential theological, botanical, and medicinal texts, as well as letters, liturgical songs, poems, and arguably the oldest surviving morality play, while supervising brilliant miniature Illuminations. About 100 years later, Walther von der Vogelweide (c. 1170 – c. 1230) became the most celebrated of the Middle High German lyric poets.
Around 1439, Johannes Gutenberg of Mainz, used movable type printing and issued the Gutenberg Bible. He was the global inventor of the printing press, thereby starting the Printing Revolution. Cheap printed books and pamphlets played central roles for the spread of the Reformation and the Scientific Revolution.
Around the transition from the 15th to the 16th century, Albrecht Dürer from Nuremberg established his reputation across Europe as painter, printmaker, mathematician, engraver, and theorist when he was still in his twenties and secured his reputation as one of the most important figures of the Northern Renaissance.
The addition "Nationis Germanicæ" (of German Nation) to the emperor's title appeared first in the 15th century: in a 1486 law decreed by Frederick III and in 1512 in reference to the Imperial Diet in Cologne by Maximilian I. By then, the emperors had lost their influence in Italy and Burgundy. In 1525, the Heilbronn reform plan – the most advanced document of the German Peasants' War ("Deutscher Bauernkrieg") – referred to the "Reich" as "von Teutscher Nation" (of German nation).
In order to manage their ever growing expenses, the Renaissance Popes of the 15th and early 16th century promoted the excessive sale of indulgences and offices and titles of the Roman Curia.
In 1517, the monk Martin Luther published a pamphlet with 95 Theses that he posted in the town square of Wittenberg and handed copies to feudal lords. Whether he nailed them to a church door at Wittenberg remains unclear. The list detailed 95 assertions, he argued, represented corrupt practice of the Christian faith and misconduct within the Catholic Church. Although perhaps not Luther's chief concern, he received popular support for his condemnation of the sale of indulgences and clerical offices, the pope's and higher clergy's abuse of power and his doubts of the very idea of the institution of the Church and the papacy.
The Protestant Reformation was the first successful challenge to the Catholic Church and began in 1521 as Luther was outlawed at the Diet of Worms after his refusal to repent. The ideas of the reformation spread rapidly, as the new technology of the modern printing press ensured cheap mass copies and distribution of the theses and helped by the Emperor Charles V's wars with France and the Turks. Hiding in the Wartburg Castle, Luther translated the Bible into German, thereby greatly contributing to the establishment of the modern German language. This is highlighted by the fact that Luther spoke only a local dialect of minor importance during that time. After the publication of his Bible, his dialect suppressed others and constitutes to a great extent what is now modern German. With the protestation of the Lutheran princes at the Imperial Diet of Speyer in 1529 and the acceptance and adoption of the Lutheran Augsburg Confession by the Lutheran princes beginning in 1530, the separate Lutheran church was established.
The 1524/25 German Peasants' War, that began in the southwest in Alsace and Swabia and spread further east into Franconia, Thuringia and Austria, was a series of economic and religious revolts of the rural lower classes, encouraged by the rhethoric of various radical religious reformers and Anabaptists against the ruling feudal lords. Although occasionally assisted by war-experienced noblemen like Götz von Berlichingen and Florian Geyer (in Franconia) and the theologian Thomas Müntzer (in Thuringia), the peasant forces lacked military structure, skill, logistics and equipment and as many as 100,000 insurgents were eventually defeated and massacred by the territorial princes.
The Catholic Counter-Reformation, initiated in 1545 at the Council of Trent was spearheaded by the scholarly religious Jesuit order, that was founded just five years prior by several clerics around Ignatius of Loyola. Its intent was to challenge and contain the Protestant Reformation via apologetic and polemical writings and decrees, ecclesiastical reconfiguration, wars and imperial political maneuverings. In 1547 emperor Charles V defeated the Schmalkaldic League, a military alliance of Protestant rulers. The 1555 Peace of Augsburg decreed the recognition of the Lutheran Faith and religious division of the empire. It also stipulated the ruler's right to determine the official confession in his principality ("Cuius regio, eius religio"). The Counter-Reformation eventually failed to reintegrate the central and northern German Lutheran states. In 1608/1609 the Protestant Union and the Catholic League were formed.
The 1618 to 1648 Thirty Years' War, that took place almost exclusively in the Holy Roman Empire has its origins, that remain widely debated, in the unsolved and recurring conflicts of the Catholic and Protestant factions. The Catholic emperor Ferdinand II attempted to achieve the religious and political unity of the empire, while the opposing Protestant Union forces were determined to defend their religious rights. The religious motive served as the universal justification for the various territorial and foreign princes, who over the course of several stages joined either of the two warring parties in order to gain land and power.
The conflict was sparked by the revolt of the Protestant nobility of Bohemia against emperor Matthias' succession policies. After imperial triumph at the Battle of White Mountain and a short-lived peace, the war grew to become a political European conflict by the intervention of King Christian IV of Denmark from 1625 to 1630, Gustavus Adolphus of Sweden from 1630 to 1648 and France under Cardinal Richelieu from 1635 to 1648. The conflict increasingly evolved into a struggle between the French House of Bourbon and the House of Habsburg for predominance in Europe, for which the central German territories of the empire served as the battle ground.
The war ranks among the most catastrophic in history as three decades of constant warfare and destruction have left the land devastated. Marauding armies incessantly pillaged the countryside, seized and levied heavy taxes on cities and indiscriminately plundered the food stocks of the peasantry. There were also the countless bands of murderous outlaws, sick, homeless, disrupted people and invalid soldiery. Overall social and economic disruption caused a dramatic decline in population as a result of pandemic murder and random rape and killings, endemic infectious diseases, crop failures, famine, declining birth rates, wanton burglary, witch-hunts and the emigration of terrified people. Estimates vary between a 38% drop from 16 million people in 1618 to 10 million by 1650 and a mere 20% drop from 20 million to 16 million. The Altmark and Württemberg regions were especially hard hit, where it took generations to fully recover.
The war was the last major religious struggle in mainland Europe and ended in 1648 with the Peace of Westphalia. It resulted in increased autonomy for the constituent states of the Holy Roman Empire, limiting the power of the emperor. Alsace was permanently lost to France, Pomerania was temporarily lost to Sweden, and the Netherlands officially left the Empire.
The population of Germany reached about twenty million people by the mid-16th century, the great majority of whom were peasant farmers.
The Protestant Reformation was a triumph for literacy and the new printing press. Luther's translation of the Bible into German was a decisive impulse for the increase of literacy and stimulated printing and distribution of religious books and pamphlets. From 1517 onward religious pamphlets flooded Germany and much of Europe. The Reformation instigated a media revolution as by 1530 over 10,000 individual works are published with a total of ten million copies. Luther strengthened his attacks on Rome by depicting a "good" against "bad" church. It soon became clear that print could be used for propaganda in the Reformation for particular agendas. Reform writers used pre-Reformation styles, clichés, and stereotypes and changed items as needed for their own purposes. Especially effective were Luther's "Small Catechism", for use of parents teaching their children, and "Larger Catechism," for pastors. Using the German vernacular they expressed the Apostles' Creed in simpler, more personal, Trinitarian language. Illustrations in the newly translated Bible and in many tracts popularized Luther's ideas. Lucas Cranach the Elder (1472–1553), the great painter patronized by the electors of Wittenberg, was a close friend of Luther, and illustrated Luther's theology for a popular audience. He dramatized Luther's views on the relationship between the Old and New Testaments, while remaining mindful of Luther's careful distinctions about proper and improper uses of visual imagery.
Luther's German translation of the Bible was also decisive for the German language and its evolution from Early New High German to Modern Standard. His bible promoted the development of non-local forms of language and exposed all speakers to forms of German from outside their own area.
The German astronomical community played a central role in Europe in the early modern period. Several non-German scientists contributed to the community, such as Copernicus, the instigator of the scientific revolution who lived in Royal Prussia, a dependency of the King of Poland and Tycho Brahe, who worked in Denmark and Bohemia. Astronomer Johannes Kepler from Weil der Stadt was one of the pioneering minds of empirical and rational research. Through rigorous application of the principles of the Scientific method he construed his laws of planetary motion. His ideas influenced contemporary Italian scientist Galileo Galilei and provided fundamental mechanical principles for Isaac Newton's theory of universal gravitation.
In 1640, Frederick William, also called the Great Elector, became ruler of Brandenburg-Prussia and immediately threw off his vassalage under the Kingdom of Poland and reorganized his loose and scattered territories. In 1648 he acquired East Pomerania via the treaty of "Territorial Adjustments" in the Peace of Westphalia. King Frederick William I, known as the "Soldier King", who reigned from 1713 to 1740, established the structures for the highly centralized future Prussian state and raised a standing army, that was to play a central role. In order to address the demographic problem of Prussia's largely rural population of about three million, the immigration and settlement of French Huguenots in urban areas, of whom many were craftsmen, was supported.
The total population of Germany (in its 1914 territorial extent) grew from 16 million in 1700 to 17 million in 1750 and reached 24 million in 1800. The 18th-century economy noticeably profited from widespread practical application of the Scientific method as greater yields and a more reliable agricultural production and the introduction of hygienic standards positively affected the birth rate – death rate balance.
Louis XIV of France waged a series of successful wars in order to extend the French territory. He conquered Alsace and Lorraine (1678–1681) that included the free imperial city of Straßburg and invaded the Electorate of the Palatinate (1688–1697) in the War of the Grand Alliance. Louis established a number of courts whose sole function was to reinterpret historic decrees and treaties, the Treaties of Nijmegen (1678) and the Peace of Westphalia (1648) in particular in favor of his policies of conquest. He considered the conclusions of these courts, the "Chambres de réunion" as sufficient justification for his boundless annexations. Louis' forces operated inside the Holy Roman Empire largely unopposed, because all available imperial contingents fought in Austria in the Great Turkish War. The Grand Alliance of 1689 took up arms against France and countered any further military advances of Louis. The conflict ended in 1697 as both parties agreed to peace talks after either side had realized, that a total victory was financially unattainable. The Treaty of Ryswick provided for the return of the Electorate of the Palatinate to the empire.
After the last-minute relief of Vienna from a siege and the imminent seizure by a Turkish force in 1683, the combined troops of the Holy League, that had been founded the following year, embarked on the military containment of the Ottoman Empire and reconquered Hungary in 1687. The Papal States, the Holy Roman Empire, the Polish–Lithuanian Commonwealth, the Republic of Venice and since 1686 Russia had joined the league under the leadership of Pope Innocent XI. Prince Eugene of Savoy, who served under emperor Leopold I, took supreme command in 1697 and decisively defeated the Ottomans in a series of spectacular battles and
manoeuvres. The 1699 Treaty of Karlowitz marked the end of the Great Turkish War and Prince Eugene continued his service for the Habsburg Monarchy as president of the War Council. He effectively ended Turkish rule over most of the territorial states in the Balkans during the Austro-Turkish War of 1716–18. The Treaty of Passarowitz left Austria to freely establish royal domains in Serbia and the Banat and maintain hegemony in Southeast Europe, on which the future Austrian Empire was based.
Frederick II "the Great" is best known for his military genius and unique utilisation of the highly organized army to make Prussia one of the great powers in Europe as well as escaping from almost certain national disaster at the last minute. However he was also an artist, author and philosopher, who conceived and promoted the concept of Enlightened absolutism. 19th-century historians created the romantic image of the glorified warrior and accomplished leader and he served as heroic role model for an aggressive Germany militarism down to 1945 and beyond.
Austrian empress Maria Theresa succeeded in bringing about a favorable conclusion for her in the 1740 to 1748 war for recognition of her succession to the throne. However, Silesia was permanently lost to Prussia as a consequence of the Silesian Wars and the Seven Years' War. The 1763 Treaty of Hubertusburg ruled that Austria and Saxony had to relinquish all claims to Silesia. Prussia, that had nearly doubled its territory was eventually recognized as a great European power with the consequence that the politics of the following century were fundamentally influenced by German dualism, the rivalry of Austria and Prussia for supremacy in Central Europe.
The concept of Enlightened absolutism, although rejected by the nobility and citizenry, was advocated in Prussia and Austria and implemented since 1763. Prussian king Frederick II defended the idea in an essay and argued that the benevolent monarch simply is the "first servant of the state", who effects his absolute political power for the benefit of the population as a whole. A number of legal reforms (e.g. the abolition of torture and the emancipation of the rural population and the Jews), the reorganization of the Prussian Academy of Sciences, the introduction of compulsory education for boys and girls and promotion of religious tolerance, among others, caused rapid social and economic development.
During 1772 to 1795 Prussia instigated the partitions of Poland by occupying the western territories of the former Polish–Lithuanian Commonwealth. Austria and Russia resolved to acquire the remaining lands with the effect that Poland ceased to exist as a sovereign state until 1918.
Completely overshadowed by Prussia and Austria, according to historian Hajo Holborn, the smaller German states were generally characterized by political lethargy and administrative inefficiency, often compounded by rulers who were more concerned with their mistresses and their hunting dogs than with the affairs of state. Bavaria was especially unfortunate in this regard; it was a rural land with very heavy debts and few growth centers. Saxony was in economically good shape, although its government was seriously mismanaged, and numerous wars had taken their toll. During the time when Prussia rose rapidly within Germany, Saxony was distracted by foreign affairs. The house of Wettin concentrated on acquiring and then holding on to the Polish throne which was ultimately unsuccessful. In Württemberg the duke lavished funds on palaces, mistresses, great celebration, and hunting expeditions. Many of the city-states of Germany were run by bishops, who in reality were from powerful noble families and showed scant interest in religion. None developed a significant reputation for good government.
In Hesse-Kassel, the Landgrave Frederick II, ruled 1760–1785 as an enlightened despot, and raised money by renting soldiers (called "Hessians") to Great Britain to help fight the American Revolutionary War. He combined Enlightenment ideas with Christian values, cameralist plans for central control of the economy, and a militaristic approach toward diplomacy.
Hanover did not have to support a lavish court—its rulers were also kings of England and resided in London. George III, elector (ruler) from 1760 to 1820, never once visited Hanover. The local nobility who ran the country opened the University of Göttingen in 1737; it soon became a world-class intellectual center. Baden sported perhaps the best government of the smaller states. Karl Friedrich ruled well for 73 years (1738–1811) and was an enthusiast for The Enlightenment; he abolished serfdom in 1783.
The smaller states failed to form coalitions with each other, and were eventually overwhelmed by Prussia. Between 1807 and 1871, Prussia swallowed up many of the smaller states, with minimal protest, then went on to found the German Empire. In the process, Prussia became too heterogeneous, lost its identity, and by the 1930s had become an administrative shell of little importance.
The nobility represented the first estate in a typical early modern kingdom of Christian Europe, with Germany being no exception. The empire's pluralistic character also applied to its nobility, that greatly varied in power and wealth, ideas, ambition, loyalty and education. However, there existed the distinction between the "Imperial nobility", the direct vassals of the emperor and the "Territorial nobility", who have received their fief from the territorial princes. Many of whom had been impoverished as their standard of life and culture had declined since the end of the Medieval period. In an ever more complex economy, they struggled to compete with the patricians and merchants of the cities. The Thirty Years War marked the reversal of fortunes for those noblemen, who seized the initiative and had understood the requirements of higher education for a lucrative position in the post-war territorial administration. In the Prussian lands east of the Elbe river the system of manorial jurisdiction guaranteed near universal legal power and economic freedom for the local lords, called Junkers, who dominated not only the localities, but also the Prussian court, and especially the Prussian army. Increasingly after 1815, a centralized Prussian government based in Berlin took over the powers of the nobles, which in terms of control over the peasantry had been almost absolute. To help the nobility avoid indebtedness, Berlin set up a credit institution to provide capital loans in 1809, and extended the loan network to peasants in 1849. When the German Empire was established in 1871, the Junker nobility controlled the army and the Navy, the bureaucracy, and the royal court; they generally set governmental policies.
Peasants continued to center their lives in the village, where they were members of a corporate body, and to help manage the community resources and monitor the community life. In the East, they were serfs who were bound permanently to parcels of land. In most of Germany, farming was handled by tenant farmers who paid rents and obligatory services to the landlord, who was typically a nobleman. Peasant leaders supervised the fields and ditches and grazing rights, maintained public order and morals, and supported a village court which handled minor offenses. Inside the family the patriarch made all the decisions, and tried to arrange advantageous marriages for his children. Much of the villages' communal life centered around church services and holy days. In Prussia, the peasants drew lots to choose conscripts required by the army. The noblemen handled external relationships and politics for the villages under their control, and were not typically involved in daily activities or decisions.
The emancipation of the serfs came in 1770–1830, beginning with Schleswig in 1780. The peasants were now ex-serfs and could own their land, buy and sell it, and move about freely. The nobles approved for now they could buy land owned by the peasants. The chief reformer was Baron vom Stein (1757–1831), who was influenced by The Enlightenment, especially the free market ideas of Adam Smith. The end of serfdom raised the personal legal status of the peasantry. A bank was set up so that landowners could borrow government money to buy land from peasants (the peasants were not allowed to use it to borrow money to buy land until 1850). The result was that the large landowners obtained larger estates, and many peasants became landless tenants, or moved to the cities or to America. The other German states imitated Prussia after 1815. In sharp contrast to the violence that characterized land reform in the French Revolution, Germany handled it peacefully. In Schleswig the peasants, who had been influenced by the Enlightenment, played an active role; elsewhere they were largely passive. Indeed, for most peasants, customs and traditions continued largely unchanged, including the old habits of deference to the nobles whose legal authority remained quite strong over the villagers. Although the peasants were no longer tied to the same land as serfs had been, the old paternalistic relationship in East Prussia lasted into the 20th century.
The agrarian reforms in northwestern Germany in the era 1770–1870 were driven by progressive governments and local elites. They abolished feudal obligations and divided collectively owned common land into private parcels and thus created a more efficient market-oriented rural economy, which increased productivity and population growth and strengthened the traditional social order because wealthy peasants obtained most of the former common land, while the rural proletariat was left without land; many left for the cities or America. Meanwhile, the division of the common land served as a buffer preserving social peace between nobles and peasants. In the east the serfs were emancipated but the Junker class maintained its large estates and monopolized political power.
Around 1800 the Catholic monasteries, which had large land holdings, were nationalized and sold off by the government. In Bavaria they had controlled 56% of the land.
A major social change occurring between 1750–1850, depending on region, was the end of the traditional "whole house" ("ganzes Haus") system, in which the owner's family lived together in one large building with the servants and craftsmen he employed. They reorganized into separate living arrangements. No longer did the owner's wife take charge of all the females in the different families in the whole house. In the new system, farm owners became more professionalized and profit-oriented. They managed the fields and the household exterior according to the dictates of technology, science, and economics. Farm wives supervised family care and the household interior, to which strict standards of cleanliness, order, and thrift applied. The result was the spread of formerly urban bourgeois values into rural Germany.
The lesser families were now living separately on wages. They had to provide for their own supervision, health, schooling, and old-age. At the same time, because of the demographic transition, there were far fewer children, allowing for much greater attention to each child. Increasingly the middle-class family valued its privacy and its inward direction, shedding too-close links with the world of work. Furthermore, the working classes, the middle classes and the upper classes became physically, psychologically and politically more separate. This allowed for the emergence of working-class organizations. It also allowed for declining religiosity among the working-class, who were no longer monitored on a daily basis.
Since the mid-18th century recognition and application of Enlightenment ideas, higher cultural, intellectual and spiritual standards have led to higher quality works of art in music, philosophy, science and literature. Philosopher Christian Wolff (1679–1754) was a pioneering author on a near universal number of Enlightenment rationality topics in Germany and established German as the language of philosophic reasoning, scholarly instruction and research.
In 1685 Margrave Frederick William of Prussia issued the Edict of Potsdam within a week after French king Louis XIV's Edict of Fontainebleau, that decreed the abolishment of the 1598 concession to free religious practice for Protestants. Frederick William offered his "co-religionists, who are oppressed and assailed for the sake of the Holy Gospel and its pure doctrine...a secure and free refuge in all Our Lands". Around 20,000 Huguenot refugees arrived in an immediate wave and settled in the cities, 40% in Berlin, the ducal residence alone. The French Lyceum in Berlin was established in 1689 and the French language had by the end of the 17th century replaced Latin to be spoken universally in international diplomacy. The nobility and the educated middle-class of Prussia and the various German states increasingly used the French language in public conversation in combination with universal cultivated manners. Like no other German state, Prussia had access to and the skill set for the application of pan-European Enlightenment ideas to develop more rational political and administrative institutions. The princes of Saxony carried out a comprehensive series of fundamental fiscal, administrative, judicial, educational, cultural and general economic reforms. The reforms were aided by the country's strong urban structure and influential commercial groups, who modernized pre-1789 Saxony along the lines of classic Enlightenment principles.
Johann Gottfried von Herder (1744–1803) broke new ground in philosophy and poetry, as a leader of the Sturm und Drang movement of proto-Romanticism. Weimar Classicism ("Weimarer Klassik") was a cultural and literary movement based in Weimar that sought to establish a new humanism by synthesizing Romantic, classical, and Enlightenment ideas. The movement, from 1772 until 1805, involved Herder as well as polymath Johann Wolfgang von Goethe (1749–1832) and Friedrich Schiller (1759–1805), a poet and historian. Herder argued that every folk had its own particular identity, which was expressed in its language and culture. This legitimized the promotion of German language and culture and helped shape the development of German nationalism. Schiller's plays expressed the restless spirit of his generation, depicting the hero's struggle against social pressures and the force of destiny.
German music, sponsored by the upper classes, came of age under composers Johann Sebastian Bach (1685–1750), Joseph Haydn (1732–1809), and Wolfgang Amadeus Mozart (1756–1791).
Königsberg philosopher Immanuel Kant (1724–1804) tried to reconcile rationalism and religious belief, individual freedom, and political authority. Kant's work contained basic tensions that would continue to shape German thought – and indeed all of European philosophy – well into the 20th century. The ideas of the Enlightenment and their implementation received general approval and recognition as principal cause for widespread cultural progress.
Before the 19th century, young women lived under the economic and disciplinary authority of their fathers until they married and passed under the control of their husbands. In order to secure a satisfactory marriage, a woman needed to bring a substantial dowry. In the wealthier families, daughters received their dowry from their families, whereas the poorer women needed to work in order to save their wages so as to improve their chances to wed. Under the German laws, women had property rights over their dowries and inheritances, a valuable benefit as high mortality rates resulted in successive marriages. Before 1789, the majority of women lived confined to society's private sphere, the home.
The Age of Reason did not bring much more for women: men, including Enlightenment aficionados, believed that women were naturally destined to be principally wives and mothers. Within the educated classes, there was the belief that women needed to be sufficiently educated to be intelligent and agreeable interlocutors to their husbands. However, the lower-class women were expected to be economically productive in order to help their husbands make ends meet.
German reaction to the French Revolution was mixed at first. German intellectuals celebrated the outbreak, hoping to see the triumph of Reason and The Enlightenment. The royal courts in Vienna and Berlin denounced the overthrow of the king and the threatened spread of notions of liberty, equality, and fraternity. By 1793, the execution of the French king and the onset of the Terror disillusioned the Bildungsbürgertum (educated middle classes). Reformers said the solution was to have faith in the ability of Germans to reform their laws and institutions in peaceful fashion.
Europe was racked by two decades of war revolving around France's efforts to spread its revolutionary ideals, and the opposition of reactionary royalty. War broke out in 1792 as Austria and Prussia invaded France, but were defeated at the Battle of Valmy (1792). The German lands saw armies marching back and forth, bringing devastation (albeit on a far lower scale than the Thirty Years' War, almost two centuries before), but also bringing new ideas of liberty and civil rights for the people. Prussia and Austria ended their failed wars with France but (with Russia) partitioned Poland among themselves in 1793 and 1795.
France took control of the Rhineland, imposed French-style reforms, abolished feudalism, established constitutions, promoted freedom of religion, emancipated Jews, opened the bureaucracy to ordinary citizens of talent, and forced the nobility to share power with the rising middle class. Napoleon created the Kingdom of Westphalia (1807–1813) as a model state. These reforms proved largely permanent and modernized the western parts of Germany. When the French tried to impose the French language, German opposition grew in intensity. A Second Coalition of Britain, Russia, and Austria then attacked France but failed. Napoleon established direct or indirect control over most of western Europe, including the German states apart from Prussia and Austria. The old Holy Roman Empire was little more than a farce; Napoleon simply abolished it in 1806 while forming new countries under his control. In Germany Napoleon set up the "Confederation of the Rhine," comprising most of the German states except Prussia and Austria.
Under Frederick William II's weak rule (1786-.1797) Prussia had undergone a serious economic, political and military decline. His successor king Frederick William III tried to remain neutral during the War of the Third Coalition and French emperor Napoleon's dissolution of the Holy Roman Empire and reorganisation of the German principalities. Induced by the queen and a pro-war party Frederick William joined the Fourth Coalition in October 1806. Napoleon easily defeated the Prussian army at the Battle of Jena and occupied Berlin. Prussia lost its recently acquired territories in western Germany, its army was reduced to 42,000 men, no trade with Britain was allowed and Berlin had to pay Paris high reparations and fund the French army of occupation. Saxony changed sides to support Napoleon and joined the Confederation of the Rhine. Ruler Frederick Augustus I was rewarded with the title of king and given a slice of Poland taken from Prussia.
After Napoleon's military fiasco in Russia in 1812, Prussia allied with Russia in the Sixth Coalition. A series of battles followed and Austria joined the alliance. Napoleon was decisively defeated in the Battle of Leipzig in late 1813. The German states of the Confederation of the Rhine defected to the Coalition against Napoleon, who rejected any peace terms. Coalition forces invaded France in early 1814, Paris fell and in April Napoleon surrendered. Prussia as one of the winners at the Congress of Vienna, gained extensive territory.
In 1815 continental Europe was in a state of overall turbulence and exhaustion, as a consequence of the French Revolutionary and Napoleonic Wars. The liberal spirit of the Enlightenment and Revolutionary era diverged toward the Romanticism of Edmund Burke, Joseph de Maistre and Novalis. The victorious members of the Coalition had negotiated a new peaceful balance of powers in Vienna and agreed to maintain a stable German heartland that keeps French imperialism at bay. However, the idea of reforming the defunct Holy Roman Empire was discarded. Napoleon's reorganization of the German states was continued and the remaining princes were allowed to keep their titles. In 1813, in return for guarantees from the Allies that the sovereignty and integrity of the Southern German states (Baden, Württemberg, and Bavaria) would be preserved, they broke with France.
During the 1815 Congress of Vienna the 39 former states of the "Confederation of the Rhine" joined the German Confederation, a loose agreement for mutual defense. Attempts of economic integration and customs coordination were frustrated by repressive anti-national policies. Great Britain approved of the union, convinced that a stable, peaceful entity in central Europe could discourage aggressive moves by France or Russia. Most historians, however, concluded, that the Confederation was weak and ineffective and an obstacle to German nationalism. The union was undermined by the creation of the Zollverein in 1834, the 1848 revolutions, the rivalry between Prussia and Austria and was finally dissolved in the wake of the Austro-Prussian War of 1866, to be replaced by the North German Confederation during the same year.
Between 1815 and 1865 the population of the German Confederation (excluding Austria) grew around 60% from 21 million to 34 million. Simultaneously the Demographic Transition took place as the high birth rates and high death rates of the pre-industrial country shifted to low birth and death rates of the fast-growing industrialized urban economic and agricultural system. Increased agricultural productivity secured a steady food supply, as famines and epidemics declined. This allowed people to marry earlier, and have more children. The high birthrate was offset by a very high rate of infant mortality and after 1840, large-scale emigration to the United States. Emigration totaled at 480,000 in the 1840s, 1,200,000 in the 1850s, and at 780,000 in the 1860s. The upper and middle classes first practiced birth control, soon to be universally adopted.
In 1800, Germany's social structure was poorly suited to entrepreneurship or economic development. Domination by France during the French Revolution (1790s to 1815), however, produced important institutional reforms, that included the abolition of feudal restrictions on the sale of large landed estates, the reduction of the power of the guilds in the cities, and the introduction of a new, more efficient commercial law. The idea, that these reforms were beneficial for Industrialization has been contested. Nevertheless, traditionalism remained strong in the many small principalities. Until 1850, the guilds, the landed aristocracy, the churches and the government bureaucracies maintained many rules and restrictions that held entrepreneurship in low esteem and given little opportunity to develop. From the 1830s and 1840s, Prussia, Saxony and other states introduced agriculture based on sugar beets, turnips and potatoes, that yielded higher crops, which enabled a surplus rural population to move to industrial areas.
In the early 19th century the Industrial Revolution was in full swing in Britain, France, and Belgium. The various small federal states in Germany developed only slowly and independently as competition was strong. Early investments for the railway network during the 1830s came almost exclusively from private hands. Without a central regulatory agency the construction projects were quickly realized. Actual industrialization only took off after 1850 in the wake of the railroad construction. The textile industry grew rapidly, profiting from the elimination of tariff barriers by the Zollverein. During the second half of the 19th century the German industry grew exponentially and by 1900, Germany was an industrial world leader along with Britain and the United States.
Historian Thomas Nipperdey remarks:
On the whole, industrialisation in Germany must be considered to have been positive in its effects. Not only did it change society and the countryside, and finally the world...it created the modern world we live in. It solved the problems of population growth, under-employment and pauperism in a stagnating economy, and abolished dependency on the natural conditions of agriculture, and finally hunger. It created huge improvements in production and both short- and long-term improvements in living standards. However, in terms of social inequality, it can be assumed that it did not change the relative levels of income. Between 1815 and 1873 the statistical distribution of wealth was on the order of 77% to 23% for entrepreneurs and workers respectively. On the other hand, new problems arose, in the form of interrupted growth and new crises, such as urbanisation, 'alienation', new underclasses, proletariat and proletarian misery, new injustices and new masters and, eventually, class warfare.
In 1800 the population was predominantly rural, as only 10% lived in communities of 5,000 or more people, and only 2% lived in cities of more than 100,000 people. After 1815, the urban population grew rapidly, due to the influx of young people from the rural areas. Berlin grew from 172,000 in 1800, to 826,000 inhabitants in 1870, Hamburg from 130,000 to 290,000, Munich from 40,000 to 269,000 and Dresden from 60,000 to 177,000.
The takeoff stage of economic development came with the railroad revolution in the 1840s, which opened up new markets for local products, created a pool of middle managers, increased the demand for engineers, architects and skilled machinists and stimulated investments in coal and iron. Political disunity of three dozen states and a pervasive conservatism made it difficult to build railways in the 1830s. However, by the 1840s, trunk lines did link the major cities; each German state was responsible for the lines within its own borders. Economist Friedrich List summed up the advantages to be derived from the development of the railway system in 1841:
Lacking a technological base at first, engineering and hardware was imported from Britain. In many cities, the new railway shops were the centres of technological awareness and training, so that by 1850, Germany was self-sufficient in meeting the demands of railroad construction, and the railways were a major impetus for the growth of the new steel industry. Observers found that even as late as 1890, their engineering was inferior to Britain. However, German unification in 1870 stimulated consolidation, nationalisation into state-owned companies, and further rapid growth. Unlike the situation in France, the goal was the support of industrialisation. Eventually numerous lines criss-crossed the Ruhr area and other industrial centers and provided good connections to the major ports of Hamburg and Bremen. By 1880, 9,400 locomotives pulled 43,000 passengers and 30,000 tons of freight a day.
While there existed no national newspaper the many states issued a great variety of printed media, although they rarely exceeded regional significance. In a typical town existed one or two outlets, urban centers, such as Berlin and Leipzig had dozens. The audience was limited to a few percent of male adults, chiefly from the aristocratic and upper middle class. Liberal publishers outnumbered conservative ones by a wide margin. Foreign governments bribed editors to guarantee a favorable image. Censorship was strict, and the imperial government issued the political news that was supposed to be published. After 1871, strict press laws were enforced by Bismarck to contain the Socialists and hostile editors. Editors focused on political commentary, culture, the arts, high culture and the popular serialized novels. Magazines were politically more influential and attracted intellectual authors.
19th-century artists and intellectuals were greatly inspired by the ideas of the French Revolution and the great poets and writers Johann Wolfgang von Goethe (1749–1832), Gotthold Ephraim Lessing (1729–1781) and Friedrich Schiller (1759–1805). The Sturm und Drang romantic movement was embraced and emotion was given free expression in reaction to the perceived rationalism of the Enlightenment. Philosophical principles and methods were revolutionized by Immanuel Kant's paradigm shift. Ludwig van Beethoven (1770–1827) was the most influential composer of the period from classical to Romantic music. His use of tonal architecture in such a way as to allow significant expansion of musical forms and structures was immediately recognized as bringing a new dimension to music. His later piano music and string quartets, especially, showed the way to a completely unexplored musical universe, and influenced Franz Schubert (1797–1828) and Robert Schumann (1810–1856). In opera, a new Romantic atmosphere combining supernatural terror and melodramatic plot in a folkloric context was first successfully achieved by Carl Maria von Weber (1786–1826) and perfected by Richard Wagner (1813–1883) in his Ring Cycle. The Brothers Grimm (1785–1863 & 1786–1859) collected folk stories into the popular Grimm's Fairy Tales and are ranked among the founding fathers of German studies, who initiated the work on the Deutsches Wörterbuch ("The German Dictionary"), the most comprehensive work on the German language.
University professors developed international reputations, especially in the humanities led by history and philology, which brought a new historical perspective to the study of political history, theology, philosophy, language, and literature. With Georg Wilhelm Friedrich Hegel (1770–1831), Friedrich Wilhelm Joseph Schelling (1775–1854), Arthur Schopenhauer (1788–1860), Friedrich Nietzsche (1844–1900), Max Weber (1864–1920), Karl Marx (1818–1883) and Friedrich Engels (1820–1895) in philosophy, Friedrich Schleiermacher (1768–1834) in theology and Leopold von Ranke (1795–1886) in history became famous. The University of Berlin, founded in 1810, became the world's leading university. Von Ranke, for example, professionalized history and set the world standard for historiography. By the 1830s mathematics, physics, chemistry, and biology had emerged with world class science, led by Alexander von Humboldt (1769–1859) in natural science and Carl Friedrich Gauss (1777–1855) in mathematics. Young intellectuals often turned to politics, but their support for the failed revolution of 1848 forced many into exile.
Two main developments reshaped religion in Germany. Across the land, there was a movement to unite the larger Lutheran and the smaller Reformed Protestant churches. The churches themselves brought this about in Baden, Nassau, and Bavaria. However, in Prussia King Frederick William III was determined to handle unification entirely on his own terms, without consultation. His goal was to unify the Protestant churches, and to impose a single standardized liturgy, organization and even architecture. The long-term goal was to have fully centralized royal control of all the Protestant churches. In a series of proclamations over several decades the "Church of the Prussian Union" was formed, bringing together the more numerous Lutherans, and the less numerous Reformed Protestants. The government of Prussia now had full control over church affairs, with the king himself recognized as the leading bishop. Opposition to unification came from the "Old Lutherans" in Silesia who clung tightly to the theological and liturgical forms they had followed since the days of Luther. The government attempted to crack down on them, so they went underground. Tens of thousands migrated, to South Australia, and especially to the United States, where they formed the Missouri Synod, which is still in operation as a conservative denomination. Finally in 1845 a new king Frederick William IV offered a general amnesty and allowed the Old Lutherans to form a separate church association with only nominal government control.
From the religious point of view of the typical Catholic or Protestant, major changes were underway in terms of a much more personalized religiosity that focused on the individual more than the church or the ceremony. The rationalism of the late 19th century faded away, and there was a new emphasis on the psychology and feeling of the individual, especially in terms of contemplating sinfulness, redemption, and the mysteries and the revelations of Christianity. Pietistic revivals were common among Protestants. Among, Catholics there was a sharp increase in popular pilgrimages. In 1844 alone, half a million pilgrims made a pilgrimage to the city of Trier in the Rhineland to view the Seamless robe of Jesus, said to be the robe that Jesus wore on the way to his crucifixion. Catholic bishops in Germany had historically been largely independent of Rome, but now the Vatican exerted increasing control, a new "ultramontanism" of Catholics highly loyal to Rome. A sharp controversy broke out in 1837–38 in the largely Catholic Rhineland over the religious education of children of mixed marriages, where the mother was Catholic and the father Protestant. The government passed laws to require that these children always be raised as Protestants, contrary to Napoleonic law that had previously prevailed and allowed the parents to make the decision. It put the Catholic Archbishop under house arrest. In 1840, the new King Frederick William IV sought reconciliation and ended the controversy by agreeing to most of the Catholic demands. However Catholic memories remained deep and led to a sense that Catholics always needed to stick together in the face of an untrustworthy government.
After the fall of Napoleon, Europe's statesmen convened in Vienna in 1815 for the reorganisation of European affairs, under the leadership of the Austrian Prince Metternich. The political principles agreed upon at this Congress of Vienna included the restoration, legitimacy and solidarity of rulers for the repression of revolutionary and nationalist ideas.
The German Confederation () was founded, a loose union of 39 states (35 ruling princes and 4 free cities) under Austrian leadership, with a Federal Diet () meeting in Frankfurt am Main. It was a loose coalition that failed to satisfy most nationalists. The member states largely went their own way, and Austria had its own interests.
In 1819 a student radical assassinated the reactionary playwright August von Kotzebue, who had scoffed at liberal student organisations. In one of the few major actions of the German Confederation, Prince Metternich called a conference that issued the repressive Carlsbad Decrees, designed to suppress liberal agitation against the conservative governments of the German states. The Decrees terminated the fast-fading nationalist fraternities (), removed liberal university professors, and expanded the censorship of the press. The decrees began the "persecution of the demagogues", which was directed against individuals who were accused of spreading revolutionary and nationalist ideas. Among the persecuted were the poet Ernst Moritz Arndt, the publisher Johann Joseph Görres and the "Father of Gymnastics" Ludwig Jahn.
In 1834 the Zollverein was established, a customs union between Prussia and most other German states, but excluding Austria. As industrialisation developed, the need for a unified German state with a uniform currency, legal system, and government became more and more obvious.
Growing discontent with the political and social order imposed by the Congress of Vienna led to the outbreak, in 1848, of the March Revolution in the German states. In May the German National Assembly (the Frankfurt Parliament) met in Frankfurt to draw up a national German constitution.
But the 1848 revolution turned out to be unsuccessful: King Frederick William IV of Prussia refused the imperial crown, the Frankfurt parliament was dissolved, the ruling princes repressed the risings by military force, and the German Confederation was re-established by 1850. Many leaders went into exile, including a number who went to the United States and became a political force there.
The 1850s were a period of extreme political reaction. Dissent was vigorously suppressed, and many Germans emigrated to America following the collapse of the 1848 uprisings. Frederick William IV became extremely depressed and melancholic during this period, and was surrounded by men who advocated clericalism and absolute divine monarchy. The Prussian people once again lost interest in politics. Prussia not only expanded its territory but began to industrialize rapidly, while maintaining a strong agricultural base.
In 1857, the Prussian king Frederick William IV suffered a stroke and his brother William served as regent until 1861 when he became King William I. Although conservative, William was very pragmatic. His most significant accomplishment was the naming of Otto von Bismarck as Prussian minister president in 1862. The cooperation of Bismarck, Defense Minister Albrecht von Roon, and Field Marshal Helmut von Moltke set the stage for the military victories over Denmark, Austria, and France, that led to the unification of Germany.
In 1863–64, disputes between Prussia and Denmark over Schleswig escalated, which was not part of the German Confederation, and which Danish nationalists wanted to incorporate into the Danish kingdom. The conflict led to the Second War of Schleswig in 1864. Prussia, joined by Austria, easily defeated Denmark and occupied Jutland. The Danes were forced to cede both the Duchy of Schleswig and the Duchy of Holstein to Austria and Prussia. The subsequent management of the two duchies led to tensions between Austria and Prussia. Austria wanted the duchies to become an independent entity within the German Confederation, while Prussia intended to annex them. The disagreement served as a pretext for the Seven Weeks War between Austria and Prussia, that broke out in June 1866. In July, the two armies clashed at Sadowa-Königgrätz (Bohemia) in an enormous battle involving half a million men. Prussian superior logistics and the modern breech-loading needle guns superioity over the slow muzzle-loading rifles of the Austrians, proved to be elementary for Prussia's victory. The battle had also decided the struggle for hegemony in Germany and Bismarck was deliberately lenient with defeated Austria, that was to play only a subordinate role in future German affairs.
After the Seven Weeks War the German Confederation was dissolved and the North German Federation (German "Norddeutscher Bund") was established under the leadership of Prussia. Austria was excluded and its immense influence over Germany finally came to an end. The North German Federation was a transitional organisation that existed from 1867 to 1871, between the dissolution of the German Confederation and the founding of the German Empire.
Chancellor Otto von Bismarck determined the political course of the German Empire until 1890. He fostered alliances in Europe to contain France on the one hand and aspired to consolidate Germany's influence in Europe on the other. His principal domestic policies focused on the suppression of socialism and the reduction of the strong influence of the Roman Catholic Church on its adherents. He issued a series of anti-socialist laws in accord with a set of social laws, that included universal health care, pension plans and other social security programs. His Kulturkampf policies were vehemently resisted by Catholics, who organized political opposition in the Center (Zentrum) Party. German industrial and economic power had grown to match Britain by 1900.
In 1888, the young and ambitious Kaiser Wilhelm II became emperor. He rejected advice from experienced politicians and ordered Bismarck's resignation in 1890. He opposed Bismarck's careful and delicate foreign policy and was determined to pursue colonialist policies, as Britain and France had been doing for centuries. The Kaiser promoted the active colonization of Africa and Asia for the lands that were not already colonies of other European powers. The Kaiser took a mostly unilateral approach in Europe only allied with the Austro-Hungarian Empire, and embarked on a dangerous naval arms race with Britain. His aggressive and erroneous policies greatly contributed to the situation in which the assassination of the Austrian-Hungarian crown prince would spark off World War I.
In 1868, the Spanish queen Isabella II was deposed in the Glorious Revolution, leaving the country's throne vacant. When Prussia suggested the Hohenzollern candidate, Prince Leopold as successor, France vehemently objected. The matter evolved into a diplomatic scandal and in July 1870, France resolved to end it in a full-scale war. The conflict was quickly decided as Prussia, joined by forces of a pan-German alliance never gave up the tactical initiative. A series of victories in north-eastern France followed and another French army group was simultaneously encircled at Metz. A few weeks later, the French army contingent under Emperor Napoleon III's personal command was finally forced to capitulate in the fortress of Sedan. Napoleon was taken prisoner and a provisional government hastily proclaimed in Paris. The new government resolved to fight on and tried to reorganize the remaining armies while the Germans settled down to besiege Paris. The starving city surrendered in January 1871 and Jules Favre signed the surrender at Versailles. France was forced to pay indemnities of 5 billion francs and cede Alsace-Lorraine to Germany. This conclusion left the French national psyche deeply humiliated and further aggravated the French–German enmity.
During the Siege of Paris, the German princes assembled in the Hall of Mirrors of the Palace of Versailles on 18 January 1871 and announced the establishment of the German Empire and proclaimed the Prussian King Wilhelm I as German Emperor. The act unified all ethnic German states with the exception of Austria in the Little German solution of a federal economic, political and administrative unit. Bismarck, was appointed to serve as Chancellor.
The new empire was a federal union of 25 states that varied considerably in size, demography, constitution, economy, culture, religion and socio-political development. However, even Prussia itself, which accounted for two-thirds of the territory as well as of the population, had emerged from the empire's periphery as a newcomer. It also faced colossal cultural and economic internal divisions. The Prussian provinces of Westphalia and the Rhineland for example had been under French control during the previous decades. The local people, who had benefited from the liberal, civil reforms, that were derived from the ideas of the French Revolution, had only little in common with predominantly rural communities in authoritarian and disjointed Junker estates of Pommerania.
The inhabitants of the smaller territorial lands, especially in central and southern Germany greatly rejected the Prussianized concept of the nation and preferred to associate such terms with their individual home state. The Hanseatic port cities of Hamburg, Bremen and Lübeck ranked among the most ferocious opponents of the "so-called contract with Prussia". As advocates of free trade, they objected Prussian ideas of economic integration and refused to sign the renewed Zollverein (Custom Union) treaties until 1888. The Hanseatic merchants' overseas economic success corresponded with their globalist mindset. The citizen of Hamburg, whom Bismark characterized as "extremely irritating" and the German ambassador in London as "the worst Germans we have", were particularly appalled by Prussian militarism and its unopposed growing influence.
The Prusso-German authorities were aware of necessary integration concepts as the results and the 52% voter turnout of the first imperial elections had clearly demonstrated. Historians increasingly argue, that the nation-state was "forged through empire". National identity was expressed in bombastic imperial stone iconography and was to be achieved as an imperial people, with "an emperor as head of state and it was to develop imperial ambitions" – domestic, European and global.
Bismarck's domestic policies as Chancellor of Germany were based on his effort to universally adopt the idea of the Protestant Prussian state and achieve the clear separation of church and state in all imperial principalities. In the Kulturkampf (lit.: culture struggle) from 1871 to 1878, he tried to minimize the influence of the Roman Catholic Church and its political arm, the Catholic Centre Party, via secularization of all education and introduction of civil marriage, but without success. The Kulturkampf antagonised many Protestants as well as Catholics and was eventually abandoned. The millions of non-German imperial subjects, like the Polish, Danish and French minorities, were left with no choice but to endure discrimination or accept the policies of Germanisation.
The new Empire provided attractive top level career opportunities for the national nobility in the various branches of the consular and civil services and the army. As a consequence the aristocratic near total control of the civil sector guaranteed a dominant voice in the decision making in the universities and the churches. The 1914 German diplomatic corps consisted of 8 princes, 29 counts, 20 barons, 54 representants of the lower nobility and a mere 11 commoners. These commoners were indiscriminately recruited from elite industrialist and banking families. The consular corps employed numerous commoners, that however, occupied positions of little to no executive power. The Prussian tradition to reserve the highest military ranks for young aristocrats was adopted and the new constitution put all military affairs under the direct control of the Emperor and beyond control of the Reichstag. With its large corps of reserve officers across Germany, the military strengthened its role as ""The estate which upheld the nation"", and historian Hans-Ulrich Wehler added: ""it became an almost separate, self-perpetuating caste.""
Power increasingly was centralized among the 7000 aristocrats, who resided in the national capital of Berlin and neighboring Potsdam. Berlin's rapidly increasing rich middle-class copied the aristocracy and tried to marry into it. A peerage could permanently boost a rich industrial family into the upper reaches of the establishment. However, the process tended to work in the other direction as the nobility became industrialists. For example, 221 of the 243 mines in Silesia were owned by nobles or by the King of Prussia himself.
The middle class in the cities grew exponentially, although it never acquired the powerful parliamentary representation and legislative rights as in France, Britain or the United States. The Association of German Women's Organizations or BDF was established in 1894 to encompass the proliferating women's organizations that had emerged since the 1860s. From the beginning the BDF was a bourgeois organization, its members working toward equality with men in such areas as education, financial opportunities, and political life. Working-class women were not welcome and were organized by the Socialists.
The rise of the Socialist Workers' Party (later known as the Social Democratic Party of Germany, SPD), aimed to peacefully establish a socialist order through the transformation of the existing political and social conditions. From 1878, Bismarck tried to oppose the growing social democratic movement by outlawing the party's organisation, its assemblies and most of its newspapers. Nonetheless, the Social Democrats grew stronger and Bismarck initiated his social welfare program in 1883 in order to appease the working class.
Bismarck built on a tradition of welfare programs in Prussia and Saxony that began as early as the 1840s. In the 1880s he introduced old age pensions, accident insurance, medical care, and unemployment insurance that formed the basis of the modern European welfare state. His paternalistic programs won the support of German industry because its goals were to win the support of the working classes for the Empire and reduce the outflow of immigrants to America, where wages were higher but welfare did not exist. Bismarck further won the support of both industry and skilled workers by his high tariff policies, which protected profits and wages from American competition, although they alienated the liberal intellectuals who wanted free trade.
Bismarck would not tolerate any power outside Germany—as in Rome—having a say in domestic affairs. He launched the Kulturkampf ("culture war") against the power of the pope and the Catholic Church in 1873, but only in the state of Prussia. This gained strong support from German liberals, who saw the Catholic Church as the bastion of reaction and their greatest enemy. The Catholic element, in turn, saw in the National-Liberals the worst enemy and formed the Center Party.
Catholics, although nearly a third of the national population, were seldom allowed to hold major positions in the Imperial government, or the Prussian government. After 1871, there was a systematic purge of the remaining Catholics; in the powerful interior ministry, which handled all police affairs, the only Catholic was a messenger boy. Jews were likewise heavily discriminated against.
Most of the Kulturkampf was fought out in Prussia, but Imperial Germany passed the Pulpit Law which made it a crime for any cleric to discuss public issues in a way that displeased the government. Nearly all Catholic bishops, clergy, and laymen rejected the legality of the new laws and defiantly faced the increasingly heavy penalties and imprisonments imposed by Bismarck's government. Historian Anthony Steinhoff reports the casualty totals:
As of 1878, only three of eight Prussian dioceses still had bishops, some 1,125 of 4,600 parishes were vacant, and nearly 1,800 priests ended up in jail or in exile ... Finally, between 1872 and 1878, numerous Catholic newspapers were confiscated, Catholic associations and assemblies were dissolved, and Catholic civil servants were dismissed merely on the pretence of having Ultramontane sympathies.
Bismarck underestimated the resolve of the Catholic Church and did not foresee the extremes that this struggle would attain. The Catholic Church denounced the harsh new laws as anti-Catholic and mustered the support of its rank and file voters across Germany. In the following elections, the Center Party won a quarter of the seats in the Imperial Diet. The conflict ended after 1879 because Pope Pius IX died in 1878 and Bismarck broke with the Liberals to put his main emphasis on tariffs, foreign policy, and attacking socialists. Bismarck negotiated with the conciliatory new pope Leo XIII. Peace was restored, the bishops returned and the jailed clerics were released. Laws were toned down or taken back (Mitigation Laws 1880–1883 and Peace Laws 1886/87), but the laws concerning education, civil registry of marriages and religious disaffiliation remained in place. The Center Party gained strength and became an ally of Bismarck, especially when he attacked socialism.
Chancellor Bismarck's imperial foreign policy basically aimed at security and the prevention of a Franco-Russian alliance, in order to avoid a likely Two-front war. The League of Three Emperors was signed in 1873 by Russia, Austria, and Germany. It stated that republicanism and socialism were common enemies and that the three powers would discuss any matters concerning foreign policy. Bismarck needed good relations with Russia in order to keep France isolated. Russia fought a victorious war against the Ottoman Empire from 1877 to 1878 and attempted to establish the Principality of Bulgaria, that was strongly opposed by France and Britain in particular, as they were long concerned with the preservation of the Ottoman Empire and Russian containment at the Bosphorus Strait and the Black Sea. Germany hosted the Congress of Berlin in 1878, where a more moderate peace settlement was agreed upon.
In 1879, Germany formed the Dual Alliance with Austria-Hungary, an agreement of mutual military assistance in the case of an attack from Russia, which was not satisfied with the agreement of the Congress of Berlin. The establishment of the Dual Alliance led Russia to take a more conciliatory stance and in 1887, the so-called Reinsurance Treaty was signed between Germany and Russia. In it, the two powers agreed on mutual military support in the case that France attacked Germany or an Austrian attack on Russia. Russia turned its attention eastward to Asia and remained largely inactive in European politics for the next 25 years. In 1882, Italy, seeking supporters for its interests in North Africa against France's colonial policy, joined the Dual Alliance, which became the Triple Alliance. In return for German and Austrian support, Italy committed itself to assisting Germany in the case of a French attack.
Bismarck had always argued that the acquisition of overseas colonies was impractical and the burden of administration and maintenance would outweigh the benefits. Eventually, Bismarck gave way, and a number of colonies were established in Africa (Togo, the Cameroons, German South-West Africa, and German East Africa) and in Oceania (German New Guinea, the Bismarck Archipelago, and the Marshall Islands). Consequently, Bismarck initiated the Berlin Conference of 1885, a formal meeting of the European colonial powers, who sought to "established international guidelines for the acquisition of African territory" (see Colonisation of Africa). Its outcome, the "General Act of the Berlin Conference", can be seen as the formalisation of the "Scramble for Africa" and "New Imperialism".
Emperor William I died in 1888. His son Frederick III, open for a more liberal political course, reigned only for ninety-nine days, as he was stricken with throat cancer and died three months after his coronation. His son Wilhelm II followed him on the throne at the age of 29. Wilhelm rejected the liberal ideas of his parents and embarked on a conservative autocratic rule. He early on decided to replace the political elite and in March 1890 he forced chancellor Bismarck into retirement. Following his principle of "Personal Regiment", Wilhelm was determined to exercise maximum influence on all government affairs.
The young Kaiser Wilhelm set out to apply his imperialist ideas of "Weltpolitik" (, "world politics"), as he envisaged a gratuitously aggressive political course to increase the empire's influence in and control over the world. After the removal of Bismarck, foreign policies were tackled with by the Kaiser and the Federal Foreign Office under Friedrich von Holstein. Wilhelm's increasingly erratic and reckless conduct was unmistakably related to character deficits and the lack of diplomatic skills. The foreign office's rather sketchy assessment of the current situation and its recommendations for the empire's most suitable course of action were:
First a long-term coalition between France and Russia had to fall apart, secondly, Russia and Britain would never get together, and finally, Britain would eventually seek an alliance with Germany.
Subsequently, Wilhelm refused to renew the Reinsurance Treaty with Russia. Russia promptly formed a closer relationship with France in the Dual Alliance of 1894, as both countries were concerned about the novel disagreeability of Germany. Furthermore, Anglo–German relations provided, from a British point of view, no basis for any consensus as the Kaiser refused to divert from his, although somewhat peculiarly desperate and anachronistic, aggressive imperial engagement and the naval arms race in particular. Von Holstein's analysis proved to be mistaken on every point, Wilhelm, however, failed too, as he did not adopt a nuanced political dialogue. Germany was left gradually isolated and dependent on the Triple Alliance with Austria-Hungary, and Italy. This agreement was hampered by differences between Austria and Italy and in 1915 Italy left the alliance.
In 1897 Admiral Alfred von Tirpitz, state secretary of the German Imperial Naval Office devised his initially rather practical, yet nonetheless ambitious plan to build a sizeable naval force. Although basically posing only an indirect threat as a Fleet in being, Tirpitz theorized, that its mere existence would force Great Britain, dependend on unrestricted movement on the seas, to agree to diplomatic compromises. Tirpitz started the program of warship construction in 1898 and enjoyed the full support of Kaiser Wilhelm. Wilhelm entertained less rational ideas on the fleet, that circled around his romantic childhood dream to have a "fleet of own some day" and his obsessive adherence to direct his policies along the line of Alfred Thayer Mahan's work The Influence of Sea Power upon History. In exchange for the eastern African island of Zanzibar, Germany had bargained the island of Heligoland in the German Bight with Britain in 1890, and converted the island into a naval base and installed immense coastal defense batteries. Britain considered the imperial German endeavours to be a dangerous infringement on the century-old delicate balance of global affairs and trade on the seas under British control. The British, however, resolved to keep up the naval arms race and introduced the highly advanced new "Dreadnought" battleship concept in 1907. Germany quickly adopted the concept and by 1910 the arms race again escalated.
In the First Moroccan Crisis of 1905, Germany nearly clashed with Britain and France when the latter attempted to establish a protectorate over Morocco. Kaiser Wilhelm II was upset at having not been informed about French intentions, and declared their support for Moroccan independence. William II made a highly provocative speech regarding this. The following year, a conference was held in which all of the European powers except Austria-Hungary (by now little more than a German satellite) sided with France. A compromise was brokered by the United States where the French relinquished some, but not all, control over Morocco.
The Second Moroccan Crisis of 1911 saw another dispute over Morocco erupt when France tried to suppress a revolt there. Germany, still smarting from the previous quarrel, agreed to a settlement whereby the French ceded some territory in central Africa in exchange for Germany's renouncing any right to intervene in Moroccan affairs. This confirmed French control over Morocco, which became a full protectorate of that country in 1912.
By 1890 the economy continued to industrialize and grow on an even higher rate than during the previous two decades and increased dramatically in the years leading up to World War I. As the growth rates for the individual branches and sectors often varied considerably, and periodical figures provided by the "Kaiserliches Statistisches Amt" ("Imperial Statistical Bureau) are often disputed or just assessments. Classification and naming of internationally traded commodities and exported goods was still in progress and the structure of production and export had changed during four decades. Published documents provide numbers such as: The proportion of goods manufactured by the modern industry was approximately 25% in 1900, while the proportion of consumer related products in manufactured exports stood at 40%. Reasonably exact are the figures for the entire industrial production between 1870 and 1914, which increased about 500%.
Historian J. A. Perkins argued that more important than Bismarck's new tariff on imported grain was the introduction of the sugar beet as a main crop. Farmers quickly abandoned traditional, inefficient practices in favor of modern methods, including the use of artificial fertilizers and mechanical tools. Intensive methodical farming of sugar and other root crops made Germany the most efficient agricultural producer in Europe by 1914. Even so, farms were usually small in size and women did much of the field work. An unintended consequence was the increased dependence on migratory, especially foreign, labor.
The basics of the modern chemical research laboratory layout and the introduction of essential equipment and instruments such as Bunsen burners, the Petri dish, the Erlenmeyer flask, task-oriented working principles and team research originated in 19th-century Germany and France. The organisation of knowledge acquisition was further refined by laboratory integration in research institutes of the universities and the industries. Germany acquired the leading role in the world's Chemical industry by the late 19th century through strictly organized methodology. In 1913, the German Chemical industry produced almost 90 percent of the global supply of dyestuffs and sold about 80 percent of its production abroad.
Germany became Europe's leading steel-producing nation in the 1890s, thanks in large part to the protection from American and British competition afforded by tariffs and cartels. The leading firm was "Friedrich Krupp AG Hoesch-Krupp", run by the Krupp family. The merger of several major firms into the "Vereinigte Stahlwerke" (United Steel Works) in 1926 was modeled on the U.S. Steel corporation in the United States. The new company emphasized rationalization of management structures and modernization of the technology; it employed a multi-divisional structure and used return on investment as its measure of success. By 1913, American and German exports dominated the world steel market, as Britain slipped to third place.
In machinery, iron and steel, and other industries, German firms avoided cut-throat competition and instead relied on trade associations. Germany was a world leader because of its prevailing "corporatist mentality", its strong bureaucratic tradition, and the encouragement of the government. These associations regulate competition and allowed small firms to function in the shadow of much larger companies.
Germany's unification process after 1871 was heavily dominated by men and give priority to the "Fatherland" theme and related male issues, such as military prowess. Nevertheless, middle-class women enrolled in the "Bund Deutscher Frauenvereine", the Union of German Feminist Organizations (BDF). Founded in 1894, it grew to include 137 separate women's rights groups from 1907 until 1933, when the Nazi regime disbanded the organization. The BDF gave national direction to the proliferating women's organizations that had sprung up since the 1860s. From the beginning the BDF was a bourgeois organization, its members working toward equality with men in such areas as education, financial opportunities, and political life. Working-class women were not welcome; they were organized by the Socialists.
Formal organizations for promoting women's rights grew in numbers during the Wilhelmine period. German feminists began to network with feminists from other countries, and participated in the growth of international organizations.
By the 1890s, German colonial expansion in Asia and the Pacific (Kiauchau in China, the Marianas, the Caroline Islands, Samoa) led to frictions with Britain, Russia, Japan and the United States. The construction of the Baghdad Railway, financed by German banks, was designed to eventually connect Germany with the Turkish Empire and the Persian Gulf, but it also collided with British and Russian geopolitical interests.
The largest colonial enterprises were in Africa. The harsh treatment of the Nama and Herero in what is now Namibia in Africa in 1906–07 led to charges of genocide against the Germans. Historians are examining the links and precedents between the Herero and Namaqua Genocide and the Holocaust of the 1940s.
Ethnic demands for nation states upset the balance between the empires that dominated Europe, leading to World War I, which started in August 1914. Germany stood behind its ally Austria in a confrontation with Serbia, but Serbia was under the protection of Russia, which was allied to France. Germany was the leader of the Central Powers, which included Austria-Hungary, the Ottoman Empire, and later Bulgaria; arrayed against them were the Allies, consisting chiefly of Russia, France, Britain, and in 1915 Italy.
In explaining why neutral Britain went to war with Germany, author Paul M. Kennedy recognized it was critical for war that Germany become economically more powerful than Britain, but he downplays the disputes over economic trade imperialism, the Baghdad Railway, confrontations in Central and Eastern Europe, high-charged political rhetoric and domestic pressure-groups. Germany's reliance time and again on sheer power, while Britain increasingly appealed to moral sensibilities, played a role, especially in seeing the invasion of Belgium as a necessary military tactic or a profound moral crime. The German invasion of Belgium was not important because the British decision had already been made and the British were more concerned with the fate of France. Kennedy argues that by far the main reason was London's fear that a repeat of 1870 – when Prussia and the German states smashed France – would mean that Germany, with a powerful army and navy, would control the English Channel and northwest France. British policy makers insisted that would be a catastrophe for British security.
In the west, Germany sought a quick victory by encircling Paris using the Schlieffen Plan. But it failed due to Belgian resistance, Berlin's diversion of troops, and very stiff French resistance on the Marne, north of Paris.
The Western Front became an extremely bloody battleground of trench warfare. The stalemate lasted from 1914 until early 1918, with ferocious battles that moved forces a few hundred yards at best along a line that stretched from the North Sea to the Swiss border. The British imposed a tight naval blockade in the North Sea which lasted until 1919, sharply reducing Germany's overseas access to raw materials and foodstuffs. Food scarcity became a serious problem by 1917.
The United States joined with the Allies in April 1917. The entry of the United States into the war – following Germany's declaration of unrestricted submarine warfare – marked a decisive turning-point against Germany.
More wide open was the fighting on the Eastern Front. In the east, there were decisive victories against the Russian army, the trapping and defeat of large parts of the Russian contingent at the Battle of Tannenberg, followed by huge Austrian and German successes. The breakdown of Russian forces – exacerbated by internal turmoil caused by the 1917 Russian Revolution – led to the Treaty of Brest-Litovsk the Bolsheviks were forced to sign on 3 March 1918 as Russia withdrew from the war. It gave Germany control of Eastern Europe. Spencer Tucker says, "The German General Staff had formulated extraordinarily harsh terms that shocked even the German negotiator." When Germany later complained that the Treaty of Versailles of 1919 was too harsh on them, the Allies responded that it was more benign than Brest-Litovsk.
By defeating Russia in 1917 Germany was able to bring hundreds of thousands of combat troops from the east to the Western Front, giving it a numerical advantage over the Allies. By retraining the soldiers in new storm-trooper tactics, the Germans expected to unfreeze the Battlefield and win a decisive victory before the American army arrived in strength. However, the spring offensives all failed, as the Allies fell back and regrouped, and the Germans lacked the reserves necessary to consolidate their gains. In the summer, with the Americans arriving at 10,000 a day, and the German reserves exhausted, it was only a matter of time before multiple Allied offenses destroyed the German army.
Unexpectedly Germany plunged into World War I (1914–1918). It rapidly mobilized its civilian economy for the war effort, the economy was handicapped by the British blockade that cut off food supplies.
Meanwhile, conditions deteriorated rapidly on the home front, with severe food shortages reported in all urban areas. Causes involved the transfer of many farmers and food workers into the military, an overburdened railroad system, shortages of coal, and the British blockade that cut off imports from abroad. The winter of 1916–1917 was known as the "turnip winter," because that vegetable, usually fed to livestock, was used by people as a substitute for potatoes and meat, which were increasingly scarce. Thousands of soup kitchens were opened to feed the hungry people, who grumbled that the farmers were keeping the food for themselves. Even the army had to cut the rations for soldiers. Morale of both civilians and soldiers continued to sink.
1918 was also the year of the deadly 1918 Spanish Flu pandemic which struck hard at a population weakened by years of malnutrition.
The end of October 1918, in Wilhelmshaven, in northern Germany, saw the beginning of the German Revolution of 1918–19. Units of the German Navy refused to set sail for a last, large-scale operation in a war which they saw as good as lost, initiating the uprising. On 3 November, the revolt spread to other cities and states of the country, in many of which workers' and soldiers' councils were established. Meanwhile, Hindenburg and the senior commanders had lost confidence in the Kaiser and his government. The Kaiser and all German ruling princes abdicated. On 9 November 1918, the Social Democrat Philipp Scheidemann proclaimed a Republic.
On 11 November, the Compiègne armistice was signed, ending the war. The Treaty of Versailles was signed on 28 June 1919. Germany was to cede Alsace-Lorraine to France. Eupen-Malmédy would temporarily be ceded to Belgium, with a plebiscite to be held to allow the people the choice of the territory either remaining with Belgium or being returned to German control. Following a plebiscite, the territory was allotted to Belgium on 20 September 1920. The future of North Schleswig was to be decided by plebiscite. In the Schleswig Plebiscites, the Danish-speaking population in the north voted for Denmark and the southern, German speaking populace, part voted for Germany. Schleswig was thus partitioned. Holstein remained German without a referendum. Memel was ceded to the Allied and Associated powers, to decide the future of the area. On 9 January 1923, Lithuanian forces invaded the territory. Following negotiations, on 8 May 1924, the League of Nations ratified the annexation on the grounds that Lithuania accepted the Memel Statute, a power-sharing arrangement to protect non-Lithuanians in the territory and its autonomous status. Until 1929, German-Lithuanian co-operation increased and this power sharing arrangement worked. Poland was restored and most of the provinces of Posen and West Prussia, and some areas of Upper Silesia were reincorporated into the reformed country after plebiscites and independence uprisings. All German colonies were to be handed over to League of Nations, who then assigned them as Mandates to Australia, France, Japan, New Zealand, Portugal, and the United Kingdom. The new owners were required to act as a disinterested trustee over the region, promoting the welfare of its inhabitants in a variety of ways until they were able to govern themselves. The left and right banks of the Rhine were to be permanently demilitarised. The industrially important Saarland was to be governed by the League of Nations for 15 years and its coalfields administered by France. At the end of that time a plebiscite was to determine the Saar's future status. To ensure execution of the treaty's terms, Allied troops would occupy the left (German) bank of the Rhine for a period of 5–15 years. The German army was to be limited to 100,000 officers and men; the general staff was to be dissolved; vast quantities of war material were to be handed over and the manufacture of munitions rigidly curtailed. The navy was to be similarly reduced, and no military aircraft were allowed. Germany was also required to pay reparations for all civilian damage caused during the war.
The humiliating peace terms in the Treaty of Versailles provoked bitter indignation throughout Germany, and seriously weakened the new democratic regime. In December 1918, the Communist Party of Germany (KPD) was founded, and in 1919 it tried and failed to overthrow the new republic. Adolf Hitler in 1919 took control of the new National Socialist German Workers' Party (NSDAP), which failed in a coup in Munich in 1923. Both parties, as well as parties supporting the republic, built militant auxiliaries that engaged in increasingly violent street battles. Electoral support for both parties increased after 1929 as the Great Depression hit the economy hard, producing many unemployed men who became available for the paramilitary units. The Nazis (formerly the German Workers' Party), with a mostly rural and lower middle class base, overthrew the Weimar regime and ruled Germany in 1933–1945.
On 11 August 1919, the Weimar constitution came into effect, with Friedrich Ebert as first President.
On 30 December 1918, the Communist Party of Germany was founded by the Spartacus League, who had split from the Social Democratic Party during the war. It was headed by Rosa Luxemburg and Karl Liebknecht, and rejected the parliamentary system. In 1920, about 300,000 members from the Independent Social Democratic Party of Germany joined the party, transforming it into a mass organization. The Communist Party had a following of about 10% of the electorate.
In the first months of 1920, the Reichswehr was to be reduced to 100,000 men, in accordance with the Treaty of Versailles. This included the dissolution of many Freikorps – units made up of volunteers. In an attempt at a coup d'état in March 1920, the Kapp Putsch, extreme right-wing politician Wolfgang Kapp let Freikorps soldiers march on Berlin and proclaimed himself Chancellor of the Reich. After four days the coup d'état collapsed, due to popular opposition and lack of support by the civil servants and the officers. Other cities were shaken by strikes and rebellions, which were bloodily suppressed.
Germany was the first state to establish diplomatic relations with the new Soviet Union. Under the Treaty of Rapallo, Germany accorded the Soviet Union "de jure" recognition, and the two signatories mutually cancelled all pre-war debts and renounced war claims.
When Germany defaulted on its reparation payments, French and Belgian troops occupied the heavily industrialised Ruhr district (January 1923). The German government encouraged the population of the Ruhr to passive resistance: shops would not sell goods to the foreign soldiers, coal-miners would not dig for the foreign troops, trams in which members of the occupation army had taken a seat would be left abandoned in the middle of the street. The passive resistance proved effective, insofar as the occupation became a loss-making deal for the French government. But the Ruhr fight also led to hyperinflation, and many who lost all their fortune would become bitter enemies of the Weimar Republic, and voters of the anti-democratic right. See 1920s German inflation.
In September 1923, the deteriorating economic conditions led Chancellor Gustav Stresemann to call an end to the passive resistance in the Ruhr. In November, his government introduced a new currency, the Rentenmark (later: Reichsmark), together with other measures to stop the hyperinflation. In the following six years the economic situation improved. In 1928, Germany's industrial production even regained the pre-war levels of 1913.
The national elections of 1924 led to a swing to the right. Field Marshal Paul von Hindenburg was elected President in 1925.
In October 1925 the Treaty of Locarno was signed by Germany, France, Belgium, Britain and Italy; it recognised Germany's borders with France and Belgium. Moreover, Britain, Italy and Belgium undertook to assist France in the case that German troops marched into the demilitarised Rheinland. Locarno paved the way for Germany's admission to the League of Nations in 1926.
The actual amount of reparations that Germany was obliged to pay out was not the 132 billion marks decided in the London Schedule of 1921 but rather the 50 million marks stipulated in the A and B Bonds. Historian Sally Marks says the 112 billion marks in "C bonds" were entirely chimerical—a device to fool the public into thinking Germany would pay much more. The actual total payout from 1920 to 1931 (when payments were suspended indefinitely) was 20 billion German gold marks, worth about US$5 billion or £1 billion British pounds. 12.5 billion was cash that came mostly from loans from New York bankers. The rest was goods like coal and chemicals, or from assets like railway equipment. The reparations bill was fixed in 1921 on the basis of a German capacity to pay, not on the basis of Allied claims. The highly publicized rhetoric of 1919 about paying for all the damages and all the veterans' benefits was irrelevant for the total, but it did determine how the recipients spent their share. Germany owed reparations chiefly to France, Britain, Italy and Belgium; the US received $100 million.
The Wall Street Crash of 1929 marked the beginning of the worldwide Great Depression, which hit Germany as hard as any nation. In July 1931, the "Darmstätter und Nationalbank" – one of the biggest German banks – failed. In early 1932, the number of unemployed had soared to more than 6,000,000.
On top of the collapsing economy came a political crisis: the political parties represented in the "Reichstag" were unable to build a governing majority in the face of escalating extremism from the far right (the Nazis, NSDAP). In March 1930, President Hindenburg appointed Heinrich Brüning Chancellor, invoking article 48 of Weimar's constitution, which allowed him to override the Parliament. To push through his package of austerity measures against a majority of Social Democrats, Communists and the NSDAP (Nazis), Brüning made use of emergency decrees and dissolved Parliament. In March and April 1932, Hindenburg was re-elected in the German presidential election of 1932.
The Nazi Party was the largest party in the national elections of 1932. On 31 July 1932 it received 37.3% of the votes, and in the election on 6 November 1932 it received less, but still the largest share, 33.1%, making it the biggest party in the "Reichstag". The Communist KPD came third, with 15%. Together, the anti-democratic parties of the far right were now able to hold a considerable share of seats in Parliament, but they were at sword's point with the political left, fighting it out in the streets. The Nazis were particularly successful among Protestants, among unemployed young voters, among the lower middle class in the cities and among the rural population. It was weakest in Catholic areas and in large cities. On 30 January 1933, pressured by former Chancellor Franz von Papen and other conservatives, President Hindenburg appointed Hitler as Chancellor.
The Weimar years saw a flowering of German science and high culture, before the Nazi regime resulted in a decline in the scientific and cultural life in Germany and forced many renowned scientists and writers to flee.
German recipients dominated the Nobel prizes in science. Germany dominated the world of physics before 1933, led by Hermann von Helmholtz, Wilhelm Conrad Röntgen, Albert Einstein, Otto Hahn, Max Planck and Werner Heisenberg. Chemistry likewise was dominated by German professors and researchers at the great chemical companies such as BASF and Bayer and persons like Justus von Liebig, Fritz Haber and Emil Fischer. Theoretical mathematicians Georg Cantor in the 19th century and David Hilbert in the 20th century. Karl Benz, the inventor of the automobile, and Rudolf Diesel were pivotal figures of engineering, and Wernher von Braun, rocket engineer. Ferdinand Cohn, Robert Koch and Rudolph Virchow were three key figures in microbiology.
Among the most important German writers were Thomas Mann (1875–1955), Hermann Hesse (1877–1962) and Bertolt Brecht (1898–1956). The reactionary historian Oswald Spengler wrote "The Decline of the West" (1918–23) on the inevitable decay of Western Civilization, and influenced intellectuals in Germany such as Martin Heidegger, Max Scheler, and the Frankfurt School, as well as intellectuals around the world.
After 1933, Nazi proponents of "Aryan physics," led by the Nobel Prize-winners Johannes Stark and Philipp Lenard, attacked Einstein's theory of relativity as a degenerate example of Jewish materialism in the realm of science. Many scientists and humanists emigrated; Einstein moved permanently to the U.S. but some of the others returned after 1945.
The Nazi regime restored economic prosperity and ended mass unemployment using heavy spending on the military, while suppressing labor unions and strikes. The return of prosperity gave the Nazi Party enormous popularity, with only minor, isolated and subsequently unsuccessful cases of resistance among the German population over the 12 years of rule. The Gestapo (secret police) under Heinrich Himmler destroyed the political opposition and persecuted the Jews, trying to force them into exile, while taking their property. The Party took control of the courts, local government, and all civic organizations except the Protestant and Catholic churches. All expressions of public opinion were controlled by Hitler's propaganda minister, Joseph Goebbels, who made effective use of film, mass rallies, and Hitler's hypnotic speaking. The Nazi state idolized Hitler as its Führer (leader), putting all powers in his hands. Nazi propaganda centered on Hitler and was quite effective in creating what historians called the "Hitler Myth"—that Hitler was all-wise and that any mistakes or failures by others would be corrected when brought to his attention. In fact Hitler had a narrow range of interests and decision making was diffused among overlapping, feuding power centers; on some issues he was passive, simply assenting to pressures from whoever had his ear. All top officials reported to Hitler and followed his basic policies, but they had considerable autonomy on a daily basis.
In order to secure a majority for his Nazi Party in the "Reichstag", Hitler called for new elections. On the evening of 27 February 1933, the "Reichstag" building was set afire. Hitler swiftly blamed an alleged Communist uprising, and convinced President Hindenburg to sign the Reichstag Fire Decree, which rescinded most German civil liberties, including rights of assembly and freedom of the press. The decree allowed the police to detain people indefinitely without charges or a court order. Four thousand members of the Communist Party of Germany were arrested. Communist agitation was banned, but at this time not the Communist Party itself. Communists and Socialists were brought into hastily prepared Nazi concentration camps such as Kemna concentration camp, where they were at the mercy of the Gestapo, the newly established secret police force. Communist "Reichstag" deputies were taken into protective custody (despite their constitutional privileges).
Despite the terror and unprecedented propaganda, the last free General Elections of 5 March 1933, while resulting in 43.9% failed to gave the majority for the NSDAP as Hitler had hoped. Together with the German National People's Party (DNVP), however, he was able to form a slim majority government. In March 1933, the Enabling Act, an amendment to the Weimar Constitution, passed in the Reichstag by a vote of 444 to 94. To obtain the two-thirds majority needed to pass the bill, accommodations were made to the Catholic Centre Party, and the Nazis used the provisions of the Reichstag Fire Decree to keep several Social Democratic deputies from attending, and the Communists deputies had already been banned. This amendment allowed Hitler and his cabinet to pass laws—even laws that violated the constitution—without the consent of the president or the Reichstag. The Enabling Act formed the basis for the dictatorship, dissolution of the Länder; the trade unions and all political parties other than the Nazi Party were suppressed. A centralised totalitarian state was established, no longer based on the liberal Weimar constitution. Germany left the League of Nations. The coalition parliament was rigged by defining the absence of arrested and murdered deputies as voluntary and therefore cause for their exclusion as wilful absentees. Subsequently, in July the Centre Party was voluntarily dissolved in a "quid pro quo" with the Pope under the "anti-communist" Pope Pius XI for the "Reichskonkordat"; and by these manoeuvres Hitler achieved movement of these Catholic voters into the Nazi Party, and a long-awaited international diplomatic acceptance of his regime. According to Professor Dick Geary the Nazis gained a larger share of their vote in Protestant areas than in Catholic areas, in the elections held between 1928 and November 1932. The Communist Party was proscribed in April 1933.
Thereafter, the Chief of Staff of the SA, Ernst Röhm, demanded more political and military power for he and his men, which caused anxiety among military, industrial, and political leaders. In response, Hitler used the SS and Gestapo to purge the entire SA leadership—along with a number of Hitler's political adversaries (such as Gregor Strasser and former chancellor Kurt von Schleicher). It became known as the Night of the Long Knives and took place from 30 June to 2 July 1934. As a reward, the SS became an independent organisation under the command of the "Reichsführer-SS" Heinrich Himmler. He would rise to become Chief of German Police in June 1936 and already had control over the concentration camps system. Upon Hindenburg's death on 2 August 1934, Hitler's cabinet passed a law proclaiming the presidency to be vacant and transferred the role and powers of the head of state to Hitler as Chancellor and Führer (Leader).
The Nazi regime was particularly hostile towards Jews, who became the target of unending antisemitic propaganda attacks. The Nazis attempted to convince the German people to view and treat Jews as "subhumans" and immediately after winning almost 44% of parliamentary seats in the 1933 federal elections the Nazis imposed a nationwide boycott of Jewish businesses. In March 1933 the first official Nazi concentration camp was established at Dachau in Bavaria and from 1933 to 1935 the Nazi regime consolidated their power. The Law for the Restoration of the Professional Civil Service passed on 7 April 1933, which forced all Jewish civil servants to retire from the legal profession and civil service. The Nuremberg Laws of 1935 ban sexual relations between Jews and Germans and only those of German or related blood were eligible to be considered citizens; the remainder were classed as state subjects, without citizenship rights. This stripped Jews, Roma and others of their legal rights. Jews continued to suffer persecution under the Nazi regime, exemplified by the Kristallnacht pogrom of 1938, and about half of Germany's 500,000 Jews fled the country before 1939, after which escape became almost impossible.
In 1941, the Nazi leadership decided to implement a plan that they called the "Final Solution" which came to be known as the Holocaust. Under the plan, Jews and other "lesser races" along with political opponents from Germany as well as occupied countries were systematically murdered at murder sites, German concentration camps, and starting in 1942, at extermination camps. Between 1941 and 1945 Jews, Gypsies, Slavs, communists, homosexuals, the mentally and physically disabled and members of other groups were targeted and methodically murdered – the origin of the word "genocide". In total approximately 17 million people were killed during the Holocaust including 1.5 million children Jewish children .
In 1935, Hitler officially re-established the Luftwaffe (air force) and reintroduced universal military service. This was in breach of the Treaty of Versailles; Britain, France and Italy issued notes of protest. Hitler had the officers swear their personal allegiance to him. In 1936 German troops marched into the demilitarised Rhineland. As the territory was part of Germany, the British and French governments did not feel that attempting to enforce the treaty was worth the risk of war. The move strengthened Hitler's standing in Germany. His reputation swelled further with the 1936 Summer Olympics, which were held in the same year in Berlin, and proved another great propaganda success for the regime as orchestrated by master propagandist Joseph Goebbels.
Historians have paid special attention to the efforts by Nazi Germany to reverse the gains women made before 1933, especially in the relatively liberal Weimar Republic. It appears the role of women in Nazi Germany changed according to circumstances. Theoretically the Nazis advocated a patriarchal society in which the German woman would recognise that her "world is her husband, her family, her children, and her home". However, before 1933, women played important roles in the Nazi organization and were allowed some autonomy to mobilize other women. After Hitler came to power in 1933, feminist groups were shut down or incorporated into the National Socialist Women's League, which coordinated groups throughout the country to promote feminine virtues, motherhood and household activities. Courses were offered on childrearing, sewing and cooking. The Nazi regime did promote a liberal code of conduct regarding heterosexual relations among Germans and was sympathetic to women who bore children out of wedlock. The "Lebensborn" (Fountain of Life) association, founded by Himmler in 1935, created a series of maternity homes where single mothers could be accommodated during their pregnancies.
As Germany prepared for war, large numbers were incorporated into the public sector and with the need for full mobilization of factories by 1943, all women under the age of fifty were required to register with the employment office for work assignments to help the war effort. Women's wages remained unequal and women were denied positions of leadership or control. In 1944–45 more than 500,000 women were volunteer uniformed auxiliaries in the German armed forces (Wehrmacht). About the same number served in civil aerial defense, 400,000 volunteered as nurses, and many more replaced drafted men in the wartime economy. In the Luftwaffe they served in auxiliary roles helping to operate the anti-aircraft systems that shot down Allied bombers.
Hitler's diplomatic strategy in the 1930s was to make seemingly reasonable demands, threatening war if they were not met. When opponents tried to appease him, he accepted the gains that were offered, then went to the next target. That aggressive strategy worked as Germany pulled out of the League of Nations (1933), rejected the Versailles Treaty and began to re-arm (1935), won back the Saar (1935), remilitarized the Rhineland (1936), formed an alliance ("axis") with Mussolini's Italy (1936), sent massive military aid to Franco in the Spanish Civil War (1936–39), annexed Austria (1938), took over Czechoslovakia after the British and French "appeasement" of the Munich Agreement of 1938, formed a peace pact with Joseph Stalin's Soviet Union in August 1939, and finally invaded Poland on 1 September 1939. Britain and France declared war on Germany two days later and World War II in Europe began.
After establishing the "Rome-Berlin axis" with Benito Mussolini, and signing the Anti-Comintern Pact with Japan – which was joined by Italy a year later in 1937 – Hitler felt able to take the offensive in foreign policy. On 12 March 1938, German troops marched into Austria, where an attempted Nazi coup had been unsuccessful in 1934. When Austrian-born Hitler entered Vienna, he was greeted by loud cheers. Four weeks later, 99% of Austrians voted in favour of the annexation (Anschluss) of their country Austria to the German Reich. After Austria, Hitler turned to Czechoslovakia, where the 3.5 million-strong Sudeten German minority was demanding equal rights and self-government. At the Munich Conference of September 1938, Hitler, the Italian leader Benito Mussolini, British Prime Minister Neville Chamberlain and French Prime Minister Édouard Daladier agreed upon the cession of Sudeten territory to the German Reich by Czechoslovakia. Hitler thereupon declared that all of German Reich's territorial claims had been fulfilled. However, hardly six months after the Munich Agreement, in March 1939, Hitler used the smoldering quarrel between Slovaks and Czechs as a pretext for taking over the rest of Czechoslovakia as the Protectorate of Bohemia and Moravia. In the same month, he secured the return of Memel from Lithuania to Germany. Chamberlain was forced to acknowledge that his policy of appeasement towards Hitler had failed.
At first Germany was very successful in its military operations, including the invasions of Poland (1939), Norway (1940), the Low Countries (1940), and France in 1940. The unexpectedly swift defeat of France resulted in an upswing in Hitler's popularity and an upsurge in war fever. Hitler made peace overtures to the new British leader Winston Churchill in July 1940, but Churchill, remained dogged in his defiance. Churchill had major financial, military, and diplomatic help from President Franklin D. Roosevelt in the U.S. Hitler's emphasis on maintaining a higher living standard postponed the full mobilization of the national economy until 1942. Germany's armed forces invaded the Soviet Union in June 1941 – weeks behind schedule due to the invasion of Yugoslavia – but swept forward until they reached the gates of Moscow.
The tide began to turn in December 1941, when the invasion of the Soviet Union hit determined resistance in the Battle of Moscow and Hitler declared war on the United States in the wake of the Japanese Pearl Harbor attack. After surrender in North Africa and losing the Battle of Stalingrad in 1942–43, the Germans were forced into the defensive. By late 1944, the United States, Canada, France, and Great Britain were closing in on Germany in the West, while the Soviets were victoriously advancing in the East. Overy estimated in 2014 that in all about 353,000 civilians were killed by British and American strategic bombing of German cities, and nine million left homeless.
Nazi Germany collapsed as Berlin was taken by the Red Army in a fight to the death on the city streets. Hitler committed suicide on 30 April 1945. The final German Instrument of Surrender was signed on 8 May 1945.
By September 1945, Nazi Germany and its Axis partners (Italy and Japan) had all been defeated, chiefly by the forces of the Soviet Union, the United States, and Great Britain. Much of Europe lay in ruins, over 60 million people worldwide had been killed (most of them civilians), including approximately 6 million Jews and 11 million non-Jews in what became known as the Holocaust. World War II resulted in the destruction of Germany's political and economic infrastructure and led directly to its partition, considerable loss of territory (especially in the East), and historical legacy of guilt and shame.
As a consequence of the defeat of Nazi Germany in 1945 and the onset of the Cold War in 1947, the country was split between the two global blocs in the East and West, a period known as the division of Germany. Millions of refugees from Central and Eastern Europe moved west, most of them to West Germany. Two countries emerged: West Germany was a parliamentary democracy, a NATO member, a founding member of what since became the European Union as one of the world's largest economies and under allied military control until 1955, while East Germany was a totalitarian Communist dictatorship controlled by the Soviet Union that was a satellite of Moscow. With the collapse of Communism in 1989, reunion on West Germany's terms followed.
No one doubted Germany's economic and engineering prowess; the question was how long bitter memories of the war would cause Europeans to distrust Germany, and whether Germany could demonstrate it had rejected totalitarianism and militarism and embraced democracy and human rights.
The total of German war dead was 8% to 10% out of a prewar population of 69,000,000, or between 5.5 million and 7 million people. This included 4.5 million in the military, and between 1 and 2 million civilians. There was chaos as 11 million foreign workers and POWs left, while soldiers returned home and more than 14 million displaced German-speaking refugees from East-Central and Eastern Europe were expelled from their native land and came to German lands often foreign to them. During the Cold War, the West German government estimated a death toll of 2.2 million civilians due to the flight and expulsion of Germans and through forced labour in the Soviet Union. This figure remained unchallenged until the 1990s, when some historians put the death toll at 500,000–600,000 confirmed deaths. In 2006 the German government reaffirmed its position that 2.0–2.5 million deaths occurred.
At the Potsdam Conference, Germany was divided into four military occupation zones by the Allies and did not regain independence until 1949. The provinces east of the Oder and Neisse rivers (the Oder-Neisse line) were transferred to Poland, Lithuania, and Russia (Kaliningrad oblast); the 6.7 million Germans living in "west-shifted" Poland mostly within previously German lands and the 3 million in German-settled regions of Czechoslovakia were forced to move west.
Denazification removed, imprisoned, or executed most top officials of the old regime, but most middle and lower ranks of civilian officialdom were not seriously affected. In accordance with the Allied agreement made at the Yalta Conference millions of POWs were used as forced labor by the Soviet Union and other European countries.
In the East, the Soviets crushed dissent and imposed another police state, often employing ex-Nazis in the dreaded Stasi. The Soviets extracted about 23% of the East German GNP for reparations, while in the West reparations were a minor factor.
In 1945–46 housing and food conditions were bad, as the disruption of transport, markets, and finances slowed a return to normal. In the West, bombing had destroyed the fourth of the housing stock, and over 10 million refugees from the east had crowded in, most living in camps. Food production in 1946–48 was only two-thirds of the prewar level, while grain and meat shipments – which usually supplied 25% of the food – no longer arrived from the East. Furthermore, the end of the war brought the end of large shipments of food seized from occupied nations that had sustained Germany during the war. Coal production was down 60%, which had cascading negative effects on railroads, heavy industry, and heating. Industrial production fell more than half and reached prewar levels only at the end of 1949.
Allied economic policy originally was one of industrial disarmament plus building the agricultural sector. In the western sectors, most of the industrial plants had minimal bomb damage and the Allies dismantled 5% of the industrial plants for reparations.
However, deindustrialization became impractical and the U.S. instead called for a strong industrial base in Germany so it could stimulate European economic recovery. The U.S. shipped food in 1945–47 and made a $600 million loan in 1947 to rebuild German industry. By May 1946 the removal of machinery had ended, thanks to lobbying by the U.S. Army. The Truman administration finally realised that economic recovery in Europe could not go forward without the reconstruction of the German industrial base on which it had previously been dependent. Washington decided that an "orderly, prosperous Europe requires the economic contributions of a stable and productive Germany."
In 1945 the occupying powers took over all newspapers in Germany and purged them of Nazi influence. The American occupation headquarters, the Office of Military Government, United States (OMGUS) began its own newspaper based in Munich, "Die Neue Zeitung." It was edited by German and Jewish émigrés who fled to the United States before the war. Its mission was to encourage democracy by exposing Germans to how American culture operated. The paper was filled with details on American sports, politics, business, Hollywood, and fashions, as well as international affairs.
In 1949 the western half of the Soviet zone became the "Deutsche Demokratische Republik" – "DDR" ("German Democratic Republic" – "GDR", simply often "East Germany"), under control of the Socialist Unity Party. Neither country had a significant army until the 1950s, but East Germany built the Stasi into a powerful secret police that infiltrated every aspect of the society.
East Germany was an Eastern bloc state under political and military control of the Soviet Union through her occupation forces and the Warsaw Treaty. Political power was solely executed by leading members ("Politburo") of the communist-controlled Socialist Unity Party (SED). A Soviet-style command economy was set up; later the GDR became the most advanced Comecon state. While East German propaganda was based on the benefits of the GDR's social programs and the alleged constant threat of a West German invasion, many of her citizens looked to the West for political freedoms and economic prosperity.
Walter Ulbricht (1893–1973) was the party boss from 1950 to 1971. In 1933, Ulbricht had fled to Moscow, where he served as a Comintern agent loyal to Stalin. As World War II was ending, Stalin assigned him the job of designing the postwar German system that would centralize all power in the Communist Party. Ulbricht became deputy prime minister in 1949 and secretary (chief executive) of the Socialist Unity (Communist) party in 1950. Some 2.6 million people had fled East Germany by 1961 when he built the Berlin Wall to stop them – shooting those who attempted it. What the GDR called the "Anti-Fascist Protective Wall" was a major embarrassment for the program during the Cold War, but it did stabilize East Germany and postpone its collapse. Ulbricht lost power in 1971, but was kept on as a nominal head of state. He was replaced because he failed to solve growing national crises, such as the worsening economy in 1969–70, the fear of another popular uprising as had occurred in 1953, and the disgruntlement between Moscow and Berlin caused by Ulbricht's détente policies toward the West.
The transition to Erich Honecker (General Secretary from 1971 to 1989) led to a change in the direction of national policy and efforts by the Politburo to pay closer attention to the grievances of the proletariat. Honecker's plans were not successful, however, with the dissent growing among East Germany's population.
In 1989, the socialist regime collapsed after 40 years, despite its omnipresent secret police, the Stasi. Main reasons for the collapse include severe economic problems and growing emigration towards the West.
East Germany's culture was shaped by Communism and particularly Stalinism. It was characterized by East German psychoanalyst Hans-Joachim Maaz in 1990 as having produced a "Congested Feeling" among Germans in the East as a result of Communist policies criminalizing personal expression that deviates from government approved ideals, and through the enforcement of Communist principals by physical force and intellectual repression by government agencies, particularly the Stasi. Critics of the East German state have claimed that the state's commitment to communism was a hollow and cynical tool of a ruling elite. This argument has been challenged by some scholars who claim that the Party was committed to the advance of scientific knowledge, economic development, and social progress. However, the vast majority regarded the state's Communist ideals to be nothing more than a deceptive method for government control.
According to German historian Jürgen Kocka (2010):
In 1949, the three western occupation zones (American, British, and French) were combined into the Federal Republic of Germany (FRG, West Germany). The government was formed under Chancellor Konrad Adenauer and his conservative CDU/CSU coalition. The CDU/CSU was in power during most of the period since 1949. The capital was Bonn until it was moved to Berlin in 1990. In 1990 FRG absorbed East Germany and gained full sovereignty over Berlin. At all points West Germany was much larger and richer than East Germany, which became a dictatorship under the control of the Communist Party and was closely monitored by Moscow. Germany, especially Berlin, was a cockpit of the Cold War, with NATO and the Warsaw Pact assembling major military forces in west and east. However, there was never any combat.
West Germany enjoyed prolonged economic growth beginning in the early 1950s ("Wirtschaftswunder" or "Economic Miracle"). Industrial production doubled from 1950 to 1957, and gross national product grew at a rate of 9 or 10% per year, providing the engine for economic growth of all of Western Europe. Labor unions supported the new policies with postponed wage increases, minimized strikes, support for technological modernization, and a policy of co-determination ("Mitbestimmung"), which involved a satisfactory grievance resolution system as well as requiring representation of workers on the boards of large corporations. The recovery was accelerated by the currency reform of June 1948, U.S. gifts of $1.4 billion as part of the Marshall Plan, the breaking down of old trade barriers and traditional practices, and the opening of the global market. West Germany gained legitimacy and respect, as it shed the horrible reputation Germany had gained under the Nazis.
West Germany played a central role in the creation of European cooperation; it joined NATO in 1955 and was a founding member of the European Economic Community in 1958.
To house the German-speakers expelled from all over Eastern Europe, quarters with cheap building were erected on the outskirts of all major and minor towns and villages of West German. Most often, these quarters are recognizable by the streets being named after towns of former Eastern Germany, Sudetenland, or other previously German-settled towns.
The most dramatic and successful policy event was the currency reform of 1948. Since the 1930s, prices and wages had been controlled, but money had been plentiful. That meant that people had accumulated large paper assets, and that official prices and wages did not reflect reality, as the black market dominated the economy and more than half of all transactions were taking place unofficially. On 21 June 1948, the Western Allies withdrew the old currency and replaced it with the new Deutsche Mark at the rate of 1 new per 10 old. This wiped out 90% of government and private debt, as well as private savings. Prices were decontrolled, and labor unions agreed to accept a 15% wage increase, despite the 25% rise in prices. The result was that prices of German export products held steady, while profits and earnings from exports soared and were poured back into the economy. The currency reforms were simultaneous with the $1.4 billion in Marshall Plan money coming in from the United States, which was used primarily for investment.
In addition, the Marshall Plan forced German companies, as well as those in all of Western Europe, to modernize their business practices and take account of the international market. Marshall Plan funding helped overcome bottlenecks in the surging economy caused by remaining controls (which were removed in 1949), and Marshall Plan business reforms opened up a greatly expanded market for German exports. Overnight, consumer goods appeared in the stores, because they could be sold for realistic prices, emphasizing to Germans that their economy had turned a corner.
The success of the currency reform angered the Soviets, who cut off all road, rail, and canal links between the western zones and West Berlin. This was the Berlin Blockade, which lasted from 24 June 1948 to 12 May 1949. In response, the U.S. and Britain launched an airlift of food and coal and distributed the new currency in West Berlin as well. The city thereby became economically integrated into West Germany.
Konrad Adenauer (1876–1967) was the dominant leader in West Germany. He was the first chancellor (top official) of the FRG, 1949–63, and until his death was the founder and leader of the Christian Democratic Union (CDU), a coalition of conservatives, ordoliberals, and adherents of Protestant and Catholic social teaching that dominated West Germany politics for most of its history. During his chancellorship, the West Germany economy grew quickly, and West Germany established friendly relations with France, participated in the emerging European Union, established the country's armed forces (the "Bundeswehr"), and became a pillar of NATO as well as firm ally of the United States. Adenauer's government also commenced the long process of reconciliation with the Jews and Israel after the Holocaust.
Ludwig Erhard (1897–1977) was in charge of economic policy as economics director for the British and American occupation zones and was Adenauer's long-time economics minister. Erhard's decision to lift many price controls in 1948 (despite opposition from both the social democratic opposition and Allied authorities), plus his advocacy of free markets, helped set the Federal Republic on its strong growth from wartime devastation. Norbert Walter, a former chief economist at Deutsche Bank, argues that "Germany owes its rapid economic advance after World War II to the system of the Social Market Economy, established by Ludwig Erhard." Erhard was politically less successful when he served as the CDU Chancellor from 1963 until 1966. Erhard followed the concept of a social market economy, and was in close touch with professional economists. Erhard viewed the market itself as social and supported only a minimum of welfare legislation. However, Erhard suffered a series of decisive defeats in his effort to create a free, competitive economy in 1957; he had to compromise on such key issues as the anti-cartel legislation. Thereafter, the West German economy evolved into a conventional west European welfare state.
Meanwhile, in adopting the Godesberg Program in 1959, the Social Democratic Party of Germany (SPD) largely abandoned Marxism ideas and embraced the concept of the market economy and the welfare state. Instead it now sought to move beyond its old working class base to appeal the full spectrum of potential voters, including the middle class and professionals. Labor unions cooperated increasingly with industry, achieving labor representation on corporate boards and increases in wages and benefits.
In 1966 Erhard lost support and Kurt Kiesinger (1904–1988) was elected as Chancellor by a new CDU/CSU-SPD alliance combining the two largest parties. Socialist (SPD) leader Willy Brandt was Deputy Federal Chancellor and Foreign Minister. The Grand Coalition lasted 1966–69 and is best known for reducing tensions with the Soviet bloc nations and establishing diplomatic relations with Czechoslovakia, Romania and Yugoslavia.
With a booming economy short of unskilled workers, especially after the Berlin Wall cut off the steady flow of East Germans, the FRG negotiated migration agreements with Italy (1955), Spain (1960), Greece (1960), and Turkey (1961) that brought in hundreds of thousands of temporary guest workers, called "Gastarbeiter". In 1968 the FRG signed a guest worker agreement with Yugoslavia that employed additional guest workers. "Gastarbeiter" were young men who were paid full-scale wages and benefits, but were expected to return home in a few years.
The agreement with Turkey ended in 1973 but few workers returned because there were few good jobs in Turkey. By 2010 there were about 4 million people of Turkish descent in Germany. The generation born in Germany attended German schools, but had a poor command of either German or Turkish, and had either low-skilled jobs or were unemployed.
Willy Brandt (1913–1992) was the leader of the Social Democratic Party in 1964–87 and West German Chancellor in 1969–1974. Under his leadership, the German government sought to reduce tensions with the Soviet Union and improve relations with the German Democratic Republic, a policy known as the "Ostpolitik". Relations between the two German states had been icy at best, with propaganda barrages in each direction. The heavy outflow of talent from East Germany prompted the building of the Berlin Wall in 1961, which worsened Cold War tensions and prevented East Germans from travel. Although anxious to relieve serious hardships for divided families and to reduce friction, Brandt's "Ostpolitik" was intent on holding to its concept of "two German states in one German nation."
"Ostpolitik" was opposed by the conservative elements in Germany, but won Brandt an international reputation and the Nobel Peace Prize in 1971. In September 1973, both West and East Germany were admitted to the United Nations. The two countries exchanged permanent representatives in 1974, and, in 1987, East Germany's leader Erich Honecker paid an official state visit to West Germany.
After 1973, Germany was hard hit by a worldwide economic crisis, soaring oil prices, and stubbornly high unemployment, which jumped from 300,000 in 1973 to 1.1 million in 1975. The Ruhr region was hardest hit, as its easy-to-reach coal mines petered out, and expensive German coal was no longer competitive. Likewise the Ruhr steel industry went into sharp decline, as its prices were undercut by lower-cost suppliers such as Japan. The welfare system provided a safety net for the large number of unemployed workers, and many factories reduced their labor force and began to concentrate on high-profit specialty items. After 1990 the Ruhr moved into service industries and high technology. Cleaning up the heavy air and water pollution became a major industry in its own right. Meanwhile, formerly rural Bavaria became a high-tech center of industry.
A spy scandal forced Brandt to step down as Chancellor while remaining as party leader. He was replaced by Helmut Schmidt (b. 1918), of the SPD, who served as Chancellor in 1974–1982. Schmidt continued the "Ostpolitik" with less enthusiasm. He had a PhD in economics and was more interested in domestic issues, such as reducing inflation. The debt grew rapidly as he borrowed to cover the cost of the ever more expensive welfare state. After 1979, foreign policy issues grew central as the Cold War turned hot again. The German peace movement mobilized hundreds of thousands of demonstrators to protest against American deployment in Europe of new medium-range ballistic missiles. Schmidt supported the deployment but was opposed by the left wing of the SPD and by Brandt.
The pro-business Free Democratic Party (FDP) had been in coalition with the SPD, but now it changed direction. Led by Finance Minister Otto Graf Lambsdorff (1926–2009) the FDP adopted the market-oriented "Kiel Theses" in 1977; it rejected the Keynesian emphasis on consumer demand, and proposed to reduce social welfare spending, and try to introduce policies to stimulate production and facilitate jobs. Lambsdorff argued that the result would be economic growth, which would itself solve both the social problems and the financial problems. As a consequence, the FDP switched allegiance to the CDU and Schmidt lost his parliamentary majority in 1982. For the only time in West Germany's history, the government fell on a vote of no confidence.
Helmut Kohl (1930–2017) brought the conservatives back to power with a CDU/CSU-FDP coalition in 1982, and served as Chancellor until 1998. After repeated victories in 1983, 1987, 1990 and 1994 he was finally defeated by a landslide that was the biggest on record, for the left in the 1998 federal elections, and was succeeded as Chancellor by Gerhard Schröder of the SPD. Kohl is best known for orchestrating reunification with the approval of all the Four Powers from World War II, who still had a voice in German affairs.
During the summer of 1989, rapid changes known as "peaceful revolution" or "Die Wende" took place in East Germany, which quickly led to German reunification. Growing numbers of East Germans emigrated to West Germany, many via Hungary after Hungary's reformist government opened its borders.
The opening of the Iron Curtain between Austria and Hungary at the Pan-European Picnic in August 1989 then triggered a chain reaction, at the end of which there was no longer a GDR and the Eastern Bloc had disintegrated. Otto von Habsburg's idea developed the greatest mass exodus since the construction of the Berlin Wall and it was shown that the USSR and the rulers of the Eastern European satellite states were not ready to keep the Iron Curtain effective. This made their loss of power visible and clear that the GDR no longer received effective support from the other communist Eastern Bloc countries. Thousands of East Germans then tried to reach the West by staging sit-ins at West German diplomatic facilities in other East European capitals, most notably in Prague. The exodus generated demands within East Germany for political change, and mass demonstrations in several cities continued to grow.
Unable to stop the growing civil unrest, Erich Honecker was forced to resign in October, and on 9 November, East German authorities unexpectedly allowed East German citizens to enter West Berlin and West Germany. Hundreds of thousands of people took advantage of the opportunity; new crossing points were opened in the Berlin Wall and along the border with West Germany. This led to the acceleration of the process of reforms in East Germany that ended with the German reunification that came into force on 3 October 1990.
The SPD in coalition with the Greens won the elections of 1998. SPD leader Gerhard Schröder positioned himself as a centrist "Third Way" candidate in the mold of British Prime Minister Tony Blair and United States President Bill Clinton.
Schröder, in March 2003, reversed his position and proposed a significant downsizing of the welfare state, known as Agenda 2010. He had enough support to overcome opposition from the trade unions and the SPD's left wing. Agenda 2010 had five goals: tax cuts; labor market deregulation, especially relaxing rules protecting workers from dismissal and setting up Hartz concept job training; modernizing the welfare state by reducing entitlements; decreasing bureaucratic obstacles for small businesses; and providing new low-interest loans to local governments.
On 26 December 2004 during a Christmas holiday and Boxing Day celebration, a several thousands of Germans in Thailand and the other part across the region of South and Southeast Asia were affected by the catastrophic tsunami from the magnitude 9.0 earthquake off the Indonesian island's west coast of Sumatra, and many thousands are lost of German lives. A memorial service held at Berlin Cathedral and Bundestag on 20 January 2005. On behalf of all Germans.
On 22 May 2005, after the SPD lost to the Christian Democratic Union (CDU) in North Rhine-Westphalia, Gerhard Schröder announced he would call federal elections "as soon as possible". A motion of confidence was subsequently defeated in the Bundestag on 1 July 2005 by 151 to 296 (with 148 abstaining), after Schröder urged members not to vote for his government in order to trigger new elections. In response, a grouping of left-wing SPD dissidents and the neo-communist Party of Democratic Socialism agreed to run on a joint ticket in the general election, with Schröder's rival Oskar Lafontaine leading the new group.
From 2005 to 2009, Germany was ruled by a grand coalition led by the CDU's Angela Merkel as chancellor. Since the 2009 elections, Merkel has headed a centre-right government of the CDU/CSU and FDP.
Together with France and other EU states, Germany has played the leading role in the European Union. Germany (especially under Chancellor Helmut Kohl) was one of the main supporters of admitting many East European countries to the EU. Germany is at the forefront of European states seeking to exploit the momentum of monetary union to advance the creation of a more unified and capable European political, defence and security apparatus. German Chancellor Schröder expressed an interest in a permanent seat for Germany in the UN Security Council, identifying France, Russia, and Japan as countries that explicitly backed Germany's bid. Germany formally adopted the Euro on 1 January 1999 after permanently fixing the Deutsche Mark rate on 31 December 1998.
Since 1990, the German Bundeswehr has participated in a number of peacekeeping and disaster relief operations abroad. Since 2002, German troops formed part of the International Security Assistance Force in the war in Afghanistan, resulting in the first German casualties in combat missions since World War II.
In the worldwide economic recession that began in 2008, Germany did relatively well. However, the economic instability of Greece and several other EU nations in 2010–11 forced Germany to reluctantly sponsor a massive financial rescue.
In the wake of the 2011 natural disaster to the nuclear industry in Japan following its 9.0 magnitude earthquake and tsunami, German public opinion turned sharply against nuclear power in Germany, which produces a fourth of the electricity supply. In response Merkel has announced plans to close down the nuclear system over the next decade, and to rely even more heavily on wind and other alternative energy sources, in addition to coal and natural gas. For further information, see Germany in 2011.
Germany was affected by the European migrant crisis in 2015 as it became the final destination of choice for many asylum seekers from Africa and the Middle East entering the EU. The country took in over a million refugees and migrants and developed a quota system which redistributed migrants around its federal states based on their tax income and existing population density. The decision by Merkel to authorize unrestricted entry led to heavy criticism in Germany as well as within Europe.
A major historiographical debate about the German history concerns the "Sonderweg", the alleged "special path" that separated German history from the normal course of historical development, and whether or not Nazi Germany was the inevitable result of the "Sonderweg". Proponents of the "Sonderweg" theory such as Fritz Fischer point to such events of the Revolution of 1848, the authoritarianism of the Second Empire and the continuation of the Imperial elite into the Weimar and Nazi periods. Opponents such as Gerhard Ritter of the "Sonderweg" theory argue that proponents of the theory are guilty of seeking selective examples, and there was much contingency and chance in German history. In addition, there was much debate within the supporters of the "Sonderweg" concept as for the reasons for the "Sonderweg", and whether or not the "Sonderweg" ended in 1945. Was there a Sonderweg? Winkler says: | https://en.wikipedia.org/wiki?curid=13224 |
GNU Hurd
GNU Hurd is the multiserver microkernel written as part of GNU. It has been under development since 1990 by the GNU Project of the Free Software Foundation, designed as a replacement for the Unix kernel, and released as free software under the GNU General Public License. When the Linux kernel proved to be a viable solution, development of GNU Hurd slowed, at times having slipped intermittently between stasis and renewed activity and interest.
The Hurd's design consists of a set of protocols and server processes (or daemons, in Unix terminology) that run on the GNU Mach microkernel. The Hurd aims to surpass the Unix kernel in functionality, security, and stability, while remaining largely compatible with it. The GNU Project chose the multiserver microkernel for the operating system, due to perceived advantages over the traditional Unix monolithic kernel architecture, a view that had been advocated by some developers in the 1980s.
In December 1991 the primary architect of the Hurd described the name as a mutually recursive acronym:
As both "hurd" and "hird" are homophones of the English word "herd", the full name "GNU Hurd" is also a play on the words "herd of gnus", reflecting how the kernel works.
The logo is called the "Hurd boxes" and it also reflects on architecture. The logo is a graph where nodes represent the Hurd kernel's servers and directed edges are IPC messages.
Richard Stallman founded the GNU Project in September 1983 with an aim to create a free GNU operating system. Initially the components required for kernel development were written: editors, shell, compiler and all the others. By 1989, the GNU GPL came into being and the only major component missing was the kernel.
Development on the Hurd began in 1990 after an abandoned kernel attempt in 1986, based on the research TRIX operating system developed by Professor Steve Ward and his group at MIT's Laboratory for Computer Science (LCS). According to Thomas Bushnell, the initial Hurd architect, their early plan was to adapt the 4.4BSD-Lite kernel and, in hindsight, "It is now perfectly obvious to me that this would have succeeded splendidly and the world would be a very different place today". In 1987 Richard Stallman proposed using the Mach microkernel developed at Carnegie Mellon University. Work on this was delayed for three years due to uncertainty over whether CMU would release the Mach code under a suitable license.
With the release of the Linux kernel in 1991, the primary user of GNU's userland components soon became operating systems based on the Linux kernel (Linux distributions), prompting the coining of the term "GNU/Linux".
Development of the Hurd has proceeded slowly. Despite an optimistic announcement by Stallman in 2002 predicting a release of GNU/Hurd later that year, the Hurd is still not considered suitable for production environments. Development in general has not met expectations, and there are still a significant number of bugs and missing features. This has resulted in a poorer product than many (including Stallman) had expected. In 2010, after twenty years under development, Stallman said that he was "not very optimistic about the GNU Hurd. It makes some progress, but to be really superior it would require solving a lot of deep problems", but added that "finishing it is not crucial" for the GNU system because a free kernel already existed (Linux), and completing Hurd would not address the main remaining problem for a free operating system: device support.
The Debian project, among others, have worked on the Hurd project to produce binary distributions of Hurd-based GNU operating systems for IBM PC compatible systems.
After years of stagnation, development picked up again in 2015 and 2016, with four releases during these two years.
On August 20, 2015, amid the Google Summer of Code, it was announced that GNU Guix had been ported to GNU Hurd.
Unlike most Unix-like kernels, the Hurd uses a server–client architecture, built on a microkernel that is responsible for providing the most basic kernel services – coordinating access to the hardware: the CPU (through process management and scheduling), RAM (via memory management), and other various input/output devices (via I/O scheduling) for sound, graphics, mass storage, etc. In theory the microkernel design would allow for all device drivers to be built as servers working in user space, but today most drivers of this kind are still contained in the GNU Mach kernel space.
According to Hurd developers, the main advantage of microkernel-based design is the ability to extend the system: developing a new module would not require in depth knowledge of the rest of the kernel, and a bug in one module would not crash the entire system. Hurd provides a concept of "translators", a framework of modules used to extend a file system functionality.
From early on, the Hurd was developed to use GNU Mach as the microkernel. This was a technical decision made by Richard Stallman, who thought it would speed up the work by saving a large part of it. He has admitted that he was wrong about that. Other Unix-like systems working on the Mach microkernel include OSF/1, Lites, and MkLinux. macOS and NeXTSTEP use hybrid kernels based on Mach.
From 2004 onward, various efforts were launched to port the Hurd to more modern microkernels. The L4 microkernel was the original choice in 2004, but progress slowed to a halt. Nevertheless, during 2005, Hurd developer Neal Walfield finished the initial memory management framework for the L4/Hurd port, and Marcus Brinkmann ported essential parts of glibc; namely, getting the process startup code working, allowing programs to run, thus allowing the first user programs (trivial ones such as the hello world program) in C to run.
Since 2005, Brinkmann and Walfield started researching Coyotos as a new kernel for HURD. In 2006, Brinkmann met with Jonathan Shapiro (a primary architect of the Coyotos Operating System) to aid in and discuss the use of the Coyotos kernel for GNU/Hurd. In further discussion HURD developers realised that Coyotos (as well as other similar kernels) are not suitable for HURD.
In 2007, Hurd developers Neal Walfield and Marcus Brinkmann gave a critique of the Hurd architecture, known as "the critique", and a proposal for how a future system may be designed, known as "the position paper". In 2008, Neal Walfield began working on the Viengoos microkernel as a modern native kernel for HURD. , development on Viengoos is paused due to Walfield lacking time to work on it.
In the meantime, others have continued working on the Mach variant of Hurd.
A number of traditional Unix concepts are replaced or extended in the Hurd.
Under Unix, every running program has an associated user id, which normally corresponds to the user that started the process. This id largely dictates the actions permitted to the program. No outside process can change the user id of a running program. A Hurd process, on the other hand, runs under a "set" of user ids, which can contain multiple ids, one, or none. A sufficiently privileged process can add and remove ids to another process. For example, there is a password server that will hand out ids in return for a correct login password.
Regarding the file system, a suitable program can be designated as a "translator" for a single file or a whole directory hierarchy. Every access to the translated file, or files below a hierarchy in the second case, is in fact handled by the program. For example, a file translator may simply redirect read and write operations to another file, like a Unix symbolic link. The effect of Unix "mounting" is achieved by setting up a filesystem translator (using the "settrans" command). Translators can also be used to provide services to the user. For example, the ftpfs translator allows a user to encapsulate remote FTP sites within a directory. Then, standard tools such as ls, cp, and rm can be used to manipulate files on the remote system. Even more powerful translators are ones such as UnionFS, which allows a user to unify multiple directories into one; thus listing the unified directory reveals the contents of all the directories.
The Hurd requires a multiboot-compliant boot loader, such as GRUB.
According to the Debian documentation, there are 24 servers (18 core servers and 6 file system servers) named as follows:
The servers collectively implement the POSIX API, with each server implementing a part of the interface. For instance, the various filesystem servers each implement the filesystem calls. The storage server will work as a wrapping layer, similar to the block layer of Linux. The equivalent of VFS of Linux is achieved by libdiskfs and libpager libraries.
"Computer bought the farm" is the message displayed when an error occurs using some commands. According to the GNU Hurd FAQ:
This is the error message for EIEIO. This error code is used for a variety of "hopeless" error conditions. Most probably you will encounter it when a translator crashes while you were trying to use a file that it serves. You can thus think of it as an equivalent of the "blue screen of the death" or "Oops"... except that it's just an error! It doesn't take your whole system away with it, only the particular operation that was going on.
Hurd-based GNU distributions include: | https://en.wikipedia.org/wiki?curid=13236 |
Health care reform
Health care reform is for the most part governmental policy that affects health care delivery in a given place. Health care reform typically attempts to:
In the United States, the debate regarding health care reform includes questions of a right to health care, access, fairness, sustainability, quality and amounts spent by government. The mixed public-private health care system in the United States is the most expensive in the world, with health care costing more per person than in any other nation, and a greater portion of gross domestic product (GDP) is spent on it than in any other United Nations member state except for East Timor (Timor-Leste).
Both Hawaii and Massachusetts have implemented some incremental reforms in health care, but neither state has complete coverage of its citizens. For example, data from the Kaiser Family Foundation shows that 5% of Massachusetts and 8% of Hawaii residents are uninsured. To date, The U.S. Uniform Law Commission, sponsored by the National Conference of Commissioners on Uniform State Laws has not submitted a uniform act or model legislation regarding health care insurance or health care reform.
Healthcare was reformed in 1948 after the Second World War, broadly along the lines of the 1942 Beveridge Report, with the creation of the National Health Service or NHS. It was originally established as part of a wider reform of social services and funded by a system of National Insurance, though receipt of healthcare was never contingent upon making contributions towards the National Insurance Fund. Private health care was not abolished but had to compete with the NHS. About 15% of all spending on health in the UK is still privately funded but this includes the patient contributions towards NHS provided prescription drugs, so private sector healthcare in the UK is quite small. As part of a wider reform of social provision it was originally thought that the focus would be as much about the prevention of ill-health as it was about curing disease. The NHS for example would distribute baby formula milk fortified with vitamins and minerals in an effort to improve the health of children born in the post war years as well as other supplements such as cod liver oil and malt. Many of the common childhood diseases such as measles, mumps, and chicken pox were mostly eradicated with a national program of vaccinations.
The NHS has been through many reforms since 1974. The Conservative Thatcher administrations attempted to bring competition into the NHS by developing a supplier/buyer role between hospitals as suppliers and health authorities as buyers. This necessitated the detailed costing of activities, something which the NHS had never had to do in such detail, and some felt was unnecessary. The Labour Party generally opposed these changes, although after the party became New Labour, the Blair government retained elements of competition and even extended it, allowing private health care providers to bid for NHS work. Some treatment and diagnostic centres are now run by private enterprise and funded under contract. However, the extent of this privatisation of NHS work is still small, though remains controversial. The administration committed more money to the NHS raising it to almost the same level of funding as the European average and as a result, there was large expansion and modernisation programme and waiting times improved.
The government of Gordon Brown proposed new reforms for care in England. One is to take the NHS back more towards health prevention by tackling issues that are known to cause long term ill health. The biggest of these is obesity and related diseases such as diabetes and cardio-vascular disease. The second reform is to make the NHS a more personal service, and it is negotiating with doctors to provide more services at times more convenient to the patient, such as in the evenings and at weekends. This personal service idea would introduce regular health check-ups so that the population is screened more regularly. Doctors will give more advice on ill-health prevention (for example encouraging and assisting patients to control their weight, diet, exercise more, cease smoking etc.) and so tackle problems before they become more serious. Waiting times, which fell considerably under Blair (median wait time is about 6 weeks for elective non-urgent surgery) are also in focus. A target was set from December 2008, to ensure that no person waits longer than 18 weeks from the date that a patient is referred to the hospital to the time of the operation or treatment. This 18-week period thus includes the time to arrange a first appointment, the time for any investigations or tests to determine the cause of the problem and how it should be treated. An NHS Constitution was published which lays out the legal rights of patients as well as promises (not legally enforceable) the NHS strives to keep in England.
Numerous healthcare reforms in Germany were legislative interventions to stabilise the public health insurance since 1983. 9 out of 10 citizens are publicly insured, only 8% privately. Health care in Germany, including its industry and all services, is one of the largest sectors of the German economy. The total expenditure in health economics of Germany was about 287.3 billion euro in 2010, equivalent to 11.6 percent of the gross domestic product (GDP) this year and about 3,510 euro per capita. Direct inpatient and outpatient care equal just about a quarter of the entire expenditure - depending on the perspective. Expenditure on pharmaceutical drugs is almost twice the amount of those for the entire hospital sector. Pharmaceutical drug expenditure grew by an annual average of 4.1% between 2004 and 2010.
These developments have caused numerous healthcare reforms since the 1980s. An actual example of 2010 and 2011: First time since 2004 the drug expenditure fell from 30.2 billion euro in 2010, to 29.1 billion Euro in 2011, i. e. minus 1.1 billion Euro or minus 3.6%. That was caused by restructuring the Social Security Code: manufacturer discount 16% instead of 6%, price moratorium, increasing discount contracts, increasing discount by wholesale trade and pharmacies.
The Netherlands has introduced a new system of health care insurance based on risk equalization through a risk equalization pool. In this way, a compulsory insurance package is available to all citizens at affordable cost without the need for the insured to be assessed for risk by the insurance company. Furthermore, health insurers are now willing to take on high risk individuals because they receive compensation for the higher risks.
A 2008 article in the journal Health Affairs suggested that the Dutch health system, which combines mandatory universal coverage with competing private health plans, could serve as a model for reform in the US.
Following the collapse of the Soviet Union, Russia embarked on a series of reforms intending to deliver better healthcare by compulsory medical insurance with privately owned providers in addition to the state run institutions. According to the OECD none of 1991-93 reforms worked out as planned and the reforms had in many respects made the system worse. Russia has more physicians, hospitals, and healthcare workers than almost any other country in the world on a per capita basis, but since the collapse of the Soviet Union, the health of the Russian population has declined considerably as a result of social, economic, and lifestyle changes. However, after Putin became president in 2000 there was significant growth in spending for public healthcare and in 2006 it exceed the pre-1991 level in real terms. Also life expectancy increased from 1991-93 levels, infant mortality rate dropped from 18.1 in 1995 to 8.4 in 2008. Russian Prime Minister Vladimir Putin announced a large-scale health care reform in 2011 and pledged to allocate more than 300 billion rubles ($10 billion) in the next few years to improve health care in the country.
Taiwan changed its healthcare system in 1995 to a National Health Insurance model similar to the US Medicare system for seniors. As a result, the 40% of Taiwanese people who had previously been uninsured are now covered. It is said to deliver universal coverage with free choice of doctors and hospitals and no waiting lists. Polls in 2005 are reported to have shown that 72.5% of Taiwanese are happy with the system, and when they are unhappy, it's with the cost of premiums (equivalent to less than US$20 a month).
Employers and the self-employed are legally bound to pay National Health Insurance (NHI) premiums which are similar to social security contributions in other countries. However, the NHI is a pay-as-you-go system. The aim is for the premium income to pay costs. The system is also subsidized by a tobacco tax surcharge and contributions from the national lottery.
As evidenced by the large variety of different healthcare systems seen across the world, there are several different pathways that a country could take when thinking about reform. In comparison to the UK, physicians in Germany have more bargaining power through professional organizations (i.e., physician associations); this ability to negotiate affects reform efforts. Germany makes use of sickness funds, which citizens are obliged to join but are able to opt out if they have a very high income (Belien 87). The Netherlands used a similar system but the financial threshold for opting out was lower (Belien 89). The Swiss, on the other hand use more of a privately based health insurance system where citizens are risk-rated by age and sex, among other factors (Belien 90). The United States government provides healthcare to just over 25% of its citizens through various agencies, but otherwise does not employ a system. Healthcare is generally centered around regulated private insurance methods.
One key component to healthcare reform is the reduction of healthcare fraud and abuse. In the U.S. and the EU, it is estimated that as much as 10 percent of all healthcare transactions and expenditures may be fraudulent. See Terry L. Leap, "Phantom Billing, Fake Prescriptions, and the High Cost of Medicine: Health Care Fraud and What to do about It" (Cornell University Press, 2011).
Also interesting to notice is the oldest healthcare system in the world and its advantages and disadvantages, see Health in Germany.
In "“Getting Health Reform Right: A Guide to Improving Performance and Equity,”" Marc Roberts, William Hsiao, Peter Berman, and Michael Reich of the Harvard T.H. Chan School of Public Health aim to provide decision-makers with tools and frameworks for health care system reform. They propose five “control knobs” of health reform: financing, payment, organization, regulation, and behavior. These control knobs refer to the “mechanisms and processes that reformers can adjust to improve system performance”. The authors selected these control knobs as representative of the most important factors upon which a policymaker can act to determine health system outcomes.
Their method emphasizes the importance of “identifying goals explicitly, diagnosing causes of poor performance systematically, and devising reforms that will produce real changes in performance”. The authors view health care systems as a means to an end. Accordingly, the authors advocate for three intrinsic performance goals of the health system that can be adjusted through the control knobs. These goals include:
The authors also propose three intermediate performance measures, which are useful in determining the performance of system goals, but are not final objectives. These include:
While final performance goals are largely agreed upon, other frameworks suggest alternative intermediate goals to those mentioned here, such as equity, productivity, safety, innovation, and choice.
The five proposed control knobs represent the mechanisms and processes that policy-makers can use to design effective health care reforms. These control knobs are not only the most important elements of a healthcare system, but they also represent the aspect that can be deliberately adjusted by reforms to affect change. The five control knobs are:
The five control knobs of health care reform are not designed to work in isolation; health care reform may require the adjustment of more than one knob or of multiple knobs simultaneously. Further, there is no agreed-upon order of turning control knobs to achieve specific reforms or outcomes. Health care reform varies by setting and reforms from one context may not necessarily apply in another. It is important to note that the knobs interact with cultural and structural factors that are not illustrated within this framework, but which have an important effect on health care reform in a given context.
In summary, the authors of "“Getting Health Reform Right: A Guide to Improving Performance and Equity”" propose a framework for assessing health systems that guides decision-makers’ understanding of the reform process. Rather than a prescriptive proposal of recommendations, the framework allows users to adapt their analysis and actions based on cultural context and relevance of interventions. As noted above, many frameworks for health care reform exist in the literature. Using a comprehensive yet responsive approach such as the control knobs framework proposed by Roberts, Hsiao, Berman, and Reich allows decision-makers to more precisely determine the “mechanisms and processes” that can be changed in order to achieve improved health status, customer satisfaction, and financial risk protection. | https://en.wikipedia.org/wiki?curid=13250 |
Henry Mayhew
Henry Mayhew (25 November 1812 – 25 July 1887) was a journalist, playwright and advocate of reform. He was one of the co-founders of the satirical magazine "Punch" in 1841, and was the magazine's joint-editor, with Mark Lemon, in its early days. He is also known for his work as a social researcher, publishing an extensive series of newspaper articles in the "Morning Chronicle" that was later compiled into the book series "London Labour and the London Poor" (1851), a groundbreaking and influential survey of the city's poor.
He was born in London, one of seventeen children of Joshua Mayhew. He was educated at Westminster School before running away from his studies to sea. He then served with the East India Company as a midshipman on a ship bound for Calcutta. He returned after several years, in 1829, becoming a trainee lawyer in Wales. He left this and became a freelance journalist. He contributed to "The Thief", a readers' digest, followed quickly by editing a weekly journal – "Figaro in London". Mayhew reputedly fled his creditors and holed up at The Erwood Inn, a small public house in the village of Erwood, south of Builth Wells in Wales.
In 1835 Mayhew found himself in a state of debt and, along with a fellow writer, escaped to Paris to avoid his creditors. He spent his time writing and in the company of other writers including William Thackeray and Douglas Jerrold. Mayhew spent over ten years in Paris returning to England in the 1850s whereby he was involved in several literary adventures, mostly the writing of plays. Two of his plays – "But, However" and the "Wandering Minstrel" – were successful, whilst his early work "Figaro in London" was less successful.
On 17 July 1841 Mayhew cofounded Punch magazine. At its founding the magazine was jointly edited by Mayhew and Mark Lemon. The two men hired a group of writers and also illustrators to aid them. These included Douglas Jerrold, Angus Reach, John Leech, Richard Doyle and Shirley Brooks. Initially it was subtitled "The London Charivari", this being a reference to a satirical humour magazine published in France under the title "Le Charivari" (a work read often whilst Mayhew was in Paris). Reflecting their satiric and humorous intent, the two editors took for their name and masthead the anarchic glove puppet Mr. Punch.
"Punch" was an unexpected success, selling about 6,000 copies a week in the early years. However, sales of as many as 10,000 issues a week were required to cover all costs of the magazine. In December 1842, the magazine was sold to Bradbury and Evans; Mayhew resigned as joint editor, and he continued at the magazine as "suggestor in chief" with Mark Lemon reappointed as editor. Mayhew eventually severed his connection with the magazine, writing his last article in February 1845. His brother Horace stayed on the board of Punch until his own death.
The "Punch" years gave Mayhew the opportunity to meet talented illustrators who he later employed to work from daguerreotypes on "London Labour and the London Poor". Following "Punch" magazine, Mayhew launched "Iron Times", a railway magazine. However this venture lost Mayhew so much money that he was forced to appear in a Court of Bankruptcy in 1846.
In 1842 Mayhew contributed to the pioneering Illustrated London News. By this time Mayhew had become reasonably secure financially, had settled his debts and married Jane Jerrold, the daughter of his friend Douglas Jerrold. She lived until 1880.
The articles comprising "London Labour and the London Poor" were initially collected into three volumes in 1851; the 1861 edition included a fourth volume, co-written with Bracebridge Hemyng, John Binny and Andrew Halliday, on the lives of prostitutes, thieves and beggars. This Extra Volume took a more general and statistical approach to its subject than Volumes 1 to 3.
Mayhew wrote in volume one: ""I shall consider the whole of the metropolitan poor under three separate phases, according as they "will" work, they "can't" work, and they "won't" work"". He interviewed everyone – beggars, street-entertainers (such as Punch and Judy men), market traders, prostitutes, labourers, sweatshop workers, even down to the "mudlarks" who searched the stinking mud on the banks of the River Thames for wood, metal, rope and coal from passing ships, and the "pure-finders" who gathered dog faeces to sell to tanners. He described their clothes, how and where they lived, their entertainments and customs, and made detailed estimates of the numbers and incomes of those practising each trade. The books show how marginal and precarious many people's lives were, in what, at that time, was the richest city in the world.
Mayhew's richly detailed descriptions give an impression of what the street markets of his day were like. An example from Volume 1:
Some of the London street traders did not like the way Mayhew wrote about them. In spring/summer 1851 they established a "Street Trader's Protection Association" to guard themselves against the journalist.
Mayhew was the grandfather of Audrey Mayhew Allen (b. 1870), an author of a number of children's stories published in various periodicals, and of "Gladys in Grammarland", an imitation of Lewis Carroll's "Wonderland" books.
Mayhew's work was embraced by and was an influence on the Christian Socialists, such as Thomas Hughes, Charles Kingsley, and F. D. Maurice. Radicals also published sizeable excerpts from the reports in the "Northern Star", the "Red Republican", and other newspapers. The often sympathetic investigations, with their immediacy and unswerving eye for detail, offered unprecedented insights into the condition of the Victorian poor. Alongside the earlier work of Edwin Chadwick, they are also regarded as a decisive influence on the thinking of Charles Dickens.
Mayhew's work inspired the script of director Christine Edzard's 1990 film "The Fool". Mayhew has appeared as a character in television and radio histories of Victorian London ; he was played by Timothy West in the documentary "London" (2004), and David Haig in the Afternoon Play "A Chaos of Wealth and Want" (2010). In the 2012 novel "Dodger" by Terry Pratchett, Mayhew and his wife appear as fictionalised versions of themselves, and he is mentioned in the dedication. | https://en.wikipedia.org/wiki?curid=13253 |
Hydrogen
Hydrogen is the chemical element with the symbol H and atomic number 1. With a standard atomic weight of , hydrogen is the lightest element in the periodic table. Hydrogen is the most abundant chemical substance in the Universe, constituting roughly 75% of all baryonic mass. Non-remnant stars are mainly composed of hydrogen in the plasma state. The most common isotope of hydrogen, termed "protium" (name rarely used, symbol 1H), has one proton and no neutrons.
The universal emergence of atomic hydrogen first occurred during the recombination epoch (Big Bang). At standard temperature and pressure, hydrogen is a colorless, odorless, tasteless, non-toxic, nonmetallic, highly combustible diatomic gas with the molecular formula H2. Since hydrogen readily forms covalent compounds with most nonmetallic elements, most of the hydrogen on Earth exists in molecular forms such as water or organic compounds. Hydrogen plays a particularly important role in acid–base reactions because most acid-base reactions involve the exchange of protons between soluble molecules. In ionic compounds, hydrogen can take the form of a negative charge (i.e., anion) when it is known as a hydride, or as a positively charged (i.e., cation) species denoted by the symbol H+. The hydrogen cation is written as though composed of a bare proton, but in reality, hydrogen cations in ionic compounds are always more complex. As the only neutral atom for which the Schrödinger equation can be solved analytically, study of the energetics and bonding of the hydrogen atom has played a key role in the development of quantum mechanics.
Hydrogen gas was first artificially produced in the early 16th century by the reaction of acids on metals. In 1766–81, Henry Cavendish was the first to recognize that hydrogen gas was a discrete substance, and that it produces water when burned, the property for which it was later named: in Greek, hydrogen means "water-former".
Industrial production is mainly from steam reforming natural gas, and less often from more energy-intensive methods such as the electrolysis of water. Most hydrogen is used near the site of its production, the two largest uses being fossil fuel processing (e.g., hydrocracking) and ammonia production, mostly for the fertilizer market. Hydrogen is problematic in metallurgy because it can embrittle many metals, complicating the design of pipelines and storage tanks.
Hydrogen gas (dihydrogen or molecular hydrogen, is highly flammable:
The enthalpy of combustion is −286 kJ/mol:
Hydrogen gas forms explosive mixtures with air in concentrations from 4–74% and with chlorine at 5–95%. The explosive reactions may be triggered by spark, heat, or sunlight. The hydrogen autoignition temperature, the temperature of spontaneous ignition in air, is .
Pure hydrogen-oxygen flames emit ultraviolet light and with high oxygen mix are nearly invisible to the naked eye, as illustrated by the faint plume of the Space Shuttle Main Engine, compared to the highly visible plume of a Space Shuttle Solid Rocket Booster, which uses an ammonium perchlorate composite. The detection of a burning hydrogen leak may require a flame detector; such leaks can be very dangerous. Hydrogen flames in other conditions are blue, resembling blue natural gas flames. The destruction of the Hindenburg airship was a notorious example of hydrogen combustion and the cause is still debated. The visible orange flames in that incident were the result of a rich mixture of hydrogen to oxygen combined with carbon compounds from the airship skin.
H2 is relatively unreactive. The thermodynamic basis this low reactivity is the very strong H-H bond, with a bond dissociation energy of 435.7 kJ/mol. The kinetic basis of the low reactivity is the nonpolar nature of H2 and its weak polarizability. It spontaneously reacts with chlorine and fluorine to form hydrogen chloride and hydrogen fluoride, respectively. Molten sodium and potassium react with the gas to give the respective hydrides NaH and KH. The reactivity of H2 is strongly affected by the presence of metal catalysts. Thus, while H2 combusts readily, mixtures of H2 and O2 do not react in the absence of a catalyst.
The ground state energy level of the electron in a hydrogen atom is −13.6 eV, which is equivalent to an ultraviolet photon of roughly 91 nm wavelength.
The energy levels of hydrogen can be calculated fairly accurately using the Bohr model of the atom, which conceptualizes the electron as "orbiting" the proton in analogy to the Earth's orbit of the Sun. However, the atomic electron and proton are held together by electromagnetic force, while planets and celestial objects are held by gravity. Because of the discretization of angular momentum postulated in early quantum mechanics by Bohr, the electron in the Bohr model can only occupy certain allowed distances from the proton, and therefore only certain allowed energies.
A more accurate description of the hydrogen atom comes from a purely quantum mechanical treatment that uses the Schrödinger equation, Dirac equation or even the Feynman path integral formulation to calculate the probability density of the electron around the proton. The most complicated treatments allow for the small effects of special relativity and vacuum polarization. In the quantum mechanical treatment, the electron in a ground state hydrogen atom has no angular momentum at all—illustrating how the "planetary orbit" differs from electron motion.
Molecular H2 exists a two spin isomers, i.e. compounds with two nuclear spin states. In the orthohydrogen form, the spins of the two nuclei are parallel and form a triplet state with a molecular spin quantum number of 1 (+); in the parahydrogen form the spins are antiparallel and form a singlet with a molecular spin quantum number of 0 (–). At standard temperature and pressure, hydrogen gas contains about 25% of the para form and 75% of the ortho form, also known as the "normal form". The equilibrium ratio of orthohydrogen to parahydrogen depends on temperature, but because the ortho form is an excited state and has a higher energy than the para form, it is unstable and cannot be purified. At very low temperatures, the equilibrium state is composed almost exclusively of the para form. The liquid and gas phase thermal properties of pure parahydrogen differ significantly from those of the normal form because of differences in rotational heat capacities, as discussed more fully in "spin isomers of hydrogen". The ortho/para distinction also occurs in other hydrogen-containing molecules or functional groups, such as water and methylene, but is of little significance for their thermal properties.
The ortho form that converts to the para form slowly at low temperatures. The ortho/para ratio in condensed H2 is an important consideration in the preparation and storage of liquid hydrogen: the conversion from ortho to para is exothermic and produces enough heat to evaporate some of the hydrogen liquid, leading to loss of liquefied material. Catalysts for the ortho-para interconversion, such as ferric oxide, activated carbon, platinized asbestos, rare earth metals, uranium compounds, chromic oxide, or some nickel compounds, are used during hydrogen cooling.
While H2 is not very reactive under standard conditions, it does form compounds with most elements. Hydrogen can form compounds with elements that are more electronegative, such as halogens (F, Cl, Br, I), or oxygen; in these compounds hydrogen takes on a partial positive charge. When bonded to a more electronegative element, particularly fluorine, oxygen, or nitrogen, hydrogen can participate in a form of medium-strength noncovalent bonding with another electronegative element with a lone pair, a phenomenon called hydrogen bonding that is critical to the stability of many biological molecules. Hydrogen also forms compounds with less electronegative elements, such as metals and metalloids, where it takes on a partial negative charge. These compounds are often known as hydrides.
Hydrogen forms a vast array of compounds with carbon called the hydrocarbons, and an even vaster array with heteroatoms that, because of their general association with living things, are called organic compounds. The study of their properties is known as organic chemistry and their study in the context of living organisms is known as biochemistry. By some definitions, "organic" compounds are only required to contain carbon. However, most of them also contain hydrogen, and because it is the carbon-hydrogen bond which gives this class of compounds most of its particular chemical characteristics, carbon-hydrogen bonds are required in some definitions of the word "organic" in chemistry. Millions of hydrocarbons are known, and they are usually formed by complicated pathways that seldom involve elemental hydrogen.
Hydrogen is highly soluble in many rare earth and transition metals and is soluble in both nanocrystalline and amorphous metals. Hydrogen solubility in metals is influenced by local distortions or impurities in the crystal lattice. These properties may be useful when hydrogen is purified by passage through hot palladium disks, but the gas's high solubility is a metallurgical problem, contributing to the embrittlement of many metals, complicating the design of pipelines and storage tanks.
Compounds of hydrogen are often called hydrides, a term that is used fairly loosely. The term "hydride" suggests that the H atom has acquired a negative or anionic character, denoted H−, and is used when hydrogen forms a compound with a more electropositive element. The existence of the hydride anion, suggested by Gilbert N. Lewis in 1916 for group 1 and 2 salt-like hydrides, was demonstrated by Moers in 1920 by the electrolysis of molten lithium hydride (LiH), producing a stoichiometric quantity of hydrogen at the anode. For hydrides other than group 1 and 2 metals, the term is quite misleading, considering the low electronegativity of hydrogen. An exception in group 2 hydrides is , which is polymeric. In lithium aluminium hydride, the anion carries hydridic centers firmly attached to the Al(III).
Although hydrides can be formed with almost all main-group elements, the number and combination of possible compounds varies widely; for example, more than 100 binary borane hydrides are known, but only one binary aluminium hydride. Binary indium hydride has not yet been identified, although larger complexes exist.
In inorganic chemistry, hydrides can also serve as bridging ligands that link two metal centers in a coordination complex. This function is particularly common in group 13 elements, especially in boranes (boron hydrides) and aluminium complexes, as well as in clustered carboranes.
Oxidation of hydrogen removes its electron and gives H+, which contains no electrons and a nucleus which is usually composed of one proton. That is why is often called a proton. This species is central to discussion of acids. Under the Brønsted–Lowry acid–base theory, acids are proton donors, while bases are proton acceptors.
A bare proton, , cannot exist in solution or in ionic crystals because of its unstoppable attraction to other atoms or molecules with electrons. Except at the high temperatures associated with plasmas, such protons cannot be removed from the electron clouds of atoms and molecules, and will remain attached to them. However, the term 'proton' is sometimes used loosely and metaphorically to refer to positively charged or cationic hydrogen attached to other species in this fashion, and as such is denoted "" without any implication that any single protons exist freely as a species.
To avoid the implication of the naked "solvated proton" in solution, acidic aqueous solutions are sometimes considered to contain a less unlikely fictitious species, termed the "hydronium ion" (). However, even in this case, such solvated hydrogen cations are more realistically conceived as being organized into clusters that form species closer to H. Other oxonium ions are found when water is in acidic solution with other solvents.
Although exotic on Earth, one of the most common ions in the universe is the ion, known as protonated molecular hydrogen or the trihydrogen cation.
NASA has investigated the use of atomic hydrogen as a rocket propellant. It could be stored in liquid helium to prevent it from recombining into molecular hydrogen. When the helium is vaporized, the atomic hydrogen would be released and combine back to molecular hydrogen. The result would be an intensely hot stream of hydrogen and helium gas. The liftoff weight of rockets could be reduced by 50% by this method.
Most interstellar hydrogen is in the form of atomic hydrogen because the atoms can seldom collide and combine. They are the source of the important 21 cm hydrogen line in astronomy at 1420 MHz.
Hydrogen has three naturally occurring isotopes, denoted , and . Other, highly unstable nuclei ( to ) have been synthesized in the laboratory but not observed in nature.
Unique among the elements, distinct names are assigned to its isotopes in common use today. During the early study of radioactivity, various heavy radioactive isotopes were given their own names, but such names are no longer used, except for deuterium and tritium. The symbols D and T (instead of and ) are sometimes used for deuterium and tritium, but the corresponding symbol for protium, P, is already in use for phosphorus and thus is not available for protium. In its nomenclatural guidelines, the International Union of Pure and Applied Chemistry (IUPAC) allows any of D, T, , and to be used, although and are preferred.
The exotic atom muonium (symbol Mu), composed of an antimuon and an electron, is also sometimes considered as a light radioisotope of hydrogen, due to the mass difference between the antimuon and the electron. Muonium was discovered in 1960. During the muon's lifetime, muonium can enter into compounds such as muonium chloride (MuCl) or sodium muonide (NaMu), analogous to hydrogen chloride and sodium hydride respectively.
In 1671, Robert Boyle discovered and described the reaction between iron filings and dilute acids, which results in the production of hydrogen gas. In 1766, Henry Cavendish was the first to recognize hydrogen gas as a discrete substance, by naming the gas from a metal-acid reaction "inflammable air". He speculated that "inflammable air" was in fact identical to the hypothetical substance called "phlogiston" and further finding in 1781 that the gas produces water when burned. He is usually given credit for the discovery of hydrogen as an element. In 1783, Antoine Lavoisier gave the element the name hydrogen (from the Greek ὑδρο- "hydro" meaning "water" and -γενής "genes" meaning "creator") when he and Laplace reproduced Cavendish's finding that water is produced when hydrogen is burned.
Lavoisier produced hydrogen for his experiments on mass conservation by reacting a flux of steam with metallic iron through an incandescent iron tube heated in a fire. Anaerobic oxidation of iron by the protons of water at high temperature can be schematically represented by the set of following reactions:
Many metals such as zirconium undergo a similar reaction with water leading to the production of hydrogen.
Hydrogen was liquefied for the first time by James Dewar in 1898 by using regenerative cooling and his invention, the vacuum flask. He produced solid hydrogen the next year. Deuterium was discovered in December 1931 by Harold Urey, and tritium was prepared in 1934 by Ernest Rutherford, Mark Oliphant, and Paul Harteck. Heavy water, which consists of deuterium in the place of regular hydrogen, was discovered by Urey's group in 1932. François Isaac de Rivaz built the first de Rivaz engine, an internal combustion engine powered by a mixture of hydrogen and oxygen in 1806. Edward Daniel Clarke invented the hydrogen gas blowpipe in 1819. The Döbereiner's lamp and limelight were invented in 1823.
The first hydrogen-filled balloon was invented by Jacques Charles in 1783. Hydrogen provided the lift for the first reliable form of air-travel following the 1852 invention of the first hydrogen-lifted airship by Henri Giffard. German count Ferdinand von Zeppelin promoted the idea of rigid airships lifted by hydrogen that later were called Zeppelins; the first of which had its maiden flight in 1900. Regularly scheduled flights started in 1910 and by the outbreak of World War I in August 1914, they had carried 35,000 passengers without a serious incident. Hydrogen-lifted airships were used as observation platforms and bombers during the war.
The first non-stop transatlantic crossing was made by the British airship "R34" in 1919. Regular passenger service resumed in the 1920s and the discovery of helium reserves in the United States promised increased safety, but the U.S. government refused to sell the gas for this purpose. Therefore, H2 was used in the "Hindenburg" airship, which was destroyed in a midair fire over New Jersey on 6 May 1937. The incident was broadcast live on radio and filmed. Ignition of leaking hydrogen is widely assumed to be the cause, but later investigations pointed to the ignition of the aluminized fabric coating by static electricity. But the damage to hydrogen's reputation as a lifting gas was already done and commercial hydrogen airship travel ceased. Hydrogen is still used, in preference to non-flammable but more expensive helium, as a lifting gas for weather balloons.
In the same year the first hydrogen-cooled turbogenerator went into service with gaseous hydrogen as a coolant in the rotor and the stator in 1937 at Dayton, Ohio, by the Dayton Power & Light Co.; because of the thermal conductivity and very low viscosity of hydrogen gas, thus lower drag than air, this is the most common type in its field today for large generators (typically 60 MW and bigger; smaller generators usually are air-cooled).
The nickel hydrogen battery was used for the first time in 1977 aboard the U.S. Navy's Navigation technology satellite-2 (NTS-2). For example, the ISS, Mars Odyssey and the Mars Global Surveyor are equipped with nickel-hydrogen batteries. In the dark part of its orbit, the Hubble Space Telescope is also powered by nickel-hydrogen batteries, which were finally replaced in May 2009, more than 19 years after launch and 13 years beyond their design life.
Because of its simple atomic structure, consisting only of a proton and an electron, the hydrogen atom, together with the spectrum of light produced from it or absorbed by it, has been central to the development of the theory of atomic structure. Furthermore, study of the corresponding simplicity of the hydrogen molecule and the corresponding cation brought understanding of the nature of the chemical bond, which followed shortly after the quantum mechanical treatment of the hydrogen atom had been developed in the mid-1920s.
One of the first quantum effects to be explicitly noticed (but not understood at the time) was a Maxwell observation involving hydrogen, half a century before full quantum mechanical theory arrived. Maxwell observed that the specific heat capacity of H2 unaccountably departs from that of a diatomic gas below room temperature and begins to increasingly resemble that of a monatomic gas at cryogenic temperatures. According to quantum theory, this behavior arises from the spacing of the (quantized) rotational energy levels, which are particularly wide-spaced in H2 because of its low mass. These widely spaced levels inhibit equal partition of heat energy into rotational motion in hydrogen at low temperatures. Diatomic gases composed of heavier atoms do not have such widely spaced levels and do not exhibit the same effect.
Antihydrogen () is the antimatter counterpart to hydrogen. It consists of an antiproton with a positron. Antihydrogen is the only type of antimatter atom to have been produced .
Hydrogen, as atomic H, is the most abundant chemical element in the universe, making up 75% of normal matter by mass and more than 90% by number of atoms. (Most of the mass of the universe, however, is not in the form of chemical-element type matter, but rather is postulated to occur as yet-undetected forms of mass such as dark matter and dark energy.) This element is found in great abundance in stars and gas giant planets. Molecular clouds of H2 are associated with star formation. Hydrogen plays a vital role in powering stars through the proton-proton reaction in case of stars with very low to approximately 1 mass of the Sun and the CNO cycle of nuclear fusion in case of stars more massive than our Sun.
Throughout the universe, hydrogen is mostly found in the atomic and plasma states, with properties quite distinct from those of molecular hydrogen. As a plasma, hydrogen's electron and proton are not bound together, resulting in very high electrical conductivity and high emissivity (producing the light from the Sun and other stars). The charged particles are highly influenced by magnetic and electric fields. For example, in the solar wind they interact with the Earth's magnetosphere giving rise to Birkeland currents and the aurora. Hydrogen is found in the neutral atomic state in the interstellar medium. The large amount of neutral hydrogen found in the damped Lyman-alpha systems is thought to dominate the cosmological baryonic density of the Universe up to redshift "z"=4.
Under ordinary conditions on Earth, elemental hydrogen exists as the diatomic gas, H2. However, hydrogen gas is very rare in the Earth's atmosphere (1 ppm by volume) because of its light weight, which enables it to escape from Earth's gravity more easily than heavier gases. However, hydrogen is the third most abundant element on the Earth's surface, mostly in the form of chemical compounds such as hydrocarbons and water. Hydrogen gas is produced by some bacteria and algae and is a natural component of flatus, as is methane, itself a hydrogen source of increasing importance.
A molecular form called protonated molecular hydrogen () is found in the interstellar medium, where it is generated by ionization of molecular hydrogen from cosmic rays. This ion has also been observed in the upper atmosphere of the planet Jupiter. The ion is relatively stable in the environment of outer space due to the low temperature and density. is one of the most abundant ions in the Universe, and it plays a notable role in the chemistry of the interstellar medium. Neutral triatomic hydrogen H3 can exist only in an excited form and is unstable. By contrast, the positive hydrogen molecular ion () is a rare molecule in the universe.
is produced in chemistry and biology laboratories, often as a by-product of other reactions; in industry for the hydrogenation of unsaturated substrates; and in nature as a means of expelling reducing equivalents in biochemical reactions.
The electrolysis of water is a simple method of producing hydrogen. A low voltage current is run through the water, and gaseous oxygen forms at the anode while gaseous hydrogen forms at the cathode. Typically the cathode is made from platinum or another inert metal when producing hydrogen for storage. If, however, the gas is to be burnt on site, oxygen is desirable to assist the combustion, and so both electrodes would be made from inert metals. (Iron, for instance, would oxidize, and thus decrease the amount of oxygen given off.) The theoretical maximum efficiency (electricity used vs. energetic value of hydrogen produced) is in the range 88–94%.
When determining the electrical efficiency of PEM (proton exchange membrane) electrolysis, the higher heat value (HHV) is used. This is because the catalyst layer interacts with water as steam. As the process operates at 80 °C for PEM electrolysers the waste heat can be redirected through the system to create the steam, resulting in a higher overall electrical efficiency. The lower heat value (LHV) must be used for alkaline electrolysers as the process within these electrolysers requires water in liquid form and uses alkalinity to facilitate the breaking of the bond holding the hydrogen and oxygen atoms together. The lower heat value must also be used for fuel cells, as steam is the output rather than input.
Hydrogen is often produced using natural gas, which involves the removal of hydrogen from hydrocarbons at very high temperatures, with about 95% of hydrogen production coming from steam reforming around year 2000. Commercial bulk hydrogen is usually produced by the steam reforming of natural gas. This method is also known as the Bosch process and is widely used for the industrial preparation of hydrogen.
At high temperatures (1000–1400 K, 700–1100 °C or 1300–2000 °F), steam (water vapor) reacts with methane to yield carbon monoxide and .
This reaction is favored at low pressures but is nonetheless conducted at high pressures (2.0 MPa, 20 atm or 600 inHg). This is because high-pressure is the most marketable product and pressure swing adsorption (PSA) purification systems work better at higher pressures. The product mixture is known as "synthesis gas" because it is often used directly for the production of methanol and related compounds. Hydrocarbons other than methane can be used to produce synthesis gas with varying product ratios. One of the many complications to this highly optimized technology is the formation of coke or carbon:
Consequently, steam reforming typically employs an excess of . Additional hydrogen can be recovered from the steam by use of carbon monoxide through the water gas shift reaction, especially with an iron oxide catalyst. This reaction is also a common industrial source of carbon dioxide:
Other important methods for production include partial oxidation of hydrocarbons:
and the coal reaction, which can serve as a prelude to the shift reaction above:
Hydrogen is sometimes produced and consumed in the same industrial process, without being separated. In the Haber process for the production of ammonia, hydrogen is generated from natural gas. Electrolysis of brine to yield chlorine also produces hydrogen as a co-product.
Many metals react with water to produce , but the rate of hydrogen evolution depends on the metal, the pH, and the presence alloying agents. Most commonly, hydrogen evolution is induced by acids. The alkali and alkaline earth metals, aluminium, zinc, manganese, and iron react readily with aqueous acids. This reaction is the basis of the Kipp's apparatus, which once was used as a source of laboratory:
In the absence of acid, the evolution of is slower. Of technological significance because iron is widely used structural material, is its anaerobic corrosion:
Many metals, e.g. aluminium, are slow to react with water because they form passivated coatings of oxides. An alloy of aluminium and gallium however does react with water.
} At high pH, aluminium can produce :
Some metal-containing compounds react with acids to evole . Under anaerobic conditions, ferrous hydroxide () is be oxidized by the protons of water to form magnetite and . This process is described by the Schikorr reaction:
This process occurs during the anaerobic corrosion of iron and steel in oxygen-free groundwater and in reducing soils below the water table.
More than 200 thermochemical cycles can be used for water splitting. Many of these cycles such as the iron oxide cycle, cerium(IV) oxide–cerium(III) oxide cycle, zinc zinc-oxide cycle, sulfur-iodine cycle, copper-chlorine cycle and hybrid sulfur cycle have been evaluated for their commercial potential to produce hydrogen and oxygen from water and heat without using electricity. A number of laboratories (including in France, Germany, Greece, Japan, and the USA) are developing thermochemical methods to produce hydrogen from solar energy and water.
In deep geological conditions prevailing far away from Earth atmosphere, hydrogen () is produced during the process of serpentinization. In this process by water protons (H+) are reduced by ferrous (Fe2+) ions provided by fayalite (). The reaction forms magnetite (), quartz (Si), and hydrogen ():
This reaction closely resembles the Schikorr reaction observed in anaerobic oxidation of ferrous hydroxide in contact with water.
Large quantities of are used in the "upgrading" of fossil fuels. Key consumers of include hydrodealkylation, hydrodesulfurization, and hydrocracking. Many of these reactions can be classified as hydrogenolysis, i.e., the cleavage of bonds to carbon. Illustrative is the separation of sulfur from liquid fossil fuels:
Hydrogenation, the addition of to various substrates is conducted on a large scale. The hydrogenation of N2 to give ammonia by the Haber-Bosch Process consumes a few percent of the energy budget in all of industry. The resulting ammonia is used to supply the majority of the protein consumed by mankind. Hydrogenation is used to convert unsaturated fats and oils to saturated fats and oils. The major application is the production of margarine. Methanol is produced by hydrogenation of carbon dioxide. It is similarly the source of hydrogen in the manufacture of hydrochloric acid. is also used as a reducing agent for the conversion of some ores to the metals.
Hydrogen is commonly used in power stations as a coolant in generators due to a number of favorable properties that are a direct result of its light diatomic molecules. These include low density, low viscosity, and the highest specific heat and thermal conductivity of all gases.
Hydrogen is not an energy resource, except in the hypothetical context of commercial nuclear fusion power plants using deuterium or tritium, a technology presently far from development. The Sun's energy comes from nuclear fusion of hydrogen, but this process is difficult to achieve controllably on Earth. Elemental hydrogen from solar, biological, or electrical sources requires more energy to make than is obtained by burning it, so in these cases hydrogen functions as an energy carrier, like a battery. Hydrogen may be obtained from fossil sources (such as methane), but these sources are unsustainable.
The energy density per unit "volume" of both liquid hydrogen and compressed hydrogen gas at any practicable pressure is significantly less than that of traditional fuel sources, although the energy density per unit fuel "mass" is higher. Nevertheless, elemental hydrogen has been widely discussed in the context of energy, as a possible future "carrier" of energy on an economy-wide scale. For example, sequestration followed by carbon capture and storage could be conducted at the point of production from fossil fuels. Hydrogen used in transportation would burn relatively cleanly, with some NOx emissions, but without carbon emissions. However, the infrastructure costs associated with full conversion to a hydrogen economy would be substantial. Fuel cells can convert hydrogen and oxygen directly to electricity more efficiently than internal combustion engines.
Hydrogen is employed to saturate broken ("dangling") bonds of amorphous silicon and amorphous carbon that helps stabilizing material properties. It is also a potential electron donor in various oxide materials, including ZnO, SnO2, CdO, MgO, ZrO2, HfO2, La2O3, Y2O3, TiO2, SrTiO3, LaAlO3, SiO2, Al2O3, ZrSiO4, HfSiO4, and SrZrO3.
Apart from its use as a reactant, has a variety of smaller applications. It is used as a shielding gas in welding methods such as atomic hydrogen welding. H2 is used as the rotor coolant in electrical generators at power stations, because it has the highest thermal conductivity of any gas. Liquid H2 is used in cryogenic research, including superconductivity studies. Because is lighter than air, having a little more than of the density of air, it was once widely used as a lifting gas in balloons and airships.
Pure or mixed with nitrogen (sometimes called forming gas), hydrogen is a tracer gas for detection of minute leaks. Applications can be found in the automotive, chemical, power generation, aerospace, and telecommunications industries. Hydrogen is an authorized food additive (E 949) that allows food package leak testing among other anti-oxidizing properties.
Hydrogen's rarer isotopes also each have specific applications. Deuterium (hydrogen-2) is used in nuclear fission applications as a moderator to slow neutrons, and in nuclear fusion reactions. Deuterium compounds have applications in chemistry and biology in studies of reaction isotope effects. Tritium (hydrogen-3), produced in nuclear reactors, is used in the production of hydrogen bombs, as an isotopic label in the biosciences, and as a radiation source in luminous paints.
The triple point temperature of equilibrium hydrogen is a defining fixed point on the ITS-90 temperature scale at 13.8033 Kelvin.
H2 is a product of some types of anaerobic metabolism and is produced by several microorganisms, usually via reactions catalyzed by iron- or nickel-containing enzymes called hydrogenases. These enzymes catalyze the reversible redox reaction between H2 and its component two protons and two electrons. Creation of hydrogen gas occurs in the transfer of reducing equivalents produced during pyruvate fermentation to water. The natural cycle of hydrogen production and consumption by organisms is called the hydrogen cycle. parts per million (ppm) of H2 occurs in the breath of healthy humans. It results from the metabolic activity of hydrogenase-containing microorganisms in the large intestine.
Water splitting, in which water is decomposed into its component protons, electrons, and oxygen, occurs in the light reactions in all photosynthetic organisms. Some such organisms, including the alga "Chlamydomonas reinhardtii" and cyanobacteria, have evolved a second step in the dark reactions in which protons and electrons are reduced to form H2 gas by specialized hydrogenases in the chloroplast. Efforts have been undertaken to genetically modify cyanobacterial hydrogenases to efficiently synthesize H2 gas even in the presence of oxygen. Efforts have also been undertaken with genetically modified alga in a bioreactor.
Hydrogen poses a number of hazards to human safety, from potential detonations and fires when mixed with air to being an asphyxiant in its pure, oxygen-free form. In addition, liquid hydrogen is a cryogen and presents dangers (such as frostbite) associated with very cold liquids. Hydrogen dissolves in many metals, and, in addition to leaking out, may have adverse effects on them, such as hydrogen embrittlement, leading to cracks and explosions. Hydrogen gas leaking into external air may spontaneously ignite. Moreover, hydrogen fire, while being extremely hot, is almost invisible, and thus can lead to accidental burns.
Even interpreting the hydrogen data (including safety data) is confounded by a number of phenomena. Many physical and chemical properties of hydrogen depend on the parahydrogen/orthohydrogen ratio (it often takes days or weeks at a given temperature to reach the equilibrium ratio, for which the data is usually given). Hydrogen detonation parameters, such as critical detonation pressure and temperature, strongly depend on the container geometry. | https://en.wikipedia.org/wiki?curid=13255 |
Helium
Helium (from ) is a chemical element with the symbol He and atomic number 2. It is a colorless, odorless, tasteless, non-toxic, inert, monatomic gas, the first in the noble gas group in the periodic table. Its boiling point is the lowest among all the elements. Helium is the second lightest and second most abundant element in the observable universe (hydrogen is the lightest and most abundant). It is present at about 24% of the total elemental mass, which is more than 12 times the mass of all the heavier elements combined. Its abundance is similar to this in both the Sun and in Jupiter. This is due to the very high nuclear binding energy (per nucleon) of helium-4, with respect to the next three elements after helium. This helium-4 binding energy also accounts for why it is a product of both nuclear fusion and radioactive decay. Most helium in the universe is helium-4, the vast majority of which was formed during the Big Bang. Large amounts of new helium are being created by nuclear fusion of hydrogen in stars.
Helium is named for the Greek Titan of the Sun, Helios. It was first detected as an unknown, yellow spectral line signature in sunlight, during a solar eclipse in 1868 by Georges Rayet, Captain C. T. Haig, Norman R. Pogson, and Lieutenant John Herschel, and was subsequently confirmed by French astronomer, Jules Janssen. Janssen is often jointly credited with detecting the element, along with Norman Lockyer. Janssen recorded the helium spectral line during the solar eclipse of 1868, while Lockyer observed it from Britain. Lockyer was the first to propose that the line was due to a new element, which he named. The formal discovery of the element was made in 1895 by two Swedish chemists, Per Teodor Cleve and Nils Abraham Langlet, who found helium emanating from the uranium ore, "cleveite", which is now not regarded as a separate mineral species but as a variety of uraninite. In 1903, large reserves of helium were found in natural gas fields in parts of the United States, which is by far the largest supplier of the gas today.
Liquid helium is used in cryogenics (its largest single use, absorbing about a quarter of production), particularly in the cooling of superconducting magnets, with the main commercial application being in MRI scanners. Helium's other industrial uses—as a pressurizing and purge gas, as a protective atmosphere for arc welding, and in processes such as growing crystals to make silicon wafers—account for half of the gas produced. A well-known but minor use is as a lifting gas in balloons and airships. As with any gas whose density differs from that of air, inhaling a small volume of helium temporarily changes the timbre and quality of the human voice. In scientific research, the behavior of the two fluid phases of helium-4 (helium I and helium II) is important to researchers studying quantum mechanics (in particular the property of superfluidity) and to those looking at the phenomena, such as superconductivity, produced in matter near absolute zero.
On Earth, it is relatively rare—5.2 ppm by volume in the atmosphere. Most terrestrial helium present today is created by the natural radioactive decay of heavy radioactive elements (thorium and uranium, although there are other examples), as the alpha particles emitted by such decays consist of helium-4 nuclei. This radiogenic helium is trapped with natural gas in concentrations as great as 7% by volume, from which it is extracted commercially by a low-temperature separation process called fractional distillation. Previously, terrestrial helium—a non-renewable resource because once released into the atmosphere, it readily escapes into space—was thought to be in increasingly short supply. However, recent studies suggest that helium produced deep in the earth by radioactive decay can collect in natural gas reserves in larger than expected quantities, in some cases, having been released by volcanic activity.
The first evidence of helium was observed on August 18, 1868, as a bright yellow line with a wavelength of 587.49 nanometers in the spectrum of the chromosphere of the Sun. The line was detected by French astronomer Jules Janssen during a total solar eclipse in Guntur, India. This line was initially assumed to be sodium. On October 20 of the same year, English astronomer, Norman Lockyer, observed a yellow line in the solar spectrum, which, he named the D3 because it was near the known D1 and D2 Fraunhofer line lines of sodium. He concluded that it was caused by an element in the Sun unknown on Earth. Lockyer and English chemist Edward Frankland named the element with the Greek word for the Sun, ἥλιος ("helios").
In 1881, Italian physicist Luigi Palmieri detected helium on Earth for the first time through its D3 spectral line, when he analyzed a material that had been sublimated during a recent eruption of Mount Vesuvius.
On March 26, 1895, Scottish chemist, Sir William Ramsay, isolated helium on Earth by treating the mineral cleveite (a variety of uraninite with at least 10% rare earth elements) with mineral acids. Ramsay was looking for argon but, after separating nitrogen and oxygen from the gas, liberated by sulfuric acid, he noticed a bright yellow line that matched the D3 line observed in the spectrum of the Sun. These samples were identified as helium, by Lockyer, and British physicist William Crookes. | https://en.wikipedia.org/wiki?curid=13256 |
Hydrocarbon
As defined by IUPAC nomenclature of organic chemistry, the classifications for hydrocarbons are:
Hydrocarbons can be gases (e.g. methane and propane), liquids (e.g. hexane and benzene), waxes or low melting solids (e.g. paraffin wax and naphthalene) or polymers (e.g. polyethylene, polypropylene and polystyrene).
The predominant use of hydrocarbons is as a combustible fuel source. Methane is the predominant component of natural gas. The C6 through C10 alkanes, alkenes and isomeric cycloalkanes are the top components of gasoline, naphtha, jet fuel and specialized industrial solvent mixtures. With the progressive addition of carbon units, the simple non-ring structured hydrocarbons have higher viscosities, lubricating indices, boiling points, solidification temperatures, and deeper color. At the opposite extreme from methane lie the heavy tars that remain as the "lowest fraction" in a crude oil refining retort. They are collected and widely utilized as roofing compounds, pavement composition (bitumen), wood preservatives (the creosote series) and as extremely high viscosity shear-resisting liquids.
Some large-scale nonfuel applications of hydrocarbons begins with ethane and propane, which are obtained from petroleum and natural gas. These two gases are converted to ethylene and propylene. These two alkenes are precursors to polymers, including polyethylene, polystyrene, acrylates, polypropylene, etc. Another class of special hydrocarbons is BTX (chemistry), a mixture of benzene, toluene, and the three xylene isomers. Global consumption of benzene, estimated at more than 40,000,000 tons (2009).
Hydrocarbons are also prevalent in nature. Some eusocial arthropods, such as the Brazilian stingless bee, "Schwarziana quadripunctata", use unique hydrocarbon "scents" in order to determine kin from non-kin. The chemical hydrocarbon composition varies between age, sex, nest location, and hierarchal position.
The noteworthy feature of hydrocarbons is their inertness, especially for saturated members. Otherwise, three main types of reactions can be identified:
Substitution reactions only occur in saturated hydrocarbons (single carbon–carbon bonds). Such reactions require highly reactive reagents, such as chlorine and fluorine. In the case of chlorination, one of the chlorine atoms replaces a hydrogen atom. The reactions proceed via free-radical pathways.
all the way to CCl4 (carbon tetrachloride)
all the way to C2Cl6 (hexachloroethane)
Of the classes of hydrcarbons, aromatic compounds uniquely (or nearly so) undergo substitution reactions. The chemical process practiced on the largest scale is an example: the reaction of benzene and ethylene to give ethylbenzene.
Addition reactions apply to alkenes and alkynes. In this reaction a variety of reagents add "across" the pi-bond(s). Chlorine, hydrogen chloride, and hydrogen are illustrative reagents. Alkenes and some alkynes also undergo polymerization, alkene metathesis, and alkyne metathesis.
Hydrocarbons are currently the main source of the world's electric energy and heat sources (such as home heating) because of the energy produced when they are combusted. Often this energy is used directly as heat such as in home heaters, which use either petroleum or natural gas. The hydrocarbon is burnt and the heat is used to heat water, which is then circulated. A similar principle is used to create electrical energy in power plants.
Common properties of hydrocarbons are the facts that they produce steam, carbon dioxide and heat during combustion and that oxygen is required for combustion to take place. The simplest hydrocarbon, methane, burns as follows:
In inadequate supply of air, carbon monoxide gas and water vapour are formed:
Another example is the combustion of propane:
And finally, for any linear alkane of n carbon atoms,
Partial oxidation characterizes the reactions of alkenes and oxygen. This process is the basis of rancidification and paint drying.
The vast majority of hydrocarbons found on Earth occur in petroleum, coal, and natural gas. Petroleum (literally "rock oil" – petrol for short) and coal result from the decomposition of organic matter. In contrast to petroleum, is coal, which is richer in carbon and poorer in hydrogen. Natural gas is the product of methanogenesis.
A seemingly limitless variety of compounds comprise petroleum, hence the necessity of refineries. These hydrocarbons consist of saturated hydrocarbons, aromatic hydrocarbons, or combinations of the two. Missing in petroleum are alkenes and alkynes. Their production requires refineries. Petroleum-derived hydrocarbons are mainly consumed for fuel, but they are also the source of virtually all synthetic organic compounds, including plastics and pharmaceuticals. Natural gas is consumed almost exclusively as fuel as is coal.
A small fraction of hydrocarbon found on earth is thought to be abiological.
Some hydrocarbons also abundant in the solar system. Lakes of liquid methane and ethane have been found on Titan, Saturn's largest moon, confirmed by the Cassini-Huygens Mission. Hydrocarbons are also abundant in nebulae forming polycyclic aromatic hydrocarbon (PAH) compounds.
Bioremediation of hydrocarbon from soil or water contaminated is a formidable challenge because of the chemical inertness that characterize hydrocarbons (hence they survived millions of years in the source rock). Nonetheless, many strategies have been devised, bioremediation being prominent. The basic problem with bioremediation is the paucity of enzymes that act on them. Nonetheless the area has received regular attention.
Bacteria in the gabbroic layer of the ocean's crust can degrade hydrocarbons; but the extreme environment makes research difficult. Other bacteria such as "Lutibacterium anuloederans" can also degrade hydrocarbons.
Mycoremediation or breaking down of hydrocarbon by mycelium and mushrooms is possible.
Hydrocarbons are generally of low toxicity, hence the widespread use of gasoline and related volatile products. Aromatic compounds such as benzene are narcotic and chronic toxins and are carcinogenic. Certain rare polycyclic aromatic compounds are carcinogenic.
Hydrocarbons are highly flammable.
Burning hydrocarbons as fuel, producing carbon dioxide and water, is a major contributor to anthropogenic global warming.
Hydrocarbons are introduced into the environment through their extensive use as fuels and chemicals as well as through leaks or accidental spills during exploration, production, refining, or transport. Anthropogenic hydrocarbon contamination of soil is a serious global issue due to contaminant persistence and the negative impact on human health. | https://en.wikipedia.org/wiki?curid=13257 |
Halogen
The halogens () are a group in the periodic table consisting of five chemically related elements: fluorine (F), chlorine (Cl), bromine (Br), iodine (I), and astatine (At). The artificially created element 117, tennessine (Ts), may also be a halogen. In the modern IUPAC nomenclature, this group is known as group 17.
The name "halogen" means "salt-producing". When halogens react with metals, they produce a wide range of salts, including calcium fluoride, sodium chloride (common table salt), silver bromide and potassium iodide.
The group of halogens is the only periodic table group that contains elements in three of the main states of matter at standard temperature and pressure. All of the halogens form acids when bonded to hydrogen. Most halogens are typically produced from minerals or salts. The middle halogens—chlorine, bromine, and iodine—are often used as disinfectants. Organobromides are the most important class of flame retardants, while elemental halogens are dangerous and can be lethally toxic.
The fluorine mineral fluorospar was known as early as 1529. Early chemists realized that fluorine compounds contain an undiscovered element, but were unable to isolate it. In 1860, George Gore, an English chemist, ran a current of electricity through hydrofluoric acid and probably produced fluorine, but he was unable to prove his results at the time. In 1886, Henri Moissan, a chemist in Paris, performed electrolysis on potassium bifluoride dissolved in anhydrous hydrogen fluoride, and successfully isolated fluorine.
Hydrochloric acid was known to alchemists and early chemists. However, elemental chlorine was not produced until 1774, when Carl Wilhelm Scheele heated hydrochloric acid with manganese dioxide. Scheele called the element "dephlogisticated muriatic acid", which is how chlorine was known for 33 years. In 1807, Humphry Davy investigated chlorine and discovered that it is an actual element. Chlorine combined with hydrochloric acid, as well as sulfuric acid in certain instances created chlorine gas which was a poisonous gas during World War I. It displaced oxygen in contaminated areas and replaced common oxygenated air with the toxic chlorine gas. In which the gas would burn human tissue externally and internally, especially the lungs making breathing difficult or impossible depending on the level of contamination.
Bromine was discovered in the 1820s by Antoine Jérôme Balard. Balard discovered bromine by passing chlorine gas through a sample of brine. He originally proposed the name "muride" for the new element, but the French Academy changed the element's name to bromine.
Iodine was discovered by Bernard Courtois, who was using seaweed ash as part of a process for saltpeter manufacture. Courtois typically boiled the seaweed ash with water to generate potassium chloride. However, in 1811, Courtois added sulfuric acid to his process and found that his process produced purple fumes that condensed into black crystals. Suspecting that these crystals were a new element, Courtois sent samples to other chemists for investigation. Iodine was proven to be a new element by Joseph Gay-Lussac.
In 1931, Fred Allison claimed to have discovered element 85 with a magneto-optical machine, and named the element Alabamine, but was mistaken. In 1937, Rajendralal De claimed to have discovered element 85 in minerals, and called the element dakine, but he was also mistaken. An attempt at discovering element 85 in 1939 by Horia Hulubei and Yvette Cauchois via spectroscopy was also unsuccessful, as was an attempt in the same year by Walter Minder, who discovered an iodine-like element resulting from beta decay of polonium. Element 85, now named astatine, was produced successfully in 1940 by Dale R. Corson, K.R. Mackenzie, and Emilio G. Segrè, who bombarded bismuth with alpha particles.
In 2010, a team led by nuclear physicist Yuri Oganessian involving scientists from the JINR, Oak Ridge National Laboratory, Lawrence Livermore National Laboratory, and Vanderbilt University successfully bombarded berkelium-249 atoms with calcium-48 atoms to make tennessine-294. As of 2019, it is the most recent element to be discovered.
In 1811, the German chemist Johann Schweigger proposed that the name "halogen" – meaning "salt producer", from αλς [als] "salt" and γενειν [genein] "to beget" – replace the name "chlorine", which had been proposed by the English chemist Humphry Davy. Davy's name for the element prevailed. However, in 1826, the Swedish chemist Baron Jöns Jacob Berzelius proposed the term "halogen" for the elements fluorine, chlorine, and iodine, which produce a sea-salt-like substance when they form a compound with an alkaline metal.
The names of the elements all have the ending -ine. Fluorine's name comes from the Latin word "fluere", meaning "to flow", because it was derived from the mineral fluorospar, which was used as a flux in metalworking. Chlorine's name comes from the Greek word "chloros", meaning "greenish-yellow". Bromine's name comes from the Greek word "bromos", meaning "stench". Iodine's name comes from the Greek word "iodes", meaning "violet". Astatine's name comes from the Greek word "astatos", meaning "unstable". Tennessine is named after the US state of Tennessee.
The halogens show trends in chemical bond energy moving from top to bottom of the periodic table column with fluorine deviating slightly. It follows a trend in having the highest bond energy in compounds with other atoms, but it has very weak bonds within the diatomic F2 molecule. This means that further down group 17 in the periodic table, the reactivity of elements decreases because of the increasing size of the atoms.
Halogens are highly reactive, and as such can be harmful or lethal to biological organisms in sufficient quantities. This high reactivity is due to the high electronegativity of the atoms due to their high effective nuclear charge. Because the halogens have seven valence electrons in their outermost energy level, they can gain an electron by reacting with atoms of other elements to satisfy the octet rule. Fluorine is one of the most reactive elements: it is the only element more electronegative than oxygen, it attacks otherwise-inert materials such as glass, and it forms compounds with the usually inert noble gases. It is a corrosive and highly toxic gas. The reactivity of fluorine is such that, if used or stored in laboratory glassware, it can react with glass in the presence of small amounts of water to form silicon tetrafluoride (SiF4). Thus, fluorine must be handled with substances such as Teflon (which is itself an organofluorine compound), extremely dry glass, or metals such as copper or steel, which form a protective layer of fluoride on their surface.
The high reactivity of fluorine allows paradoxically some of the strongest bonds possible, especially to carbon. For example, Teflon is fluorine bonded with carbon and is extremely resistant to thermal and chemical attacks and has a high melting point.
The halogens form homonuclear diatomic molecules (not proven for astatine).
Due to relatively weak intermolecular forces, chlorine and fluorine form part of the group known as "elemental gases".
The elements become less reactive and have higher melting points as the atomic number increases. The higher melting points are caused by stronger London dispersion forces resulting from more electrons.
All of the halogens have been observed to react with hydrogen to form hydrogen halides. For fluorine, chlorine, and bromine, this reaction is in the form of:
However, hydrogen iodide and hydrogen astatide can split back into their constituent elements.
The hydrogen-halogen reactions get gradually less reactive toward the heavier halogens. A fluorine-hydrogen reaction is explosive even when it is dark and cold. A chlorine-hydrogen reaction is also explosive, but only in the presence of light and heat. A bromine-hydrogen reaction is even less explosive; it is explosive only when exposed to flames. Iodine and astatine only partially react with hydrogen, forming equilibria.
All halogens form binary compounds with hydrogen known as the hydrogen halides: hydrogen fluoride (HF), hydrogen chloride (HCl), hydrogen bromide (HBr), hydrogen iodide (HI), and hydrogen astatide (HAt). All of these compounds form acids when mixed with water. Hydrogen fluoride is the only hydrogen halide that forms hydrogen bonds. Hydrochloric acid, hydrobromic acid, hydroiodic acid, and acid are all strong acids, but hydrofluoric acid is a weak acid.
All of the hydrogen halides are irritants. Hydrogen fluoride and hydrogen chloride are highly acidic. Hydrogen fluoride is used as an industrial chemical, and is highly toxic, causing pulmonary edema and damaging cells. Hydrogen chloride is also a dangerous chemical. Breathing in gas with more than fifty parts per million of hydrogen chloride can cause death in humans. Hydrogen bromide is even more toxic and irritating than hydrogen chloride. Breathing in gas with more than thirty parts per million of hydrogen bromide can be lethal to humans. Hydrogen iodide, like other hydrogen halides, is toxic.
All the halogens are known to react with sodium to form sodium fluoride, sodium chloride, sodium bromide, sodium iodide, and sodium astatide. Heated sodium's reaction with halogens produces bright-orange flames. Sodium's reaction with chlorine is in the form of:
Iron reacts with fluorine, chlorine, and bromine to form Iron(III) halides. These reactions are in the form of:
However, when iron reacts with iodine, it forms only iron(II) iodide.
Iron wool can react rapidly with fluorine to form the white compound iron(III) fluoride even in cold temperatures. When chlorine comes into contact with a heated iron, they react to form the black iron (III) chloride. However, if the reaction conditions are moist, this reaction will instead result in a reddish-brown product. Iron can also react with bromine to form iron(III) bromide. This compound is reddish-brown in dry conditions. Iron's reaction with bromine is less reactive than its reaction with fluorine or chlorine. A hot iron can also react with iodine, but it forms iron(II) iodide. This compound may be gray, but the reaction is always contaminated with excess iodine, so it is not known for sure. Iron's reaction with iodine is less vigorous than its reaction with the lighter halogens.
Interhalogen compounds are in the form of XYn where X and Y are halogens and n is one, three, five, or seven. Interhalogen compounds contain at most two different halogens. Large interhalogens, such as can be produced by a reaction of a pure halogen with a smaller interhalogen such as . All interhalogens except can be produced by directly combining pure halogens in various conditions.
Interhalogens are typically more reactive than all diatomic halogen molecules except F2 because interhalogen bonds are weaker. However, the chemical properties of interhalogens are still roughly the same as those of diatomic halogens. Many interhalogens consist of one or more atoms of fluorine bonding to a heavier halogen. Chlorine can bond with up to 3 fluorine atoms, bromine can bond with up to five fluorine atoms, and iodine can bond with up to seven fluorine atoms. Most interhalogen compounds are covalent gases. However, some interhalogens are liquids, such as BrF3, and many iodine-containing interhalogens are solids.
Many synthetic organic compounds such as plastic polymers, and a few natural ones, contain halogen atoms; these are known as "halogenated" compounds or organic halides. Chlorine is by far the most abundant of the halogens in seawater, and the only one needed in relatively large amounts (as chloride ions) by humans. For example, chloride ions play a key role in brain function by mediating the action of the inhibitory transmitter GABA and are also used by the body to produce stomach acid. Iodine is needed in trace amounts for the production of thyroid hormones such as thyroxine. Organohalogens are also synthesized through the nucleophilic abstraction reaction.
Polyhalogenated compounds are industrially created compounds substituted with multiple halogens. Many of them are very toxic and bioaccumulate in humans, and have a very wide application range. They include PCBs, PBDEs, and perfluorinated compounds (PFCs), as well as numerous other compounds.
Fluorine reacts vigorously with water to produce oxygen (O2) and hydrogen fluoride (HF):
Chlorine has maximum solubility of ca. 7.1 g Cl2 per kg of water at ambient temperature (21 °C). Dissolved chlorine reacts to form hydrochloric acid (HCl) and hypochlorous acid, a solution that can be used as a disinfectant or bleach:
Bromine has a solubility of 3.41 g per 100 g of water, but it slowly reacts to form hydrogen bromide (HBr) and hypobromous acid (HBrO):
Iodine, however, is minimally soluble in water (0.03 g/100 g water at 20 °C) and does not react with it. However, iodine will form an aqueous solution in the presence of iodide ion, such as by addition of potassium iodide (KI), because the triiodide ion is formed.
The table below is a summary of the key physical and atomic properties of the halogens. Data marked with question marks are either uncertain or are estimations partially based on periodic trends rather than observations.
Fluorine has one stable and naturally occurring isotope, fluorine-19. However, there are trace amounts in nature of the radioactive isotope fluorine-23, which occurs via cluster decay of protactinium-231. A total of eighteen isotopes of fluorine have been discovered, with atomic masses ranging from 14 to 31. Chlorine has two stable and naturally occurring isotopes, chlorine-35 and chlorine-37. However, there are trace amounts in nature of the isotope chlorine-36, which occurs via spallation of argon-36. A total of 24 isotopes of chlorine have been discovered, with atomic masses ranging from 28 to 51.
There are two stable and naturally occurring isotopes of bromine, bromine-79 and bromine-81. A total of 33 isotopes of bromine have been discovered, with atomic masses ranging from 66 to 98. There is one stable and naturally occurring isotope of iodine, iodine-127. However, there are trace amounts in nature of the radioactive isotope iodine-129, which occurs via spallation and from the radioactive decay of uranium in ores. Several other radioactive isotopes of iodine have also been created naturally via the decay of uranium. A total of 38 isotopes of iodine have been discovered, with atomic masses ranging from 108 to 145.
There are no stable isotopes of astatine. However, there are four naturally occurring radioactive isotopes of astatine produced via radioactive decay of uranium, neptunium, and plutonium. These isotopes are astatine-215, astatine-217, astatine-218, and astatine-219. A total of 31 isotopes of astatine have been discovered, with atomic masses ranging from 191 to 227.
Tennessine has only two known synthetic radioisotopes, tennessine-293 and tennessine-294.
Approximately six million metric tons of the fluorine mineral fluorite are produced each year. Four hundred-thousand metric tons of hydrofluoric acid are made each year. Fluorine gas is made from hydrofluoric acid produced as a by-product in phosphoric acid manufacture. Approximately 15,000 metric tons of fluorine gas are made per year.
The mineral halite is the mineral that is most commonly mined for chlorine, but the minerals carnallite and sylvite are also mined for chlorine. Forty million metric tons of chlorine are produced each year by the electrolysis of brine.
Approximately 450,000 metric tons of bromine are produced each year. Fifty percent of all bromine produced is produced in the United States, 35% in Israel, and most of the remainder in China. Historically, bromine was produced by adding sulfuric acid and bleaching powder to natural brine. However, in modern times, bromine is produced by electrolysis, a method invented by Herbert Dow. It is also possible to produce bromine by passing chlorine through seawater and then passing air through the seawater.
In 2003, 22,000 metric tons of iodine were produced. Chile produces 40% of all iodine produced, Japan produces 30%, and smaller amounts are produced in Russia and the United States. Until the 1950s, iodine was extracted from kelp. However, in modern times, iodine is produced in other ways. One way that iodine is produced is by mixing sulfur dioxide with nitrate ores, which contain some iodates. Iodine is also extracted from natural gas fields.
Even though astatine is naturally occurring, it is usually produced by bombarding bismuth with alpha particles.
Tennessine is made by fusing berkelium-249 and calcium-48.
Both chlorine and bromine are used as disinfectants for drinking water, swimming pools, fresh wounds, spas, dishes, and surfaces. They kill bacteria and other potentially harmful microorganisms through a process known as sterilization. Their reactivity is also put to use in bleaching. Sodium hypochlorite, which is produced from chlorine, is the active ingredient of most fabric bleaches, and chlorine-derived bleaches are used in the production of some paper products. Chlorine also reacts with sodium to create sodium chloride, which is table salt.
Halogen lamps are a type of incandescent lamp using a tungsten filament in bulbs that have small amounts of a halogen, such as iodine or bromine added. This enables the production of lamps that are much smaller than non-halogen incandescent lightbulbs at the same wattage. The gas reduces the thinning of the filament and blackening of the inside of the bulb resulting in a bulb that has a much greater life. Halogen lamps glow at a higher temperature (2800 to 3400 kelvins) with a whiter color than other incandescent bulbs. However, this requires bulbs to be manufactured from fused quartz rather than silica glass to reduce breakage.
In drug discovery, the incorporation of halogen atoms into a lead drug candidate results in analogues that are usually more lipophilic and less water-soluble. As a consequence, halogen atoms are used to improve penetration through lipid membranes and tissues. It follows that there is a tendency for some halogenated drugs to accumulate in adipose tissue.
The chemical reactivity of halogen atoms depends on both their point of attachment to the lead and the nature of the halogen. Aromatic halogen groups are far less reactive than aliphatic halogen groups, which can exhibit considerable chemical reactivity. For aliphatic carbon-halogen bonds, the C-F bond is the strongest and usually less chemically reactive than aliphatic C-H bonds. The other aliphatic-halogen bonds are weaker, their reactivity increasing down the periodic table. They are usually more chemically reactive than aliphatic C-H bonds. As a consequence, the most common halogen substitutions are the less reactive aromatic fluorine and chlorine groups.
Fluoride anions are found in ivory, bones, teeth, blood, eggs, urine, and hair of organisms. Fluoride anions in very small amounts may be essential for humans. There are 0.5 milligrams of fluorine per liter of human blood. Human bones contain 0.2 to 1.2% fluorine. Human tissue contains approximately 50 parts per billion of fluorine. A typical 70-kilogram human contains 3 to 6 grams of fluorine.
Chloride anions are essential to a large number of species, humans included. The concentration of chlorine in the dry weight of cereals is 10 to 20 parts per million, while in potatoes the concentration of chloride is 0.5%. Plant growth is adversely affected by chloride levels in the soil falling below 2 parts per million. Human blood contains an average of 0.3% chlorine. Human bone typically contains 900 parts per million of chlorine. Human tissue contains approximately 0.2 to 0.5% chlorine. There is a total of 95 grams of chlorine in a typical 70-kilogram human.
Some bromine in the form of the bromide anion is present in all organisms. A biological role for bromine in humans has not been proven, but some organisms contain organobromine compounds. Humans typically consume 1 to 20 milligrams of bromine per day. There are typically 5 parts per million of bromine in human blood, 7 parts per million of bromine in human bones, and 7 parts per million of bromine in human tissue. A typical 70-kilogram human contains 260 milligrams of bromine.
Humans typically consume less than 100 micrograms of iodine per day. Iodine deficiency can cause intellectual disability. Organoiodine compounds occur in humans in some of the glands, especially the thyroid gland, as well as the stomach, epidermis, and immune system. Foods containing iodine include cod, oysters, shrimp, herring, lobsters, sunflower seeds, seaweed, and mushrooms. However, iodine is not known to have a biological role in plants. There are typically 0.06 milligrams per liter of iodine in human blood, 300 parts per billion of iodine in human bones, and 50 to 700 parts per billion of iodine in human tissue. There are 10 to 20 milligrams of iodine in a typical 70-kilogram human.
Astatine, although very scarce, has been found in micrograms in the earth.
Tennessine is purely man-made and has no other roles in nature.
The halogens tend to decrease in toxicity towards the heavier halogens.
Fluorine gas is extremely toxic; breathing in fluorine at a concentration of 25 parts per million is potentially lethal. Hydrofluoric acid is also toxic, being able to penetrate skin and cause highly painful burns. In addition, fluoride anions are toxic, but not as toxic as pure fluorine. Fluoride can be lethal in amounts of 5 to 10 grams. Prolonged consumption of fluoride above concentrations of 1.5 mg/L is associated with a risk of dental fluorosis, an aesthetic condition of the teeth. At concentrations above 4 mg/L, there is an increased risk of developing skeletal fluorosis, a condition in which bone fractures become more common due to the hardening of bones. Current recommended levels in water fluoridation, a way to prevent dental caries, range from 0.7 to 1.2 mg/L to avoid the detrimental effects of fluoride while at the same time reaping the benefits. People with levels between normal levels and those required for skeletal fluorosis tend to have symptoms similar to arthritis.
Chlorine gas is highly toxic. Breathing in chlorine at a concentration of 3 parts per million can rapidly cause a toxic reaction. Breathing in chlorine at a concentration of 50 parts per million is highly dangerous. Breathing in chlorine at a concentration of 500 parts per million for a few minutes is lethal. Breathing in chlorine gas is highly painful.
Pure bromine is somewhat toxic but less toxic than fluorine and chlorine. One hundred milligrams of bromine is lethal. Bromide anions are also toxic, but less so than bromine. Bromide has a lethal dose of 30 grams.
Iodine is somewhat toxic, being able to irritate the lungs and eyes, with a safety limit of 1 milligram per cubic meter. When taken orally, 3 grams of iodine can be lethal. Iodide anions are mostly nontoxic, but these can also be deadly if ingested in large amounts.
Astatine is very radioactive and thus highly dangerous, but it has not been produced in macroscopic quantities and hence it is most unlikely that its toxicity will be of much relevance to the average individual.
Tennessine cannot be chemically investigated due to how short its half-life is, although its radioactivity would make it very dangerous.
Certain aluminium clusters have superatom properties. These aluminium clusters are generated as anions ( with "n" = 1, 2, 3, ... ) in helium gas and reacted with a gas containing iodine. When analyzed by mass spectrometry one main reaction product turns out to be . These clusters of 13 aluminium atoms with an extra electron added do not appear to react with oxygen when it is introduced in the same gas stream. Assuming each atom liberates its 3 valence electrons, this means 40 electrons are present, which is one of the magic numbers for sodium and implies that these numbers are a reflection of the noble gases.
Calculations show that the additional electron is located in the aluminium cluster at the location directly opposite from the iodine atom. The cluster must therefore have a higher electron affinity for the electron than iodine and therefore the aluminium cluster is called a superhalogen (i.e., the vertical electron detachment energies of the moieties that make up the negative ions are larger than those of any halogen atom). The cluster component in the ion is similar to an iodide ion or a bromide ion. The related cluster is expected to behave chemically like the triiodide ion. | https://en.wikipedia.org/wiki?curid=13258 |
Hee Haw
Hee Haw is an American television variety show featuring country music and humor with the fictional rural "Kornfield Kounty" as the backdrop. It aired first-run on CBS from 1969 to 1971, in syndication from 1971 to 1993, and on TNN from 1996 to 1997. Reruns of the series ran on RFD-TV (from September 2008, through April 2020) and currently on Circle (beginning January 2020).
The show was inspired by "Rowan & Martin's Laugh-In", with the major differences being that "Hee Haw" was centered on country music and rural rather than pop culture, and was far less topical. Hosted by country music artists Buck Owens and Roy Clark for most of its run, the show was equally well known for its corn pone humor as for its voluptuous, scantily clad women (called the Hee Haw Honeys) in stereotypical farmer's daughter outfits.
"Hee Haw"'s appeal, however, was not limited to a rural audience. It was successful in all of the major markets, including network-based Los Angeles and New York City, as well as other large cities like Boston and Chicago. Other niche programs such as "The Lawrence Welk Show" (which targeted older audiences) and "Soul Train" (which targeted black audiences) also rose to prominence in syndication during the era. Like "Laugh-In", the show minimized production costs by taping all of the recurring sketches for a season in batches, setting up for the Cornfield one day, the Joke Fence on another day, etc. At the height of its popularity, an entire season's worth of shows were taped in two separate week-long sessions, then individual shows were assembled from edited sections. Only musical performances were taped with a live audience, while a laugh track was added to all other segments.
The series was taped for the CBS Television Network at its station affiliate WLAC-TV (now WTVF) in downtown Nashville, Tennessee, and later at Opryland USA in the city's Donelson area. The show was produced by Yongestreet Productions through the mid-1980s; it was later produced by Gaylord Entertainment, which distributed the show in syndication. The show's name, derived from a common English onomatopoeia used to describe a donkey's braying, was coined by show business talent manager and producer Bernie Brillstein.
After 25 seasons, the series initially ended its run in June 1993, where it was soon picked up by TNN for reruns. TNN eventually ordered an additional season of first-run episodes, beginning November 23, 1996. The show ultimately ended on December 27, 1997.
"Hee Haw's" creators, Frank Peppiatt and John Aylesworth, were both Canadian-born writers who had extensive experience in writing for variety shows. Inspired by the enormous prior success of rural sitcoms of the 1960s, especially on CBS, which included the small town sympathetic "The Andy Griffith Show", followed by the country parodying "The Beverly Hillbillies" and its spin-offs "Petticoat Junction" and "Green Acres", Peppiatt and Aylesworth sought to create a variety show catering to the same audience. This was despite neither one having a firm grasp on rural comedy.
The producers selected a pair of hosts who represented each side in a divide in country/western music at the time: Buck Owens was a prominent architect of the California-based Bakersfield sound and one of the biggest country hitmakers of the 1960s. Roy Clark, who had worked in Washington, D.C. and Las Vegas, was a stalwart of Nashville's Music Row known for his skill at mixing music and comedy onstage. Both Clark and Owens had been regular guests on "The Jimmy Dean Show" during Peppiatt and Aylesworth's time writing for that series. Peppiatt and Aylesworth brought on two fellow Canadian writers with more experience in rural humor, Gordie Tapp and Don Harron; Harron would appear in the recurring role of "Charlie Farquharson", the rural anchorman for station KORN. The producers also scored a country comedy expert familiar to rural audiences in Archie Campbell, who co-starred and wrote many of the jokes and sketches, along with Tapp, George Yanok and comedian Jack Burns (a former cast member of the rural comedy "The Andy Griffith Show") in the first season.
"Hee Haw" premiered on CBS in 1969 as a summer series. The network picked it up as a last-minute replacement for "The Smothers Brothers Comedy Hour", a popular but controversial variety show that had been canceled amid feuds between the Smothers Brothers and the network censors over the show's topical humor. Though the show had solid ratings overall (it sat at #16 for the 1970-71 season), it was dropped in July 1971 by CBS as part of the so-called "Rural Purge" that abruptly cancelled all of the network's country-themed shows, including those with still-respectable ratings. The success of shows like "Hee Haw" was the source of a heated dispute in CBS's corporate offices: Vice President of network programming Michael Dann, although he personally disliked the shows, argued in favor of ratings (reflecting audience size), while his subordinate, Fred Silverman, head of daytime programming, held that certain demographics within total television viewership — in which "Hee Haw" and the others performed poorly — could draw more advertising dollars. Silverman's view won out, Dann was fired, Silverman promoted, and CBS cancelled its rural shows in the summer of 1971.
Undaunted, however, "Hee Haw's" producers put together a syndication deal for the show, which continued in roughly the same format for the rest of its run. Peppiatt and Aylesworth's company, Yongestreet Productions (named for Yonge Street, a prominent thoroughfare in their home city of Toronto), maintained ownership of the series.
During the show's peak in popularity, "Hee Haw" often competed in syndication against "The Lawrence Welk Show", a long-running ABC program which had likewise been cancelled in 1971, in its case in a purge of the networks' older demographic-leaning programs. Like "Hee Haw", "Lawrence Welk" was picked up for syndication in the fall of 1971, in some markets by the same stations. The success of the two shows in syndication, and the network decisions that led to their respective cancellations, were the inspiration for a novelty song, "The Lawrence Welk-Hee Haw Counter-Revolution Polka", performed by Clark; it rose to become a top 10 hit on the "Billboard" Hot Country Singles chart in the fall of 1972.
"Welk" and "Hee Haw" also competed against another music-oriented niche program that moved to syndication in 1971, "Soul Train". Originally a local program based in Chicago, the black-oriented program also went on to a very long run in syndication; unlike either program, "Soul Train" entered the market after achieving success at the local scale.
In 1981, Yongestreet was acquired by Gaylord Entertainment (best known for the "Grand Ole Opry" and its related businesses). Mirroring the long downward trend in the popularity of variety shows in general that had taken place in the 1970s, ratings began to decline for "Hee Haw" around 1986. That year, Owens departed as host, leaving Clark to continue with a celebrity guest host each week. The ratings decline continued into the early 1990s. In the fall of 1991, in an attempt to win back viewers, attract a younger audience, and keep pace with sweeping changes in the country music industry of the era, the show's format and setting underwent a dramatic overhaul. The changes included a new title ("The Hee Haw Show"), more pop-oriented country music, and the barnyard-cornfield setting replaced by a city street and shopping mall set. The first of the new episodes aired in January 1992. The changes alienated many of the show's longtime viewers while failing to gain the hoped-for younger viewers, and the ratings continued their decline.
During the summer of 1992, a decision was made to end first-run production, and instead air highlights of the show's earlier years in a revamped program called "Hee Haw Silver" (as part of celebrating the show's 25th season). Under the new format, Clark hosted a mixture of classic clips and new footage.
"Hee Haw Silver" episodes also aired a series of retrospective looks at performers who had died since performing in highlighted content, such as David "Stringbean" Akeman, Archie Campbell, Junior Samples, and Kenny Price. According to the show's producer, Sam Lovullo, the ratings showed improvement with these classic reruns; however, the series was finally cancelled in June 1993 at the conclusion of its 25th season. "Hee Haw" continued to pop up in reruns throughout the 1990s and later during the following decade in a series of successful DVD releases from Time Life.
After the show's syndication run ended, reruns aired on The Nashville Network from 1993 until 1996. Upon the cancellation of reruns in 1996 the program resurfaced, for another first-run season, ultimately concluding the series in 1997. Its 22 years in TV syndication (1971–93) was, during its latter years, tied with "Soul Train" with the record for the longest-running U.S. syndicated TV program ("Soul Train" continued until 2006); "Hee Haw", as of 2019, ranks the sixth longest-running syndicated American TV program and the longest-running of its genre (the current record is "Entertainment Tonight", which has been on the air for years; aside from that and "Soul Train", "Wheel of Fortune", "Jeopardy!" and "Inside Edition" rank ahead of it, with "Judge Judy" surpassing "Hee Haw" in September 2019).
During the 2006–07 season CMT aired a series of reruns and TV Land also recognized the series with an award presented by k.d. lang; in attendance were Roy Clark, Gunilla Hutton, Barbi Benton, the Hager twins, Linda Thompson, Misty Rowe, and others. It was during this point, roughly between the years of 2004 and 2007, that Time Life began selling selected episodes of the show on DVD. Among the DVD content offered was the 1978 10th anniversary special that had not been seen since its original airing. CMT sporadically aired the series, usually in graveyard slots, and primarily held the rights in order to be able to air the musical performances as part of their music video library (such as during the "Pure Vintage" block on CMT Pure Country).
Reruns of "Hee Haw" began airing on RFD-TV in September 2008, where it ran for 12 years, anchoring the network's Sunday night lineup, although beginning in January 2014 an episode airs on Saturday afternoon and the same episode is rerun the following Sunday night; those episodes were cut down to comply with the 44-minute minimum. In 2011, the network began re-airing the earliest episodes from 1969–70 on Thursday evenings. That summer, many of the surviving cast members, along with a number of country artists who were guest stars on the show, taped a "Country's Family Reunion" special, entitled "Salute to the Kornfield", which aired on RFD-TV in January 2012. The special is also part of "Country's Family Reunion" 's DVD series. Concurrent with the special was the unveiling of a "Hee Haw" exhibit, titled "Pickin' and Grinnin' ", at the Oklahoma History Center in Oklahoma City.
"Hee Haw" left RFD-TV in 2020 and began airing on the Grand Ole Opry-operated Circle network.
As part of the promotions for its DVD products, Time-Life also compiles and syndicates a half-hour clip show series "The Hee Haw Collection".
Two rural-style comedians, already well known in their native Canada, gained their first major U.S. exposure: Gordie Tapp and Don Harron (whose KORN Radio character, newscaster Charlie Farquharson, had been a fixture of Canadian television since 1952 and later appeared on "The Red Green Show").
Other cast members over the years included:
Roy Acuff,
Cathy Baker (as the show's emcee),
Billy Jim Baker,
Barbi Benton,
Kelly Billingsley,
Vicki Bird,
Jennifer Bishop,
Archie Campbell,
Phil Campbell,
Harry Cole (Weeping Willie),
Mackenzie Colt,
John Henry Faulk,
Tennessee Ernie Ford,
Diana Goodman,
Marianne Gordon (Rogers),
Jim and Jon Hager,
Victoria Hallman,
Little Jimmy Henley,
Gunilla Hutton,
Linda Johnson,
Grandpa Jones,
Zella Lehr (the "unicycle girl"),
George Lindsey (reprising his "Goober" character from "The Andy Griffith Show"),
Little Jimmy Dickens,
Irlene Mandrell,
Charlie McCoy,
Dawn McKinley,
Patricia McKinnon,
Sherry Miles,
Rev. Grady Nutt,
Minnie Pearl,
Claude "Jackie" Phelps,
Slim Pickens,
Kenny Price,
Anne Randall,
Chase Randolph,
Susan Raye,
Jimmie Riddle,
Jeannine Riley,
Alice Ripley,
Lulu Roman,
Misty Rowe,
Junior Samples,
Ray Sanders,
Terry Sanders,
Gailard Sartain,
Diana Scott,
Shotgun Red,
Gerald Smith (the "Georgia Quacker"),
Jeff Smith,
Donna Stokes,
Dennis Stone,
Roni Stoneman,
Mary Taylor,
Nancy Taylor,
Linda Thompson,
Lisa Todd,
Pedro Tomas,
Nancy Traylor,
Buck Trent,
Jackie Waddell,
Pat Woodell, and
Jonathan Winters, among many others.
The Buckaroos (Buck Owens' band) initially served as the house band on the show and consisted of members Don Rich, Jim Shaw, Jerry Brightman, Jerry Wiggins, Rick Taylor, Doyle Singer (Doyle Curtsinger), Don Lee, Ronnie Jackson, Terry Christoffersen, Doyle Holly and, in later seasons, fiddle player Jana Jae and Victoria Hallman, who replaced Don Rich on harmony vocals (Rich was killed in a motorcycle accident in 1974). In later seasons, the show hired Nashville musicians to serve as the show's "house band." George Richey was the first music director. When he left to marry Tammy Wynette, harmonica player Charlie McCoy, already a member of the band when he was not playing on recording sessions, became the show's music director, forming the "Hee Haw Band", which became the house band for the remainder of the series' run. The Nashville Edition, a four-member (two male, two female) singing group, served as the background singers for most of the musical performances, along with performing songs on their own.
Some of the cast members made national headlines: Lulu Roman was twice charged with drug possession in 1971; David "Stringbean" Akeman and his wife were murdered in November 1973 during a robbery at their home; Slim Pickens, less than two years after joining the series, was diagnosed with a fatal brain tumor, and, as mentioned above, Don Rich of the Buckaroos was killed in a motorcycle crash in 1974.
Some cast members, such as Charlie McCoy and Tennessee Ernie Ford, originally appeared on the show as guest stars; while Barbi Benton was the only known former cast member to return in later seasons only as a guest star.
After Buck Owens left the show, a different country music artist would accompany Roy Clark as a guest co-host each week, who would give the episode's opening performance, participate with Clark in the "Pickin' and Grinnin'" sketch, and assist Clark in introducing the other guest stars' performances. The show's final season ("Hee Haw Silver") was hosted by Clark alone.
Some of the most popular sketches and segments on "Hee Haw" included, but were not limited to:
Guest stars often participated in some of the sketches (mostly the "PFFT! You Was Gone" and "The Cornfield" sketches); however, this did not occur until later seasons.
"Hee Haw" featured a premiere showcase on commercial television throughout its run for country, bluegrass, gospel, and other styles of American traditional music, featuring hundreds of elite musical performances that were paramount to the success, popularity and legacy of the series for a broad audience of Southern, rural and purely music fans alike. Although country music was the primary genre of music featured on the show, guest stars and cast members alike also performed music from other genres, such as rock 'n' roll oldies, big band, and pop standards.
Some of the music-based segments on the show (other than guest stars' performances) included:
Lovullo also has made the claim the show presented "what were, in reality, the first musical videos." Lovullo said his videos were conceptualized by having the show's staff go to nearby rural areas and film animals and farmers, before editing the footage to fit the storyline of a particular song. "The video material was a very workable production item for the show," he wrote. "It provided picture stories for songs. However, some of our guests felt the videos took attention away from their live performances, which they hoped would promote record sales. If they had a hit song, they didn't want to play it under comic barnyard footage." The concept's mixed reaction eventually spelled an end to the "video" concept on "Hee Haw". However, several of co-host Owens' songs – including "Tall, Dark Stranger," "Big in Vegas", and "I Wouldn't Live in New York City (If They Gave Me the Whole Dang Town)" – aired on the series and have since aired on Great American Country and CMT as part of their classic country music programming blocks.
"Hee Haw" featured at least two, and sometimes three or four, guest celebrities each week. While most of the guest stars were country music artists, a wide range of other famous luminaries were featured from actors and actresses to sports stars to politicians.
Sheb Wooley, one of the original cast members, wrote the show's theme song. After filming the initial 13 episodes, other professional demands caused him to leave the show, but he returned from time to time as a guest star.
Loretta Lynn was the first guest star of "Hee Haw" and made more guest appearances (24) than any other artist. She also co-hosted the show more than any other guest co-host and therefore appears on more of the DVD releases for retail sale than any other guest star. Tammy Wynette was second with 21 guest appearances, and Wynette married George Richey (the musical director for "Hee Haw" from 1970 to 1977) in 1978.
From 1990–92, country megastar Garth Brooks appeared on the show four times. In 1992, producer Sam Lovullo tried unsuccessfully to contact Brooks because he wanted him for the final show. Brooks then surprised Lovullo by showing up at the last minute, ready to don his overalls and perform for the final episode.
A barn interior set was used as the main stage for most of the musical performances from the show's premiere until the debut of the "Hee Haw Honky Tonk" sketch in the early 1980s. Afterwards, the "Hee Haw Honky Tonk" set would serve as the main stage for the remainder of the series' run. Buck Owens then began using the barn interior set for his performances after it was replaced by the "Hee Haw Honky Tonk" set and was named "Buck's Place" (as a nod to one of Owens' hits, "Sam's Place"). Other settings for the musical performances throughout the series' run included a haystack (where the entire cast performed songs), the living room of a Victorian house, the front porch and lawn of the Samuel B. Sternwheeler home, a grist mill (where Roy Clark performed many of his songs in earlier seasons), and a railroad depot, where Buck Owens performed his songs before acquiring "Buck's Place."
Elvis Presley was a fan of "Hee Haw" and wanted to appear as a guest on the program, but Presley was afraid that his manager, Colonel Tom Parker, would not allow him to do so. Two of the Hee Haw Honeys dated Presley long before they joined the cast: Linda Thompson in the mid-1970s, whom Presley had a long-term relationship with after his divorce from Priscilla; and Diana Goodman shortly afterwards. Shortly after Presley's death, his father, Vernon Presley, made a cameo appearance on the show, alongside Thompson and Buck Owens, and paid tribute to his late son, noting how much Elvis enjoyed watching the show, and introduced one of his favorite gospel songs, as performed by the Hee Haw Gospel Quartet.
"Hee Haw" produced a short-lived spin-off series, "Hee Haw Honeys" (not to be confused with "Hee Haw's" female cast members), for the 1978–79 television seasons. This musical sitcom starred Kathie Lee Johnson (Gifford) along with "Hee Haw" regulars Misty Rowe, Gailard Sartain, Lulu Roman, and Kenny Price as a family who owned a truck stop restaurant (likely inspired by the "Lulu's Truck Stop" sketch on "Hee Haw"). Their restaurant included a bandstand, where guest country artists would perform a couple of their hits of the day, sometimes asking the cast to join them. Cast members would also perform songs occasionally; and the Nashville Edition, "Hee Haw's" backup singing group, frequently appeared on the show, portraying regular patrons of the restaurant. Notable guest stars on "Honeys" included, but were not limited to: Loretta Lynn, The Oak Ridge Boys, Larry Gatlin, Dave & Sugar, and the Kendalls. Some stations that carried "Hee Haw" would air an episode of "Honeys" prior to "Hee Haw".
The Hee Haw Theater opened in Branson, Missouri, in 1981 and operated through 1983. It featured live shows using the cast of the television series, as well as guests and other talent. The format was similar with a country variety show-type family theme.
Charlton Comics also published humor comics based on "Hee Haw". They were drawn by Frank Roberge.
When "Hee Haw" went into syndication, its normal time slot was on Saturday evening in the early fringe hour (7:00pm ET).
"Hee Haw" continues to remain popular with its long-time fans and those who have discovered the program through DVD releases or its reruns on RFD-TV. In spite of the popularity among its fans, the program has never been a favorite of television critics or reviewers; the "Hee Haw Honeys" spin-off, in particular, was cited in a 2002 "TV Guide" article as one of the 10 worst television series ever.
In the third season episode of "The Simpsons" "Colonel Homer", "Hee Haw" is parodied as the TV show "Ya Hoo!".
On at least four episodes of the animated Fox series "Family Guy", when the storyline hits a dead-end, a cutaway to Conway Twitty performing a song is inserted. The hand-off is done in "Hee Haw" style, and often uses actual footage of Twitty performing on the show.
Lulu Roman released a new album titled "At Last" on January 15, 2013. The album features Lulu's versions of 12 classics and standards, including guest appearances by Dolly Parton, T. Graham Brown, Linda Davis, and Georgette Jones (daughter of George Jones and Tammy Wynette).
The series was referenced in "The Critic" as a parody crossover with "Star Trek The Next Generation" under the title of "Hee Haw The Next Generation", where the characters of the "Star Trek" series act out as the cast of "Hee Haw". | https://en.wikipedia.org/wiki?curid=13260 |
Hexadecimal
In mathematics and computing, hexadecimal (also base 16, or hex) is a positional system that represents numbers using a base of 16. Unlike the common way of representing numbers with ten symbols, it uses sixteen distinct symbols, most often the symbols "0"–"9" to represent values zero to nine, and "A"–"F" (or alternatively "a"–"f") to represent values ten to fifteen.
Hexadecimal numerals are widely used by computer system designers and programmers, as they provide a human-friendly representation of binary-coded values. Each hexadecimal digit represents four binary digits, also known as a nibble, which is half a byte. For example, a single byte can have values ranging from 00000000 to 11111111 in binary form, which can be conveniently represented as 00 to FF in hexadecimal.
In mathematics, a subscript is typically used to specify the base, also known as the radix. For example, the decimal value would be expressed in hexadecimal as . In programming, a number of notations are used to support hexadecimal representation, usually involving a prefix or suffix. The prefix codice_1 is used in C and related languages, which would denote this value by codice_1.
Hexadecimal is used in the transfer encoding Base16, in which each byte of the plaintext is broken into two 4-bit values and represented by two hexadecimal digits.
Almost all modern use uses the letters A-F to represent the digits with values 10-15. There is no universal convention to use lowercase or uppercase, each is prevalent or preferred in particular environments by community standards or convention, and mixed case is often used. Seven-segment displays use mixed-case AbCdEF to make digits that can be distinguished from each other.
In contexts where the base is not clear, hexadecimal numbers can be ambiguous and confused with numbers expressed in other bases. There are several conventions for expressing values unambiguously. A numerical subscript (itself written in decimal) can give the base explicitly: 15910 is decimal 159; 15916 is hexadecimal 159, which is equal to 34510. Some authors prefer a text subscript, such as 159decimal and 159hex, or 159d and 159h.
Donald Knuth introduced the use of a particular typeface to represent a particular radix in his book "The TeXbook". Hexadecimal representations are written there in a typewriter typeface: 5A3
In linear text systems, such as those used in most computer programming environments, a variety of methods have arisen:
The use of the letters "A" through "F" to represent the digits above 9 was not universal in the early history of computers.
There are no traditional numerals to represent the quantities from ten to fifteen – letters are used as a substitute – and most European languages lack non-decimal names for the numerals above ten. Even though English has names for several non-decimal powers ("pair" for the first binary power, "score" for the first vigesimal power, "dozen", "gross" and "great gross" for the first three duodecimal powers), no English name describes the hexadecimal powers (decimal 16, 256, 4096, 65536, ... ). Some people read hexadecimal numbers digit by digit like a phone number, or using the NATO phonetic alphabet, the Joint Army/Navy Phonetic Alphabet, or a similar ad hoc system. In the wake of the adoption of hexadecimal among IBM System/360 programmers, Robert A. Magnuson suggested in 1968 in Datamation Magazine a pronunciation guide that gave short names to the letters of hexadecimal - for instance, "A" was pronounced "ann", B "bet", C "chris", etc. Another naming system was invented independently by Tim Babb in 2015. An additional naming system has been published online by S. R. Rogers in 2007 that tries to make the verbal representation distinguishable in any case, even when the actual number does not contain numbers A-F. Examples are listed in the tables below.
Systems of counting on digits have been devised for both binary and hexadecimal.
Arthur C. Clarke suggested using each finger as an on/off bit, allowing finger counting from zero to 102310 on ten fingers. Another system for counting up to FF16 (25510) is illustrated on the right.
The hexadecimal system can express negative numbers the same way as in decimal: −2A to represent −4210 and so on.
Hexadecimal can also be used to express the exact bit patterns used in the processor, so a sequence of hexadecimal digits may represent a signed or even a floating point value. This way, the negative number −4210 can be written as FFFF FFD6 in a 32-bit CPU register (in two's-complement), as C228 0000 in a 32-bit FPU register or C045 0000 0000 0000 in a 64-bit FPU register (in the IEEE floating-point standard).
Just as decimal numbers can be represented in exponential notation, so too can hexadecimal numbers. By convention, the letter "P" (or "p", for "power") represents "times two raised to the power of", whereas "E" (or "e") serves a similar purpose in decimal as part of the E notation. The number after the "P" is "decimal" and represents the "binary" exponent. Increasing the exponent by 1 multiplies by 2, not 16. 10.0p1 = 8.0p2 = 4.0p3 = 2.0p4 = 1.0p5. Usually, the number is normalized so that the leading hexadecimal digit is 1 (unless the value is exactly 0).
Example: 1.3DEp42 represents .
Hexadecimal exponential notation is required by the IEEE 754-2008 binary floating-point standard.
This notation can be used for floating-point literals in the C99 edition of the C programming language.
Using the "%a" or "%A" conversion specifiers, this notation can be produced by implementations of the "printf" family of functions following the C99 specification and
Single Unix Specification (IEEE Std 1003.1) POSIX standard.
Most computers manipulate binary data, but it is difficult for humans to work with a large number of digits for even a relatively small binary number. Although most humans are familiar with the base 10 system, it is much easier to map binary to hexadecimal than to decimal because each hexadecimal digit maps to a whole number of bits (410).
This example converts 11112 to base ten. Since each position in a binary numeral can contain either a 1 or a 0, its value may be easily determined by its position from the right:
Therefore:
With little practice, mapping 11112 to F16 in one step becomes easy: see table in written representation. The advantage of using hexadecimal rather than decimal increases rapidly with the size of the number. When the number becomes large, conversion to decimal is very tedious. However, when mapping to hexadecimal, it is trivial to regard the binary string as 4-digit groups and map each to a single hexadecimal digit.
This example shows the conversion of a binary number to decimal, mapping each digit to the decimal value, and adding the results.
Compare this to the conversion to hexadecimal, where each group of four digits can be considered independently, and converted directly:
The conversion from hexadecimal to binary is equally direct.
Although quaternary (base 4) is little used, it can easily be converted to and from hexadecimal or binary. Each hexadecimal digit corresponds to a pair of quaternary digits and each quaternary digit corresponds to a pair of binary digits. In the above example 5 E B 5 216 = 11 32 23 11 024.
The octal (base 8) system can also be converted with relative ease, although not quite as trivially as with bases 2 and 4. Each octal digit corresponds to three binary digits, rather than four. Therefore we can convert between octal and hexadecimal via an intermediate conversion to binary followed by regrouping the binary digits in groups of either three or four.
As with all bases there is a simple algorithm for converting a representation of a number to hexadecimal by doing integer division and remainder operations in the source base. In theory, this is possible from any base, but for most humans only decimal and for most computers only binary (which can be converted by far more efficient methods) can be easily handled with this method.
Let d be the number to represent in hexadecimal, and the series hihi−1...h2h1 be the hexadecimal digits representing the number.
"16" may be replaced with any other base that may be desired.
The following is a JavaScript implementation of the above algorithm for converting any number to a hexadecimal in String representation. Its purpose is to illustrate the above algorithm. To work with data seriously, however, it is much more advisable to work with bitwise operators.
function toHex(d) {
function toChar(n) {
It is also possible to make the conversion by assigning each place in the source base the hexadecimal representation of its place value — before carrying out multiplication and addition to get the final representation.
For example, to convert the number B3AD to decimal, one can split the hexadecimal number into its digits: B (1110), 3 (310), A (1010) and D (1310), and then get the final result by multiplying each decimal representation by 16"p" ("p" being the corresponding hex digit position, counting from right to left, beginning with 0). In this case, we have that:
which is 45997 in base 10.
Most modern computer systems with graphical user interfaces provide a built-in calculator utility capable of performing conversions between the various radices, and in most cases would include the hexadecimal as well.
In Microsoft Windows, the Calculator utility can be set to Scientific mode (called Programmer mode in some versions), which allows conversions between radix 16 (hexadecimal), 10 (decimal), 8 (octal) and 2 (binary), the bases most commonly used by programmers. In Scientific Mode, the on-screen numeric keypad includes the hexadecimal digits A through F, which are active when "Hex" is selected. In hex mode, however, the Windows Calculator supports only integers.
Elementary operations such additions, subtractions, multiplications and divisions can be carried out indirectly through conversion to an alternate numeral system, such as the decimal system, since it is the most commonly adopted system, or the binary system, since each hex digit corresponds to four binary digits,
Alternatively, one can also perform elementary operations directly within the hex system itself — by relying on its addition/multiplication tables and its corresponding standard algorithms such as long division and the traditional subtraction algorithm.
As with other numeral systems, the hexadecimal system can be used to represent rational numbers, although repeating expansions are common since sixteen (1016) has only a single prime factor; two.
For any base, 0.1 (or "1/10") is always equivalent to one divided by the representation of that base value in its own number system. Thus, whether dividing one by two for binary or dividing one by sixteen for hexadecimal, both of these fractions are written as codice_52. Because the radix 16 is a perfect square (42), fractions expressed in hexadecimal have an odd period much more often than decimal ones, and there are no cyclic numbers (other than trivial single digits). Recurring digits are exhibited when the denominator in lowest terms has a prime factor not found in the radix; thus, when using hexadecimal notation, all fractions with denominators that are not a power of two result in an infinite string of recurring digits (such as thirds and fifths). This makes hexadecimal (and binary) less convenient than decimal for representing rational numbers since a larger proportion lie outside its range of finite representation.
All rational numbers finitely representable in hexadecimal are also finitely representable in decimal, duodecimal and sexagesimal: that is, any hexadecimal number with a finite number of digits also has a finite number of digits when expressed in those other bases. Conversely, only a fraction of those finitely representable in the latter bases are finitely representable in hexadecimal. For example, decimal 0.1 corresponds to the infinite recurring representation 0.1 in hexadecimal. However, hexadecimal is more efficient than duodecimal and sexagesimal for representing fractions with powers of two in the denominator. For example, 0.062510 (one-sixteenth) is equivalent to 0.116, 0.0912, and 0;3,4560.
The table below gives the expansions of some common irrational numbers in decimal and hexadecimal.
Powers of two have very simple expansions in hexadecimal. The first sixteen powers of two are shown below.
The word "hexadecimal" is composed of "hexa-", derived from the Greek ἕξ (hex) for "six", and "-decimal", derived from the Latin for "tenth". Webster's Third New International online derives "hexadecimal" as an alteration of the all-Latin "sexadecimal" (which appears in the earlier Bendix documentation). The earliest date attested for "hexadecimal" in Merriam-Webster Collegiate online is 1954, placing it safely in the category of international scientific vocabulary (ISV). It is common in ISV to mix Greek and Latin combining forms freely. The word "sexagesimal" (for base 60) retains the Latin prefix. Donald Knuth has pointed out that the etymologically correct term is "senidenary" (or possibly, "sedenary"), from the Latin term for "grouped by 16". (The terms "binary", "ternary" and "quaternary" are from the same Latin construction, and the etymologically correct terms for "decimal" and "octal" arithmetic are "denary" and "octonary", respectively.) Alfred B. Taylor used "senidenary" in his mid-1800s work on alternative number bases, although he rejected base 16 because of its "incommodious number of digits". Schwartzman notes that the expected form from usual Latin phrasing would be "sexadecimal", but computer hackers would be tempted to shorten that word to "sex". The etymologically proper Greek term would be "hexadecadic" / "ἑξαδεκαδικός" / "hexadekadikós" (although in Modern Greek, "decahexadic" / "δεκαεξαδικός" / "dekaexadikos" is more commonly used).
The traditional Chinese units of measurement were base-16. For example, one jīn (斤) in the old system equals sixteen taels. The suanpan (Chinese abacus) can be used to perform hexadecimal calculations such as additions and subtractions.
As with the duodecimal system, there have been occasional attempts to promote hexadecimal as the preferred numeral system. These attempts often propose specific pronunciation and symbols for the individual numerals. Some proposals unify standard measures so that they are multiples of 16.
An example of unified standard measures is hexadecimal time, which subdivides a day by 16 so that there are 16 "hexhours" in a day.
Base16 (as a proper name without a space) can also refer to a binary to text encoding belonging to the same family as Base32, Base58, and Base64.
In this case, data is broken into 4-bit sequences, and each value (between 0 and 15 inclusively) is encoded using 16 symbols from the ASCII character set. Although any 16 symbols from the ASCII character set can be used, in practice the ASCII digits '0'–'9' and the letters 'A'–'F' (or the lowercase 'a'–'f') are always chosen in order to align with standard written notation for hexadecimal numbers.
There are several advantages of Base16 encoding:
The main disadvantages of Base16 encoding are:
Support for Base16 encoding is ubiquitous in modern computing. It is the basis for the W3C standard for URL percent encoding, where a character is replaced with a percent sign "%" and its Base16-encoded form. Most modern programming languages directly include support for formatting and parsing Base16-encoded numbers. | https://en.wikipedia.org/wiki?curid=13263 |
Histogram
A histogram is an approximate representation of the distribution of numerical or categorical data. It was first introduced by Karl Pearson. To construct a histogram, the first step is to "bin" (or "bucket") the range of values—that is, divide the entire range of values into a series of intervals—and then count how many values fall into each interval. The bins are usually specified as consecutive, non-overlapping intervals of a variable. The bins (intervals) must be adjacent, and are often (but not required to be) of equal size.
If the bins are of equal size, a rectangle is erected over the bin with height proportional to the frequency—the number of cases in each bin. A histogram may also be normalized to display "relative" frequencies. It then shows the proportion of cases that fall into each of several categories, with the sum of the heights equaling 1.
However, bins need not be of equal width; in that case, the erected rectangle is defined to have its "area" proportional to the frequency of cases in the bin. The vertical axis is then not the frequency but "frequency density"—the number of cases per unit of the variable on the horizontal axis. Examples of variable bin width are displayed on Census bureau data below.
As the adjacent bins leave no gaps, the rectangles of a histogram touch each other to indicate that the original variable is continuous.
Histograms give a rough sense of the density of the underlying distribution of the data, and often for density estimation: estimating the probability density function of the underlying variable. The total area of a histogram used for probability density is always normalized to 1. If the length of the intervals on the "x"-axis are all 1, then a histogram is identical to a relative frequency plot.
A histogram can be thought of as a simplistic kernel density estimation, which uses a kernel to smooth frequencies over the bins. This yields a smoother probability density function, which will in general more accurately reflect distribution of the underlying variable. The density estimate could be plotted as an alternative to the histogram, and is usually drawn as a curve rather than a set of boxes. Histograms are nevertheless preferred in applications, when their statistical properties need to be modeled. The correlated variation of a kernel density estimate is very difficult to describe mathematically, while it is simple for a histogram where each bin varies independently.
An alternative to kernel density estimation is the average shifted histogram,
which is fast to compute and gives a smooth curve estimate of the density without using kernels.
The histogram is one of the seven basic tools of quality control.
Histograms are sometimes confused with bar charts. A histogram is used for continuous data, where the bins represent ranges of data, while a bar chart is a plot of categorical variables. Some authors recommend that bar charts have gaps between the rectangles to clarify the distinction.
This is the data for the histogram to the right, using 500 items:
The words used to describe the patterns in a histogram are: "symmetric", "skewed left" or "right", "unimodal", "bimodal" or "multimodal".
It is a good idea to plot the data using several different bin widths to learn more about it. Here is an example on tips given in a restaurant.
The U.S. Census Bureau found that there were 124 million people who work outside of their homes. Using their data on the time occupied by travel to work, the table below shows the absolute number of people who responded with travel times "at least 30 but less than 35 minutes" is higher than the numbers for the categories above and below it. This is likely due to people rounding their reported journey time. The problem of reporting values as somewhat arbitrarily rounded numbers is a common phenomenon when collecting data from people.
This histogram shows the number of cases per unit interval as the height of each block, so that the area of each block is equal to the number of people in the survey who fall into its category. The area under the curve represents the total number of cases (124 million). This type of histogram shows absolute numbers, with Q in thousands.
This histogram differs from the first only in the vertical scale. The area of each block is the fraction of the total that each category represents, and the total area of all the bars is equal to 1 (the fraction meaning "all"). The curve displayed is a simple density estimate. This version shows proportions, and is also known as a unit area histogram.
In other words, a histogram represents a frequency distribution by means of rectangles whose widths represent class intervals and whose areas are proportional to the corresponding frequencies: the height of each is the average frequency density for the interval. The intervals are placed together in order to show that the data represented by the histogram, while exclusive, is also contiguous. (E.g., in a histogram it is possible to have two connecting intervals of 10.5–20.5 and 20.5–33.5, but not two connecting intervals of 10.5–20.5 and 22.5–32.5. Empty intervals are represented as empty and not skipped.)
In a more general mathematical sense, a histogram is a function "m""i" that counts the number of observations that fall into each of the disjoint categories (known as "bins"), whereas the graph of a histogram is merely one way to represent a histogram. Thus, if we let "n" be the total number of observations and "k" be the total number of bins, the histogram "m""i" meets the following conditions:
A cumulative histogram is a mapping that counts the cumulative number of observations in all of the bins up to the specified bin. That is, the cumulative histogram "M""i" of a histogram "m""j" is defined as:
There is no "best" number of bins, and different bin sizes can reveal different features of the data. Grouping data is at least as old as Graunt's work in the 17th century, but no systematic guidelines were given until Sturges's work in 1926.
Using wider bins where the density of the underlying data points is low reduces noise due to sampling randomness; using narrower bins where the density is high (so the signal drowns the noise) gives greater precision to the density estimation. Thus varying the bin-width within a histogram can be beneficial. Nonetheless, equal-width bins are widely used.
Some theoreticians have attempted to determine an optimal number of bins, but these methods generally make strong assumptions about the shape of the distribution. Depending on the actual data distribution and the goals of the analysis, different bin widths may be appropriate, so experimentation is usually needed to determine an appropriate width. There are, however, various useful guidelines and rules of thumb.
The number of bins "k" can be assigned directly or can be calculated from a suggested bin width "h" as:
The braces indicate the ceiling function.
which takes the square root of the number of data points in the sample (used by Excel histograms and many other) and rounds to the next integer.
Sturges' formula is derived from a binomial distribution and implicitly assumes an approximately normal distribution.
It implicitly bases the bin sizes on the range of the data and can perform poorly if "n" Online Statistics Education: A Multimedia Course of Study (http://onlinestatbook.com/). Project Leader: David M. Lane, Rice University (chapter 2 "Graphing Distributions", section "Histograms") is presented as a simple alternative to Sturges's rule.
Doane's formula is a modification of Sturges' formula which attempts to improve its performance with non-normal data.
where formula_8 is the estimated 3rd-moment-skewness of the distribution and
where formula_11 is the sample standard deviation. Scott's normal reference rule is optimal for random samples of normally distributed data, in the sense that it minimizes the integrated mean squared error of the density estimate.
The Freedman–Diaconis rule is:
which is based on the interquartile range, denoted by IQR. It replaces 3.5σ of Scott's rule with 2 IQR, which is less sensitive than the standard deviation to outliers in data.
This approach of minimizing integrated mean squared error from Scott's rule can be generalized beyond normal distributions, by using leave-one out cross validation:
Here, formula_14 is the number of datapoints in the "k"th bin, and choosing the value of "h" that minimizes "J" will minimize integrated mean squared error.
The choice is based on minimization of an estimated "L"2 risk function
where formula_16 and formula_17 are mean and biased variance of a histogram with bin-width formula_18, formula_19 and formula_20.
Rather than choosing evenly spaced bins, for some applications it is preferable to vary the bin width. This avoids bins with low counts. A common case is to choose "equiprobable bins", where the number of samples in each bin is expected to be approximately equal. The bins may be chosen according to some known distribution or may be chosen based on the data so that each bin has formula_21 samples. When plotting the histogram, the "frequency density" is used for the dependent axis. While all bins have approximately equal area, the heights of the histogram approximate the density distribution.
For equiprobable bins, the following rule for the number of bins is suggested:
This choice of bins is motivated by maximizing the power of a Pearson chi-squared test testing whether the bins do contain equal numbers of samples. More specifically, for a given confidence interval formula_23 it is recommended to choose between 1/2 and 1 times the following equation:
Where formula_25 is the probit function. Following this rule for formula_26 would give between formula_27 and formula_28; the coefficient of 2 is chosen as an easy-to-remember value from this broad optimum.
A good reason why the number of bins should be proportional to formula_29 is the following: suppose that the data are obtained as formula_30 independent realizations of a bounded probability distribution with smooth density. Then the histogram remains equally "rugged" as formula_30 tends to infinity. If formula_32 is the "width" of the distribution (e. g., the standard deviation or the inter-quartile range), then the number of units in a bin (the frequency) is of order formula_33 and the "relative" standard error is of order formula_34. Comparing to the next bin, the relative change of the frequency is of order formula_35 provided that the derivative of the density is non-zero. These two are of the same order if formula_36 is of order formula_37, so that formula_38 is of order formula_29. This simple cubic root choice can also be applied to bins with non-constant width. | https://en.wikipedia.org/wiki?curid=13266 |
Hawaii
Hawaii ( ; ) is a state of the United States of America located in the Pacific Ocean. It is the only U.S. state located outside North America and the only island state.
The state encompasses nearly the entire Hawaiian archipelago, 137 islands spread over . The volcanic archipelago is physiographically and ethnologically part of the Polynesian subregion of Oceania. At the southeastern end of the archipelago, the eight main islands are, in order from northwest to southeast: Niihau, Kauai, Oahu, Molokai, Lānai, Kahoolawe, Maui, and the largest, Hawaii, which the state is named after. It is often called the "Big Island" or "Hawaii Island" to avoid confusion with the state or archipelago.
Hawaii is the 8th-smallest geographically and the 11th-least populous, but the 13th-most densely populated of the 50 states. It is the only state with an Asian American plurality. Hawaii has more than 1.4 million permanent residents, along with many visitors and U.S. military personnel. The state capital and largest city is Honolulu on the island of Oahu. The state's ocean coastline is about long, the fourth longest in the U.S., after the coastlines of Alaska, Florida, and California. Hawaii is the most recent state to join the union, on August 21, 1959. It was an independent nation until 1898.
Hawaii's diverse natural scenery, warm tropical climate, abundance of public beaches, oceanic surroundings, and active volcanoes make it a popular destination for tourists, surfers, biologists, and volcanologists. Because of its central location in the Pacific and 19th-century labor migration, Hawaii's culture is strongly influenced by North American and East Asian cultures, in addition to its indigenous Hawaiian culture.
The state of Hawaii derives its name from the name of its largest island, . A common Hawaiian explanation of the name of is that it was named for , a legendary figure from Hawaiian myth. He is said to have discovered the islands when they were first settled.
The Hawaiian language word is very similar to Proto-Polynesian "Sawaiki", with the reconstructed meaning "homeland". Cognates of are found in other Polynesian languages, including Māori (), Rarotongan () and Samoan (). According to linguists Pukui and Elbert, "elsewhere in Polynesia, or a cognate is the name of the underworld or of the ancestral home, but in Hawaii, the name has no meaning".
A somewhat divisive political issue arose in 1978 when the Constitution of the State of Hawaii added Hawaiian as a second official state language. The title of the state constitution is "The Constitution of the State of Hawaii". ArticleXV, Section1 of the Constitution uses "The State of Hawaii". Diacritics were not used because the document, drafted in 1949, predates the use of the and the in modern Hawaiian orthography. The exact spelling of the state's name in the Hawaiian language is . In the Hawaii Admission Act that granted Hawaiian statehood, the federal government recognized "Hawaii" as the official state name. Official government publications, department and office titles, and the Seal of Hawaii use the traditional spelling with no symbols for glottal stops or vowel length. In contrast, the National and State Park Services, the University of Hawaii and some private enterprises implement these symbols. No precedent for changes to U.S. state names exists since the adoption of the United States Constitution in 1789. However, the Constitution of Massachusetts formally changed the "Province of Massachusetts Bay" to the Commonwealth of Massachusetts in 1780, and in 1819, the Territory of Arkansaw was created but was later admitted to statehood as the State of Arkansas.
There are eight main Hawaiian islands, seven of which are permanently inhabited. The island of Niihau is privately managed by brothers Bruce and Keith Robinson; access is restricted to those who have permission from the island's owners. Access to uninhabited Kahoʻolawe island is also restricted.
The Hawaiian archipelago is located southwest of the contiguous United States. Hawaii is the southernmost U.S. state and the second westernmost after Alaska. Hawaii, like Alaska, does not border any other U.S. state. It is the only U.S. state that is not geographically located in North America, the only state completely surrounded by water and that is entirely an archipelago, and the only state in which coffee is commercially cultivable.
In addition to the eight main islands, the state has many smaller islands and islets. Kaula is a small island near Niihau. The Northwest Hawaiian Islands is a group of nine small, older islands to the northwest of Kauai that extend from Nihoa to Kure Atoll; these are remnants of once much larger volcanic mountains. Across the archipelago are around 130 small rocks and islets, such as Molokini, which are either volcanic, marine sedimentary or erosional in origin.
Hawaii's tallest mountain Mauna Kea is above mean sea level; it is taller than Mount Everest if measured from the base of the mountain, which lies on the floor of the Pacific Ocean and rises about .
The Hawaiian islands were formed by volcanic activity initiated at an undersea magma source called the Hawaii hotspot. The process is continuing to build islands; the tectonic plate beneath much of the Pacific Ocean continually moves northwest and the hot spot remains stationary, slowly creating new volcanoes. Because of the hotspot's location, all currently active land volcanoes are located on the southern half of Hawaii Island. The newest volcano, Lōihi Seamount, is located south of the coast of Hawaii Island.
The last volcanic eruption outside Hawaii Island occurred at on Maui before the late 18thcentury, possibly hundreds of years earlier. In 1790, Kīlauea exploded; it was the deadliest eruption known to have occurred in the modern era in what is now the United States. Up to 5,405 warriors and their families marching on Kīlauea were killed by the eruption. Volcanic activity and subsequent erosion have created impressive geological features. Hawaii Island has the second-highest point among the world's islands.
On the flanks of the volcanoes, slope instability has generated damaging earthquakes and related tsunamis, particularly in 1868 and 1975. Steep cliffs have been created by catastrophic debris avalanches on the submerged flanks of ocean island volcanoes.
The erupted in May 2018, opening 22 fissure vents on its East Rift Zone. The Leilani Estates and Lanipuna Gardens are situated within this territory. The destruction affected at least 36 buildings and this coupled with the lava flows and the sulfur dioxide fumes, necessitated the evacuation of more than 2,000 local inhabitants from the neighborhoods.
Because the islands of Hawaii are distant from other land habitats, life is thought to have arrived there by wind, waves (i.e. by ocean currents) and wings (i.e. birds, insects, and any seeds they may have carried on their feathers). This isolation, in combination with the diverse environment (including extreme altitudes, tropical climates, and arid shorelines), allowed for the evolution of new endemic flora and fauna. Hawaii has more endangered species and has lost a higher percentage of its endemic species than any other U.S. state. One endemic plant, "Brighamia", now requires hand-pollination because its natural pollinator is presumed to be extinct. The two species of "Brighamia"—"B. rockii" and "B. insignis"—are represented in the wild by around 120 individual plants. To ensure these plants set seed, biologists rappel down cliffs to brush pollen onto their stigmas.
The extant main islands of the archipelago have been above the surface of the ocean for fewer than 10million years; a fraction of the time biological colonization and evolution have occurred there. The islands are well known for the environmental diversity that occurs on high mountains within a trade winds field. On a single island, the climate around the coasts can range from dry tropical (less than annual rainfall) to wet tropical; on the slopes, environments range from tropical rainforest (more than per year), through a temperate climate, to alpine conditions with a cold, dry climate. The rainy climate impacts soil development, which largely determines ground permeability, affecting the distribution of streams and wetlands.
Several areas in Hawaii are under the protection of the National Park Service. Hawaii has two national parks: Haleakalā National Park located near Kula on the island of Maui, which features the dormant volcano Haleakalā that formed east Maui, and Hawaii Volcanoes National Park in the southeast region of the Hawaii Island, which includes the active volcano Kīlauea and its rift zones.
There are three national historical parks; Kalaupapa National Historical Park in Kalaupapa, Molokai, the site of a former leper colony; Kaloko-Honokōhau National Historical Park in Kailua-Kona on Hawaii Island; and Puuhonua o Hōnaunau National Historical Park, an ancient place of refuge on Hawaii Island's west coast. Other areas under the control of the National Park Service include Ala Kahakai National Historic Trail on Hawaii Island and the USS "Arizona" Memorial at Pearl Harbor on Oahu.
The Papahānaumokuākea Marine National Monument was proclaimed by President George W. Bush on June 15, 2006. The monument covers roughly of reefs, atolls, and shallow and deep sea out to offshore in the Pacific Ocean—an area larger than all the national parks in the U.S. combined.
Hawaii's climate is typical for the tropics, although temperatures and humidity tend to be less extreme because of near-constant trade winds from the east. Summer highs usually reach around during the day, with the temperature reaching a low of at night. Winter day temperatures are usually around ; at low elevation they seldom dip below at night. Snow, not usually associated with the tropics, falls at on Mauna Kea and Mauna Loa on Hawaii Island in some winter months. Snow rarely falls on Haleakalā. Mount Waialeale on Kauai has the second-highest average annual rainfall on Earth, about per year. Most of Hawaii experiences only two seasons; the dry season runs from May to October and the wet season is from October to April.
The warmest temperature recorded in the state, in Pahala on April 27, 1931, is , making it tied with Alaska as the lowest record high temperature observed in a U.S. state. Hawaii's record low temperature is observed in May1979, on the summit of Mauna Kea. Hawaii is the only state to have never recorded sub-zero Fahrenheit temperatures.
Climates vary considerably on each island; they can be divided into windward and leeward ("koolau" and "kona", respectively) areas based upon location relative to the higher mountains. Windward sides face cloud cover.
Hawaii is one of two states that were widely recognized independent nations prior to joining the United States. The Kingdom of Hawaii was sovereign from 1810 until 1893 when the monarchy was overthrown by resident American and European capitalists and landholders. Hawaii was an independent republic from 1894 until August 12, 1898, when it officially became a territory of the United States. Hawaii was admitted as a U.S. state on August 21, 1959.
Based on archaeological evidence, the earliest habitation of the Hawaiian Islands dates to around 300 CE, probably by Polynesian settlers from the Marquesas Islands. A second wave of migration from Raiatea and Bora Bora took place in the century. The date of the human discovery and habitation of the Hawaiian Islands is the subject of academic debate. Some archaeologists and historians think it was a later wave of immigrants from Tahiti around 1000 CE who introduced a new line of high chiefs, the kapu system, the practice of human sacrifice, and the building of "heiau". This later immigration is detailed in Hawaiian mythology ("moolelo") about Paao. Other authors say there is no archaeological or linguistic evidence for a later influx of Tahitian settlers and that Paao must be regarded as a myth.
The history of the islands is marked by a slow, steady growth in population and the size of the chiefdoms, which grew to encompass whole islands. Local chiefs, called alii, ruled their settlements, and launched wars to extend their influence and defend their communities from predatory rivals. Ancient Hawaii was a caste-based society, much like that of Hindus in India.
The 1778 arrival of British explorer Captain James Cook marked the first documented contact by a European explorer with Hawaii. Cook named the archipelago "the Sandwich Islands" in honor of his sponsor John Montagu, 4th Earl of Sandwich, publishing the islands' location and rendering the native name as "Owyhee". The form 'Owyhee' or 'Owhyhee' is preserved in the names of certain locations in the American part of the Pacific Northwest, among them Owyhee County and Owyhee Mountains in Idaho, named after three native Hawaiian members of a trapping party who went missing in the area.
It is very possible that Spanish explorers arrived in the Hawaiian Islands in the 16th century, two hundred years before Cook's first documented visit in 1778. Ruy López de Villalobos commanded a fleet of six ships that left Acapulco in 1542 bound for the Philippines, with a Spanish sailor named Juan Gaetano aboard as pilot. Depending on the interpretation, Gaetano's reports describe an encounter with either Hawaii or the Marshall Islands. If de Villalobos' crew spotted Hawaii, Gaetano would thus be considered the first European to see the islands. Some scholars have dismissed these claims due to a lack of credibility.
Nonetheless, Spanish archives contain a chart that depicts islands at the same latitude as Hawaii, but with a longitude ten degrees east of the islands. In this manuscript, the island of Maui is named "La Desgraciada" (The Unfortunate Island), and what appears to be Hawaii Island is named "La Mesa" (The Table). Islands resembling Kahoolawe', Lānai, and Molokai are named "Los Monjes" (The Monks). For two-and-a-half centuries, Spanish galleons crossed the Pacific from Mexico along a route that passed south of Hawaii on their way to Manila. The exact route was kept secret to protect the Spanish trade monopoly against competing powers. Hawaii thus maintained independence, despite being situated on a sea route East-West between nations that were subjects of the Viceroyalty of New Spain, an empire that exercised jurisdiction over many subject civilizations and kingdoms on both sides of the Pacific.
Despite such contested claims, Cook is generally credited as being the first European to land at Hawaii, having visited the Hawaiian Islands twice. As he prepared for departure after his second visit in 1779, a quarrel ensued as Cook took temple idols and fencing as "firewood", and a minor chief and his men stole a boat from his ship. Cook abducted the King of Hawaii Island, Kalaniōpuu, and held him for ransom aboard his ship in order to gain return of Cook's boat, as this tactic had previously worked in Tahiti and other islands. Instead, the supporters of Kalaniōpuu attacked, killing Cook and four sailors as Cook's party retreated along the beach to their ship. The ship departed without retrieving the stolen boat.
After Cook's visit and the publication of several books relating his voyages, the Hawaiian Islands attracted many European visitors: explorers, traders, and eventually whalers, who found the islands to be a convenient harbor and source of supplies. Early British influence can be seen in the design of the flag of Hawaii, which bears the Union Jack in the top-left corner. These visitors introduced diseases to the once-isolated islands, causing the Hawaiian population to drop precipitously. Native Hawaiians had no resistance to Eurasian diseases, such as influenza, smallpox and measles. By 1820, disease, famine and wars between the chiefs killed more than half of the Native Hawaiian population. During the 1850s, measles killed a fifth of Hawaii's people.
Historical records indicated the earliest Chinese immigrants to Hawaii originated from Guangdong Province; a few sailors had arrived in 1778 with Captain Cook's journey, and more arrived in 1789 with an American trader who settled in Hawaii in the late 18th century. It is said that leprosy was introduced by Chinese workers by 1830, and as with the other new infectious diseases, it proved damaging to the Hawaiians.
During the 1780s, and 1790s, chiefs often fought for power. After a series of battles that ended in 1795, all inhabited islands were subjugated under a single ruler, who became known as King Kamehameha the Great. He established the House of Kamehameha, a dynasty that ruled the kingdom until 1872.
After Kamehameha II inherited the throne in 1819, American Protestant missionaries to Hawaii converted many Hawaiians to Christianity. They used their influence to end many traditional practices of the people. During the reign of King Kamehameha III, Hawaiʻi turned into a Christian monarchy with the signing of the 1840 Constitution. Hiram Bingham I, a prominent Protestant missionary, was a trusted adviser to the monarchy during this period. Other missionaries and their descendants became active in commercial and political affairs, leading to conflicts between the monarchy and its restive American subjects. Catholic and Mormon missionaries were also active in the kingdom, but they converted a minority of the Native Hawaiian population. Missionaries from each major group administered to the leper colony at Kalaupapa on Molokai, which was established in 1866 and operated well into the 20th century. The best known were Father Damien and Mother Marianne Cope, both of whom were canonized in the early 21st century as Roman Catholic saints.
The death of the bachelor King Kamehameha V—who did not name an heir—resulted in the popular election of Lunalilo over Kalākaua. Lunalilo died the next year, also without naming an heir. In 1874, the election was contested within the legislature between Kalākaua and Emma, Queen Consort of Kamehameha IV. After riots broke out, the United States and Britain landed troops on the islands to restore order. King Kalākaua was chosen as monarch by the Legislative Assembly by a vote of 39 to6 on February 12, 1874.
In 1887, Kalākaua was forced to sign the 1887 Constitution of the Kingdom of Hawaii. Drafted by white businessmen and lawyers, the document stripped the king of much of his authority. It established a property qualification for voting that effectively disenfranchised most Hawaiians and immigrant laborers and favored the wealthier, white elite. Resident whites were allowed to vote but resident Asians were not. As the 1887 Constitution was signed under threat of violence, it is known as the Bayonet Constitution. King Kalākaua, reduced to a figurehead, reigned until his death in 1891. His sister, Queen Liliuokalani, succeeded him; she was the last monarch of Hawaii.
In 1893, Queen Liliuokalani announced plans for a new constitution to proclaim herself an absolute monarch. On January 14, 1893, a group of mostly Euro-American business leaders and residents formed the Committee of Safety to stage a coup d'état against the kingdom and seek annexation by the United States. United States Government Minister John L. Stevens, responding to a request from the Committee of Safety, summoned a company of U.S. Marines. The Queen's soldiers did not resist. According to historian William Russ, the monarchy was unable to protect itself.
On January 17, 1893, Queen Liliuokalani was overthrown and replaced by a provisional government composed of members of the Committee of Safety. The United States Minister to the Kingdom of Hawaii (John L. Stevens) conspired with U.S. citizens to overthrow the monarchy. After the overthrow, Lawyer Sanford B. Dole, a citizen of Hawaii, became President of the Republic when the Provisional Government of Hawaii ended on July 4, 1894. Controversy ensued in the following years as the Queen tried to regain her throne. The administration of President Grover Cleveland commissioned the Blount Report, which concluded that the removal of Liliuokalani had been illegal. The U.S. government first demanded that Queen Liliuokalani be reinstated, but the Provisional Government refused.
Congress conducted an independent investigation, and on February 26, 1894, submitted the Morgan Report, which found all parties, including Minister Stevens—with the exception of the Queen—"not guilty" and not responsible for the coup. Partisans on both sides of the debate questioned the accuracy and impartiality of both the Blount and Morgan reports over the events of 1893.
In 1993, the US Congress passed a joint Apology Resolution regarding the overthrow; it was signed by President Bill Clinton. The resolution apologized and said that the overthrow was illegal in the following phrase: "The Congress—on the occasion of the 100th anniversary of the illegal overthrow of the Kingdom of Hawaii on January 17, 1893, acknowledges the historical significance of this event which resulted in the suppression of the inherent sovereignty of the Native Hawaiian people." The Apology Resolution also "acknowledges that the overthrow of the Kingdom of Hawaii occurred with the active participation of agents and citizens of the United States and further acknowledges that the Native Hawaiian people never directly relinquished to the United States their claims to their inherent sovereignty as a people over their national lands, either through the Kingdom of Hawaii or through a plebiscite or referendum".
After William McKinley won the 1896 U.S. presidential election, advocates pressed to annex the Republic of Hawaii. The previous president, Grover Cleveland, was a friend of Queen Liliuokalani. McKinley was open to persuasion by U.S. expansionists and by annexationists from Hawaii. He met with three non-native annexationists: Lorrin A. Thurston, Francis March Hatch and William Ansel Kinney. After negotiations in June 1897, Secretary of State John Sherman agreed to a treaty of annexation with these representatives of the Republic of Hawaii. The U.S. Senate never ratified the treaty. Despite the opposition of most native Hawaiians, the Newlands Resolution was used to annex the Republic to the U.S.; it became the Territory of Hawaii. The Newlands Resolution was passed by the House on June 15, 1898, by 209 votes in favor to 91 against, and by the Senate on July 6, 1898, by a vote of 42 to 21.
In 1900, Hawaii was granted self-governance and retained Iolani Palace as the territorial capitol building. Despite several attempts to become a state, Hawaii remained a territory for 60 years. Plantation owners and capitalists, who maintained control through financial institutions such as the Big Five, found territorial status convenient because they remained able to import cheap, foreign labor. Such immigration and labor practices were prohibited in many states.
Puerto Rican immigration to Hawaii began in 1899, when Puerto Rico's sugar industry was devastated by a hurricane, causing a worldwide shortage of sugar and a huge demand for sugar from Hawaii. Hawaiian sugarcane plantation owners began to recruit experienced, unemployed laborers in Puerto Rico. Two waves of Korean immigration to Hawaii occurred in the 20th century. The first wave arrived between 1903 and 1924; the second wave began in 1965 after President Lyndon B. Johnson signed the Immigration and Nationality Act of 1965, which removed racial and national barriers and resulted in significantly altering the demographic mix in the U.S.
Oahu was the target of a surprise attack on Pearl Harbor by Imperial Japan on December 7, 1941. The attack on Pearl Harbor and other military and naval installations, carried out by aircraft and by midget submarines, brought the United States into World WarII.
In the 1950s, the power of the plantation owners was broken by the descendants of immigrant laborers, who were born in Hawaii and were U.S. citizens. They voted against the Hawaii Republican Party, strongly supported by plantation owners. The new majority voted for the Democratic Party of Hawaii, which dominated territorial and state politics for more than 40 years. Eager to gain full representation in Congress and the Electoral College, residents actively campaigned for statehood. In Washington there was talk that Hawaii would be a Republican Party stronghold so it was matched with the admission of Alaska, seen as a Democratic Party stronghold. These predictions turned out to be inaccurate; today, Hawaii votes Democratic predominantly, while Alaska votes Republican.
In March 1959, Congress passed the Hawaii Admissions Act, which U.S. President Dwight D. Eisenhower signed into law. The act excluded Palmyra Atoll from statehood; it had been part of the Kingdom and Territory of Hawaii. On June 27, 1959, a referendum asked residents of Hawaii to vote on the statehood bill; 94.3% voted in favor of statehood and 5.7% opposed it. The referendum asked voters to choose between accepting the Act and remaining a U.S. territory. The United Nations' Special Committee on Decolonization later removed Hawaii from its list of non-self-governing territories.
After attaining statehood, Hawaii quickly modernized through construction and a rapidly growing tourism economy. Later, state programs promoted Hawaiian culture. The Hawaii State Constitutional Convention of 1978 created institutions such as the Office of Hawaiian Affairs to promote indigenous language and culture.
Coincidentally, the Wiki knowledge revolution that transformed the Internet took its name from a Hawaiian word.
After Europeans and mainland Americans first arrived during the Kingdom of Hawaii period, the overall population of Hawaii, until that time composed solely of indigenous Hawaiians, fell dramatically. The indigenous Hawaiian population succumbed to foreign diseases, declining from 300,000 in the 1770s, to 60,000 in the 1850s, to 24,000 in 1920. In that year 43% of the population was of Japanese descent. The population of Hawaii began to finally increase after an influx of primarily Asian settlers that arrived as migrant laborers at the end of the 19thcentury.
The unmixed indigenous Hawaiian population has still not restored itself to its 300,000 pre-contact level. , only 156,000 persons declared themselves to be of Native Hawaiian-only ancestry, just over half the pre-contact level Native Hawaiian population, although an additional 371,000 persons declared themselves to possess Native Hawaiian ancestry in combination with one or more other races (including other Polynesian groups, but mostly Asian and/or Caucasian).
The United States Census Bureau estimates the population of Hawaii was 1,420,491 on July 1, 2018; an increase of 4.42% since the 2010 United States Census.
, Hawaii had an estimated population of 1,420,491; a decrease of 7,047 from the previous year and an increase of 60,190 (4.42%) since 2010. This includes a natural increase of 48,111 (96,028 births minus 47,917 deaths) and an increase due to net migration of 16,956 people into the state. Immigration from outside the United States resulted in a net increase of 30,068; migration within the country produced a net loss of 13,112 people.
The center of population of Hawaii is located on the island of O'ahu. Large numbers of Native Hawaiians have moved to Las Vegas, which has been called the "ninth island" of Hawaii.
Hawaii has a "de facto" population of over 1.4million, due in part to a large number of military personnel and tourist residents. O'ahu is the most populous island; it has the highest population density with a resident population of just under one million in , approximately 1,650 people per square mile. Hawaii's 1.4million residents, spread across of land, result in an average population density of 188.6 persons per square mile. The state has a lower population density than Ohio and Illinois.
The average projected lifespan of people born in Hawaii in 2000 is 79.8 years; 77.1 years if male, 82.5 if female—longer than the average lifespan of any other U.S. state. the U.S. military reported it had 42,371 personnel on the islands.
According to the 2010 United States Census, Hawaii had a population of 1,360,301. The state's population identified as 38.6% Asian; 24.7% White (22.7% Non-Hispanic White Alone); 23.6% from two or more races; 10.0% Native Hawaiians and other Pacific Islanders; 8.9% Hispanics and Latinos of any race; 1.6% Black or African American; 1.2% from some other race; and 0.3% Native American and Alaska Native.
Hawaii has the highest percentage of Asian Americans and multiracial Americans and the lowest percentage of White Americans of any state. It is the only state where people who identify as Asian Americans are the largest ethnic group. In 2012, 14.5% of the resident population under age1 was non-Hispanic white. Hawaii's Asian population consists mainly of 198,000 (14.6%) Filipino Americans, 185,000 (13.6%) Japanese Americans, roughly 55,000 (4.0%) Chinese Americans, and 24,000 (1.8%) Korean Americans. There are more than 80,000 Indigenous Hawaiians—5.9% of the population. Including those with partial ancestry, Samoan Americans constitute 2.8% of Hawaii's population, and Tongan Americans constitute 0.6%.
Over 120,000 (8.8%) Hispanic and Latino Americans live in Hawaii. Mexican Americans number over 35,000 (2.6%); Puerto Ricans exceed 44,000 (3.2%). Multiracial Americans constitute almost 25% of Hawaii's population, exceeding 320,000 people. Eurasian Americans are a prominent mixed-race group, numbering about 66,000 (4.9%). The Non-Hispanic White population numbers around 310,000—just over 20% of the population. The multi-racial population outnumbers the non-Hispanic white population by about 10,000 people. In 1970, the Census Bureau reported Hawaii's population was 38.8% white and 57.7% Asian and Pacific Islander.
The five largest European ancestries in Hawaii are German (7.4%), Irish (5.2%), English (4.6%), Portuguese (4.3%) and Italian (2.7%). About 82.2% of the state's residents were born in the United States. Roughly 75% of foreign-born residents originate in Asia. Hawaii is a majority-minority state. It was expected to be one of three states that will not have a non-Hispanic white plurality in 2014; the other two are California and New Mexico.
The third group of foreigners to arrive in Hawaii were from China. Chinese workers on Western trading ships settled in Hawaii starting in 1789. In 1820, the first American missionaries arrived to preach Christianity and teach the Hawaiians Western ways. , a large proportion of Hawaii's population have Asian ancestry—especially Filipino, Japanese and Chinese. Many are descendants of immigrants brought to work on the sugarcane plantations in the mid-to-late 19th century. The first 153 Japanese immigrants arrived in Hawaii on June 19, 1868. They were not approved by the then-current Japanese government because the contract was between a broker and the Tokugawa shogunate—by then replaced by the Meiji Restoration. The first Japanese current-government-approved immigrants arrived on February 9, 1885, after Kalākaua's petition to Emperor Meiji when Kalākaua visited Japan in 1881.
Almost 13,000 Portuguese migrants had arrived by 1899; they also worked on the sugarcane plantations. By 1901, more than 5,000 Puerto Ricans were living in Hawaii.
English and Hawaiian are listed as Hawaii's official languages in the state's 1978 constitution, in Article XV, Section 4. However, the use of Hawaiian is limited because the constitution specifies that "Hawaiian shall be required for public acts and transactions only as provided by law". Hawaii Creole English, locally referred to as "Pidgin", is the native language of many native residents and is a second language for many others.
As of the 2000 Census, 73.4% of Hawaii residents age5 and older exclusively speak English at home. According to the 2008 American Community Survey, 74.6% of Hawaii's residents older than5 speak only English at home. In their homes, 21.0% of state residents speak an additional Asian language, 2.6% speak Spanish, 1.6% speak other Indo-European languages and 0.2% speak another language.
After English, other languages popularly spoken in the state are Tagalog, Japanese and Ilocano. Significant numbers of European immigrants and their descendants also speak their native languages; the most numerous are German, Portuguese, Italian and French. 5.4% of residents speak Tagalog—which includes non-native speakers of Filipino language, the national, co-official, Tagalog-based language; 5.0% speak Japanese and 4.0% speak Ilocano; 1.2% speak Chinese, 1.7% speak Hawaiian; 1.7% speak Spanish; 1.6% speak Korean; and 1.0% speak Samoan.
The keyboard layout used for Hawaiian is QWERTY.
The Hawaiian language has about 2,000 native speakers, about 0.15% of the total population. According to the United States Census, there were more than 24,000 total speakers of the language in Hawaii in 2006–2008. Hawaiian is a Polynesian member of the Austronesian language family. It is closely related to other Polynesian languages, such as Marquesan, Tahitian, Māori, Rapa Nui (the language of Easter Island), and less closely to Samoan and Tongan.
According to Schütz, the Marquesans colonized the archipelago in roughly 300 CE and were later followed by waves of seafarers from the Society Islands, Samoa and Tonga.
These Polynesians remained in the islands; they eventually became the Hawaiian people and their languages evolved into the Hawaiian language. Kimura and Wilson say, "[l]inguists agree that Hawaiian is closely related to Eastern Polynesian, with a particularly strong link in the Southern Marquesas, and a secondary link in Tahiti, which may be explained by voyaging between the Hawaiian and Society Islands". Before the arrival of Captain James Cook, the Hawaiian language had no written form. That form was developed mainly by American Protestant missionaries between 1820 and 1826. They assigned to the Hawaiian phonemes letters from the Latin alphabet.
Interest in Hawaiian increased significantly in the late 20th century. With the help of the Office of Hawaiian Affairs, specially designated immersion schools in which all subjects would be taught in Hawaiian were established. The University of Hawaii developed a Hawaiian language graduate studies program. Municipal codes were altered to favor Hawaiian place and street names for new civic developments. Hawaiʻi Sign Language, a sign language for the deaf based on the Hawaiian language, has been in use in the islands since the early 1800s. It is dwindling in numbers due to American Sign Language supplanting HSL through schooling and various other domains.
Hawaiian distinguishes between long and short vowel sounds. In modern practice, vowel length is indicated with a macron ("kahakō"). Hawaiian-language newspapers ("nūpepa") published from 1834 to 1948 and traditional native speakers of Hawaiian generally omit the marks in their own writing. The okina and kahakō are intended to help non-native speakers. The Hawaiian language uses the glottal stop ("okina") as a consonant. It is written as a symbol similar to the apostrophe or left-hanging (opening) single quotation mark.
Some residents of Hawaii speak Hawaii Creole English (HCE), endonymically called "pidgin" or "pidgin English". The lexicon of HCE derives mainly from English but also uses words that have derived from Hawaiian, Chinese, Japanese, Portuguese, Ilocano and Tagalog. During the 19th century, the increase in immigration—mainly from China, Japan, Portugal—especially from the Azores and Madeira, and Spain—catalyzed the development of a hybrid variant of English known to its speakers as "pidgin". By the early 20th century, pidgin speakers had children who acquired it as their first language. HCE speakers use some Hawaiian words without those words being considered archaic. Most place names are retained from Hawaiian, as are some names for plants and animals. For example, tuna fish is often called by its Hawaiian name, "ahi".
HCE speakers have modified the meanings of some English words. For example, "aunty" and "uncle" may either refer to any adult who is a friend or be used to show respect to an elder. Syntax and grammar follow distinctive rules different from those of General American English. For example, instead of "it is hot today, isn't it?", an HCE speaker would say simply "stay hot, eh?" The term "da kine" is used as a filler; a substitute for virtually any word or phrase. During the surfing boom in Hawaii, HCE was influenced by surfer slang. Some HCE expressions, such as "brah" and "da kine", have found their ways elsewhere through surfing communities.
Christianity is the most widespread religion in Hawaii, mainly represented by various Protestants, Roman Catholics and Mormons. The second-most popular religion is Buddhism, especially among the archipelago's Japanese community. Unaffiliated account for one-quarter of the population.
The Cathedral Church of Saint Andrew in Honolulu was formally the seat of the Hawaiian Reformed Catholic Church. When the Hawaiian Reformed Catholic Church, a province of the Anglican Communion, was merged into the Episcopal Church in the 1890s following the overthrow of the Kingdom of Hawaii, it became the seat of the Episcopal Diocese of Hawaii. The Cathedral Basilica of Our Lady of Peace and the Co-Cathedral of Saint Theresa of the Child Jesus serve as seats of the Roman Catholic Diocese of Honolulu. The Eastern Orthodox community is centered around the Saints Constantine and Helen Greek Orthodox Cathedral of the Pacific.
The largest denominations by number of adherents were the Roman Catholic Church with 249,619 adherents in 2010, the Church of Jesus Christ of Latter-day Saints with 68,128 adherents in 2009, the United Church of Christ with 115 congregations and 20,000 members, and the Southern Baptist Convention with 108 congregations and 18,000 members. All non-denominational churches have 128 congregations and 32,000 members.
According to data provided by religious establishments, religion in Hawaii in 2000 was distributed as follows:
A Pew poll found that the religious composition was as follows:
"Note: Births in this table do not add up, because Hispanics are counted both by their ethnicity and by their race, giving a higher overall number."
Hawaii has had a long history of LGBT identities. "Māhū" ("in the middle") were a precolonization third gender with traditional spiritual and social roles; "māhū" were a respected group of people widely regarded as healers. The concept of "aikāne" referred to homosexual relationships, widely accepted as a normal part of ancient Hawaiian society. Among men, "aikāne" relationships often began as teens and continued throughout their adult lives, even if they also maintained heterosexual partners. While "aikāne" usually refers to male homosexuality, some stories also refer to women, implying that women may have been involved in "aikāne" relationships as well. Journals written by Captain Cook's crew record that many "alii" (hereditary nobles) also engaged in "aikāne" relationships, and Kamehameha the Great, the founder and first ruler of the Kingdom of Hawaii, was also known to participate. Cook's second lieutenant and co-astronomer James King observed that "all the chiefs had them", and recounts that Cook was actually asked by one chief to leave King behind, considering the role a great honor.
According to Hawaiian scholar Lilikalā Kameeleihiwa, "If you didn't sleep with a man, how could you trust him when you went into battle? How would you know if he was going to be the warrior that would protect you at all costs, if he wasn't your lover?"
During the late 19th and early 20th century, the word "aikāne" was expurgated of its original sexual meaning by colonialism, and in print simply meant "friend". Nonetheless, in Hawaiian language publications its metaphorical meaning can still mean either "friend" or "lover" without stigmatization.
A 2012 poll by Gallup found that Hawaii had the largest proportion of lesbian, gay, bisexual and transgender (LGBT) adults in the U.S., at 5.1%, comprising an estimated adult LGBT population of 53,966 individuals. The number of same-sex couple households in 2010 was 3,239; a 35.5% increase of figures from a decade earlier. In 2013, Hawaii became the fifteenth U.S. state to legalize same-sex marriage; a University of Hawaii researcher reported at the time that the law may have been able to boost tourism by $217 million.
The history of Hawaii's economy can be traced through a succession of dominant industries: sandalwood, whaling, sugarcane, pineapple, the military, tourism and education. Since statehood in 1959, tourism has been the largest industry, contributing 24.3% of the gross state product (GSP) in 1997, despite efforts to diversify. The state's gross output for 2003 was billion; per capita income for Hawaii residents in 2014 was . Hawaiian exports include food and clothing. These industries play a small role in the Hawaiian economy, due to the shipping distance to viable markets, such as the West Coast of the contiguous U.S. The state's food exports include coffee, macadamia nuts, pineapple, livestock, sugarcane and honey.
By weight, honey bees may be the state's most valuable export. According to the Hawaii Agricultural Statistics Service, agricultural sales were million from diversified agriculture, million from pineapple, and million from sugarcane. Hawaii's relatively consistent climate has attracted the seed industry, which is able to test three generations of crops per year on the islands, compared with one or two on the mainland. Seeds yielded million in 2012, supporting 1,400 workers.
, the state's unemployment rate was 3.2%. In 2009, the United States military spent billion in Hawaii, accounting for 18% of spending in the state for that year. 75,000 United States Department of Defense personnel live in Hawaii. According to a 2013 study by Phoenix Marketing International, Hawaii had the fourth-largest number of millionaires per capita in the United States, with a ratio of 7.2%.
Tax is collected by the Hawaii Department of Taxation.
Hawaii residents pay the most per person in state taxes in the United States. Millions of tourists pay general excise tax and hotel room tax.
The Hawaii Tax Foundation considers the state's tax burden too high, which it says contributes to higher prices and the perception of an unfriendly business climate.
State Senator Sam Slom says state taxes are comparatively higher than other states because the state government handles education, health care, and social services that are usually handled at a county or municipal level in most other states.
The cost of living in Hawaii, specifically Honolulu, is high compared to that of most major U.S. cities, although it is 6.7% lower than in New York City and 3.6% lower than in San Francisco. These numbers may not take into account some costs, such as increased travel costs for flights, additional shipping fees, and the loss of promotional participation opportunities for customers outside the contiguous U.S. While some online stores offer free shipping on orders to Hawaii, many merchants exclude Hawaii, Alaska, Puerto Rico and certain other U.S. territories.
Hawaiian Electric Industries, a privately owned company, provides 95% of the state's population with electricity, mostly from fossil-fuel power stations. Average electricity prices in October 2014 () were nearly three times the national average () and 80% higher than the second-highest state, Connecticut.
The median home value in Hawaii in the 2000 U.S. Census was , while the national median home value was . Hawaii home values were the highest of all states, including California with a median home value of . Research from the National Association of Realtors places the 2010 median sale price of a single family home in Honolulu, Hawaii, at and the U.S. median sales price at . The sale price of single family homes in Hawaii was the highest of any U.S. city in 2010, just above that of the Silicon Valley area of California ().
Hawaii's very high cost of living is the result of several interwoven factors of the global economy in addition to domestic U.S. government trade policy. Like other regions with desirable weather throughout the year, such as areas of California, Arizona and Florida, Hawaii's residents can be considered to be subject to a "Sunshine tax". This situation is further exacerbated by the natural factors of geography and world distribution that lead to higher prices for goods due to increased shipping costs, a problem which many island states and territories suffer from as well.
The higher costs to ship goods across an ocean may be further increased by the requirements of the Jones Act, which generally requires that goods be transported between places within the U.S., including between the mainland U.S. west coast and Hawaii, using only U.S.-owned, built, and crewed ships. Jones Act-compliant vessels are often more expensive to build and operate than foreign equivalents, which can drive up shipping costs. While the Jones Act does not affect transportation of goods to Hawaii directly from Asia, this type of trade is nonetheless not common; this is a result of other primarily economic reasons including additional costs associated with stopping over in Hawaii (e.g. pilot and port fees), the market size of Hawaii, and the economics of using ever-larger ships that cannot be handled in Hawaii for transoceanic voyages. Therefore, Hawaii relies on receiving most inbound goods on Jones Act-qualified vessels originating from the U.S. west coast, which may contribute to the increased cost of some consumer goods and therefore the overall cost of living.
Hawaiian consumers ultimately bear the expense of transporting goods imposed by the Jones Act. This law makes Hawaii less competitive than West Coast ports as a shopping destination for tourists from countries with much higher taxes like Japan, even though prices for Asian-manufactured goods should be cheaper because Hawaii is much closer than mainland states to Asia.
The aboriginal culture of Hawaii is Polynesian. Hawaii represents the northernmost extension of the vast Polynesian Triangle of the south and central Pacific Ocean. While traditional Hawaiian culture remains as vestiges in modern Hawaiian society, there are re-enactments of the ceremonies and traditions throughout the islands. Some of these cultural influences, including the popularity (in greatly modified form) of "lūau" and "hula", are strong enough to affect the wider United States.
The cuisine of Hawaii is a fusion of many foods brought by immigrants to the Hawaiian Islands, including the earliest Polynesians and Native Hawaiian cuisine, and American, Chinese, Filipino, Japanese, Korean, Polynesian, Puerto Rican, and Portuguese origins. Plant and animal food sources are imported from around the world for agricultural use in Hawaii. "Poi", a starch made by pounding taro, is one of the traditional foods of the islands. Many local restaurants serve the ubiquitous plate lunch, which features two scoops of rice, a simplified version of American macaroni salad and a variety of toppings including hamburger patties, a fried egg, and gravy of a "loco moco", Japanese style "tonkatsu" or the traditional lūau favorites, including "kālua" pork and "laulau". "Spam musubi" is an example of the fusion of ethnic cuisine that developed on the islands among the mix of immigrant groups and military personnel. In the 1990s, a group of chefs developed Hawaii regional cuisine as a contemporary fusion cuisine.
Some key customs and etiquette in Hawaii are as follows: when visiting a home, it is considered good manners to bring a small gift for one's host (for example, a dessert). Thus, parties are usually in the form of potlucks. Most locals take their shoes off before entering a home. It is customary for Hawaiian families, regardless of ethnicity, to hold a luau to celebrate a child's first birthday. It is also customary at Hawaiian weddings, especially at Filipino weddings, for the bride and groom to do a money dance (also called the pandanggo). Print media and local residents recommend that one refer to non-Hawaiians as "locals of Hawaii" or "people of Hawaii".
Hawaiian mythology comprises the legends, historical tales, and sayings of the ancient Hawaiian people. It is considered a variant of a more general Polynesian mythology that developed a unique character for several centuries before "circa" 1800. It is associated with the Hawaiian religion, which was officially suppressed in the 19th century but was kept alive by some practitioners to the modern day. Prominent figures and terms include Aumakua, the spirit of an ancestor or family god and Kāne, the highest of the four major Hawaiian deities.
Polynesian mythology is the oral traditions of the people of Polynesia, a grouping of Central and South Pacific Ocean island archipelagos in the Polynesian triangle together with the scattered cultures known as the Polynesian outliers. Polynesians speak languages that descend from a language reconstructed as Proto-Polynesian that was probably spoken in the area around Tonga and Samoa in around 1000 BCE.
Prior to the 15th century, Polynesian people migrated east to the Cook Islands, and from there to other island groups such as Tahiti and the Marquesas. Their descendants later discovered the islands Tahiti, Rapa Nui and later the Hawaiian Islands and New Zealand.
The Polynesian languages are part of the Austronesian language family. Many are close enough in terms of vocabulary and grammar to be mutually intelligible. There are also substantial cultural similarities between the various groups, especially in terms of social organization, childrearing, horticulture, building and textile technologies. Their mythologies in particular demonstrate local reworkings of commonly shared tales. The Polynesian cultures each have distinct but related oral traditions; legends or myths are traditionally considered to recount ancient history (the time of "pō") and the adventures of gods ("atua") and deified ancestors.
There are many Hawaiian state parks.
The literature of Hawaii is diverse and includes authors Kiana Davenport, Lois-Ann Yamanaka, and Kaui Hart Hemmings. Hawaiian magazines include "Hana Hou!", "Hawaii Business Magazine" and "Honolulu", among others.
The music of Hawaii includes traditional and popular styles, ranging from native Hawaiian folk music to modern rock and hip hop. Hawaii's musical contributions to the music of the United States are out of proportion to the state's small size.
Styles such as slack-key guitar are well known worldwide, while Hawaiian-tinged music is a frequent part of Hollywood soundtracks. Hawaii also made a major contribution to country music with the introduction of the steel guitar.
Traditional Hawaiian folk music is a major part of the state's musical heritage. The Hawaiian people have inhabited the islands for centuries and have retained much of their traditional musical knowledge. Their music is largely religious in nature, and includes chanting and dance music.
Hawaiian music has had an enormous impact on the music of other Polynesian islands; according to Peter Manuel, the influence of Hawaiian music a "unifying factor in the development of modern Pacific musics". Native Hawaiian musician and Hawaiian sovereignty activist Israel Kamakawiwoʻole, famous for his medley of "Somewhere Over the Rainbow/What a Wonderful World", was named "The Voice of Hawaii" by NPR in 2010 in its 50 great voices series.
Surfing has been a central part of Polynesian culture for centuries. Since the late 19th century, Hawaii has become a major site for surfists from around the world. Notable competitions include the Triple Crown of Surfing and The Eddie.
The only NCAA Division I team in Hawaii is the Hawaii Rainbow Warriors and Rainbow Wahine, which competes at the Big West Conference (major sports), Mountain West Conference (football) and Mountain Pacific Sports Federation (minor sports). There are three teams in NCAA Division II: Chaminade Silverswords, Hawaii Pacific Sharks and Hawaii-Hilo Vulcans, all of which compete at the Pacific West Conference.
Notable college sports events in Hawaii include the Maui Invitational Tournament, Diamond Head Classic (basketball) and Hawaii Bowl (football).
Notable professional teams include The Hawaiians, which played at the World Football League in 1974 and 1975; the Hawaii Islanders, a Triple-A minor league baseball team that played at the Pacific Coast League from 1961 to 1987; and Team Hawaii, a North American Soccer League team that played in 1977.
Hawaii has hosted the Sony Open in Hawaii golf tournament since 1965, the Tournament of Champions golf tournament since 1999, the Lotte Championship golf tournament since 2012, the Honolulu Marathon since 1973, the Ironman World Championship triathlon race since 1978, the Ultraman triathlon since 1983, the National Football League's Pro Bowl from 1980 to 2016, the 2000 FINA World Open Water Swimming Championships, and the 2008 Pan-Pacific Championship and 2012 Hawaiian Islands Invitational soccer tournaments.
Tourism is an important part of the Hawaiian economy. In 2003, according to state government data, there were more than 6.4 million visitors, with expenditures of over $10 billion, to the Hawaiian Islands. Due to the mild year-round weather, tourist travel is popular throughout the year. The major holidays are the most popular times for outsiders to visit, especially in the winter months. Substantial numbers of Japanese tourists still visit the islands but have now been surpassed by Chinese and Koreans due to the collapse of the value of the Yen and the weak Japanese economy. The average Japanese stays only five days, while other Asians stay over 9.5 days and spend 25% more.
Hawaii hosts numerous cultural events. The annual Merrie Monarch Festival is an international Hula competition. The Hawaii International Film Festival is the premier film festival for Pacific rim cinema. Honolulu hosts the state's long-running LGBT film festival, the Rainbow Film Festival.
, Hawaii's health care system insures 92% of residents. Under the state's plan, businesses are required to provide insurance to employees who work more than twenty hours per week. Heavy regulation of insurance companies helps reduce the cost to employers. Due in part to heavy emphasis on preventive care, Hawaiians require hospital treatment less frequently than the rest of the United States, while total health care expenses measured as a percentage of state GDP are substantially lower. Proponents of universal health care elsewhere in the U.S. sometimes use Hawaii as a model for proposed federal and state health care plans.
Hawaii has the only school system within the U.S. that is unified statewide. Policy decisions are made by the fourteen-member state Board of Education, which sets policy and hires the superintendent of schools, who oversees the state Department of Education. The Department of Education is divided into seven districts; four on Oahu and one for each of the other three counties. The main rationale for centralization is to combat inequalities between highly populated Oahu and the more rural Neighbor Islands, and between lower-income and more affluent areas.
Public elementary, middle and high school test scores in Hawaii are below national averages on tests mandated under the No Child Left Behind Act. The Hawaii Board of Education requires all eligible students to take these tests and report all student test scores. This may have unbalanced the results that reported in August 2005 that of 282 schools across the state, 185 failed to reach federal minimum performance standards in mathematics and reading. The ACT college placement tests show that in 2005, seniors scored slightly above the national average (21.9 compared with 20.9), but in the widely accepted SAT examinations, Hawaii's college-bound seniors tend to score below the national average in all categories except mathematics.
Hawaii has the highest rates of private school attendance in the nation. During the 2011–2012 school year, Hawaii public and charter schools had an enrollment of 181,213, while private schools had 37,695. Private schools educated over 17% of students in Hawaii that school year, nearly three times the approximate national average of 6%. It has four of the largest independent schools; Iolani School, Kamehameha Schools, Mid-Pacific Institute and Punahou School. Pacific Buddhist Academy, the second Buddhist high school in the U.S. and first such school in Hawaii, was founded in 2003. The first native controlled public charter school was the Kanu O Ka Aina New Century Charter School.
Independent and charter schools can select their students, while the public schools are open to all students in their district. The Kamehameha Schools are the only schools in the U.S. that openly grant admission to students based on ancestry; collectively, they are one of the wealthiest schools in the United States, if not the world, having over eleven billion US dollars in estate assets. In 2005, Kamehameha enrolled 5,398 students, 8.4% of the Native Hawaiian children in the state.
The largest institution of higher learning in Hawaii is the University of Hawaii System, which consists of the research university at Mānoa, two comprehensive campuses at Hilo and West Oahu, and seven community colleges. Private universities include Brigham Young University–Hawaii, Chaminade University of Honolulu, Hawaii Pacific University, and Wayland Baptist University. Saint Stephen Diocesan Center is a seminary of the Roman Catholic Diocese of Honolulu. Kona hosts the University of the Nations, which is not an accredited university.
A system of state highways encircles each main island. Only Oahu has federal highways, and is the only area outside the contiguous 48 states to have signed Interstate highways. Narrow, winding roads and congestion in populated places can slow traffic. Each major island has a public bus system.
Honolulu International Airport (IATA:HNL), which shares runways with the adjacent Hickam Field (IATA:HIK), is the major commercial aviation hub of Hawaii. The commercial aviation airport offers intercontinental service to North America, Asia, Australia and Oceania. Hawaiian Airlines, Mokulele Airlines and go! use jets to provide services between the large airports in Honolulu, Līhue, Kahului, Kona and Hilo. Island Air and Pacific Wings serve smaller airports. These airlines also provide air freight services between the islands. On May 30, 2017, the airport was officially renamed as the Daniel K. Inouye International Airport (HNL), after U.S. Senator Daniel K. Inouye.
Until air passenger services began in the 1920s, private boats were the sole means of traveling between the islands. Seaflite operated hydrofoils between the major islands in the mid-1970s.
The Hawaii Superferry operated between Oahu and Maui between December 2007 and March 2009, with additional routes planned for other islands. Protests and legal problems over environmental impact statements ended the service, though the company operating Superferry has expressed a wish to recommence ferry services in the future. Currently there is a passenger ferry service in Maui County between Lanai and Maui, which does not take vehicles; a passenger ferry to Molokai ended in 2016. Currently Norwegian Cruise Lines and Princess Cruises provide passenger cruise ship services between the larger islands.
At one time Hawaii had a network of railroads on each of the larger islands that transported farm commodities and passengers. Most were narrow gauge systems but there were some gauge on some of the smaller islands. The standard gauge in the U.S. is . By far the largest railroad was the Oahu Railway and Land Company (OR&L) that ran lines from Honolulu across the western and northern part of Oahu.
The OR&L was important for moving troops and goods during World War II. Traffic on this line was busy enough for signals to be used to facilitate movement of trains and to require wigwag signals at some railroad crossings for the protection of motorists. The main line was officially abandoned in 1947, although part of it was bought by the U.S. Navy and operated until 1970. of track remain; preservationists occasionally run trains over a portion of this line. The Honolulu High-Capacity Transit Corridor Project aims to add elevated passenger rail on Oahu to relieve highway congestion.
The movement of the Hawaiian royal family from Hawaii Island to Maui, and subsequently to Oahu, explains the modern-day distribution of population centers. Kamehameha III chose the largest city, Honolulu, as his capital because of its natural harbor—the present-day Honolulu Harbor. Now the state capital, Honolulu is located along the southeast coast of Oahu. The previous capital was Lahaina, Maui, and before that Kailua-Kona, Hawaii. Some major towns are Hilo; Kāneohe; Kailua; Pearl City; Waipahu; Kahului; Kailua-Kona. Kīhei; and Līhue.
Hawaii comprises five counties: the City and County of Honolulu, Hawaii County, Maui County, Kauai County, and Kalawao County.
Hawaii has the fewest local governments among U.S. states. Unique to this state is the lack of municipal governments. All local governments are generally administered at the county level. The only incorporated area in the state is Honolulu County, a consolidated city–county that governs the entire island of Oahu. County executives are referred to as mayors; these are the Mayor of Hawaii County, Mayor of Honolulu, Mayor of Kauai, and the Mayor of Maui. The mayors are all elected in nonpartisan elections. Kalawao County has no elected government, and as mentioned above there are no local school districts and instead all local public education is administered at the state level by the Hawaii Department of Education. The remaining local governments are special districts.
The state government of Hawaii is modeled after the federal government with adaptations originating from the kingdom era of Hawaiian history. As codified in the Constitution of Hawaii, there are three branches of government: executive, legislative and judicial. The executive branch is led by the Governor of Hawaii, who is assisted by the Lieutenant Governor of Hawaii, both of whom are elected on the same ticket. The governor is the only state public official elected statewide; all others are appointed by the governor. The lieutenant governor acts as the Secretary of State. The governor and lieutenant governor oversee twenty agencies and departments from offices in the State Capitol. The official residence of the governor is Washington Place.
The legislative branch consists of the bicameral Hawaii State Legislature, which is composed of the 51-member Hawaii House of Representatives led by the Speaker of the House, and the 25-member Hawaii Senate led by the President of the Senate. The Legislature meets at the State Capitol. The unified judicial branch of Hawaii is the Hawaii State Judiciary. The state's highest court is the Supreme Court of Hawaii, which uses Aliiōlani Hale as its chambers.
Hawaii is represented in the United States Congress by two senators and two representatives. , all four seats are held by Democrats. Former representative Ed Case was elected in 2018 to the 1st congressional district. Tulsi Gabbard represents the 2nd congressional district, representing the rest of the state, which is largely rural and semi-rural.
Brian Schatz is the senior United States Senator from Hawaii. He was appointed to the office on December 26, 2012, by Governor Neil Abercrombie, following the death of former senator Daniel Inouye. The state's junior senator is Mazie Hirono, the former representative from the second congressional district. Hirono is the first female Asian American senator and the first Buddhist senator. Hawaii incurred the biggest seniority shift between the 112th and 113th Congresses. The state went from a delegation consisting of senators who were first and twenty-first in seniority to their respective replacements, relative newcomers Schatz and Hirono.
Federal officials in Hawaii are based at the Prince Kūhiō Federal Building near the Aloha Tower and Honolulu Harbor. The Federal Bureau of Investigation, Internal Revenue Service and the Secret Service maintain their offices there; the building is also the site of the federal District Court for the District of Hawaii and the United States Attorney for the District of Hawaii.
Since gaining statehood and participating in its first election in1960, Hawaii has supported Democrats in all but two presidential elections; 1972 and1984, both of which were landslide reelection victories for Republicans Richard Nixon and Ronald Reagan respectively. In Hawaii's statehood tenure, only Minnesota has supported Republican candidates fewer times in presidential elections. The 2016 Cook Partisan Voting Index ranks Hawaii as the most heavily Democratic state in the nation.
Hawaii has not elected a Republican to represent the state in the U.S. Senate since Hiram Fong in 1970; since 1977, both of the state's U.S. Senators have been Democrats.
In 2004, John Kerry won the state's four electoral votes by a margin of nine percentage points with 54% of the vote. Every county supported the Democratic candidate. In 1964, favorite son candidate senator Hiram Fong of Hawaii sought the Republican presidential nomination, while Patsy Mink ran in the Oregon primary in 1972.
Honolulu-born Barack Obama, then serving as United States Senator from Illinois, was elected the 44th President of the United States on November 4, 2008 and was re-elected for a second term on November 6, 2012. Obama had won the Hawaii Democratic caucus on February 19, 2008, with 76% of the vote. He was the third Hawaii-born candidate to seek the nomination of a major party and the first presidential nominee from Hawaii.
Hawaii is the only state in the United States that does not maintain a separate, state-wide police force. Instead, state law enforcement responsibilities are taken on by the municipal police agencies of the four main islands. Forensic services for all agencies in the state are provided by the Honolulu Police Department.
While Hawaii is internationally recognized as a state of the United States while also being broadly accepted as such in mainstream understanding, the legality of this status has been questioned in U.S. District Court, the U.N., and other international forums. Domestically, the debate is a topic covered in the Kamehameha Schools curriculum, and in classes at the University of Hawaiʻi at Mānoa.
Political organizations seeking some form of sovereignty for Hawaii have been active since the late 19th century. Generally, their focus is on self-determination and self-governance, either for Hawaii as an independent nation (in many proposals, for "Hawaiian nationals" descended from subjects of the Hawaiian Kingdom or declaring themselves as such by choice), or for people of whole or part native Hawaiian ancestry in an indigenous ""nation to nation"" relationship akin to tribal sovereignty with US federal recognition of Native Hawaiians. The pro-federal recognition Akaka Bill drew substantial opposition among Hawaiian residents in the 2000s. Opponents to the tribal approach argue it is not a legitimate path to Hawaiian nationhood; they also argue that the U.S. government should not be involved in re-establishing Hawaiian sovereignty.
The Hawaiian sovereignty movement views the overthrow of the Kingdom of Hawaii in 1893 as illegal, and views the subsequent annexation of Hawaii by the United States as illegal; the movement seeks some form of greater autonomy for Hawaii, such as free association or independence from the United States.
Some groups also advocate some form of redress from the United States for the 1893 overthrow of Queen Liliuokalani, and for what is described as a prolonged military occupation beginning with the 1898 annexation. The Apology Resolution passed by US Congress in 1993 is cited as a major impetus by the movement for Hawaiian sovereignty. The sovereignty movement considers Hawaii to be an illegally occupied nation. | https://en.wikipedia.org/wiki?curid=13270 |
Hearse
A hearse is a vehicle used to carry the dead in a coffin/casket. They range from deliberately anonymous vehicles to very formal heavily decorated vehicles.
In the funeral trade of some countries hearses are called funeral coaches.
The name is derived, through the French herse, from the Latin , which means a harrow. The funeral hearse was originally a wooden or metal framework, which stood over the bier or coffin and supported the pall. It was provided with numerous spikes to hold burning candles, and, owing to the resemblance of these spikes to the teeth of a harrow, was called a hearse. Later on, the word was applied, not only to the construction above the coffin, but to any receptacle in which the coffin was placed. Thus from about 1650 it came to denote the vehicle on which the dead are carried to the grave.
Hearses were originally hand-drawn then horse-drawn after the decoration and weight of the hearse increased. The first electric motorized hearses were introduced to the United States in the early 1900s. Petrol-powered hearses began to be produced from 1907 and, after slow initial uptake due to their high cost, became widely accepted in the 1920s. The vast majority of hearses since then have been based on larger, more powerful car chassis, generally retaining the front end up to and possibly including the front doors but with custom bodywork to the rear to contain the coffin.
A First Call vehicle is used to pick up the remains of a recently deceased person, and transport that person to the funeral home or morgue for preparation.
A few big cities provided special rail lines and/or funeral trolley cars and/or subway cars to carry bodies and mourners to remote cemeteries such as in Sydney, NSW and London and tram services were common. Chicago, Illinois operated 3 different funeral trolley cars over the elevated tracks in downtown Chicago to outlying cemeteries in the western suburbs. A special funeral bureau handled the funeral trains which sometimes operated 3–4 funeral trains a week over the "L".
The motorcycle hearse has become popular and is often used during the funeral of motorcycle enthusiasts.
This type of hearse is either a motorcycle with a special sidecar built to carry a casket or an urn at the side of the rider, or it is a trike that carries the casket behind the rider.
Two styles of formal hearse bodywork are common. One style has opaque rear panels so the coffin is barely glimpsed. This American style is fitted with a heavily padded leather or vinyl roof and each side decorated with large mock landau bars resembling the braces used for the folding leather tops on some horse-drawn carriages. The other has narrow pillars and large windows revealing the coffin.
Since the working life of a hearse is generally one of light duty and short, sedate drives, hearses often remain serviceable for a long time and hearses 30 years old or more may still be in service. Due to the costs of owning an expensive custom vehicle that sits idle "80 to 90 percent of the week", individual funeral homes reduce costs by renting or utilizing a shared motor pool.
Perhaps owing to the morbid associations of the hearse, its luxurious accommodations for the driver, or both, the hearse has a number of enthusiasts who own and drive retired hearses. There are several hearse clubs.
Usually more luxurious automobile brands are used as a base for funeral cars; since the mid 20th century, the vast majority of hearses in the United States and Canada have been Cadillacs and less frequently, Lincolns.
The Cadillac Commercial Chassis was a longer and strengthened version of the long-wheelbase Fleetwood limousine frame to carry the extra weight of bodywork, rear deck and cargo. The rear of the Cadillac commercial chassis was considerably lower than the passenger car frame, thereby lowering the rear deck height as well for ease of loading and unloading. The Cadillac hearses were shipped as incomplete cars to coachbuilders for final assembly. Since the late 1990s, most Cadillac based funeral cars have been constructed from modified Cadillac sedans, until late 2019; The XTS chassis was discontinued from General Motors, and as such any new Cadillac hearse will be built on the XT5 SUV chassis.
The fleet division of Ford Motor Company sells a Lincoln Town Car with a special "hearse package" strictly to coachbuilders. Shipped without rear seat, rear interior trim, rear window or decklid, the hearse package also features upgraded suspension, brakes, charging system and tires. This was replaced with the Lincoln MKT, which has also been discontinued.
The limousine style of hearse is more popular in the United States. It is common practice in the US for the windows to be curtained, while in other countries the windows are normally left unobscured.
Until the 1970s, it was common for many hearses to also be used as ambulances, due to the large cargo capacity in the rear of the vehicle. These vehicles were called "combination cars" and were especially used in small towns and rural areas. Car-based ambulances and combination coaches were unable to meet stricter Federal specifications for such vehicles and were discontinued after 1979.
Coachbuilders modify Mercedes-Benz, Jaguar, Opel, Ford, Vauxhall Motors and Volvo products to hearses. Some second-hand Rolls-Royce cars have traditionally been used as hearses though the high cost of newer models is generally considered prohibitive.
In Japan, hearses, called , can come in two styles: "Foreign" style, which is similar in build and style to an American hearse, or a "Japanese" style, in which the rear area of the vehicle is modified to resemble a small, ornate Buddhist temple.
The Japanese-style hearse generally requires the rear of the vehicle to be extensively altered; commonly, the rear roof is cut away from the front windows back and all interior parts are removed from the rear as well. The ornate Buddhist-style rear area, generally constructed of wood and in which the casket or urn is placed, is built on top of this empty cavity and most often is wider than the base of the vehicle, so that it sticks out on the sides, over the rear body panels. Popular bases for these hearses are large sedans, minivans and pickup trucks.
The ornaments on a Japanese-style hearse vary by region. Nagoya style decorates both the upper and lower halves of the car body.
Kansai style has a relatively modest decorations unpainted.
Kanazawa style is known for having a red body (other styles mostly have black bodies) with gilded ornaments.
Tokyo style, found anywhere else in Japan, features painted/gilded ornaments on the upper half of the body.
"Foreign" style hearses are mostly similar in appearance to their US counterparts, although their exterior dimensions and interiors reflect the Japanese preference for smaller, less ornate caskets (this in light of the national preference for cremation). This means that, in contrast to American hearses, the rear quarter panels require less, and sometimes no, alteration. These are generally built from station wagons such as the Nissan Stagea, or from executive sedans such as the Toyota Celsior (Lexus LS in the US) and Nissan Cima (Infiniti Q45 in the US). American market vehicles such as the Lincoln Town Car and Cadillac DeVille, which are otherwise fairly uncommon in Japan, are often converted to hearses in both styles.
In Hong Kong, light goods vehicles of Isuzu, Volkswagen and Ford are used as hearses by most of the privately operated funeral homes.
In Singapore, most hearses are built upon commercial van chassis, such as the Toyota Hiace and the Nissan Urvan, while the grand/traditional Chinese/Indian hearses are built with a lorry.
Amongst enthusiasts, the 1959 Cadillac Miller-Meteor hearse is considered one of the most desirable, due to its especially ornate styling and appearances in feature films, notably an ambulance version (Ecto-1) in the motion picture "Ghostbusters".
In the 1971 film "Harold and Maude" the character Harold, played by Bud Cort, drives two hearses: originally a 1959 Cadillac Superior 3-way; and then later a custom hearse he makes from a 1971 Jaguar XK-E 4.2 Series II. The Cadillac hearse used in the film is now privately owned in central California and is preserved, looking essentially identical to the way it did in the film. Only one Jaguar "hearse" was built and was destroyed as part of the film's storyline. Several "Harold and Maude" fans have since built similar hearses from E-Types and photos of them can be found online. Jane Goldman, wife of British TV and radio personality Jonathan Ross, owns a similar style "hearse" built from a Jaguar XK8 convertible.
The Rogues prowl around in a graffitied 1955 Cadillac Hearse in the film "The Warriors".
Musician Neil Young's first car was a hearse, which was used to transport the band's equipment.
Celebrity hearse enthusiasts include rock singer Neil Young and three-time NASCAR Sprint Cup Champion Tony Stewart, who had his hearse customised for a television show. Sam the Sham of the Pharaohs (known for Wooly Bully and Lil' Red Riding Hood) was known for transporting all his equipment in a 1952 Packard hearse.
In the 2014 film "Dumb and Dumber To", Harry (Jeff Daniels) and Lloyd (Jim Carrey) are lent a customized 1972 Cadillac Miller-Meteor hearse by Fraida (Kathleen Turner) to drive to Oxford, Maryland.
In the HBO television show "Six Feet Under", which dealt with death every week, premieres with the Fisher family patriarch Nathaniel, a funeral director, killed in an accident involving his new hearse. His daughter Claire also owned and drove a hearse.
In the Canadian television program "", character Eli Goldsworthy, a 'death obsessed' 16-year-old, drives a 1960s era vintage hearse, affectionately nicknamed Morty. Cleve Hall, of the Syfy television show "Monster Man", drives a 1980 Superior, with added coach lights on each side, in the 1st season of the show. He now drives a 1963 Miller Meteor named "Lucy".
Hearse was not used until about 1650 as the name for a carriage or car for the coffin. | https://en.wikipedia.org/wiki?curid=13274 |
Hungary
Hungary ( ) is a country in Central Europe. Spanning in the Carpathian Basin, it borders Slovakia to the north, Ukraine to the northeast, Romania to the east and southeast, Serbia to the south, Croatia and Slovenia to the southwest, and Austria to the west. With about 10 million inhabitants, Hungary is a medium-sized member state of the European Union. The official language is Hungarian, which is the most widely spoken Uralic language in the world, and among the few non-Indo-European languages to be widely spoken in Europe. Hungary's capital and largest city is Budapest; other major urban areas include Debrecen, Szeged, Miskolc, Pécs, and Győr.
The territory of modern Hungary was for centuries inhabited by a succession of peoples, including Celts, Romans, Germanic tribes, Huns, West Slavs and the Avars. The foundations of the Hungarian state were established in the late ninth century AD by the Hungarian grand prince Árpád following the conquest of the Carpathian Basin. His great-grandson Stephen I ascended the throne in 1000, converting his realm to a Christian kingdom. By the 12th century, Hungary became a regional power, reaching its cultural and political height in the 15th century. Following the Battle of Mohács in 1526, Hungary was partially occupied by the Ottoman Empire (1541–1699). It came under Habsburg rule at the turn of the 18th century, and later joined Austria to form the Austro–Hungarian Empire, a major European power.
The Austro-Hungarian Empire collapsed after World War I, and the subsequent Treaty of Trianon established Hungary's current borders, resulting in the loss of 71% of its territory, 58% of its population, and 32% of ethnic Hungarians. Following the tumultuous interwar period, Hungary joined the Axis Powers in World War II, suffering significant damage and casualties. Hungary became a satellite state of the Soviet Union, which contributed to the establishment of a socialist republic spanning four decades (1949–1989). The country gained widespread international attention as a result of its 1956 revolution and the seminal opening of its previously-restricted border with Austria in 1989, which accelerated the collapse of the Eastern Bloc. On 23 October 1989, Hungary became a democratic parliamentary republic.
Hungary is an OECD high-income economy, and has the world's 54th-largest economy by nominal GDP, and the 53rd-largest by PPP. It ranks 45th on the Human Development Index, owing in large part to its social security system, universal health care, and tuition-free secondary education. Hungary's rich cultural history includes significant contributions to the arts, music, literature, sports, science and technology. It is the thirteenth-most popular tourist destination in Europe, attracting 15.8 million international tourists in 2017, owing to attractions such as the largest thermal water cave system in the world, second largest thermal lake, the largest lake in Central Europe and the largest natural grasslands in Europe. Hungary's cultural, historical, and academic prominence classify it as a middle power in global affairs. Hungary joined the European Union in 2004 and has been part of the Schengen Area since 2007. It is a member of numerous international organizations, including the United Nations, NATO, WTO, World Bank, IIB, the AIIB, the Council of Europe, and the Visegrád Group.
The "H" in the name of Hungary (and Latin "Hungaria") is most likely due to founded historical associations with the Huns, who had settled Hungary prior to the Avars. The rest of the word comes from the Latinized form of Byzantine Greek "Oungroi" (Οὔγγροι). The Greek name was borrowed from Old Bulgarian "ągrinŭ", in turn borrowed from Oghur-Turkic "Onogur" ('ten [tribes of the] Ogurs'). "Onogur" was the collective name for the tribes who later joined the Bulgar tribal confederacy that ruled the eastern parts of Hungary after the Avars.
The Hungarian endonym is "Magyarország", composed of "magyar" ('Hungarian') and "ország" ('country'). The name "Magyar", which refers to the people of the country, more accurately reflects the name of the country in some other languages such as Turkish, Persian and other languages as "Magyaristan" or "Land of Magyars" or similar. The word "magyar" is taken from the name of one of the seven major semi-nomadic Hungarian tribes, "magyeri". The first element "magy" is likely from Proto-Ugric *"mäńć-" 'man, person', also found in the name of the Mansi people ("mäńćī, mańśi, måńś"). The second element "eri", 'man, men, lineage', survives in Hungarian "férj" 'husband', and is cognate with Mari "erge" 'son', Finnish archaic "yrkä" 'young man'.
The Roman Empire conquered the territory west of the Danube River between 35 and 9 BC. From 9 BC to the end of the 4th century AD, Pannonia, the western part of the Carpathian Basin, was part of the Roman Empire. After the Western Roman Empire collapsed in the 5th century AD under the stress of the migration of Germanic tribes and Carpian pressure, the Migration Period continued to bring many invaders into Central Europe, beginning with the Hunnic Empire (c. 370–469). The most powerful ruler of the Hunnic Empire was Attila the Hun (434–453) who later became a central figure in Hungarian mythology. After the disintegration of the Hunnic Empire, the Gepids, an Eastern Germanic tribe, who had been vassalized by the Huns, established their own kingdom in the Carpathian Basin. Other groups which reached the Carpathian Basin in the Migration Period were the Goths, Vandals, Lombards, and Slavs. In the 560s, the Avars founded the Avar Khaganate, a state that maintained supremacy in the region for more than two centuries. The Franks under Charlemagne defeated the Avars in a series of campaigns during the 790s. By the mid-9th century, the Balaton Principality, also known as Lower Pannonia, was established west of the Danube river as part of the Frankish March of Pannonia. The First Bulgarian Empire conquered the lands east of the Danube river and took over rule of the local Slavic tribes and remnants of the Avars.
The freshly unified Hungarians led by Árpád (by tradition a descendant of Attila), settled in the Carpathian Basin starting in 895. According to the Finno-Ugrian theory, they originated from an ancient Uralic-speaking population that formerly inhabited the forested area between the Volga River and the Ural Mountains.
As a federation of united tribes, Hungary was established in 895, some 50 years after the division of the Carolingian Empire at the Treaty of Verdun in 843, before the unification of the Anglo-Saxon kingdoms. Initially, the rising Principality of Hungary ("Western Tourkia" in medieval Greek sources) was a state created by a semi-nomadic people. It accomplished an enormous transformation into a Christian realm during the 10th century.
This state was well-functioning and the nation's military power allowed the Hungarians to conduct successful fierce campaigns and raids, from Constantinople to as far as today's Spain. The Hungarians defeated no fewer than three major East Frankish imperial armies between 907 and 910. A later defeat at the Battle of Lechfeld in 955 signaled a provisory end to most campaigns on foreign territories, at least towards the West.
The year 972 marked the date when the ruling prince () Géza of the Árpád dynasty officially started to integrate Hungary into the Christian Western Europe. His first-born son, Saint Stephen I, became the first King of Hungary after defeating his pagan uncle Koppány, who also claimed the throne. Under Stephen, Hungary was recognized as a Catholic Apostolic Kingdom. Applying to Pope Sylvester II, Stephen received the insignia of royalty (including probably a part of the Holy Crown of Hungary, currently kept in the Hungarian Parliament) from the papacy.
By 1006, Stephen consolidated his power, and started sweeping reforms to convert Hungary into a Western feudal state. The country switched to using the Latin language, and until as late as 1844, Latin remained the official language of Hungary. Around this time, Hungary began to become a powerful kingdom. Ladislaus I extended Hungary's frontier in Transylvania and invaded Croatia in 1091. The Croatian campaign culminated in the Battle of Gvozd Mountain in 1097 and a personal union of Croatia and Hungary in 1102, ruled by Coloman i.e. Könyves Kálmán.
The most powerful and wealthiest king of the Árpád dynasty was Béla III, who disposed of the equivalent of 23 tonnes of pure silver a year. This exceeded the income of the French king (estimated at 17 tonnes) and was double the receipts of the English Crown.
Andrew II issued the Diploma Andreanum which secured the special privileges of the Transylvanian Saxons and is considered the first Autonomy law in the world. He led the Fifth Crusade to the Holy Land in 1217, setting up the largest royal army in the history of Crusades. His Golden Bull of 1222 was the first constitution in Continental Europe. The lesser nobles also began to present Andrew with grievances, a practice that evolved into the institution of the parliament ("parlamentum publicum").
In 1241–1242, the kingdom received a major blow with the Mongol (Tatar) invasion. Up to half of Hungary's then population of 2,000,000 were victims of the invasion. King Béla IV let Cumans and Jassic people into the country, who were fleeing the Mongols. Over the centuries, they were fully assimilated into the Hungarian population.
As a consequence, after the Mongols retreated, King Béla ordered the construction of hundreds of stone castles and fortifications, to defend against a possible second Mongol invasion. The Mongols returned to Hungary in 1285, but the newly built stone-castle systems and new tactics (using a higher proportion of heavily armed knights) stopped them. The invading Mongol force was defeated near Pest by the royal army of Ladislaus IV of Hungary. As with later invasions, it was repelled handily, the Mongols losing much of their invading force.
The Kingdom of Hungary reached one of its greatest extents during the Árpádian kings, yet royal power was weakened at the end of their rule in 1301. After a destructive period of interregnum (1301–1308), the first Angevin king, Charles I of Hungary – a bilineal descendant of the Árpád dynasty – successfully restored royal power, and defeated oligarch rivals, the so-called "little kings". The second Angevin Hungarian king, Louis the Great (1342–1382), led many successful military campaigns from Lithuania to Southern Italy (Kingdom of Naples), and was also King of Poland from 1370. After King Louis died without a male heir, the country was stabilized only when Sigismund of Luxembourg (1387–1437) succeeded to the throne, who in 1433 also became Holy Roman Emperor. Sigismund was also (in several ways) a bilineal descendant of the Árpád dynasty.
The first Hungarian Bible translation was completed in 1439. For half a year in 1437, there was an antifeudal and anticlerical peasant revolt in Transylvania, the Budai Nagy Antal Revolt, which was strongly influenced by Hussite ideas.
From a small noble family in Transylvania, John Hunyadi grew to become one of the country's most powerful lords, thanks to his outstanding capabilities as a mercenary commander. He was elected governor then regent. He was a successful crusader against the Ottoman Turks, one of his greatest victories being the Siege of Belgrade in 1456.
The last strong king of medieval Hungary was the Renaissance king Matthias Corvinus (1458–1490), son of John Hunyadi. His election was the first time that a member of the nobility mounted to the Hungarian royal throne without dynastic background. He was a successful military leader and an enlightened patron of the arts and learning. His library, the Bibliotheca Corviniana, was Europe's greatest collection of historical chronicles, philosophic and scientific works in the 15th century, and second only in size to the Vatican Library. Items from the Bibliotheca Corviniana were inscribed on UNESCO's Memory of the World Register in 2005.
The serfs and common people considered him a just ruler because he protected them from excessive demands from and other abuses by the magnates. Under his rule, in 1479, the Hungarian army destroyed the Ottoman and Wallachian troops at the Battle of Breadfield. Abroad he defeated the Polish and German imperial armies of Frederick at Breslau (Wrocław). Matthias' mercenary standing army, the Black Army of Hungary, was an unusually large army for its time, and it conquered parts of Austria, Vienna (1485) and parts of Bohemia.
King Matthias died without lawful sons, and the Hungarian magnates procured the accession of the Pole Vladislaus II (1490–1516), supposedly because of his weak influence on Hungarian aristocracy. Hungary's international role declined, its political stability shaken, and social progress was deadlocked. In 1514, the weakened old King Vladislaus II faced a major peasant rebellion led by György Dózsa, which was ruthlessly crushed by the nobles, led by John Zápolya.
The resulting degradation of order paved the way for Ottoman pre-eminence. In 1521, the strongest Hungarian fortress in the South, Nándorfehérvár (today's Belgrade, Serbia), fell to the Turks. The early appearance of Protestantism further worsened internal relations in the country.
After some 150 years of wars with the Hungarians and other states, the Ottomans gained a decisive victory over the Hungarian army at the Battle of Mohács in 1526, where King Louis II died while fleeing. Amid political chaos, the divided Hungarian nobility elected two kings simultaneously, John Zápolya and Ferdinand I of the Habsburg dynasty. With the conquest of Buda by the Turks in 1541, Hungary was divided into three parts and remained so until the end of the 17th century. The north-western part, termed as Royal Hungary, was annexed by the Habsburgs who ruled as Kings of Hungary. The eastern part of the kingdom became independent as the Principality of Transylvania, under Ottoman (and later Habsburg) suzerainty. The remaining central area, including the capital Buda, was known as the Pashalik of Buda.
The vast majority of the seventeen and nineteen thousand Ottoman soldiers in service in the Ottoman fortresses in the territory of Hungary were Orthodox and Muslim Balkan Slavs rather than ethnic Turkish people. Orthodox Southern Slavs were also acting as akinjis and other light troops intended for pillaging in the territory of present-day Hungary. In 1686, the Holy League's army, containing over 74,000 men from various nations, reconquered Buda from the Turks. After some more crushing defeats of the Ottomans in the next few years, the entire Kingdom of Hungary was removed from Ottoman rule by 1718. The last raid into Hungary by the Ottoman vassals Tatars from Crimea took place in 1717. The constrained Habsburg Counter-Reformation efforts in the 17th century reconverted the majority of the kingdom to Catholicism. The ethnic composition of Hungary was fundamentally changed as a consequence of the prolonged warfare with the Turks. A large part of the country became devastated, population growth was stunted, and many smaller settlements perished. The Austrian-Habsburg government settled large groups of Serbs and other Slavs in the depopulated south, and settled Germans (called Danube Swabians) in various areas, but Hungarians were not allowed to settle or re-settle in the south of the Great Plain.
Between 1703 and 1711, there was a large-scale uprising led by Francis II Rákóczi, who after the dethronement of the Habsburgs in 1707 at the Diet of Ónod, took power provisionally as the Ruling Prince of Hungary for the wartime period, but refused the Hungarian Crown and the title "King". The uprisings lasted for years. The Hungarian Kuruc army, although taking over most of the country, lost the main battle at Trencsén (1708). Three years later, because of the growing desertion, defeatism and low morale, the Kuruc forces finally surrendered.
During the Napoleonic Wars and afterwards, the Hungarian Diet had not convened for decades. In the 1820s, the Emperor was forced to convene the Diet, which marked the beginning of a Reform Period (1825–1848, ). Count István Széchenyi, one of the most prominent statesmen of the country, recognized the urgent need of modernization and his message got through. The Hungarian Parliament was reconvened in 1825 to handle financial needs. A liberal party emerged and focused on providing for the peasantry. Lajos Kossuth – a famous journalist at that time – emerged as leader of the lower gentry in the Parliament. A remarkable upswing started as the nation concentrated its forces on modernization even though the Habsburg monarchs obstructed all important liberal laws relating to civil and political rights and economic reforms. Many reformers (Lajos Kossuth, Mihály Táncsics) were imprisoned by the authorities.
On 15 March 1848, mass demonstrations in Pest and Buda enabled Hungarian reformists to push through a list of 12 demands. Under governor and president Lajos Kossuth and the first Prime Minister, Lajos Batthyány, the House of Habsburg was dethroned.
The Habsburg Ruler and his advisors skillfully manipulated the Croatian, Serbian and Romanian peasantry, led by priests and officers firmly loyal to the Habsburgs, and induced them to rebel against the Hungarian government, though the Hungarians were supported by the vast majority of the Slovak, German and Rusyn nationalities and by all the Jews of the kingdom, as well as by a large number of Polish, Austrian and Italian volunteers. In July 1849 the Hungarian Parliament proclaimed and enacted the first laws of ethnic and minority rights in the world. Many members of the nationalities gained the coveted highest positions within the Hungarian Army, like General János Damjanich, an ethnic Serb who became a Hungarian national hero through his command of the 3rd Hungarian Army Corps or Józef Bem, who was Polish and also became a national hero in Hungary. The Hungarian forces ("Honvédség") defeated Austrian armies. To counter the successes of the Hungarian revolutionary army, Habsburg Emperor Franz Joseph I asked for help from the "Gendarme of Europe", Tsar Nicholas I, whose Russian armies invaded Hungary. This made Artúr Görgey surrender in August 1849. The leader of the Austrian army, Julius Jacob von Haynau, became governor of Hungary for a few months, and ordered the execution of the 13 Martyrs of Arad, leaders of the Hungarian army, and Prime Minister Batthyány in October 1849. Lajos Kossuth escaped into exile. Following the war of 18481849, the whole country was in "passive resistance".
Because of external and internal problems, reforms seemed inevitable and major military defeats of Austria forced the Habsburgs to negotiate the Austro-Hungarian Compromise of 1867, by which the dual Monarchy of Austria–Hungary was formed. This Empire had the second largest area in Europe (after the Russian Empire), and it was the third most populous (after Russia and the German Empire). The two realms were governed separately by two parliaments from two capital cities, with a common monarch and common external and military policies. Economically, the empire was a customs union. The old Hungarian Constitution was restored, and Franz Joseph I was crowned as King of Hungary. The era witnessed impressive economic development. The formerly backward Hungarian economy became relatively modern and industrialized by the turn of the 20th century, although agriculture remained dominant until 1890. In 1873, the old capital Buda and Óbuda were officially united with Pest, thus creating the new metropolis of Budapest. Many of the state institutions and the modern administrative system of Hungary were established during this period.
After the Assassination of Archduke Franz Ferdinand in Sarajevo, the Hungarian prime minister István Tisza and his cabinet tried to avoid the outbreak and escalating of a war in Europe, but their diplomatic efforts were unsuccessful. Austria–Hungary drafted 9 million (fighting forces: 7.8 million) soldiers in World War I (over 4 million from the Kingdom of Hungary) on the side of Germany, Bulgaria and Turkey. The troops raised in the Kingdom of Hungary spent little time defending the actual territory of Hungary, with the exceptions of the Brusilov Offensive in June 1916, and a few months later, when the Romanian army made an attack into Transylvania, both of which were repelled. In comparison, of the total army, Hungary's loss ratio was more than any other nations of Austria-Hungary. The Central Powers conquered Serbia. Romania declared war. The Central Powers conquered Southern Romania and the Romanian capital Bucharest. In 1916 Emperor Franz Joseph died, and the new monarch Charles IV sympathized with the pacifists. With great difficulty, the Central powers stopped and repelled the attacks of the Russian Empire.
The Eastern front of the Allied (Entente) Powers completely collapsed. The Austro-Hungarian Empire then withdrew from all defeated countries. On the Italian front, the Austro-Hungarian army made no progress against Italy after January 1918. Despite great Eastern successes, Germany suffered complete defeat on the more important Western front. By 1918, the economic situation had deteriorated (strikes in factories were organized by leftist and pacifist movements) and uprisings in the army had become commonplace. In the capital cities, the Austrian and Hungarian leftist liberal movements (the maverick parties) and their leaders supported the separatism of ethnic minorities. Austria-Hungary signed a general armistice in Padua on 3 November 1918. In October 1918, Hungary's union with Austria was dissolved.
Following the First World War, Hungary underwent a period of profound political upheaval, beginning with the Aster Revolution in 1918, which brought the social-democratic Mihály Károlyi to power as Prime Minister. The Hungarian Royal Honvéd army still had more than 1,400,000 soldiers when Mihály Károlyi was announced as prime minister of Hungary. Károlyi yielded to U.S. President Woodrow Wilson's demand for pacifism by ordering the disarmament of the Hungarian army. This happened under the direction of Béla Linder, minister of war in the Károlyi government. Due to the full disarmament of its army, Hungary was to remain without a national defence at a time of particular vulnerability. During the rule of Károlyi's pacifist cabinet, Hungary lost the control over approx. 75% of its former pre-WW1 territories (325 411 km²) without fight and was subject to foreign occupation. The Little Entente, sensing an opportunity, invaded the country from three sides—Romania invaded Transylvania, Czechoslovakia annexed Upper Hungary (today's Slovakia), and a joint Serb-French coalition annexed Vojvodina and other southern regions. In March 1919, communists led by Béla Kun ousted the Károlyi government and proclaimed the Hungarian Soviet Republic ("Tanácsköztársaság"), followed by a thorough Red Terror campaign. Despite some successes on the Czechoslovak front, Kun's forces were ultimately unable to resist the Romanian invasion; by August 1919, Romanian troops occupied Budapest and ousted Kun.
In November 1919, rightist forces led by former Austro-Hungarian admiral Miklós Horthy entered Budapest; exhausted by the war and its aftermath, the populace accepted Horthy's leadership. In January 1920, parliamentary elections were held and Horthy was proclaimed Regent of the reestablished Kingdom of Hungary, inaugurating the so-called "Horthy era" ("Horthy-kor"). The new government worked quickly to normalize foreign relations while turning a blind eye to a White Terror that swept through the countryside; extrajudicial killings of suspected communists and Jews lasted well into 1920. On 4 June of that year, the Treaty of Trianon established new borders for Hungary. The country lost 71% of its territory and 66% of its antebellum population, as well as many sources of raw materials and its sole port, Fiume. Though the revision of the Treaty quickly rose to the top of the national political agenda, the Horthy government was not willing to resort to military intervention to do so.
The initial years of the Horthy regime were preoccupied by putsch attempts by Charles IV, the Austro-Hungarian pretender; continued suppression of communists; and a migration crisis triggered by the Trianon territorial changes. Though free elections continued, Horthy's personality, and those of his personally selected prime ministers, dominated the political scene. The government's actions continued to drift right with the passage of antisemitic laws and, due to the continued isolation of the Little Entente, economic and then political gravitation toward Italy and Germany. The Great Depression further exacerbated the situation and the popularity of fascist politicians such as Gyula Gömbös and Ferenc Szálasi, promising economic and social recovery, rose.
Horthy's nationalist agenda reached its apogee in 1938 and 1940, when the Nazis rewarded Hungary's staunchly pro-Germany foreign policy in the First and Second Vienna Awards, respectively, peacefully restoring ethnic-Hungarian-majority areas lost after Trianon. In 1939, Hungary regained further territory from Czechoslovakia through force. Hungary formally joined the Axis Powers on 20 November 1940, and in 1941, participated in the invasion of Yugoslavia, gaining some of its former territories in the south.
Hungary formally entered World War II as an Axis Power on 26 June 1941, declaring war on the Soviet Union after unidentified planes bombed Kassa, Munkács, and Rahó. Hungarian troops fought on the Eastern Front for two years. Despite some early successes, the Hungarian government began seeking a secret peace pact with the Allies after the Second Army suffered catastrophic losses at the River Don in January 1943. Learning of the planned defection, German troops occupied Hungary on 19 March 1944 to guarantee Horthy's compliance. In October, as the Soviet front approached and the Hungarian government made further efforts to disengage from the war, German troops ousted Horthy and installed a puppet government under Szálasi's fascist Arrow Cross Party. Szálasi pledged all the country's capabilities in service of the German war machine. By October 1944, the Soviets had reached the river Tisza, and despite some losses, succeeded in encircling and besieging Budapest in December.
After German occupation, Hungary participated in the Holocaust. During the German occupation in May–June 1944, the Arrow Cross and Hungarian police deported nearly 440,000 Jews, mainly to Auschwitz. Nearly all of them were murdered. The Swedish Diplomat Raoul Wallenberg managed to save a considerable number of Hungarian Jews by giving them Swedish passports. Rezső Kasztner, one of the leaders of the Hungarian Aid and Rescue Committee, bribed senior SS officers such as Adolf Eichmann to allow some Jews to escape. The Horthy government's complicity in the Holocaust remains a point of controversy and contention.
The war left Hungary devastated, destroying over 60% of the economy and causing significant loss of life. In addition to the over 600,000 Hungarian Jews killed, as many as 280,000 other Hungarians were raped, murdered and executed or deported for slave labor by Czechoslovaks, Soviet Red Army troops, and Yugoslavs.
On 13 February 1945, Budapest surrendered; by April, German troops left the country under Soviet military occupation. 200,000 Hungarians were expelled from Czechoslovakia in exchange for 70,000 Slovaks living in Hungary. 202,000 ethnic Germans were expelled to Germany, and through the 1947 Paris Peace Treaties, Hungary was again reduced to its immediate post-Trianon borders.
Following the defeat of Nazi Germany, Hungary became a satellite state of the Soviet Union. The Soviet leadership selected Mátyás Rákosi to front the Stalinization of the country, and Rákosi "de facto" ruled Hungary from 1949 to 1956. His government's policies of militarization, industrialization, collectivization, and war compensation led to a severe decline in living standards. In imitation of Stalin's KGB, the Rákosi government established a secret political police, the ÁVH, to enforce the new regime. In the ensuing purges approximately 350,000 officials and intellectuals were imprisoned or executed from 1948 to 1956. Many freethinkers, democrats, and Horthy-era dignitaries were secretly arrested and extrajudicially interned in domestic and foreign Gulags. Some 600,000 Hungarians were deported to Soviet labor camps, where at least 200,000 died.
After Stalin's death in 1953, the Soviet Union pursued a program of destalinization that was inimical to Rákosi, leading to his deposition. The following political cooling saw the ascent of Imre Nagy to the premiership, and the growing interest of students and intellectuals in political life. Nagy promised market liberalization and political openness, while Rákosi opposed both vigorously. Rákosi eventually managed to discredit Nagy and replace him with the more hard-line Ernő Gerő. Hungary joined the Warsaw Pact in May 1955, as societal dissatisfaction with the regime swelled. Following the firing on peaceful demonstrations by Soviet soldiers and secret police, and rallies throughout the country on 23 October 1956, protesters took to the streets in Budapest, initiating the 1956 Revolution. In an effort to quell the chaos, Nagy returned as premier, promised free elections, and took Hungary out of the Warsaw Pact.
The violence nonetheless continued as revolutionary militias sprung up against the Soviet Army and the ÁVH; the roughly 3,000-strong resistance fought Soviet tanks using Molotov cocktails and machine-pistols. Though the preponderance of the Soviets was immense, they suffered heavy losses, and by 30 October 1956 most Soviet troops had withdrawn from Budapest to garrison the countryside. For a time, the Soviet leadership was unsure how to respond to developments in Hungary, but eventually decided to intervene to prevent a destabilization of the Soviet bloc. On 4 November reinforcements of more than 150,000 troops and 2,500 tanks entered the country from the Soviet Union. Nearly 20,000 Hungarians were killed resisting the intervention, while an additional 21,600 were imprisoned afterwards for political reasons. Some 13,000 were interned and 230 brought to trial and executed. Nagy was secretly tried, found guilty, sentenced to death and executed by hanging in June 1958. Because borders were briefly opened, nearly a quarter of a million people fled the country by the time the revolution was suppressed.
After a second, briefer period of Soviet military occupation, János Kádár, Nagy's former Minister of State, was chosen by the Soviet leadership to head the new government and chair the new ruling Socialist Workers' Party (MSzMP). Kádár quickly normalized the situation. In 1963, the government granted a general amnesty and released the majority of those imprisoned for their active participation in the uprising. Kádár proclaimed a new policy line, according to which the people were no longer compelled to profess loyalty to the party if they tacitly accepted the Socialist regime as a fact of life. In many speeches, he described this as, "Those who are not against us are with us." Kádár introduced new planning priorities in the economy, such as allowing farmers significant plots of private land within the collective farm system ("háztáji gazdálkodás"). The living standard rose as consumer good and food production took precedence over military production, which was reduced to one tenth of pre-revolutionary levels.
In 1968, the New Economic Mechanism (NEM) introduced free-market elements into socialist command economy. From the 1960s through the late 1980s, Hungary was often referred to as "the happiest barrack" within the Eastern bloc. During the latter part of the Cold War Hungary's GDP per capita was fourth only to East Germany, Czechoslovakia, and the Soviet Union itself. As a result of this relatively high standard of living, a more liberalized economy, a less censored press, and less restricted travel rights, Hungary was generally considered one of the more liberal countries in which to live in Central Europe during communism. In the 1980s, however, living standards steeply declined again due to a worldwide recession to which communism was unable to respond. By the time Kádár died in 1989, the Soviet Union was in steep decline and a younger generation of reformists saw liberalization as the solution to economic and social issues.
Hungary's transition from communism to democracy and capitalism ("rendszerváltás", "regime change") was peaceful and prompted by economic stagnation, domestic political pressure, and changing relations with other Warsaw Pact countries. Although the MSzMP began Round Table Talks with various opposition groups in March 1989, the reburial of Imre Nagy as a revolutionary martyr that June is widely considered the symbolic end of communism in Hungary. Over 100,000 people attended the Budapest ceremony without any significant government interference, and many speakers openly called for Soviet troops to leave the country. Free elections were held in May 1990, and the Hungarian Democratic Forum, a major conservative opposition group, was elected to the head of a coalition government. József Antall became the first democratically elected Prime Minister since World War II.
With the removal of state subsidies and rapid privatization in 1991, Hungary was affected by a severe economic recession. The Antall government's austerity measures proved unpopular, and the Communist Party's legal and political heir, the Socialist Party, won the subsequent 1994 elections. This abrupt shift in the political landscape was repeated in 1998 and 2002; each electoral cycle, the governing party was ousted and the erstwhile opposition elected. Like most other post-communist European states, however, Hungary broadly pursued an integrationist agenda, joining NATO in 1999 and the European Union in 2004. As a NATO member, Hungary was involved in the Yugoslav Wars.
In 2006, major protests erupted after it was revealed that Prime Minister Ferenc Gyurcsány had claimed in a private speech that his party "lied" to win the recent elections. The popularity of left-wing parties plummeted in the ensuing political upheaval, and in 2010, Viktor Orbán's national-conservative Fidesz was elected to a parliamentary supermajority. The legislature consequently approved a new constitution, among other sweeping governmental and legal changes. Although these developments were met with and still engender controversy, Fidesz secured a second parliamentary supermajority in 2014 and a third in 2018.
In September 2018, the European parliament voted to act against Hungary, under the terms of Article 7 of the Treaty on European Union. Proponents of the vote claimed that the Hungarian government posed a "systematic threat" to democracy and the rule of law. The vote was carried with the support of 448 MEPs, narrowly clearing the two-thirds majority required. The vote marked the first the European parliament had triggered an article 7 procedure against an EU member state. Péter Szijjártó, the Hungarian foreign minister, described the vote as "petty revenge" which had been provoked by Hungary's tough anti-migration policies. Szijjártó alleged that the vote was fraudulent because abstentions were not counted which made it easier to reach the two-thirds majority required to pass the vote.
At the European elections in May 2019, Viktor Orbán's Fidesz Party secured another a sweeping victory, receiving more than 50 percent of the votes.
In March 2020, during the ongoing coronavirus pandemic, the Hungarian parliament passed a law granting the Government the power to rule by decree to the extent it is necessary to diminish the consequences of the pandemic, suspending by-elections and outlawing the "spreading of misinformation". The Government's special authorization is in force until the pandemic is declared to have ended. The law has been lifted on 16 June, 2020. | https://en.wikipedia.org/wiki?curid=13275 |
Historiography
Historiography is the study of the methods of historians in developing history as an academic discipline, and by extension is any body of historical work on a particular subject. The historiography of a specific topic covers how historians have studied that topic using particular sources, techniques, and theoretical approaches. Scholars discuss historiography by topic—such as the historiography of the United Kingdom, that of WWII, the British Empire, early Islam, and China—and different approaches and genres, such as political history and social history. Beginning in the nineteenth century, with the development of academic history, there developed a body of historiographic literature. The extent to which historians are influenced by their own groups and loyalties—such as to their nation state—remains a debated question.
In the ancient world, chronological annals were produced in civilizations such as ancient Egypt and Mesopotamia. However, the discipline of historiography was first established in the 5th century BC with the "Histories" of Herodotus, the founder of Greek historiography. The Roman statesman Cato the Elder produced the first history in Latin, the "Origines", in the 2nd century BC. His near contemporaries Sima Tan and Sima Qian in the Han Empire of China established Chinese historiography with the compiling of the "Shiji" ("Records of the Grand Historian"). During the Middle Ages, medieval historiography included the works of chronicles in medieval Europe, Islamic histories by Muslim historians, and the Korean and Japanese historical writings based on the existing Chinese model. During the 18th century Age of Enlightenment, historiography in the Western world was shaped and developed by figures such as Voltaire, David Hume, and Edward Gibbon, who among others set the foundations for the modern discipline.
The research interests of historians change over time, and there has been a shift away from traditional diplomatic, economic, and political history toward newer approaches, especially social and cultural studies. From 1975 to 1995 the proportion of professors of history in American universities identifying with social history increased from 31 to 41 percent, while the proportion of political historians decreased from 40 to 30 percent. In 2007, of 5,723 faculty in the departments of history at British universities, 1,644 (29%) identified themselves with social history and 1,425 (25%) identified themselves with political history. Since the 1980s there has been a special interest in the memories and commemoration of past events—the histories as remembered and presented for popular celebration.
In the early modern period, the term "historiography" meant "the writing of history", and "historiographer" meant "historian". In that sense certain official historians were given the title "Historiographer Royal" in Sweden (from 1618), England (from 1660), and Scotland (from 1681). The Scottish post is still in existence.
Historiography was more recently defined as "the study of the way history has been and is written – the history of historical writing", which means that, "When you study 'historiography' you do not study the events of the past directly, but the changing interpretations of those events in the works of individual historians."
Understanding the past appears to be a universal human need, and the "telling of history" has emerged independently in civilizations around the world.
What constitutes history is a philosophical question (see philosophy of history).
The earliest chronologies date back to Mesopotamia and ancient Egypt, in the form of chronicles and annals. However, no historical writers in these early civilizations were known by name. By contrast, the term "historiography" is taken to refer to written history recorded in a narrative format for the purpose of informing future generations about events. In this limited sense, "ancient history" begins with the early historiography of Classical Antiquity, in about the 5th century BCE.
The earliest known systematic historical thought emerged in ancient Greece, a development which would be an important influence on the writing of history elsewhere around the Mediterranean region. Greek historians greatly contributed to the development of historical methodology. The earliest known critical historical works were "The Histories", composed by Herodotus of Halicarnassus (484–425 BCE) who became known as the "father of history". Herodotus attempted to distinguish between more and less reliable accounts, and personally conducted research by travelling extensively, giving written accounts of various Mediterranean cultures. Although Herodotus' overall emphasis lay on the actions and characters of men, he also attributed an important role to divinity in the determination of historical events.
The generation following Herodotus witnessed a spate of local histories of the individual city-states ("poleis"), written by the first of the local historians who employed the written archives of city and sanctuary. Dionysius of Halicarnassus characterized these historians as the forerunners of Thucydides, and these local histories continued to be written into Late Antiquity, as long as the city-states survived. Two early figures stand out: Hippias of Elis, who produced the lists of winners in the Olympic Games that provided the basic chronological framework as long as the pagan classical tradition lasted, and Hellanicus of Lesbos, who compiled more than two dozen histories from civic records, all of them now lost.
Thucydides largely eliminated divine causality in his account of the war between Athens and Sparta, establishing a rationalistic element which set a precedent for subsequent Western historical writings. He was also the first to distinguish between cause and immediate origins of an event, while his successor Xenophon ( – ) introduced autobiographical elements and character studies in his "Anabasis".
The proverbial Philippic attacks of the Athenian orator Demosthenes () on Philip II of Macedon marked the height of ancient political agitation. The now lost history of Alexander's campaigns by the diadoch Ptolemy I () may represent the first historical work composed by a ruler. Polybius ( – ) wrote on the rise of Rome to world prominence, and attempted to harmonize the Greek and Roman points of view.
The Chaldean priest Berossus () composed a Greek-language "History of Babylonia" for the Seleucid king Antiochus I, combining Hellenistic methods of historiography and Mesopotamian accounts to form a unique composite. Reports exist of other near-eastern histories, such as that of the Phoenician historian Sanchuniathon; but he is considered semi-legendary and writings attributed to him are fragmentary, known only through the later historians Philo of Byblos and Eusebius, who asserted that he wrote before even the Trojan war.
The Romans adopted the Greek tradition, writing at first in Greek, but eventually chronicling their history in a freshly non-Greek language. While early Roman works were still written in Greek, the "Origines", composed by the Roman statesman Cato the Elder (), was written in Latin, in a conscious effort to counteract Greek cultural influence. It marked the beginning of Latin historical writings. Hailed for its lucid style, Julius Caesar's () "de Bello Gallico" exemplifies autobiographical war coverage. The politician and orator Cicero () introduced rhetorical elements in his political writings.
Strabo ( – ) was an important exponent of the Greco-Roman tradition of combining geography with history, presenting a descriptive history of peoples and places known to his era. Livy ( – ) records the rise of Rome from city-state to empire. His speculation about what would have happened if Alexander the Great had marched against Rome represents the first known instance of alternate history.
Biography, although popular throughout antiquity, was introduced as a branch of history by the works of Plutarch ( – ) and Suetonius ( – after ) who described the deeds and characters of ancient personalities, stressing their human side. Tacitus () denounces Roman immorality by praising German virtues, elaborating on the topos of the Noble savage.
The Han dynasty eunuch Sima Qian (around ) was the first in China to lay the groundwork for professional historical writing. His work superseded the older style of the "Spring and Autumn Annals", compiled in the 5th century BC, the "Bamboo Annals" and other court and dynastic annals that recorded history in a chronological form that abstained from analysis. Sima's "Shiji" ("Records of the Grand Historian") pioneered the "Annals-biography" format, which would become the standard for prestige history writing in China. In this genre a history opens with a chronological outline of court affairs, and then continues with detailed biographies of prominent people who lived during the period in question. The scope of his work extended as far back as the , and included many treatises on specific subjects and individual biographies of prominent people. He also explored the lives and deeds of commoners, both contemporary and those of previous eras.
Whereas Sima's had been a universal history from the beginning of time down to the time of writing, his successor Ban Gu wrote an annals-biography history limiting its coverage to only the Western Han dynasty, the Book of Han (96 CE). This established the notion of using dynastic boundaries as start- and end-points, and most later Chinese histories would focus on a single dynasty or group of dynasties.
The Records of the Grand Historian and Book of Han were eventually joined by the Book of the Later Han (488 CE) (replacing the earlier, and now only partially extant, Han Records from the Eastern Pavilion) and the Records of the Three Kingdoms (297 CE) to form the "Four Histories". These became mandatory reading for the Imperial Examinations and have therefore exerted an influence on Chinese culture comparable to the Confucian Classics. More annals-biography histories were written in subsequent dynasties, eventually bringing the number to between twenty-four and twenty-six, but none ever reached the popularity and impact of the first four.
Traditional Chinese historiography describes history in terms of dynastic cycles. In this view, each new dynasty is founded by a morally righteous founder. Over time, the dynasty becomes morally corrupt and dissolute. Eventually, the dynasty becomes so weak as to allow its replacement by a new dynasty.
In 281 CE the tomb of King Xiang of Wei (d. 296 BC) was opened, inside of which was found a historical text called the Bamboo Annals, after the writing material. It is similar in style to the Spring and Autumn Annals and covers the time from the Yellow Emperor to 299 BC. Opinions on the authenticity of the text has varied throughout the centuries, and in any event it was re-discovered too late to gain anything like the same status as the Spring and Autumn.
Christian historiography began early, perhaps as early as Luke-Acts, which is the primary source for the Apostolic Age, though its historical reliability is disputed. In the first Christian centuries, the New Testament canon was developed. The growth of Christianity and its enhanced status in the Roman Empire after Constantine I (see State church of the Roman Empire) led to the development of a distinct Christian historiography, influenced by both Christian theology and the nature of the Christian Bible, encompassing new areas of study and views of history. The central role of the Bible in Christianity is reflected in the preference of Christian historians for written sources, compared to the classical historians' preference for oral sources and is also reflected in the inclusion of politically unimportant people. Christian historians also focused on development of religion and society. This can be seen in the extensive inclusion of written sources in the "Ecclesiastical History" written by Eusebius of Caesarea around 324 and in the subjects it covers. Christian theology considered time as linear, progressing according to divine plan. As God's plan encompassed everyone, Christian histories in this period had a universal approach. For example, Christian writers often included summaries of important historical events prior to the period covered by the work.
Writing history was popular among Christian monks and clergy in the Middle Ages. They wrote about the history of Jesus Christ, that of the Church and that of their patrons, the dynastic history of the local rulers. In the Early Middle Ages historical writing often took the form of annals or chronicles recording events year by year, but this style tended to hamper the analysis of events and causes. An example of this type of writing is the "Anglo-Saxon Chronicle", which was the work of several different writers: it was started during the reign of Alfred the Great in the late 9th century, but one copy was still being updated in 1154. Some writers in the period did construct a more narrative form of history. These included Gregory of Tours and more successfully Bede, who wrote both secular and ecclesiastical history and who is known for writing the "Ecclesiastical History of the English People".
During the Renaissance, history was written about states or nations. The study of history changed during the Enlightenment and Romanticism. Voltaire described the history of certain ages that he considered important, rather than describing events in chronological order. History became an independent discipline. It was not called "philosophia historiae" anymore, but merely history ("historia").
Muslim historical writings first began to develop in the 7th century, with the reconstruction of the Prophet Muhammad's life in the centuries following his death. With numerous conflicting narratives regarding Muhammad and his companions from various sources, it was necessary to verify which sources were more reliable. In order to evaluate these sources, various methodologies were developed, such as the "science of biography", "science of hadith" and "Isnad" (chain of transmission). These methodologies were later applied to other historical figures in the Islamic civilization. Famous historians in this tradition include Urwah (d. 712), Wahb ibn Munabbih (d. 728), Ibn Ishaq (d. 761), al-Waqidi (745–822), Ibn Hisham (d. 834), Muhammad al-Bukhari (810–870) and Ibn Hajar (1372–1449). Historians of the medieval Islamic world also developed an interest in world history. Islamic historical writing eventually culminated in the works of the Arab Muslim historian Ibn Khaldun (1332–1406), who published his historiographical studies in the "Muqaddimah" (translated as "Prolegomena") and "Kitab al-I'bar" ("Book of Advice"). His work was forgotten until it was rediscovered in the late 19th century.
The earliest works of history produced in Japan were the "Rikkokushi" (Six National Histories), a corpus of six national histories covering the history of Japan from its mythological beginnings until the 9th century. The first of these works were the "Nihon Shoki", compiled by Prince Toneri in 720.
The tradition of Korean historiography was established with the "Samguk Sagi", a history of Korea from its allegedly earliest times. It was compiled by Goryeo court historian Kim Busik after its commission by King Injong of Goryeo (r. 1122 – 1146). It was completed in 1145 and relied not only on earlier Chinese histories for source material, but also on the "Hwarang Segi" written by the Silla historian Kim Daemun in the 8th century. The latter work is now lost.
In 1084 the Song dynasty official Sima Guang completed the "Zizhi Tongjian" (Comprehensive Mirror to Aid in Government), which laid out the entire history of China from the beginning of the Warring States period (403 BCE) to the end of the Five Dynasties period (959 CE) in chronological annals form, rather than in the traditional annals-biography form. This work is considered much more accessible than the "Official Histories" for the Six dynasties, Tang dynasty, and Five Dynasties, and in practice superseded those works in the mind of the general reader.
The great Song Neo-Confucian Zhu Xi found the Mirror to be overly long for the average reader, as well as to too morally nihilist, and therefore prepared a didactic summary of it called the "Zizhi Tongjian Gangmu" (Digest of the Comprehensive Mirror to Aid in Government), posthumously published in 1219. It reduced the original's 249 chapters to just 59, and for the rest of imperial Chinese history would be the first history book most people ever read.
Includes historical and archival research and writing on the history of the Philippine archipelago including the islands of Luzon, Visayas, and Mindanao.
Historiography of the Philippines refers to the studies, sources, critical methods and interpretations used by scholars to study the history of the Philippines. The Philippine archipelago has been part of many empires before the Spanish empire has arrived in the 16th century.
Before the arrival of Spanish colonial powers the Philippines did not actually exist. Southeast Asia is classified as part of the Indosphere and the Sinosphere. The archipelago has direct contact with China during Song dynasty (960-1279) and has been a part of the Srivijaya and Majapahit empires.
The pre-colonial Philippines uses the Abugida writing system that has been widely used in writing and seals on documents though it was for communication and no recorded writings of early literature or history Ancient Filipinos usually write documents on bamboo, bark, and leaves which did not survive unlike inscriptions on clays, metals, and ivories did like the Laguna Copperplate Inscription and Butuan Ivory Seal. The discovery of the Butuan Ivory Seal also proves the use of paper documents in ancient Philippines.
The arrival of the Spanish colonizers, pre-colonial Filipino manuscripts and documents were gathered and burned to eliminate pagan beliefs. This has been the burden of historians in the accumulation of data and the development of theories that gave historians many aspects of Philippine history that were left unexplained. The interplay of pre-colonial events, the use of secondary sources written by historians to evaluate the primary sources, do not provide a critical examination of the methodology of the early Philippine historical study.
During the Age of Enlightenment, the modern development of historiography through the application of scrupulous methods began. Among the many Italians who contributed to this were Leonardo Bruni (c. 1370–1444), Francesco Guicciardini (1483–1540), and Cesare Baronio (1538–1607).
French "philosophe" Voltaire (1694–1778) had an enormous influence on the development of historiography during the Age of Enlightenment through his demonstration of fresh new ways to look at the past. Guillaume de Syon argues:
Voltaire's best-known histories are "The Age of Louis XIV" (1751), and his "Essay on the Customs and the Spirit of the Nations" (1756). He broke from the tradition of narrating diplomatic and military events, and emphasized customs, social history and achievements in the arts and sciences. He was the first scholar to make a serious attempt to write the history of the world, eliminating theological frameworks, and emphasizing economics, culture and political history. Although he repeatedly warned against political bias on the part of the historian, he did not miss many opportunities to expose the intolerance and frauds of the church over the ages. Voltaire advised scholars that anything contradicting the normal course of nature was not to be believed. Although he found evil in the historical record, he fervently believed reason and educating the illiterate masses would lead to progress.
Voltaire explains his view of historiography in his article on "History" in Diderot's "Encyclopédie": ""One demands of modern historians more details, better ascertained facts, precise dates, more attention to customs, laws, mores, commerce, finance, agriculture, population."" Already in 1739 he had written: ""My chief object is not political or military history, it is the history of the arts, of commerce, of civilization – in a word, – of the human mind."" Voltaire's histories used the values of the Enlightenment to evaluate the past. He helped free historiography from antiquarianism, Eurocentrism, religious intolerance and a concentration on great men, diplomacy, and warfare. Peter Gay says Voltaire wrote "very good history", citing his "scrupulous concern for truths", "careful sifting of evidence", "intelligent selection of what is important", "keen sense of drama", and "grasp of the fact that a whole civilization is a unit of study".
At the same time, philosopher David Hume was having a similar effect on the study of history in Great Britain. In 1754 he published "The History of England", a 6-volume work which extended "From the Invasion of Julius Caesar to the Revolution in 1688". Hume adopted a similar scope to Voltaire in his history; as well as the history of Kings, Parliaments, and armies, he examined the history of culture, including literature and science, as well. His short biographies of leading scientists explored the process of scientific change and he developed new ways of seeing scientists in the context of their times by looking at how they interacted with society and each other – he paid special attention to Francis Bacon, Robert Boyle, Isaac Newton and William Harvey.
He also argued that the quest for liberty was the highest standard for judging the past, and concluded that after considerable fluctuation, England at the time of his writing had achieved "the most entire system of liberty, that was ever known amongst mankind".
The apex of Enlightenment history was reached with Edward Gibbon's monumental six-volume work, "The History of the Decline and Fall of the Roman Empire", published on 17 February 1776. Because of its relative objectivity and heavy use of primary sources, its methodology became a model for later historians. This has led to Gibbon being called the first "modern historian". The book sold impressively, earning its author a total of about £9000. Biographer Leslie Stephen wrote that thereafter, "His fame was as rapid as it has been lasting."
Gibbon's work has been praised for its style, its piquant epigrams and its effective irony. Winston Churchill memorably noted, "I set out upon...Gibbon's "Decline and Fall of the Roman Empire" [and] was immediately dominated both by the story and the style... I devoured Gibbon. I rode triumphantly through it from end to end and enjoyed it all." Gibbon was pivotal in the secularizing and 'desanctifying' of history, remarking, for example, on the "want of truth and common sense" of biographies composed by Saint Jerome. Unusually for an 18th-century historian, Gibbon was never content with secondhand accounts when the primary sources were accessible (though most of these were drawn from well-known printed editions). "I have always endeavoured," he says, "to draw from the fountain-head; that my curiosity, as well as a sense of duty, has always urged me to study the originals; and that, if they have sometimes eluded my search, I have carefully marked the secondary evidence, on whose faith a passage or a fact were reduced to depend." In this insistence upon the importance of primary sources, Gibbon broke new ground in the methodical study of history:
In accuracy, thoroughness, lucidity, and comprehensive grasp of a vast subject, the 'History' is unsurpassable. It is the one English history which may be regarded as definitive... Whatever its shortcomings the book is artistically imposing as well as historically unimpeachable as a vast panorama of a great period.
The tumultuous events surrounding the French Revolution inspired much of the historiography and analysis of the early 19th century. Interest in the 1688 Glorious Revolution was also rekindled by the Great Reform Act of 1832 in England.
Thomas Carlyle published his three-volume "", in 1837. The first volume was accidentally burned by John Stuart Mill's maid. Carlyle rewrote it from scratch. Carlyle's style of historical writing stressed the immediacy of action, often using the present tense. He emphasised the role of forces of the spirit in history and thought that chaotic events demanded what he called 'heroes' to take control over the competing forces erupting within society. He considered the dynamic forces of history as being the hopes and aspirations of people that took the form of ideas, and were often ossified into ideologies. Carlyle's "The French Revolution" was written in a highly unorthodox style, far removed from the neutral and detached tone of the tradition of Gibbon. Carlyle presented the history as dramatic events unfolding in the present as though he and the reader were participants on the streets of Paris at the famous events. Carlyle's invented style was epic poetry combined with philosophical treatise. It is rarely read or cited in the last century.
In his main work "Histoire de France" (1855), French historian Jules Michelet (1798–1874) coined the term Renaissance (meaning "rebirth" in French), as a period in Europe's cultural history that represented a break from the Middle Ages, creating a modern understanding of humanity and its place in the world. The 19-volume work covered French history from Charlemagne to the outbreak of the French Revolution. His inquiry into manuscript and printed authorities was most laborious, but his lively imagination, and his strong religious and political prejudices, made him regard all things from a singularly personal point of view.
Michelet was one of the first historians to shift the emphasis of history to the common people, rather than the leaders and institutions of the country. He had a decisive impact on scholars. Gayana Jurkevich argues that led by Michelet:
Hippolyte Taine (1828–1893), although unable to secure an academic position, was the chief theoretical influence of French naturalism, a major proponent of sociological positivism, and one of the first practitioners of historicist criticism. He pioneered the idea of "the milieu" as an active historical force which amalgamated geographical, psychological, and social factors. Historical writing for him was a search for general laws. His brilliant style kept his writing in circulation long after his theoretical approaches were passé.
One of the major progenitors of the history of culture and art, was the Swiss historian Jacob Burckhardt Siegfried Giedion described Burckhardt's achievement in the following terms: "The great discoverer of the age of the Renaissance, he first showed how a period should be treated in its entirety, with regard not only for its painting, sculpture and architecture, but for the social institutions of its daily life as well."
His most famous work was "The Civilization of the Renaissance in Italy", published in 1860; it was the most influential interpretation of the Italian Renaissance in the nineteenth century and is still widely read. According to John Lukacs, he was the first master of cultural history, which seeks to describe the spirit and the forms of expression of a particular age, a particular people, or a particular place. His innovative approach to historical research stressed the importance of art and its inestimable value as a primary source for the study of history. He was one of the first historians to rise above the narrow nineteenth-century notion that "history is past politics and politics current history.
By the mid-19th century, scholars were beginning to analyse the history of institutional change, particularly the development of constitutional government. William Stubbs's "Constitutional History of England" (3 vols., 1874–78) was an important influence on this developing field. The work traced the development of the English constitution from the Teutonic invasions of Britain until 1485, and marked a distinct step in the advance of English historical learning. He argued that the theory of the unity and continuity of history should not remove distinctions between ancient and modern history. He believed that, though work on ancient history is a useful preparation for the study of modern history, either may advantageously be studied apart. He was a good palaeographer, and excelled in textual criticism, in examination of authorship, and other such matters, while his vast erudition and retentive memory made him second to none in interpretation and exposition.
The modern academic study of history and methods of historiography were pioneered in 19th-century German universities, especially the University of Göttingen. Leopold von Ranke (1795–1886) at Berlin was a pivotal influence in this regard, and was the founder of modern source-based history. According to Caroline Hoefferle, "Ranke was probably the most important historian to shape historical profession as it emerged in Europe and the United States in the late 19th century."
Specifically, he implemented the seminar teaching method in his classroom, and focused on archival research and analysis of historical documents. Beginning with his first book in 1824, the "History of the Latin and Teutonic Peoples from 1494 to 1514", Ranke used an unusually wide variety of sources for a historian of the age, including "memoirs, diaries, personal and formal missives, government documents, diplomatic dispatches and first-hand accounts of eye-witnesses". Over a career that spanned much of the century, Ranke set the standards for much of later historical writing, introducing such ideas as reliance on primary sources, an emphasis on narrative history and especially international politics ("aussenpolitik"). Sources had to be solid, not speculations and rationalizations. His credo was to write history the way it was. He insisted on primary sources with proven authenticity.
Ranke also rejected the 'teleological approach' to history, which traditionally viewed each period as inferior to the period which follows. In Ranke's view, the historian had to understand a period on its own terms, and seek to find only the general ideas which animated every period of history. In 1831 and at the behest of the Prussian government, Ranke founded and edited the first historical journal in the world, called "Historisch-Politische Zeitschrift".
Another important German thinker was Georg Wilhelm Friedrich Hegel, whose theory of historical progress ran counter to Ranke's approach. In Hegel's own words, his philosophical theory of "World history... represents the development of the spirit's consciousness of its own freedom and of the consequent realization of this freedom.". This realization is seen by studying the various cultures that have developed over the millennia, and trying to understand the way that freedom has worked itself out through them:
World history is the record of the spirit's efforts to attain knowledge of what it is in itself. The Orientals do not know that the spirit or man as such are free in themselves. And because they do not know that, they are not themselves free. They only know that One is free... The consciousness of freedom first awoke among the Greeks, and they were accordingly free; but, like the Romans, they only knew that Some, and not all men as such, are free... The Germanic nations, with the rise of Christianity, were the first to realize that All men are by nature free, and that freedom of spirit is his very essence.
Karl Marx introduced the concept of historical materialism into the study of world historical development. In his conception, the economic conditions and dominant modes of production determined the structure of society at that point. In his view five successive stages in the development of material conditions would occur in Western Europe. The first stage was primitive communism where property was shared and there was no concept of "leadership". This progressed to a slave society where the idea of class emerged and the State developed. Feudalism was characterized by an aristocracy working in partnership with a theocracy and the emergence of the Nation-state. Capitalism appeared after the bourgeois revolution when the capitalists (or their merchant predecessors) overthrew the feudal system and established a market economy, with
private property and Parliamentary democracy. Marx then predicted the eventual proletarian revolution that would result in the attainment of socialism, followed by Communism, where property would be communally owned.
Previous historians had focused on cyclical events of the rise and decline of rulers and nations. Process of nationalization of history, as part of national revivals in the 19th century, resulted with separation of "one's own" history from common universal history by such way of perceiving, understanding and treating the past that constructed history as history of a nation. A new discipline, sociology, emerged in the late 19th century and analyzed and compared these perspectives on a larger scale.
The term Whig history, coined by Herbert Butterfield in his short book "The Whig Interpretation of History" in 1931, means the approach to historiography which presents the past as an inevitable progression towards ever greater liberty and enlightenment, culminating in modern forms of liberal democracy and constitutional monarchy. In general, Whig historians emphasized the rise of constitutional government, personal freedoms and scientific progress. The term has been also applied widely in historical disciplines outside of British history (the history of science, for example) to criticize any teleological (or goal-directed), hero-based, and transhistorical narrative.
Paul Rapin de Thoyras's history of England, published in 1723, became "the classic Whig history" for the first half of the 18th century. It was later supplanted by the immensely popular "The History of England" by David Hume. Whig historians emphasized the achievements of the Glorious Revolution of 1688. This included James Mackintosh's "History of the Revolution in England in 1688", William Blackstone's "Commentaries on the Laws of England" and Henry Hallam's "Constitutional History of England".
The most famous exponent of 'Whiggery' was Thomas Babington Macaulay. His writings are famous for their ringing prose and for their confident, sometimes dogmatic, emphasis on a progressive model of British history, according to which the country threw off superstition, autocracy and confusion to create a balanced constitution and a forward-looking culture combined with freedom of belief and expression. This model of human progress has been called the Whig interpretation of history. He published the first volumes of his most famous work of history, "The History of England from the Accession of James II", in 1848. It proved an immediate success and replaced Hume's history to become the new orthodoxy. His 'Whiggish convictions' are spelled out in his first chapter:
His legacy continues to be controversial; Gertrude Himmelfarb wrote that "most professional historians have long since given up reading Macaulay, as they have given up writing the kind of history he wrote and thinking about history as he did." However, J. R. Western wrote that: "Despite its age and blemishes, Macaulay's "History of England" has still to be superseded by a full-scale modern history of the period".
The Whig consensus was steadily undermined during the post-World War I re-evaluation of European history, and Butterfield's critique exemplified this trend. Intellectuals no longer believed the world was automatically getting better and better. Subsequent generations of academic historians have similarly rejected Whig history because of its presentist and teleological assumption that history is driving toward some sort of goal. Other criticized 'Whig' assumptions included viewing the British system as the apex of human political development, assuming that political figures in the past held current political beliefs (anachronism), considering British history as a march of progress with inevitable outcomes and presenting political figures of the past as heroes, who advanced the cause of this political progress, or villains, who sought to hinder its inevitable triumph. J. Hart says "a Whig interpretation requires human heroes and villains in the story."
20th-century historiography in major countries is characterized by a move to universities and academic research centers. Popular history continued to be written by self-educated amateurs, but scholarly history increasingly became the province of PhD's trained in research seminars at a university. The training emphasized working with primary sources in archives. Seminars taught graduate students how to review the historiography of the topics, so that they could understand the conceptual frameworks currently in use, and the criticisms regarding their strengths and weaknesses. Western Europe and the United States took leading roles in this development. The emergence of area studies of other regions also developed historiographical practices.
The French "Annales" school radically changed the focus of historical research in France during the 20th century by stressing long-term social history, rather than political or diplomatic themes. The school emphasized the use of quantification and the paying of special attention to geography.
The "Annales d'histoire économique et sociale" journal was founded in 1929 in Strasbourg by Marc Bloch and Lucien Febvre. These authors, the former a medieval historian and the latter an early modernist, quickly became associated with the distinctive "Annales" approach, which combined geography, history, and the sociological approaches of the Année Sociologique (many members of which were their colleagues at Strasbourg) to produce an approach which rejected the predominant emphasis on politics, diplomacy and war of many 19th and early 20th-century historians as spearheaded by historians whom Febvre called Les Sorbonnistes. Instead, they pioneered an approach to a study of long-term historical structures ("la longue durée") over events and political transformations. Geography, material culture, and what later Annalistes called "mentalités", or the psychology of the epoch, are also characteristic areas of study. The goal of the "Annales" was to undo the work of the "Sorbonnistes", to turn French historians away from the narrowly political and diplomatic toward the new vistas in social and economic history. For early modern Mexican history, the work of Marc Bloch's student François Chevalier on the formation of landed estates (haciendas) from the sixteenth century to the seventeenth had a major impact on Mexican history and historiography, setting off an important debate about whether landed estates were basically feudal or capitalistic.
An eminent member of this school, Georges Duby, described his approach to history as one that relegated the sensational to the sidelines and was reluctant to give a simple accounting of events, but strived on the contrary to pose and solve problems and, neglecting surface disturbances, to observe the long and medium-term evolution of economy, society and civilisation. The Annalistes, especially Lucien Febvre, advocated a "histoire totale", or "histoire tout court", a complete study of a historical problem.
The second era of the school was led by Fernand Braudel and was very influential throughout the 1960s and 1970s, especially for his work on the Mediterranean region in the era of Philip II of Spain. Braudel developed the idea, often associated with Annalistes, of different modes of historical time: "l'histoire quasi immobile" (motionless history) of historical geography, the history of social, political and economic structures ("la longue durée"), and the history of men and events, in the context of their structures. His 'longue durée' approach stressed slow, and often imperceptible effects of space, climate and technology on the actions of human beings in the past. The "Annales" historians, after living through two world wars and major political upheavals in France, were deeply uncomfortable with the notion that multiple ruptures and discontinuities created history. They preferred to stress slow change and the longue durée. They paid special attention to geography, climate, and demography as long-term factors. They considered the continuities of the deepest structures were central to history, beside which upheavals in institutions or the superstructure of social life were of little significance, for history lies beyond the reach of conscious actors, especially the will of revolutionaries.
Noting the political upheavals in Europe and especially in France in 1968, Eric Hobsbawm argued that "in France the virtual hegemony of Braudelian history and the "Annales" came to an end after 1968, and the international influence of the journal dropped steeply." Multiple responses were attempted by the school. Scholars moved in multiple directions, covering in disconnected fashion the social, economic, and cultural history of different eras and different parts of the globe. By the time of crisis the school was building a vast publishing and research network reaching across France, Europe, and the rest of the world. Influence indeed spread out from Paris, but few new ideas came in. Much emphasis was given to quantitative data, seen as the key to unlocking all of social history. However, the "Annales" ignored the developments in quantitative studies underway in the U.S. and Britain, which reshaped economic, political and demographic research.
Marxist historiography developed as a school of historiography influenced by the chief tenets of Marxism, including the centrality of social class and economic constraints in determining historical outcomes (historical materialism). Friedrich Engels wrote "The Peasant War in Germany", which analysed social warfare in early Protestant Germany in terms of emerging capitalist classes. Although it lacked a rigorous engagement with archival sources, it indicated an early interest in history from below and class analysis, and it attempts a dialectical analysis. Another treatise of Engels, "The Condition of the Working Class in England in 1844", was salient in creating the socialist impetus in British politics from then on, e.g. the Fabian Society.
R. H. Tawney was an early historian working in this tradition. "The Agrarian Problem in the Sixteenth Century" (1912) and "Religion and the Rise of Capitalism" (1926), reflected his ethical concerns and preoccupations in economic history. He was profoundly interested in the issue of the enclosure of land in the English countryside in the sixteenth and seventeenth centuries and in Max Weber's thesis on the connection between the appearance of Protestantism and the rise of capitalism. His belief in the rise of the gentry in the century before the outbreak of the Civil War in England provoked the 'Storm over the Gentry' in which his methods were subjected to severe criticisms by Hugh Trevor-Roper and John Cooper.
Historiography in the Soviet Union was greatly influenced by Marxist historiography, as historical materialism was extended into the Soviet version of dialectical materialism.
A circle of historians inside the Communist Party of Great Britain (CPGB) formed in 1946 and became a highly influential cluster of British Marxist historians, who contributed to history from below and class structure in early capitalist society. While some members of the group (most notably Christopher Hill and E. P. Thompson) left the CPGB after the 1956 Hungarian Revolution, the common points of British Marxist historiography continued in their works. They placed a great emphasis on the subjective determination of history.
Christopher Hill's studies on 17th-century English history were widely acknowledged and recognised as representative of this school. His books include "Puritanism and Revolution" (1958), "Intellectual Origins of the English Revolution" (1965 and revised in 1996), "The Century of Revolution" (1961), "AntiChrist in 17th-century England" (1971), "The World Turned Upside Down" (1972) and many others.
E. P. Thompson pioneered the study of history from below in his work, "The Making of the English Working Class", published in 1963. It focused on the forgotten history of the first working-class political left in the world in the late-18th and early-19th centuries. In his preface to this book, Thompson set out his approach to writing history from below:
Thompson's work was also significant because of the way he defined "class". He argued that class was not a structure, but a relationship that changed over time. He opened the gates for a generation of labor historians, such as David Montgomery and Herbert Gutman, who made similar studies of the American working classes.
Other important Marxist historians included Eric Hobsbawm, C. L. R. James, Raphael Samuel, A. L. Morton and Brian Pearce.
Although Marxist historiography made important contributions to the history of the working class, oppressed nationalities, and the methodology of history from below, its chief problematic aspect was its argument on the nature of history as "determined" or "dialectical"; this can also be stated as the relative importance of subjective and objective factors in creating outcomes. It increasingly fell out of favour in the 1960s and '70s. Geoffrey Elton was important in undermining the case for a Marxist historiography, which he argued was presenting seriously flawed interpretations of the past. In particular, Elton was opposed to the idea that the English Civil War was caused by socioeconomic changes in the 16th and 17th centuries, arguing instead that it was due largely to the incompetence of the Stuart kings.
In dealing with the era of the Second World War, Addison notes that in Britain by the 1990s, labour history was, "in sharp decline", because:
Biography has been a major form of historiography since the days when Plutarch wrote the parallel lives of great Roman and Greek leaders. It is a field especially attractive to nonacademic historians, and often to the spouses or children of famous people, who have access to the trove of letters and documents. Academic historians tend to downplay biography because it pays too little attention to broad social, cultural, political and economic forces, and perhaps too much attention to popular psychology. The "Great Man" tradition in Britain originated in the multi-volume "Dictionary of National Biography" (which originated in 1882 and issued updates into the 1970s); it continues to this day in the new "Oxford Dictionary of National Biography'.' In the United States, the "Dictionary of American Biography" was planned in the late 1920s and appeared with numerous supplements into the 1980s. It has now been displaced by the "American National Biography" as well as numerous smaller historical encyclopedias that give thorough coverage to Great Persons. Bookstores do a thriving business in biographies, which sell far more copies than the esoteric monographs based on post-structuralism, cultural, racial or gender history. Michael Holroyd says the last forty years "may be seen as a golden age of biography", but nevertheless calls it the "shallow end of history". Nicolas Barker argues that "more and more biographies command an ever larger readership", as he speculates that biography has come "to express the spirit of our age".
Daniel R. Meister argues that:
Marxist historian E. H. Carr developed a controversial theory of history in his 1961 book "What Is History?", which proved to be one of the most influential books ever written on the subject. He presented a middle-of-the-road position between the empirical or (Rankean) view of history and R. G. Collingwood's idealism, and rejected the empirical view of the historian's work being an accretion of "facts" that they have at their disposal as nonsense. He maintained that there is such a vast quantity of information that the historian always chooses the "facts" they decide to make use of. In Carr's famous example, he claimed that millions had crossed the Rubicon, but only Julius Caesar's crossing in 49 BC is declared noteworthy by historians. For this reason, Carr argued that Leopold von Ranke's famous dictum "wie es eigentlich gewesen" (show what actually happened) was wrong because it presumed that the "facts" influenced what the historian wrote, rather than the historian choosing what "facts of the past" they intended to turn into "historical facts". At the same time, Carr argued that the study of the facts may lead the historian to change his or her views. In this way, Carr argued that history was "an unending dialogue between the past and present".
Carr is held by some critics to have had a deterministic outlook in history. Others have modified or rejected this use of the label "determinist". He took a hostile view of those historians who stress the workings of chance and contingency in the workings of history. In Carr's view, no individual is truly free of the social environment in which they live, but contended that within those limitations, there was room, albeit very narrow room for people to make decisions that affect history. Carr emphatically contended that history was a social science, not an art, because historians like scientists seek generalizations that helped to broaden the understanding of one's subject.
One of Carr's most forthright critics was Hugh Trevor-Roper, who argued that Carr's dismissal of the "might-have-beens of history" reflected a fundamental lack of interest in examining historical causation. Trevor-Roper asserted that examining possible alternative outcomes of history was far from being a "parlour-game" was rather an essential part of the historians' work, as only by considering all possible outcomes of a given situation could a historian properly understand the period.
The controversy inspired Sir Geoffrey Elton to write his 1967 book "The Practice of History". Elton criticized Carr for his "whimsical" distinction between the "historical facts" and the "facts of the past", arguing that it reflected "...an extraordinarily arrogant attitude both to the past and to the place of the historian studying it". Elton, instead, strongly defended the traditional methods of history and was also appalled by the inroads made by postmodernism. Elton saw the duty of historians as empirically gathering evidence and objectively analyzing what the evidence has to say. As a traditionalist, he placed great emphasis on the role of individuals in history instead of abstract, impersonal forces. Elton saw political history as the highest kind of history. Elton had no use for those who seek history to make myths, to create laws to explain the past, or to produce theories such as Marxism.
Classical and European history was part of the 19th-century grammar curriculum. American history became a topic later in the 19th century.
In the historiography of the United States, there were a series of major approaches in the 20th century. In 2009–2012, there were an average of 16,000 new academic history books published in the U.S. every year.
From 1910 to the 1940s, "Progressive" historiography was dominant, especially in political studies. It stressed the central importance of class conflict in American history. Important leaders included Vernon L. Parrington, Carl L. Becker, Arthur M. Schlesinger, Sr., John Hicks, and C. Vann Woodward. The movement established a strong base at the History Department at the University of Wisconsin with Curtis Nettels, William Hesseltine, Merle Curti, Howard K. Beale, Merrill Jensen, Fred Harvey Harrington (who became the university president), William Appleman Williams, and a host of graduate students. Charles A. Beard was the most prominent representative with his "Beardian" approach that reached both scholars and the general public.
In covering the Civil War, Charles and Mary Beard did not find it useful to examine nationalism, unionism, states' rights, slavery, abolition or the motivations of soldiers in battle. Instead, they proclaimed it was a:
Arthur Schlesinger, Jr. wrote the "Age of Jackson" (1945), one of the last major books from this viewpoint. Schlesinger made Jackson a hero for his successful attacks on the Second Bank of the United States. His own views were clear enough: "Moved typically by personal and class, rarely by public, considerations, the business community has invariably brought national affairs to a state of crisis and exasperated the rest of society into dissatisfaction bordering on revolt."
Consensus history emphasizes the basic unity of American values and downplays conflict as superficial. It was especially attractive in the 1950s and 1960s. Prominent leaders included Richard Hofstadter, Louis Hartz, Daniel Boorstin, Allan Nevins, Clinton Rossiter, Edmund Morgan, and David M. Potter. In 1948 Hofstadter made a compelling statement of the consensus model of the U.S. political tradition:
Consensus history was rejected by New Left viewpoints that attracted a younger generation of radical historians in the 1960s. These viewpoints stress conflict and emphasize the central roles of class, race and gender. The history of dissent, and the experiences of racial minorities and disadvantaged classes was central to the narratives produced by New Left historians.
Social history, sometimes called the "new social history", is a broad branch that studies the experiences of ordinary people in the past. It had major growth as a field in the 1960s and 1970s, and still is well represented in history departments. However, after 1980 the "cultural turn" directed the next generation to new topics. In the two decades from 1975 to 1995, the proportion of professors of history in U.S. universities identifying with social history rose from 31% to 41%, while the proportion of political historians fell from 40% to 30%.
The growth was enabled by the social sciences, computers, statistics, new data sources such as individual census information, and summer training programs at the Newberry Library and the University of Michigan. The New Political History saw the application of social history methods to politics, as the focus shifted from politicians and legislation to voters and elections.
The Social Science History Association was formed in 1976 as an interdisciplinary group with a journal "Social Science History" and an annual convention. The goal was to incorporate in historical studies perspectives from all the social sciences, especially political science, sociology and economics. The pioneers shared a commitment to quantification. However, by the 1980s the first blush of quantification had worn off, as traditional historians counterattacked. Harvey J. Graff says:
Meanwhile, quantitative history became well-established in other disciplines, especially economics (where they called it "cliometrics"), as well as in political science. In history, however, quantification remained central to demographic studies, but slipped behind in political and social history as traditional narrative approaches made a comeback.
Latin America is the former Spanish American empire in the Western Hemisphere plus Portuguese Brazil. Professional historians pioneered the creation of this field, starting in the late nineteenth century. The term “Latin America” did not come into general usage until the twentieth century and in some cases it was rejected. The historiography of the field has been more fragmented than unified, with historians of Spanish America and Brazil generally remaining in separate spheres. Another standard division within the historiography is the temporal factor, with works falling into either the early modern period (or “colonial era”) or the post-independence (or “national”) period, from the early nineteenth onward. Relatively few works span the two eras and few works except textbooks unite Spanish America and Brazil. There is a tendency to focus on histories of particular countries or regions (the Andes, the Southern Cone, the Caribbean) with relatively little comparative work.
Historians of Latin America have contributed to various types of historical writing, but one major, innovative development in Spanish American history is the emergence of ethnohistory, the history of indigenous peoples, especially in Mexico based on alphabetic sources in Spanish or in indigenous languages.
For the early modern period, the emergence of Atlantic history, based on comparisons and linkages of Europe, the Americas, and Africa from 1450–1850 that developed as a field in its own right has integrated early modern Latin American history into a larger framework. For all periods, global or world history have focused on the connections between areas, likewise integrating Latin America into a larger perspective. Latin America's importance to world history is notable but often overlooked. "Latin America’s central, and sometimes pioneering, role in the development of globalization and modernity did not cease with the end of colonial rule and the early modern period. Indeed, the region’s political independence places it at the forefront of two trends that are regularly considered thresholds of the modern world. The first is the so-called liberal revolution, the shift from monarchies of the ancien régime, where inheritance legitimated political power, to constitutional republics... The second, and related, trend consistently considered a threshold of modern history that saw Latin America in the forefront is the development of nation-states."
Historical research appears in a number of specialized journals. These include "Hispanic American Historical Review" (est. 1918), published by the Conference on Latin American History; "The Americas", (est. 1944); "Journal of Latin American Studies" (1969); "Canadian Journal of Latin American and Caribbean Studies",( est.1976) "Bulletin of Latin American Research", (est. 1981); "Colonial Latin American Review" (1992); and "Colonial Latin American Historical Review" (est. 1992). "Latin American Research Review" (est. 1969), published by the Latin American Studies Association, does not focus primarily on history, but it has often published historiographical essays on particular topics.
General works on Latin American history have appeared since the 1950s, when the teaching of Latin American history expanded in U.S. universities and colleges. Most attempt full coverage of Spanish America and Brazil from the conquest to the modern era, focusing on institutional, political, social and economic history. An important, eleven volume treatment of Latin American history is "The Cambridge History of Latin America", with separate volumes on the colonial era, nineteenth century, and the twentieth century. There is a small number of general works that have gone through multiple editions. Major trade publishers have also issued edited volumes on Latin American history and historiography. Reference works include the Handbook of Latin American Studies, which publishes articles by area experts, with annotated bibliographic entries, and the "Encyclopedia of Latin American History and Culture".
World history, as a distinct field of historical study, emerged as an independent academic field in the 1980s. It focused on the examination of history from a global perspective and looked for common patterns that emerged across all cultures. The basic thematic approach of this field was to analyse two major focal points: integration – (how processes of world history have drawn people of the world together), and difference – (how patterns of world history reveal the diversity of the human experience).
Arnold J. Toynbee's ten-volume "A Study of History", took an approach that was widely discussed in the 1930s and 1940s. By the 1960s his work was virtually ignored by scholars and the general public. He compared 26 independent civilizations and argued that they displayed striking parallels in their origin, growth, and decay. He proposed a universal model to each of these civilizations, detailing the stages through which they all pass: genesis, growth, time of troubles, universal state, and disintegration. The later volumes gave too much emphasis on spirituality to satisfy critics.
Chicago historian William H. McNeill wrote "The Rise of the West" (1965) to show how the separate civilizations of Eurasia interacted from the very beginning of their history, borrowing critical skills from one another, and thus precipitating still further change as adjustment between traditional old and borrowed new knowledge and practice became necessary. He then discusses the dramatic effect of Western civilization on others in the past 500 years of history. McNeill took a broad approach organized around the interactions of peoples across the globe. Such interactions have become both more numerous and more continual and substantial in recent times. Before about 1500, the network of communication between cultures was that of Eurasia. The term for these areas of interaction differ from one world historian to another and include "world-system" and "ecumene." His emphasis on cultural fusions influenced historical theory significantly.
The "cultural turn" of the 1980s and 1990s affected scholars in most areas of history. Inspired largely by anthropology, it turned away from leaders, ordinary people and famous events to look at the use of language and cultural symbols to represent the changing values of society.
The British historian Peter Burke finds that cultural studies has numerous spinoffs, or topical themes it has strongly influenced. The most important include gender studies and postcolonial studies, as well as memory studies, and film studies.
Diplomatic historian Melvyn P. Leffler finds that the problem with the "cultural turn" is that the culture concept is imprecise, and may produce excessively broad interpretations, because it:
Memory studies is a new field, focused on how nations and groups (and historians) construct and select their memories of the past in order to celebrate (or denounce) key features, thus making a statement of their current values and beliefs. Historians have played a central role in shaping the memories of the past as their work is diffused through popular history books and school textbooks. French sociologist Maurice Halbwachs, opened the field with "La mémoire collective" (Paris: 1950).
Many historians examine how the memory of the past has been constructed, memorialized or distorted. Historians examine how legends are invented. For example, there are numerous studies of the memory of atrocities from World War II, notably the Holocaust in Europe and Japanese behavior in Asia. British historian Heather Jones argues that the historiography of the First World War in recent years has been reinvigorated by the cultural turn. Scholars have raised entirely new questions regarding military occupation, radicalization of politics, race, and the male body.
Representative of recent scholarship is a collection of studies on the "Dynamics of Memory and Identity in Contemporary Europe". SAGE has published the scholarly journal "Memory Studies" since 2008, and the book series "Memory Studies" was launched by Palgrave Macmillan in 2010 with 5–10 titles a year.
The historical journal, a forum where academic historians could exchange ideas and publish newly discovered information, came into being in the 19th century. The early journals were similar to those for the physical sciences, and were seen as a means for history to become more professional. Journals also helped historians to establish various historiographical approaches, the most notable example of which was "Annales. Économies, sociétés, civilisations", a publication of the "Annales" school in France. Journals now typically have one or more editors and associate editors, an editorial board, and a pool of scholars to whom articles that are submitted are sent for confidential evaluation. The editors will send out new books to recognized scholars for reviews that usually run 500 to 1000 words. The vetting and publication process often takes months or longer. Publication in a prestigious journal (which accept 10% or fewer of the articles submitted) is an asset in the academic hiring and promotion process. Publication demonstrates that the author is conversant with the scholarly field. Page charges and fees for publication are uncommon in history. Journals are subsidized by universities or historical societies, scholarly associations, and subscription fees from libraries and scholars. Increasingly they are available through library pools that allow many academic institutions to pool subscriptions to online versions. Most libraries have a system for obtaining specific articles through inter-library loan.
According to Lawrence Stone, narrative has traditionally been the main rhetorical device used by historians. In 1979, at a time when the new Social History was demanding a social-science model of analysis, Stone detected a move back toward the narrative. Stone defined narrative as follows: it is organized chronologically; it is focused on a single coherent story; it is descriptive rather than analytical; it is concerned with people not abstract circumstances; and it deals with the particular and specific rather than the collective and statistical. He reported that, "More and more of the 'new historians' are now trying to discover what was going on inside people's heads in the past, and what it was like to live in the past, questions which inevitably lead back to the use of narrative."
Historians committed to a social science approach, however, have criticized the narrowness of narrative and its preference for anecdote over analysis, and its use of clever examples rather than statistically verified empirical regularities.
Some of the common topics in historiography are:
How a historian approaches historical events is one of the most important decisions within historiography. It is commonly recognised by historians that, in themselves, individual historical facts dealing with names, dates and places are not particularly meaningful. Such facts will only become useful when assembled with other historical evidence, and the process of assembling this evidence is understood as a particular historiographical approach.
The most influential historiographical approaches are:
Important related fields include: | https://en.wikipedia.org/wiki?curid=13276 |
Holy Roman Empire
The Holy Roman Empire (; ), occasionally but unofficially referred to as the Holy Roman Empire of the German Nation, was a multi-ethnic complex of territories in Western and Central Europe that developed during the Early Middle Ages and continued until its dissolution in 1806 during the Napoleonic Wars. The largest territory of the empire after 962 was the Kingdom of Germany, though it also included the neighboring Kingdom of Bohemia and Kingdom of Italy, plus numerous other territories, and soon after the Kingdom of Burgundy was added. However, while by the 15th century the Empire was still in theory composed of three major blocks – Italy, Germany, and Burgundy – in practice, the links between these blocks had become so unsubstantial that only the Kingdom of Germany remained, nearly all the Italian territories for instance having become in effect part of a narrowly-defined Habsburg dynastic patrimony, unconnected to the Empire. The external borders of the Empire did not change noticeably from the Peace of Westphalia – which acknowledged the exclusion of Switzerland and the Northern Netherlands, and the French protectorate over Alsace – to the dissolution of the Empire. By then, it largely contained only German-speaking territories, plus the Kingdom of Bohemia. At the conclusion of the Napoleonic Wars in 1815, most of the Holy Roman Empire was included in the German Confederation.
On 25 December 800, Pope Leo III crowned the Frankish king Charlemagne as Emperor, reviving the title in Western Europe, more than three centuries after the fall of the earlier ancient Western Roman Empire in 476. The title continued in the Carolingian family until 888 and from 896 to 899, after which it was contested by the rulers of Italy in a series of civil wars until the death of the last Italian claimant, Berengar I, in 924. The title was revived again in 962 when Otto I was crowned emperor, fashioning himself as the successor of Charlemagne and beginning a continuous existence of the empire for over eight centuries. Some historians refer to the coronation of Charlemagne as the origin of the empire, while others prefer the coronation of Otto I as its beginning. Scholars generally concur, however, in relating an evolution of the institutions and principles constituting the empire, describing a gradual assumption of the imperial title and role.
The exact term "Holy Roman Empire" was not used until the 13th century, before which the empire was referred to variously as "universum regnum" ("the whole kingdom", as opposed to the regional kingdoms), "imperium christianum" ("Christian empire"), or "Romanum imperium" ("Roman empire"), but the Emperor's legitimacy always rested on the concept of "translatio imperii", that he held supreme power inherited from the ancient emperors of Rome. The dynastic office of Holy Roman Emperor was traditionally elective through the mostly German prince-electors, the highest-ranking noblemen of the empire; they would elect one of their peers as "King of the Romans" to be crowned emperor by the Pope, although the tradition of papal coronations was discontinued in the 16th century.
The empire never achieved the extent of political unification as was formed to the west in France, evolving instead into a decentralized, limited elective monarchy composed of hundreds of sub-units: kingdoms, principalities, duchies, counties, prince-bishoprics, Free Imperial Cities, and other domains. The power of the emperor was limited, and while the various princes, lords, bishops, and cities of the empire were vassals who owed the emperor their allegiance, they also possessed an extent of privileges that gave them "de facto" independence within their territories. Emperor Francis II dissolved the empire on 6 August 1806 following the creation of the Confederation of the Rhine by Emperor Napoleon I the month before.
Before 1157, the realm was merely referred to as the Roman Empire. The term "sacrum" ("holy", in the sense of "consecrated") in connection with the medieval Roman Empire was used beginning in 1157 under Frederick I Barbarossa ("Holy Empire"): the term was added to reflect Frederick's ambition to dominate Italy and the Papacy. The form "Holy Roman Empire" is attested from 1254 onward.
In a decree following the 1512 Diet of Cologne, the name was changed to the "Holy Roman Empire of the German Nation" (, ), a form first used in a document in 1474. The new title was adopted partly because the Empire had lost most of its territories in Italy and Burgundy (the Kingdom of Arles) to the south and west by the late 15th century, but also to emphasize the new importance of the German Imperial Estates in ruling the Empire due to the Imperial Reform. By the end of the 18th century, the term "Holy Roman Empire of the German Nation" had fallen out of official use. Contradicting the traditional view concerning that designation, Hermann Weisert has argued in a study on imperial titulature that, despite the claims of many textbooks, the name ""Holy Roman Empire of the German Nation"" never had an official status and points out that documents were thirty times as likely to omit the national suffix as include it.
In a famous assessment of the name, the political philosopher Voltaire remarked sardonically: "This body which was called and which still calls itself the Holy Roman Empire was in no way holy, nor Roman, nor an empire."
In the modern period, the Empire was often informally called the German Empire () or Roman-German Empire (). After its dissolution through the end of the German Empire, it was often called "the old Empire" (). Beginning in 1923, early-twentieth century German nationalists and Nazi propaganda would identify the Holy Roman Empire as the First Reich ("Reich" meaning empire), with the German Empire as the Second Reich and either a future German nationalist state or Nazi Germany as the Third Reich.
As Roman power in Gaul declined during the 5th century, local Germanic tribes assumed control. In the late 5th and early 6th centuries, the Merovingians, under Clovis I and his successors, consolidated Frankish tribes and extended hegemony over others to gain control of northern Gaul and the middle Rhine river valley region. By the middle of the 8th century, however, the Merovingians had been reduced to figureheads, and the Carolingians, led by Charles Martel, had become the "de facto" rulers. In 751, Martel's son Pepin became King of the Franks, and later gained the sanction of the Pope. The Carolingians would maintain a close alliance with the Papacy.
In 768, Pepin's son Charlemagne became King of the Franks and began an extensive expansion of the realm. He eventually incorporated the territories of present-day France, Germany, northern Italy, the Low Countries and beyond, linking the Frankish kingdom with Papal lands.
Although antagonism about the expense of Byzantine domination had long persisted within Italy, a political rupture was set in motion in earnest in 726 by the iconoclasm of Emperor Leo III the Isaurian, in what Pope Gregory II saw as the latest in a series of imperial heresies. In 797, the Eastern Roman Emperor Constantine VI was removed from the throne by his mother Irene who declared herself Empress. As the Latin Church, influenced by Gothic law forbidding female leadership and property ownership, only regarded a male Roman Emperor as the head of Christendom, Pope Leo III sought a new candidate for the dignity, excluding consultation with the Patriarch of Constantinople. Charlemagne's good service to the Church in his defense of Papal possessions against the Lombards made him the ideal candidate. On Christmas Day of 800, Pope Leo III crowned Charlemagne emperor, restoring the title in the West for the first time in over three centuries. This can be seen as symbolic of the papacy turning away from the declining Byzantine Empire towards the new power of Carolingian Francia. Charlemagne adopted the formula "Renovatio imperii Romanorum" ("renewal of the Roman Empire"). In 802, Irene was overthrown and exiled by Nikephoros I and henceforth there were two Roman Emperors.
After Charlemagne died in 814, the imperial crown passed to his son, Louis the Pious. Upon Louis' death in 840, it passed to his son Lothair, who had been his co-ruler. By this point the territory of Charlemagne had been divided into several territories, and over the course of the later ninth century the title of Emperor was disputed by the Carolingian rulers of Western Francia and Eastern Francia, with first the western king (Charles the Bald) and then the eastern (Charles the Fat), who briefly reunited the Empire, attaining the prize; however, after the death of Charles the Fat in 888 the Carolingian Empire broke apart, and was never restored. According to Regino of Prüm, the parts of the realm "spewed forth kinglets", and each part elected a kinglet "from its own bowels". After the death of Charles the Fat, those crowned emperor by the pope controlled only territories in Italy. The last such emperor was Berengar I of Italy, who died in 924.
Around 900, autonomous stem duchies (Franconia, Bavaria, Swabia, Saxony, and Lotharingia) reemerged in East Francia. After the Carolingian king Louis the Child died without issue in 911, East Francia did not turn to the Carolingian ruler of West Francia to take over the realm but instead elected one of the dukes, Conrad of Franconia, as "Rex Francorum Orientalium". On his deathbed, Conrad yielded the crown to his main rival, Henry the Fowler of Saxony (r. 919–36), who was elected king at the Diet of Fritzlar in 919. Henry reached a truce with the raiding Magyars, and in 933 he won a first victory against them in the Battle of Riade.
Henry died in 936, but his descendants, the Liudolfing (or Ottonian) dynasty, would continue to rule the Eastern kingdom for roughly a century. Upon Henry the Fowler's death, Otto, his son and designated successor, was elected King in Aachen in 936. He overcame a series of revolts from a younger brother and from several dukes. After that, the king managed to control the appointment of dukes and often also employed bishops in administrative affairs.
In 951, Otto came to the aid of Adelaide, the widowed queen of Italy, defeating her enemies, marrying her, and taking control over Italy. In 955, Otto won a decisive victory over the Magyars in the Battle of Lechfeld. In 962, Otto was crowned emperor by Pope John XII, thus intertwining the affairs of the German kingdom with those of Italy and the Papacy. Otto's coronation as Emperor marked the German kings as successors to the Empire of Charlemagne, which through the concept of "translatio imperii", also made them consider themselves as successors to Ancient Rome.
The kingdom had no permanent capital city. Kings traveled between residences (called Kaiserpfalz) to discharge affairs, though each king preferred certain places; in Otto's case, this was the city of Magdeburg. Kingship continued to be transferred by election, but Kings often ensured their own sons were elected during their lifetimes, enabling them to keep the crown for their families. This only changed after the end of the Salian dynasty in the 12th century.
In 963, Otto deposed the current Pope John XII and chose Pope Leo VIII as the new pope (although John XII and Leo VIII both claimed the papacy until 964 when John XII died). This also renewed the conflict with the Eastern Emperor in Constantinople, especially after Otto's son Otto II (r. 967–83) adopted the designation "imperator Romanorum". Still, Otto II formed marital ties with the east when he married the Byzantine princess Theophanu. Their son, Otto III, came to the throne only three years old, and was subjected to a power struggle and series of regencies until his age of majority in 994. Up to that time, he had remained in Germany, while a deposed duke, Crescentius II, ruled over Rome and part of Italy, ostensibly in his stead.
In 996 Otto III appointed his cousin Gregory V the first German Pope. A foreign pope and foreign papal officers were seen with suspicion by Roman nobles, who were led by Crescentius II to revolt. Otto III's former mentor Antipope John XVI briefly held Rome, until the Holy Roman Emperor seized the city.
Otto died young in 1002, and was succeeded by his cousin Henry II, who focused on Germany.
Henry II died in 1024 and Conrad II, first of the Salian Dynasty, was elected king only after some debate among dukes and nobles. This group eventually developed into the college of Electors.
The Holy Roman Empire became eventually composed of four kingdoms. The kingdoms were:
Kings often employed bishops in administrative affairs and often determined who would be appointed to ecclesiastical offices. In the wake of the Cluniac Reforms, this involvement was increasingly seen as inappropriate by the Papacy. The reform-minded Pope Gregory VII was determined to oppose such practices, which led to the Investiture Controversy with Henry IV (r. 1056–1106), the King of the Romans and Holy Roman Emperor. Henry IV repudiated the Pope's interference and persuaded his bishops to excommunicate the Pope, whom he famously addressed by his born name "Hildebrand", rather than his regnal name "Pope Gregory VII". The Pope, in turn, excommunicated the king, declared him deposed, and dissolved the oaths of loyalty made to Henry. The king found himself with almost no political support and was forced to make the famous Walk to Canossa in 1077, by which he achieved a lifting of the excommunication at the price of humiliation. Meanwhile, the German princes had elected another king, Rudolf of Swabia. Henry managed to defeat him but was subsequently confronted with more uprisings, renewed excommunication, and even the rebellion of his sons. After his death, his second son, Henry V, reached an agreement with the Pope and the bishops in the 1122 Concordat of Worms. The political power of the Empire was maintained, but the conflict had demonstrated the limits of the ruler's power, especially in regard to the Church, and it robbed the king of the sacral status he had previously enjoyed. The Pope and the German princes had surfaced as major players in the political system of the empire.
When the Salian dynasty ended with Henry V's death in 1125, the princes chose not to elect the next of kin, but rather Lothair, the moderately powerful but already old Duke of Saxony. When he died in 1137, the princes again aimed to check royal power; accordingly they did not elect Lothair's favoured heir, his son-in-law Henry the Proud of the Welf family, but Conrad III of the Hohenstaufen family, the grandson of Emperor Henry IV and thus a nephew of Emperor Henry V. This led to over a century of strife between the two houses. Conrad ousted the Welfs from their possessions, but after his death in 1152, his nephew Frederick I "Barbarossa" succeeded him and made peace with the Welfs, restoring his cousin Henry the Lion to his – albeit diminished – possessions.
The Hohenstaufen rulers increasingly lent land to "ministerialia", formerly non-free servicemen, who Frederick hoped would be more reliable than dukes. Initially used mainly for war services, this new class of people would form the basis for the later knights, another basis of imperial power. A further important constitutional move at Roncaglia was the establishment of a new peace mechanism for the entire empire, the Landfrieden, with the first imperial one being issued in 1103 under Henry IV at Mainz. This was an attempt to abolish private feuds, between the many dukes and other people, and to tie the Emperor's subordinates to a legal system of jurisdiction and public prosecution of criminal acts – a predecessor of the modern concept of "rule of law". Another new concept of the time was the systematic foundation of new cities by the Emperor and by the local dukes. These were partly caused by the explosion in population, and they also concentrated economic power at strategic locations. Before this, cities had only existed in the form of old Roman foundations or older bishoprics. Cities that were founded in the 12th century include Freiburg, possibly the economic model for many later cities, and Munich.
Frederick I, also called Frederick Barbarossa, was crowned Emperor in 1155. He emphasized the "Romanness" of the empire, partly in an attempt to justify the power of the Emperor independent of the (now strengthened) Pope. An imperial assembly at the fields of Roncaglia in 1158 reclaimed imperial rights in reference to Justinian's Corpus Juris Civilis. Imperial rights had been referred to as "regalia" since the Investiture Controversy but were enumerated for the first time at Roncaglia. This comprehensive list included public roads, tariffs, coining, collecting punitive fees, and the investiture or seating and unseating of office holders. These rights were now explicitly rooted in Roman Law, a far-reaching constitutional act.
Frederick's policies were primarily directed at Italy, where he clashed with the increasingly wealthy and free-minded cities of the north, especially Milan. He also embroiled himself in another conflict with the Papacy by supporting a candidate elected by a minority against Pope Alexander III (1159–81). Frederick supported a succession of antipopes before finally making peace with Alexander in 1177. In Germany, the Emperor had repeatedly protected Henry the Lion against complaints by rival princes or cities (especially in the cases of Munich and Lübeck). Henry gave only lackluster support to Frederick's policies, and in a critical situation during the Italian wars, Henry refused the Emperor's plea for military support. After returning to Germany, an embittered Frederick opened proceedings against the Duke, resulting in a public ban and the confiscation of all his territories. In 1190, Frederick participated in the Third Crusade and died in the Armenian Kingdom of Cilicia.
During the Hohenstaufen period, German princes facilitated a successful, peaceful eastward settlement of lands that were uninhabited or inhabited sparsely by West Slavs. German speaking farmers, traders, and craftsmen from the western part of the Empire, both Christians and Jews, moved into these areas. The gradual Germanization of these lands was a complex phenomenon that should not be interpreted in the biased terms of 19th-century nationalism. The eastward settlement expanded the influence of the empire to include Pomerania and Silesia, as did the intermarriage of the local, still mostly Slavic, rulers with German spouses. The Teutonic Knights were invited to Prussia by Duke Konrad of Masovia to Christianize the Prussians in 1226. The monastic state of the Teutonic Order () and its later German successor state of Prussia were never part of the Holy Roman Empire.
Under the son and successor of Frederick Barbarossa, Henry VI, the Hohenstaufen dynasty reached its apex. Henry added the Norman kingdom of Sicily to his domains, held English king Richard the Lionheart captive, and aimed to establish a hereditary monarchy when he died in 1197. As his son, Frederick II, though already elected king, was still a small child and living in Sicily, German princes chose to elect an adult king, resulting in the dual election of Frederick Barbarossa's youngest son Philip of Swabia and Henry the Lion's son Otto of Brunswick, who competed for the crown. Otto prevailed for a while after Philip was murdered in a private squabble in 1208 until he began to also claim Sicily.
Pope Innocent III, who feared the threat posed by a union of the empire and Sicily, was now supported by Frederick II, who marched to Germany and defeated Otto. After his victory, Frederick did not act upon his promise to keep the two realms separate. Though he had made his son Henry king of Sicily before marching on Germany, he still reserved real political power for himself. This continued after Frederick was crowned Emperor in 1220. Fearing Frederick's concentration of power, the Pope finally excommunicated the Emperor. Another point of contention was the crusade, which Frederick had promised but repeatedly postponed. Now, although excommunicated, Frederick led the Sixth Crusade in 1228, which ended in negotiations and a temporary restoration of the Kingdom of Jerusalem.
Despite his imperial claims, Frederick's rule was a major turning point towards the disintegration of central rule in the Empire. While concentrated on establishing a modern, centralized state in Sicily, he was mostly absent from Germany and issued far-reaching privileges to Germany's secular and ecclesiastical princes: in the 1220 "Confoederatio cum principibus ecclesiasticis," Frederick gave up a number of "regalia" in favour of the bishops, among them tariffs, coining, and fortification. The 1232 "Statutum in favorem principum" mostly extended these privileges to secular territories. Although many of these privileges had existed earlier, they were now granted globally, and once and for all, to allow the German princes to maintain order north of the Alps while Frederick concentrated on Italy. The 1232 document marked the first time that the German dukes were called "domini terræ," owners of their lands, a remarkable change in terminology as well.
The Kingdom of Bohemia was a significant regional power during the Middle Ages. In 1212, King Ottokar I (bearing the title "king" since 1198) extracted a Golden Bull of Sicily (a formal edict) from the emperor Frederick II, confirming the royal title for Ottokar and his descendants and the Duchy of Bohemia was raised to a kingdom. Bohemian kings would be exempt from all future obligations to the Holy Roman Empire except for participation in the imperial councils. Charles IV set Prague to be the seat of the Holy Roman Emperor.
After the death of Frederick II in 1250, the German kingdom was divided between his son Conrad IV (died 1254) and the anti-king, William of Holland (died 1256). Conrad's death was followed by the Interregnum, during which no king could achieve universal recognition, allowing the princes to consolidate their holdings and become even more independent rulers. After 1257, the crown was contested between Richard of Cornwall, who was supported by the Guelph party, and Alfonso X of Castile, who was recognized by the Hohenstaufen party but never set foot on German soil. After Richard's death in 1273, Rudolf I of Germany, a minor pro-Staufen count, was elected. He was the first of the Habsburgs to hold a royal title, but he was never crowned emperor. After Rudolf's death in 1291, Adolf and Albert were two further weak kings who were never crowned emperor.
Albert was assassinated in 1308. Almost immediately, King Philip IV of France began aggressively seeking support for his brother, Charles of Valois, to be elected the next King of the Romans. Philip thought he had the backing of the French Pope Clement V (established at Avignon in 1309), and that his prospects of bringing the empire into the orbit of the French royal house were good. He lavishly spread French money in the hope of bribing the German electors. Although Charles of Valois had the backing of Henry, Archbishop of Cologne, a French supporter, many were not keen to see an expansion of French power, least of all Clement V. The principal rival to Charles appeared to be Rudolf, the Count Palatine.
Instead, Henry VII, of the House of Luxembourg, was elected with six votes at Frankfurt on 27 November 1308. Given his background, although he was a vassal of king Philip, Henry was bound by few national ties, an aspect of his suitability as a compromise candidate among the electors, the great territorial magnates who had lived without a crowned emperor for decades, and who were unhappy with both Charles and Rudolf. Henry of Cologne's brother, Baldwin, Archbishop of Trier, won over a number of the electors, including Henry, in exchange for some substantial concessions. Henry VII was crowned king at Aachen on 6 January 1309, and emperor by Pope Clement V on 29 June 1312 in Rome, ending the interregnum.
During the 13th century, a general structural change in how land was administered prepared the shift of political power towards the rising bourgeoisie at the expense of the aristocratic feudalism that would characterize the Late Middle Ages. The rise of the cities and the emergence of the new burgher class eroded the societal, legal and economic order of feudalism. Instead of personal duties, money increasingly became the common means to represent economic value in agriculture. Peasants were increasingly required to pay tribute to their lands. The concept of "property" began to replace more ancient forms of jurisdiction, although they were still very much tied together. In the territories (not at the level of the Empire), power became increasingly bundled: whoever owned the land had jurisdiction, from which other powers derived. However, that jurisdiction at the time did not include legislation, which was virtually non-existent until well into the 15th century. Court practice heavily relied on traditional customs or rules described as customary.
During this time territories began to transform into the predecessors of modern states. The process varied greatly among the various lands and was most advanced in those territories that were almost identical to the lands of the old Germanic tribes, "e.g.", Bavaria. It was slower in those scattered territories that were founded through imperial privileges.
In the 12th century the Hanseatic League established itself as a commercial and defensive alliance of the merchant guilds of towns and cities in the empire and all over northern and central Europe. It dominated marine trade in the Baltic Sea, the North Sea and along the connected navigable rivers. Each of the affiliated cities retained the legal system of its sovereign and, with the exception of the Free imperial cities, had only a limited degree of political autonomy. By the late 14th century the powerful league enforced its interests with military means, if necessary. This culminated in a war with the sovereign Kingdom of Denmark from 1361 to 1370. The league declined after 1450.
The difficulties in electing the king eventually led to the emergence of a fixed college of prince-electors ("Kurfürsten"), whose composition and procedures were set forth in the Golden Bull of 1356, which remained valid until 1806. This development probably best symbolizes the emerging duality between emperor and realm ("Kaiser und Reich"), which were no longer considered identical. The Golden Bull also set forth the system for election of the Holy Roman Emperor. The emperor now was to be elected by a majority rather than by consent of all seven electors. For electors the title became hereditary, and they were given the right to mint coins and to exercise jurisdiction. Also it was recommended that their sons learn the imperial languages – German, Latin, Italian, and Czech.
The shift in power away from the emperor is also revealed in the way the post-Hohenstaufen kings attempted to sustain their power. Earlier, the Empire's strength (and finances) greatly relied on the Empire's own lands, the so-called "Reichsgut", which always belonged to the king of the day and included many Imperial Cities. After the 13th century, the relevance of the "Reichsgut" faded, even though some parts of it did remain until the Empire's end in 1806. Instead, the "Reichsgut" was increasingly pawned to local dukes, sometimes to raise money for the Empire, but more frequently to reward faithful duty or as an attempt to establish control over the dukes. The direct governance of the "Reichsgut" no longer matched the needs of either the king or the dukes.
The kings beginning with Rudolf I of Germany increasingly relied on the lands of their respective dynasties to support their power. In contrast with the "Reichsgut", which was mostly scattered and difficult to administer, these territories were relatively compact and thus easier to control. In 1282, Rudolf I thus lent Austria and Styria to his own sons. In 1312, Henry VII of the House of Luxembourg was crowned as the first Holy Roman Emperor since Frederick II. After him all kings and emperors relied on the lands of their own family ("Hausmacht"): Louis IV of Wittelsbach (king 1314, emperor 1328–47) relied on his lands in Bavaria; Charles IV of Luxembourg, the grandson of Henry VII, drew strength from his own lands in Bohemia. It was thus increasingly in the king's own interest to strengthen the power of the territories, since the king profited from such a benefit in his own lands as well.
The "constitution" of the Empire still remained largely unsettled at the beginning of the 15th century. Although some procedures and institutions had been fixed, for example by the Golden Bull of 1356, the rules of how the king, the electors, and the other dukes should cooperate in the Empire much depended on the personality of the respective king. It therefore proved somewhat damaging that Sigismund of Luxemburg (king 1410, emperor 1433–1437) and Frederick III of Habsburg (king 1440, emperor 1452–1493) neglected the old core lands of the empire and mostly resided in their own lands. Without the presence of the king, the old institution of the "Hoftag", the assembly of the realm's leading men, deteriorated. The "Imperial Diet" as a legislative organ of the Empire did not exist at that time. The dukes often conducted feuds against each other – feuds that, more often than not, escalated into local wars.
Simultaneously, the Catholic Church experienced crises of its own, with wide-reaching effects in the Empire. The conflict between several papal claimants (two anti-popes and the "legitimate" Pope) ended only with the Council of Constance (1414–1418); after 1419 the Papacy directed much of its energy to suppressing the Hussites. The medieval idea of unifying all Christendom into a single political entity, with the Church and the Empire as its leading institutions, began to decline.
With these drastic changes, much discussion emerged in the 15th century about the Empire itself. Rules from the past no longer adequately described the structure of the time, and a reinforcement of earlier "Landfrieden" was urgently needed. During this time, the concept of "reform" emerged, in the original sense of the Latin verb "re-formare" – to regain an earlier shape that had been lost.
When Frederick III needed the dukes to finance a war against Hungary in 1486, and at the same time had his son (later Maximilian I) elected king, he faced a demand from the united dukes for their participation in an Imperial Court. For the first time, the assembly of the electors and other dukes was now called the Imperial Diet (German "Reichstag") (to be joined by the Imperial Free Cities later). While Frederick refused, his more conciliatory son finally convened the Diet at Worms in 1495, after his father's death in 1493. Here, the king and the dukes agreed on four bills, commonly referred to as the "Reichsreform" (Imperial Reform): a set of legal acts to give the disintegrating Empire some structure. For example, this act produced the Imperial Circle Estates and the "Reichskammergericht" (Imperial Chamber Court), institutions that would – to a degree – persist until the end of the Empire in 1806. It took a few more decades for the new regulation to gain universal acceptance and for the new court to begin functioning effectively; the Imperial Circles were finalized in 1512. The King also made sure that his own court, the "Reichshofrat", continued to operate in parallel to the "Reichskammergericht". Also in 1512, the Empire received its new title, the "Heiliges Römisches Reich Deutscher Nation" ("Holy Roman Empire of the German Nation").
In 1516, Ferdinand II of Aragon, grandfather of the future Holy Roman Emperor Charles V, died. Due to a combination of (1) the traditions of dynastic succession in Aragon, which permitted maternal inheritance with no precedence for female rule; (2) the insanity of Charles's mother, Joanna of Castile; and (3) the insistence by his remaining grandfather, Maximilian I, that he take up his royal titles, Charles initiated his reign in Castile and Aragon, a union which evolved into Spain, in conjunction with his mother. This ensured for the first time that all the realms of what is now Spain would be united by one monarch under one nascent Spanish crown. The founding territories retained their separate governance codes and laws. In 1519, already reigning as "Carlos I" in Spain, Charles took up the imperial title as "Karl V". The balance (and imbalance) between these separate inheritances would be defining elements of his reign and would ensure that personal union between the Spanish and German crowns would be short-lived. The latter would end up going to a more junior branch of the Habsburgs in the person of Charles's brother Ferdinand, while the senior branch continued to rule in Spain and in the Burgundian inheritance in the person of Charles's son, Philip II of Spain.
In addition to conflicts between his Spanish and German inheritances, conflicts of religion would be another source of tension during the reign of Charles V. Before Charles's reign in the Holy Roman Empire began, in 1517, Martin Luther launched what would later be known as the Reformation. At this time, many local dukes saw it as a chance to oppose the hegemony of Emperor Charles V. The empire then became fatally divided along religious lines, with the north, the east, and many of the major cities – Strasbourg, Frankfurt, and Nuremberg – becoming Protestant while the southern and western regions largely remained Catholic.
Charles V continued to battle the French and the Protestant princes in Germany for much of his reign. After his son Philip married Queen Mary of England, it appeared that France would be completely surrounded by Habsburg domains, but this hope proved unfounded when the marriage produced no children. In 1555, Paul IV was elected pope and took the side of France, whereupon an exhausted Charles finally gave up his hopes of a world Christian empire. He abdicated and divided his territories between Philip and Ferdinand of Austria. The Peace of Augsburg ended the war in Germany and accepted the existence of Protestantism in the form of Lutheranism, while Calvinism was still not recognized. Anabaptist, Arminian and other minor Protestant communities were also forbidden.
Germany would enjoy relative peace for the next six decades. On the eastern front, the Turks continued to loom large as a threat, although war would mean further compromises with the Protestant princes, and so the Emperor sought to avoid it. In the west, the Rhineland increasingly fell under French influence. After the Dutch revolt against Spain erupted, the Empire remained neutral, "de facto" allowing the Netherlands to depart the empire in 1581, a secession acknowledged in 1648. A side effect was the Cologne War, which ravaged much of the upper Rhine.
After Ferdinand died in 1564, his son Maximilian II became Emperor, and like his father accepted the existence of Protestantism and the need for occasional compromise with it. Maximilian was succeeded in 1576 by Rudolf II, a strange man who preferred classical Greek philosophy to Christianity and lived an isolated existence in Bohemia. He became afraid to act when the Catholic Church was forcibly reasserting control in Austria and Hungary, and the Protestant princes became upset over this. Imperial power sharply deteriorated by the time of Rudolf's death in 1612. When Bohemians rebelled against the Emperor, the immediate result was the series of conflicts known as the Thirty Years' War (1618–48), which devastated the Empire. Foreign powers, including France and Sweden, intervened in the conflict and strengthened those fighting Imperial power, but also seized considerable territory for themselves. The long conflict so bled the Empire that it never recovered its strength.
The actual end of the empire came in several steps. The Peace of Westphalia in 1648, which ended the Thirty Years' War, gave the territories almost complete independence. Calvinism was now allowed, but Anabaptists, Arminians and other Protestant communities would still lack any support and continue to be persecuted well until the end of the Empire. The Swiss Confederation, which had already established quasi-independence in 1499, as well as the Northern Netherlands, left the Empire. The Habsburg Emperors focused on consolidating their own estates in Austria and elsewhere.
At the Battle of Vienna (1683), the Army of the Holy Roman Empire, led by the Polish King John III Sobieski, decisively defeated a large Turkish army, stopping the western Ottoman advance and leading to the eventual dismemberment of the Ottoman Empire in Europe. The army was half forces of the Polish–Lithuanian Commonwealth, mostly cavalry, and half forces of the Holy Roman Empire (German/Austrian), mostly infantry.
By the rise of Louis XIV, the Habsburgs were chiefly dependent on their hereditary lands to counter the rise of Prussia, some of whose territories lay inside the Empire. Throughout the 18th century, the Habsburgs were embroiled in various European conflicts, such as the War of the Spanish Succession (1701-1714), the War of the Polish Succession (1733-1735), and the War of the Austrian Succession (1740-1748). The German dualism between Austria and Prussia dominated the empire's history after 1740.
From 1792 onwards, revolutionary France was at war with various parts of the Empire intermittently.
The German mediatization was the series of mediatizations and secularizations that occurred between 1795 and 1814, during the latter part of the era of the French Revolution and then the Napoleonic Era. "Mediatization" was the process of annexing the lands of one imperial estate to another, often leaving the annexed some rights. For example, the estates of the Imperial Knights were formally mediatized in 1806, having "de facto" been seized by the great territorial states in 1803 in the so-called "Rittersturm". "Secularization" was the abolition of the temporal power of an ecclesiastical ruler such as a bishop or an abbot and the annexation of the secularized territory to a secular territory.
The empire was dissolved on 6 August 1806, when the last Holy Roman Emperor Francis II (from 1804, Emperor Francis I of Austria) abdicated, following a military defeat by the French under Napoleon at Austerlitz (see Treaty of Pressburg). Napoleon reorganized much of the Empire into the Confederation of the Rhine, a French satellite. Francis' House of Habsburg-Lorraine survived the demise of the empire, continuing to reign as Emperors of Austria and Kings of Hungary until the Habsburg empire's final dissolution in 1918 in the aftermath of World War I.
The Napoleonic Confederation of the Rhine was replaced by a new union, the German Confederation, in 1815, following the end of the Napoleonic Wars. It lasted until 1866 when Prussia founded the North German Confederation, a forerunner of the German Empire which united the German-speaking territories outside of Austria and Switzerland under Prussian leadership in 1871. This state developed into modern Germany.
The only princely member state of the Holy Roman Empire that has preserved its status as a monarchy until today is the Principality of Liechtenstein. The only Free Imperial Cities still being states within Germany are Hamburg and Bremen. All other historic member states of the HRE were either dissolved or are republican successor states to their princely predecessor states.
The Holy Roman Empire was neither a centralized state nor a nation-state. Instead, it was divided into dozens – eventually hundreds – of individual entities governed by kings, dukes, counts, bishops, abbots, and other rulers, collectively known as princes. There were also some areas ruled directly by the Emperor. At no time could the Emperor simply issue decrees and govern autonomously over the Empire. His power was severely restricted by the various local leaders.
From the High Middle Ages onwards, the Holy Roman Empire was marked by an uneasy coexistence with the princes of the local territories who were struggling to take power away from it. To a greater extent than in other medieval kingdoms such as France and England, the emperors were unable to gain much control over the lands that they formally owned. Instead, to secure their own position from the threat of being deposed, emperors were forced to grant more and more autonomy to local rulers, both nobles and bishops. This process began in the 11th century with the Investiture Controversy and was more or less concluded with the 1648 Peace of Westphalia. Several Emperors attempted to reverse this steady dilution of their authority but were thwarted both by the papacy and by the princes of the Empire.
The number of territories represented in the Imperial Diet was considerable, numbering about 300 at the time of the Peace of Westphalia. Many of these "Kleinstaaten" ("little states") covered no more than a few square miles, and/or included several non-contiguous pieces, so the Empire was often called a "Flickenteppich" ("patchwork carpet").
An entity was considered a "Reichsstand" (imperial estate) if, according to feudal law, it had no authority above it except the Holy Roman Emperor himself. The imperial estates comprised:
A sum total of 1,500 Imperial estates has been reckoned. For a list of "Reichsstände" in 1792, see List of Imperial Diet participants (1792).
A prospective Emperor had first to be elected King of the Romans (Latin: "Rex Romanorum"; German: "römischer König"). German kings had been elected since the 9th century; at that point they were chosen by the leaders of the five most important tribes (the Salian Franks of Lorraine, Ripuarian Franks of Franconia, Saxons, Bavarians, and Swabians). In the Holy Roman Empire, the main dukes and bishops of the kingdom elected the King of the Romans. In 1356, Emperor Charles IV issued the Golden Bull, which limited the electors to seven: the King of Bohemia, the Count Palatine of the Rhine, the Duke of Saxony, the Margrave of Brandenburg, and the archbishops of Cologne, Mainz, and Trier. During the Thirty Years' War, the Duke of Bavaria was given the right to vote as the eighth elector, and the Duke of Brunswick-Lüneburg (colloquially, Hanover) was granted a ninth electorate; additionally, the Napoleonic Wars resulted in several electorates being reallocated, but these new electors never voted before the Empire's dissolution. A candidate for election would be expected to offer concessions of land or money to the electors in order to secure their vote.
After being elected, the King of the Romans could theoretically claim the title of "Emperor" only after being crowned by the Pope. In many cases, this took several years while the King was held up by other tasks: frequently he first had to resolve conflicts in rebellious northern Italy or was quarreling with the Pope himself. Later Emperors dispensed with the papal coronation altogether, being content with the styling "Emperor-Elect": the last Emperor to be crowned by the Pope was Charles V in 1530.
The Emperor had to be male and of noble blood. No law required him to be a Catholic, but as the majority of the Electors adhered to this faith, no Protestant was ever elected. Whether and to what degree he had to be German was disputed among the Electors, contemporary experts in constitutional law, and the public. During the Middle Ages, some Kings and Emperors were not of German origin, but since the Renaissance, German heritage was regarded as vital for a candidate in order to be eligible for imperial office.
The Imperial Diet ("Reichstag", or "Reichsversammlung") was not a legislative body as we understand it today, as its members envisioned it more like a central forum where it was more important to negotiate than to decide. The Diet was theoretically superior to the emperor himself. It was divided into three classes. The first class, the Council of Electors, consisted of the electors, or the princes who could vote for King of the Romans. The second class, the Council of Princes, consisted of the other princes. The Council of Princes was divided into two "benches", one for secular rulers and one for ecclesiastical ones. Higher-ranking princes had individual votes, while lower-ranking princes were grouped into "colleges" by geography. Each college had one vote.
The third class was the Council of Imperial Cities, which was divided into two colleges: Swabia and the Rhine. The Council of Imperial Cities was not fully equal with the others; it could not vote on several matters such as the admission of new territories. The representation of the Free Cities at the Diet had become common since the late Middle Ages. Nevertheless, their participation was formally acknowledged only as late as 1648 with the Peace of Westphalia ending the Thirty Years' War.
The Empire also had two courts: the "Reichshofrat" (also known in English as the Aulic Council) at the court of the King/Emperor, and the "Reichskammergericht" (Imperial Chamber Court), established with the Imperial Reform of 1495 by Maximillian I. The Reichskammergericht and the Auclic Council were the two highest judicial instances in the Old Empire. The Imperial Chamber court's composition was determined by both the Holy Roman Emperor and the subject states of the Empire. Within this court, the Emperor appointed the chief justice, always a highborn aristocrat, several divisional chief judges, and some of the other puisne judges. The Aulic Council held standing over many judicial disputes of state, both in concurrence with the Imperial Chamber court and exclusively on their own. The provinces Imperial Chamber Court extended to breaches of the public peace, cases of arbitrary distraint or imprisonment, pleas which concerned the treasury, violations of the Emperor's decrees or the laws passed by the Imperial Diet, disputes about property between immediate tenants of the Empire or the subjects of different rulers, and finally suits against immediate tenants of the Empire, with the exception of criminal charges and matters relating to imperial fiefs, which went to the Aulic Council.
As part of the Imperial Reform, six Imperial Circles were established in 1500; four more were established in 1512. These were regional groupings of most (though not all) of the various states of the Empire for the purposes of defense, imperial taxation, supervision of coining, peace-keeping functions, and public security. Each circle had its own parliament, known as a "Kreistag" ("Circle Diet"), and one or more directors, who coordinated the affairs of the circle. Not all imperial territories were included within the imperial circles, even after 1512; the Lands of the Bohemian Crown were excluded, as were Switzerland, the imperial fiefs in northern Italy, the lands of the Imperial Knights, and certain other small territories like the Lordship of Jever.
The Army of the Holy Roman Empire (German "Reichsarmee", "Reichsheer" or "Reichsarmatur"; Latin "exercitus imperii") was created in 1422 and came to an end even before the Empire as the result of the Napoleonic Wars. It must not be confused with the Imperial Army ("Kaiserliche Armee") of the Emperor.
Despite appearances to the contrary, the Army of the Empire did not constitute a permanent standing army that was always at the ready to fight for the Empire. When there was danger, an Army of the Empire was mustered from among the elements constituting it, in order to conduct an imperial military campaign or "Reichsheerfahrt". In practice, the imperial troops often had local allegiances stronger than their loyalty to the Emperor.
Throughout the first half of its history the Holy Roman Empire was reigned by a travelling court. Kings and emperors toured between the numerous Kaiserpfalzes (Imperial palaces), usually resided for several weeks or months and furnished local legal matters, law and administration. Most rulers maintained one or a number of favourites Imperial palace sites, where they would advance development and spent most of their time: Charlemagne (Aachen from 794), Frederick II (Palermo 1220–1254), Wittelsbacher (Munich 1328–1347 and 1744–1745), Habsburger (Prague 1355–1437 and 1576–1611) and (Vienna 1438–1576, 1611–1740 and 1745–1806). This practice eventually ended during the 14th century, as the emperors of the Habsburg dynasty chose Vienna and Prague and the Wittelsbach rulers chose Munich as their permanent residences. These sites served however only as the individual residence for a particular sovereign. A number of cities held official status, where the Imperial Estates would summon at Imperial Diets, the deliberative assembly of the empire.
The Imperial Diet ("Reichstag") resided variously in Paderborn, Bad Lippspringe, Ingelheim am Rhein, Diedenhofen (now Thionville), Aachen, Worms, Forchheim, Trebur, Fritzlar, Ravenna, Quedlinburg, Dortmund, Verona, Minden, Mainz, Frankfurt am Main, Merseburg, Goslar, Würzburg, Bamberg, Schwäbisch Hall, Augsburg, Nuremberg, Quierzy-sur-Oise, Speyer, Gelnhausen, Erfurt, Eger (now Cheb), Esslingen, Lindau, Freiburg, Cologne, Konstanz and Trier before it was moved permanently to Regensburg.
Until the 15th century the elected emperor was crowned and anointed by the Pope in Rome, among some exceptions in Ravenna, Bologna and Reims. Since 1508 (emperor Maximilian I) Imperial elections took place in Frankfurt am Main, Augsburg, Rhens, Cologne or Regensburg.
In December 1497 the Aulic Council ("Reichshofrat") was established in Vienna.
In 1495 the "Reichskammergericht" was established, which variously resided in Worms, Augsburg, Nuremberg, Regensburg, Speyer and Esslingen before it was moved permanently to Wetzlar.
The Habsburg royal family had its own diplomats to represent its interests. The larger principalities in the HRE, beginning around 1648, also did the same. The HRE did not have its own dedicated ministry of foreign affairs and therefore the Imperial Diet had no control over these diplomats; occasionally the Diet criticised them.
When Regensburg served as the site of the Diet, France and, in the late 1700s, Russia, had diplomatic representatives there. Denmark, Great Britain, and Sweden had land holdings in Germany and so had representation in the Diet itself. The Netherlands also had envoys in Regensburg. Regensburg was the place where envoys met as it was where representatives of the Diet could be reached.
Overall population figures for the Holy Roman Empire are extremely vague and vary widely. Given the political fragmentation of the Empire, there were no central agencies that could compile such figures. According to an overgenerous contemporary estimate of the Austrian War Archives for the first decade of the 18th century, the Empire, including Bohemia and the Spanish Netherlands, had a population of close to 28 million with a breakdown as follows:
German demographic historians have traditionally worked on estimates of the population of the Holy Roman Empire based on assumed population within the frontiers of Germany in 1871 or 1914. More recent estimates use less outdated criteria, but they remain guesswork. One estimate based on the frontiers of Germany in 1870 gives a population of some 15–17 million around 1600, declined to 10–13 million around 1650 (following the Thirty Years' War). Other historians who work on estimates of the population of the early modern Empire suggest the population declined from 20 million to some 16–17 million by 1650.
A credible estimate for 1800 gives 27 million inhabitants for the Empire, with an overall breakdown as follows:
Largest cities or towns of the Empire by year:
Roman Catholicism constituted the single official religion of the Empire until 1555. The Holy Roman Emperor was always a Roman Catholic.
Lutheranism was officially recognized in the Peace of Augsburg of 1555, and Calvinism in the Peace of Westphalia of 1648. Those two constituted the only officially recognized Protestant denominations, while various other Protestant confessions such as Anabaptism, Arminianism, etc. coexisted illegally within the Empire. Anabaptism came in a variety of denominations, including Mennonites, Schwarzenau Brethren, Hutterites, the Amish, and multiple other groups.
Following the Peace of Augsburg, the official religion of a territory was determined by the principle cujus regio, ejus religio according to which a ruler's religion determined that of his subjects. The Peace of Westphalia abrogated that principle by stipulating that the official religion of a territory was to be what it had been on 1 January 1624, considered to have been a "normal year". Henceforth, the conversion of a ruler to another faith did not entail the conversion of his subjects. In addition, all Protestant subjects of a Catholic ruler and vice versa were guaranteed the rights that they had enjoyed on that date. While the adherents of a territory's official religion enjoyed the right of public worship, the others were allowed the right of private worship (in chapels without either spires or bells). In theory, no one was to be discriminated against or excluded from commerce, trade, craft or public burial on grounds of religion. For the first time, the permanent nature of the division between the Christian Churches of the empire was more or less assumed.
In addition, a Jewish minority existed in the Holy Roman Empire. | https://en.wikipedia.org/wiki?curid=13277 |
Holiday
A holiday is a day set aside by custom or by law on which normal activities, especially business or work including school, are suspended or reduced. Generally, holidays are intended to allow individuals to celebrate or commemorate an event or tradition of cultural or religious significance. Holidays may be designated by governments, religious institutions, or other groups or organizations. The degree to which normal activities are reduced by a holiday may depend on local laws, customs, the type of job held or personal choices.
The concept of holidays often originated in connection with religious observances. The intention of a holiday was typically to allow individuals to tend to religious duties associated with important dates on the calendar. In most modern societies, however, holidays serve as much of a recreational function as any other weekend days or activities.
In many societies there are important distinctions between holidays designated by governments and holidays designated by religious institutions. For example, in many predominantly Christian nations, government-designed holidays may center on Christian holidays, though non-Christians may instead observe religious holidays associated with their faith. In some cases, a holiday may only be nominally observed. For example, many Jews in the Americas and Europe treat the relatively minor Jewish holiday of Hanukkah as a "working holiday", changing very little of their daily routines for this day.
The word "holiday" has differing connotations in different regions. In the United States the word is used exclusively to refer to the nationally, religiously or culturally observed day(s) of rest or celebration, or the events themselves, whereas in the United Kingdom and other Commonwealth nations, the word may refer to the period of time where leave from one's duties has been agreed, and is used as a synonym to the US preferred "vacation". This time is usually set aside for rest, travel or the participation in recreational activities, with entire industries targeted to coincide or enhance these experiences. The days of leave may not coincide with any specific customs or laws. Employers and educational institutes may designate ‘holidays’ themselves which may or may not overlap nationally or culturally relevant dates, which again comes under this connotation, but it is the first implication detailed that this article is concerned with.
The word "holiday" comes from the Old English word "hāligdæg" ("hālig" "holy" + "dæg" "day"). The word originally referred only to special religious days. In modern use, it means any special day of rest or relaxation, as opposed to normal days away from work or school.
Winter in the Northern Hemisphere features many holidays that involve festivals and feasts. The Christmas and holiday season surrounds the Christmas and other holidays, and is celebrated by many religions and cultures. Usually, this period begins near the start of November and ends with New Year's Day. "Holiday season" in the US corresponds to the period that begins with Thanksgiving and ends with New Year's Eve. Some Christian countries consider the end of the festive season to be after the feast of Epiphany.
Sovereign nations and territories observe holidays based on events of significance to their history. For example, Americans celebrate Independence Day, celebrating the signing of the Declaration of Independence in 1776.
Other secular (non-religious) holidays are observed nationally, internationally (often in conjunction with organizations such as the United Nations), and across multi-country regions. The United Nations Calendar of Observances dedicates decades to a specific topic, but also a complete year, month, week and days. Holidays dedicated to an observance such as the commemoration of the ending of World War II, or the Shoah, can also be part of the reparation obligation as per UN General Assembly Resolution 60/147 Basic Principles and Guidelines on the Right to a Remedy and Reparation for Victims of Gross Violations of International Human Rights Law and Serious Violations of International Humanitarian Law.
Another example of a major secular holiday is the Lunar New Year, which is celebrated across East Asia and South East Asia. Many other days are marked to celebrate events or people, but are not strictly holidays as time off work is rarely given; examples include Arbor Day (originally U.S.), Labor Day (celebrated sometimes under different names and on different days in different countries), and Earth Day (22 April).
These are holidays that are not traditionally marked on calendars. These holidays are celebrated by various groups and individuals. Some promote a cause, others recognize historical events not officially recognized, and others are "funny" holidays celebrated with humorous intent. For example, Monkey Day is celebrated on December 14, International Talk Like a Pirate Day is observed on September 19, and Blasphemy Day is held on September 30. Other examples are April Fools' Day on April 1 and World No Tobacco Day on May 31. Various community organizers and marketers promote odd social media holidays.
Many holidays are linked to faiths and religions (see etymology above). Christian holidays are defined as part of the liturgical year, the chief ones being Easter and Christmas. The Orthodox Christian and Western-Roman Catholic patronal feast day or "name day" are celebrated in each place's patron saint's day, according to the Calendar of saints. Jehovah's Witnesses annually commemorate "The Memorial of Jesus Christ's Death", but do not celebrate other holidays with any religious significance such as Easter, Christmas or New Year's. This holds especially true for those holidays that have combined and absorbed rituals, overtones or practices from non-Christian beliefs into the celebration, as well as those holidays that distract from or replace the worship of Jehovah. In Islam, the largest holidays are Eid al-Fitr (immediately after Ramadan) and Eid al-Adha (at the end of the Hajj). Ahmadi Muslims additionally celebrate Promised Messiah Day, Promised Reformer Day, and Khilafat Day, but contrary to popular belief, neither are regarded as holidays. Hindus, Jains and Sikhs observe several holidays, one of the largest being Diwali (Festival of Light). Japanese holidays as well as few Catholic holidays contain heavy references to several different faiths and beliefs. Celtic, Norse, and Neopagan holidays follow the order of the Wheel of the Year. For example, Christmas ideas like decorating trees and colors (green, red, and white) have very similar ideas to modern Wicca (a modern Pagan belief) Yule which is a lesser Sabbat of the wheel of the year. Some are closely linked to Swedish festivities. The Baháʼí Faith observes 11 annual holidays on dates determined using the Baháʼí calendar. Jews have two holiday seasons: the Spring Feasts of Pesach (Passover) and Shavuot (Weeks, called Pentecost in Greek); and the Fall Feasts of Rosh Hashanah (Head of the Year), Yom Kippur (Day of Atonement), Sukkot (Tabernacles), and Shemini Atzeret (Eighth Day of Assembly).
If a holiday coincides with another holiday or a weekend day a substitute holiday may be recognised in lieu. In the United Kingdom the government website states that "If a bank holiday is on a weekend, a 'substitute' weekday becomes a bank holiday, normally the following Monday.", and the list of bank holidays for the year 2020 includes Monday 28 December as "Boxing Day (substitute day)", as 26 December is a Saturday. The process of moving a holiday from a weekend day to the following Monday is known as Mondayisation in New Zealand. | https://en.wikipedia.org/wiki?curid=13279 |
Hobby
A hobby is a regular activity done for enjoyment, typically during one's leisure time, not professionally and not for pay. Hobbies include collecting themed items and objects, engaging in creative and artistic pursuits, playing sports, or pursuing other amusements. Participation in hobbies encourages acquiring substantial skills and knowledge in that area. A list of hobbies changes with renewed interests and developing fashions, making it diverse and lengthy. Hobbies tend to follow trends in society, for example stamp collecting was popular during the nineteenth and twentieth centuries as postal systems were the main means of communication, while video games are more popular nowadays following technological advances. The advancing production and technology of the nineteenth century provided workers with more availability in leisure time to engage in hobbies. Because of this, the efforts of people investing in hobbies has increased with time.
Hobbyists may be identified under three sub-categories: "casual leisure" which is intrinsically rewarding, short-lived, pleasurable activity requiring little or no preparation, "serious leisure" which is the systematic pursuit of an amateur, hobbyist, or volunteer that is substantial, rewarding and results in a sense of accomplishment, and finally "project-based leisure" which is a short-term often a one-off project that is rewarding.
In the 16th century, the term "hobyn" had the meaning of "small horse and pony". The term "hobby horse" was documented in a 1557 payment confirmation for a "Hobbyhorse" from Reading, England. The item, originally called a "Tourney Horse", was made of a wooden or basketwork frame with an artificial tail and head. It was designed for a child to mimic riding a real horse. By 1816 the derivative, "hobby", was introduced into the vocabulary of a number of English people. Over the course of subsequent centuries, the term came to be associated with recreation and leisure. In the 17th century, the term was used in a pejorative sense by suggesting that a hobby was a childish pursuit, however, in the 18th century with a more industrial society and more leisure time, hobbies took on greater respectability. A hobby is also called a pastime, derived from the use of hobbies to pass the time. A hobby became an activity that is practised regularly and usually with some worthwhile purpose. Hobbies are usually, but not always, practised primarily for interest and enjoyment, rather than financial reward.
Hobbies were originally described as pursuits that others thought somewhat childish or trivial. However, as early as 1676 Sir Matthew Hale, in "Contemplations Moral and Divine", wrote "Almost every person hath some hobby horse or other wherein he prides himself." He was acknowledging that a "hobby horse" produces a legitimate sense of pride. By the mid 18th century there was a flourishing of hobbies as working people had more regular hours of work and greater leisure time. They spent more time to pursue interests that brought them satisfaction. However, there was concern that these working people might not use their leisure time in worthwhile pursuits. "The hope of weaning people away from bad habits by the provision of counter-attractions came to the fore in the 1830s, and has rarely waned since. Initially the bad habits were perceived to be of a sensual and physical nature, and the counter attractions, or perhaps more accurately alternatives, deliberately cultivated rationality and the intellect." The flourishing book and magazine trade of the day encouraged worthwhile hobbies and pursuits. The burgeoning manufacturing trade made materials used in hobbies cheap and was responsive to the changing interests of hobbyists.
The English have been identified as enthusiastic hobbyists, as George Orwell observed. "[A]nother English characteristic which is so much a part of us that we barely notice it … is the addiction to hobbies and spare-time occupations, the privateness of English life. We are a nation of flower-lovers, but also a nation of stamp-collectors, pigeon-fanciers, amateur carpenters, coupon-snippers, darts-players, crossword-puzzle fans. All the culture that is most truly native centres round things which even when they are communal are not official—the pub, the football match, the back garden, the fireside and the 'nice cup of tea'."
Deciding what to include in a list of hobbies provokes debate because it is difficult to decide which pleasurable pass-times can also be described as hobbies. During the 20th century the term hobby suggested activities, such as stamp collecting, embroidery, knitting, painting, woodwork, and photography. Typically the description did not include activities like listening to music, watching television, or reading. These latter activities bring pleasure, but lack the sense of achievement usually associated with a hobby. They are usually not structured, organised pursuits, as most hobbies are. The pleasure of a hobby is usually associated with making something of value or achieving something of value. "Such leisure is socially valorised precisely because it produces feelings of satisfaction with something that looks very much like work but that is done of its own sake." "Hobbies are a contradiction: they take work and turn it into leisure, and take leisure and turn it into work."
Hobbies change with time. In the 21st century, the video game industry is a very large hobby involving millions of kids and adults in various forms of 'play'. Stamp collecting declined along with the importance of the postal system. Woodwork and knitting declined as hobbies, because manufactured goods provide cheap alternatives for handmade goods. Through the internet, an online community has become a hobby for many people; sharing advice, information and support, and in some cases, allowing a traditional hobby, such as collecting, to flourish and support trading in a new environment.
Hobbyists are a part of a wider group of people engaged in leisure pursuits where the boundaries of each group overlap to some extent. The "Serious Leisure Perspective" groups hobbyists with amateurs and volunteers and identifies three broad groups of leisure activity with hobbies being found mainly in the Serious leisure category. "Casual leisure" is intrinsically rewarding, short-lived, pleasurable activity requiring little or no preparation. "Serious leisure" is the systematic pursuit of an amateur, hobbyist, or volunteer that is substantial, rewarding and results in a sense of accomplishment. Finally, "project-based leisure" is a short-term often a one-off project that is rewarding.
The terms amateur and hobbyist are often used interchangeably. Stebbins has a framework which distinguishes the terms in a useful categorisation of leisure in which "casual leisure" is separated from "serious Leisure". He describes serious leisure as undertaken by "amateurs", "hobbyists" and "volunteers". "Amateurs" engage in pursuits that have a professional counterpart, such as playing an instrument or astronomy. Hobbyists engage in five broad types of activity: "collecting", "making and tinkering" (like embroidery and car restoration), "activity participation" (like fishing and singing), "sports and games", and "liberal-arts" hobbies (like languages, cuisine, literature). Volunteers commit to organisations where they work as guides, counsellors, gardeners and so on. The separation of the amateur from the hobbyist is because the amateur has the ethos of the professional practitioner as a guide to practice. An amateur clarinetist is conscious of the role and procedures of a professional clarinetist.
A large proportion of hobbies are mainly solitary in nature. However, individual pursuit of a hobby often includes club memberships, organised sharing of products and regular communication between participants. For many hobbies there is an important role in being in touch with fellow hobbyists. Some hobbies are of communal nature, like choral singing and volunteering.
People who engage in hobbies have an interest in and time to pursue them. Children have been an important group of hobbyists because they are enthusiastic for collecting, making and exploring, in addition to this they have the leisure time that allows them to pursue those hobbies. The growth in hobbies occurred during industrialisation which gave workers set time for leisure. During the Depression there was an increase in the participation in hobbies because the unemployed had the time and a desire to be purposefully occupied. Hobbies are often pursued with an increased interest by retired people because they have the time and seek the intellectual and physical stimulation a hobby provides.
Hobbies are a diverse set of activities and it is difficult to categorize them in a logical manner. The following categorization of hobbies was developed by Stebbins.
Collecting includes seeking, locating, acquiring, organizing, cataloging, displaying and storing. Collecting is appealing to many people due to their interest in a particular subject and a desire to categorise and make order out of complexity. Some collectors are generalists, accumulating items from countries of the world. Others focus on a subtopic within their area of interest, perhaps 19th century postage stamps, milk bottle labels from Sussex, or Mongolian harnesses and tack, Firearms (both modern and vintage).
Collecting is an ancient hobby, with the list of coin collectors showing Caesar Augustus as one. Sometimes collectors have turned their hobby into a business, becoming commercial dealers that trade in the items being collected.
An alternative to collecting physical objects is collecting records of events of a particular kind. Examples include train spotting, bird-watching, aircraft spotting, railfans, and any other form of systematic recording a particular phenomenon. The recording form can be written, photographic, online, etc.
"Making" and "tinkering" includes working on self-motivated projects for fulfillment. These projects may be progressive, irregular tasks performed over a long period of time. Making and Tinkering hobbies include higher-end projects, such as building or restoring a car or building a computer from individual parts, like CPUs and SSDs. For computer savvy do-it-yourself hobbyists, CNC (Computer Numerical Control) machining may also popular. A CNC machine can be assembled and programmed to make different parts from wood or metal.
Tinkering is 'dabbling' with the making process, often applied to the hobby of tinkering with car repairs, and various kinds of restoration: of furniture, antique cars, etc. It also applies to household tinkering: repairing a wall, laying a pathway, etc. Examples of Making and Tinkering hobbies include Scale modeling, model engineering, 3D printing, dressmaking, and cooking.
Scale modeling is making a replica of a real-life object in a smaller scale and dates back to prehistoric times with small clay "dolls" and other children's toys that have been found near known populated areas. The Persians, Greeks, and Romans took the form to a greater depth during their years of domination of the Western World, using scale replicas of enemy fortifications, coastal defense lines, and other geographic fixtures to plan battles.
At the turn of the Industrial Age and through the 1920s, some families could afford things such as electric trains, wind-up toys (typically boats or cars) and the increasingly valuable tin toy soldiers. Scale modeling as we know it today became popular shortly after World War II. Before 1946, children as well as adults were content in carving and shaping wooden replicas from block wood kits, often depicting enemy aircraft to help with identification in case of an invasion.
With the advent of modern plastics, the amount of skill required to get the basic shape accurately shown for any given subject was lessened, making it easier for people of all ages to begin assembling replicas in varying scales. Superheroes, aeroplanes, boats, cars, tanks, artillery, and even figures of soldiers became quite popular subjects to build, paint and display. Although almost any subject can be found in almost any scale, there are common scales for such miniatures which remain constant today.
Model engineering refers to building functioning machinery in metal, such as internal combustion motors and live steam models or locomotives. This is a demanding hobby that requires a multitude of large and expensive tools, such as lathes and mills. This hobby originated in the United Kingdom in the late 19th century, later spreading and flourishing in the mid-20th century. Due to the expense and space required, it is becoming rare.
3D Printing is a relatively new technology and already a major hobby as the cost of printers has fallen sharply. It is a good example of how hobbyists quickly engage with new technologies, communicate with one another and become producers related to their former hobby. 3D modeling is the process of making mathematical representations of three dimensional items and is an aspect of 3D printing.
Dressmaking has been a major hobby up until the late 20th century, in order to make cheap clothes, but also as a creative design and craft challenge. It has been reduced by the low cost of manufactured clothes.
Cooking is for some people an interest, a hobby, a challenge and a source of significant satisfaction. For many other people it is a job, a chore, a duty, like cleaning. In the early 21st century the importance of cooking as a hobby was demonstrated by the high popularity of competitive television cooking programs.
Activity participation includes partaking in "non-competitive, rule-based pursuits."
Outdoor pursuits are the group of activities which occur outdoors. These hobbies include gardening, hill walking, hiking, backpacking, cycling, canoeing, climbing, caving, fishing, hunting, target shooting (informal or formal), wildlife viewing (as birdwatching) and engaging in watersports and snowsports.
One large subset of outdoor pursuits is gardening. Residential gardening most often takes place in or about one's own residence, in a space referred to as the garden. Although a garden typically is located on the land near a residence, it may also be located on a roof, in an atrium, on a balcony, in a windowbox, or on a patio or vivarium.
Gardening also takes place in non-residential green areas, such as parks, public or semi-public gardens (botanical gardens or zoological gardens), amusement and theme parks, along transportation corridors, and around tourist attractions and hotels. In these situations, a staff of gardeners or groundskeepers maintains the gardens.
Indoor gardening is concerned with growing houseplants within a residence or building, in a conservatory, or in a greenhouse. Indoor gardens are sometimes incorporated into air conditioning or heating systems.
Water gardening is concerned with growing plants that have adapted to pools and ponds. Bog gardens are also considered a type of water garden. A simple water garden may consist solely of a tub containing the water and plant(s).
Container gardening is concerned with growing plants in containers that are placed above the ground.
Many hobbies involve performances by the hobbyist, such as singing, acting, juggling, magic, dancing, playing a musical instrument, martial arts, and other performing arts.
Some hobbies may result in an end product. Examples of this would be woodworking, photography, moviemaking, jewelry making, software projects such as Photoshopping and home music or video production, making bracelets, artistic projects such as drawing, painting, writing..., Cosplay (design, creation, and wearing a costume based on an already existing creative property), creating models out of card stock or paper – called papercraft. Many of these fall under the category visual arts.
Reading, books, ebooks, magazines, comics, or newspapers, along with browsing the internet is a common hobby, and one that can trace its origins back hundreds of years. A love of literature, later in life, may be sparked by an interest in reading children's literature as a child. Many of these fall under the category literary arts.
Stebbins distinguishes an amateur sports person and a hobbyist by suggesting a hobbyist plays in less formal sports, or games that are rule bound and have no professional equivalent. While an amateur sports individual plays a sport with a professional equivalent, such as football or tennis. Amateur sport may range from informal play to highly competitive practice, such as deck tennis or long distance trekking.
The Department for Culture, Media, and Support in England suggests that playing sports benefits physical and mental health. A positive relationship appeared between engaging in sports and improving overall health.
During the 20th century there was extensive research into the important role that play has in human development. While most evident in childhood, play continues throughout life for many adults in the form of games, hobbies, and sport. Moreover, studies of ageing and society support the value of hobbies in healthy ageing.
There have been many instances where hobbyists and amateurs have achieved significant discoveries and developments. These are a small sample.
RC Cars HOBBY | https://en.wikipedia.org/wiki?curid=13287 |
Holland
Holland is a region and former province on the western coast of the Netherlands. The name "Holland" is also frequently used informally to refer to the whole of the country of the Netherlands. This usage is commonly accepted in other countries, and sometimes employed by the Dutch themselves. However, some in the Netherlands, particularly those from regions outside Holland, may find it undesirable or misrepresentative to use the term for the whole country.
From the 10th to the 16th century, Holland proper was a unified political region within the Holy Roman Empire as a county ruled by the Counts of Holland. By the 17th century, the province of Holland had risen to become a maritime and economic power, dominating the other provinces of the newly independent Dutch Republic.
The area of the former County of Holland roughly coincides with the two current Dutch provinces of North Holland and South Holland into which it was divided, and which together include the Netherlands's three largest cities: the "de jure" capital city of Amsterdam; Rotterdam, home of Europe's largest port; and the seat of government of The Hague. Holland has a population of 6,583,534 as of November 2019, and a density of .
The name "Holland" first appeared in sources for the region around Haarlem, and by 1064 was being used as the name of the entire county. By the early twelfth century, the inhabitants of Holland were called "Hollandi" in a Latin text. "Holland" is derived from the Old Dutch term "holtlant" ("wood-land"). This spelling variation remained in use until around the 14th century, at which time the name stabilised as "Holland" (alternative spellings at the time were "Hollant" and "Hollandt"). A popular but erroneous folk etymology holds that "Holland" is derived from "hol land" ("hollow land" in Dutch) purportedly inspired by the low-lying geography of the land.
"Holland" is informally used in English and other languages, including sometimes the Dutch language itself, to mean the whole of the modern country of the Netherlands. This example of "pars pro toto" or synecdoche is similar to the tendency to refer to the United Kingdom as "England", and developed due to Holland's becoming the dominant province and thus having the majority of political and economic interactions with other countries.
Between 1806 and 1810 "Holland" was the official name for the country as a whole, after Napoleon made his brother Louis Bonaparte the monarch of the Kingdom of Holland.
The people of Holland are referred to as "Hollanders" in both Dutch and English, though in English this is now unusual. Today this refers specifically to people from the current provinces of North Holland and South Holland. Strictly speaking, the term "Hollanders" does not refer to people from the other provinces in the Netherlands, but colloquially "Hollanders" is sometimes used in this wider sense.
In Dutch, the Dutch word ""Hollands"" is the adjectival form for ""Holland"". The Dutch word ""Hollands"" is also colloquially used by some Dutch people in the sense of ""Nederlands"" (Dutch), occasionally with the intention of contrasting with other types of Dutch people or language, for example Limburgish, the Belgian varieties of the Dutch language ("Flemish"), or even any southern variety of Dutch within the Netherlands itself.
In English, "Dutch" refers to the Netherlands as a whole, but there is no commonly used adjective for "Holland". The word "Hollandish" is no longer in common use. "Hollandic" is the name linguists give to the dialect spoken in Holland, and is occasionally also used by historians and when referring to pre-Napoleonic Holland.
Initially, Holland was a remote corner of the Holy Roman Empire. Gradually, its regional importance increased until it began to have a decisive, and ultimately dominant, influence on the History of the Netherlands.
Until the start of the 12th century, the inhabitants of the area that became Holland were known as Frisians. The area was initially part of Frisia. At the end of the 9th century, West-Frisia became a separate county in the Holy Roman Empire. The first Count known about with certainty was Dirk I, who ruled from 896 to 931. He was succeeded by a long line of counts in the House of Holland (who were in fact known as counts of Frisia until 1101). When John I, Count of Holland, died childless in 1299, the county was inherited by John II of Avesnes, count of Hainaut. By the time of William V (House of Wittelsbach; 1354–1388) the count of Holland was also the count of Hainaut and Zealand.
After the St. Lucia's flood in 1287 the part of Frisia west of the later Zuiderzee, West Friesland, was conquered. As a result, most provincial institutions, including the States of Holland and West Frisia, would for more than five centuries refer to "Holland and West Frisia" as a unit. The Hook and Cod wars started around this time and ended when the countess of Holland, Jacoba or Jacqueline was forced to cede Holland to the Burgundian Philip III, known as Philip the Good, in 1432.
In 1432, Holland became part of the Burgundian Netherlands and since 1477 of the Habsburg Seventeen Provinces. In the 16th century the county became the most densely urbanised region in Europe, with the majority of the population living in cities. Within the Burgundian Netherlands, Holland was the dominant province in the north; the political influence of Holland largely determined the extent of Burgundian dominion in that area. The last count of Holland was Philip III, better known as Philip II, king of Spain. He was deposed in 1581 by the Act of Abjuration, although the kings of Spain continued to carry the titular appellation of Count of Holland until the Peace of Münster signed in 1648.
In the Dutch Rebellion against the Habsburgs during the Eighty Years' War, the naval forces of the rebels, the Watergeuzen, established their first permanent base in 1572 in the town of Brill. In this way, Holland, now a sovereign state in a larger Dutch confederation, became the centre of the rebellion. It became the cultural, political and economic centre of the United Provinces (), in the 17th century, the Dutch Golden Age, the wealthiest nation in the world. After the King of Spain was deposed as the count of Holland, the executive and legislative power rested with the States of Holland, which was led by a political figure who held the office of Grand Pensionary.
The largest cities in the Dutch Republic were in the province of Holland, such as Amsterdam, Rotterdam, Leiden, Alkmaar, The Hague, Delft, Dordrecht and Haarlem. From the great ports of Holland, Hollandic merchants sailed to and from destinations all over Europe, and merchants from all over Europe gathered to trade in the warehouses of Amsterdam and other trading cities of Holland.
Many Europeans thought of the United Provinces first as "Holland" rather than as the "Republic of the Seven United Provinces of the Netherlands". A strong impression of "Holland" was planted in the minds of other Europeans, which then was projected back onto the Republic as a whole. Within the provinces themselves, a gradual slow process of cultural expansion took place, leading to a "Hollandification" of the other provinces and a more uniform culture for the whole of the Republic. The dialect of urban Holland became the standard language.
The formation of the Batavian Republic, inspired by the French revolution, led to a more centralised government. Holland became a province of a unitary state. Its independence was further reduced by an administrative reform in 1798, in which its territory was divided into several departments called "Amstel", "Delf", "Texel", and part of "Schelde en Maas".
From 1806 to 1810 Napoleon styled his vassal state, governed by his brother Louis Napoleon and shortly by the son of Louis, Napoleon Louis Bonaparte, as the "Kingdom of Holland". This kingdom encompassed much of what would become the modern Netherlands. The name reflects how natural at the time it had become to equate Holland with the non-Belgian Netherlands as a whole.
During the period when the Low Countries were annexed by the French Empire and actually incorporated into France (from 1810 to 1813), Holland was divided into départements Zuyderzée, and Bouches-de-la-Meuse. From 1811 to 1813 Charles-François Lebrun, duc de Plaisance served as governor-general. He was assisted by Antoine de Celles, Goswin de Stassart and François Jean-Baptiste d'Alphonse. In 1813, Dutch dignitaries proclaimed the Sovereign Principality of the United Netherlands.
In 1815, Holland was restored as a province of the United Kingdom of the Netherlands. Holland was divided into the present provinces North Holland and South Holland in 1840, after the Belgian Revolution of 1830. This reflected a historical division of Holland along the IJ into a Southern Quarter ("Zuiderkwartier") and a Northern Quarter ("Noorderkwartier"), but the present division is different from the old division. From 1850, a strong process of nation formation took place, the Netherlands being culturally unified and economically integrated by a modernisation process, with the cities of Holland as its centre.
Holland is located in the west of the Netherlands. A maritime region, Holland lies on the North Sea at the mouths of the Rhine and the Meuse (Maas). It contains numerous rivers and lakes, and has an extensive inland canal and waterway system. To the south is Zealand. The region is bordered on the east by the IJsselmeer and four Dutch provinces.
Holland is protected from the sea by a long line of coastal dunes. The highest point in Holland, about above sea level, is in the (Schoorl Dunes). Most of the land area behind the dunes consists of polder landscape lying well below sea level. At present the lowest point in Holland is a polder near Rotterdam, which is about below sea level. Continuous drainage is necessary to keep Holland from flooding. In earlier centuries windmills were used for this task. The landscape was (and in places still is) dotted with windmills, which have become a symbol of Holland.
Holland is , land and water included, making it roughly 13% of the area of the Netherlands. Looking at land alone, it is in area. The combined population was in 2018 6.5 million.
The main cities in Holland are Amsterdam, Rotterdam and The Hague. Amsterdam is formally the capital of the Netherlands and its largest city. The Port of Rotterdam is Europe's largest and most important harbour and port. The Hague is the seat of government of the Netherlands. These cities, combined with Utrecht and other smaller municipalities, effectively form a single metroplex—a conurbation called Randstad.
The Randstad area is one of the most densely populated regions of Europe, but still relatively free of urban sprawl. There are strict zoning laws. Population pressures are enormous, property values are high, and new housing is constantly under development on the edges of the built-up areas. Surprisingly, much of the province still has a rural character. The remaining agricultural land and natural areas are highly valued and protected. Most of the arable land is used for intensive agriculture, including horticulture and greenhouse agri-businesses.
The land that is now Holland has not been "stable" since prehistoric times. The western coastline shifted up to to the east and storm surges regularly broke through the row of coastal dunes. The Frisian Isles, originally joined to the mainland, became detached islands in the north. The main rivers, the Rhine and the Meuse (Maas), flooded regularly and changed course repeatedly and dramatically.
The people of Holland found themselves living in an unstable, watery environment. Behind the dunes on the coast of the Netherlands a high peat plateau had grown, forming a natural protection against the sea. Much of the area was marsh and bog. By the tenth century the inhabitants set about cultivating this land by draining it. However, the drainage resulted in extreme soil shrinkage, lowering the surface of the land by up to .
To the south of Holland, in Zeeland, and to the north, in Frisia, this development led to catastrophic storm floods literally washing away entire regions, as the peat layer disintegrated or became detached and was carried away by the flood water. From the Frisian side the sea even flooded the area to the east, gradually hollowing Holland out from behind and forming the Zuiderzee (the present IJsselmeer). This inland sea threatened to link up with the "drowned lands" of Zealand in the south, reducing Holland to a series of narrow dune barrier islands in front of a lagoon. Only drastic administrative intervention saved the county from utter destruction. The counts and large monasteries took the lead in these efforts, building the first heavy emergency dikes to bolster critical points. Later special autonomous administrative bodies were formed, the "waterschappen" ("water control boards"), which had the legal power to enforce their regulations and decisions on water management. They eventually constructed an extensive dike system that covered the coastline and the polders, thus protecting the land from further incursions by the sea.
However, the Hollanders did not stop there. Starting around the 16th century, they took the offensive and began land reclamation projects, converting lakes, marshy areas and adjoining mudflats into polders. This continued well into the 20th century. As a result, historical maps of medieval and early modern Holland bear little resemblance to present maps.
This ongoing struggle to master the water played an important role in the development of Holland as a maritime and economic power, and has traditionally been seen as developing the presumed collective character of its inhabitants: stubborn, egalitarian and frugal.
The stereotypical image of Holland is an artificial amalgam of tulips, windmills, clogs, Edam cheese and the traditional dress ("klederdracht") of the village of Volendam, far from the reality of everyday Holland. These stereotypes were deliberately created in the late 19th century by official "Holland Promotion" to attract tourists.
The predominance of Holland in the Netherlands has resulted in regionalism on the part of the other provinces, a reaction to the perceived threat that Holland poses to their local culture and identity. The other provinces have a strong, and often negative, image of Holland and the Hollanders, to whom certain qualities are ascribed within a mental geography, a conceptual mapping of spaces and their inhabitants. On the other hand, some Hollanders take Holland's cultural dominance for granted and treat the concepts of "Holland" and "the Netherlands" as coinciding. Consequently, they see themselves not primarily as Hollanders, but simply as Dutch ("Nederlanders"). This phenomenon has been called "hollandocentrism".
The predominant language spoken in Holland is Dutch. Hollanders sometimes call the Dutch language ""Hollands,"" instead of the standard term "Nederlands". Inhabitants of Belgium and other provinces of the Netherlands use "Hollands" to mean a Hollandic dialect or strong accent.
Standard Dutch was historically largely based on the dialect of the County of Holland, incorporating many traits derived from the dialects of the previously more powerful Duchy of Brabant and County of Flanders. Strong dialectal variation still exists throughout the Low Countries. Today, Holland proper is the region where the original dialects are least spoken, in many areas having been completely replaced by standard Dutch, and the Randstad has the largest influence on the developments of the standard language—with the exception of the Dutch spoken in Belgium.
Despite this correspondence between standard Dutch and the Dutch spoken in the Randstad, there are local variations within Holland itself that differ from standard Dutch. The main cities each have their own modern urban dialect, that can be considered a sociolect. Some people, especially in the area north of Amsterdam, still speak the original dialect of the county, Hollandic. This dialect is present in the north: Volendam and Marken and the area around there, West Friesland and the Zaanstreek; and in a southeastern fringe bordering the provinces of North Brabant and Utrecht. In the south on the island of Goeree-Overflakkee, Zeelandic is spoken.
The province of Holland gave its name to a number of colonial settlements and discovered regions that were called "Nieuw Holland" or New Holland. The largest was the island continent presently known as Australia: New Holland was first applied to Australia in 1644 by the Dutch seafarer Dirk Hartog as a Latin "Nova Hollandia", and remained in international use for 190 years. Dutch explorer Abel Tasman named New Zealand after the Dutch province of Zealand. In the Netherlands "Nieuw Holland" would remain the usual name of the continent until the end of the 19th century; it is now no longer in use there, the Dutch name today being "Australië".
While "Holland" has been replaced in English as the official name for the country of the Netherlands, other languages still use it or a variant of it to officially refer to the Netherlands. This is still the case in the languages of Indonesia, for example: | https://en.wikipedia.org/wiki?curid=13288 |
History of the Netherlands
The History of the Netherlands is a history of seafaring people thriving on a lowland river delta on the North Sea in northwestern Europe. Records begin with the four centuries during which the region formed a militarized border zone of the Roman Empire. This came under increasing pressure from Germanic peoples moving westwards. As Roman power collapsed and the Middle Ages began, three dominant Germanic peoples coalesced in the area, Frisians in the north and coastal areas, Low Saxons in the northeast, and the Franks in the south.
During the Middle Ages, the descendants of the Carolingian dynasty came to dominate the area and then extended their rule to a large part of Western Europe. The region of the Netherlands therefore became part of Lower Lotharingia within the Frankish Holy Roman Empire. For several centuries, lordships such as Brabant, Holland, Zeeland, Friesland, Guelders and others held a changing patchwork of territories. There was no unified equivalent of the modern Netherlands.
By 1433, the Duke of Burgundy had assumed control over most of the lowlands territories in Lower Lotharingia; he created the Burgundian Netherlands which included modern Netherlands, Belgium, Luxembourg, and a part of France.
The Catholic kings of Spain took strong measures against Protestantism, which polarised the peoples of present-day Belgium and the Netherlands. The subsequent Dutch revolt led to splitting the Burgundian Netherlands into a Catholic French and Dutch-speaking "Spanish Netherlands" (approximately corresponding to modern Belgium and Luxembourg), and a northern "United Provinces", which spoke Dutch and were predominantly Protestant with a Catholic minority. It became the modern Netherlands.
In the Dutch Golden Age, which had its zenith around 1667, there was a flowering of trade, industry, the arts and the sciences. A rich worldwide Dutch empire developed and the Dutch East India Company became one of the earliest and most important of national mercantile companies based on entrepreneurship and trade.
During the eighteenth century, the power, wealth and influence of the Netherlands declined. A series of wars with the more powerful British and French neighbours weakened it. The UK seized the North American colony of New Amsterdam, and renamed it "New York". There was growing unrest and conflict between the Orangists and the Patriots. The French Revolution spilled over after 1789, and a pro-French Batavian Republic was established in 1795–1806. Napoleon made it a satellite state, the Kingdom of Holland (1806–1810), and later simply a French imperial province.
After the collapse of Napoleon in 1813–15, an expanded "United Kingdom of the Netherlands" was created with the House of Orange as monarchs, also ruling Belgium and Luxembourg. The King imposed unpopular Protestant reforms on Belgium, which revolted in 1830 and became independent in 1839. After an initially conservative period, following the introduction of the 1848 constitution; the country became a parliamentary democracy with a constitutional monarch. Modern-day Luxembourg became officially independent from the Netherlands in 1839, but a personal union remained until 1890. Since 1890, it is ruled by another branch of the House of Nassau.
The Netherlands was neutral during the First World War, but during the Second World War, it was invaded and occupied by Nazi Germany. The Nazis, including many collaborators, rounded up and killed almost all of the country's Jewish population. When the Dutch resistance increased, the Nazis cut off food supplies to much of the country, causing severe starvation in 1944–45. In 1942, the Dutch East Indies were conquered by Japan, but prior to this the Dutch destroyed the oil wells for which Japan was desperate. Indonesia proclaimed its independence from the Netherlands in 1945, followed by Suriname in 1975. The post-war years saw rapid economic recovery (helped by the American Marshall Plan), followed by the introduction of a welfare state during an era of peace and prosperity. The Netherlands formed a new economic alliance with Belgium and Luxembourg, the Benelux, and all three became founding members of the European Union and NATO. In recent decades, the Dutch economy has been closely linked to that of Germany and is highly prosperous. The four countries adopted the Euro on 1 January 2002, along with eight other EU member states.
The prehistory of the area that is now the Netherlands was largely shaped by its constantly shifting, low-lying geography.
The area that is now the Netherlands was inhabited by early humans at least 37,000 years ago, as attested by flint tools discovered in Woerden in 2010. In 2009 a fragment of a 40,000-year-old Neanderthal skull was found in sand dredged from the North Sea floor off the coast of Zeeland.
During the last ice age, the Netherlands had a tundra climate with scarce vegetation and the inhabitants survived as hunter-gatherers. After the end of the ice age, various Paleolithic groups inhabited the area. It is known that around 8000 BC a Mesolithic tribe resided near Burgumer Mar (Friesland). Another group residing elsewhere is known to have made canoes. The oldest recovered canoe in the world is the Pesse canoe. According to C14 dating analysis it was constructed somewhere between 8200 BC and 7600 BC. This canoe is exhibited in the Drents Museum in Assen.
Autochthonous hunter-gatherers from the Swifterbant culture are attested from around 5600 BC onwards. They are strongly linked to rivers and open water and were related to the southern Scandinavian Ertebølle culture (5300–4000 BC). To the west, the same tribes might have built hunting camps to hunt winter game, including seals.
Agriculture arrived in the Netherlands somewhere around 5000 BC with the Linear Pottery culture, who were probably central European farmers. Agriculture was practiced only on the loess plateau in the very south (southern Limburg), but even there it was not established permanently. Farms did not develop in the rest of the Netherlands.
There is also some evidence of small settlements in the rest of the country. These people made the switch to animal husbandry sometime between 4800 BC and 4500 BC. Dutch archaeologist Leendert Louwe Kooijmans wrote, "It is becoming increasingly clear that the agricultural transformation of prehistoric communities was a purely indigenous process that took place very gradually." This transformation took place as early as 4300 BC–4000 BC and featured the introduction of grains in small quantities into a traditional broad-spectrum economy.
The Funnelbeaker culture was a farming culture extending from Denmark through northern Germany into the northern Netherlands. In this period of Dutch prehistory, the first notable remains were erected: the dolmens, large stone grave monuments. They are found in Drenthe, and were probably built between 4100 BC and 3200 BC.
To the west, the Vlaardingen culture (around 2600 BC), an apparently more primitive culture of hunter-gatherers survived well into the Neolithic period.
Around 2950 BCE there was a transition from the Funnelbeaker farming culture to the Corded Ware pastoralist culture, a large archeological horizon appearing in western and central Europe, that is associated with the advance of Indo-European languages. This transition was probably caused by developments in eastern Germany, and it occurred within two generations.
The Bell Beaker culture was also present in the Netherlands.
The Corded Ware and Bell Beaker cultures were not indigenous to the Netherlands but were pan-European in nature, extending across much of northern and central Europe.
The first evidence of the use of the wheel dates from this period, about 2400 BC. This culture also experimented with working with copper. Evidence of this, including stone anvils, copper knives, and a copper spearhead, was found on the Veluwe. Copper finds show that there was trade with other areas in Europe, as natural copper is not found in Dutch soil.
The Bronze Age probably started somewhere around 2000 BC and lasted until around 800 BC. The earliest bronze tools have been found in the grave of a Bronze Age individual called "the smith of Wageningen". More Bronze Age objects from later periods have been found in Epe, Drouwen and elsewhere. Broken bronze objects found in Voorschoten were apparently destined for recycling. This indicates how valuable bronze was considered in the Bronze Age. Typical bronze objects from this period included knives, swords, axes, fibulae and bracelets.
Most of the Bronze Age objects found in the Netherlands have been found in Drenthe. One item shows that trading networks during this period extended a far distance. Large bronze "situlae" (buckets) found in Drenthe were manufactured somewhere in eastern France or in Switzerland. They were used for mixing wine with water (a Roman/Greek custom). The many finds in Drenthe of rare and valuable objects, such as tin-bead necklaces, suggest that Drenthe was a trading centre in the Netherlands in the Bronze Age.
The Bell Beaker cultures (2700–2100) locally developed into the Bronze Age Barbed-Wire Beaker culture (2100–1800). In the second millennium BC, the region was the boundary between the Atlantic and Nordic horizons and was split into a northern and a southern region, roughly divided by the course of the Rhine.
In the north, the Elp culture (c. 1800 to 800 BC) was a Bronze Age archaeological culture having earthenware pottery of low quality known as ""Kümmerkeramik"" (or ""Grobkeramik"") as a marker. The initial phase was characterized by tumuli (1800–1200 BC) that were strongly tied to contemporary tumuli in northern Germany and Scandinavia, and were apparently related to the Tumulus culture (1600–1200 BC) in central Europe. This phase was followed by a subsequent change featuring Urnfield (cremation) burial customs (1200–800 BC). The southern region became dominated by the Hilversum culture (1800–800), which apparently inherited the cultural ties with Britain of the previous Barbed-Wire Beaker culture.
The Iron Age brought a measure of prosperity to the people living in the area of the present-day Netherlands. Iron ore was available throughout the country, including bog iron extracted from the ore in peat bogs ("moeras ijzererts") in the north, the natural iron-bearing balls found in the Veluwe and the red iron ore near the rivers in Brabant. Smiths travelled from small settlement to settlement with bronze and iron, fabricating tools on demand, including axes, knives, pins, arrowheads and swords. Some evidence even suggests the making of Damascus steel swords using an advanced method of forging that combined the flexibility of iron with the strength of steel.
In Oss, a grave dating from around 500 BC was found in a burial mound 52 metres wide (and thus the largest of its kind in western Europe). Dubbed the "king's grave" ("Vorstengraf (Oss)"), it contained extraordinary objects, including an iron sword with an inlay of gold and coral.
In the centuries just before the arrival of the Romans, northern areas formerly occupied by the Elp culture emerged as the probably Germanic Harpstedt culture while the southern parts were influenced by the Hallstatt culture and assimilated into the Celtic La Tène culture. The contemporary southern and western migration of Germanic groups and the northern expansion of the Hallstatt culture drew these peoples into each other's sphere of influence. This is consistent with Caesar's account of the Rhine forming the boundary between Celtic and Germanic tribes.
The Germanic tribes originally inhabited southern Scandinavia, Schleswig-Holstein and Hamburg, but subsequent Iron Age cultures of the same region, like Wessenstedt (800–600 BC) and Jastorf, may also have belonged to this grouping.
The climate deteriorating in Scandinavia around 850 BC to 760 BC and later and faster around 650 BC might have triggered migrations. Archaeological evidence suggests around 750 BC a relatively uniform Germanic people from the Netherlands to the Vistula and southern Scandinavia. In the west, the newcomers settled the coastal floodplains for the first time, since in adjacent higher grounds the population had increased and the soil had become exhausted.
By the time this migration was complete, around 250 BC, a few general cultural and linguistic groupings had emerged.
One grouping – labelled the "North Sea Germanic" – inhabited the northern part of the Netherlands (north of the great rivers) and extending along the North Sea and into Jutland. This group is also sometimes referred to as the "Ingvaeones". Included in this group are the peoples who would later develop into, among others, the early Frisians and the early Saxons.
A second grouping, which scholars subsequently dubbed the "Weser-Rhine Germanic" (or "Rhine-Weser Germanic"), extended along the middle Rhine and Weser and inhabited the southern part of the Netherlands (south of the great rivers). This group, also sometimes referred to as the "Istvaeones", consisted of tribes that would eventually develop into the Salian Franks.
The Celtic culture had its origins in the central European Hallstatt culture (c. 800–450 BC), named for the rich grave finds in Hallstatt, Austria. By the later La Tène period (c. 450 BC up to the Roman conquest), this Celtic culture had, whether by diffusion or migration, expanded over a wide range, including into the southern area of the Netherlands. This would have been the northern reach of the Gauls.
In March 2005 17 Celtic coins were found in Echt (Limburg). The silver coins, mixed with copper and gold, date from around 50 BC to 20 AD. In October 2008 a hoard of 39 gold coins and 70 silver Celtic coins was found in the Amby area of Maastricht. The gold coins were attributed to the Eburones people. Celtic objects have also been found in the area of Zutphen.
Although it is rare for hoards to be found, in past decades loose Celtic coins and other objects have been found throughout the central, eastern and southern part of the Netherlands. According to archaeologists these finds confirmed that at least the Maas river valley in the Netherlands was within the influence of the La Tène culture. Dutch archaeologists even speculate that Zutphen (which lies in the centre of the country) was a Celtic area before the Romans arrived, not a Germanic one at all.
Scholars debate the actual extent of the Celtic influence. The Celtic influence and contacts between Gaulish and early Germanic culture along the Rhine is assumed to be the source of a number of Celtic loanwords in Proto-Germanic. But according to Belgian linguist Luc van Durme, toponymic evidence of a former Celtic presence in the Low Countries is near to utterly absent. Although there were Celts in the Netherlands, Iron Age innovations did not involve substantial Celtic intrusions and featured a local development from Bronze Age culture.
Some scholars (De Laet, Gysseling, Hachmann, Kossack & Kuhn) have speculated that a separate ethnic identity, neither Germanic nor Celtic, survived in the Netherlands until the Roman period. They see the Netherlands as having been part of an Iron Age "Nordwestblock" stretching from the Somme to the Weser. Their view is that this culture, which had its own language, was being absorbed by the Celts to the south and the Germanic peoples from the east as late as the immediate pre-Roman period.
During the Gallic Wars, the Belgic area south of the Oude Rijn and west of the Rhine was conquered by Roman forces under Julius Caesar in a series of campaigns from 57 BC to 53 BC. The tribes located in the area of the Netherlands at this time did not leave behind written records, so all the information known about them during this pre-Roman period is based on what the Romans and Greeks wrote about them. One of the most important is Caesar's own "Commentarii de Bello Gallico". Two main tribes he described as living in what is now the Netherlands were the Menapii, and the Eburones, both in the south, which is where Caesar was active. He established the principle that the Rhine defined a natural boundary between Gaul and Germania magna. But the Rhine was not a strong border, and he made it clear that there was a part of Belgic Gaul where many of the local tribes (including the Eburones) were "Germani cisrhenani", or in other cases, of mixed origin.
The Menapii stretched from the south of Zeeland, through North Brabant (and possibly South Holland), into the southeast of Gelderland. In later Roman times their territory seems to have been divided or reduced, so that it became mainly contained in what is now western Belgium.
The Eburones, the largest of the "Germani Cisrhenani" group, covered a large area including at least part of modern Dutch Limburg, stretching east to the Rhine in Germany, and also northwest to the delta, giving them a border with the Menapii. Their territory may have stretched into Gelderland.
In the delta itself, Caesar makes a passing comment about the "Insula Batavorum" ("Island of the Batavi") in the Rhine river, without discussing who lived there. Later, in imperial times, a tribe called the Batavi became very important in this region. Much later Tacitus wrote that they had originally been a tribe of the Chatti, a tribe in Germany never mentioned by Caesar. However, archaeologists find evidence of continuity, and suggest that the Chattic group may have been a small group, moving into a pre-existing (and possibly non-Germanic) people, who could even have been part of a known group such as the Eburones.
The approximately 450 years of Roman rule that followed would profoundly change the area that would become the Netherlands. Very often this involved large-scale conflict with the free Germanic tribes over the Rhine.
Other tribes who eventually inhabited the islands in the delta during Roman times are mentioned by Pliny the Elder are the Cananefates in South Holland; the Frisii, covering most of the modern Netherlands north of the Oude Rijn; the Frisiabones, who apparently stretched from the delta into the North of North Brabant; the Marsacii, who stretched from the Flemish coast, into the delta; and the Sturii.
Caesar reported that he eliminated the name of the Eburones but in their place the Texuandri inhabited most of North Brabant, and the modern province of Limburg, with the Maas running through it, appears to have been inhabited in imperial times by (from north to south) the Baetasii, the Catualini, the Sunuci and the Tungri. (Tacitus reported that the Tungri was a new name for the earlier "Germani cisrhenani".)
North of the Old Rhine, apart from the Frisii, Pliny reports some Chauci reached into the delta, and two other tribes known from the eastern Netherlands were the Tuihanti (or Tubantes) from Twenthe in Overijssel, and the Chamavi, from Hamaland in northern Gelderland, who became one of the first tribes to be named as Frankish (see below). The Salians, also Franks, probably originated in Salland in Overijssel, before they moved into the empire, forced by Saxons in the 4th century, first into Batavia, and then into Toxandria.
Starting about 15 BC, the Rhine, in the Netherlands came to be defended by the Lower Limes Germanicus. After a series of military actions, the Rhine became fixed around 12 AD as Rome's northern frontier on the European mainland. A number of towns and developments would arise along this line. The area to the south would be integrated into the Roman Empire. At first part of Gallia Belgica, this area became part of the province of Germania Inferior. The tribes already within, or relocated to, this area became part of the Roman Empire. The area to the north of the Rhine, inhabited by the Frisii and the Chauci, remained outside Roman rule but not its presence and control.
Romans built military forts along the Limes Germanicus and a number of towns and smaller settlements in the Netherlands. The more notable Roman towns were at Nijmegen () and at Voorburg (Forum Hadriani).
Perhaps the most evocative Roman ruin is the mysterious Brittenburg, which emerged from the sand at the beach in Katwijk several centuries ago, only to be buried again. These ruins were part of .
Other Roman settlements, fortifications, temples and other structures have been found at Alphen aan de Rijn (Albaniana); Bodegraven; Cuijk; Elst, Overbetuwe; Ermelo; Esch; Heerlen; Houten; Kessel, North Brabant; Oss, i.e. De Lithse Ham near Maren-Kessel; Kesteren in Neder-Betuwe; Leiden (Matilo); Maastricht; Meinerswijk (now part of Arnhem); Tiel; Utrecht (Traiectum); Valkenburg (South Holland) (Praetorium Agrippinae); Vechten (Fectio) now part of Bunnik; Velsen; Vleuten; Wijk bij Duurstede (); Woerden ( or ); and Zwammerdam ().
The Batavians, Cananefates, and the other border tribes were held in high regard as soldiers throughout the empire, and traditionally served in the Roman cavalry. The frontier culture was influenced by the Romans, Germanic people, and Gauls. In the first centuries after Rome's conquest of Gaul, trade flourished. And Roman, Gaulish and Germanic material culture are found combined in the region.
However, the Batavians rose against the Romans in the Batavian rebellion of 69 AD. The leader of this revolt was Batavian Gaius Julius Civilis. One of the causes of the rebellion was that the Romans had taken young Batavians as slaves. A number of Roman "castella" were attacked and burnt. Other Roman soldiers in Xanten and elsewhere and auxiliary troops of Batavians and Canninefatae in the legions of Vitellius) joined the revolt, thus splitting the northern part of the Roman army. In April 70 AD, a few legions sent by Vespasianus and commanded by Quintus Petillius Cerialis eventually defeated the Batavians and negotiated surrender with Gaius Julius Civilis somewhere between the Waal and the Maas near Noviomagus (Nijmegen), which was probably called "Batavodurum" by the Batavians. The Batavians later merged with other tribes and became part of the Salian Franks.
Dutch writers in the 17th and 18th centuries saw the rebellion of the independent and freedom-loving Batavians as mirroring the Dutch revolt against Spain and other forms of tyranny. According to this nationalist view, the Batavians were the "true" forefathers of the Dutch, which explains the recurring use of the name over the centuries. Jakarta was named "Batavia" by the Dutch in 1619. The Dutch republic created in 1795 on the basis of French revolutionary principles was called the Batavian Republic. Even today "Batavian" is a term sometimes used to describe the Dutch people; this is similar to use of "Gallic" to describe the French and "Teutonic" to describe the Germans.
Modern scholars of the Migration Period are in agreement that the Frankish identity emerged at the first half of the 3rd century out of various earlier, smaller Germanic groups, including the Salii, Sicambri, Chamavi, Bructeri, Chatti, Chattuarii, Ampsivarii, Tencteri, Ubii, Batavi and the Tungri, who inhabited the lower and middle Rhine valley between the Zuyder Zee and the river Lahn and extended eastwards as far as the Weser, but were the most densely settled around the IJssel and between the Lippe and the Sieg. The Frankish confederation probably began to coalesce in the 210s.
The Franks eventually were divided into two groups: the Ripuarian Franks (Latin: Ripuari), who were the Franks that lived along the middle-Rhine River during the Roman Era, and the Salian Franks, who were the Franks that originated in the area of the Netherlands.
Franks appear in Roman texts as both allies and enemies ("laeti" and "dediticii"). By about 320, the Franks had the region of the Scheldt river (present day west Flanders and southwest Netherlands) under control, and were raiding the Channel, disrupting transportation to Britain. Roman forces pacified the region, but did not expel the Franks, who continued to be feared as pirates along the shores at least until the time of Julian the Apostate (358), when Salian Franks were allowed to settle as "foederati" in Toxandria, according to Ammianus Marcellinus.
Three factors contributed to the probable disappearance of the Frisii from the northern Netherlands. First, according to the "Panegyrici Latini" (Manuscript VIII), the ancient Frisii were forced to resettle within Roman territory as "laeti" (i.e., Roman-era serfs) in c. 296. This is the last reference to the ancient Frisii in the historical record. What happened to them, however, is suggested in the archaeological record. The discovery of a type of earthenware unique to 4th-century Frisia, called "terp Tritzum", shows that an unknown number of them were resettled in Flanders and Kent, likely as "laeti" under Roman coercion.
Second, the environment in the low-lying coastal regions of northwestern Europe began to lower c. 250 and gradually receded over the next 200 years. Tectonic subsidence, a rising water table and storm surges combined to flood some areas with marine transgressions. This was accelerated by a shift to a cooler, wetter climate in the region. Any Frisii left in the lower area's of Frisia, would have drowned.
Third, after the collapse of the Roman Empire, there was a decline in population as Roman activity stopped and Roman institutions withdrew. As a result of these three factors, it has been postulated that the Frisii and Frisiaevones disappeared from the area, leaving the coastal lands largely unpopulated for the next two centuries. However, recent excavations in the coastal dunes of Kennemerland show clear indication of a permanent habitation.
As climatic conditions improved, there was another mass migration of Germanic peoples into the area from the east. This is known as the "Migration Period" ("Volksverhuizingen"). The northern Netherlands received an influx of new migrants and settlers, mostly Saxons, but also Angles and Jutes. Many of these migrants did not stay in the northern Netherlands but moved on to England and are known today as the Anglo-Saxons. The newcomers who stayed in the northern Netherlands would eventually be referred to as "Frisians", although they were not descended from the ancient Frisii. These new Frisians settled in the northern Netherlands and would become the ancestors of the modern Frisians. (Because the early Frisians and Anglo-Saxons were formed from largely identical tribal confederacies, their respective languages were very similar. Old Frisian is the most closely related language to Old English and the modern Frisian dialects are in turn the closest related languages to contemporary English.) By the end of the 6th century, the Frisian territory in the northern Netherlands had expanded west to the North Sea coast and, by the 7th century, south to Dorestad. During this period most of the northern Netherlands was known as Frisia. This extended Frisian territory is sometimes referred to as "Frisia Magna" (or Greater Frisia).
In the 7th and 8th centuries, the Frankish chronologies mention this area as the kingdom of the Frisians. This kingdom comprised the coastal provinces of the Netherlands and the German North Sea coast. During this time, the Frisian language was spoken along the entire southern North Sea coast. The 7th-century Frisian Kingdom (650–734) under King Aldegisel and King Redbad, had its centre of power in Utrecht.
Dorestad was the largest settlement (emporia) in northwestern Europe. It had grown around a former Roman fortress. It was a large, flourishing trading place, three kilometers long and situated where the rivers Rhine and Lek diverge southeast of Utrecht near the modern town of Wijk bij Duurstede. Although inland, it was a North Sea trading centre that primarily handled goods from the Middle Rhineland. Wine was among the major products traded at Dorestad, likely from vineyards south of Mainz. It was also widely known because of its mint. Between 600 and around 719 Dorestad was often fought over between the Frisians and the Franks.
After Roman government in the area collapsed, the Franks expanded their territories until there were numerous small Frankish kingdoms, especially at Cologne, Tournai, Le Mans and Cambrai. The kings of Tournai eventually came to subdue the other Frankish kings. By the 490s, Clovis I had conquered and united all the Frankish territories to the west of the Meuse, including those in the southern Netherlands. He continued his conquests into Gaul.
After the death of Clovis I in 511, his four sons partitioned his kingdom amongst themselves, with Theuderic I receiving the lands that were to become Austrasia (including the southern Netherlands). A line of kings descended from Theuderic ruled Austrasia until 555, when it was united with the other Frankish kingdoms of Chlothar I, who inherited all the Frankish realms by 558. He redivided the Frankish territory amongst his four sons, but the four kingdoms coalesced into three on the death of Charibert I in 567. Austrasia (including the southern Netherlands) was given to Sigebert I. The southern Netherlands remained the northern part of Austrasia until the rise of the Carolingians.
The Franks who expanded south into Gaul settled there and eventually adopted the Vulgar Latin of the local population. However, a Germanic language was spoken as a second tongue by public officials in western Austrasia and Neustria as late as the 850s. It completely disappeared as a spoken language from these regions during the 10th century. During this expansion to the south, many Frankish people remained in the north (i.e. southern Netherlands, Flanders and a small part of northern France). A widening cultural divide grew between the Franks remaining in the north and the rulers far to the south in what is now France. Salian Franks continued to reside in their original homeland and the area directly to the south and to speak their original language, Old Frankish, which by the 9th century had evolved into Old Dutch. A Dutch-French language boundary came into existence (but this was originally south of where it is today). In the Maas and Rhine areas of the Netherlands, the Franks had political and trading centres, especially at Nijmegen and Maastricht. These Franks remained in contact with the Frisians to the north, especially in places like Dorestad and Utrecht.
In the late 19th century, Dutch historians believed that the Franks, Frisians, and Saxons were the original ancestors of the Dutch people. Some went further by ascribing certain attributes, values and strengths to these various groups and proposing that they reflected 19th-century nationalist and religious views. In particular, it was believed that this theory explained why Belgium and the southern Netherlands (i.e. the Franks) had become Catholic and the northern Netherlands (Frisians and Saxons) had become Protestant. The success of this theory was partly due to anthropological theories based on a tribal paradigm. Being politically and geographically inclusive, and yet accounting for diversity, this theory was in accordance with the need for nation-building and integration during the 1890–1914 period. The theory was taught in Dutch schools.
However, the disadvantages of this historical interpretation became apparent. This tribal-based theory suggested that external borders were weak or non-existent and that there were clear-cut internal borders. This origins myth provided an historical premise, especially during the Second World War, for regional separatism and annexation to Germany. After 1945 the tribal paradigm lost its appeal for anthropological scholars and historians. When the accuracy of the three-tribe theme was fundamentally questioned, the theory fell out of favour.
Due to the scarcity of written sources, knowledge of this period depends to a large degree on the interpretation of archaeological data. The traditional view of a clear-cut division between Frisians in the north and coast, Franks in the south and Saxons in the east has proven historically problematic. Archeological evidence suggests dramatically different models for different regions, with demographic continuity for some parts of the country and depopulation and possible replacement in other parts, notably the coastal areas of Frisia and Holland.
The language from which Old Dutch (also sometimes called Old West Low Franconian, Old Low Franconian or Old Frankish) arose is not known with certainty, but it is thought to be the language spoken by the Salian Franks. Even though the Franks are traditionally categorized as Weser-Rhine Germanic, Dutch has a number of Ingvaeonic characteristics and is classified by modern linguists as an Ingvaeonic language. Dutch also has a number of Old Saxon characteristics. There was a close relationship between Old Dutch, Old Saxon, Old English and Old Frisian. Because texts written in the language spoken by the Franks are almost non-existent, and Old Dutch texts scarce and fragmentary, not much is known about the development of Old Dutch. Old Dutch made the transition to Middle Dutch around 1150.
The Christianity that arrived in the Netherlands with the Romans appears not to have died out completely (in Maastricht, at least) after the withdrawal of the Romans in about 411.
The Franks became Christians after their king Clovis I converted to Catholicism, an event which is traditionally set in 496. Christianity was introduced in the north after the conquest of Friesland by the Franks. The Saxons in the east were converted before the conquest of Saxony, and became Frankish allies.
Hiberno-Scottish and Anglo-Saxon missionaries, particularly Willibrord, Wulfram and Boniface, played an important role in converting the Frankish and Frisian peoples to Christianity by the 8th century. Boniface was martyred by the Frisians in Dokkum (754).
In the early 8th century the Frisians came increasingly into conflict with the Franks to the south, resulting in a series of wars in which the Frankish Empire eventually subjugated Frisia. In 734, at the Battle of the Boarn, the Frisians in the Netherlands were defeated by the Franks, who thereby conquered the area west of the Lauwers. The Franks then conquered the area east of the Lauwers in 785 when Charlemagne defeated Widukind.
The linguistic descendants of the Franks, the modern Dutch -speakers of the Netherlands and Flanders, seem to have broken with the endonym "Frank" around the 9th century. By this time Frankish identity had changed from an ethnic identity to a national identity, becoming localized and confined to the modern "Franconia" and principally to the French province of "Île-de-France".
Although the people no longer referred to themselves as "Franks", the Netherlands was still part of the Frankish empire of Charlemagne. Indeed, because of the Austrasian origins of the Carolingians in the area between the Rhine and the Maas, the cities of Aachen, Maastricht, Liège and Nijmegen were at the heart of Carolingian culture. Charlemagne maintained his "palatium" in Nijmegen at least four times.
The Carolingian empire would eventually include France, Germany, northern Italy and much of Western Europe. In 843, the Frankish empire was divided into three parts, giving rise to West Francia in the west, East Francia in the east, and Middle Francia in the centre. Most of what is today the Netherlands became part of Middle Francia; Flanders became part of West Francia. This division was an important factor in the historical distinction between Flanders and the other Dutch-speaking areas.
Middle Francia () was an ephemeral Frankish kingdom that had no historical or ethnic identity to bind its varied peoples. It was created by the Treaty of Verdun in 843, which divided the Carolingian Empire among the sons of Louis the Pious. Situated between the realms of East and West Francia, Middle Francia comprised the Frankish territory between the rivers Rhine and Scheldt, the Frisian coast of the North Sea, the former Kingdom of Burgundy (except for a western portion, later known as "Bourgogne"), Provence and the Kingdom of Italy.
Middle Francia fell to Lothair I, the eldest son and successor of Louis the Pious, after an intermittent civil war with his younger brothers Louis the German and Charles the Bald. In acknowledgement of Lothair's Imperial title, Middle Francia contained the imperial cities of Aachen, the residence of Charlemagne, as well as Rome. In 855, on his deathbed at Prüm Abbey, Emperor Lothair I again partitioned his realm amongst his sons. Most of the lands north of the Alps, including the Netherlands, passed to Lothair II and consecutively were named Lotharingia. After Lothair II died in 869, Lotharingia was partitioned by his uncles Louis the German and Charles the Bald in the Treaty of Meerssen in 870. Although some of the Netherlands had come under Viking control, in 870 it technically became part of East Francia, which became the Holy Roman Empire in 962.
In the 9th and 10th centuries, the Vikings raided the largely defenceless Frisian and Frankish towns lying on the coast and along the rivers of the Low Countries. Although Vikings never settled in large numbers in those areas, they did set up long-term bases and were even acknowledged as lords in a few cases. In Dutch and Frisian historical tradition, the trading centre of Dorestad declined after Viking raids from 834 to 863; however, since no convincing Viking archaeological evidence has been found at the site (as of 2007), doubts about this have grown in recent years.
One of the most important Viking families in the Low Countries was that of Rorik of Dorestad (based in Wieringen) and his brother the "younger Harald" (based in Walcheren), both thought to be nephews of Harald Klak. Around 850, Lothair I acknowledged Rorik as ruler of most of Friesland. And again in 870, Rorik was received by Charles the Bald in Nijmegen, to whom he became a vassal. Viking raids continued during that period. Harald's son Rodulf and his men were killed by the people of Oostergo in 873. Rorik died sometime before 882.
Buried Viking treasures consisting mainly of silver have been found in the Low Countries. Two such treasures have been found in Wieringen. A large treasure found in Wieringen in 1996 dates from around 850 and is thought perhaps to have been connected to Rorik. The burial of such a valuable treasure is seen as an indication that there was a permanent settlement in Wieringen.
Around 879, Godfrid arrived in Frisian lands as the head of a large force that terrorised the Low Countries. Using Ghent as his base, they ravaged Ghent, Maastricht, Liège, Stavelot, Prüm, Cologne, and Koblenz. Controlling most of Frisia between 882 and his death in 885, Godfrid became known to history as Godfrid, Duke of Frisia. His lordship over Frisia was acknowledged by Charles the Fat, to whom he became a vassal. Godfried was assassinated in 885, after which Gerolf of Holland assumed lordship and Viking rule of Frisia came to an end.
Viking raids of the Low Countries continued for over a century. Remains of Viking attacks dating from 880 to 890 have been found in Zutphen and Deventer. In 920, King Henry of Germany liberated Utrecht. According to a number of chronicles, the last attacks took place in the first decade of the 11th century and were directed at Tiel and/or Utrecht.
These Viking raids occurred about the same time that French and German lords were fighting for supremacy over the middle empire that included the Netherlands, so their sway over this area was weak. Resistance to the Vikings, if any, came from local nobles, who gained in stature as a result.
The German kings and emperors ruled the Netherlands in the 10th and 11th century. Germany was called the Holy Roman Empire after the coronation of King Otto the Great as emperor. The Dutch city of Nijmegen used to be the spot of an important domain of the German emperors. Several German emperors were born and died there, including for example Byzantine empress Theophanu, who died in Nijmegen. Utrecht was also an important city and trading port at the time.
The Holy Roman Empire was not able to maintain political unity. In addition to the growing independence of the towns, local rulers turned their counties and duchies into private kingdoms and felt little sense of obligation to the emperor who reigned over large parts of the nation in name only. Large parts of what now comprise the Netherlands were governed by the Count of Holland, the Duke of Gelre, the Duke of Brabant and the Bishop of Utrecht. Friesland and Groningen in the north maintained their independence and were governed by the lower nobility.
The various feudal states were in a state of almost continual war. Gelre and Holland fought for control of Utrecht. Utrecht, whose bishop had in 1000 ruled over half of what is today the Netherlands, was marginalised as it experienced continuing difficulty in electing new bishops. At the same time, the dynasties of neighbouring states were more stable. Groningen, Drenthe and most of Gelre, which used to be part of Utrecht, became independent. Brabant tried to conquer its neighbours, but was not successful. Holland also tried to assert itself in Zeeland and Friesland, but its attempts failed.
The language and culture of most of the people who lived in the area that is now Holland were originally Frisian. The sparsely populated area was known as "West Friesland" ("Westfriesland"). As Frankish settlement progressed, the Frisians migrated away or were absorbed and the area quickly became Dutch. (The part of North Holland situated north of Alkmaar is still colloquially known as West Friesland).
The rest of Friesland in the north continued to maintain its independence during this time. It had its own institutions (collectively called the "Frisian freedom") and resented the imposition of the feudal system and the patriciate found in other European towns. They regarded themselves as allies of Switzerland. The Frisian battle cry was "better dead than a slave". They later lost their independence when they were defeated in 1498 by the German Landsknecht mercenaries of Duke Albrecht of Saxony-Meissen.
The center of power in these emerging independent territories was in the County of Holland. Originally granted as a fief to the Danish chieftain Rorik in return for loyalty to the emperor in 862, the region of Kennemara (the region around modern Haarlem) rapidly grew under Rorik's descendants in size and importance. By the early 11th century, Dirk III, Count of Holland was levying tolls on the Meuse estuary and was able to resist military intervention from his overlord, the Duke of Lower Lorraine.
In 1083, the name "Holland" first appears in a deed referring to a region corresponding more or less to the current province of South Holland and the southern half of what is now North Holland. Holland's influence continued to grow over the next two centuries. The counts of Holland conquered most of Zeeland but it was not until 1289 that Count Floris V was able to subjugate the Frisians in West Friesland (that is, the northern half of North Holland).
Around 1000 AD there were several agricultural developments (described sometimes as an agricultural revolution) that resulted in an increase in production, especially food production. The economy started to develop at a fast pace, and the higher productivity allowed workers to farm more land or to become tradesmen.
Much of the western Netherlands was barely inhabited between the end of the Roman period until around 1100 AD, when farmers from Flanders and Utrecht began purchasing the swampy land, draining it and cultivating it. This process happened quickly and the uninhabited territory was settled in a few generations. They built independent farms that were not part of villages, something unique in Europe at the time.
Guilds were established and markets developed as production exceeded local needs. Also, the introduction of currency made trading a much easier affair than it had been before. Existing towns grew and new towns sprang into existence around monasteries and castles, and a mercantile middle class began to develop in these urban areas. Commerce and town development increased as the population grew.
The Crusades were popular in the Low Countries and drew many to fight in the Holy Land. At home, there was relative peace. Viking pillaging had stopped. Both the Crusades and the relative peace at home contributed to trade and the growth in commerce.
Cities arose and flourished, especially in Flanders and Brabant. As the cities grew in wealth and power, they started to buy certain privileges for themselves from the sovereign, including city rights, the right to self-government and the right to pass laws. In practice, this meant that the wealthiest cities became quasi-independent republics in their own right. Two of the most important cities were Brugge and Antwerp (in Flanders) which would later develop into some of the most important cities and ports in Europe.
The Hook and Cod Wars () were a series of wars and battles in the County of Holland between 1350 and 1490. Most of these wars were fought over the title of count of Holland, but some have argued that the underlying reason was because of the power struggle of the bourgeois in the cities against the ruling nobility.
The Cod faction generally consisted of the more progressive cities of Holland. The Hook faction consisted for a large part of the conservative noblemen. Some of the main figures in this multi-generational conflict were William IV, Margaret, William V, William VI, Count of Holland and Hainaut, John and Philip the Good, Duke of Burgundy. But perhaps the most well known is Jacqueline, Countess of Hainaut.
The conquest of the county of Holland by the Duke Philip the Good of Burgundy was an odd affair. Leading noblemen in Holland invited the duke to conquer Holland, even though he had no historical claim to it. Some historians say that the ruling class in Holland wanted Holland to integrate with the Flemish economic system and adopt Flemish legal institutions. Europe had been wracked by many civil wars in the 14th and 15th centuries, while Flanders had grown rich and enjoyed peace.
Most of what is now the Netherlands and Belgium was eventually united by the Duke of Burgundy in 1433. Before the Burgundian union, the Dutch identified themselves by the town they lived in, their local duchy or county or as subjects of the Holy Roman Empire. The Burgundian period is when the Dutch began the road to nationhood.
Holland's trade developed rapidly, especially in the areas of shipping and transport. The new rulers defended Dutch trading interests. The fleets of Holland defeated the fleets of the Hanseatic League several times. Amsterdam grew and in the 15th century became the primary trading port in Europe for grain from the Baltic region. Amsterdam distributed grain to the major cities of Belgium, Northern France and England. This trade was vital to the people of Holland, because Holland could no longer produce enough grain to feed itself. Land drainage had caused the peat of the former wetlands to reduce to a level that was too low for drainage to be maintained.
Charles V (1500–58) was born and raised in the Flemish city of Ghent; he spoke French. Charles extended the Burgundian territory with the annexation of Tournai, Artois, Utrecht, Groningen and Guelders. The Seventeen Provinces had been unified by Charles's Burgundian ancestors, but nominally were fiefs of either France or the Holy Roman Empire. When he was a minor, his aunt Margaret acted as regent until 1515. France relinquished its ancient claim on Flanders in 1528.
From 1515 to 1523, Charles's government in the Netherlands had to contend with the rebellion of Frisian peasants (led by Pier Gerlofs Donia and Wijard Jelckama). Gelre attempted to build up its own state in northeast Netherlands and northwest Germany. Lacking funds in the 16th century, Gelre had its soldiers provide for themselves by pillaging enemy terrain. These soldiers were a great menace to the Burgundian Netherlands, as when they pillaged The Hague.
The dukes of Burgundy over the years through astute marriages, purchases and wars, had taken control of the Seventeen Provinces that made up the Low Countries. They are now the Netherlands in the north, the Southern Netherlands (now Belgium) in the south, and Luxemburg in the southeast. Known as the "Burgundian Circle," these lands came under the control of the Habsburg family. Charles (1500–58) became the owner in 1506, but in 1515 he left to become king of Spain and later became the Holy Roman Emperor. Charles turned over control to regents (his close relatives), and in practice rule was exercised by Spaniards he controlled. The provinces each had their own governments and courts, controlled by the local nobility, and their own traditions and rights ("liberties") dating back centuries. Likewise the numerous cities had their own legal rights and local governments, usually controlled by the merchants, On top of this the Spanish had imposed an overall government, the Estates General of the Netherlands, with its own officials and courts. The Spanish officials sent by Charles ignored traditions and the Dutch nobility as well as local officials, inciting an anti-Spanish sense of nationalism, and leading to the Dutch Revolt. With the emergence of the Protestant Reformation, Charles—now the Emperor—was determined to crush Protestantism and never compromise with it. Unrest began in the south, centered in the large rich metropolis of Antwerp. The Netherlands was an especially rich unit of the Spanish realm, especially after the Treaty of Cateau-Cambresis of 1559; it ended four decades of warfare between France and Spain and allowed Spain to reposition its army.
In 1548, Charles granted the Netherlands status as an entity in which many of the laws of the Holy Roman Empire became obsolete. The "Transaction of Augsburg." created the Burgundian Circle of the Holy Roman Empire, which comprised the Netherlands and Franche-Comté. A year later the Pragmatic Sanction of 1549 stated that the Seventeen Provinces could only be passed on to his heirs as a composite entity.
During the 16th century, the Protestant Reformation rapidly gained ground in northern Europe, especially in its Lutheran and Calvinist forms. Dutch Protestants, after initial repression, were tolerated by local authorities. By the 1560s, the Protestant community had become a significant influence in the Netherlands, although it clearly formed a minority then. In a society dependent on trade, freedom and tolerance were considered essential. Nevertheless, the Catholic rulers Charles V, and later Philip II, made it their mission to defeat Protestantism, which was considered a heresy by the Catholic Church and a threat to the stability of the whole hierarchical political system. On the other hand, the intensely moralistic Dutch Protestants insisted their Biblical theology, sincere piety and humble lifestyle was morally superior to the luxurious habits and superficial religiosity of the ecclesiastical nobility. The rulers' harsh punitive measures led to increasing grievances in the Netherlands, where the local governments had embarked on a course of peaceful coexistence. In the second half of the century, the situation escalated. Philip sent troops to crush the rebellion and make the Netherlands once more a Catholic region.
In the first wave of the Reformation, Lutheranism won over the elites in Antwerp and the South. The Spanish successfully suppressed it there, and Lutheranism only flourished in east Friesland.
The second wave of the Reformation, came in the form of Anabaptism, that was popular among ordinary farmers in Holland and Friesland. Anabaptists were socially very radical and equalitarian; they believed that the apocalypse was very near. They refused to live the old way, and began new communities, creating considerable chaos. A prominent Dutch Anabaptist was Menno Simons, who initiated the Mennonite church. The movement was allowed in the north, but never grew to a large scale.
The third wave of the Reformation, that ultimately proved to be permanent, was Calvinism. It arrived in the Netherlands in the 1540s, attracting both the elite and the common population, especially in Flanders. The Catholic Spanish responded with harsh persecution and introduced the Inquisition of the Netherlands. Calvinists rebelled. First there was the iconoclasm in 1566, which was the systematic destruction of statues of saints and other Catholic devotional depictions in churches. In 1566, William the Silent, a Calvinist, started the Eighty Years' War to liberate all Dutch of whatever religion from Catholic Spain. Blum says, "His patience, tolerance, determination, concern for his people, and belief in government by consent held the Dutch together and kept alive their spirit of revolt." The provinces of Holland and Zeeland, being mainly Calvinist by 1572, submitted to the rule of William. The other states remained almost entirely Catholic.
The Netherlands was a valuable part of the Spanish Empire, especially after the Treaty of Cateau-Cambresis of 1559. This treaty ended a forty-year period of warfare between France and Spain conducted in Italy from 1521 to 1559. The Treaty of Cateau-Cambresis was somewhat of a watershed—not only for the battleground that Italy had been, but also for northern Europe. Spain had been keeping troops in the Netherlands to be ready to attack France from the north as well as from the south.
With the settlement of so many major issues between France and Spain by the Treaty of Cateau-Cambresis, there was no longer any reason to keep Spanish troops in the Netherlands. Thus, the people of the Netherlands could get on with their peacetime pursuits. As they did so they found that there was a great deal of demand for their products. Fishing had long been an important part of the economy of the Netherlands. However, now the fishing of herring alone came to occupy 2,000 boats operating out of Dutch ports. Spain, still the Dutch trader's best customer, was buying fifty large ships full of furniture and household utensils from Flanders merchants. Additionally, Dutch woolen goods were desired everywhere. The Netherlands bought and processed enough Spanish wool to sell four million florins of wool products through merchants in Bruges. So strong was the Dutch appetite for raw wool at this time that they bought nearly as much English wool as they did Spanish wool. Total commerce with England alone amounted to 24 million florins. Much of the export going to England resulted in pure profit to the Dutch because the exported items were of their own manufacture. The Netherlands was just starting to enter its "Golden Age." Brabant and Flanders were the richest and most flourishing parts of the Dutch Republic at the time. The Netherlands was one of the richest places in the world. The population reached 3 million in 1560, with 25 cities of 10,000 people or more, by far the largest urban presence in Europe; with the trading and financial center of Antwerp being especially important (population 100,000). Spain could not afford to lose this rich land, nor allow it to fall from Catholic control. Thus came 80 years of warfare.
A devout Catholic, Philip was appalled by the success of the Reformation in the Low Countries, which had led to an increasing number of Calvinists. His attempts to enforce religious persecution of the Protestants, and his centralization of government, law enforcement, and taxes, made him unpopular and led to a revolt. Fernando Alvarez de Toledo, Duke of Alba, was sent with a Spanish Army to punish the unruly Dutch in 1567.
The only opposition the Duke of Alba faced in his march across the Netherlands were the nobles, Lamoral, Count of Egmont; Philippe de Montmorency, Count of Horn and others. With the approach of Alba and the Spanish army, William the Silent of Orange fled to Germany with his three brothers and his whole family on 11 April 1567. The Duke of Alba sought to meet and negotiate with the nobles that now faced him with armies. However, when the nobles arrived in Brussels they were all arrested and Egmont and Horn were executed. Alba then revoked all the prior treaties that Margaret, the Duchess of Parma had signed with the Protestants of the Netherlands and instituted the Inquisition to enforce the decrees of the Council of Trent.
The Dutch War for Independence from Spain is frequently called the Eighty Years' War (1568–1648). The first fifty years (1568 through 1618) were uniquely a war between Spain and the Netherlands. During the last thirty years (1618–1648) the conflict between Spain and the Netherlands was submerged in the general European War that became known as the Thirty Years' War. The seven rebellious provinces of the Netherlands were eventually united by the Union of Utrecht in 1579 and formed the Republic of the Seven United Netherlands (also known as the "United Provinces"). The Act of Abjuration or "Plakkaat van Verlatinghe" was signed on 26 July 1581, and was the formal declaration of independence of the northern Low Countries from the Spanish king.
William of Orange (Slot Dillenburg, 24 April 1533 – Delft, 10 July 1584), the founder of the Dutch royal family, led the Dutch during the first part of the war, following the death of Egmont and Horn in 1568. The very first years were a success for the Spanish troops. However, the Dutch countered subsequent sieges in Holland. In November and December 1572, all the citizens of Zutphen and Naarden were slaughtered by the Spanish. From 11 December that year the city of Haarlem was besieged, holding out for seven months until 13 July 1573. Oudewater was conquered by the Spanish on 7 August 1575, and most of its inhabitants were killed. Maastricht was besieged, sacked and destroyed twice in succession (in 1576 and 1579) by the Spanish.
In a war composed mostly of sieges rather than battles, Governor-General Alexander Farnese proved his mettle. His strategy was to offer generous terms for the surrender of a city: there would be no more massacres or looting; historic urban privileges were retained; there was a full pardon and amnesty; return to the Catholic Church would be gradual. The conservative Catholics in the south and east supported the Spanish. Farnese recaptured Antwerp and nearly all of what became Belgium. Most of the Dutch-speaking territory in the Netherlands was taken from Spain, but not in Flanders, which to this day remains part of Belgium. Flanders was the most radical anti-Spanish territory. Many Flemish fled to Holland, among them half of the population of Antwerp, 3/4 of Bruges and Ghent and the entire population of Nieuwpoort, Dunkerque and countryside. His successful campaign gave the Catholics control of the lower half of the Low Countries, and was part of the Catholic Counter-Reformation.
The war dragged on for another half century, but the main fighting was over. The Peace of Westphalia, signed in 1648, confirmed the independence of the United Provinces from Spain. The Dutch people started to develop a national identity since the 15th century, but they officially remained a part of the Holy Roman Empire until 1648. National identity was mainly formed by the province people came from. Holland was the most important province by far. The republic of the Seven Provinces came to be known as Holland across Europe.
The Catholics in the Netherlands were an outlawed minority that had been suppressed by the Calvinists. After 1572, however, they made a striking comeback (also as part of the Catholic Counter-Reformation), setting up seminaries, reforming their Church, and sending missionaries into Protestant districts. Laity often took the lead; the Calvinist government often arrested or harassed priests who seemed too effective. Catholic numbers stabilized at about a third of the population in the Netherlands; they were strongest in the southeast.
During the Eighty Years' War the Dutch provinces became the most important trading centre of Northern Europe, replacing Flanders in this respect. During the Golden Age, there was a great flowering of trade, industry, the arts and the sciences in the Netherlands. In the 17th and 18th centuries, the Dutch were arguably the most economically wealthy and scientifically advanced of all European nations. This new, officially Calvinist nation flourished culturally and economically, creating what historian Simon Schama has called an "embarrassment of riches". Speculation in the tulip trade led to a first stock market crash in 1637, but the economic crisis was soon overcome. Due to these developments the 17th century has been dubbed the Golden Age of the Netherlands.
The invention of the sawmill enabled the construction of a massive fleet of ships for worldwide trading and for defence of the republic's economic interests by military means. National industries such as shipyards and sugar refineries expanded as well.
The Dutch, traditionally able seafarers and keen mapmakers, obtained an increasingly dominant position in world trade, a position which before had been occupied by the Portuguese and Spaniards. In 1602 the Dutch East India Company (Dutch: "Verenigde Oostindische Compagnie" or "VOC") was founded. It was the first-ever multinational corporation, financed by shares that established the first modern stock exchange. It became the world's largest commercial enterprise of the 17th century. To finance the growing trade within the region, the Bank of Amsterdam was established in 1609, the precursor to, if not the first true central bank.
Dutch ships hunted whales off Svalbard, traded spices in India and Indonesia (via the Dutch East India Company) and founded colonies in New Amsterdam (now New York), South Africa and the West Indies. In addition some Portuguese colonies were conquered, namely in Northeastern Brazil, Angola, Indonesia and Ceylon. In 1640 by the Dutch East India Company began a trade monopoly with Japan through the trading post on Dejima.
The Dutch also dominated trade between European countries. The Low Countries were favorably positioned on a crossing of east-west and north-south trade routes and connected to a large German hinterland through the Rhine river. Dutch traders shipped wine from France and Portugal to the Baltic lands and returned with grain destined for countries around the Mediterranean Sea. By the 1680s, an average of nearly 1000 Dutch ships entered the Baltic Sea each year. The Dutch were able to gain control of much of the trade with the nascent English colonies in North America and following the end of war with Spain in 1648, Dutch trade with that country also flourished.
Renaissance Humanism, of which Desiderius Erasmus (c. 1466–1536) was an important advocate, had also gained a firm foothold and was partially responsible for a climate of tolerance. Overall, levels of tolerance were sufficiently high to attract religious refugees from other countries, notably Jewish merchants from Portugal who brought much wealth with them. The revocation of the Edict of Nantes in France in 1685 resulted in the immigration of many French Huguenots, many of whom were shopkeepers or scientists. Still tolerance had its limits, as philosopher Baruch de Spinoza (1632–1677) would find out. Due to its climate of intellectual tolerance the Dutch Republic attracted scientists and other thinkers from all over Europe. Especially the renowned University of Leiden (established in 1575 by the Dutch stadtholder, William of Oranje, as a token of gratitude for Leiden's fierce resistance against Spain during the Eighty Years' War) became a gathering place for these people. For instance French philosopher René Descartes lived in Leiden from 1628 until 1649.
Dutch lawyers were famous for their knowledge of international law of the sea and commercial law. Hugo Grotius (1583–1645) played a leading part in the foundation of international law. Again due to the Dutch climate of tolerance, book publishers flourished. Many books about religion, philosophy and science that might have been deemed controversial abroad were printed in the Netherlands and secretly exported to other countries. Thus during the 17th century the Dutch Republic became more and more Europe's publishing house.
Christiaan Huygens (1629–1695) was a famous astronomer, physicist and mathematician. He invented the pendulum clock, which was a major step forward towards exact timekeeping. He contributed to the fields of optics. The most famous Dutch scientist in the area of optics is certainly Anton van Leeuwenhoek, who invented or greatly improved the microscope (opinions differ) and was the first to methodically study microscopic life, thus laying the foundations for the field of microbiology. Famous Dutch hydraulic engineer Jan Leeghwater (1575–1650) gained important victories in The Netherlands's eternal battle against the sea. Leeghwater added a considerable amount of land to the republic by converting several large lakes into polders, pumping all water out with windmills.
Painting was the dominant art form in 17th-century Holland. Dutch Golden Age painting followed many of the tendencies that dominated Baroque art in other parts of Europe, as with the Utrecht Caravaggisti, but was the leader in developing the subjects of still life, landscape, and genre painting. Portraiture were also popular, but history painting – traditionally the most-elevated genre struggled to find buyers. Church art was virtually non-existent, and little sculpture of any kind produced. While art collecting and painting for the open market was also common elsewhere, art historians point to the growing number of wealthy Dutch middle-class and successful mercantile patrons as driving forces in the popularity of certain pictorial subjects. Today, the best-known painters of the Dutch Golden Age are the period's most dominant figure Rembrandt, the Delft master of genre Johannes Vermeer, the innovative landscape painter Jacob van Ruisdael, and Frans Hals, who infused new life into portraiture. Some notable artistic styles and trends include Haarlem Mannerism, Utrecht Caravaggism, the School of Delft, the Leiden fijnschilders, and Dutch classicism.
Due to the thriving economy, cities expanded greatly. New town halls, weighhouses and storehouses were built. Merchants that had gained a fortune ordered a new house built along one of the many new canals that were dug out in and around many cities (for defence and transport purposes), a house with an ornamented façade that befitted their new status. In the countryside, many new castles and stately homes were built. Most of them have not survived. Starting at 1595 Reformed churches were commissioned, many of which are still landmarks today. The most famous Dutch architects of the 17th century were Jacob van Campen, Pieter Post, Pieter Vingbooms, Lieven de Key, Hendrick de Keyser. Overall, Dutch architecture, which generally combined traditional building styles with some foreign elements, did not develop to the level of painting.
The Golden Age was also an important time for developments in literature. Some of the major figures of this period were Gerbrand Adriaenszoon Bredero, Jacob Cats, Pieter Corneliszoon Hooft and Joost van den Vondel. Since Latin was the lingua franca of education, relatively few men could speak, write, and read Dutch all at the same time.
Music did not develop very much in the Netherlands since the Calvinists considered it an unnecessary extravagance, and organ music was forbidden in Reformed Church services, although it remained common at secular functions.
The Dutch West India Company was a chartered company (known as the "GWC") of Dutch merchants. On 2 June 1621, it was granted a for a trade monopoly in the West Indies (meaning the Caribbean) by the Republic of the Seven United Netherlands and given jurisdiction over the African slave trade, Brazil, the Caribbean, and North America. Its area of operations stretched from West Africa to the Americas, and the Pacific islands. The company became instrumental in the Dutch colonization of the Americas. The first forts and settlements in Guyana and on the Amazon River date from the 1590s. Actual colonization, with Dutch settling in the new lands, was not as common as with England and France. Many of the Dutch settlements were lost or abandoned by the end of that century, but the Netherlands managed to retain possession of Suriname and a number of Dutch Caribbean islands.
The colony was a private business venture to exploit the fur trade in beaver pelts. New Netherland was slowly settled during its first decades, partially as a result of policy mismanagement by the Dutch West India Company (WIC), and conflicts with Native Americans. During the 1650s, the colony experienced dramatic growth and became a major port for trade in the Atlantic World, tolerating a highly diverse ethnic mix. The surrender of Fort Amsterdam to the British control in 1664 was formalized in 1667, contributing to the Second Anglo–Dutch War. In 1673 the Dutch re-took the area, but later relinquished it under the 5 April 1674 Treaty of Westminster ending the Third Anglo-Dutch War.
Descendants of the original settlers played a prominent role in the History of the United States, as typified by the Roosevelt and Vanderbilt families. The Hudson Valley still boasts a Dutch heritage. The concepts of civil liberties and pluralism introduced in the province became mainstays of American political and social life.
Although slavery was illegal inside the Netherlands it flourished in the Dutch Empire, and helped support the economy. In 1619 The Netherlands took the lead in building a large-scale slave trade between Africa and Virginia, by 1650 becoming the pre-eminent slave trading country in Europe. It was overtaken by Britain around 1700. Historians agree that in all the Dutch shipped about 550,000 African slaves across the Atlantic, about 75,000 of whom died on board before reaching their destinations. From 1596 to 1829, the Dutch traders sold 250,000 slaves in the Dutch Guianas, 142,000 in the Dutch Caribbean islands, and 28,000 in Dutch Brazil. In addition, tens of thousands of slaves, mostly from India and some from Africa, were carried to the Dutch East Indies and slaves from the East Indies to Africa and the West Indies.
The Dutch East India Company, called the VOC, began in 1602, when the government gave it a monopoly to trade with Asia, mainly to Mughal India. It had many world firsts—the first multinational corporation, the first company to issue stock, and was the first megacorporation, possessing quasi-governmental powers, including the ability to wage war, negotiate treaties, coin money, and establish colonial settlements.
England and France soon copied its model but could not match its record. Between 1602 and 1796 the VOC sent almost a million Europeans to work in the Asia trade on 4,785 ships. It returned over 2.5 million tons of Asian trade goods. The VOC enjoyed huge profits from its spice monopoly through most of the 17th century. The VOC was active chiefly in the Dutch East Indies, now Indonesia, where its base was Batavia (now Jakarta), which remained an important trading concern and paid an 18% annual dividend for almost 200 years; colonized parts of Taiwan between 1624–1662 and 1664–1667 and the only western trading post in Japan, Dejima.
During the period of Proto-industrialization, the empire received 50% of textiles and 80% of silks import from the India's Mughal Empire, chiefly from its most developed region known as Bengal Subah.
By the 17th century, the Dutch East India Company established their base in parts of Ceylon (modern-day Sri Lanka). Afterward, they established ports in Dutch occupied Malabar, leading to Dutch settlements and trading posts in India. However, their expansion into India was halted, after their defeat in the Battle of Colachel by the Kingdom of Travancore, during the Travancore-Dutch War. The Dutch never recovered from the defeat and no longer posed a large colonial threat to India.
Eventually, the Dutch East India Company was weighted down by corruption, the VOC went bankrupt in 1800. Its possessions were taken over by the government and turned into the Dutch East Indies.
In 1647, a Dutch vessel was wrecked in the present-day Table Bay at Cape Town. The marooned crew, the first Europeans to attempt settlement in the area, built a fort and stayed for a year until they were rescued. Shortly thereafter, the Dutch East India Company (in the Dutch of the day: "Vereenigde Oostindische Compagnie", or VOC) decided to establish a permanent settlement. The VOC, one of the major European trading houses sailing the spice route to East Asia, had no intention of colonizing the area, instead wanting only to establish a secure base camp where passing ships could shelter, and where hungry sailors could stock up on fresh supplies of meat, fruit, and vegetables. To this end, a small VOC expedition under the command of Jan van Riebeeck reached Table Bay on 6 April 1652.
To remedy a labour shortage, the VOC released a small number of VOC employees from their contracts and permitted them to establish farms with which they would supply the VOC settlement from their harvests. This arrangement proved highly successful, producing abundant supplies of fruit, vegetables, wheat, and wine; they also later raised livestock. The small initial group of "free burghers", as these farmers were known, steadily increased in number and began to expand their farms further north and east.
The majority of burghers had Dutch ancestry and belonged to the Calvinist Reformed Church of the Netherlands, but there were also numerous Germans as well as some Scandinavians. In 1688 the Dutch and the Germans were joined by French Huguenots, also Calvinists, who were fleeing religious persecution in France under King Louis XIV. The Huguenots in South Africa were absorbed into the Dutch population but they played a prominent role in South Africa's history.
From the beginning, the VOC used the cape as a place to supply ships travelling between the Netherlands and the Dutch East Indies. There was a close association between the cape and these Dutch possessions in the far east. Van Riebeeck and the VOC began to import large numbers of slaves, primarily from Madagascar and Indonesia. These slaves often married Dutch settlers, and their descendants became known as the Cape Coloureds and the Cape Malays.
During the 18th century, the Dutch settlement in the area of the cape grew and prospered. By the late 1700s, the Cape Colony was one of the best developed European settlements outside Europe or the Americas. The two bases of the Cape Colony's economy for almost the entirety of its history were shipping and agriculture. Its strategic position meant that almost every ship sailing between Europe and Asia stopped off at the colony's capital Cape Town. The supplying of these ships with fresh provisions, fruit, and wine provided a very large market for the surplus produce of the colony.
Some free burghers continued to expand into the rugged hinterlands of the north and east, many began to take up a semi-nomadic pastoralist lifestyle, in some ways not far removed from that of the Khoikhoi they had displaced. In addition to its herds, a family might have a wagon, a tent, a Bible, and a few guns. As they became more settled, they would build a mud-walled cottage, frequently located, by choice, days of travel from the nearest European settlement. These were the first of the Trekboers (Wandering Farmers, later shortened to Boers), completely independent of official controls, extraordinarily self-sufficient, and isolated from the government and the main settlement in Cape Town.
Dutch was the official language, but a dialect had formed that was quite distinct from Dutch. The Afrikaans language originated mainly from 17th-century Dutch dialects.
This Dutch dialect sometimes referred to as the "kitchen language" ("kombuistaal"), would eventually in the late 19th century be recognised as a distinct language called Afrikaans and replace Dutch as the official language of the Afrikaners.
As the 18th century drew to a close, Dutch mercantile power began to fade and the British moved in to fill the vacuum. They seized the Cape Colony in 1795 to prevent it from falling into French hands, then briefly relinquished it back to the Dutch (1803), before definitively conquering it in 1806. British sovereignty of the area was recognised at the Congress of Vienna in 1815. By the time the Dutch colony was seized by the British in 1806, it had grown into an established settlement with 25,000 slaves, 20,000 white colonists, 15,000 Khoisan, and 1,000 freed black slaves. Outside Cape Town and the immediate hinterland, isolated black and white pastoralists populated the country.
Dutch interest in South Africa was mainly as a strategically located VOC port. Yet in the 17th and 18th centuries the Dutch created the foundation of the modern state of South Africa. The Dutch legacy in South Africa is evident everywhere, but particularly in the Afrikaner people and the Afrikaans language.
The Netherlands gained independence from Spain as a result of the Eighty Years' War, during which the Dutch Republic was founded. As the Netherlands was a republic, it was largely governed by an aristocracy of city-merchants called the regents, rather than by a king. Every city and province had its own government and laws, and a large degree of autonomy. After attempts to find a competent sovereign proved unsuccessful, it was decided that sovereignty would be vested in the various provincial Estates, the governing bodies of the provinces. The Estates-General, with its representatives from all the provinces, would decide on matters important to the Republic as a whole. However, at the head of each province was the stadtholder of that province, a position held by a descendant of the House of Orange. Usually the stadtholdership of several provinces was held by a single man.
After having gained its independence in 1648, the Netherlands tried in various coalitions to help to contain France, which had replaced Spain as the strongest nation of Europe. The end of the War of the Spanish Succession (1713) marked the end of the Dutch Republic as a major player. In the 18th century, it just tried to maintain its independence and stuck to a policy of neutrality.
The economy, based on Amsterdam's role as the center of world trade, remained robust. In 1670 the Dutch merchant marine totalled 568,000 tons of shipping—about half the European total. The province of Holland was highly commercial and dominated the country. Its nobility was small and closed and had little influence, for it was numerically small, politically weak, and formed a strictly closed caste. Most land in the province of Holland was commercialized for cash crops and was owned by urban capitalists, not nobles; there were few links between Holland's nobility and the merchants. By 1650 the burgher families which had grown wealthy through commerce and become influential in government controlled the province of Holland, and to a large extent shaped national policies. The other six provinces were more rural and traditional in life style, had an active nobility, and played a small role in commerce and national politics. Instead they concentrated on their flood protections and land reclamation projects.
The Netherlands sheltered many notable refugees, including Protestants from Antwerp and Flanders, Portuguese and German Jews, French Protestants (Huguenots) (including Descartes) and English Dissenters (including the Pilgrim Fathers). Many immigrants came to the cities of Holland in the 17th and 18th century from the Protestant parts of Germany and elsewhere. The amount of first generation immigrants from outside the Netherlands in Amsterdam was nearly 50% in the 17th and 18th centuries. Indeed, Amsterdam's population consisted primarily of immigrants, if one includes second and third generation immigrants and migrants from the Dutch countryside. People in most parts of Europe were poor and many were unemployed. But in Amsterdam there was always work. Tolerance was important, because a continuous influx of immigrants was necessary for the economy. Travellers visiting Amsterdam reported their surprise at the lack of control over the influx.
The era of explosive economic growth is roughly coterminous with the period of social and cultural bloom that has been called the Dutch Golden Age, and that actually formed the material basis for that cultural era. Amsterdam became the hub of world trade, the center into which staples and luxuries flowed for sorting, processing, and distribution, and then reexported around Europe and the world.
During 1585 through 1622 there was the rapid accumulation of trade capital, often brought in by refugee merchantes from Antwerp and other ports. The money was typically invested in high-risk ventures like pioneering expeditions to the East Indies to engage in the spice trade. These ventures were soon consolidated in the Dutch East India Company (VOC). There were similar ventures in different fields however, like the trade on Russia and the Levant. The profits of these ventures were ploughed back in the financing of new trade, which led to its exponential growth.
Rapid industrialization led to the rapid growth of the nonagricultural labor force and the increase in real wages during the same time. In the half-century between 1570 and 1620 this labor supply increased 3 percent per annum, a truly phenomenal growth. Despite this, nominal wages were repeatedly increased, outstripping price increases. In consequence, real wages for unskilled laborers were 62 percent higher in 1615–1619 than in 1575–1579.
By the mid-1660s Amsterdam had reached the optimum population (about 200,000) for the level of trade, commerce and agriculture then available to support it. The city contributed the largest quota in taxes to the States of Holland which in turn contributed over half the quota to the States General. Amsterdam was also one of the most reliable in settling tax demands and therefore was able to use the threat to withhold such payments to good effect.
Amsterdam was governed by a body of regents, a large, but closed, oligarchy with control over all aspects of the city's life, and a dominant voice in the foreign affairs of Holland. Only men with sufficient wealth and a long enough residence within the city could join the ruling class. The first step for an ambitious and wealthy merchant family was to arrange a marriage with a long-established regent family. In the 1670s one such union, that of the Trip family (the Amsterdam branch of the Swedish arms makers) with the son of Burgomaster Valckenier, extended the influence and patronage available to the latter and strengthened his dominance of the council. The oligarchy in Amsterdam thus gained strength from its breadth and openness. In the smaller towns family interest could unite members on policy decisions but contraction through intermarriage could lead to the degeneration of the quality of the members.
In Amsterdam the network was so large that members of the same family could be related to opposing factions and pursue widely separated interests. The young men who had risen to positions of authority in the 1670s and 1680s consolidated their hold on office well into the 1690s and even the new century.
Amsterdam's regents provided good services to residents. They spent heavily on the water-ways and other essential infrastructure, as well as municipal almshouses for the elderly, hospitals and churches.
Amsterdam's wealth was generated by its commerce, which was in turn sustained by the judicious encouragement of entrepreneurs whatever their origin. This open door policy has been interpreted as proof of a tolerant ruling class. But toleration was practiced for the convenience of the city. Therefore, the wealthy Sephardic Jews from Portugal were welcomed and accorded all privileges except those of citizenship, but the poor Ashkenazi Jews from Eastern Europe were far more carefully vetted and those who became dependent on the city were encouraged to move on. Similarly, provision for the housing of Huguenot immigrants was made in 1681 when Louis XIV's religious policy was beginning to drive these Protestants out of France; no encouragement was given to the dispossessed Dutch from the countryside or other towns of Holland. The regents encouraged immigrants to build churches and provided sites or buildings for churches and temples for all except the most radical sects and the Catholics by the 1670s (although even the Catholics could practice quietly in a chapel within the Beguinhof).
During the wars a tension had arisen between the Orange-Nassau leaders and the patrician merchants. The former—the Orangists—were soldiers and centralizers who seldom spoke of compromise with the enemy and looked for military solutions. They included many rural gentry as well as ordinary folk attached to the banner of the House of Orange. The latter group were the Republicans, led by the Grand Pensionary (a sort of prime minister) and the regents stood for localism, municipal rights, commerce, and peace. In 1650, the stadtholder William II, Prince of Orange suddenly died; his son was a baby and the Orangists were leaderless. The regents seized the opportunity: there would be no new stadtholder in Holland for 22 years. Johan de Witt, a brilliant politician and diplomat, emerged as the dominant figure. Princes of Orange became the stadtholder and an almost hereditary ruler in 1672 and 1748. The Dutch Republic of the United Provinces was a true republic from 1650 to 1672 and 1702–1748. These periods are called the First Stadtholderless Period and Second Stadtholderless Period.
The Republic and England were major rivals in world trade and naval power. Halfway through the 17th century the Republic's navy was the rival of Britain's Royal Navy as the most powerful navy in the world. The Republic fought a series of three naval wars against England in 1652–74.
In 1651, England imposed its first Navigation Act, which severely hurt Dutch trade interests. An incident at sea concerning the Act resulted in the First Anglo-Dutch War, which lasted from 1652 to 1654, ending in the Treaty of Westminster (1654), which left the Navigation Act in effect.
After the English Restoration in 1660, Charles II tried to serve his dynastic interests by attempting to make Prince William III of Orange, his nephew, stadtholder of the Republic, using some military pressure. King Charles thought a naval war would weaken the Dutch traders and strengthen the English economy and empire, so the Second Anglo-Dutch War was launched in 1665. At first many Dutch ships were captured and the English scored great victories. However, the Raid on the Medway, in June 1667, ended the war with a Dutch victory. The Dutch recovered their trade, while the English economy was seriously hurt and its treasury nearly bankrupt. The greatly expanded Dutch navy was for years after the world's strongest. The Dutch Republic was at the zenith of its power.
The year 1672 is known in the Netherlands as the "Disaster Year" ("Rampjaar"). England declared war on the Republic, (the Third Anglo-Dutch War), followed by France, Münster and Cologne, which had all signed alliances against the Republic. France, Cologne and Münster invaded the Republic. Johan de Witt and his brother Cornelis, who had accomplished a diplomatic balancing act for a long time, were now the obvious scapegoats. They were lynched, and a new stadtholder, William III, was appointed.
An Anglo-French attempt to land on the Dutch shore was barely repelled in three desperate naval battles under command of Admiral Michiel de Ruyter. The advance of French troops from the south was halted by a costly inundation of its own heartland, by breaching river dikes. With the aid of friendly German princes, the Dutch succeeded in fighting back Cologne and Münster, after which the peace was signed with both of them, although some territory in the east was lost forever. Peace was signed with England as well, in 1674 (Second Treaty of Westminster). In 1678, peace was made with France at the Treaty of Nijmegen, although France's Spanish and German allies felt betrayed by this.
In 1688, the relations with England reached crisis level once again. Stadtholder William III decided he had to take a huge gamble when he was invited to invade England by Protestant British nobles feuding with William's father-in-law the Catholic James II of England. This led to the Glorious Revolution and cemented the principle of parliamentary rule and Protestant ascendency in England. James fled to France, and William ascended to the English throne as co-monarch with his wife Mary, James' eldest daughter. This manoeuvre secured England as a critical ally of the United Provinces in its ongoing wars with Louis XIV of France. William was the commander of the Dutch and English armies and fleets until his death in 1702. During William's reign as King of England, his primary focus was leveraging British manpower and finances to aid the Dutch against the French. The combination continued after his death as the combined Dutch, British, and mercenary army conquered Flanders and Brabant, and invaded French territory before the alliance collapsed in 1713 due to British political infighting.
The "Second Stadtholderless Period" () is the designation in Dutch historiography of the period between the death of stadtholder William III on 19 March 1702 and the appointment of William IV, Prince of Orange as stadtholder and captain general in all provinces of the Dutch Republic on 2 May 1747. During this period the office of stadtholder was left vacant in the provinces of Holland, Zeeland, and Utrecht, though in other provinces that office was filled by members of the House of Nassau-Dietz (later called Orange-Nassau) during various periods.
During the period, the Republic lost its Great-Power status and its primacy in world trade, processes that went hand-in-hand, the latter causing the former. Though the economy declined considerably, causing deindustralization and deurbanization in the maritime provinces, a "rentier"-class kept accumulating a large capital fund that formed the basis for the leading position the Republic achieved in the international capital market. A military crisis at the end of the period caused the fall of the States-Party regime and the restoration of the Stadtholderate in all provinces. However, though the new stadtholder acquired near-dictatorial powers, this did not improve the situation.
The slow economic decline after 1730 was relative: other countries grew faster, eroding the Dutch lead and surpassing it. Wilson identifies three causes. Holland lost its world dominance in trade as competitors emerged and copied its practices, built their own ships and ports, and traded on their own account directly without going through Dutch intermediaries. Second, there was no growth in manufacturing, due perhaps to a weaker sense of industrial entrepreneurship and to the high wage scale. Third the wealthy turned their investments to foreign loans. This helped jump-start other nations and provided the Dutch with a steady income from collecting interest, but leaving them with few domestic sectors with a potential for rapid growth.
After the Dutch fleet declined, merchant interests became dependent on the goodwill of Britain. The main focus of Dutch leaders was reducing the country's considerable budget deficits. Dutch trade and shipping remained at a fairly steady level through the 18th century, but no longer had a near monopoly and also could not match growing English and French competition. The Netherlands lost its position as the trading centre of Northern Europe to London.
Although the Netherlands remained wealthy, investors for the nation's money became more difficult to find. Some investment went into purchases of land for estates, but most went to foreign bonds and Amsterdam remained one of Europe's banking capitals.
Dutch culture also declined both in the arts and sciences. Literature for example largely imitated English and French styles with little in the way of innovation or originality. The most influential intellectual was Pierre Bayle (1647–1706), a Protestant refugee from France who settled in Rotterdam where he wrote the massive Dictionnaire Historique et Critique ("Historical and Critical Dictionary", 1696). It had a major impact on the thinking of The Enlightenment across Europe, giving an arsenal of weapons to critics who wanted to attack religion. It was an encyclopaedia of ideas that argued that most "truths" were merely opinions, and that gullibility and stubbornness were prevalent.
Life for the average Dutchman became slower and more relaxed than in the 18th century. The upper and middle classes continued to enjoy prosperity and high living standards. The drive to succeed seemed less urgent. Unskilled laborers remained locked in poverty and hardship. The large underclass of unemployed beggars and riffraff required government and private charity to survive.
Religious life became more relaxed as well. Catholics grew from 18% to 23% of the population during the 18th century and enjoyed greater tolerance, even as they continued to be outside the political system. They became divided by the feud between moralistic Jansenists (who denied free will) and orthodox believers. One group of Jansenists formed a splinter sect, the Old Catholic Church in 1723. The upper classes willingly embraced the ideas of the Enlightenment, tempered by the tolerance that meant less hostility to organized religion compared to France.
During the term of Anthonie van der Heim as Grand Pensionary from 1737 to 1746, the Republic slowly drifted into the War of Austrian Succession. This started as a Prusso-Austrian conflict, but eventually all the neighbours of the Dutch Republic became involved. On one side were Prussia, France and their allies and on the other Austria, Britain (after 1744) and their allies. At first the Republic strove to remain neutral in this European conflict, but it maintained garrisons in a number of fortresses in the Austrian Netherlands. French grievances and threats spurred the Republic into bring its army up to European standards (84,000 men in 1743).
In 1744 and 1745 the French attacked Dutch fortresses at Menen and Tournai. This prompted the Dutch Republic in 1745 to join the Quadruple Alliance, but this alliance was severely defeated at the Battle of Fontenoy in May 1745. In 1746 the French occupied most of the large cities in the Austrian Netherlands. Then, in April 1747, apparently as an exercise in armed diplomacy, a relatively small French military force occupied Zeelandic Flanders, part of the Dutch Republic.
This relatively innocuous invasion fully exposed the rot underlying the Dutch defences. The consequences were spectacular. Still mindful of the French invasion in the "Disaster Year" of 1672, many fearful people clamored for the restoration of the stadtholderate. William IV, Prince of Orange, had been waiting impatiently in the wings since acquiring his princely title in 1732. Over the next year he and his supporters engaged in a number of political battles in various provinces and towns in the Netherlands to wrest control from the regents. The aim was for William IV to obtain a firm grip on government patronage and place loyal officials in all strategic government positions. Eventually he managed to achieve this aim in all provinces.
Willem Bentinck van Rhoon was a prominent Orangist. People like Bentinck hoped that gathering the reins of power in the hands of a single "eminent head" would soon help restore the state of the Dutch economy and finances. The regents they opposed included the Grand Pensionary Jacob Gilles and Adriaen van der Hoop. This popular revolt had religious, anti-Catholic and democratic overtones and sometimes involved mob violence. It eventually involved political agitation by Daniel Raap, Jean Rousset de Missy and the Doelisten, attacks on tax farmers (pachtersoproer), religious agitation for enforcement of the Sabbath laws and preference for followers of Gisbertus Voetius and various demands by the civil militia.
The war against the French was itself brought to a not-too-devastating end for the Dutch Republic with the Treaty of Aix-la-Chapelle (1748). The French retreated of their own accord from the Dutch frontier. William IV died unexpectedly, at the age of 40, on 22 October 1751.
His son, William V, was 3 years old when his father died, and a long regency characterised by corruption and misrule began. His mother delegated most of the powers of the regency to Bentinck and her favorite, Duke Louis Ernest of Brunswick-Lüneburg. All power was concentrated in the hands of an unaccountable few, including the Frisian nobleman Douwe Sirtema van Grovestins. Still a teenager, William V assumed the position of stadtholder in 1766, the last to hold that office. In 1767, he married Princess Wilhelmina of Prussia, the daughter of Augustus William of Prussia, niece of Frederick the Great.
The position of the Dutch during the American War of Independence was one of neutrality. William V, leading the pro-British faction within the government, blocked attempts by pro-independence, and later pro-French, elements to drag the government to war. However, things came to a head with the Dutch attempt to join the Russian-led League of Armed Neutrality, leading to the outbreak of the disastrous Fourth Anglo-Dutch War in 1780. After the signing of the Treaty of Paris (1783), the impoverished nation grew restless under William's rule.
An English historian summed him up uncharitably as "a Prince of the profoundest lethargy and most abysmal stupidity." And yet he would guide his family through the difficult French-Batavian period and his son would be crowned king.
The Fourth Anglo–Dutch War (1780–1784) was a conflict between the Kingdom of Great Britain and the Dutch Republic. The war, tangentially related to the American Revolutionary War, broke out over British and Dutch disagreements on the legality and conduct of Dutch trade with Britain's enemies in that war.
Although the Dutch Republic did not enter into a formal alliance with the United States and their allies, U.S. ambassador (and future President) John Adams managed to establish diplomatic relations with the Dutch Republic, making it the second European country to diplomatically recognize the Continental Congress in April 1782. In October 1782, a treaty of amity and commerce was concluded as well.
Most of the war consisted of a series of largely successful British operations against Dutch colonial economic interests, although British and Dutch naval forces also met once off the Dutch coast. The war ended disastrously for the Dutch and exposed the weakness of the political and economic foundations of the country. The Treaty of Paris (1784), according to Fernand Braudel, "sounded the knell of Dutch greatness."
After the war with Great Britain ended disastrously in 1784, there was growing unrest and a rebellion by the anti-Orangist Patriots. The French Revolution resulted first in the establishment of a pro-French Batavian Republic (1795–1806), then the creation of the Kingdom of Holland, ruled by a member of the House of Bonaparte (1806–1810), and finally annexation by the French Empire (1810–1813).
Influenced by the American Revolution, the Patriots sought a more democratic form of government. The opening shot of this revolution is often considered to be the 1781 publication of a manifesto called "Aan het Volk van Nederland" ("To the People of the Netherlands") by Joan van der Capellen tot den Pol, who would become an influential leader of the Patriot movement. Their aim was to reduce corruption and the power held by the stadtholder, William V, Prince of Orange.
Support for the Patriots came mostly from the middle class. They formed militias called "exercitiegenootschappen". In 1785, there was an open Patriot rebellion, which took the form of an armed insurrection by local militias in certain Dutch towns, "Freedom" being the rallying cry. Herman Willem Daendels attempted to organise an overthrow of various municipal governments (vroedschap). The goal was to oust government officials and force new elections. "Seen as a whole this revolution was a string of violent and confused events, accidents, speeches, rumours, bitter enmities and armed confrontations", wrote French historian Fernand Braudel, who saw it as a forerunner of the French Revolution.
In 1785 the stadholder left The Hague and moved his court to Nijmegen in Guelders, a city remote from the heart of Dutch political life. In June 1787, his energetic wife Wilhelmina (the sister of Frederick William II of Prussia) tried to travel to The Hague. Outside Schoonhoven, she was stopped by Patriot militiamen and taken to a farm near Goejanverwellesluis. Within two days she was forced to return to Nijmegen, an insult not unnoticed in Prussia.
The House of Orange reacted with severity, relying on Prussian troops led by Charles William Ferdinand, Duke of Brunswick and a small contingent of British troops to suppress the rebellion. Dutch banks at this time still held much of the world's capital. Government-sponsored banks owned up to 40% of Great Britain's national debt and there were close connections to the House of Stuart. The stadholder had supported British policies after the American Revolution.
This severe military response overwhelmed the Patriots and put the stadholder firmly back in control. A small unpaid Prussian army was billeted in the Netherlands and supported themselves by looting and extortion. The exercitiegenootschappen continued urging citizens to resist the government. They distributed pamphlets, formed "Patriot Clubs" and held public demonstrations. The government responded by pillaging those towns where opposition continued. Five leaders were sentenced to death (but fled first). Lynchings also occurred. For a while, no one dared appear in public without an orange cockade to show their support for Orangism. Many Patriots, perhaps around 40,000 in all, fled to Brabant, France (especially Dunkirk and St. Omer) and elsewhere. However, before long the French became involved in Dutch politics and the tide turned.
The French Revolution was popular, and numerous underground clubs were promoting it when in January 1795 the French army invaded. The underground rose up, overthrew the municipal and provincial governments, and proclaimed the Batavian Republic () in Amsterdam. Stadtholder William V fled to England and the States General dissolved itself. The new government was virtually a puppet of France. The Batavian Republic enjoyed widespread support and sent soldiers to fight in the French armies. The 1799 Anglo-Russian invasion of Holland was repulsed by Batavian–French forces. Nevertheless, Napoleon replaced it because the regime of Grand Pensionary Rutger Jan Schimmelpenninck (1805–06) was insufficiently docile.
The confederal structure of the old Dutch Republic was permanently replaced by a unitary state. The 1798 constitution had a genuinely democratic character, though a coup d'état of 1801 put an authoritarian regime in power. Ministerial government was introduced for the first time in Dutch history and many of the current government departments date their history back to this period. Meanwhile, the exiled stadholder handed over the Dutch colonies in "safekeeping" to Great Britain and ordered the colonial governors to comply. This permanently ended the colonial empire in Guyana, Ceylon and the Cape Colony. The Dutch East Indies was returned to the Netherlands under the Anglo-Dutch Treaty of 1814.
In 1806 Napoleon restyled the Netherlands (along with a small part of what is now Germany) into the Kingdom of Holland, putting his brother Louis Bonaparte (1778–1846), on the throne. The new king was unpopular, but he was willing to cross his brother for the benefit of his new kingdom. Napoleon forced his abdication in 1810 and incorporated the Netherlands directly into the French empire, imposing economic controls and conscription of all young men as soldiers. When the French retreated from the northern provinces in 1813, a Triumvirate took over at the helm of a provisional government. Although most members of the provisional government had been among the men who had driven out William V 18 years earlier, the leaders of the provisional government knew that any new regime would have to be headed by his son, William Frederick. They also knew that it would be better in the long term if the Dutch people themselves installed the prince, rather than have him imposed on the country by the anti-French alliance. Accordingly, the Triumvirate called William Frederick back on 30 November and offered him the crown. He refused, but instead proclaimed himself "hereditary sovereign prince" on 6 December.
The Great Powers had secretly agreed to merge the northern Netherlands with the more populated Austrian Netherlands and the smaller Prince-Bishopric of Liège into a single constitutional monarchy. Having a stronger country on France's northern border was considered (especially by Tsar Alexander) to be an important part of the strategy to keep France's power in check. In 1814, William Frederick gained sovereignty over the Austrian Netherlands and Liège as well. Thus, William Frederick had fulfilled his family's three-century quest to unite the Low Countries under a single rule.
On 15 March 1815; with the encouragement of the powers gathered at the Congress of Vienna, William Frederick raised the Netherlands to the status of a kingdom and proclaimed himself King William I. This was made official later in 1815, when the Low Countries were formally recognized as the United Kingdom of the Netherlands. The crown was made a hereditary office of the House of Orange-Nassau.
William I became king and also became the hereditary Grand Duke of Luxembourg, that was part of the Netherlands but at the same time part of the German Confederation. The newly created country had two capitals: Amsterdam and Brussels. The new nation had two equal parts. The north (Netherlands proper) had 2 million people. They spoke chiefly Dutch but were divided religiously between a Protestant majority and a large Catholic minority. The south (which would be known as "Belgium" after 1830) had a population of 3.4 million people. Nearly all were Catholic, but it was divided between French-speaking Walloons and Dutch-speaking Flemings. The upper and middle classes in the south were mostly French-speaking. About 60,000 Belgians were eligible to vote, compared to about 80,000 Dutchmen. Officially Amsterdam was the capital, but in a compromise the government met alternately in Brussels and The Hague.
Adolphe Quetelet (1796–1874), the great Belgian statistician, calculated that the new nation was significantly better off than other states. Mortality was low, the food supply was good, education was good, public awareness was high and the charity rate was the highest in the world. The best years were in the mid-1820s.
The quality of schooling was dismal, however. According to Schama, about 1800 the local school teacher was the "humble auxiliary of the local priest. Despised by his co-villagers and forced to subsist on the gleanings of the peasants, he combined drumming the catechism into the heads of his unruly charges with the duties of winding the town clock, ringing the church bells or digging its graves. His principal use to the community was to keep its boys out of mischief when there was no labour for them in the fields, or setting the destitute orphans of the town to the 'useful arts' of picking tow or spinning crude flax. As one would expect, standards in such an occupation were dismal." But in 1806 the Dutch, led by Adriaan van den Ende, energetically set out to modernise education, focusing on a new system for advanced training of teachers with an elaborate system of inspectors, training courses, teacher examinations and teaching societies. By 1826, although much smaller than France, the Dutch national government was spending 12 times more than Paris on education.
William I, who reigned from 1815 to 1840, had great constitutional power. An enlightened despot, he accepted the modernizing transformations of the previous 25 years, including equality of all before the law. However, he resurrected the estates as a political class and elevated a large number of people to the nobility. Voting rights were still limited, and only the nobility were eligible for seats in the upper house. The old provinces were reestablished in name only. The government was now fundamentally unitary, and all authority flowed from the center.
William I was a Calvinist and unsympathetic to the religious culture and practices of the Catholic majority. He promulgated the "Fundamental Law of Holland", with some modifications. This entirely overthrew the old order of things in the southern Netherlands: it abolished the privileges of the Catholic Church, and guaranteed equal protection to every religious creed and the enjoyment of the same civil and political rights to every subject of the king. It reflected the spirit of the French Revolution and in so doing did not please the Catholic bishops in the south, who had detested the Revolution.
William I actively promoted economic modernization. The first 15 years of the Kingdom showed progress and prosperity, as industrialization proceeded rapidly in the south, where the Industrial Revolution allowed entrepreneurs and labor to combine in a new textile industry, powered by local coal mines. There was little industry in the northern provinces, but most overseas colonies were restored, and highly profitable trade resumed after a 25-year hiatus. Economic liberalism combined with moderate monarchical authoritarianism to accelerate the adaptation of the Netherlands to the new conditions of the 19th century. The country prospered until a crisis arose in relations with the southern provinces.
William was determined to create a united people, even though the north and south had drifted far apart in the past three centuries. Protestants were the largest denomination in the North (population 2 million), but formed a quarter of the population in the overwhelmingly Catholic South (population 3.5 million). Nevertheless, Protestants dominated William's government and army. The Catholics did not consider themselves an integral part of the United Netherlands, preferring instead to identify with mediaeval Dutch culture. Other factors that contributed to this feeling were economic (the South was industrialising, the North had always been a merchants' nation) and linguistic (French was spoken in Wallonia and a large part of the bourgeoisie in Flemish cities).
After having been dominant for centuries, the French-speaking elite in the Southern Netherlands now felt like second-class citizens.
In the Catholic South, William's policies were unpopular. The French-speaking Walloons strenuously rejected his attempt to make Dutch the universal language of government, while the population of Flanders was divided. Flemings in the south spoke a Dutch dialect ("Flemish") and welcomed the encouragement of Dutch with a revival of literature and popular culture. Other Flemings, notably the educated bourgeoisie, preferred to speak French. Although Catholics possessed legal equality, they resented their subordination to a government that was fundamentally Protestant in spirit and membership after having been the state church for centuries in the south. Few Catholics held high office in state or army. Furthermore, political liberals in the south complained about the king's authoritarian methods. All southerners complained of underrepresentation in the national legislature. Although the south was industrializing and was more prosperous than the north the accumulated grievances allowed the multiple opposition forces to coalesce.
The outbreak of revolution in France in 1830 was a signal for action, at first on behalf of autonomy for Belgium, as the southern provinces were now called, and later on behalf of total independence. William dithered and his half-hearted efforts to reconquer Belgium were thwarted both by the efforts of the Belgians themselves and by the diplomatic opposition of the great powers.
At the London Conference of 1830, the chief powers of Europe ordered (in November 1830) an armistice between the Dutch and the Belgians. The first draft for a treaty of separation of Belgium and the Netherlands was rejected by the Belgians. A second draft (June 1831) was rejected by William I, who resumed hostilities. Franco-British intervention forced William to withdraw Dutch forces from Belgium late in 1831, and in 1833 an armistice of indefinite duration was concluded. Belgium was effectively independent but William's attempts to recover Luxembourg and Limburg led to renewed tension. The London Conference of 1838–39 prepared the final Dutch-Belgian separation treaty of 1839. It divided Luxembourg and Limburg between the Dutch and Belgian crowns. The Kingdom of the Netherlands thereafter was made up of the 11 northern provinces.
The Netherlands did not industrialize as rapidly as Belgium after 1830, but it was prosperous enough. Griffiths argues that certain government policies facilitated the emergence of a national economy in the 19th century. They included the abolition of internal tariffs and guilds, a unified coinage system, modern methods of tax collection, standardized weights and measures, and the building of many roads, canals, and railroads. However, compared to Belgium, which was leading in industrialization on the Continent, the Netherlands moved slowly. Possible explanations for this difference are the higher costs due to geography and high wages, and the emphasis of entrepreneurs on trade rather than industry.
For example, in the Dutch coastal provinces agricultural productivity was relatively high. Hence, industrial growth arrived relatively late – after 1860 – because incentives to move to labour-intensive industry were quite weak.
However, the provinces of North Brabant and Overijssel did industrialize, and they became the most economically advanced areas of the country.
As in the rest of Europe, the 19th century saw the gradual transformation of the Netherlands into a modern middle-class industrial society. The number of people employed in agriculture decreased, while the country made a strong effort to revive its stake in the highly competitive shipping and trade business. The Netherlands lagged behind Belgium until the late 19th century in industrialization, and caught up around 1920. Major industries included textiles and (later) the great Philips industrial conglomerate. Rotterdam became a major shipping and manufacturing center. Poverty slowly declined as begging largely disappeared along with steadily improving working conditions for the population.
In 1840 William I abdicated in favor of his son, William II, who attempted to carry on the policies of his father in the face of a powerful liberal movement. In 1848 unrest broke out all over Europe. Although there were no major events in the Netherlands, these foreign developments persuaded King William II to agree to liberal and democratic reform. That same year Johan Rudolf Thorbecke, a prominent liberal, was asked by the king to draft a constitution that would turn the Netherlands into a constitutional monarchy. The new constitution was proclaimed on 3 November 1848. It severely limited the king's powers (making the government accountable only to an elected parliament), and it protected civil liberties. The new liberal constitution, which put the government under the control of the States General, was accepted by the legislature in 1848. The relationship between monarch, government and parliament has remained essentially unchanged ever since.
William II was succeeded by William III in 1849. The new king reluctantly chose Thorbecke to head the new government, which introduced several liberal measures, notably the extension of suffrage. However, Thorbecke's government soon fell, when Protestants rioted against the Vatican's reestablishment of the Catholic episcopate, in abeyance since the 16th century. A conservative government was formed, but it did not undo the liberal measures, and the Catholics were finally given equality after two centuries of subordination. Dutch political history from the middle of the 19th century until the First World War was fundamentally one of the extension of liberal reforms in government, the reorganization and modernization of the Dutch economy, and the rise of trade unionism and socialism as working-class movements independent of traditional liberalism. The growth in prosperity was enormous, as real per capita GNP soared from 106 guilders in 1804 to 403 in 1913.
Religion was a contentious issue with repeated struggles over the relations of church and state in the field of education. In 1816, the government took full control of the Dutch Reformed Church ("Nederlands Hervormde Kerk"). In 1857, all religious instruction was ended in public schools, but the various churches set up their own schools, and even universities. Dissident members broke away from the Dutch Reformed Church in the Secession of 1834. They were harassed by the government under an onerous Napoleonic law prohibiting gatherings of more than 20 members without a permit. After the harassment ended in the 1850s, a number of these dissidents eventually created the Christian Reformed Church in 1869; thousands migrated to Michigan, Illinois, and Iowa in the United States. By 1900, the dissidents represented about 10% of the population, compared to 45% of the population who were in the Dutch Reformed Church, which continued to be the only church to receive state money.
At mid-century, most Dutch belonged either to the Dutch Reformed Church or dissenter groups that separated from it (around 55%), or the Roman Catholic Church (35% to 40%), together with smaller Protestant (for example, Lutheran) and Jewish groups. A large and powerful sector of nominal Protestants were in fact secular liberals seeking to minimize religious influence. In reaction a novel alliance developed with Catholics and devout Calvinists joining against secular liberals. The Catholics, who had been loosely allied with the liberals in earlier decades, turned against them on the issue of state support, which the liberals insisted should be granted only to public schools, and joined with Protestant political parties in demanding equal state support to schools maintained by religious groups.
The Netherlands remained one of the most tolerant countries in Europe towards religious belief, although conservative Protestants objected to the liberalization of the Dutch Reformed Church during the 19th century and faced opposition from the government when they tried to establish separate communities (Catholics and other non-Protestants were left unmolested by Dutch authorities). Some moved to the United States as a consequence, but as the century drew to a close, religious persecution had totally ceased.
Dutch social and political life became divided by fairly clear-cut internal borders that were emerging as the society pillarized into three separate parts based on religion. The economy was not affected. One of the people most responsible for designing pillarization was Abraham Kuyper (1837–1920), a leading politician, neo-Calvinist theologian, and journalist. Kuyper established orthodox Calvinist organizations, and also provided a theoretical framework by developing such concepts as "sphere-sovereignty" that celebrated Dutch society as a society of organized minorities. "Verzuiling" ("pillarization" or "pluralism") after 1850 became the solution to the danger of internal conflict. Everyone was part of one (and only one) pillar ("zuil") based chiefly on religion (Protestant, Catholic, secular). The secular pillar eventually split into a socialist/working class pillar and a liberal (pro-business) secular pillar. Each pillar built a full set of its own social organizations, including churches (for the religious pillars), political parties, schools, universities, labor unions, sport clubs, boy scout unions and other youth clubs, and newspapers. The members of different "zuilen" lived in close proximity in cities and villages, spoke the same language, and did business with one another, but seldom interacted informally and rarely intermarried. In politics Kuyper formed the Anti-Revolutionary Party (ARP) in 1879, and headed it until 1905.
Pillarization was officially recognized in the Pacification of 1917, whereby socialists and liberals achieved their goal of universal male suffrage and the religious parties were guaranteed equal funding of all schools. In 1930 radio was organized so that each pillar had full control of its own network. When television began in the late 1940s the pillars divided up time equally on the one station. In politics and civic affairs leaders of the pillar organizations cooperated and the acknowledged the right of the other pillars, so public life generally ran smoothly.
The late 19th century saw a cultural revival. The Hague School brought a revival of realist painting, 1860–1890. The world-famous Dutch painter was Vincent van Gogh, but he spent most of his career in France. Literature, music, architecture and science also flourished. A representative leader of science was Johannes Diderik van der Waals (1837–1923), a working class youth who taught himself physics, earned a PhD at the nation's leading school Leiden University, and in 1910 won the Nobel Prize for his discoveries in thermodynamics. Hendrik Lorentz (1853–1928) and his student Pieter Zeeman (1865–1943) shared the 1902 Nobel Prize in physics. Other notable scientists included biologist Hugo de Vries (1848–1935), who rediscovered Mendelian genetics.
In 1890, William III died after a long reign and was succeeded by his young daughter, Queen Wilhelmina (1880–1962). She would rule the Netherlands for 58 years. On her accession to the throne, the personal union between the Netherlands and Luxembourg ended because Luxembourg law excluded women from rule. Her remote cousin Adolphe became the Grand Duke of Luxembourg.
This was a time of further growth and colonial development, but it was marked by the difficulties of the World War I (in which the Netherlands was neutral)
and the Great Depression. The Dutch population grew rapidly in the 20th century, as death rates fell, more lands were opened up, and industrialisation created urban jobs. Between 1900 and 1950 the population doubled from 5.1 to 10 million people.
The Dutch empire comprised the Dutch East Indies (Indonesia), as well as Surinam in South America and some minor possessions. It was smaller in 1945 than in 1815 because the Netherlands was the only colonial power that did not expand into Africa or anywhere else. The empire was run from Batavia (in Java), where the governor and his technical experts had almost complete authority with little oversight from the Hague. Successive governors improved their bureaucratic and military controls, and allowed very little voice to the locals until the 1920s.
The colony brought economic opportunity to the mother country and there was little concern at the time about it. One exception came in 1860 when Eduard Dekker, under the pen name "Multatuli" wrote the novel "", one of the most notable books in the history of Dutch literature. He criticized the exploitation of the colony and as well had harsh words about the indigenous princes who collaborated with the governor. The book helped inspire the Indonesian independence movement in the mid-20th century as well as the "Fair trade" movement for coffee at the end of the century.
The military forces in the Dutch East Indies were controlled by the governor and were not part of the regular Dutch army. As the map shows, the Dutch slowly expanded their holdings from their base in Java to include all of modern Indonesia by 1920. Most islands were not a problem but there was a long, costly campaign against the Achin (Aceh) state in northern Sumatra.
The Netherlands had not fought a major military campaign since the 1760s, and the strength of its armed forces had gradually dwindled. The Dutch decided not to ally themselves with anyone, and kept out of all European wars especially the First World War that swirled about it.
The German war plan (the Schlieffen Plan) of 1905 was modified in 1908 to invade Belgium on the way to Paris but not the Netherlands. It supplied many essential raw materials to Germany such as rubber, tin, quinine, oil and food. The British used its blockade to limit supplies that the Dutch could pass on. There were other factors that made it expedient for both the Allies and the Central Powers for the Netherlands to remain neutral. The Netherlands controlled the mouths of the Scheldt, the Rhine and the Meuse Rivers. Germany had an interest in the Rhine since it ran through the industrial areas of the Ruhr and connected it with the Dutch port of Rotterdam. Britain had an interest in the Scheldt River and the Meuse flowed from France. All countries had an interest in keeping the others out of the Netherlands so that no one's interests could be taken away or be changed. If one country were to have invaded the Netherlands, another would certainly have counterattacked to defend their own interest in the rivers. It was too big a risk for any of the belligerent nations and none wanted to risk fighting on another front.
The Dutch were affected by the war, troops were mobilized and conscription was introduced in the face of harsh criticism from opposition parties. In 1918, mutinies broke out in the military. Food shortages were extensive, due to the control the belligerents exercised over the Dutch. Each wanted their share of Dutch produce. As a result, the price of potatoes rose sharply because Britain had demanded so much from the Dutch. Food riots even broke out in the country. A big problem was smuggling. When Germany had conquered Belgium, the Allies saw it as enemy territory and stopped exporting to Belgium. Food became scarce for the Belgian people, since the Germans seized all food. This gave the Dutch the opportunity to start to smuggle. This, however, caused great problems in the Netherlands, including inflation and further food shortages. The Allies demanded that the Dutch stop the smuggling, and the government took measures to remain neutral. The government placed many cities under 'state of siege'. On 8 January 1916, a zone was created by the government along the border. In that zone, goods could be moved on main roads with a permit. German authorities in Belgium had an electrified fence erected all along the Belgian–Dutch border that caused many refugees from Belgium to lose their lives. The fence was guarded by older German Landsturm soldiers.
Although both houses of the Dutch Parliament were elected by the people, only men with high incomes were eligible to vote until 1917, when pressure from socialist movements resulted in elections in which all men regardless of income, were entitled to vote. In 1919, women also obtained the right to vote for the first time in history.
The worldwide Great Depression which began after the tumultuous events of Black Tuesday in 1929, that continued into the early-1930s had crippling effects on the Dutch economy; lasting longer than in most other European countries. The long duration of the Great Depression in the Netherlands is often explained by the very strict fiscal policy of the Dutch government at the time, and its decision to adhere to the gold standard for much longer than most of its trading partners. The Great Depression led to high unemployment and widespread poverty, as well as increasing social unrest.
The rise of Nazism in Germany did not go unnoticed in the Netherlands, and there was growing concern at the possibility of armed conflict, but most Dutch people expected that Germany would again respect Dutch neutrality.
There were separate fascist and Nazi movements in the 1930s. Dutch Fascists admired Mussolini's Italy and called for a traditional corporate ideology. The membership was small, elitist and ineffective. The pro-Nazi movement, however, won support from Berlin and attempted to build a mass base by 1935. It failed because most Dutch rejected its racial ideology and calls for violence.
The defence budget was not increased until Germany remilitarised the Rhineland in 1936. The budget was further increased in 1938 (after the annexation of Austria and occupation of the Czech Sudetenland). The colonial government also increased its military budget because of increasing tensions with Japan. The Dutch did not mobilise their armed forces until shortly before France and the UK declared war on Germany in September 1939 after the invasion of Poland. Neutrality was still the official policy, but the Dutch government tried to buy new arms for their badly equipped forces; however, a considerable share of ordered weapons never arrived.
At the outbreak of World War II in 1939, the Netherlands once again declared its neutrality. However, on 10 May 1940, Nazi Germany launched an attack on the Netherlands and Belgium and quickly overran most of the two countries. Fighting against the Dutch army proved to be more of a burden than foreseen; the northern attack was stopped dead, the one in the middle came to a grinding halt near the Grebbeberg and many airborne assault troops were killed and taken prisoner in the west of the country.
Only in the south were defences broken, but the one passage over the River Maas at Rotterdam was held by the Dutch. By 14 May, fighting in many locations had ceased and the German army could make little or no headway, so the Luftwaffe bombed Rotterdam, the second-largest city of the Netherlands, killing about 900 people, destroying most of the inner city and leaving 78,000 people homeless.
Following the bombing and German threats of the same treatment for Utrecht, the Netherlands capitulated on 15 May, except for the province of Zeeland where French and French-Moroccan troops stood side-by-side with the Dutch army. Still, the Dutch Royal Family along with some armed forces fled to the United Kingdom. Some members of the Dutch Royal Family eventually moved to Ottawa, Ontario, Canada until the Netherlands was liberated five years later. Princess Margriet was born in Canada, during the period the family spent in exile.
Resentment of the Germans grew as the occupation became more harsh, prompting many Dutch in the latter years of the war to join the resistance. But collaboration was not uncommon either; many thousands of young Dutch males volunteered for combat service on the Russian Front with the Waffen-SS and many companies worked for the German occupiers.
About 140,000 Jews lived in the Netherlands at the beginning of the war. Persecution of Dutch Jews started shortly after the occupation. At the end of the war, 40,000 Jews were still alive. Of the 100,000 Jews who did not go into hiding, about 1,000 survived the war.
One famous victim of the Holocaust was Anne Frank, who gained worldwide fame when her diary, written in the "achterhuis" ('backhouse') while hiding from the Nazis, was found and published posthumously by her father, Otto Frank; who was the only member of the family to survive the Holocaust.
On 8 December 1941, the day after the attack on Pearl Harbor, the Netherlands declared war on Japan. The Dutch government-in-exile in London had for long been working with the UK & US governments to cut off oil supplies to Japan. Japanese forces invaded the Dutch East Indies on 11 January 1942. The Dutch surrendered on 8 March after Japanese troops landed on Java. Dutch citizens and everybody with Dutch ancestry, the so-called "Indo's" were captured and put to work in labour camps or interned. As in the Netherlands, many Dutch ships, planes and military personnel managed to reach safety, in this case Australia; from where they were able to fight again.
In Europe, after the Allies landed in Normandy in June 1944, progress was slow until the Battle of Normandy ended in August 1944. German resistance collapsed in Western Europe and the allied armies advanced quickly towards the Dutch border. The First Canadian Army and the Second British Army conducted operations on Dutch soil from September onwards. On 17 September, a daring operation, Operation Market Garden; was executed with the goal of capturing bridges across three major rivers in the southern Netherlands. Despite desperate fighting by American, British and Polish forces, the bridge at Arnhem, across the Neder Rijn, could not be captured.
Areas south of the Rhine river were liberated in the period September–December 1944, including the province of Zeeland, which was liberated in October and November in the Battle of the Scheldt. This opened Antwerp to allied shipping. The First Canadian Army held a static line along the river Meuse (Maas) from December 1944 through February 1945.
The rest of the country remained occupied until the spring of 1945. In the face of Dutch defiance, the Nazis deliberately cut off food supplies resulting in near-starvation in the cities during the "Hongerwinter" (Hunger winter) of 1944–45. Soup kitchens were set up but many vulnerable people died. A few days before the Allied victory, the Germans allowed emergency shipments of food.
The First Canadian Army launched Operation Veritable in early-February, cracking the Siegfried Line and reaching the banks of the Rhine in early-March. In the final weeks of the war in Europe, the First Canadian Army was charged with clearing the Netherlands of German forces.
The Liberation of Arnhem began on 12 April 1945 and proceeded to plan, as the three infantry brigades of the 49th Division leapfrogged each other through the city. Within four days Arnhem, now a ruined city, was totally under Allied control.
The Canadians then immediately advanced further into the country, encountering and defeating a German counterattack at Otterlo and Dutch SS resistance at Ede. On 27 April a temporary truce came into effect, allowing the distribution of food aid to the starving Dutch civilians in areas under German control (Operation Manna). On 5 May 1945, Generaloberst Blaskowitz agreed to the unconditional surrender of all German forces in the Netherlands, signing the surrender to Canadian general Charles Foulkes at Wageningen. (The Fifth of May is now celebrated annually in the Netherlands as Liberation Day.) Three days later Germany unconditionally surrendered, bringing the war in Europe to an end.
After the euphoria and settling of scores had ended, the Dutch were a traumatised people with a ruined economy, a shattered infrastructure and several destroyed cities including Rotterdam, Nijmegen, Arnhem and part of The Hague.
After the war, there were reprisals against those who had collaborated with the Nazis. Artur Seyss-Inquart, Nazi Commissioner of the Netherlands, was tried at Nüremberg.
In the early post-war years, the Netherlands made continued attempts to expand its territory by annexing neighbouring German territory. The larger annexation plans were continuously rejected by the United States, but the London conference of 1949 permitted the Netherlands to perform a smaller scale annexation. Most of the annexed territory was returned to Germany on 1 August 1963.
Operation Black Tulip was a plan in 1945 by Dutch Minister of Justice Kolfschoten to evict all Germans from the Netherlands. The operation lasted from 1946 to 1948 and in the end 3,691 Germans (15% of Germans resident in the Netherlands) were deported. The operation started on 10 September 1946 in Amsterdam, where Germans and their families were taken from their homes in the middle of the night and given one hour to collect 50 kg of luggage. They were allowed to take 100 guilders. The rest of their possessions went to the state. They were taken to concentration camps near the German border, the biggest of which was Mariënbosch concentration camp near Nijmegen.
The post-war years were a time of hardship, shortages and natural disaster. This was followed by large-scale public works programmes, economic recovery, European integration and the gradual introduction of a welfare state.
Immediately after the war, rationing was imposed on many goods, including: cigarettes, textiles, washing powder and coffee. Even traditional wooden shoes were rationed. There was severe housing shortages in the Netherlands as a result of the war. In the 1950s, there was mass emigration, especially to Canada, Australia and New Zealand. Government-encouraged emigration efforts to reduce population density prompted some 500,000 Dutch people to leave the country after the war. The Netherlands failed to hold the Dutch East Indies, as Indonesia became independent and 300,000 Dutch inhabitants (and their Indonesian allies) left the islands.
Post-war politics saw shifting coalition governments. The 1946 Parliamentary elections saw the Catholic People's Party (KVP) emerge as the largest party, just ahead of the socialist Labour party (PvdA). Louis J. M. Beel formed a new coalition cabinet. The United States began providing economic assistance as part of the Marshall Plan in 1948 that injected valuable funds into the economy, fostered modernisation of business, and encouraged economic cooperation.
The 1948 elections led to a new coalition led by Labor's Willem Drees. He led four successive cabinets Drees I, Drees II, Drees III and Drees IV until 1958. His tenure in office saw four major political developments: the traumas of decolonisation, economic reconstruction, the establishment of the Dutch welfare state, and international integration and co-operation, including the formation of Benelux, the OEEC, NATO, the ECSC, and the EEC.
Despite the socio-economic problems, this was a period of optimism for many. A baby boom followed the war, as young Dutch couples started the families they couldn't previously due to the war. They had lived through the hardships of the Great Depression and the hell of war. They wanted to start afresh and live better lives without the poverty, starvation, terror, and extreme frugality they knew so well. They had little taste for a strictly imposed rule-oriented traditional system with its rigid hierarchies, sharp pillarised boundaries and strictly orthodox religious doctrines. The translation of "The Common Sense Book of Baby and Child Care" (1946), by American pediatrician Benjamin Spock was a best-seller. His vision of family life as companionate, permissive, enjoyable and even as being fun took hold, and seemed the best way to achieve family happiness in a dawning age of freedom and prosperity.
Wages were kept low and the recovery of consumption to pre-war levels was delayed to permit rapid rebuilding of the infrastructure. In the years after the war, unemployment fell and the economy grew at an astonishing pace, despite the high birth rate. The shattered infrastructure and destroyed cities were rebuilt. A key contribution to the recovery in the post-war Netherlands came from the Marshall Plan, which provided the country with funds, goods, raw materials and produce.
The Dutch became internationally active again. Dutch corporations, particularly Royal Dutch Shell and Philips, became internationally prominent. Businesspeople, scientists, engineers and artists from the Netherlands made important international contributions. For example, Dutch economists, especially Jan Tinbergen (1903–1994), Tjalling Koopmans (1910–1985) and Henri Theil (1924–2000), made major contributions to the mathematical and statistical methodology known as econometrics.
Across Western Europe, the period from 1973–81 marked the end of the booming economy of the 1960s. The Netherlands also experienced years of negative growth after that. Unemployment increased steadily, causing rapid growth in social-security expenditures. Inflation reached double digits; government surpluses disappeared. On the positive side, rich natural gas resources were developed, providing a current account trade surplus during most of the period. Public deficits were high. According to the long-term economic analysis of Horlings and Smits, the major gains in the Dutch economy were concentrated between 1870–1930 and between 1950–70. Rates were much lower in 1930–45 and after 1987.
The last major flood in the Netherlands took place in early-February 1953, when a huge storm caused the collapse of several dikes in the southwest of the Netherlands. More than 1,800 people drowned in the ensuing inundation.
The Dutch government subsequently decided on a large-scale programme of public works (the "Delta Works") to protect the country against future floods. The project took more than thirty years to complete. The Oosterscheldedam, an advanced sea storm barrier, became operational in 1986. According to Dutch government engineers, the odds of a major inundation anywhere in the Netherlands are now one in 10,000 years.
The European Coal and Steel Community (ECSC), was founded in 1951 by the six founding members: Belgium, the Netherlands and Luxembourg (the Benelux countries) and West Germany, France and Italy. Its purpose was to pool the steel and coal resources of the member states, and to support the economies of the participating countries. As a side effect, the ECSC helped defuse tensions between countries which had recently been fighting each other during the war. In time, this economic merger grew, adding members and broadening in scope, to become the European Economic Community, and later the European Union (EU).
The United States started to have more influence. After the war, higher education changed from a German model to more of an American-influenced model. American influences had been small in the interwar era, and during the war, the Nazis had emphasised the dangers of a "degraded" American culture as represented by jazz. However, the Dutch became more attracted to the United States during the post-war era, perhaps partly because of antipathy towards the Nazis but certainly because of American films and consumer goods. The Marshall Plan also introduced the Dutch to American management practices. NATO brought in American military doctrine and technology. Intellectuals, artists and the political left, however, remained more reserved about the Americans. According to Rob Kroes, the anti-Americanism in the Netherlands was ambiguous: American culture was both accepted and criticised at the same time.
The Netherlands is a founding member of the EU, NATO, OECD and WTO. Together with Belgium and Luxembourg it forms the Benelux economic union. The country is host to the Organisation for the Prohibition of Chemical Weapons and five international courts: the Permanent Court of Arbitration, the International Court of Justice, the International Criminal Tribunal for the Former Yugoslavia, the International Criminal Court and the Special Tribunal for Lebanon. The first four are situated in The Hague, as is the EU's criminal intelligence agency Europol and judicial co-operation agency Eurojust. This has led to the city being dubbed "the world's legal capital".
The Dutch East Indies had long been a valuable resource to the Netherlands, generating about 14% of the Dutch national income in the 1930s, and was home to thousands of Dutch people and officials, businessmen and missionaries. By the first half of the twentieth century, new organisations and leadership had developed in the Dutch East Indies. Under its Ethical Policy, the government had helped create an educated Indonesian elite. These profound changes constituted the "Indonesian National Revival". Increased political activism and Japanese occupation undermining Dutch rule culminated in nationalists proclaiming independence on 17 August 1945, two days after the surrender of Japan. The Dutch did not plan to let go, for they would be left as merely a minor second-class power ranking with Denmark perhaps. However, the Netherlands was much too weak to reconquer Indonesia. The Japanese had imprisoned all the Dutch residents, and turned the islands over to a native government, which was widely popular. The British military arrived to disarm the Japanese. The Dutch finally returned and attempted to eradicate the Indonesian National Revolution with force, (sometimes brutal in nature).
Hundreds of thousands of Indonesians supported the Dutch position; when Independence finally arrived, most of them were relocated to the Netherlands. The UK mediated a compromise signed in March 1947 whereby de facto control of the new Indonesian Republic was acknowledged over Java, Maduro and Sumatra, while acknowledging Dutch control over the numerous smaller and far less important islands. Supposedly there would be a federated Indonesian state and a union with the Netherlands, but that never happened. The Indonesians wanted complete transfer of power, and the Dutch refused. By 1946, the United States was financing the Dutch in Indonesia, and was able to exert pressure on The Hague. Increasing international pressure—including American hints about cutting off military funds—forced the Netherlands to withdraw. A decisive episode was the success of the Indonesian Republic in crushing a Communist revolt. Washington now realised that Indonesia was part of the Cold War fight against communism, and the Indonesian government was a necessary ally—and that the Dutch tactics were counterproductive and chaotic, and could only provide help to Communist insurgencies. The Netherlands formally recognised Indonesian independence on 27 December 1949. Public opinion blamed Washington for the Dutch colonial failure. Only Irian, the western half of New Guinea remained under Dutch control as Netherlands New Guinea until 1961, when the Netherlands transferred sovereignty of this area to Indonesia.
During and after the Indonesian National Revolution, over 350,000 people, left Indonesia for the Netherlands. They included 250,000 Europeans and "Indos" (Dutch-Indonesian Eurasians), Along with 100,000 military conscripts, and 12,000 South Moluccans settled in the Netherlands. Similarly after independence in 1975, Surinam sent about 115,000 Surinamese. This out-migration occurred in five distinct waves over a period of twenty years. It included Indos (many of whom spent the war years in Japanese concentration camps), former South Moluccan soldiers and their families, "New-Guinea Issue" Dutch citizens, Dutch citizens from Netherlands New Guinea (including Papuan civil servants and their families), and other Indos who had remained behind but later regretted their decision to take out Indonesian citizenship.
The Indos of Indonesian descent (now numbering around 680,000) is the largest ethnic minority group in the Netherlands. They are integrated into Dutch society, but they have also retained many aspects of their culture and have added a distinct Indonesian flavour to the Netherlands.
Although it was originally feared that the loss of the Dutch East Indies would contribute to an economic decline, the Dutch economy experienced exceptional growth (partly because a disproportionate amount of Marshall Plan aid was received) in the 1950s and 1960s. In fact, the demand for labour was so strong that immigration was actively encouraged, first from Italy and Spain then later on, in larger numbers, from Turkey and Morocco.
Suriname became independent on 25 November 1975. The Dutch government supported independence because it wanted to stem the flow of immigrants from Suriname and also to end its colonial status. However, about one-third of the entire population of Suriname, fearing political unrest and economic decline, relocated to the Netherlands, creating a Surinamese community in the Netherlands that is now roughly as large as the population of Suriname itself.
When the post-war baby boom children grew up, they led the revolt in the 1960s against all rigidities in Dutch life. The 1960s and 1970s were a time of great social and cultural change, such as rapid de-pillarization leading to the erosion of the old divisions along class and religious lines. A youth culture emerged all across Western Europe and the United States, characterised by student rebellion, informality, sexual freedom, informal clothes, new hairstyles, protest music, drugs and idealism. Young people, and students in particular, rejected traditional mores, and pushed for change over matters such as: women's rights, sexuality, disarmament and environmental issues.
Secularisation, or the decline in religiosity, first became noticeable after 1960 in the Protestant rural areas of Friesland and Groningen. Then, it spread to Amsterdam, Rotterdam and the other major cities in the west. Finally, the Catholic southern areas showed religious decline. As the social distance between the Calvinists and Catholics narrowed (and they began to intermarry), it became possible to merge their parties. The Anti-Revolutionary Party (ARP) in 1977 merged with the Catholic People's Party (KVP) and the Protestant Christian Historical Union (CHU) to form the Christian Democratic Appeal (CDA). However, a countervailing trend later appeared as the result of a religious revival in the Protestant Bible Belt, and the growth of the Muslim and Hindu communities as a result of immigration from overseas and high fertility levels.
After 1982, there was a retrenchment of the welfare system, especially regarding old-age pensions, unemployment benefits, and disability pensions/early retirement benefits.
Following the 1994 general election, in which the Christian democratic CDA lost a considerable portion of its representatives, the social-liberal Democrats 66 (D66) doubled in size and formed a coalition with the labour party (Netherlands) (PvdA), and the People's Party for Freedom and Democracy (VVD). This purple (government) coalition marked the first absence of the CDA in a government for decades. During the Purple Coalition years, a period lasting until the rise of the populist politician Pim Fortuyn, the government addressed issues previously viewed as taboo under the Christian-influenced cabinet. At this time, the Dutch government introduced unprecedented legislation based on a policy of official tolerance ("gedoogbeleid"). Abortion and euthanasia were decriminalised, but stricter guidelines were set for their implementation. Drug policy, especially with regard to the regulation of cannabis, was reformed. Prostitution was legalised, but confined to brothels where the health and safety of those involved could be properly monitored. With the 2001 Same-Sex Marriage Act, the Netherlands became the first country in the world to legalise same-sex marriage. In addition to social reforms, the Purple Coalition also presided over a period of remarkable economic prosperity.
At the 1998 general election, the Purple Coalition consisting of Social Democrats, and left and right-wing Liberals, increased its majority. Both the social democratic PvdA and the conservative liberal VVD grew at the cost of their junior partner in cabinet, the progressive liberal D66. The voters rewarded the Purple Coalition for its economic performance, which had included reduction of unemployment and the budget deficit, steady growth and job creation combined with wage freezes and trimming of the welfare state, together with a policy of fiscal restraint. The result was the second Kok cabinet.
The power of the coalition waned with the introduction of List Pim Fortuyn in the Dutch general election of 2002, a populist party, which ran a distinctly anti-immigration and anti-purple campaign, citing "Purple Chaos" ("Puinhopen van Paars") as the source of the countries social woes. In the first political assassination for three centuries, Fortuyn was murdered with little over a week left before the election. In the wake of its leader's death, LPF swept the elections, entering parliament with one-sixth of the seats, while the PvdA (Labour) lost half of its seats. The ensuing cabinet was formed by CDA, VVD and LPF, led by Prime Minister Jan Peter Balkenende. Though the party succeeded in displacing the rival Purple Coalition, without the charismatic figure of Pim Fortuyn at its helm, it proved to be short-lived; lasting a mere 87 days in government.
Two events changed the political landscape:
By 2000, the population had increased to 15,900,000 people, making the Netherlands one of the most densely-populated countries in the world. Urban development has led to the development of a conurbation called the Randstad (), which includes the four largest cities (Amsterdam, Rotterdam, The Hague and Utrecht), and the surrounding areas. With a population of 7,100,000; it is one of the largest conurbations in Europe.
On 26 December 2004 during the Christmas holiday and Boxing Day celebration, several Dutch people in Thailand and the other part across of South and Southeast Asia were among thousands of people killed by the magnitude 9.0 earthquake and tsunami off Indonesian island's west coast of Sumatra, which suffered from the significant lost of Dutch lives. A memorial service held at Basilica of St. Nicholas Cathedral in Amsterdam in January 2005 was held on behalf of the Queen of the Netherlands.
This small nation has successfully developed into one of the most open, dynamic and prosperous countries in the world. It had the tenth-highest per capita income in the world in 2011. It has an open, market-based mixed economy, ranking thirteenth out of 157 countries according to the Index of Economic Freedom. In May 2011, the OECD ranked the Netherlands as the "happiest" country in the world.
On Koningsdag ("King's Day"), 30 April 2013, Prince Willem Alexander appointed as the King, having ascended the throne following his mother's abdication, Queen Beatrix. At the time of her abdication at age 75, Beatrix was the oldest reigning monarch in the country's history.
On 17 July 2014, a several 193 Dutch people are among 300 people on aboard killed in the Malaysia Airlines Flight MH17 plane shot down by the air-surface missile in Eastern Ukraine near Russian border. A referendum on the approval of the Association Agreement between the European Union and Ukraine was held in The Hague on 6 April 2016.
VVD Prime Minister Mark Rutte won the 2017 general election and formed a third government and was in first few months challenged after the People's Party for Freedom and Democracy voted since 2006.
Gutenberg editions online and "History of the United Netherlands, 1584–1609" (4 vol., 1860–67) Gutenberg editions online For a criticism of Motler see Robert Wheaton, "Motley and the Dutch Historians." "New England Quarterly" 1962 35(3): 318–36. in JSTOR
The American John Lothrop Motley was the first foreign historian to write a major history of the Dutch Republic. In 3500 pages he crafted a literary masterpiece that was translated into numerous languages; his dramatic story reached a wide audience in the 19th century. Motley relied heavily on Dutch scholarship and immersed himself in the sources. His style no longer attracts readers, and scholars have moved away from his simplistic dichotomies of good versus evil, Dutch versus Spanish, Catholic versus Protestant, freedom versus authoritarianism. His theory of causation over-emphasized ethnicity as an unchanging characteristic, exaggerated the importance of William of Orange, and gave undue importance to the issue religious tolerance.
The pioneering Dutch cultural historian Johan Huizinga (1872–1945), author of "The Autumn of the Middle Ages" (1919) (the English translation was called "The Waning of the Middle Ages") and "Homo Ludens: A Study of the Play Element in Culture" (1935), which expanded the field of cultural history and influenced the historical anthropology of younger historians of the French Annales School. He was influenced by art history and advised historians to trace "patterns of culture" by studying "themes, figures, motifs, symbols, styles and sentiments."
The "polder model" continues to strongly influence historians as well as Dutch political discussion. The polder model stresses the need for finding consensus; it discourages furious debate and angry dissent in both academia and politics – in contrast to the highly developed, intense debates in Germany.
The H-Net list H-Low-Countries is published free by email and is edited by scholars. Its occasional messages serve an international community with diverse methodological approaches, archival experiences, teaching styles, and intellectual traditions, promotes discussion relevant to the region and to the different national histories in particular, with an emphasis on the Netherlands. H-Low-Countries publishes conference announcements, questions and discussions; reviews of books, journals, and articles; and tables of contents of journals on the history of the Low Countries (in both Dutch and English). After World War II both research-oriented and teaching-oriented historians have been rethinking their interpretive approaches to Dutch history, balancing traditional memories and modern scholarship. In terms of popular history, there has been an effort to ensure greater historical accuracy in museums and historic tourist sites.
Once heralded as the leading event of modern Dutch history, the Dutch Revolt lasted from 1568 to 1648, and historians have worked to interpret it for even longer. Cruz (2007) explains the major debates among scholars regarding the Dutch bid for independence from Spanish rule. While agreeing that the intellectual milieus of late 19th and 20th centuries affected historians' interpretations, Cruz argues that writings about the revolt trace changing perceptions of the role played by small countries in the history of Europe. In recent decades grand theory has fallen out of favor among most scholars, who emphasize the particular over the general. Dutch and Belgian historiography since 1945 no longer says the revolt was the culmination of an inevitable process leading to independence and freedom. Instead scholars have put the political and economic details of the towns and provinces under the microscope, while agreeing on the weaknesses of attempts at centralization by the Habsburg rulers. The most influential new studies have been rooted in demographic and economic history, though scholars continue to debate the relationship between economics and politics. The religious dimension has been viewed in terms of mentalities, exposing the minority position of Calvinism, while the international aspects have been studied more seriously by foreign historians than by the Dutch themselves.
Pieter Geyl was the leading historian of the Dutch Revolt, and a highly influential professor at the University of London (1919–1935) and at the State University of Utrecht (1936–58). He wrote a six-volume history of the Dutch-speaking peoples. The Nazis imprisoned him in World War II. In his political views, Geyl adopted the views of the 17th-century Dutch Louvestein faction, led by Johan van Oldenbarneveldt (1547–1619) and Johan de Witt (1625–72). It stood for liberty, toleration, and national interests in contrast to the Orange stadholders who sought to promote their own self-interest. According to Geyl, the Dutch Republic reached the peak of its powers during the 17th century. He was also a staunch nationalist and suggested that Flanders could split off from Belgium and join the Netherlands. Later he decried what he called radical nationalism and stressed more the vitality of Western Civilization. Geyl was highly critical of the world history approach of Arnold J. Toynbee.
Jan Romein (1893–1962) created a "theoretical history" in an attempt to reestablish the relevance of history to public life in the 1930s at a time of immense political uncertainty and cultural crisis, when Romein thought that history had become too inward-looking and isolated from other disciplines. Romein, a Marxist, wanted history to contribute to social improvement. At the same time, influenced by the successes of theoretical physics and his study of Oswald Spengler, Arnold J. Toynbee, Frederick John Teggart, and others, he spurred on the development of theoretical history in the Netherlands, to the point where it became a subject in its own right at the university level after the war. Romein used the term integral history as a substitute for cultural history and focused his attention on the period around the turn of the century. He concluded that a serious crisis occurred in European civilization in 1900 because of the rise of anti-Semitism, extreme nationalism, discontent with the parliamentary system, depersonalization of the state, and the rejection of positivism. European civilization waned as the result of this crisis which was accompanied by the rise of the United States, the Americanization of the world, and the emergence of Asia. His interpretation is reminiscent of that of his mentor Johan Huizinga and was criticized by his colleague Pieter Geyl.
See also: | https://en.wikipedia.org/wiki?curid=13289 |
Harold and Maude
Harold and Maude is a 1971 American coming-of-age dark comedy drama film directed by Hal Ashby and released by Paramount Pictures. It incorporates elements of dark humor and existentialist drama. The plot revolves around the exploits of a young man named Harold Chasen (Bud Cort) who is intrigued with death. Harold drifts away from the life that his detached mother (Vivian Pickles) prescribes for him, and slowly develops a strong friendship, and eventually a romantic relationship, with a 79-year-old woman named Maude (Ruth Gordon) who teaches Harold about living life to its fullest and that life is the most precious gift of all.
The film was based on a screenplay written by Colin Higgins and published as a novel in 1971. Filming locations in the San Francisco Bay Area included both Holy Cross Cemetery and Golden Gate National Cemetery, and the ruins of the Sutro Baths.
Critically and commercially unsuccessful when originally released, the film developed a cult following and in 1983 began making a profit. The film is ranked number 45 on the American Film Institute's list of 100 Funniest Movies of all Time and was selected for preservation in the National Film Registry of the Library of Congress in 1997, for being "culturally, historically or aesthetically significant". The Criterion Collection special-edition Blu-ray and DVD were released June 12, 2012.
Harold Chasen is a young man obsessed with death. He stages elaborate shocking fake suicides, attends funerals, and drives a hearse, all to the chagrin of his cold, narcissistic, filthy-rich socialite mother. His mother sets up appointments with a psychoanalyst for him, but the analyst is befuddled by the case and fails to get Harold to talk about his real emotions. Harold’s mother also sets him up with dates he does not want, buys him presents he hates and destroys, and answers career questionnaires for him based on her own preferences rather than his. Harold does not exist for her except as an extension of herself.
At another stranger's funeral service, Harold meets Maude, a seventy-nine-year-old woman who shares Harold's hobby of attending funerals. He is entranced by her quirky outlook on life, which is bright and excessively carefree in contrast with his morbidity. Maude lives in a decommissioned railroad car. She thinks nothing of breaking the law in any way she pleases, including “taking” other people’s cars, uprooting a tree from a public space to re-plant it, speeding, and parking on the city sidewalk. She and Harold form a bond and Maude shows Harold the pleasures of art and music (including how to play banjo), and teaches him how to make “the most of his time on earth.” Meanwhile, Harold's mother is determined, against Harold's wishes, to find him a wife. One by one, Harold frightens and horrifies each of his appointed dates, by appearing to commit gruesome acts such as self-immolation, self-mutilation and seppuku. His mother tries enlisting him in the military by sending him to his uncle, who had served under General MacArthur in the Second World War and lost an arm, but Harold deters his recruiting-officer uncle by staging a scene in which Maude poses as a pacifist protester and Harold seemingly murders her out of militarist fanaticism.
When Harold and Maude talk at her home he tells her, without prompting, the motive for his fake suicides: when he was at boarding school, he accidentally caused an explosion in his chemistry lab, leading police to assume that he had been killed. Harold had returned home just in time to witness his mother react to the news of his death by faking a ludicrously dramatized fainting. As he reaches this part of the story, Harold bursts into tears and says, “I decided then I enjoyed being dead.”
As they become closer, their friendship blossoms into a romance. Holding her hand, Harold discovers a number tattooed on her forearm, meaning that Maude is a Jewish survivor of the German Nazi death camps. Harold announces that he will marry Maude, resulting in disgusted outbursts from his family, analyst, and priest. Unbeknownst to Harold, Maude is secretly planning to commit suicide on her eightieth birthday. Maude’s birthday arrives, and Harold throws a surprise party for her. As the couple dance, Maude tells Harold that she “couldn't imagine a lovelier farewell.” Confused, he questions Maude as to her meaning and she reveals that she has taken an overdose of sleeping pills and will be dead by midnight. She restates her firm belief that eighty is the proper age to die.
Harold rushes Maude to the hospital, where she is treated unsuccessfully and dies. In the final sequence, Harold’s car is seen going off a seaside cliff but after the crash, the final shot reveals Harold standing calmly atop the cliff, holding his banjo. After gazing down at the wreckage, he dances away, picking out on his banjo Cat Stevens’s song "If You Want to Sing Out, Sing Out", which Maude had played and sung for him.
Director Hal Ashby appears in an uncredited cameo, watching a model train at an amusement park. The amusement park is Santa Cruz Beach Boardwalk (California USA) / Penny Arcade.
UCLA student Colin Higgins wrote "Harold and Maude" as his master's thesis. While working as producer Edward Lewis's pool boy, Higgins showed the script to Lewis's wife, Mildred. Mildred was so impressed that she got Edward to give it to Stanley Jaffe at Paramount. Higgins sold the script with the understanding that he would direct the film but he was told he wasn't ready, after tests he shot proved unsatisfactory to the studio heads. Ashby would only commit to directing the film after getting Higgins' blessing and then, so Higgins could watch and learn from him on the set, Ashby made Higgins a co-producer. Higgins says he originally thought of the story as a play. It then became a 20-minute thesis while at film school. After the film came out, the script was turned into a novel then a play, which ran for several years in Paris.
Ashby felt that Maude should ideally be European and his list of possible actresses included dames Peggy Ashcroft, Edith Evans, Gladys Cooper and Celia Johnson as well as Lotte Lenya, Luise Rainer, Pola Negri, Minta Durfee, and Agatha Christie. Ruth Gordon indicated that in addition she heard that Edwige Feuillère, Elisabeth Bergner, Mildred Natwick, Mildred Dunnock, and Dorothy Stickney had been considered.
For Harold, in addition to Bud Cort, Ashby considered all promising unknowns, Richard Dreyfuss, Bob Balaban, and John Savage. Also on his list were John Rubinstein, for whom Higgins had written the part, and then-up-and-coming British pop star Elton John, whom Ashby had seen live and hoped would also do the music.
Anne Brebner, the casting director, was almost cast as Harold's mother, when Vivian Pickles was briefly unable to do the role.
"Harold and Maude" received mixed reviews, with several critics being offended by the film's dark humor. Roger Ebert, in a review dated January 1, 1972, gave the film one-and-a-half out of four stars. He wrote, "And so what we get, finally, is a movie of attitudes. Harold is death, Maude life, and they manage to make the two seem so similar that life's hardly worth the extra bother. The visual style makes everyone look fresh from the Wax Museum, and all the movie lacks is a lot of day-old gardenias and lilies and roses in the lobby, filling the place with a cloying sweet smell. Nothing more to report today. Harold doesn't even make pallbearer." Vincent Canby also panned the film, stating that the actors "are so aggressive, so creepy and off-putting, that Harold and Maude are obviously made for each other, a point the movie itself refuses to recognize with a twist ending that betrays, I think, its life-affirming pretensions." "The film was a runaway cult favorite, and, most memorably, in Minneapolis, residents actually picketed the Westgate Theater, and tried to get the management to replace the picture after a consecutive three-year run."
The reputation of the film has increased greatly; Rotten Tomatoes, which labeled the film as "Certified Fresh", gave it a score of 84% based on 45 reviews, with an average score of 7.7/10. A consensus on the site read, "Hal Ashby's comedy is too dark and twisted for some, and occasionally oversteps its bounds, but there's no denying the film's warm humor and big heart." In 2005, the Writers Guild of America ranked the screenplay #86 on its list of 101 Greatest Screenplays ever written. Sight & Sound magazine conducts a poll every ten years of the world's finest film directors, to find out the Ten Greatest Films of All Time. This poll has been going since 1992 and has become the most recognized poll of its kind in the world. In 2012, Niki Caro, Wanuri Kahiu, and Cyrus Frisch voted for "Harold and Maude". Frisch commented: "An encouragement to think beyond the obvious!" In 2017, "Chicago Tribune" critic Mark Caro wrote a belated appreciation, "I'm sorry, "Harold and Maude", for denying you for so long. You're my favorite movie once again."
On June 12, 2012, The Criterion Collection released "Harold and Maude" for Region 1 on DVD and Blu-ray, both of which includes a collection of audio excerpts of director Hal Ashby from January 11, 1972 and of screenwriter Colin Higgins from January 10, 1979, a new video interview with Yusuf/Cat Stevens, a new audio commentary by Ashby biographer Nick Dawson and producer Charles B. Mulvehill, and a booklet which includes a new film essay by film and television critic Matt Zoller Seitz. Exclusive to the Blu-ray edition are a new digital restoration of the film with uncompressed monaural soundtrack and an optional remastered uncompressed stereo soundtrack. Other exclusives are a "New York Times" profile of actress Ruth Gordon from 1971, an interview from 1997 with actor Bud Cort and cinematographer John Alonzo, and an interview from 2001 with executive producer Mildred Lewis.
"Harold and Maude" is #45 on the American Film Institute's list of 100 Years... 100 Laughs, the list of the top 100 films in American comedy. The list was released in 2000. Two years later, AFI released the list AFI's 100 Years... 100 Passions honoring the most romantic films for the past 100 years, "Harold and Maude" ranked #69. In September 2008, "Empire" listed "Harold and Maude" as #65 in "Empire"s 500 Greatest Movies of All Time. "Entertainment Weekly" ranked the film #4 on their list of "The Top 50 Cult Films."
In June 2008, AFI revealed its "Ten Top Ten"—the best ten films in ten "classic" American film genres—after polling over 1,500 people from the creative community. "Harold and Maude" was acknowledged as the ninth-best film in the romantic comedy genre.
The film is recognized by American Film Institute in these lists:
At the 29th Golden Globe Awards, Bud Cort and Ruth Gordon received a nomination for Best Actor and Best Actress in a Musical or Comedy film, respectively.
The music in "Harold and Maude" was composed and performed by Cat Stevens. He had been suggested by Elton John to do the music after John had dropped out of the project. Stevens composed two original songs for the film, "Don't Be Shy" and "If You Want to Sing Out, Sing Out" and performed instrumental and alternative versions of the songs "On the Road to Find Out", "I Wish, I Wish", "Miles from Nowhere", "Tea for the Tillerman", "I Think I See the Light", "Where Do the Children Play?" and "Trouble" which were either on the album "Mona Bone Jakon" or "Tea for the Tillerman". Those albums had been released before the film. "Don't Be Shy" and "If You Want to Sing Out, Sing Out" were not released on an album, until his 1984 compilation "".
There is some additional non–Cat Stevens music in the film. "Greensleeves" is played on the harp during dinner. During the scene where Harold is floating face-down in the swimming pool, the opening bars of Tchaikovsky's Piano Concerto No. 1 are heard. A marching band is also heard playing a march titled "The Klaxon" by Henry Fillmore outside the church following a funeral. At the amusement part, a calliope version of the waltz "Over the Waves" by Juventino Rosas is played.
The first soundtrack was released in Japan in 1972 on vinyl and cassette (A&M Records GP-216). It omitted the two original songs and all instrumental and alternative versions of songs and was generally composed of re-released material that was in the film, along with five songs that were not in the film.
The second soundtrack was released in December 2007, by Vinyl Films Records, as a vinyl-only limited-edition release of 2,500 copies. It contained a 30-page oral history of the making of the film, comprising the most extensive series of interviews yet conducted on "Harold and Maude".
Colin Higgins later adapted the story into a stage play. The original Broadway production, starring Janet Gaynor as Maude and Keith McDermott as Harold, closed after four performances in February 1980.
A French adaptation for television, translated and written by Jean-Claude Carrière, appeared in 1978. It was also adapted for the stage by the Compagnie Viola Léger in Moncton, New Brunswick, starring Roy Dupuis.
Higgins expressed interest in 1978 about both a sequel and prequel to "Harold and Maude". The sequel, "Harold's Story", would have Cort portray Harold's life after Maude. Higgins also imagined a prequel showing Maude's life before Harold, "Grover and Maude" had Maude learning how to steal cars from Grover Muldoon, the character portrayed by Richard Pryor in Higgins' 1976 film "Silver Streak". Higgins wanted Gordon and Pryor to reprise their respective roles. | https://en.wikipedia.org/wiki?curid=13290 |
Habitus (sociology)
In sociology, Habitus () comprises socially ingrained habits, skills and dispositions. It is the way that individuals perceive the social world around them and react to it. These dispositions are usually shared by people with similar backgrounds (such as social class, religion, nationality, ethnicity, education and profession). The habitus is acquired through imitation ("mimesis") and is the reality in which individuals are socialized, which includes their individual experience and opportunities. Thus, the habitus represents the way group culture and personal history shape the body and the mind; as a result, it shapes present social actions of an individual.
French sociologist Pierre Bourdieu suggested that the habitus consists of both the "hexis" (the tendency to hold and use one's body in a certain way, such as posture and accent) and more abstract mental habits, schemes of perception, classification, appreciation, feeling, as well as action. These schemes are not mere habits: Bourdieu suggested they allow individuals to find new solutions to new situations without calculated deliberation, based on their gut feelings and intuitions, which he believed were collective and socially shaped. These attitudes, mannerisms, tastes, moral intuitions and habits have influence on the individual's life chances, so the habitus not only is structured by an individual's objective past position in the social structure but also structures the individual's future life path. Bourdieu argued that the reproduction of the social structure results from the habitus of individuals.
The notion of habitus is extremely influential (with 400,000 Google Scholar publications using it), yet it also evoked criticism for its alleged determinism, as Bourdieu compared social actors to "automata" (while relying on Leibniz's theory of Monads).
The concept of habitus has been used as early as Aristotle but in contemporary usage was introduced by Marcel Mauss and later Maurice Merleau-Ponty. However, it was Pierre Bourdieu who turned it into a cornerstone of his sociology, and used it to address the sociological problem of agency and structure: the habitus is shaped by structural position and generates action, thus when people act and demonstrate agency they simultaneously reflect and reproduce social structure. Bourdieu elaborated his theory of the habitus while borrowing ideas on cognitive and generative schemes from Noam Chomsky and Jean Piaget dependency on history and human memory. For instance, a certain behaviour or belief becomes part of a society's structure when the original purpose of that behaviour or belief can no longer be recalled and becomes socialized into individuals of that culture.
Loïc Wacquant wrote that habitus is an old philosophical notion, originating in the thought of Aristotle, whose notion of "hexis" ("state") was translated into "habitus" by the Medieval Scholastics. Bourdieu first adapted the term in his 1967 postface to Erwin Panofsky's "Gothic Architecture and Scholasticism". The term was earlier used in sociology by Norbert Elias in "The Civilizing Process" (1939) and in Marcel Mauss's account of "body techniques" (). The concept is also present in the work of Max Weber, Gilles Deleuze, and Edmund Husserl.
Mauss defined habitus as those aspects of culture that are anchored in the body or daily practices of individuals, groups, societies, and nations. It includes the totality of learned habits, bodily skills, styles, tastes, and other non-discursive knowledges that might be said to "go without saying" for a specific group (Bourdieu 1990:66-67)—in that way it can be said to operate beneath the level of rational ideology.
According to Bourdieu, habitus is composed of:
The term has also been adopted in literary criticism, adapting from Bourdieu's usage of the term. For example, Joe Moran's examination of authorial identities in "Star Authors: Literary Celebrity in America" uses the term in discussion of how authors develop a habitus formed around their own celebrity and status as authors, which manifests in their writing.
Bourdieu's principle of habitus is interwoven with the concept of structuralism in literary theory. Peter Barry explains, "in the structuralist approach to literature there is a constant movement away from interpretation of the individual literary work and a parallel drive towards understanding the larger structures which contain them" (2009, p. 39). There is therefore a strong desire to understand the larger influencing factors which makes an individual literary work. As Bourdieu explains, habitus "are structured structures, generative principles of distinct and distinctive practices – what the worker eats, and especially the way he eats it, the sport he practices and the way he practices it, his political opinions and the way he expresses them are systematically different from the industrial proprietor's corresponding activities / habitus are also structuring structures, different classifying schemes classification principles, different principles of vision and division, different tastes. Habitus make different differences; they implement distinctions between what is good and what is bad, what is right and what is wrong, between what is distinguished and what is vulgar, and so on, but they are not the same. Thus, for instance, the same behaviour or even the same good can appear distinguished to one person, pretentious to someone else, and cheap or showy to yet another" (Bourdieu, 1996). As a result, habitus may be employed in literary theory in order to understand those larger, external structures which influence individual theories and works of literature.
Body habitus (or "bodily habitus") is the medical term for physique, and is categorized as either endomorphic (relatively short and stout), ectomorphic (relatively long and thin) or mesomorphic (muscular proportions). In this sense, habitus has in the past been interpreted as the physical and constitutional characteristics of an individual, especially as related to the tendency to develop a certain disease. For example, "Marfanoid bodily habitus". | https://en.wikipedia.org/wiki?curid=13291 |
Hypoxia (medical)
Hypoxia is a condition in which the body or a region of the body is deprived of adequate oxygen supply at the tissue level. Hypoxia may be classified as either "generalized", affecting the whole body, or "local", affecting a region of the body. Although hypoxia is often a pathological condition, variations in arterial oxygen concentrations can be part of the normal physiology, for example, during hypoventilation training or strenuous physical exercise.
Hypoxia differs from hypoxemia and anoxemia in that hypoxia refers to a state in which oxygen supply is insufficient, whereas hypoxemia and anoxemia refer specifically to states that have low or zero arterial oxygen supply. Hypoxia in which there is complete deprivation of oxygen supply is referred to as anoxia.
Generalized hypoxia occurs in healthy people when they ascend to high altitude, where it causes altitude sickness leading to potentially fatal complications: high altitude pulmonary edema (HAPE) and high altitude cerebral edema (HACE). Hypoxia also occurs in healthy individuals when breathing mixtures of gases with a low oxygen content, e.g. while diving underwater especially when using closed-circuit rebreather systems that control the amount of oxygen in the supplied air. Mild, non-damaging intermittent hypoxia is used intentionally during altitude training to develop an athletic performance adaptation at both the systemic and cellular level.
In acute or silent hypoxia, a person’s oxygen level in blood cells and tissue can drop without any initial warning, even though the individual’s chest x-ray shows diffuse pneumonia with an oxygen level below normal. Doctors report cases of silent hypoxia with COVID-19 patients who did not experience shortness of breath or coughing until their oxygen levels had plummeted to such a degree that the patients risked acute respiratory distress (ARDS) and organ failure. In a New York Times opinion piece (4/20/20), emergency room doctor Richard Levitan reports "a vast majority of Covid pneumonia patients I met had remarkably low oxygen saturations at triage — seemingly incompatible with life — but they were using their cellphones as we put them on monitors."
Hypoxia is a common complication of preterm birth in newborn infants. Because the lungs develop late in pregnancy, premature infants frequently possess underdeveloped lungs. To improve lung function, doctors frequently place infants at risk of hypoxia inside incubators (also known as humidicribs) that provide warmth, humidity, and oxygen. More serious cases are treated with continuous positive airway pressure.
The 2019 Nobel Prize in Physiology or Medicine was awarded to William G. Kaelin Jr., Sir Peter J. Ratcliffe, and Gregg L. Semenza in recognition of their discovery of cellular mechanisms to sense and adapt to different oxygen concentrations, establishing a basis for how oxygen levels affect physiological function.
The symptoms of generalized hypoxia depend on its severity and acceleration of onset.
In the case of altitude sickness, where hypoxia develops gradually, the symptoms include fatigue, numbness / tingling of extremities, nausea, and cerebral anoxia. These symptoms are often difficult to identify, but early detection of symptoms can be critical.
In severe hypoxia, or hypoxia of very rapid onset, ataxia, confusion / disorientation / hallucinations / behavioral change, severe headaches / reduced level of consciousness, papilloedema, breathlessness, pallor, tachycardia, and pulmonary hypertension eventually leading to the late signs cyanosis, slow heart rate / cor pulmonale, and low blood pressure followed by heart failure eventually leading to shock and death.
Because hemoglobin is a darker red when it is not bound to oxygen (deoxyhemoglobin), as opposed to the rich red color that it has when bound to oxygen (oxyhemoglobin), when seen through the skin it has an increased tendency to reflect blue light back to the eye. In cases where the oxygen is displaced by another molecule, such as carbon monoxide, the skin may appear 'cherry red' instead of cyanotic. Hypoxia can cause premature birth, and injure the liver, among other deleterious effects.
If tissue is not being perfused properly, it may feel cold and appear pale; if severe, hypoxia can result in cyanosis, a blue discoloration of the skin. If hypoxia is very severe, a tissue may eventually become gangrenous.
Extreme pain may also be felt at or around the site.
Tissue hypoxia from low oxygen delivery may be due to low haemoglobin concentration (anaemic hypoxia), low cardiac output (stagnant hypoxia) or low haemoglobin saturation (hypoxic hypoxia). The consequence of oxygen deprivation in tissues is a switch to anaerobic metabolism at the cellular level. As such, reduced systemic blood flow may result in increased serum lactate. Serum lactate levels have been correlated with illness severity and mortality in critically ill adults and in ventilated neonates with respiratory distress.
Oxygen passively diffuses in the lung alveoli according to a pressure gradient. Oxygen diffuses from the breathed air, mixed with water vapour, to arterial blood, where its partial pressure is around 100 mmHg (13.3 kPa). In the blood, oxygen is bound to hemoglobin, a protein in red blood cells. The binding capacity of hemoglobin is influenced by the partial pressure of oxygen in the environment, as described in the oxygen–hemoglobin dissociation curve. A smaller amount of oxygen is transported in solution in the blood.
In peripheral tissues, oxygen again diffuses down a pressure gradient into cells and their mitochondria, where it is used to produce energy in conjunction with the breakdown of glucose, fats, and some amino acids.
Hypoxia can result from a failure at any stage in the delivery of oxygen to cells. This can include decreased partial pressures of oxygen, problems with diffusion of oxygen in the lungs, insufficient available hemoglobin, problems with blood flow to the end tissue, and problems with breathing rhythm.
Experimentally, oxygen diffusion becomes rate limiting (and lethal) when arterial oxygen partial pressure falls to 60 mmHg (5.3 kPa) or below.
Almost all the oxygen in the blood is bound to hemoglobin, so interfering with this carrier molecule limits oxygen delivery to the periphery. Hemoglobin increases the oxygen-carrying capacity of blood by about 40-fold, with the ability of hemoglobin to carry oxygen influenced by the partial pressure of oxygen in the environment, a relationship described in the oxygen–hemoglobin dissociation curve. When the ability of hemoglobin to carry oxygen is interfered with, a hypoxic state can result.
Ischemia, meaning insufficient blood flow to a tissue, can also result in hypoxia. This is called 'ischemic hypoxia'. This can include an embolic event, a heart attack that decreases overall blood flow, or trauma to a tissue that results in damage. An example of insufficient blood flow causing local hypoxia is gangrene that occurs in diabetes.
Diseases such as peripheral vascular disease can also result in local hypoxia. For this reason, symptoms are worse when a limb is used. Pain may also be felt as a result of increased hydrogen ions leading to a decrease in blood pH (acidity) created as a result of anaerobic metabolism.
This refers specifically to hypoxic states where the arterial content of oxygen is insufficient. This can be caused by alterations in respiratory drive, such as in respiratory alkalosis, physiological or pathological shunting of blood, diseases interfering in lung function resulting in a ventilation-perfusion mismatch, such as a pulmonary embolus, or alterations in the partial pressure of oxygen in the environment or lung alveoli, such as may occur at altitude or when diving.
Carbon monoxide competes with oxygen for binding sites on hemoglobin molecules. As carbon monoxide binds with hemoglobin hundreds of times tighter than oxygen, it can prevent the carriage of oxygen.
Carbon monoxide poisoning can occur acutely, as with smoke intoxication, or over a period of time, as with cigarette smoking. Due to physiological processes, carbon monoxide is maintained at a resting level of 4–6 ppm. This is increased in urban areas (7–13 ppm) and in smokers (20–40 ppm). A carbon monoxide level of 40 ppm is equivalent to a reduction in hemoglobin levels of 10 g/L. | https://en.wikipedia.org/wiki?curid=13292 |
Historical revisionism
In historiography, the term historical revisionism identifies the re-interpretation of an historical account.
It usually involves challenging the orthodox (established, accepted or traditional) views held by professional scholars about a historical event or time-span or phenomenon, introducing contrary evidence, or reinterpreting the motivations and decisions of the people involved. The revision of the historical record can reflect new discoveries of fact, evidence, and interpretation, which then results in revised history. In dramatic cases, revisionism involves a reversal of older moral judgments.
At a basic level, legitimate historical revisionism is a common and not especially controversial process of developing and refining the writing of histories. Much more controversial is the reversal of moral findings, whereby what mainstream historians had considered (for example) positive forces are depicted as negative. Such revisionism, if challenged (especially in heated terms) by the supporters of the previous view, can become an illegitimate form of historical revisionism known as historical negationism if it involves inappropriate methods such as:
This type of historical revisionism can present a re-interpretation of the moral meaning of the historical record.
Negationists use the term "revisionism" to portray their efforts as legitimate historical revisionism. This is especially the case when "revisionism" relates to Holocaust denial.
Historical revisionism is the means by which the historical record, the history of a society, as understood in its collective memory, continually integrates new facts and interpretations of the events that are commonly understood as history. The historian and American Historical Association member James M. McPherson has said:
In the field of historiography, the historian who works within the existing establishment of society and has produced a body of history books from which he or she can claim authority, usually benefits from the "status quo". As such, the professional-historian paradigm is manifested as a denunciative stance towards any form of historical revisionism of fact, interpretation or both. In contrast to the single-paradigm form of writing history, the philosopher of science, Thomas Kuhn, said, in contrast to the quantifiable hard sciences, characterized by a single paradigm, the social sciences are characterized by several paradigms that derive from a "tradition of claims, counterclaims, and debates over [the] fundamentals" of research. On resistance to the works of revised history that present a culturally-comprehensive historical narrative of the US, the perspectives of black people, women, and the labour movement, the historian David Williams said:
After the Second World War, the study and production of history in the US was expanded by the G.I. Bill, which funding allowed "a new and more broadly-based generation of scholars" with perspectives and interpretations drawn from the feminist movement, the Civil Rights Movement, and the American Indian Movement. That expansion and deepening of the pool of historians voided the existence of a definitive and universally-accepted history, therefore, is presented by the revisionist historian to the national public with a history that has been corrected and augmented with new facts, evidence, and interpretations of the historical record. In "The Cycles of American History" (1986), in contrasting and comparing the US and the Soviet Union during the Cold War (1945–1991), the historian Arthur M. Schlesinger Jr. said:
Revisionist historians contest the mainstream or traditional view of historical events and raise views at odds with traditionalists, which must be freshly judged. Revisionist history is often practiced by those who are in the minority, such as feminist historians, ethnic minority historians, those working outside of mainstream academia in smaller and less known universities, or the youngest scholars, essentially historians who have the most to gain and the least to lose in challenging the status quo. In the friction between the mainstream of accepted beliefs and the new perspectives of historical revisionism, received historical ideas are either changed, solidified, or clarified. If over a period of time, the revisionist ideas become the new establishment "status quo" a paradigm shift is said to have occurred. The historian Forrest McDonald is often critical of the turn that revisionism has taken but admits that the turmoil of the 1960s America has changed the way history was written:
Historians are influenced by the "zeitgeist" (spirit of the time), and the usually progressive changes to society, politics, and culture, such as occurred after the Second World War (1939–1945); in "The Future of the Past" (1989), the historian C. Vann Woodward said:
Developments in the academy, culture, and politics shaped the contemporary model of writing history, the accepted paradigm of historiography. The philosopher Karl Popper said that "each generation has its own troubles and problems, and, therefore, its own interests and its own point of view".
As the social, political, and cultural influences change a society, most historians revise and update their explanation of historical events. The old consensus, based upon limited evidence, might no longer be considered historically valid in explaining the particulars: of cause and effect, of motivation and self-interest – that tell "How?" and "Why?" the past occurred as it occurred; therefore, the historical revisionism of the factual record is revised to concord with the contemporary understanding of history. As such, in 1986, the historian John Hope Franklin described four stages in the historiography of the African experience of life in the US, which were based upon different models of historical consensus.
The historian Deborah Lipstadt ("Denying the Holocaust: The Growing Assault on Truth and Memory", 1993), and the historians Michael Shermer and Alex Grobman ("Denying History: Who Says the Holocaust Never Happened and Why Do They Say It?", 2002), distinguish between historical revisionism and historical negationism, the latter of which is a form of denialism. Lipstadt said that Holocaust deniers, such as Harry Elmer Barnes, disingenuously self-identify as "historical revisionists" in order to obscure their denialism as academic revision of the historical record.
As such, Lipstadt, Shermer, and Grobman said that legitimate historical revisionism entails the refinement of existing knowledge about a historical event, not a denial of the event, itself; that such refinement of history emerges from the examination of new, empirical evidence, and a re-examination, and consequent re-interpretation of the existing documentary evidence. That legitimate historical revisionism acknowledges the existence of a "certain body of irrefutable evidence" and the existence of a "convergence of evidence", which suggest that an event – such as the Black Death, American slavery, and the Holocaust – did occur; whereas the denialism of history rejects the entire foundation of historical evidence, which is a form of historical negationism.
Some of the influences on historians that may change over time are the following:
As non-Latin texts, such as Welsh, Gaelic and the Norse sagas have been analysed and added to the canon of knowledge about the period, and as much more archaeological evidence has come to light, the period known as the Dark Ages has narrowed to the point that many historians no longer believe that such a term is useful. Moreover, the term "dark" implies less of a void of culture and law but more a lack of many source texts in Mainland Europe. Many modern scholars who study the era tend to avoid the term altogether for its negative connotations and find it misleading and inaccurate for any part of the Middle Ages.
The concept of feudalism has been questioned. Revisionist scholars led by historian Elizabeth A. R. Brown have rejected the term.
For centuries, historians thought the Battle of Agincourt was an engagement in which the English army, overwhelmingly outnumbered four to one by the French army, pulled off a stunning victory, a version that was especially popularised by Shakespeare's play "Henry V". However, recent research by Professor Anne Curry, using the original enrollment records, has brought into question this interpretation. Though her research is not finished, she has published her initial findings, that the French outnumbered the English and the Welsh only by 12,000 to 8,000. If true, the numbers may have been exaggerated for patriotic reasons by the English.
In recounting the European colonization of the Americas, some history books of the past paid little attention to the indigenous peoples of the Americas, usually mentioning them only in passing and making no attempt to understand the events from their point of view. That was reflected in the description of Christopher Columbus having discovered America. Those events' portrayal has since been revised to avoid the word "discovery."
In his 1990 revisionist book, "The Conquest of Paradise: Christopher Columbus and the Columbian Legacy", Kirkpatrick Sale argued that Christopher Columbus was an imperialist bent on conquest from his first voyage. In a "New York Times" book review, historian and member of the Christopher Columbus Quincentenary Jubilee Committee William Hardy McNeill wrote about Sale:
McNeill declares Sale's work to be "unhistorical, in the sense that [it] selects from the often cloudy record of Columbus's actual motives and deeds what suits the researcher's 20th-century purposes." McNeill states that detractors and advocates of Columbus present a "sort of history [that] caricatures the complexity of human reality by turning Columbus into either a bloody ogre or a plaster saint, as the case may be."
The military historian James R. Arnold argues:
In reaction to the orthodox interpretation enshrined in the Versailles Treaty, which declared that Germany was guilty of starting World War I, the self-described "revisionist" historians of the 1920s rejected the orthodox view and presented a complex causation in which several other countries were equally guilty. Intense debate continues among scholars.
The military leadership of the British Army during World War I was frequently condemned as poor by historians and politicians for decades after the war ended. Common charges were that the generals commanding the army were blind to the realities of trench warfare, ignorant of the conditions of their men and unable to learn from their mistakes, thus causing enormous numbers of casualties ("lions led by donkeys"). | https://en.wikipedia.org/wiki?curid=13293 |
History of the petroleum industry in the United States
The first successful oil well in North America was established in Oil Springs, Ontario, Canada in 1858. The field is still in production although quantities are low.
The history of the petroleum industry in the United States goes back to the early 19th century, although the indigenous peoples, like many ancient societies, have used petroleum seeps since prehistoric times; where found, these seeps signaled the growth of the industry from the earliest discoveries to the more recent.
Petroleum became a major industry following the oil discovery at Oil Creek Pennsylvania in 1859. For much of the 19th and 20th centuries, the US was the largest oil producing country in the world. As of October 2015, the US was the world's third-largest producer of crude oil.
Native Americans had known of the oil in western Pennsylvania, and had made some use of it for many years before the mid-19th century. Early European explorers noted seeps of oil and natural gas in western Pennsylvania and New York. Interest grew substantially in the mid-1850s as scientists reported on the potential to manufacture kerosene from crude oil, if a sufficiently large oil supply could be found.
The Jesuit Relations of 1657 states:
Salt was a valuable commodity, and an industry developed near salt springs in the Ohio River Valley, producing salt by evaporating brine from the springs. Salt wells were sunk at the salt springs to increase the supply of brine for evaporation. Some of the wells were hand-dug, but salt producers also learned to drill wells by percussion (cable tool) methods. In a number of locations in western Virginia, Ohio, and Kentucky, oil and natural gas came up the wells along with the brine. The oil was mostly a nuisance, but some salt producers saved it and sold it as illuminating oil or medicine. In some locations, enough natural gas was produced to be used as fuel for the salt evaporating pans. Early salt brine wells that produced byproduct oil included the Thorla-McKee Well of Ohio in 1814, a well near Burkesville, Kentucky, in 1828, and wells at Burning Springs, West Virginia, by 1836.
The US natural gas industry started in 1821 at Fredonia, Chautauqua County, New York, when William Hart dug a well to a depth of into gas-bearing shale, then drilled a borehole further, and piped the natural gas to a nearby inn where it was burned for illumination. Soon many gas wells were drilled in the area, and the gas-lit streets of Fredonia became a tourist attraction.
On August 27, 1859, George Bissell and Edwin L. Drake made the first successful use of a drilling rig on a well drilled especially to produce oil, at a site on Oil Creek near Titusville, Pennsylvania. The Drake partners were encouraged by Benjamin Silliman (1779-1864), a chemistry professor at Yale, who tested a sample of the oil, and assured them that it could be distilled into useful products such as illuminating oil.
The Drake well is often referred to as the "first" commercial oil well, although that title is also claimed for wells in Azerbaijan, Ontario, West Virginia, and Poland, among others. However, before the Drake well, oil-producing wells in the United States were wells that were drilled for salt brine, and produced oil and gas only as accidental byproducts. An intended drinking water well at Oil Springs, Ontario found oil in 1858, a year before the Drake well, but it had not been drilled for oil. Historians have noted that the importance of the Drake well was not in being the first well to produce oil, but in attracting the first great wave of investment in oil drilling, refining, and marketing:
The success of the Drake well quickly led to oil drilling in other locations in the western Appalachian mountains, where oil was seeping to the surface, or where salt drillers had previously found oil fouling their salt wells. During the American Civil War, the oil-producing region spread over much of western Pennsylvania, up into western New York state, and down the Ohio River valley into the states of Ohio, Kentucky, and the western part of Virginia (now West Virginia). The Appalachian Basin continued to be the leading oil-producing region in the United States through 1904.
The first commercial oil well in New York was drilled in 1865. New York's (and Northwestern Pennsylvania) crude oil is very high in paraffin.
The principal product of the oil in the 19th century was kerosene, which quickly replaced whale oil for illuminating purposes in the United States. Originally dealing in whale oil which was widely used for illumination, Charles Pratt (1830–1891) of Massachusetts was an early pioneer of the natural oil industry in the United States. He was founder of Astral Oil Works in the Greenpoint section of Brooklyn, New York. Pratt's product later gave rise to the slogan, ""The holy lamps of Tibet are primed with Astral Oil"." He joined with his protégé Henry H. Rogers to form Charles Pratt and Company in 1867. Both companies became part of John D. Rockefeller's Standard Oil in 1874.
The Mid-continent area is an area generally including Kansas, Oklahoma, Arkansas, North Louisiana and the part of Texas away from the Gulf Coast. The first commercially successful oil well drilled in Kansas was the Norman No. 1 near Neodesha, Kansas, on November 28, 1892.
Oil was discovered at Bartlesville and Burbank in 1897. But the initial discoveries created no great excitement until the discovery gusher of the Glenn Pool in 1905. The Glenn discovery came when Gulf Coast production was declining rapidly, and the operators were eager for new areas to drill. The increased drilling resulted in major discoveries at Cushing in 1912 and Healdton in 1913.
The largest oil field in the lower 48 states, the East Texas oil field, was not discovered until 1930, when wildcatter Columbus Marion Joiner (more commonly known as "Dad" Joiner) drilled the Daisy Bradford No. 3 well, in Rusk County, Texas.
In 1906, the Caddo-Pine Island Field in northern Caddo Parish, Louisiana was discovered, and a rush of leasing and drilling activity ensued. In 1908, the first natural gas pipeline was constructed to transport gas from Caddo-Pine Island to Shreveport, Louisiana. This was one of the earliest commercial uses of natural gas, which was commonly viewed as an undesirable by-product of oil production and often "flared" or burnt off at the well site.
Other innovations in the Caddo-Pine Island Field included the first over-water oil platform, which was constructed in the field on Caddo Lake in 1910. In that same year, a major oil pipeline was constructed from Caddo-Pine Island Field to a refinery built and operated by Standard Oil Company of Louisiana in Baton Rouge, Louisiana. The refinery continues to operate today.
Other early petroleum discoveries in North Louisiana included the Bull Bayou Field, Red River Parish, Louisiana (1913), Monroe Gas Field, Ouachita Parish, Louisiana (1916), Homer Field, Claiborne Parish, Louisiana (1919) and Haynesville Field, Claiborne Parish, Louisiana (1921).
Native Americans had known of the tar seeps in southern California for thousands of years, and used the tar to waterproof their canoes. Spanish settlers also knew of the seeps, such as at Rancho La Brea (Spanish for "Tar Ranch") in present-day Los Angeles, from which the priests obtained tar to waterproof the roofs of the Los Angeles and San Gabriel missions.
Despite the abundance of well-known seeps in southern California, the first commercial oil well in California was drilled in Humboldt County, northern California in 1865.
Some attempts were made in the 1860s to exploit oil deposits under tar seeps in the Ventura Basin of Ventura County and northeastern Los Angeles county. The early efforts failed because of complex geology, and, more importantly, because the refining techniques then available could not manufacture high-quality kerosene from California crude oil, which differed chemically from Pennsylvania crude oil. Most California crude oil in the early years was turned into the less lucrative products of fuel oil and asphalt.
Oil production in the Los Angeles Basin started with the discovery of the Brea-Olinda Oil Field in 1880, and continued with the development of the Los Angeles City Oil Field in 1893, the Beverly Hills Oil Field in 1900, the Salt Lake Oil Field in 1902, and many others. The discovery of the Long Beach Oil Field in 1921, which proved to be the world's richest in production per-acre of the time, increased the importance of the Los Angeles Basin as a worldwide oil producer. This increased again with the discovery of the Wilmington Oil Field in 1932, and the development of the Port of Los Angeles as a means of shipping crude oil overseas.
Production in Santa Barbara County began in the 1890s with the development of the Summerland Oil Field, which included the world's first offshore oil wells. With the discovery of the Orcutt and Lompoc fields, northern Santa Barbara County became a regional center of production; towns such as Orcutt owe their existence to the quickly growing industry.
Oil in the San Joaquin Basin was first discovered at the Coalinga field in 1890. By 1901, the San Joaquin Basin was the main oil-producing region of California, and it remains so in the 21st century, with huge oil fields including the Midway-Sunset, Kern River, and Belridge fields producing much of California's onshore oil.
The first commercial oil well in the Rocky Mountains was drilled near Cañon City, Colorado in 1862. The wells in the Cañon City-Florence field, drilled near surface oil seeps, produced from fractures in the Pierre Shale.
A Russian sea captain noted oil seeps along the shore of the Cook Inlet as early as 1853, and oil drilling began in 1898 in a number of locations along the southern coast of Alaska. Production was relatively small, however, until huge discoveries were made on Alaska's remote North Slope.
Petroleum seeps on the North Slope have been known for many years, and in 1923, the federal government created US Naval Petroleum Reserve No. 4 to cover the presumed oil fields beneath the seeps. Some exploration drilling was done in the reserve during World War II and the 1950s, but the remote location deterred intensive exploration until the 1960s. The Prudhoe Bay Oil Field, the largest oil field in the United States in terms of total oil produced, was discovered in 1968. Production began in 1977, following completion of the Trans-Alaska Pipeline. Through 2005, the field has produced of oil (an average of 1.5 million barrels/day), and is estimated to contain another of economically recoverable oil.
Capt. Anthony Francis Lucas, an experienced mining engineer and salt driller, drilled a well to find oil at Spindletop Hill. On the morning of January 10, 1901, the little hill south of Beaumont, Texas began to tremble and mud bubbled up over the rotary table. A low rumbling sound came from underground, and then, with a force that shot 6 tons of 4-inch (100 mm) diameter pipe out over the top of the derrick, knocking off the crown block, the Lucas Gusher roared in and the Spindletop oil field was born. Spindletop became the focus of frenzied drilling; oil production from the field peaked in 1902 at , but by 1905 production had declined 90% from the peak.
Spindletop Hill turned out to be the surface expression of an underground salt dome, around which the oil accumulated. The Spindletop gusher started serious oil exploration of the Gulf Coast in Texas and Louisiana, an area that had previously been dismissed by oil men. Other salt dome mounds were quickly drilled, resulting in discoveries at Sour Lake (1902), Batson (1904) and Humble (1905).
The Standard Oil Company was slow to appreciate the economic potential of the Spindletop oil field, and the Gulf Coast generally, which gave greater opportunity to others; Spindletop became the birthplace of oil giants Texaco and Gulf Oil. Although in 1899 Standard Oil controlled more than 85% of the oil production in the older oil regions in the Appalachian Basin and the Lima-Indiana trend, it never controlled more than 10% of the oil production in the new Gulf Coast province.
By the Natural Gas Act of 1938, the federal government imposed price controls on natural gas in interstate commerce. The Federal Power Commission was mandated to set interstate gas prices at "just and reasonable" rates. The FPC at first only regulated the price at which pipelines sold gas to utilities and industry, but later put limits on the wellhead price of gas sold to an interstate pipeline. Gas producers challenged the controls, but lost in the Supreme Court in Phillips Petroleum Co. v. Wisconsin (1954).
The federal government had controlled the price of natural gas that crossed state lines, but not of gas produced and sold within a state. In the 1970s, the low interstate price set by the federal government caused supply shortages of gas in consuming states, because gas producers sold as much as they could of their product for higher prices in the local markets within gas-producing states. In the Natural Gas Policy Act of 1978, the federal government extended price controls to all natural gas in the country. At the same time, the government created a complex price system in which the price paid to the producer depended on the date the well was drilled, the depth of the well, the geological formation, the distance to other gas wells, and several other factors. The price system was an attempt to keep the average price low while encouraging new production.
The last federal price controls on natural gas were removed by the Natural Gas Decontrol Act of 1989, which phased out the last remaining price control as of 1 January 1993.
As of December, 2012, North Dakota was producing oil at the rate of 750,000 barrels/day. | https://en.wikipedia.org/wiki?curid=13294 |
Hezekiah
Hezekiah (; ), or Ezekias, was, according to the Hebrew Bible, the son of Ahaz and the 13th king of Judah. He is considered a very righteous king in both the Second Book of Kings and the Second Book of Chronicles. He is also one of the more prominent kings of Judah mentioned in the Bible and is one of the kings mentioned in the genealogy of Jesus in the Gospel of Matthew. "No king of Judah, among either his predecessors or his successors, could ... be compared to him", according to 2 Kings 18:5.
Edwin Thiele concluded that his reign was between c. 715 and 686 BC.
According to the biblical narrative, Hezekiah witnessed the destruction of the northern Kingdom of Israel by Sargon's Assyrians in c. 722 BC and was king of Judah during the siege of Jerusalem by Sennacherib in 701 BC. Hezekiah enacted sweeping religious reforms, including a strict mandate for the sole worship of Yahweh and a prohibition on venerating other deities within the Temple of Jerusalem. Isaiah and Micah prophesied during his reign.
The name Hezekiah means "Yahweh strengthens" in Hebrew. Alternately it may be translated as "Yahweh is my strength".
The main accounts of Hezekiah's reign are found in , , and of the Hebrew Bible. mentions that it is a collection of King Solomon's proverbs that were "copied by the officials of King Hezekiah of Judah". His reign is also referred to in the books of the prophets Isaiah, Jeremiah, Hosea, and Micah. The books of Hosea and Micah record that their prophecies were made during Hezekiah’s reign.
Hezekiah was the son of King Ahaz and Abijah. His mother, Abijah (also called Abi), was a daughter of the high priest Zechariah. Based on Thiele's dating, Hezekiah was born in c. 741 BCE. He was married to Hephzi-bah. () He died from natural causes at the age of 54 in c. 687 BCE, and was succeeded by his son Manasseh ().
According to the biblical narrative, Hezekiah assumed the throne of Judah at the age of 25 and reigned for 29 years (, ). Some writers have proposed that Hezekiah served as coregent with his father Ahaz for about 14 years. His sole reign is dated by William F. Albright as 715–687 BCE, and by Edwin R. Thiele as 716–687 BCE (the last ten years being a co-regency with his son Manasseh).
Hezekiah purified and repaired the Temple, purged its idols, and reformed the priesthood. In an effort to abolish idolatry from his kingdom, he destroyed the high places (or "bamot") and the "bronze serpent" (or "Nehushtan"), recorded as being made by Moses, which had become objects of idolatrous worship. In place of this, he centralized the worship of God at the Temple in Jerusalem. Hezekiah also defeated the Philistines, "as far as Gaza and its territory", () and resumed the Passover pilgrimage and the tradition of inviting the scattered tribes of Israel to take part in a Passover festival.
2 Chronicles 30 (but not the parallel account in 2 Kings) records that Hezekiah sent messengers to Ephraim and Manasseh inviting them to Jerusalem for the celebration of the Passover. The messengers, however, were not only not listened to, but were even laughed at, although a few men of the tribes of Asher, Manasseh and Zebulun "were humble enough to come" to the city. Nevertheless, the Passover was celebrated with great solemnity and such rejoicing as had not been seen in Jerusalem since the days of Solomon. The celebration took place during the second month, Iyar, because not enough priests had consecrated themselves in the first month.
Biblical writer H. P. Mathys suggests that Hezekiah, being unable to restore the union of Judah and Israel by political means, used the invitation to the northern tribes as a final religious "attempt to restore the unity of the cult". He also notes that this account "is often considered to contain historically reliable elements, especially since negative aspects are also reported on", although he questions the full extent to which it may be considered historically reliable.
After the death of Assyrian king Sargon II in 705 BCE, Sargon's son Sennacherib became king of Assyria. In 703 BCE, Sennacherib began a series of major campaigns to quash opposition to Assyrian rule, starting with cities in the eastern part of the realm. In 701 BCE, Sennacherib turned toward cities in the west. Hezekiah then had to face the invasion of Judah. According to the Bible, Hezekiah did not rely on Egypt for support, but relied on God and prayed to Him for deliverance of his capital city Jerusalem. (; ; ; ; )
The Assyrians recorded that Sennacherib lifted his siege of Jerusalem after Hezekiah paid Sennacherib tribute. The Bible records that Hezekiah paid him three hundred talents of silver and thirty of gold as tribute, even sending the doors of the Temple to produce the promised amount, but, even after the payment was made, Sennacherib renewed his assault on Jerusalem. Sennacherib surrounded the city and sent his Rabshakeh to the walls as a messenger. The Rabshakeh addressed the soldiers manning the city wall in Hebrew ("Yĕhuwdiyth"), asking them to distrust Yahweh and Hezekiah, claiming that Hezekiah's righteous reforms (destroying the idols and High Places) were a sign that the people should not trust their god to be favorably disposed (). records that Hezekiah went to the Temple and there he prayed to God.
Knowing that Jerusalem would eventually be subject to siege, Hezekiah had been preparing for some time by fortifying the walls of the capital, building towers, and constructing a tunnel to bring fresh water to the city from a spring outside its walls. He made at least two major preparations that would help Jerusalem to resist conquest: the construction of the Siloam Tunnel, and construction of the Broad Wall.
"When Sennacherib had come, intent on making war against Jerusalem, Hezekiah consulted with his officers about stopping the flow of the springs outside the city … for otherwise, they thought, the King of Assyria would come and find water in abundance" ().
The narratives of the Bible state that Sennacherib's army besieged Jerusalem. (; ; ; )
According to the biblical record, Sennacherib sent threatening letters warning Hezekiah that he had not desisted from his determination to take the Judean capital. () Although they besieged Jerusalem, the biblical accounts state that the Assyrians did not so much as "shoot an arrow there, ... nor cast up a siege rampart against it", and that God sent out an angel who, in one night, struck down "a hundred and eighty-five thousand in the camp of the Assyrians," sending Sennacherib back "with shame of face to his own land".
Sennacherib's inscriptions make no mention of the disaster suffered by his forces. But, as Professor Jack Finegan comments: "In view of the general note of boasting which pervades the inscriptions of the Assyrian kings, ... it is hardly to be expected that Sennacherib would record such a defeat." The Cambridge Bible for Schools and Colleges refers to an "Egyptian tradition, according to which Sennacherib had already reached Pelusium in Egypt, when in a single night his army was rendered helpless by a plague of field-mice which gnawed the bows of the soldiers and the thongs of their shields". The version of the matter that Sennacherib presents, as found inscribed on what is known as the Sennacherib Prism preserved in the University of Chicago Oriental Institute, in part says: "As to Hezekiah, the Jew, he did not submit to my yoke ... Hezekiah himself ... did send me, later, to Nineveh, my lordly city, together with 30 talents of gold, 800 talents of silver, ..." This version inflates the number of silver talents sent from 300 to 800; but in other regards it confirms the biblical record and shows that Sennacherib made no claim that he captured Jerusalem. However, Sennacherib presents the matter of Hezekiah's paying tribute as having come after the Assyrian threat of a siege against Jerusalem, whereas the Bible states it was paid before.
Of Sennacherib's death, records:
"It came about as he was worshiping in the house of Nisroch his god, that Adrammelech and Sharezer killed him [Sennacherib] with the sword; and they escaped into the land of Ararat. And Esarhaddon his son became king in his place."
According to Assyrian records, Sennacherib was assassinated in 681 BCE, twenty years after the 701 BCE invasion of Judah. A Neo-Babylonian letter corroborates with the biblical account a sentiment from Sennacherib’s sons to assassinate him, an event Assyriologists have reconstructed as historical. The son Ardi-Mulishi, who is mentioned in the letter as killing anyone who would reveal his conspiracy, successfully murders his father in c. 681 BCE, and was most likely the Adrammelech in 2 Kings, though Sharezer is not known elsewhere. Assyriologists posit the murder was motivated because Esarhaddon was chosen as heir to the throne instead of Ardi-Mulishi, the next eldest son. Assyrian and Hebrew biblical history corroborate that Esarhaddon ultimately did succeed the throne. Other Assyriologists assert that Sennacherib was murdered in revenge for his destruction of Babylon, a city sacred to all Mesopotamians, including the Assyrians.
Later in his life, Hezekiah was ill with a boil or an inflammation which Isaiah initially thought would be fatal. The narrative of his sickness and miraculous recovery is found in , and . Various ambassadors came to congratulate him on his recovery, among them from Merodach-baladan, son of the king of Babylon, "for he had heard that Hezekiah had been sick". Hezekiah, his vanity flattered by the visit, showed the Babylonian embassy all the wealth, arms and stores of Jerusalem, revealing too much information to Baladan, king of Babylon (or perhaps boasting about his wealth): he was then confronted by Isaiah, who foretold that a future generation of the people of Judah would be taken as captives to Babylon. Hezekiah was reassured that his own lifetime would see peace and security.
According to , Hezekiah lived another 15 years after praying to God. His son and successor, Manasseh, was born during this time: he was 12 years of age when he succeeded Hezekiah.
According to the Talmud, the disease came about because of a dispute between him and Isaiah over who should pay whom a visit and over Hezekiah's refusal to marry and have children, although in the end he married Isaiah's daughter. Some Talmudists also considered that it might have come about as a way for Hezekiah to purge his sins or due to his arrogance in assuming his righteousness.
Extra-biblical sources do much more for us than give us a pan-Mid Eastern picture into which we contextualize Hezekiah: there are extra-biblical sources that specify Hezekiah by name, along with his reign and influence. "Historiographically, his reign is noteworthy for the convergence of a variety of biblical sources and diverse extrabiblical evidence often bearing on the same events. Significant data concerning Hezekiah appear in the Deuteronomistic History, the Chronicler, Isaiah, Assyrian annals and reliefs, Israelite epigraphy, and, increasingly, stratigraphy". Archaeologist Amihai Mazar calls the tensions between Assyria and Judah "one of the best-documented events of the Iron Age" (172). Hezekiah's story is one of the best to cross-reference with the rest of the Mid Eastern world's historical documents.
A seal impression dating back to 727–698 BCE, reading "לחזקיהו [בן] אחז מלך יהדה" "Belonging to Hezekiah [son of] Ahaz king of Judah" was uncovered in a dig at the Ophel in Jerusalem. The impression on this inscription was set in ancient Hebrew script.
A lintel inscription, found over the doorway of a tomb, has been ascribed to his secretary, Shebnah ().
LMLK stored jars along the border with Assyria "demonstrate careful preparations to counter Sennacherib's likely route of invasion" and show "a notable degree of royal control of towns and cities which would facilitate Hezekiah's destruction of rural sacrificial sites and his centralization of worship in Jerusalem". Evidence suggests they were used throughout his 29-year reign. There are some Bullae from sealed documents that may have belonged to Hezekiah himself. There are also some that name his servants ("ah-vah-deem" in Hebrew, ayin-bet-dalet-yod-mem).
In 2015 Eilat Mazar discovered a bulla that bears an inscription in ancient Hebrew script that translates as: "Belonging to Hezekiah [son of] Ahaz king of Judah." This is the first seal impression of an Israelite or Judean king to come to light in a scientific archaeological excavation. While another, unprovenanced bulla of King Hezekiah was known, this was the first time a seal impression of Hezekiah had been discovered in situ in the course of actual excavations. Archaeological findings like the Hezekiah seal led scholars to surmise that the ancient Judahite kingdom had a highly developed administrative system. In 2018 Mazar published a report discussing the discovery of a bulla (a type of seal) which she says may have to have belonged to Isaiah. She believes the fragment to have been part of a seal whose complete text might have read "Belonging to Isaiah the prophet." Several other biblical archaeologists, including George Washington University's Christopher Rollston have pointed to the bulla being incomplete, and the present inscription not enough to necessarily refer to the biblical figure.
According to the work of archaeologists and philologists, the reign of Hezekiah saw a notable increase in the power of the Judean state. At this time Judah was the strongest nation on the Assyrian–Egyptian frontier. There were increases in literacy and in the production of literary works. The massive construction of the Broad Wall was made during his reign, the city was enlarged to accommodate a large influx, and population increased in Jerusalem up to 25,000, "five times the population under Solomon." Archaeologist Amihai Mazar explains, "Jerusalem was a virtual city-state where the majority of the state's population was concentrated," in comparison to the rest of Judah's cities (167). Archaeologist Israel Finkelstein says, "The key phenomenon—which cannot be explained solely against the background of economic prosperity—was the sudden growth of the population of Jerusalem in particular, and of Judah in general" (153). He says the cause of this growth must be a large influx of Israelites fleeing from the Assyrian destruction of the northern state. It is "[t]he only reasonable way to explain this unprecedented demographic development" (154). This, according to Finkelstein, set the stage for motivations to compile and reconcile Hebrew history into a text at that time (157). Mazar questions this explanation, since, he argues, it is "no more than an educated guess" (167).
The Siloam Tunnel was chiseled through 533 meters (1,750 feet) of solid rock in order to provide Jerusalem underground access to the waters of the Gihon Spring or Siloam Pool, which lay outside the city.
The Siloam Inscription from the Siloam Tunnel is now in the Istanbul Archaeology Museum. It "commemorates the dramatic moment when the two original teams of tunnelers, digging with picks from opposite ends of the tunnel, met each other" (564). It is "[o]ne of the most important ancient Hebrew inscriptions ever discovered." Finkelstein and Mazar cite this tunnel as an example of Jerusalem's impressive state-level power at the time.
Archaeologists like William G. Dever have pointed at archaeological evidence for the iconoclasm during the period of Hezekiah's reign. The central cult room of the temple at Arad (a royal Judean fortress) was deliberately and carefully dismantled, "with the altars and massebot" concealed "beneath a Str. 8 plaster floor". This stratum correlates with the late 8th century; Dever concludes that "the deliberate dismantling of the temple and its replacement by another structure in the days of Hezekiah is an archeological fact. I see no reason for skepticism here."
Under Rehoboam, Lachish became the second-most important city of the kingdom of Judah. During the revolt of king Hezekiah against Assyria, it was captured by Sennacherib despite determined resistance (see Siege of Lachish).
As the Lachish relief attests, Sennacherib began his siege of the city of Lachish in 701 BCE. The Lachish Relief graphically depicts the battle, and the defeat of the city, including Assyrian archers marching up a ramp and Judahites pierced through on mounted stakes. "The reliefs on these slabs" discovered in the Assyrian palace at Nineveh "originally formed a single, continuous work, measuring 8 feet ... tall by 80 feet ... long, which wrapped around the room" (559). Visitors "would have been impressed not only by the magnitude of the artwork itself but also by the magnificent strength of the Assyrian war machine."
Sennacherib's Prism was found buried in the foundations of the Nineveh palace. It was written in cuneiform, the Mesopotamian form of writing of the day. The prism records the conquest of 46 strong towns and "uncountable smaller places," along with the siege of Jerusalem where Sennacherib says he just "shut him up ... like a bird in a cage," subsequently enforcing a larger tribute upon him.
The Hebrew Bible states that during the night, the angel of Jehovah (YHWH Hebrew) brought death to 185,000 Assyrians troops (), forcing the army to abandon the siege, yet it also records a tribute paid to Sennacherib of 300 silver talents following the siege. There is no account of the supernatural event in the prism. Sennacherib's account records his levying of a tribute from Hezekiah, the king of Judea, who was within Jerusalem, leaving the city as the only one intact following the exile of the northern ten-tribe kingdom of Israel due to idolatry. (2 Kings 17:22, 23; 2 Kings 18:1–8) Sennacherib recorded a payment of 800 silver talents, which suggests a capitulation to end the siege. However, Inscriptions have been discovered describing Sennacherib's defeat of the Ethiopian forces. These say: "As to Hezekiah, the Jew, he did not submit to my yoke, I laid siege to 46 of his strong cities ... and conquered (them). ... Himself I made a prisoner in Jerusalem, his royal residence, like a bird in a cage." He does not claim to have captured the city. This is consistent with the Bible account of Hezekiah's revolt against Assyria in the sense that neither account seems to indicate that Sennacherib ever entered or formally captured the city. Sennacherib in this inscription claims that Hezekiah paid for tribute 800 talents of silver, in contrast with the Bible's 300, however this could be due to boastful exaggeration which was not uncommon amongst kings of the period. Furthermore, the annals record a list of booty sent from Jerusalem to Nineveh. In the inscription, Sennacherib claims that Hezekiah accepted servitude, and some theorize that Hezekiah remained on his throne as a vassal ruler. The campaign is recorded with differences in the Assyrian records and in the biblical Books of Kings; there is agreement that the Assyrian have a propensity for exaggeration.
One theory that takes the biblical view posits that a defeat was caused by "possibly an outbreak of the bubonic plague". Another that this is a composite text which makes use of a 'legendary motif' analogous to that of the Exodus story.
The Talmud (Bava Batra 15a) credits Hezekiah with overseeing the compilation of the biblical books of Isaiah, Proverbs, Song of Songs and Ecclesiastes.
According to Jewish tradition, the victory over the Assyrians and Hezekiah's return to health happened at the same time, the first night of Passover.
The Greek historian Herodotus (c. 484 BCE – c. 425 BCE) wrote of the invasion and acknowledges many Assyrian deaths, which he claims were the result of a plague of mice. The Jewish historian Josephus followed the writings of Herodotus. These historians record Sennacherib's failure to take Jerusalem is "uncontested".
Abi saved the life of her son Hezekiah, whom her godless husband, Ahaz, had designed as an offering to Moloch. By anointing him with the blood of the salamander, she enabled him to pass through the fire of Moloch unscathed (Sanh. 63b).
Hezekiah is considered as the model of those who put their trust in the Lord. Only during his sickness did he waver in his hitherto unshaken trust and require a sign, for which he was blamed by Isaiah (Lam. R. i.). The Hebrew name "Ḥizḳiyyah" is considered by the Talmudists to be a surname, meaning either "strengthened by Yhwh" or "he who made a firm alliance between the Israelites and Yhwh"; his eight other names are enumerated in Isa. ix. 5 (Sanh. 94a). He is called the restorer of the study of the Law in the schools, and is said to have planted a sword at the door of the bet ha-midrash, declaring that he who would not study the Law should be struck with the weapon (ib. 94b).
Hezekiah's piety, which, according to the Talmudists, alone occasioned the destruction of the Assyrian army and the signal deliverance of the Israelites when Jerusalem was attacked by Sennacherib, caused him to be considered by some as the Messiah (ib. 99a). According to Bar Ḳappara, Hezekiah was destined to be the Messiah, but the attribute of justice("middat ha-din") protested against this, saying that as David, who sang so much the glory of God, had not been made the Messiah, still less should Hezekiah, for whom so many miracles had been performed, yet who did not sing the praise of God (ib. 94a).
Hezekiah's dangerous illness was caused by the discord between him and Isaiah, each of whom desired that the other should pay him the first visit. In order to reconcile them God struck Hezekiah with a malady and ordered Isaiah to visit the sick king. Isaiah told the latter that he would die, and that his soul also would perish because he had not married and had thus neglected the commandment to perpetuate the human species. Hezekiah did not despair, however, holding to the principle that one must always have recourse to prayer. He finally married Isaiah's daughter, who bore him Manasseh (Ber. 10a). However, in Gen. R. lxv. 4, as quoted in Yalḳ., II Kings, 243, it is said that Hezekiah prayed for illness and for recovery in order that he might be warned and be able to repent of his sins. He was thus the first who recovered from illness. But in his prayer he was rather arrogant, praising himself; and this resulted in the banishment of his descendants (Sanh. 104a). R. Levi said that Hezekiah's words, "and I have done what is good in thy eyes" (II Kings xx. 3), refer to his concealing a book of healing. According to the Talmudists, Hezekiah did six things, of which three agreed with the dicta of the Rabbis and three disagreed therewith (Pes. iv., end). The first three were these: (1) he concealed the book of healing because people, instead of praying to God, relied on medical prescriptions; (2) he broke in pieces the brazen serpent (see Biblical Data, above); and (3) he dragged his father's remains on a pallet, instead of giving them kingly burial. The second three were: (1) stopping the water of Gihon; (2) cutting the gold from the doors of the Temple; and (3) celebrating the Passover in the second month (Ber. 10b; comp. Ab. R. N. ii., ed. Schechter, p. 11).
The question that puzzled Heinrich Ewald ("Gesch. des Volkes Israel," iii. 669, note 5) and others, "Where was the brazen serpent till the time of Hezekiah?" occupied the Talmudists also. They answered it in a very simple way: Asa and Joshaphat, when clearing away the idols, purposely left the brazen serpent behind, in order that Hezekiah might also be able to do a praiseworthy deed in breaking it (Ḥul. 6b).
The Midrash reconciles the two different narratives (II Kings xviii. 13-16 and II Chron. xxxii. 1-8) of Hezekiah's conduct at the time of Sennacherib's invasion (see Biblical Data, above). It says that Hezekiah prepared three means of defense: prayer, presents, and war (Eccl. R. ix. 27), so that the two Biblical statements complement each other. The reason why Hezekiah's display of his treasures to the Babylonian ambassadors aroused the anger of God (II Chron. xxxii. 25) was that Hezekiah opened before them the Ark, showing them the tablets of the covenant, and saying, "It is with this that we are victorious" (Yalḳ., l.c. 245).
Notwithstanding Hezekiah's immense riches, his meal consisted only of a pound of vegetables (Sanh. 94b). The honor accorded to him after death consisted, according to R. Judah, in his bier being preceded by 36,000 men whose shoulders were bare in sign of mourning. According to R. Nehemiah, a scroll of the Law was placed on Hezekiah's bier. Another statement is that a yeshibah was established on his grave—for three days, according to some: for seven, according to others; or for thirty, according to a third authority (Yalḳ., II Chron. 1085). The Talmudists attribute to Hezekiah the redaction of the books of Isaiah, Proverbs, Song of Solomon, and Ecclesiastes (B. B. 15a).
Understanding the biblically recorded sequence of events in Hezekiah's life as chronological or not is critical to the contextual interpretation of his reign. According to scholar Stephen L. Harris, chapter 20 of 2 Kings does not follow the events of chapters 18 and 19 (161). Rather, the Babylonian envoys precede the Assyrian invasion and siege. Chapter 20 would have been added during the exile, and Harris says it "evidently took place before Sennacherib's invasion' when Hezekiah was "trying to recruit Babylon as an ally against Assyria.' Consequently, "Hezekiah ends his long reign impoverished and ruling over only a tiny scrap of his former domain.' Likewise, "The Archaeological Study Bible" says, "The presence of these riches' that Hezekiah shows to the Babylonians "indicates that this event took place before Hezekiah's payment of tribute to Sennacherib in 701 BC" (564). Again, "Though the king's illness and the subsequent Babylonian mission are described at the end of the accounts of his reign, they must have occurred before the war with Assyria. Thus, Isaiah's chastening of Hezekiah is due to his alliances made with other countries during the Assyrian conflict for insurance. To a reader who interprets the chapters chronologically, it would appear that Hezekiah ended his reign at a climax, but with a scholarly analysis, his end would contrarily be interpreted as a long fall from where he began.”
There has been considerable academic debate about the actual dates of reigns of the Israelite kings. Scholars have endeavored to synchronize the chronology of events referred to in the Hebrew Bible with those derived from other external sources. In the case of Hezekiah, scholars have noted that the apparent inconsistencies are resolved by accepting the evidence that Hezekiah, like his predecessors for four generations in the kings of Judah, had a coregency with his father, and this coregency began in 729 BCE.
As an example of the reasoning that finds inconsistencies in calculations when coregencies are "a priori" ruled out, dates the fall of Samaria (the Northern Kingdom) to the 6th year of Hezekiah's reign. William F. Albright has dated the fall of the Kingdom of Israel to 721 BCE, while E. R. Thiele calculates the date as 723 BCE. If Abright's or Thiele's dating are correct, then Hezekiah's reign would begin in either 729 or 727 BCE. On the other hand, states that Sennacherib invaded Judah in the 14th year of Hezekiah's reign. Dating based on Assyrian records date this invasion to 701 BCE, and Hezekiah's reign would therefore begin in 716/715 BCE. This dating would be confirmed by the account of Hezekiah's illness in chapter 20, which immediately follows Sennacherib's departure (). This would date his illness to Hezekiah's 14th year, which is confirmed by Isaiah's statement () that he will live fifteen more years (29 − 15 = 14). As shown below, these problems are all addressed by scholars who make reference to the ancient Near Eastern practice of coregency.
Following the approach of Wellhausen, another set of calculations shows it is probable that Hezekiah did not ascend the throne before 722 BCE. By Albright's calculations, Jehu's initial year is 842 BCE; and between it and Samaria's destruction the "Books of Kings" give the total number of the years the kings of Israel ruled as 143 7/12, while for the kings of Judah the number is 165. This discrepancy, amounting in the case of Judah to 45 years (165–120), has been accounted for in various ways; but every one of those theories must allow that Hezekiah's first six years fell before 722 BCE. (That Hezekiah began to reign before 722 BCE, however, is entirely consistent with the principle that the Ahaz/Hezekiah coregency began in 729 BCE.) Nor is it clearly known how old Hezekiah was when called to the throne, although states he was twenty-five years of age. His father died at the age of thirty-six (); it is not likely that Ahaz at the age of eleven should have had a son. Hezekiah's own son Manasseh ascended the throne twenty-nine years later, at the age of twelve. This places his birth in the seventeenth year of his father's reign, or gives Hezekiah's age as forty-two, if he was twenty-five at his ascension. It is more probable that Ahaz was twenty-one or twenty-five when Hezekiah was born (and suggesting an error in the text), and that the latter was thirty-two at the birth of his son and successor, Manasseh.
Since Albright and Friedman, several scholars have explained these dating problems on the basis of a coregency between Hezekiah and his father Ahaz between 729 and 716/715 BCE. Assyriologists and Egyptologists recognize that coregency was a practice both in Assyria and Egypt. After noting that coregencies were only used sporadically in the northern kingdom (Israel), Nadav Na'aman writes,
In the kingdom of Judah, on the other hand, the nomination of a co-regent was the common procedure, beginning from David who, before his death, elevated his son Solomon to the throne. When taking into account the permanent nature of the co-regency in Judah from the time of Joash, one may dare to conclude that dating the co-regencies accurately is indeed the key for solving the problems of biblical chronology in the eighth century BC."
Among the numerous scholars who have recognized the coregency between Ahaz and Hezekiah are Kenneth Kitchen in his various writings, Leslie McFall, and Jack Finegan. McFall, in his 1991 article, argues that if 729 BCE (that is, the Judean regnal year beginning in Tishri of 729) is taken as the start of the Ahaz/Hezekiah coregency, and 716/715 BCE as the date of the death of Ahaz, then all the extensive chronological data for Hezekiah and his contemporaries in the late eighth century BCE are in harmony. Further, McFall found that no textual emendations are required among the numerous dates, reign lengths, and synchronisms given in the Hebrew Testament for this period. In contrast, those who do not accept the Ancient Near Eastern principle of coregencies require multiple emendations of the Scriptural text, and there is no general agreement on which texts should be emended, nor is there any consensus among these scholars on the resultant chronology for the eighth century BCE. This is in contrast with the general consensus among those who accept the biblical and near Eastern practice of coregencies that Hezekiah was installed as coregent with his father Ahaz in 729 BCE, and the synchronisms of 2 Kings 18 must be measured from that date, whereas the synchronisms to Sennacherib are measured from the sole reign starting in 716/715 BCE. The two synchronisms to Hoshea of Israel in 2 Kings 18 are then in exact agreement with the dates of Hoshea's reign that can be determined from Assyrian sources, as is the date of Samaria's fall as stated in 2 Kings 18:10. An analogous situation of two ways of measurement, both equally valid, is encountered in the dates given for Jehoram of Israel, whose first year is synchronized to the 18th year of the sole reign of Jehoshaphat of Judah in 2 Kings 3:1 (853/852 BCE), but his reign is also reckoned according to another method as starting in the second year of the coregency of Jehoshaphat and his son Jehoram of Judah (2 Kings 1:17); both methods refer to the same calendrical year.
Scholars who accept the principle of coregencies note that abundant evidence for their use is found in the biblical material itself. The agreement of scholarship built on these principles with both biblical and secular texts was such that the Thiele/McFall chronology was accepted as the best chronology for the kingdom period in Jack Finegan's encyclopedic "Handbook of Biblical Chronology". | https://en.wikipedia.org/wiki?curid=14005 |
Haemophilia
Haemophilia is a mostly inherited genetic disorder that impairs the body's ability to make blood clots, a process needed to stop bleeding. This results in people bleeding for a longer time after an injury, easy bruising, and an increased risk of bleeding inside joints or the brain. Those with a mild case of the disease may have symptoms only after an accident or during surgery. Bleeding into a joint can result in permanent damage while bleeding in the brain can result in long term headaches, seizures, or a decreased level of consciousness.
There are two main types of haemophilia: haemophilia A, which occurs due to low amounts of clotting factor VIII, and haemophilia B, which occurs due to low levels of clotting factor IX. They are typically inherited from one's parents through an X chromosome carrying a nonfunctional gene. Rarely a new mutation may occur during early development or haemophilia may develop later in life due to antibodies forming against a clotting factor. Other types include haemophilia C, which occurs due to low levels of factor XI, and parahaemophilia, which occurs due to low levels of factor V. Acquired haemophilia is associated with cancers, autoimmune disorders, and pregnancy. Diagnosis is by testing the blood for its ability to clot and its levels of clotting factors.
Prevention may occur by removing an egg, fertilizing it, and testing the embryo before transferring it to the uterus. Treatment is by replacing the missing blood clotting factors. This may be done on a regular basis or during bleeding episodes. Replacement may take place at home or in hospital. The clotting factors are made either from human blood or by recombinant methods. Up to 20% of people develop antibodies to the clotting factors which makes treatment more difficult. The medication desmopressin may be used in those with mild haemophilia A. Studies of gene therapy are in early human trials.
Haemophilia A affects about 1 in 5,000–10,000, while haemophilia B affects about 1 in 40,000, males at birth. As haemophilia A and B are both X-linked recessive disorders, females are rarely severely affected. Some females with a nonfunctional gene on one of the X chromosomes may be mildly symptomatic. Haemophilia C occurs equally in both sexes and is mostly found in Ashkenazi Jews. In the 1800s haemophilia B was common within the royal families of Europe. The difference between haemophilia A and B was determined in 1952. The word is from the Greek "haima" αἷμα meaning blood and "philia" φιλία meaning love.
Characteristic symptoms vary with severity. In general symptoms are internal or external bleeding episodes, which are called "bleeds". People with more severe haemophilia suffer more severe and more frequent bleeds, while people with mild haemophilia usually suffer more minor symptoms except after surgery or serious trauma. In cases of moderate haemophilia symptoms are variable which manifest along a spectrum between severe and mild forms.
In both haemophilia A and B, there is spontaneous bleeding but a normal bleeding time, normal prothrombin time, normal thrombin time, but prolonged partial thromboplastin time. Internal bleeding is common in people with severe haemophilia and some individuals with moderate haemophilia. The most characteristic type of internal bleed is a joint bleed where blood enters into the joint spaces. This is most common with severe haemophiliacs and can occur spontaneously (without evident trauma). If not treated promptly, joint bleeds can lead to permanent joint damage and disfigurement. Bleeding into soft tissues such as muscles and subcutaneous tissues is less severe but can lead to damage and requires treatment.
Children with mild to moderate haemophilia may not have any signs or symptoms at birth, especially if they do not undergo circumcision. Their first symptoms are often frequent and large bruises and haematomas from frequent bumps and falls as they learn to walk. Swelling and bruising from bleeding in the joints, soft tissue, and muscles may also occur. Children with mild haemophilia may not have noticeable symptoms for many years. Often, the first sign in very mild haemophiliacs is heavy bleeding from a dental procedure, an accident, or surgery. Females who are carriers usually have enough clotting factors from their one normal gene to prevent serious bleeding problems, though some may present as mild haemophiliacs.
Severe complications are much more common in cases of severe and moderate haemophilia. Complications may arise from the disease itself or from its treatment:
Haemophilic arthropathy is characterized by chronic proliferative synovitis and cartilage destruction. If an intra-articular bleed is not drained early, it may cause apoptosis of chondrocytes and affect the synthesis of proteoglycans. The hypertrophied and fragile synovial lining while attempting to eliminate excessive blood may be more likely to easily rebleed, leading to a vicious cycle of hemarthrosis-synovitis-hemarthrosis. In addition, iron deposition in the synovium may induce an inflammatory response activating the immune system and stimulating angiogenesis, resulting in cartilage and bone destruction.
Typically, females possess two X-chromosomes, and males have one X and one Y-chromosome. Since the mutations causing the disease are X-linked recessive, a female carrying the defect on one of her X-chromosomes may not be affected by it, as the equivalent dominant allele on her other chromosome should express itself to produce the necessary clotting factors, due to X inactivation. Therefore, heterozygous females are just carriers of this genetic disposition. However, the Y-chromosome in the male has no gene for factors VIII or IX. If the genes responsible for production of factor VIII or factor IX present on a male's X-chromosome are deficient there is no equivalent on the Y-chromosome to cancel it out, so the deficient gene is not masked and the disorder will develop.
Since a male receives his single X-chromosome from his mother, the son of a healthy female silently carrying the deficient gene will have a 50% chance of inheriting that gene from her and with it the disease; and if his mother is affected with haemophilia, he will have a 100% chance of being a haemophiliac. In contrast, for a female to inherit the disease, she must receive two deficient X-chromosomes, one from her mother and the other from her father (who must therefore be a haemophiliac himself). Hence, haemophilia is expressed far more commonly among males than females, while double-X females are far more likely to be silent carriers, survive childhood and to submit each of her genetic children to an at least 50% risk of receiving the deficient gene. However, it is possible for female carriers to become mild haemophiliacs due to lyonisation (inactivation) of the X-chromosomes. Haemophiliac daughters are more common than they once were, as improved treatments for the disease have allowed more haemophiliac males to survive to adulthood and become parents. Adult females may experience menorrhagia (heavy periods) due to the bleeding tendency. The pattern of inheritance is criss-cross type. This type of pattern is also seen in colour blindness.
A mother who is a carrier has a 50% chance of passing the faulty X-chromosome to her daughter, while an affected father will always pass on the affected gene to his daughters. A son cannot inherit the defective gene from his father. This is a recessive trait and can be passed on if cases are more severe with carrier. Genetic testing and genetic counselling is recommended for families with haemophilia. Prenatal testing, such as amniocentesis, is available to pregnant women who may be carriers of the condition.
As with all genetic disorders, it is also possible for a human to acquire it spontaneously through mutation, rather than inheriting it, because of a new mutation in one of their parents' gametes. Spontaneous mutations account for about 33% of all cases of haemophilia A. About 30% of cases of haemophilia B are the result of a spontaneous gene mutation.
If a female gives birth to a haemophiliac son, either the female is a carrier for the blood disorder or the haemophilia was the result of a spontaneous mutation. Until modern direct DNA testing, however, it was impossible to determine if a female with only healthy children was a carrier or not. Generally, the more healthy sons she bore, the higher the probability that she was not a carrier.
If a male is afflicted with the disease and has children with a female who is not a carrier, his daughters will be carriers of haemophilia. His sons, however, will not be affected with the disease. The disease is X-linked and the father cannot pass haemophilia through the Y-chromosome. Males with the disorder are then no more likely to pass on the gene to their children than carrier females, though all daughters they sire will be carriers and all sons they father will not have haemophilia (unless the mother is a carrier).
There are numerous different mutations which cause each type of haemophilia. Due to differences in changes to the genes involved, people with haemophilia often have some level of active clotting factor. Individuals with less than 1% active factor are classified as having severe haemophilia, those with 1–5% active factor have moderate haemophilia, and those with mild haemophilia have between 5% and 40% of normal levels of active clotting factor.
Haemophilia can be diagnosed before, during or after birth if there is a family history of the condition. Several options are available to parents. If there is no family history of haemophilia, it is usually only diagnosed when a child begins to walk or crawl. They may experience joint bleeds or easy bruising.
Mild haemophilia may only be discovered later, usually after an injury or a dental or surgical procedure.
Genetic testing and counselling are available to help determine the risk of passing the condition onto a child. This may involve testing a sample of tissue or blood to look for signs of the genetic mutation that causes haemophilia.
A pregnant woman with a history of haemophilia in her family can test for the haemophilia gene. Such tests include:
There is a small risk of these procedures causing problems such as miscarriage or premature labour, so the woman may discuss this with the doctor in charge of her care.
If haemophilia is suspected after a child has been born, a blood test can usually confirm the diagnosis. Blood from the umbilical cord can be tested at birth if there's a family history of haemophilia. A blood test will also be able to identify whether a child has haemophilia A or B, and how severe it is.
There are several types of haemophilia: haemophilia A, haemophilia B, haemophilia C, "parahaemophilia", "acquired haemophilia A", and "acquired haemophilia B".
Haemophilia A, is a recessive X-linked genetic disorder resulting in a deficiency of functional clotting Factor VIII. Haemophilia B, is also a recessive X-linked genetic disorder involving a lack of functional clotting Factor IX. Haemophilia C, is an autosomal genetic disorder involving a lack of functional clotting Factor XI. Haemophilia C is not completely recessive, as heterozygous individuals also show increased bleeding.
The type of haemophilia known as "parahaemophilia" is a mild and rare form and is due to a deficiency in factor V. This type can be inherited or acquired.
A non-genetic form of haemophilia is caused by autoantibodies against factor VIII and so is known as "acquired haemophilia A". Acquired haemophilia can be associated with cancers, autoimmune disorders and following childbirth.
There is no long-term cure. Treatment and prevention of bleeding episodes is done primarily by replacing the missing blood clotting factors.
Clotting factors are usually not needed in mild haemophilia. In moderate haemophilia clotting factors are typically only needed when bleeding occurs or to prevent bleeding with certain events. In severe haemophilia preventive use is often recommended two or three times a week and may continue for life. Rapid treatment of bleeding episodes decreases damage to the body.
Factor VIII is used in haemophilia A and factor IX in haemophilia B. Factor replacement can be either isolated from human blood serum, recombinant, or a combination of the two. Some people develop antibodies (inhibitors) against the replacement factors given to them, so the amount of the factor has to be increased or non-human replacement products must be given, such as porcine factor VIII.
If a person becomes refractory to replacement coagulation factor as a result of high levels of circulating inhibitors, this may be partially overcome with recombinant human factor VIII.
In early 2008, the US Food and Drug Administration (FDA) approved anti-haemophilic factor genetically engineered from the genes of Chinese hamster ovary cells. Since 1993 recombinant factor products (which are typically cultured in Chinese hamster ovary (CHO) tissue culture cells and involve little, if any human plasma products) have been available and have been widely used in wealthier western countries. While recombinant clotting factor products offer higher purity and safety, they are, like concentrate, extremely expensive, and not generally available in the developing world. In many cases, factor products of any sort are difficult to obtain in developing countries.
Clotting factors are either given preventively or on-demand. Preventive use involves the infusion of clotting factor on a regular schedule in order to keep clotting levels sufficiently high to prevent spontaneous bleeding episodes. On-demand (or episodic) treatment involves treating bleeding episodes once they arise. In 2007, a trial comparing on-demand treatment of boys (< 30 months) with haemophilia A with prophylactic treatment (infusions of 25 IU/kg body weight of Factor VIII every other day) in respect to its effect on the prevention of joint-diseases. When the boys reached 6 years of age, 93% of those in the prophylaxis group and 55% of those in the episodic-therapy group had a normal index joint-structure on MRI. Preventative treatment, however, resulted in average costs of $300,000 per year. The author of an editorial published in the same issue of the "NEJM" supports the idea that prophylactic treatment not only is more effective than on demand treatment but also suggests that starting after the first serious joint-related haemorrhage may be more cost effective than waiting until the fixed age to begin. Most haemophiliacs in third world countries have limited or no access to commercial blood clotting factor products.
Desmopressin (DDAVP) may be used in those with mild haemophilia A. Tranexamic acid or epsilon aminocaproic acid may be given along with clotting factors to prevent breakdown of clots.
Pain medicines, steroids, and physical therapy may be used to reduce pain and swelling in an affected joint. In those with severe hemophilia A already receiving FVIII, emicizumab may provide some benefit. Different treatments are used to help those with an acquired form of hemophilia in addition to the normal clotting factors. Often the most effective treatment is corticosteroids which remove the auto-antibodies in half of people. As a secondary route of treatment, cyclophosphamide and cyclosporine are used and are proven effective for those who did not respond to the steroid treatments. In rare cases a third route or treatment is used, high doses of intravenous immunoglobulin or immunosorbent that works to help control bleeding instead of battling the auto-antibodies.
Anticoagulants such as heparin and warfarin are contraindicated for people with haemophilia as these can aggravate clotting difficulties. Also contraindicated are those drugs which have "blood thinning" side effects. For instance, medicines which contain aspirin, ibuprofen, or naproxen sodium should not be taken because they are well known to have the side effect of prolonged bleeding.
Also contraindicated are activities with a high likelihood of trauma, such as motorcycling and skateboarding. Popular sports with very high rates of physical contact and injuries such as American football, hockey, boxing, wrestling, and rugby should be avoided by people with haemophilia. Other active sports like soccer, baseball, and basketball also have a high rate of injuries, but have overall less contact and should be undertaken cautiously and only in consultation with a doctor.
Like most aspects of the disorder, life expectancy varies with severity and adequate treatment. People with severe haemophilia who don't receive adequate, modern treatment have greatly shortened lifespans and often do not reach maturity. Prior to the 1960s when effective treatment became available, average life expectancy was only 11 years. By the 1980s the life span of the average haemophiliac receiving appropriate treatment was 50–60 years. Today with appropriate treatment, males with haemophilia typically have a near normal quality of life with an average lifespan approximately 10 years shorter than an unaffected male.
Since the 1980s the primary leading cause of death of people with severe haemophilia has shifted from haemorrhage to HIV/AIDS acquired through treatment with contaminated blood products. The second leading cause of death related to severe haemophilia complications is intracranial haemorrhage which today accounts for one third of all deaths of people with haemophilia. Two other major causes of death include hepatitis infections causing cirrhosis and obstruction of air or blood flow due to soft tissue haemorrhage.
Haemophilia is rare, with only about 1 instance in every 10,000 births (or 1 in 5,000 male births) for haemophilia A and 1 in 50,000 births for haemophilia B. About 18,000 people in the United States have haemophilia. Each year in the US, about 400 babies are born with the disorder. Haemophilia usually occurs in males and less often in females. It is estimated that about 2,500 Canadians have haemophilia A, and about 500 Canadians have haemophilia B.
The excessive bleeding was known to ancient people. The Talmud instructs that a boy must not be circumcised if he had two brothers who died due to complications arising from their circumcisions, and Maimonides says that this excluded paternal half-brothers. This may have been due to a concern about hemophilia. The first medical professional to describe the disease was Arab surgeon Al-Zahrawi, also known as Abulcasis. In the tenth century he described families whose males died of bleeding after only minor traumas. While many other such descriptive and practical references to the disease appear throughout historical writings, scientific analysis did not begin until the start of the nineteenth century.
In 1803, John Conrad Otto, a Philadelphian physician, wrote an account about "a hemorrhagic disposition existing in certain families" in which he called the affected males "bleeders". He recognised that the disorder was hereditary and that it affected mostly males and was passed down by healthy females. His paper was the second paper to describe important characteristics of an X-linked genetic disorder (the first paper being a description of colour blindness by John Dalton who studied his own family). Otto was able to trace the disease back to a woman who settled near Plymouth, NH in 1720. The idea that affected males could pass the trait onto their unaffected daughters was not described until 1813 when John F. Hay, published an account in The New England Journal of Medicine.
In 1924, a Finnish doctor discovered a hereditary bleeding disorder similar to haemophilia localised in the Åland Islands, southwest of Finland. This bleeding disorder is called "Von Willebrand Disease".
The term "haemophilia" is derived from the term "haemorrhaphilia" which was used in a description of the condition written by Friedrich Hopff in 1828, while he was a student at the University of Zurich. In 1937, Patek and Taylor, two doctors from Harvard, discovered anti-haemophilic globulin. In 1947, Pavlosky, a doctor from Buenos Aires, found haemophilia A and haemophilia B to be separate diseases by doing a lab test. This test was done by transferring the blood of one haemophiliac to another haemophiliac. The fact that this corrected the clotting problem showed that there was more than one form of haemophilia.
Haemophilia has featured prominently in European royalty and thus is sometimes known as 'the royal disease'. Queen Victoria passed the mutation for haemophilia B to her son Leopold and, through two of her daughters, Alice and Beatrice, to various royals across the continent, including the royal families of Spain, Germany, and Russia. In Russia, Tsarevich Alexei, the son and heir of Tsar Nicholas II, famously suffered from haemophilia, which he had inherited from his mother, Empress Alexandra, one of Queen Victoria's granddaughters. The haemophilia of Alexei would result in the rise to prominence of the Russian mystic Grigori Rasputin, at the imperial court.
It was claimed that Rasputin was successful at treating Tsarevich Alexei's haemophilia. At the time, a common treatment administered by professional doctors was to use aspirin, which worsened rather than lessened the problem. It is believed that, by simply advising against the medical treatment, Rasputin could bring visible and significant improvement to the condition of Tsarevich Alexei.
In Spain, Queen Victoria's youngest daughter, Princess Beatrice, had a daughter Victoria Eugenie of Battenberg, who later became Queen of Spain. Two of her sons were haemophiliacs and both died from minor car accidents. Her eldest son, Prince Alfonso of Spain, Prince of Asturias, died at the age of 31 from internal bleeding after his car hit a telephone booth. Her youngest son, Infante Gonzalo, died at age 19 from abdominal bleeding following a minor car accident in which he and his sister hit a wall while avoiding a cyclist. Neither appeared injured or sought immediate medical care and Gonzalo died two days later from internal bleeding.
The method for the production of an antihaemophilic factor was discovered by Judith Graham Pool from Stanford University in 1964, and approved for commercial use in 1971 in the United States under the name "Cryoprecipitated AHF". Together with the development of a system for transportation and storage of human plasma in 1965, this was the first time an efficient treatment for haemophilia became available.
Up until late-1985 many people with haemophilia received clotting factor products that posed a risk of HIV and hepatitis C infection. The plasma used to create the products was not screened or tested, neither had most of the products been subject to any form of viral inactivation.
Tens of thousands worldwide were infected as a result of contaminated factor products including more than 10,000 people in the United States, 3,500 British, 1,400 Japanese, 700 Canadians, 250 Irish, and 115 Iraqis.
Infection via the tainted factor products had mostly stopped by 1986 by which time viral inactivation methods had largely been put into place, although some products were shown to still be dangerous in 1987.
In those with severe haemophilia, gene therapy may reduce symptoms to those that a mild or moderate person with haemophilia might have. The best results have been found in haemophilia B. In 2016 early stage human research was ongoing with a few sites recruiting participants. In 2017 a gene therapy trial on nine people with haemophilia A reported that high doses did better than low doses. It is not currently an accepted treatment for haemophilia. | https://en.wikipedia.org/wiki?curid=14006 |
Hemicellulose
A hemicellulose (also known as polyose) is one of a number of heteropolymer (matrix polysaccharides), such as arabinoxylans, present along with cellulose in almost all terrestrial plant cell walls. While cellulose is crystalline, strong, and resistant to hydrolysis, hemicelluloses have random, amorphous structure with little strength. They are easily hydrolyzed by dilute acid or base as well as a myriad of hemicellulase enzymes.
Diverse kinds of hemicelluloses are known. Important examples include xylan, glucuronoxylan, arabinoxylan, glucomannan, and xyloglucan.
Hemicelluloses are polysaccharides often associated with cellulose, but cellulose and hemicellulose have distinct compositions and structures. Diverse sugars comprise hemicellulose, whereas cellulose is derived exclusively from glucose. For instance, besides glucose, sugar monomers in hemicelluloses can include the five-carbon sugars xylose and arabinose, the six-carbon sugars mannose and galactose, and the six-carbon deoxy sugar rhamnose. Hemicelluloses contain most of the D-pentose sugars, and occasionally small amounts of L-sugars as well. Xylose is in most cases the sugar monomer present in the largest amount, although in softwoods mannose can be the most abundant sugar. Not only regular sugars can be found in hemicellulose, but also their acidified form, for instance glucuronic acid and galacturonic acid can be present.
Unlike cellulose, hemicelluloses consist of shorter chains – 500–3,000 sugar units. In contrast, 7,000–15,000 glucose molecules comprise each polymer of cellulose. In addition, hemicellulose may be branched polymers, while cellulose is unbranched. Hemicelluloses are embedded in the cell walls of plants, sometimes in chains that form a 'ground' – they bind with pectin to cellulose to form a network of cross-linked fibres.
Based on the structural difference, like backbone linkages and side groups, as well as other factors, like abundance and distributions in plants, hemicellulose could be characterized into four groups as following: 1) Xylans, 2) Mannans; 3) Mixed linkage β-glucans; 4) Xyloglucans
Xylans
Xylans usually consist of backbone of β-(1→4)-linked xylose residues. And it could be further divided into homoxylans and heteroxylans. Homoxylans has a backbone of D-xylopyranose residues linked by β(1→3) or mixed ,β(1→3, 1→4)-glycosidic linkages. Homoxylans mainly carry structural functions. Heteroxylans such as glucuronoxylans, glucuronoarabinoxylans, and complex heteroxylans, have a backbone of D-xylopyranose and short carbohydrate branches. For examples, glucuronoxylan has substitution with α-(1→2)-linked glucuronosyl and 4-O-methyl glucuronosyl residues. And arabinoxylans and glucuronoarabinoxylans contain arabinose residues attached to the backbone
Mannans
The mannan-type hemicellulose can be classified into two types based on their main chain difference, galactomannans and glucomannans. Galactomannans have only β-(1→4) linked D-mannopyranose residues in linear chains. Glucomannans consist of both β-(1→4) linked D-mannopyranose and β-(1→4) linked D-glucopyranose residues in the main chains. As for the side chains, D-galactopyranose residues tend to be 6-linked to both types as the single side chains with various amount.
Mixed linkage β-glucans
The conformation of the mixed linkage glucan chains usually contains blocks of β-(1→4) D-Glucopyranose separated by single β-(1→3) D-Glucopyranose. The population of β-(1→4) and β-(1→3) are about 70% and 30%. These glucans primarily consist of cellotriosly (C18H32O16) and cellotraosyl (C24H42O21)segments in random order. There are some study show the molar ratio of cellotriosly/cellotraosyl for oat (2.1-2.4), barley (2.8-3.3), and wheat (4.2-4.5).
Xyloglucans
Xyloglucans have a backbone similar to cellulose with α-D-Xylopyranose residues at position 6. To better describe different side chains, a single letter code notation is used for each side chain type. G -- unbranched Glc residue; X -- α-d-Xyl-(1→6)-Glc. L -- β-Gal , S -- α-l-Araf, F-- α-l-Fuc. These are the most common side chains.
The two most common types of xyloglucans in plant cell walls are identified as XXXG and XXGG.
Hemicelluloses are synthesised from sugar nucleotides in the cell's Golgi apparatus. Two models explain their synthesis: 1) a '2 component model' where modification occurs at two transmembrane proteins, and 2) a '1 component model' where modification occurs only at one transmembrane protein. After synthesis, hemicelluloses are transported to the plasma membrane via Golgi vesicles.
Each kind of hemicellulose is biosynthesized by specialized enzymes.
Mannan chain backbones are synthesized by cellulose synthase-like protein family A (CSLA) and possibly enzymes in cellulose synthase-like protein family D (CSLD). Mannan synthase, a particular enzyme in CSLA, is responsible for the addition of mannose units to the backbone. The galactose side-chains of some mannans are added by galactomannan galactosyltransferase. Acetylation of mannans is mediated by a mannan O-acetyltransferase, however, this enzyme has not been definitively identified.
Xyloglucan backbone synthesis is mediated by cellulose synthase-like protein family C (CSLC), particularly glucan synthase, which adds glucose units to the chain. Backbone synthesis of xyloglucan is also mediated in some way by xylosyltransferase, but this mechanism is separate to its transferase function and remains unclear. Xylosyltransferase in its transferase function is, however, utilized for the addition of xylose to the side-chain. Other enzymes utilized for side-chain synthesis of xyloglucan include galactosyltransferase (which is responsible for the addition of galactose and of which two different forms are utilized), fucosyltransferase (which is responsible for the addition of fucose), and acetyltransferase (which is responsible for acetylation).
Xylan backbone synthesis, unlike that of the other hemicelluloses, is not mediated by any cellulose synthase-like proteins. Instead, xylan synthase is responsible for backbone synthesis, facilitating the addition of xylose. Several genes for xylan synthases have been identified. Several other enzymes are utilized for the addition and modification of the side-chain units of xylan, including glucuronosyltransferase (which adds glucuronic acid units), xylosyltransferase (which adds additional xylose units), arabinosyltransferase (which adds arabinose), methyltransferase (responsible for methylation), and acetyltransferase (responsible for acetylation).Given that mixed-linkage glucan is a non-branched homopolymer of glucose, there is no side-chain synthesis, only the addition of glucose to the backbone in two linkages, β1-3 and β1-4. Backbone synthesis is mediated by enzymes in cellulose synthase-like protein families F and H (CSLF and CSLH), specifically glucan synthase. Several forms of glucan synthase from CSLF and CSLH have been identified. All of them are responsible for addition of glucose to the backbone and all are capable of producing both β1-3 and β1-4 linkages, however, it is unknown how much each specific enzyme contributes to the distribution of β1-3 and β1-4 linkages.
In the sulphite pulp process the hemicellulose is largely hydrolysed by the acid pulping liquor ending up in the brown liquor where the fermentable hexose sugars (around 2%) can be used for producing ethanol. This process was primarily applied to calcium sulfite brown liquors.
Arabinogalactans can be used as emulsifiers, stabilizers and binders according to the Federal Food, Drug and Cosmetic Act. Arabinogalactans can also be used as bonding agent for a bunch of things like sweeteners.
The films based on xylan show low oxygen permeability and thus are of potential interest as packaging for oxygen-sensitive products.
Agar is used in making jellies and puddings. It is also growth medium with other nutrients for microorganisms.
Curdlan can be used in fat replacement to produce diet food while having a taste and a mouth feel of real fat containing products.
b-glucans have an important role in food supplement while b-glucans are also promising in health-related issues, especially in immune reactions and the treatment of cancer.
Xanthan, with other polysaccharides can form gels that have high solution viscosity which can be used in the oil industry to thicken drilling mud. In the food industry, xanthan is used in products such as dressings and sauces.
Alginate is an important role in the development of antimicrobial textiles due to its characteristics of environmental friendliness, and high industrialization level as a sustainable biopolymer.
Hemicellulose in Plant Cells
There are many ways to obtain hemicellulose; all of these rely on extraction methods through hardwood or softwood trees milled into smaller samples. In hardwoods the main hemicellulose extract is glucuronoxlyan (acetylated xylans), while galactoglucomannan is found in softwoods. Prior to extraction the wood typically must be milled into wood chips of various sizes depending on the reactor used. Following this, a hot water extraction process, also known as autohydrolysis or hydrothermal treatment, is utilized with the addition of acids and bases to vastly change the yield size and properties. The main advantage to hot water extraction is that it offers a method where the only chemical that is needed is water, making this environmentally friendly and cheap.
The hot water treatment goal is achieve as much removal of hemilleculose from the wood as possible. This is done through the hydrolysis of the hemicellulose to achieve smaller oligomers and monosacccharie xylose. Xylose when dehydrated becomes furfural. When xylose and fufural are the goal, acid catalysts, such as formic acid, are added to increase the transition of polysaccharide to monosaccharide. This catalyst also has been show to also utilize a solvent effect to be aid the reaction.
One method of pretreatment is to soak the wood with diluted acids (with concentrations around 4%). This converts the hydroloze hemicellulose into monosaccharaides. When pretreatment is done with bases (for instance sodium or potassium hydroxide) this destroys the structure of the inherent lignin. This changes the structure from crystalline to amorphous. Another pretreatment method is to pretreat hydrothermally. This offers advantages such as no toxic or corrosive solvents are needed, nor are special reactors, and no extra costs to dispose of hazardous chemicals.
The hot water extraction process is done in batch reactors, semi-continuous reactors, or slurry continuous reactors. For batch and semi-continuous reactors wood samples can be used in conditions such as chips or pellets while a slurry reactor must have particles as small as 200 to 300 micrometers. While the particle size decreases the yield production decreases as well. This is due to the increase of cellulose.
The hot water process is operated at a temperature range of 160 to 240 degrees Celsius in order to maintain the liquid phrase. This is done above the normal boiling point of water to increase the solubilization of the hemicellulose and the depolymerization of polysaccharides. This process can take several minutes to several hours depending on the temperature and pH of the system. Higher temperatures paired with higher extraction times lead to higher yields. A maximum yield is obtained at a pH of 3.5. If below, the extraction yield exponentially decreases. In order to control pH sodium bicarbonates are generally added. The sodium biocarbonates inhibits the autodyolysis of acetyl groups as well as inhibiting glycosic bonds. Depending on the temperature and time the hemicellulose can be further converted into oligomers, monomers, and lignin. | https://en.wikipedia.org/wiki?curid=14009 |
Hillbilly
"Hillbilly" is a term (often derogatory) for people who dwell in rural, mountainous areas in the United States, primarily in southern Appalachia and the Ozarks. The term was later used to refer to people from other rural and mountainous areas west of the Mississippi river too, particularly those of the Rocky Mountains and near the Rio Grande.
The first known instances of "hillbilly" in print were in "The Railroad Trainmen's Journal" (vol. ix, July 1892), an 1899 photograph of men and women in West Virginia labeled "Camp Hillbilly", and a 1900 "New York Journal" article containing the definition: "a Hill-Billie is a free and untrammeled white citizen of Alabama, who lives in the hills, has no means to speak of, dresses as he can, talks as he pleases, drinks whiskey when he gets it, and fires off his revolver as the fancy takes him". The stereotype is twofold in that it incorporates both positive and negative traits: "Hillbillies" are often considered independent and self-reliant individuals who resist the modernization of society, but at the same time they are also defined as backward and violent. Scholars argue this duality is reflective of the split ethnic identities in white America. The term's later usage extended beyond solely white communities, exemplified with the "Hispanic hillbillies of northern New Mexico," in reference to the Hispanos of New Mexico.
The Appalachian Mountains were settled in the 18th century by settlers primarily from England, lowland Scotland, and the province of Ulster in Ireland. The settlers from Ulster were mainly Protestants who migrated to Ireland, during the Plantation of Ulster in the 17th century, from Scotland and Northern England. Many further migrated to the American colonies beginning in the 1730s, and in America became known as the Scots-Irish.
Scholars argue that the term "hillbilly" originated from Scottish dialect. The term "hill-folk" referred to people who preferred isolation from the greater society, and "billy" meant "comrade" or "companion". It is suggested that "hill-folk" and "billie" were combined when the Cameronians fled to the hills of southern Scotland. There is also the belief that most of the settlers from Scotland and northern Ireland were followers of king William of Orange. 'Billy' is a diminutive of 'William' common across the British isles. For the people who settle in America in the hills and who were Williamites, the term hillbilly connects both people who live in the hills and who are supporters of king William of Orange's ideologies. In 17th century Ireland, during the Williamite War, Protestant supporters of King William III ("King Billy") were often referred to as "Billy's Boys". However, some scholars disagree with this theory. Michael Montgomery's "From Ulster to America: The Scotch-Irish Heritage of American English" states, "In Ulster in recent years it has sometimes been supposed that it was coined to refer to followers of King William III and brought to America by early Ulster emigrants, but this derivation is almost certainly incorrect. ... In America "hillbilly" was first attested only in 1898, which suggests a later, independent development."
The term "hillbilly" spread in the years following the American Civil War. At this time, the country was developing both technologically and socially, but the Appalachian region was falling behind. Before the war, Appalachia was not distinctively different from other rural areas of the country. Post-war, although the frontier pushed farther west, the region maintained frontier characteristics. Appalachians themselves were perceived as backward, quick to violence and inbred in their isolation. Fueled by news stories of mountain feuds such as that in the 1880s between the Hatfields and McCoys, the hillbilly stereotype developed in the late 19th to early 20th century.
The "classic" hillbilly stereotype reached its current characterization during the years of the Great Depression. The period of Appalachian out-migration, roughly from the 1930s through the 1950s, saw many mountain residents moving north to the Midwestern industrial cities of Chicago, Cleveland, Akron, and Detroit.
This movement to Northern society, which became known as the "Hillbilly Highway", brought these previously isolated communities into mainstream United States culture. In response, poor white mountaineers became central characters in newspapers, pamphlets, and eventually, motion pictures. Authors at the time were inspired by historical figures such as Davy Crockett and Daniel Boone. The mountaineer image transferred over to the 20th century where the "hillbilly" stereotype emerged.
Pop culture has perpetuated the "hillbilly" stereotype. Scholarly works suggest that the media has exploited both the Appalachian region and people by classifying them as "hillbillies". These generalizations do not match the cultural experiences of Appalachians. Appalachians, like many other groups, do not subscribe to a single identity. One of the issues associated with stereotyping is that it is profitable. When "hillbilly" became a widely used term, entrepreneurs saw a window for potential revenue. They "recycled" the image and brought it to life through various forms of media.
The comics portrayed hillbilly stereotypes, notably in two strips, "Li'l Abner" and "Snuffy Smith". Both characters were introduced in 1934. Television and film have portrayed "hillbillies" in both derogatory and sympathetic terms. Films such as "Sergeant York" or the Ma and Pa Kettle series portrayed the "hillbilly" as wild but good-natured. Television programs of the 1960s such as "The Real McCoys", "The Andy Griffith Show", and especially "The Beverly Hillbillies", portrayed the "hillbilly" as backwards but with enough wisdom to outwit more sophisticated city folk. "Gunsmoke" Festus Haggen was portrayed as intelligent and quick-witted (but lacking "education"). The popular 1970s television variety show "Hee Haw" regularly lampooned the stereotypical "hillbilly" lifestyle. A darker image of the hillbilly was introduced to another generation in the film "Deliverance" (1972), based on a novel of the same name by James Dickey, which depicted some "hillbillies" as genetically deficient, inbred, and murderous.
"Hillbillies" were at the center of reality television in the 21st century. Network television shows such as "New Beverly Hillbillies", "High Life", and "The Simple Life" displayed the "hillbilly" lifestyle for viewers in the United States. This sparked protests across the country with rural-minded individuals gathering to fight the stereotype. The Center for Rural Strategies started a nationwide campaign stating the stereotype was "politically incorrect". The Kentucky-based organization engaged political figures in the movement such as Robert Byrd and Mike Huckabee. Both protestors argued that the discrimination of any other group in United States would not be tolerated, so neither should the discrimination against rural U.S. citizens. A 2003 piece published by "The Cincinnati Enquirer" read, "In this day of hypersensitivity to diversity and political correctness, Appalachians have been a group that it is still socially acceptable to demean and joke about. ... But rural folks have spoken up and said 'enough' to the Hollywood mockers."
"" (2016) is a memoir by J. D. Vance about the Appalachian values of his upbringing and their relationship to the social problems of his hometown, Middletown, Ohio. The book topped "The New York Times" Best Seller list in August 2016.
A family of "Hill People", who are employed as migrant workers on a farm in 1952 Arkansas, have a major role in John Grisham's book "A Painted House", with Grisham trying to avoid stereotypes.
"Hillbilly music" was at one time considered an acceptable label for what is now known as country music. The label, coined in 1925 by country pianist Al Hopkins, persisted until the 1950s.
The "hillbilly music" categorization covers a wide variety of musical genres including bluegrass, country, western, and gospel. Appalachian folk song existed long before the "hillbilly" label. When the commercial industry was combined with "traditional Appalachian folksong", "hillbilly music" was formed. Some argue this is a "High Culture" issue where sophisticated individuals may see something considered "unsophisticated" as "trash".
In the early-20th century, artists began to utilize the "hillbilly" label. The term gained momentum due to Ralph Peer, the recording director of OKeh Records, who heard it being used among Southerners when he went down to Virginia to record the music and labeled all Southern country music as so from then on. The York Brothers entitled one of their songs "Hillbilly Rose" and the Delmore Brothers followed with their song "Hillbilly Boogie". In 1927, the Gennett studios in Richmond, Indiana, made a recording of black fiddler Jim Booker. The recordings were labeled "made for Hillbilly" in the Gennett files and were marketed to a white audience. Columbia Records had much success with the "Hill Billies" featuring Al Hopkins and Fiddlin' Charlie Bowman.
By the late-1940s, radio stations started to use the "hillbilly music" label. Originally, "hillbilly" was used to describe fiddlers and string bands, but now it was used to describe traditional Appalachian music. Appalachians had never used this term to describe their own music. Popular songs whose style bore characteristics of both hillbilly and African American music were referred to as "hillbilly boogie" and "rockabilly". Elvis Presley was a prominent player of rockabilly and was known early in his career as the "Hillbilly Cat".
When the Country Music Association was founded in 1958, the term "hillbilly music" gradually fell out of use. The music industry merged hillbilly music, Western swing, and Cowboy music, to form the current category C&W, Country and Western.
Some artists (notably Hank Williams) and fans were offended by the "hillbilly music" label. While the term is not used as frequently today, it is still used on occasion to refer to old-time music or bluegrass. For example, WHRB broadcasts a popular weekly radio show entitled "Hillbilly at Harvard". The show is devoted to playing a mix of old-time music, bluegrass, and traditional country and western.
The hillbilly stereotype is considered to have had a traumatizing effect on some in the Appalachian region. Feelings of shame, self-hatred, and detachment are cited as a result of "culturally transmitted traumatic stress syndrome". Appalachian scholars say that the large-scale stereotyping has rewritten Appalachian history, making Appalachians feel particularly vulnerable. "Hillbilly" has now become part of Appalachian identity and some Appalachians feel they are constantly defending themselves against this image.
The stereotyping also has political implications for the region. There is a sense of "perceived history" that prevents many political issues from receiving adequate attention. Appalachians are often blamed for economic struggles. "Moonshiners, welfare cheats, and coal miners" are stereotypes stemming from the greater hillbilly stereotype in the region. This prejudice has been said to serve as a barrier for addressing some serious issues such as the economy and the environment.
Despite the political and social difficulties associated with stereotyping, Appalachians have organized to enact change. The War on Poverty is sometimes considered to be an example of one effort that allowed for Appalachian community organization. Grassroots movements, protests, and strikes are common in the area, though not always successful.
The Springfield, Missouri Chamber of Commerce once presented dignitaries visiting the city with an "Ozark Hillbilly Medallion" and a certificate proclaiming the honoree a "hillbilly of the Ozarks". On June 7, 1952, President Harry S. Truman received the medallion after a breakfast speech at the Shrine Mosque for the 35th Division Association. Other recipients included US Army generals Omar Bradley and Matthew Ridgway, J. C. Penney, Johnny Olson, and Ralph Story.
Hillbilly Days is an annual festival held in mid-April in Pikeville, Kentucky celebrating the best of Appalachian culture. The event began by local Shriners as a fundraiser to support the Shriners Children's Hospital. It has grown since its beginning in 1976 and now is the second largest festival held in the state of Kentucky. Artists and craftspeople showcase their talents and sell their works on display. Nationally renowned musicians as well as the best of the regional mountain musicians share six different stages located throughout the downtown area of Pikeville. Aspiring hillbillies from across the nation compete to come up with the wildest Hillbilly outfit. The event has earned its name as the Mardi Gras of the Mountains. Fans of "mountain music" come from around the United States to hear this annual concentrated gathering of talent. Some refer to this event as the equivalent of a "Woodstock" for mountain music.
The term "Hillbilly" is used with pride by a number of people within the region as well as famous persons, such as singer Dolly Parton, chef Sean Brock, and actress Minnie Pearl. Positive self-identification with the term generally includes identification with a set of "hillbilly values" including love and respect for nature, strong work ethic, generosity toward neighbors and those in need, family ties, self-reliance, resiliency, and a simple lifestyle.
African Banjo Echoes in Appalachia: A Study of Folk Tradition (1995), by Cecelia Conway | https://en.wikipedia.org/wiki?curid=14011 |
Hernán Cortés
Hernán Cortés de Monroy y Pizarro Altamirano, 1st Marquess of the Valley of Oaxaca (; ; 1485 – December 2, 1547) was a Spanish "Conquistador" who led an expedition that caused the fall of the Aztec Empire and brought large portions of what is now mainland Mexico under the rule of the King of Castile in the early 16th century. Cortés was part of the generation of Spanish explorers and conquistadors who began the first phase of the Spanish colonization of the Americas.
Born in Medellín, Spain, to a family of lesser nobility, Cortés chose to pursue adventure and riches in the New World. He went to Hispaniola and later to Cuba, where he received an "encomienda" (the right to the labor of certain subjects). For a short time, he served as "alcalde" (magistrate) of the second Spanish town founded on the island. In 1519, he was elected captain of the third expedition to the mainland, which he partly funded. His enmity with the Governor of Cuba, Diego Velázquez de Cuéllar, resulted in the recall of the expedition at the last moment, an order which Cortés ignored.
Arriving on the continent, Cortés executed a successful strategy of allying with some indigenous people against others. He also used a native woman, Doña Marina, as an interpreter. She later bore his first son. When the Governor of Cuba sent emissaries to arrest Cortés, he fought them and won, using the extra troops as reinforcements. Cortés wrote letters directly to the king asking to be acknowledged for his successes instead of being punished for mutiny. After he overthrew the Aztec Empire, Cortés was awarded the title of "Marqués del Valle de Oaxaca", while the more prestigious title of Viceroy was given to a high-ranking nobleman, Antonio de Mendoza. In 1541 Cortés returned to Spain, where he died six years later of natural causes but embittered.
Because of the controversial undertakings of Cortés and the scarcity of reliable sources of information about him, it is difficult to describe his personality or motivations. Early lionizing of the conquistadores did not encourage deep examination of Cortés. Modern reconsideration has done little to enlarge understanding regarding him. As a result of these historical trends, descriptions of Cortés tend to be simplistic, and either damning or idealizing.
Cortés himself used the form "Hernando" or "Fernando" for his first name, as seen in his signature and the title of an early portrait. William Hickling Prescott's "Conquest of Mexico" (1843) also refers to him as Hernando Cortés. At some point writers began using the shortened form of "Hernán" more generally.
Cortés was born in 1485 in the town of Medellín, then a village in the Kingdom of Castile, now a municipality of the modern-day province of Badajoz in Extremadura, Spain. His father, Martín Cortés de Monroy, born in 1449 to Rodrigo or Ruy Fernández de Monroy and his wife María Cortés, was an infantry captain of distinguished ancestry but slender means. Hernán's mother was Catalína Pizarro Altamirano.
Through his mother, Hernán was second cousin once removed of Francisco Pizarro, who later conquered the Inca Empire of modern-day Peru, and not to be confused with another Francisco Pizarro, who joined Cortés to conquer the Aztecs. (His maternal grandmother, Leonor Sánchez Pizarro Altamirano, was first cousin of Pizarro's father Gonzalo Pizarro y Rodriguez.) Through his father, Hernán was related to Nicolás de Ovando, the third Governor of Hispaniola. His paternal great-grandfather was Rodrigo de Monroy y Almaraz, 5th Lord of Monroy.
According to his biographer, chaplain, and friend Francisco López de Gómara, Cortés was pale and sickly as a child. At the age of 14, he was sent to study Latin under an uncle in Salamanca. Modern historians have misconstrued this personal tutoring as time enrolled at the University of Salamanca.
After two years, Cortés returned home to Medellín, much to the irritation of his parents, who had hoped to see him equipped for a profitable legal career. However, those two years in Salamanca, plus his long period of training and experience as a notary, first in Valladolid and later in Hispaniola, gave him knowledge of the legal codes of Castile that he applied to help justify his unauthorized conquest of Mexico.
At this point in his life, Cortés was described by Gómara as ruthless, haughty, and mischievous. The 16-year-old youth had returned home to feel constrained life in his small provincial town. By this time, news of the exciting discoveries of Christopher Columbus in the New World was streaming back to Spain.
Plans were made for Cortés to sail to the Americas with a family acquaintance and distant relative, Nicolás de Ovando, the newly appointed Governor of Hispaniola. (This island is now divided between Haiti and the Dominican Republic). Cortés suffered an injury and was prevented from traveling. He spent the next year wandering the country, probably spending most of his time in Spain's southern ports of Cadiz, Palos, Sanlucar, and Seville. He finally left for Hispaniola in 1504 and became a colonist.
Cortés reached Hispaniola in a ship commanded by Alonso Quintero, who tried to deceive his superiors and reach the New World before them in order to secure personal advantages. Quintero's mutinous conduct may have served as a model for Cortés in his subsequent career. The history of the conquistadores is rife with accounts of rivalry, jockeying for positions, mutiny, and betrayal.
Upon his arrival in 1504 in Santo Domingo, the capital of Hispaniola, the 18-year-old Cortés registered as a citizen; this entitled him to a building plot and land to farm. Soon afterward, Governor Nicolás de Ovando granted him an "encomienda" and appointed him as a notary of the town of Azua de Compostela. His next five years seemed to help establish him in the colony; in 1506, Cortés took part in the conquest of Hispaniola and Cuba. The expedition leader awarded him a large estate of land and Indian slaves for his efforts.
In 1511, Cortés accompanied Diego Velázquez de Cuéllar, an aide of the Governor of Hispaniola, in his expedition to conquer Cuba. Velázquez was appointed Governor of New Spain. At the age of 26, Cortés was made clerk to the treasurer with the responsibility of ensuring that the Crown received the "quinto", or customary one fifth of the profits from the expedition.
Velázquez was so impressed with Cortés that he secured a high political position for him in the colony. He became secretary for Governor Velázquez. Cortés was twice appointed municipal magistrate ("alcalde") of Santiago. In Cuba, Cortés became a man of substance with an "encomienda" to provide Indian labor for his mines and cattle. This new position of power also made him the new source of leadership, which opposing forces in the colony could then turn to. In 1514, Cortés led a group which demanded that more Indians be assigned to the settlers.
As time went on, relations between Cortés and Governor Velázquez became strained. This began once news reached Velázquez that Juan de Grijalva had established a colony on the mainland where there was a bonanza of silver and gold, and Velázquez decided to send him help. Cortés was appointed Captain-General of this new expedition in October 1518, but was advised to move fast before Velázquez changed his mind.
With Cortés' experience as an administrator, knowledge gained from many failed expeditions, and his impeccable rhetoric he was able to gather six ships and 300 men, within a month. Velázquez's jealousy exploded and he decided to put the expedition in other hands. However, Cortés quickly gathered more men and ships in other Cuban ports.
Cortés also found time to become romantically involved with Catalina Xuárez (or Juárez), the sister-in-law of Governor Velázquez. Part of Velázquez's displeasure seems to have been based on a belief that Cortés was trifling with Catalina's affections. Cortés was temporarily distracted by one of Catalina's sisters but finally married Catalina, reluctantly, under pressure from Governor Velázquez. However, by doing so, he hoped to secure the good will of both her family and that of Velázquez.
It was not until he had been almost 15 years in the Indies that Cortés began to look beyond his substantial status as mayor of the capital of Cuba and as a man of affairs in the thriving colony. He missed the first two expeditions, under the orders of Francisco Hernández de Córdoba and then Juan de Grijalva, sent by Diego Velázquez to Mexico in 1518.
There are differing perceptions about what happened to Hernán Cortés's ships. Some think that he burned the vessels, and others believe he beached them. The notion that he burned his ships did not become accepted until 250 years later. Bernal Díaz del Castillo, while attending an expedition with Cortés, gives reason to believe that Cortés ran them ashore. In a letter to King Charles, Cortés states that his ships were incapable of sailing, telling his men the reason was shipworm. After establishing the town of Vera Cruz, five Aztec emissaries arrived which made Cortés anxious to visit Tenochtitlán. Therefore, he destroyed all of his ships but one, which he sent back to Spain for King Charles. The fear of his men returning to Cuba, rather than embarking on the journey to the Aztec Empire, made him decide to demolish his ships. They no longer had an option but to accompany him on this journey. This decision brought about severe consequences because Cortés had trapped himself in Mexico.
Cortés created strife between his men and the Aztecs by taking over the city. Worried his men would revolt, Montezuma decided to convince them to delay the attack. Cortés now had time to build more ships, but he had to stay on guard against the Aztecs because he was unable to leave Mexico. Furthermore, Velázquez was sending forces to arrest Cortés, which meant the lives of him and his men were at jeopardy. Still without ships, Cortés could not escape. This resulted in him fighting the battle at Cempoala. In addition to restraining himself in Mexico, Cortés also suffered financially. He had to repay Velázquez for the ships he destroyed.
Throughout his journey, Cortés had written five letters to King Charles. He wanted to relieve himself from Velázquez's authority; therefore, Cortés bribed the King through sending him treasures. Velázquez and his men convinced King Charles to revoke Cortés's position as governor. While King Charles forgave Cortés, he did receive punishment. Cortés was promoted to captain-general and given the title of Marquess, but he was not allowed to reclaim his governorship or return to New Spain.
In 1518, Velázquez put Cortés in command of an expedition to explore and secure the interior of Mexico for colonization. At the last minute, due to the old argument between the two, Velázquez changed his mind and revoked Cortés's charter. He ignored the orders and, in an act of open mutiny, went anyway in February 1519. He stopped in Trinidad, Cuba, to hire more soldiers and obtain more horses. Accompanied by about 11 ships, 500 men (including seasoned slaves), 13 horses, and a small number of cannons, Cortés landed on the Yucatán Peninsula in Mayan territory. There he encountered Geronimo de Aguilar, a Spanish Franciscan priest who had survived a shipwreck followed by a period in captivity with the Maya, before escaping. Aguilar had learned the Chontal Maya language and was able to translate for Cortés.
In March 1519, Cortés formally claimed the land for the Spanish crown. Then he proceeded to Tabasco, where he met with resistance and won a battle against the natives. He received twenty young indigenous women from the vanquished natives, and he converted them all to Christianity.
Among these women was La Malinche, his future mistress and mother of his son Martín. Malinche knew both the Nahuatl language and Chontal Maya, thus enabling Cortés to communicate with the Aztecs through Aguilar. At San Juan de Ulúa on Easter Sunday 1519, Cortés met with Moctezuma II's Aztec Empire governors Tendile and Pitalpitoque.
In July 1519, his men took over Veracruz. By this act, Cortés dismissed the authority of the Governor of Cuba to place himself directly under the orders of King Charles. In order to eliminate any ideas of retreat, Cortés scuttled his ships.
In Veracruz, he met some of the tributaries of the Aztecs and asked them to arrange a meeting with Moctezuma II, the "tlatoani" (ruler) of the Aztec Empire. Moctezuma repeatedly turned down the meeting, but Cortés was determined. Leaving a hundred men in Veracruz, Cortés marched on Tenochtitlán in mid-August 1519, along with 600 soldiers, 15 horsemen, 15 cannons, and hundreds of indigenous carriers and warriors.
On the way to Tenochtitlán, Cortés made alliances with indigenous peoples such as the Totonacs of Cempoala and the Nahuas of Tlaxcala. The Otomis initially, and then the Tlaxcalans fought with the Spanish in a series of three battles from 2 to 5 September 1519, and at one point, Diaz remarked, "they surrounded us on every side". After Cortés continued to release prisoners with messages of peace, and realizing the Spanish were enemies of Moctezuma, Xicotencatl the Elder and Maxixcatzin persuaded the Tlaxcalan warleader, Xicotencatl the Younger, that it would be better to ally with the newcomers than to kill them.
In October 1519, Cortés and his men, accompanied by about 1,000 Tlaxcalteca, marched to Cholula, the second largest city in central Mexico. Cortés, either in a pre-meditated effort to instill fear upon the Aztecs waiting for him at Tenochtitlan or (as he later claimed, when he was being investigated) wishing to make an example when he feared native treachery, massacred thousands of unarmed members of the nobility gathered at the central plaza, then partially burned the city.
By the time he arrived in Tenochtitlán the Spaniards had a large army. On November 8, 1519, they were peacefully received by Moctezuma II. Moctezuma deliberately let Cortés enter the Aztec capital, the island city of Tenochtitlán, hoping to get to know their weaknesses better and to crush them later.
Moctezuma gave lavish gifts of gold to the Spaniards which, rather than placating them, excited their ambitions for plunder. In his letters to King Charles, Cortés claimed to have learned at this point that he was considered by the Aztecs to be either an emissary of the feathered serpent god Quetzalcoatl or Quetzalcoatl himself – a belief which has been contested by a few modern historians. But quickly Cortés learned that several Spaniards on the coast had been killed by Aztecs while supporting the Totonacs, and decided to take Moctezuma as a hostage in his own palace, indirectly ruling Tenochtitlán through him.
Meanwhile, Velázquez sent another expedition, led by Pánfilo de Narváez, to oppose Cortés, arriving in Mexico in April 1520 with 1,100 men. Cortés left 200 men in Tenochtitlán and took the rest to confront Narváez. He overcame Narváez, despite his numerical inferiority, and convinced the rest of Narváez's men to join him. In Mexico, one of Cortés's lieutenants Pedro de Alvarado, committed the "massacre in the Great Temple", triggering a local rebellion.
Cortés speedily returned to Tenochtitlán. On July 1, 1520 Moctezuma was killed (the Spaniards claimed he was stoned to death by his own people; others claim he was murdered by the Spanish once they realized his inability to placate the locals). Faced with a hostile population, Cortés decided to flee for Tlaxcala. During the "Noche Triste" (June 30 – July 1, 1520), the Spaniards managed a narrow escape from Tenochtitlán across the Tlacopan causeway, while their rearguard was being massacred. Much of the treasure looted by Cortés was lost (as well as his artillery) during this panicked escape from Tenochtitlán.
After a battle in Otumba, they managed to reach Tlaxcala, having lost 870 men. With the assistance of their allies, Cortés's men finally prevailed with reinforcements arriving from Cuba. Cortés began a policy of attrition towards Tenochtitlán, cutting off supplies and subduing the Aztecs' allied cities. The siege of Tenochtitlán ended with Spanish victory and the destruction of the city.
In January 1521, Cortés countered a conspiracy against him, headed by Antonio de Villafana, who was hanged for the offense. Finally, with the capture of Cuauhtémoc, the "tlatoani" (ruler) of Tenochtitlán, on August 13, 1521, the Aztec Empire was captured, and Cortés was able to claim it for Spain, thus renaming the city Mexico City. From 1521 to 1524, Cortés personally governed Mexico.
Many historical sources have conveyed an impression that Cortés was unjustly treated by the Spanish Crown, and that he received nothing but ingratitude for his role in establishing New Spain. This picture is the one Cortés presents in his letters and in the later biography written by Francisco López de Gómara. However, there may be more to the picture than this. Cortés's own sense of accomplishment, entitlement, and vanity may have played a part in his deteriorating position with the king:
King Charles appointed Cortés as governor, captain general and chief justice of the newly conquered territory, dubbed "New Spain of the Ocean Sea". But also, much to the dismay of Cortés, four royal officials were appointed at the same time to assist him in his governing – in effect, submitting him to close observation and administration. Cortés initiated the construction of Mexico City, destroying Aztec temples and buildings and then rebuilding on the Aztec ruins what soon became the most important European city in the Americas.
Cortés managed the founding of new cities and appointed men to extend Spanish rule to all of New Spain, imposing the "encomienda" system in 1524. He reserved many encomiendas for himself and for his retinue, which they considered just rewards for their accomplishment in conquering central Mexico. However, later arrivals and members of factions antipathetic to Cortés complained of the favoritism that excluded them.
In 1523, the Crown (possibly influenced by Cortés's enemy, Bishop Fonseca), sent a military force under the command of Francisco de Garay to conquer and settle the northern part of Mexico, the region of Pánuco. This was another setback for Cortés who mentioned this in his fourth letter to the King in which he describes himself as the victim of a conspiracy by his archenemies Diego Velázquez de Cuéllar, Diego Columbus and Bishop Fonseca as well as Francisco Garay. The influence of Garay was effectively stopped by this appeal to the King who sent out a decree forbidding Garay to interfere in the politics of New Spain, causing him to give up without a fight.
Although Cortés had flouted the authority of Diego Velázquez in sailing to the mainland and then leading an expedition of conquest, Cortés's spectacular success was rewarded by the crown with a coat of arms, a mark of high honor, following the conqueror's request. The document granting the coat of arms summarizes Cortés's accomplishments in the conquest of Mexico. The proclamation of the king says in part:
We, respecting the many labors, dangers, and adventures which you underwent as stated above, and so that there might remain a perpetual memorial of you and your services and that you and your descendants might be more fully honored ... it is our will that besides your coat of arms of your lineage, which you have, you may have and bear as your coat of arms, known and recognized, a shield ...
The grant specifies the iconography of the coat of arms, the central portion divided into quadrants. In the upper portion, there is a "black eagle with two heads on a white field, which are the arms of the empire". Below that is a "golden lion on a red field, in memory of the fact that you, the said Hernando Cortés, by your industry and effort brought matters to the state described above" (i.e., the conquest). The specificity of the other two quadrants is linked directly to Mexico, with one quadrant showing three crowns representing the three Aztec emperors of the conquest era, Moctezuma, Cuitlahuac, and Cuauhtemoc and the other showing the Aztec capital of Tenochtitlan. Encircling the central shield are symbols of the seven city-states around the lake and their lords that Cortés defeated, with the lords "to be shown as prisoners bound with a chain which shall be closed with a lock beneath the shield".
Cortés's wife Catalina Súarez arrived in New Spain around summer 1522, along with her sister and brother. His marriage to Catalina was at this point extremely awkward, since she was a kinswoman of the governor of Cuba, Diego Velázquez, whose authority Cortés had thrown off and who was therefore now his enemy. Catalina lacked the noble title of "doña," so at this point his marriage with her no longer raised his status. Their marriage had been childless. Since Cortés had sired children with a variety of indigenous women, including a son around 1522 by his cultural translator, Doña Marina, Cortés knew he was capable of fathering children. Cortés's only male heir at this point was illegitimate, but nonetheless named after Cortés's father, Martín Cortés. This son Martín Cortés was sometimes called "El Mestizo".
Catalina Suárez died under mysterious circumstances the night of November 1–2, 1522. There were accusations at the time that Cortés had murdered his wife. There was an investigation into her death, interviewing a variety of household residents and others. The documentation of the investigation was published in the nineteenth century in Mexico and these archival documents were uncovered in the twentieth century. The death of Catalina Suárez produced a scandal and investigation, but Cortés was now free to marry someone of high status more appropriate to his wealth and power. In 1526, he built an imposing residence for himself, the Palace of Cortés in Cuernavaca, in a region close to the capital where he had extensive encomienda holdings. In 1529 he had been accorded the noble designation of "don", but more importantly was given the noble title of Marquess of the Valley of Oaxaca and married the Spanish noblewoman Doña Juana de Zúñiga. The marriage produced three children, including another son, who was also named Martín. As the first-born legitimate son, Don Martín Cortés y Zúñiga was now Cortés's heir and succeeded him as holder of the title and estate of the Marquessate of the Valley of Oaxaca. Cortés's legitimate daughters were Doña Maria, Doña Catalina, and Doña Juana.
Since the conversion to Christianity of indigenous peoples was an essential and integral part of the extension of Spanish power, making formal provisions for that conversion once the military conquest was completed was an important task for Cortés. During the Age of Discovery, the Catholic Church had seen early attempts at conversion in the Caribbean islands by Spanish friars, particularly the mendicant orders. Cortés made a request to the Spanish monarch to send Franciscan and Dominican friars to Mexico to convert the vast indigenous populations to Christianity. In his fourth letter to the king, Cortés pleaded for friars rather than diocesan or secular priests because those clerics were in his view a serious danger to the Indians' conversion.
If these people [Indians] were now to see the affairs of the Church and the service of God in the hands of canons or other dignitaries, and saw them indulge in the vices and profanities now common in Spain, knowing that such men were the ministers of God, it would bring our Faith into much harm that I believe any further preaching would be of no avail.
He wished the mendicants to be the main evangelists. Mendicant friars did not usually have full priestly powers to perform all the sacraments needed for conversion of the Indians and growth of the neophytes in the Christian faith, so Cortés laid out a solution to this to the king.
Your Majesty should likewise beseech His Holiness [the pope] to grant these powers to the two principal persons in the religious orders that are to come here, and that they should be his delegates, one from the Order of St. Francis and the other from the Order of St. Dominic. They should bring the most extensive powers Your Majesty is able to obtain, for, because these lands are so far from the Church of Rome, and we, the Christians who now reside here and shall do so in the future, are so far from the proper remedies of our consciences and, as we are human, so subject to sin, it is essential that His Holiness should be generous with us and grant to these persons most extensive powers, to be handed down to persons actually in residence here whether it be given to the general of each order or to his provincials.
The Franciscans arrived in May 1524, a symbolically powerful group of twelve known as the Twelve Apostles of Mexico, led by Fray Martín de Valencia. Franciscan Geronimo de Mendieta claimed that Cortés's most important deed was the way he met this first group of Franciscans. The conqueror himself was said to have met the friars as they approached the capital, kneeling at the feet of the friars who had walked from the coast. This story was told by Franciscans to demonstrate Cortés piety and humility and was a powerful message to all, including the Indians, that Cortés's earthly power was subordinate to the spiritual power of the friars. However, one of the first twelve Franciscans, Fray Toribio de Benavente Motolinia does not mention it in his history. Cortés and the Franciscans had a particularly strong alliance in Mexico, with Franciscans seeing him as "the new Moses" for conquering Mexico and opening it to Christian evangelization. In Motolinia's 1555 response to Dominican Bartolomé de Las Casas, he praises Cortés.
And as to those who murmur against the Marqués del Valle [Cortés], God rest him, and who try to blacken and obscure his deeds, I believe that before God their deeds are not as acceptable as those of the Marqués. Although as a human he was a sinner, he had faith and works of a good Christian, and a great desire to employ his life and property in widening and augmenting the fair of Jesus Christ, and dying for the conversion of these gentiles ... Who has loved and defended the Indians of this new world like Cortés? ... Through this captain, God opened the door for us to preach his holy gospel and it was he who caused the Indians to revere the holy sacraments and respect the ministers of the church.
In Fray Bernardino de Sahagún's 1585 revision of the conquest narrative first codified as Book XII of the Florentine Codex, there are laudatory references to Cortés that do not appear in the earlier text from the indigenous perspective. Whereas Book XII of the Florentine Codex concludes with an account of Spaniards' search for gold, in Sahagún's 1585 revised account, he ends with praise of Cortés for requesting the Franciscans be sent to Mexico to convert the Indians.
From 1524 to 1526, Cortés headed an expedition to Honduras where he defeated Cristóbal de Olid, who had claimed Honduras as his own under the influence of the Governor of Cuba Diego Velázquez. Fearing that Cuauhtémoc might head an insurrection in Mexico, he brought him with him to Honduras. In a controversial move, Cuauhtémoc was executed during the journey. Raging over Olid's treason, Cortés issued a decree to arrest Velázquez, whom he was sure was behind Olid's treason. This, however, only served to further estrange the Crown of Castile and the Council of Indies, both of which were already beginning to feel anxious about Cortés's rising power.
Cortés's fifth letter to King Charles attempts to justify his conduct, concludes with a bitter attack on "various and powerful rivals and enemies" who have "obscured the eyes of your Majesty". Charles, who was also Holy Roman Emperor, had little time for distant colonies (much of Charles's reign was taken up with wars with France, the German Protestants and the expanding Ottoman Empire), except insofar as they contributed to finance his wars. In 1521, year of the Conquest, Charles was attending to matters in his German domains and Bishop Adrian of Utrecht functioned as regent in Spain.
Velázquez and Fonseca persuaded the regent to appoint a commissioner (a "Juez de residencia", Luis Ponce de León) with powers to investigate Cortés's conduct and even arrest him. Cortés was once quoted as saying that it was "more difficult to contend against [his] own countrymen than against the Aztecs." Governor Diego Velázquez continued to be a thorn in his side, teaming up with Bishop Juan Rodríguez de Fonseca, chief of the Spanish colonial department, to undermine him in the Council of the Indies.
A few days after Cortés's return from his expedition, Ponce de León suspended Cortés from his office of governor of New Spain. The Licentiate then fell ill and died shortly after his arrival, appointing Marcos de Aguilar as "alcalde mayor". The aged Aguilar also became sick and appointed Alonso de Estrada governor, who was confirmed in his functions by a royal decree in August 1527. Cortés, suspected of poisoning them, refrained from taking over the government.
Estrada sent Diego de Figueroa to the south. De Figueroa raided graveyards and extorted contributions, meeting his end when the ship carrying these treasures sank. Albornoz persuaded Alonso de Estrada to release Gonzalo de Salazar and Chirinos. When Cortés complained angrily after one of his adherents' hands was cut off, Estrada ordered him exiled. Cortés sailed for Spain in 1528 to appeal to King Charles.
In 1528, Cortés returned to Spain to appeal to the justice of his master, Charles V. Juan Altamirano and Alonso Valiente stayed in Mexico and acted as Cortés' representatives during his absence. Cortés presented himself with great splendor before Charles V's court. By this time Charles had returned and Cortés forthrightly responded to his enemy's charges. Denying he had held back on gold due the crown, he showed that he had contributed more than the quinto (one-fifth) required. Indeed, he had spent lavishly to build the new capital of Mexico City on the ruins of the Aztec capital of Tenochtitlán, leveled during the siege that brought down the Aztec empire.
He was received by Charles with every distinction, and decorated with the order of Santiago. In return for his efforts in expanding the still young Spanish Empire, Cortés was rewarded in 1529 by being accorded the noble title of "don" but more importantly named the ""Marqués del Valle de Oaxaca"" (Marquess of the Valley of Oaxaca and married the Spanish noblewoman Doña Juana Zúñiga, after the 1522 death of his much less distinguished first wife, Catalina Suárez. The noble title and senorial estate of the Marquesado was passed down to his descendants until 1811. The Oaxaca Valley was one of the wealthiest regions of New Spain, and Cortés had 23,000 vassals in 23 named encomiendas in perpetuity.
Although confirmed in his land holdings and vassals, he was not reinstated as governor and was never again given any important office in the administration of New Spain. During his travel to Spain, his property was mismanaged by abusive colonial administrators. He sided with local natives in a lawsuit. The natives documented the abuses in the Huexotzinco Codex.
The entailed estate and title passed to his legitimate son Don Martín Cortés upon Cortés's death in 1547, who became the Second Marquess. Don Martín's association with the so-called Encomenderos' Conspiracy endangered the entailed holdings, but they were restored and remained the continuing reward for Hernán Cortés's family through the generations.
Cortés returned to Mexico in 1530 with new titles and honors, but with diminished power. Although Cortés still retained military authority and permission to continue his conquests, viceroy Don Antonio de Mendoza was appointed in 1535 to administer New Spain's civil affairs. This division of power led to continual dissension, and caused the failure of several enterprises in which Cortés was engaged. On returning to Mexico, Cortés found the country in a state of anarchy. There was a strong suspicion in court circles of an intended rebellion by Cortés.
After reasserting his position and reestablishing some sort of order, Cortés retired to his estates at Cuernavaca, about 30 miles (48 km) south of Mexico City. There he concentrated on the building of his palace and on Pacific exploration. Remaining in Mexico between 1530 and 1541, Cortés quarreled with Nuño Beltrán de Guzmán and disputed the right to explore the territory that is today California with Antonio de Mendoza, the first viceroy.
Cortés acquired several silver mines in Zumpango del Rio in 1534. By the early 1540s, he owned 20 silver mines in Sultepec, 12 in Taxco, and 3 in Zacualpan. Earlier, Cortés had claimed the silver in the Tamazula area.
In 1536, Cortés explored the northwestern part of Mexico and discovered the Baja California Peninsula. Cortés also spent time exploring the Pacific coast of Mexico. The Gulf of California was originally named the "Sea of Cortés" by its discoverer Francisco de Ulloa in 1539. This was the last major expedition by Cortés.
After his exploration of Baja California, Cortés returned to Spain in 1541, hoping to confound his angry civilians, who had brought many lawsuits against him (for debts, abuse of power, etc.).
On his return he went through a crowd to speak to the emperor, who demanded of him who he was. "I am a man," replied Cortés, "who has given you more provinces than your ancestors left you cities."
The emperor finally permitted Cortés to join him and his fleet commanded by Andrea Doria at the great expedition against Algiers in the Barbary Coast in 1541, which was then part of the Ottoman Empire and was used as a base by Hayreddin Barbarossa, a famous Turkish corsair and Admiral-in-Chief of the Ottoman Fleet. During this campaign, Cortés was almost drowned in a storm that hit his fleet while he was pursuing Barbarossa.
Having spent a great deal of his own money to finance expeditions, he was now heavily in debt. In February 1544 he made a claim on the royal treasury, but was ignored for the next three years. Disgusted, he decided to return to Mexico in 1547. When he reached Seville, he was stricken with dysentery. He died in Castilleja de la Cuesta, Seville province, on December 2, 1547, from a case of pleurisy at the age of 62.
He left his many mestizo and white children well cared for in his will, along with every one of their mothers. He requested in his will that his remains eventually be buried in Mexico. Before he died he had the Pope remove the "natural" status of four of his children (legitimizing them in the eyes of the church), including Martin, the son he had with Doña Marina (also known as La Malinche), said to be his favourite. His daughter, Doña Catalina, however, died shortly after her father's death.
After his death, his body was moved more than eight times for several reasons. On December 4, 1547 he was buried in the mausoleum of the Duke of Medina in the church of San Isidoro del Campo, Sevilla. Three years later (1550) due to the space being required by the duke, his body was moved to the altar of Santa Catarina in the same church. In his testament, Cortés asked for his body to be buried in the monastery he had ordered to be built in Coyoacan in México, ten years after his death, but the monastery was never built. So in 1566, his body was sent to New Spain and buried in the church of San Francisco de Texcoco, where his mother and one of his sisters were buried.
In 1629, "Don Pedro Cortés fourth "Marquez del Valle", his last male descendant, died, so the viceroy decided to move the bones of Cortés along with those of his descendant to the Franciscan church in México. This was delayed for nine years, while his body stayed in the main room of the palace of the viceroy. Eventually it was moved to the Sagrario of Franciscan church, where it stayed for 87 years. In 1716, it was moved to another place in the same church. In 1794, his bones were moved to the "Hospital de Jesus" (founded by Cortés), where a statue by Tolsá and a mausoleum were made. There was a public ceremony and all the churches in the city rang their bells.
In 1823, after the independence of México, it seemed imminent that his body would be desecrated, so the mausoleum was removed, the statue and the coat of arms were sent to Palermo, Sicily, to be protected by the Duke of Terranova. The bones were hidden, and everyone thought that they had been sent out of México. In 1836, his bones were moved to another place in the same building.
It was not until November 24, 1946 that they were rediscovered, thanks to the discovery of a secret document by Lucas Alamán. His bones were put in the charge of the Instituto Nacional de Antropología e Historia (INAH). The remains were authenticated by INAH. They were then restored to the same place, this time with a bronze inscription and his coat of arms. When the bones were first rediscovered, the supporters of the Hispanic tradition in Mexico were excited, but one supporter of an indigenist vision of Mexico "proposed that the remains be publicly burned in front of the statue of Cuauhtemoc, and the ashes flung into the air". Following the discovery and authentication of Cortés's remains, there was a discovery of what were described as the bones of Cuauhtémoc, resulting in a "battle of the bones".
Cortés is commemorated in the scientific name of a subspecies of Mexican lizard, "Phrynosoma orbiculare cortezii".
There are relatively few sources to the early life of Cortés; his fame arose from his participation in the conquest of Mexico and it was only after this that people became interested in reading and writing about him.
Probably the best source is his letters to the king which he wrote during the campaign in Mexico, but they are written with the specific purpose of putting his efforts in a favourable light and so must be read critically. Another main source is the biography written by Cortés's private chaplain Lopez de Gómara, which was written in Spain several years after the conquest. Gómara never set foot in the Americas and knew only what Cortés had told him, and he had an affinity for knightly romantic stories which he incorporated richly in the biography. The third major source is written as a reaction to what its author calls "the lies of Gomara", the eyewitness account written by the Conquistador Bernal Díaz del Castillo does not paint Cortés as a romantic hero but rather tries to emphasize that Cortés's men should also be remembered as important participants in the undertakings in Mexico.
In the years following the conquest more critical accounts of the Spanish arrival in Mexico were written. The Dominican friar Bartolomé de Las Casas wrote his "A Short Account of the Destruction of the Indies" which raises strong accusations of brutality and heinous violence towards the Indians; accusations against both the conquistadors in general and Cortés in particular. The accounts of the conquest given in the Florentine Codex by the Franciscan Bernardino de Sahagún and his native informants are also less than flattering towards Cortés. The scarcity of these sources has led to a sharp division in the description of Cortés's personality and a tendency to describe him as either a vicious and ruthless person or a noble and honorable cavalier.
In México there are few representations of Cortés. However, many landmarks still bear his name, from the castle Palacio de Cortés in the city of Cuernavaca to some street names throughout the republic.
The pass between the volcanoes Iztaccíhuatl and Popocatépetl where Cortés took his soldiers on their march to Mexico City. It is known as the Paso de Cortés.
The muralist Diego Rivera painted several representation of him but the most famous, depicts him as a powerful and ominous figure along with Malinche in a mural in the National Palace in Mexico City.
In 1981, President Lopez Portillo tried to bring Cortés to public recognition. First, he made public a copy of the bust of Cortés made by Manuel Tolsá in the Hospital de Jesús Nazareno with an official ceremony, but soon a nationalist group tried to destroy it, so it had to be taken out of the public. Today the copy of the bust is in the "Hospital de Jesús Nazareno" while the original is in Naples, Italy, in the Villa Pignatelli.
Later, another monument, known as "Monumento al Mestizaje" by Julián Martínez y M. Maldonado (1982) was commissioned by Mexican president José López Portillo to be put in the "Zócalo" (Main square) of Coyoacan, near the place of his country house, but it had to be removed to a little known park, the Jardín Xicoténcatl, Barrio de San Diego Churubusco, to quell protests. The statue depicts Cortés, Malinche and their son Martín.
There is another statue by Sebastián Aparicio, in Cuernavaca, was in a hotel "El casino de la selva". Cortés is barely recognizable, so it sparked little interest. The hotel was closed to make a commercial center, and the statue was put out of public display by Costco the builder of the commercial center.
Hernán Cortés is a character in the opera "La Conquista" (2005) by Italian composer Lorenzo Ferrero, which depicts the major episodes of the Spanish conquest of the Aztec Empire in 1521.
Cortés' personal account of the conquest of Mexico is narrated in his five letters addressed to Charles V. These five letters, the "cartas de relación", are Cortés' only surviving writings. See "Letters and Dispatches of Cortés", translated by George Folsom (New York, 1843); Prescott's "Conquest of Mexico" (Boston, 1843); and Sir Arthur Helps's "Life of Hernando Cortes" (London, 1871).
His first letter was considered lost, and the one from the municipality of Veracruz has to take its place. It was published for the first time in volume IV of "Documentos para la Historia de España", and subsequently reprinted.
The "Segunda Carta de Relacion", bearing the date of October 30, 1520, appeared in print at Seville in 1522. The third letter, dated May 15, 1522, appeared at Seville in 1523. The fourth, October 20, 1524, was printed at Toledo in 1525. The fifth, on the Honduras expedition, is contained in volume IV of the "Documentos para la Historia de España".
Natural children of Don Hernán Cortés
He married twice: firstly in Cuba to Catalina Suárez Marcaida, who died at Coyoacán in 1522 without issue, and secondly in 1529 to "doña" Juana Ramírez de Arellano de Zúñiga, daughter of "don" Carlos Ramírez de Arellano, 2nd Count of Aguilar and wife the Countess "doña" Juana de Zúñiga, and had: | https://en.wikipedia.org/wiki?curid=14013 |
Herstory
Herstory is a term for history written from a feminist perspective, emphasizing the role of women, or told from a woman's point of view. It originated as an alteration of the word "history", as part of a feminist critique of conventional historiography, which in their opinion is traditionally written as "his story", i.e., from the masculine point of view. (The word "history"—from the Ancient Greek ἱστορία, or "historia", meaning "knowledge obtained by inquiry"—is etymologically unrelated to the possessive pronoun "his".)
The "Oxford English Dictionary" credits Robin Morgan with first using the term "herstory" in print in her 1970 anthology "Sisterhood is Powerful". Concerning the feminist organization W.I.T.C.H., Morgan wrote:
During the 1970s and 1980s, second-wave feminists saw the study of history as a male-dominated intellectual enterprise and presented "herstory" as a means of compensation. The term, intended to be both serious and comic, became a rallying cry used on T-shirts and buttons as well as in academia.
In 2017, Hridith Sudev, an inventor, environmentalist and social activist associated with various youth movements, launched 'The Herstory Movement'; an online platform to "celebrate lesser known great persons; female, queer or otherwise marginalized, who helped shape the modern World History". It is intended as an academic platform to feature stories of female historic persons and thus help facilitate more widespread knowledge about 'Great Women' History.
Non-profit organizations Global G.L.O.W and LitWorld created a joint initiative called the "HerStory Campaign." This campaign works with 25 other countries to share girl's lives and stories. They encourage others to join the campaign and to 'raise our voices on behalf of all world's girls.'
The herstory movement has spawned women-centered presses, such as Virago Press in 1973, which publishes fiction and non-fiction by noted women authors like Janet Frame and Sarah Dunant.
This movement has led to an increase in activity in other female-centric disciplines such as "femistry" and "galgebra".
Christina Hoff Sommers has been a vocal critic of the concept of herstory, and presented her argument against the movement in her 1994 book, "Who Stole Feminism?". Sommers defined herstory as an attempt to infuse education with ideology, at the expense of knowledge. The "gender feminists", as she called them, were the group of feminists responsible for the movement, which she felt amounted to negationism. She regarded most attempts to make historical studies more female-inclusive as being artificial in nature, and an impediment to progress.
Professor and author Devoney Looser has criticized the concept of herstory for overlooking the contributions that some women made as historians before the twentieth century.
The Global Language Monitor, a nonprofit group that analyzes and tracks trends in language, named "herstory" the third most "politically incorrect" word of 2006—rivaled only by ""macaca"" and ""Global Warming Denier"."
Books published on the topic include: | https://en.wikipedia.org/wiki?curid=14015 |
House of Cards (British TV series)
House of Cards is a 1990 British political thriller television serial in four episodes, set after the end of Margaret Thatcher's tenure as Prime Minister of the United Kingdom. It was televised by the BBC from 18 November to 9 December 1990, to critical and popular acclaim.
Andrew Davies adapted the story from the 1989 novel of the same name by Michael Dobbs, a former Chief of Staff at Conservative Party headquarters. Neville Teller also dramatised Dobbs's novel for BBC World Service in 1996, and it had two television sequels ("To Play the King" and "The Final Cut"). The opening and closing theme music for this TV series is entitled "Francis Urquhart's March".
"House of Cards" was ranked 84th in the British Film Institute list of the 100 Greatest British Television Programmes in 2000. In 2013, the serial and the Dobbs novel were the basis for a US adaptation set in Washington, D.C., commissioned and released by Netflix.
The antihero of "House of Cards" is Francis Urquhart, a fictional Chief Whip of the Conservative Party, played by Ian Richardson. The plot follows his amoral and manipulative scheme to become leader of the governing party and, thus, Prime Minister of the United Kingdom.
Michael Dobbs did not envision writing the second and third books, as Urquhart dies at the end of the first novel. The screenplay of the BBC's dramatisation of "House of Cards" differs from the book, and hence allows future series. Dobbs wrote two following books, "To Play the King" and "The Final Cut", which were televised in 1993 and 1995, respectively.
"House of Cards" was said to draw from Shakespeare's plays "Macbeth" and "Richard III", both of which feature main characters who are corrupted by power and ambition. Richardson has a Shakespearean background and said he based his characterisation of Urquhart on Shakespeare's portrayal of Richard III.
Urquhart frequently talks through the camera to the audience, breaking the fourth wall.
After Margaret Thatcher's resignation, the ruling Conservative Party is about to elect a new leader. Francis Urquhart (Ian Richardson), an MP and the Government Chief Whip in the House of Commons, introduces viewers to the contestants, from which Henry "Hal" Collingridge (David Lyon) emerges victorious. Urquhart is secretly contemptuous of the well-meaning but weak Collingridge, but expects a promotion to a senior position in the Cabinet. After the general election, which the party wins by a reduced majority, Urquhart submits his suggestions for a reshuffle that includes his desired promotion. However, Collingridge – citing Harold Macmillan's political demise after the 1962 Night of the Long Knives – effects no changes at all. Urquhart resolves to oust Collingridge, with encouragement from his wife, Elizabeth (Diane Fletcher).
At the same time, with Elizabeth's blessing, Urquhart begins an affair with Mattie Storin (Susannah Harker), a junior political reporter at a Conservative-leaning tabloid newspaper called "The Chronicle". The affair allows Urquhart to manipulate Mattie and indirectly skew her coverage of the Conservative leadership contest in his favour. Mattie has an apparent Electra complex; she finds appeal in Urquhart's much older age and later refers to him as "Daddy". Another unwitting pawn is Roger O'Neill (Miles Anderson), the party's cocaine-addicted public relations consultant.
Urquhart blackmails O'Neill into leaking information on budget cuts that humiliates Collingridge during the Prime Minister's Questions. Later, he blames party chairman Lord "Teddy" Billsborough (Nicholas Selby) for leaking an internal poll showing a drop in Tory numbers, leading Collingridge to sack him. As Collingridge's image suffers, Urquhart encourages ultraconservative Foreign Secretary Patrick Woolton (Malcolm Tierney) and "Chronicle" owner Benjamin Landless (Kenny Ireland) to support his removal. Urquhart also poses as Collingridge's alcoholic brother Charles (James Villiers), to trade shares in a chemical company about to benefit from advance information confidential to the government. Consequently, Collingridge becomes falsely accused of insider trading and is forced to resign.
In the ensuing leadership race, Urquhart initially feigns unwillingness to stand before announcing his candidacy. With the help of his underling, Tim Stamper (Colin Jeavons), Urquhart goes about making sure his competitors drop out of the race: Health Secretary Peter MacKenzie (Christopher Owen) accidentally runs his car over a disabled protester at a demonstration staged by Urquhart and is forced by the public outcry to withdraw, while Education Secretary Harold Earle (Kenneth Gilbert) is blackmailed into withdrawing when Urquhart anonymously sends pictures of him in the company of a rent boy whom Earle had paid for sex.
The first ballot leaves Urquhart to face Woolton and Michael Samuels, the moderate Environment Secretary supported by Billsborough. Urquhart eliminates Woolton by a prolonged scheme: at the party conference, he pressures O'Neill into persuading his personal assistant and lover, Penny Guy (Alphonsia Emmanuel), to have a one-night stand with Woolton in his suite, which Urquhart records via a bugged ministerial red box. When the tape is sent to Woolton, he is led to assume that Samuels is behind the scheme and backs Urquhart in the contest. Urquhart also receives support from Collingridge, who is unaware of Urquhart's role in his own downfall. Samuels is forced out of the running when the tabloids reveal that he backed leftist causes as a student at University of Cambridge.
Stumbling across contradictions in the allegations against Collingridge and his brother, Mattie begins to dig deeper. On Urquhart's orders, O'Neill arranges for her car and flat to be vandalised in a show of intimidation. However, O'Neill becomes increasingly uneasy with what he is being asked to do, and his cocaine addiction adds to his instability. Urquhart mixes O'Neill's cocaine with rat poison, causing him to kill himself when taking the cocaine in a motorway lavatory. Though initially blind to the truth of matters thanks to her relations with Urquhart, Mattie eventually deduces that Urquhart is responsible for O'Neill's death and is behind the unfortunate downfalls of Collingridge and all of Urquhart's rivals.
Mattie looks for Urquhart at the point when it seems his victory is certain. She eventually finds him on the roof garden of the Houses of Parliament, where she confronts him. He admits to O'Neill's murder and everything else he has done. He then asks whether he can trust Mattie, and, though she answers in the affirmative, he does not believe her and throws her off the roof onto a van parked below. An unseen person picks up Mattie's tape recorder, which she had been using to secretly record her conversations with Urquhart. The series ends with Urquhart defeating Samuels in the second leadership ballot and being driven to Buckingham Palace to be invited to form a government by Elizabeth II.
In the first novel, but not in the television series:
Before the series was reissued in 2013 to coincide with the release of the US version of "House of Cards", Dobbs rewrote portions of the novel to bring the series in line with the television mini-series and restore continuity among the three novels. In the 2013 version:
The first installment of the TV series coincidentally aired two days before the Conservative Party leadership election. During a time of "disillusionment with politics", the series "caught the nation's mood".
Ian Richardson won a Best Actor BAFTA in 1991 for his role as Urquhart, and Andrew Davies won an Emmy for outstanding writing in a miniseries.
The series ranked 84th in the British Film Institute list of the 100 Greatest British Television Programmes.
The Urquhart trilogy has been adapted in the United States as "House of Cards". The show stars Kevin Spacey as Francis "Frank" Underwood, the Majority Whip of the Democratic Party, who schemes and murders his way to becoming President of the United States. It is produced by David Fincher and Spacey's Trigger Street Productions, with the initial episodes directed by Fincher.
The series, produced and financed by independent studio Media Rights Capital, is one of Netflix's first forays into original programming. Series one was made available online on 1 February 2013. The series is filmed in Baltimore, Maryland. The first series was critically acclaimed and earned four Golden Globe Nominations, including Best Drama, actor, actress and supporting actor, with Robin Wright winning best actress. It also earned nine Primetime Emmy Award nominations, winning three, and was the first show to earn nominations that was broadcast solely via an internet streaming service.
The drama introduced and popularised the phrase: "You might very well think that; I couldn't possibly comment". It was a non-confirmation confirmative statement, used by Urquhart whenever he could not be seen to agree with a leading statement, with the emphasis on either the "I" or the "possibly", depending on the situation. The phrase was even used in the House of Commons, House of Lords and Parliamentary Committees following the series.
A variation on the phrase was written into the TV adaptation of Terry Pratchett's "Hogfather" for the character Death, as an in-joke on the fact that he was voiced by Richardson.
During the first Gulf War, a British reporter speaking from Baghdad, conscious of the possibility of censorship, used the code phrase "You might very well think that; I couldn't possibly comment" to answer a BBC presenter's question.
A further variation was used by Nicola Murray, a fictional government minister, in the third series finale of "The Thick of It".
In the U.S. adaptation, the phrase is used by Frank Underwood in the first episode during his initial meeting with Zoe Barnes, the US counterpart of Mattie Storin. | https://en.wikipedia.org/wiki?curid=14017 |
Helen Gandy
Helen W. Gandy (April 8, 1897 – July 7, 1988) was an American civil servant. For 54 years, she was the secretary to Federal Bureau of Investigation director J. Edgar Hoover, who called her "indispensable". She exercised great behind-the-scenes influence on Hoover and the workings of the Bureau. Following Hoover's death in 1972, she spent weeks destroying his "Personal File", thought to be where the most incriminating material he used to manipulate and control the most powerful figures in Washington was kept.
Helen Gandy was born in Rockville, New Jersey, one of three children (two daughters and a son) born to Franklin Dallas and Annie (née Williams) Gandy. She grew up in New Jersey in Fairton or the Port Norris section of Commercial Township (sources differ) and graduated from Bridgeton High School in Bridgeton, New Jersey. In 1918, aged 21, she moved to Washington, D.C., where she later took classes at Strayer Business College and George Washington University Law School.
Gandy briefly worked in a department store in Washington before finding a job as a file clerk at the Justice Department in 1918. Within weeks, she went to work as a typist for Hoover, effective March 25, 1918, having told Hoover in her interview she had "no immediate plans to marry." She, like Hoover, would never marry; both were completely devoted to the Bureau.
When Hoover went to the Bureau of Investigation (its original title; it became the F.B.I. in 1935) as its assistant director on August 22, 1921, he specifically requested Gandy return from vacation to help him in the new post. Hoover became director of the Bureau in 1924, and Gandy continued in his service. She was promoted to "office assistant" on August 23, 1937 and "executive assistant" on October 1, 1939. Though she would receive promotions in her civil service grade subsequently, she retained her title as executive assistant until her retirement on May 2, 1972, the day Hoover died. Hoover said of her: "if there is anyone in this Bureau whose services are indispensable, I consider Miss Gandy to be that person." Despite this, Curt Gentry wrote:
Theirs was a rigidly formal relationship. He'd always called her 'Miss Gandy' (when angry, barking it out as one word). In all those fifty-four years he had never once called her by her first name.
Hoover biographers Theoharis and Cox would say "her stern face recalled Cerberus at the gate," a view echoed by Anthony Summers in his life of Hoover, who also pictured Gandy as Hoover's first line of defense against the outside world. When Attorney General Robert F. Kennedy, Hoover's superior, had a direct telephone line installed between their offices, Hoover refused to answer the phone. "Put that damn thing on Miss Gandy's desk where it belongs," Hoover would declare.
Gentry described Gandy's influence:
Her genteel manner and pleasant voice contrasted sharply with this domineering presence. Yet behind the politeness was a resolute firmness not unlike his, and no small amount of influence. Many a career in the Bureau had been quietly manipulated by her. Even those who disliked him, praised her, most often commenting on her remarkable ability to get along with all kinds of people. That she had held her position for fifty-four years was the best evidence of this, for it was a Bureau tradition that the closer you were to him, the more demanding he was.
William C. Sullivan, an agent with the Bureau for three decades, reported in his memoir when he worked in the public relations section answering mail from the public, he gave a correspondent the wrong measurements for Hoover's personal popover recipe, relying on memory rather than the files. Gandy, ever protective of her boss, caught the error and brought it to Hoover's attention. The director then placed an official letter of reprimand in Sullivan's file for the lapse. Mark Felt, deputy associate director of the Bureau, wrote in his memoir that Gandy "was bright and alert and quick-tempered—and completely dedicated to her boss."
Hoover died during the night of May 1–2, 1972. According to Curt Gentry, who wrote the 1991 book "J Edgar Hoover: The Man and the Secrets", Hoover's body was not discovered by his live-in cook and general housekeeper, Annie Fields; rather, it was discovered by James Crawford, who had been Hoover's chauffeur for 37 years. Crawford then yelled out to Fields and Tom Moton (Hoover's new chauffeur after Crawford had retired in January 1972). Ms. Fields first called Hoover's personal physician, Dr. Robert Choisser, then used another phone to call Clyde Tolson's private number. Tolson then called Helen Gandy's private number with the news of Hoover's death along with orders to begin destroying the files. Within an hour, the "D List" ("d" standing for destruction) was being distributed, and the destruction of files began. However, "The New York Times" quoted an anonymous F.B.I. source in spring 1975, who said: "Gandy had begun almost a year before Mr. Hoover's death and was instructed to purge the files that were then in his office."
Anthony Summers reported that G. Gordon Liddy had said of his sources in the F.B.I.: "by the time Gray went in to get the files, Miss Gandy had already got rid of them." The day after Hoover died, Gray, who had been named acting director by President Richard Nixon upon Tolson's resignation from that position, went to Hoover's office. Gandy paused from her work to give Gray a tour. He found file cabinets open and packing boxes being filled with papers. She informed him the boxes contained personal papers of Hoover's. Gandy stated Gray flipped through a few files and approved her work, but Gray was to deny he looked at any papers. Gandy also told Gray it would be a week before she could clear Hoover's effects out so Gray could move into the suite.
Gray reported to Nixon that he had secured Hoover's office and its contents. However, he had sealed only Hoover's personal inner office, where no files were stored, not the entire suite of offices. Since 1957, Hoover's "Official/Confidential" files, containing material too sensitive to include in the Bureau's central files, had been kept in the outer office, where Gandy sat. Gentry reported that Gray would not have known where to look in Gandy's office for the files, as her office was lined floor to ceiling with filing cabinets; moreover, without her index to the files, he would not have been able to locate incriminating material, for files were deliberately mislabeled, e.g., President Nixon's file was labeled "Obscene Matters".
On May 4, Gandy turned over 12 boxes labelled "Official/Confidential", containing 167 files and 17,750 pages, to Mark Felt. Many of them contained derogatory information. Gray told the press that afternoon that "there are no dossiers or secret files. There are just general files and I took steps to preserve their integrity." Gandy retained the "Personal File".
Gandy worked on going through Hoover's "Personal File" in the office until May 12. She then transferred at least 32 file drawers of material to the basement rec room of Hoover's Washington home at 4936 Thirtieth Place, NW, where she continued her work from May 13 to July 17. Gandy later testified nothing official had been removed from the Bureau's offices, "not even his badge." At Hoover's residence the destruction was overseen by John P. Mohr, the number three man in the Bureau after Hoover and Tolson. They were aided by James Jesus Angleton, the Central Intelligence Agency's counterintelligence chief, whom Hoover's neighbors saw removing boxes from Hoover's home. Mohr would claim the boxes Angleton removed were cases of spoiled wine.
When the House Committee on Government Oversight investigated the F.B.I.'s spying on and harassment of Martin Luther King, Jr. and others in 1975, Gandy was called to testify. "I tore them up, put them in boxes, and they were taken away to be shredded," she told the congressmen about the papers. The Bureau's Washington field office had F.B.I. drivers transport the material to Hoover's home, then once Gandy had gone through the material, the drivers transported it back to the field office in the Old Post Office Building on Pennsylvania Avenue, where it was shredded and burned.
Gandy stated that Hoover had left standing instructions to destroy his personal papers upon his death, and that this instruction was confirmed by Tolson and Gray. Gandy stated that she destroyed no official papers, that everything was personal papers of Hoover's. The staff of the subcommittee did not believe her, but she told the committee: "I have no reason to lie." Representative Andrew Maguire (D-New Jersey), a freshman member of the 94th Congress, said "I find your testimony very difficult to believe." Gandy held her ground: "That is your privilege."
"I can give you my word. I know what there was—letters to and from friends, personal friends, a lot of letters," she testified. Gandy also said the files she took to his home also included his financial papers, such as tax returns and investment statements, the deed to his home, and papers relating to his dogs' pedigrees.
Curt Gentry wrote:
In "J. Edgar Hoover: The Man and His Secrets", Gentry describes the nature of the files: "... their contents included blackmail material on the patriarch of an American political dynasty, his sons, their wives, and other women; allegations of two homosexual arrests which Hoover leaked to help defeat a witty, urbane Democratic presidential candidate; the surveillance reports on one of America's best-known first ladies and her alleged lovers, both male and female, white and black; the child molestation documentation the director used to control and manipulate one of the Red-baiting proteges; a list of the Bureau's spies in the White House during the eight administrations when Hoover was FBI director; the forbidden fruit of hundreds of illegal wiretaps and bugs, containing, for example, evidence that an attorney general, Tom C. Clark, who later became Supreme Court justice, had received payoffs from the Chicago syndicate; as well as celebrity files, with all the unsavory gossip Hoover could amass on some of the biggest names in show business."
While Gandy officially retired the day Hoover died, she spent the next few weeks destroying his papers (as described and referenced above). Hoover left her $5,000 in his will.
In 1961, Gandy and her sister, Lucy G. Rodman, donated a portrait of their mother by Thomas Eakins to the Smithsonian American Art Museum. Gandy lived in Washington until 1986, when she moved to DeLand, Florida, in Volusia County, where a niece lived. Gandy was an avid trout fisherman.
Gandy died of a heart attack on July 7, 1988, either in DeLand (as indicated by her "New York Times" obituary) or in nearby Orange City, Florida (as stated in her "Post" obituary).
Gandy was portrayed by actresses Lee Kessler in the television film "J. Edgar Hoover" (1987) and Naomi Watts in the cinematic release "J. Edgar" (2011). | https://en.wikipedia.org/wiki?curid=14018 |
Horsepower
Horsepower (hp) is a unit of measurement of power, or the rate at which work is done, usually in reference to the output of engines or motors. There are many different standards and types of horsepower. Two common definitions being used today are the mechanical horsepower (or imperial horsepower), which is about 745.7 watts, and the metric horsepower, which is approximately 735.5 watts.
The term was adopted in the late 18th century by Scottish engineer James Watt to compare the output of steam engines with the power of draft horses. It was later expanded to include the output power of other types of piston engines, as well as turbines, electric motors and other machinery. The definition of the unit varied among geographical regions. Most countries now use the SI unit watt for measurement of power. With the implementation of the EU Directive 80/181/EEC on 1 January 2010, the use of horsepower in the EU is permitted only as a supplementary unit.
The development of the steam engine provided a reason to compare the output of horses with that of the engines that could replace them. In 1702, Thomas Savery wrote in "The Miner's Friend":
So that an engine which will raise as much water as two horses, working together at one time in such a work, can do, and for which there must be constantly kept ten or twelve horses for doing the same. Then I say, such an engine may be made large enough to do the work required in employing eight, ten, fifteen, or twenty horses to be constantly maintained and kept for doing such a work…
The idea was later used by James Watt to help market his improved steam engine. He had previously agreed to take royalties of one third of the savings in coal from the older Newcomen steam engines. This royalty scheme did not work with customers who did not have existing steam engines but used horses instead.
Watt determined that a horse could turn a mill wheel 144 times in an hour (or 2.4 times a minute). The wheel was in radius; therefore, the horse travelled feet in one minute. Watt judged that the horse could pull with a force of . So:
Watt defined and calculated the horsepower as 32,572 ft⋅lbf/min, which was rounded to an even 33,000 ft⋅lbf/min.
Watt determined that a pony could lift an average per minute over a four-hour working shift. Watt then judged a horse was 50% more powerful than a pony and thus arrived at the 33,000 ft⋅lbf/min figure. "Engineering in History" recounts that John Smeaton initially estimated that a horse could produce per minute. John Desaguliers had previously suggested per minute, and Tredgold suggested per minute. "Watt found by experiment in 1782 that a 'brewery horse' could produce per minute." James Watt and Matthew Boulton standardized that figure at per minute the next year.
A common legend states that the unit was created when one of Watt's first customers, a brewer, specifically demanded an engine that would match a horse, and chose the strongest horse he had and driving it to the limit. Watt, while aware of the trick, accepted the challenge and built a machine that was actually even stronger than the figure achieved by the brewer, and it was the output of that machine which became the horsepower.
In 1993, R. D. Stevenson and R. J. Wassersug published correspondence in "Nature" summarizing measurements and calculations of peak and sustained work rates of a horse. Citing measurements made at the 1926 Iowa State Fair, they reported that the peak power over a few seconds has been measured to be as high as and also observed that for sustained activity, a work rate of about per horse is consistent with agricultural advice from both the 19th and 20th centuries and also consistent with a work rate of about 4 times the basal rate expended by other vertebrates for sustained activity.
When considering human-powered equipment, a healthy human can produce about briefly (see orders of magnitude) and sustain about indefinitely; trained athletes can manage up to about briefly
and for a period of several hours. The Jamaican sprinter Usain Bolt produced a maximum of 0.89 seconds into his 9.58 second dash world record in 2009.
If torque and rotational speed are expressed in coherent SI units, the power is calculated as
where formula_3 is power in watts when formula_4 is torque in newton-metres, and formula_5 is angular speed in radians per second. When using other units or if the speed is in revolutions per unit time rather than radians, a conversion factor has to be included.
When torque formula_6 is in pound-foot units, rotational speed formula_7 is in rpm, the resulting power in horsepower is
The constant 5252 is the rounded value of (33,000 ft⋅lbf/min)/(2π rad/rev).
When torque formula_6 is in inch-pounds,
The constant 63,025 is the approximation of
The following definitions have been or are widely used:
In certain situations it is necessary to distinguish between the various definitions of horsepower and thus a suffix is added: hp(I) for mechanical (or imperial) horsepower, hp(M) for metric horsepower, hp(S) for boiler (or steam) horsepower and hp(E) for electrical horsepower.
Assuming the third CGPM (1901, CR 70) definition of standard gravity, , is used to define the pound-force as well as the kilogram force, and the international avoirdupois pound (1959), one mechanical horsepower is:
Or given that 1 hp = 550 ft⋅lbf/s, 1 ft = 0.3048 m, 1 lbf ≈ 4.448 N, 1 J = 1 N⋅m, 1 W = 1 J/s: 1 hp ≈ 746 W
The various units used to indicate this definition ("PS", "cv", "hk", "pk", "ks" and "ch") all translate to "horse power" in English. British manufacturers often intermix metric horsepower and mechanical horsepower depending on the origin of the engine in question. Sometimes the metric horsepower rating of an engine is conservative enough so that the same figure can be used for both 80/1269/EEC with metric hp and SAE J1349 with imperial hp.
DIN 66036 defines one metric horsepower as the power to raise a mass of 75 kilograms against the Earth's gravitational force over a distance of one metre in one second: = 75 ⋅m/s = 1 PS. This is equivalent to 735.499 W, or 98.6% of an imperial mechanical horsepower.
In 1972, the PS was rendered obsolete by EEC directives, when it was replaced by the kilowatt as the official power-measuring unit. It is still in use for commercial and advertising purposes, in addition to the kilowatt rating, as many customers are still not familiar with the use of kilowatts for engines.
Other names for the metric horsepower are the Italian , Dutch , the French , the Spanish and Portuguese , the Russian , the Swedish , the Finnish , the Estonian , the Norwegian and Danish , the Hungarian , the Czech and Slovak or ), the Bosnian/Croatian/Serbian , the Bulgarian , the Macedonian , the Polish , Slovenian , the Ukrainian and the Romanian , which all equal the German .
In the 19th century, the French had their own unit, which they used instead of the CV or horsepower. It was called the poncelet and was abbreviated "p".
Tax horsepower is a non-linear rating of a motor vehicle for tax purposes. The fiscal power is formula_12, where "P" is the maximum power in kilowatts and "U" is the amount of carbon dioxide (CO2) emitted in grams per kilometre. The term for CO2 measurements has been included in the definition only since 1998, so older ratings in CV are not directly comparable. The fiscal power has found its way into naming of automobile models, such as the popular Citroën deux-chevaux. The (ch) unit should not be confused with the French (CV).
Nameplates on electrical motors show their power output, not the power input (the power delivered at the shaft, not the power consumed to drive the motor). This power output is ordinarily stated in watts or kilowatts. In the United States, the power output is stated in horsepower, which for this purpose is defined as exactly 746 W.
Hydraulic horsepower can represent the power available within hydraulic machinery, power through the down-hole nozzle of a drilling rig, or can be used to estimate the mechanical power needed to generate a known hydraulic flow rate.
It may be calculated as
where pressure is in psi, and flow rate is in US gallons per minute.
Drilling rigs are powered mechanically by rotating the drill pipe from above. Hydraulic power is still needed though, as between 2 and 7 hp are required to push mud through the drill bit to clear waste rock. Additional hydraulic power may also be used to drive a down-hole mud motor to power directional drilling.
Boiler horsepower is a boiler's capacity to deliver steam to a steam engine and is not the same unit of power as the 550 ft-lb/s definition. One boiler horsepower is equal to the thermal energy rate required to evaporate 34.5 lb of fresh water at 212 °F in one hour. In the early days of steam use, the boiler horsepower was roughly comparable to the horsepower of engines fed by the boiler.
The term "boiler horsepower" was originally developed at the Philadelphia Centennial Exhibition in 1876, where the best steam engines of that period were tested. The average steam consumption of those engines (per output horsepower) was determined to be the evaporation of 30 pounds of water per hour, based on feed water at 100 °F, and saturated steam generated at 70 psi. This original definition is equivalent to a boiler heat output of 33,485 Btu/h. Years later in 1884, the ASME re-defined the boiler horsepower as the thermal output equal to the evaporation of 34.5 pounds per hour of water "from and at" 212 °F. This considerably simplified boiler testing, and provided more accurate comparisons of the boilers at that time. This revised definition is equivalent to a boiler heat output of 33,469 Btu/h. Present industrial practice is to define "boiler horsepower" as a boiler thermal output equal to 33,475 Btu/h, which is very close to the original and revised definitions.
Boiler horsepower is still used to measure boiler output in industrial boiler engineering in Australia, the US, and New Zealand. Boiler horsepower is abbreviated BHP, not to be confused with brake horsepower, below, which is also abbreviated BHP.
Drawbar horsepower (dbhp) is the power a railway locomotive has available to haul a train or an agricultural tractor to pull an implement. This is a measured figure rather than a calculated one. A special railway car called a dynamometer car coupled behind the locomotive keeps a continuous record of the drawbar pull exerted, and the speed. From these, the power generated can be calculated. To determine the maximum power available, a controllable load is required; it is normally a second locomotive with its brakes applied, in addition to a static load.
If the drawbar force (formula_14) is measured in pounds-force (lbf) and speed (formula_15) is measured in miles per hour (mph), then the drawbar power (formula_3) in horsepower (hp) is
Example: How much power is needed to pull a drawbar load of 2,025 pounds-force at 5 miles per hour?
The constant 375 is because 1 hp = 375 lbf⋅mph. If other units are used, the constant is different. When using coherent SI units (watts, newtons, and metres per second), no constant is needed, and the formula becomes formula_19.
This formula may also be used to calculate the horsepower of a jet engine, using the speed of the jet and the thrust required to maintain that speed.
Example: How much power is generated with a thrust of 4,000 pounds at 400 miles per hour?
This measure was instituted by the Royal Automobile Club in Britain and was used to denote the power of early 20th-century British cars. (An identical measure, known as ALAM horsepower or NACC horsepower, was used for early U.S. automobiles.) Many cars took their names from this figure (hence the Austin Seven and Riley Nine), while others had names such as "40/50 hp", which indicated the RAC figure followed by the true measured power.
Taxable horsepower does not reflect developed horsepower; rather, it is a calculated figure based on the engine's bore size, number of cylinders, and a (now archaic) presumption of engine efficiency. As new engines were designed with ever-increasing efficiency, it was no longer a useful measure, but was kept in use by UK regulations which used the rating for tax purposes.
where
This is equal to the engine displacement in cubic inches divided by 0.625π, then divided again by the stroke in inches.
Since taxable horsepower was computed based on bore and number of cylinders, not based on actual displacement, it gave rise to engines with "undersquare" dimensions (bore smaller than stroke), which tended to impose an artificially low limit on rotational speed, hampering the potential power output and efficiency of the engine.
The situation persisted for several generations of four- and six-cylinder British engines: for example, Jaguar's 3.4-litre XK engine of the 1950s had six cylinders with a bore of and a stroke of , where most American automakers had long since moved to oversquare (large bore, short stroke) V8 engines (see, for example, the early Chrysler Hemi).
The power of an engine may be measured or estimated at several points in the transmission of the power from its generation to its application. A number of names are used for the power developed at various stages in this process, but none is a clear indicator of either the measurement system or definition used.
In the case of an engine dynamometer, power is measured at the engine's flywheel.
In general:
All the above assumes that no power inflation factors have been applied to any of the readings.
Engine designers use expressions other than horsepower to denote objective targets or performance, such as brake mean effective pressure (BMEP). This is a coefficient of theoretical brake horsepower and cylinder pressures during combustion.
Nominal horsepower (nhp) is an early 19th-century rule of thumb used to estimate the power of steam engines. It assumed a steam pressure of .
nhp = 7 × area of piston in square inches × equivalent piston speed in feet per minute/33,000
For paddle ships, the Admiralty rule was that the piston speed in feet per minute was taken as 129.7 × (stroke)1/3.38. For screw steamers, the intended piston speed was used.
The stroke (or length of stroke) was the distance moved by the piston measured in feet.
For the nominal horsepower to equal the actual power it would be necessary for the mean steam pressure in the cylinder during the stroke to be and for the piston speed to be that generated by the assumed relationship for paddle ships.
The French Navy used the same definition of nominal horse power as the Royal Navy.
Indicated horsepower (ihp) is the theoretical power of a reciprocating engine if it is completely frictionless in converting the expanding gas energy (piston pressure × displacement) in the cylinders. It is calculated from the pressures developed in the cylinders, measured by a device called an "engine indicator" – hence indicated horsepower. As the piston advances throughout its stroke, the pressure against the piston generally decreases, and the indicator device usually generates a graph of pressure vs stroke within the working cylinder. From this graph the amount of work performed during the piston stroke may be calculated.
Indicated horsepower was a better measure of engine power than nominal horsepower (nhp) because it took account of steam pressure. But unlike later measures such as shaft horsepower (shp) and brake horsepower (bhp), it did not take into account power losses due to the machinery internal frictional losses, such as a piston sliding within the cylinder, plus bearing friction, transmission and gear box friction, etc.
Brake horsepower (bhp) is the power measured using a brake type (load) dynamometer at a specified location, such as the crankshaft, output shaft of the transmission, rear axle or rear wheels. Bhp is Brake dyno derived and is often incorrectly confused with upfactored power figures as produced using an inertia type (not a load dyno).
In Europe, the DIN 70020 standard tests the engine fitted with all ancillaries and exhaust system as used in the car. The older American standard (SAE gross horsepower, referred to as bhp) used an engine without alternator, water pump, and other auxiliary components such as power steering pump, muffled exhaust system, etc., so the figures were higher than the European figures for the same engine. The newer American standard (referred to as SAE net horsepower) tests an engine with all the auxiliary components (see "Engine power test standards" below).
Brake refers to the device which is used to provide an equal braking force / load to balance / equal an engine's output force and hold it at a desired rotational speed. During testing, the output torque and rotational speed were measured to determine the brake horsepower. Horsepower was originally measured and calculated by use of the "indicator diagram" (a James Watt invention of the late 18th century), and later by means of a Prony brake connected to the engine's output shaft. Modern dynamometers use any of several braking methods to measure the engine's brake horsepower, the actual output of the engine itself, before losses to the drivetrain.
Shaft horsepower (shp) is the power delivered to a propeller shaft, a turbine shaft, or to an output shaft of an automotive transmission. Shaft horsepower is a common rating for turboshaft and turboprop engines, industrial turbines, and some marine applications.
Equivalent shaft horsepower (eshp) is sometimes used to rate turboprop engines. It includes the equivalent power derived from residual jet thrust from the turbine exhaust.
There exist a number of different standard determining how the power and torque of an automobile engine is measured and corrected. Correction factors are used to adjust power and torque measurements to standard atmospheric conditions, to provide a more accurate comparison between engines as they are affected by the pressure, humidity, and temperature of ambient air. Some standards are described below.
In the early twentieth century, a so-called "SAE horsepower" was sometimes quoted for U.S. automobiles. This long predates the Society of Automotive Engineers (SAE) horsepower measurement standards and was really just another term for the widely used ALAM or NACC horsepower figure, which was the same as the British RAC horsepower, used for tax purposes.
Prior to the 1972 model year, American automakers rated and advertised their engines in brake horsepower, "bhp", which was a version of brake horsepower called SAE gross horsepower because it was measured according to Society of Automotive Engineers (SAE) standards (J245 and J1995) that call for a stock test engine without accessories (such as dynamo/alternator, radiator fan, water pump), and sometimes fitted with long tube test headers in lieu of the OEM exhaust manifolds. This contrasts with both SAE net power and DIN 70020 standards, which account for engine accessories (but not transmission losses). The atmospheric correction standards for barometric pressure, humidity and temperature for SAE gross power testing were relatively idealistic.
In the United States, the term "bhp" fell into disuse in 1971–1972, as automakers began to quote power in terms of SAE net horsepower in accord with SAE standard J1349. Like SAE gross and other brake horsepower protocols, SAE net hp is measured at the engine's crankshaft, and so does not account for transmission losses. However, similar to the DIN 70020 standard, SAE net power testing protocol calls for standard production-type belt-driven accessories, air cleaner, emission controls, exhaust system, and other power-consuming accessories. This produces ratings in closer alignment with the power produced by the engine as it is actually configured and sold.
In 2005, the SAE introduced "SAE Certified Power" with SAE J2723. This test is voluntary and is in itself not a separate engine test code but a certification of either J1349 or J1995 after which the manufacturer is allowed to advertise "Certified to SAE J1349" or "Certified to SAE J1995" depending on which test standard have been followed. To attain certification the test must follow the SAE standard in question, take place in an ISO 9000/9002 certified facility and be witnessed by an SAE approved third party.
A few manufacturers such as Honda and Toyota switched to the new ratings immediately. The rating for Toyota's Camry 3.0 L "1MZ-FE" V6 fell from . The company's Lexus ES 330 and Camry SE V6 (3.3 L V6) were previously rated at but the ES 330 dropped to while the Camry declined to . The first engine certified under the new program was the 7.0 L LS7 used in the 2006 Chevrolet Corvette Z06. Certified power rose slightly from .
While Toyota and Honda are retesting their entire vehicle lineups, other automakers generally are retesting only those with updated powertrains. For example, the 2006 Ford Five Hundred is rated at 203 horsepower, the same as that of 2005 model. However, the 2006 rating does not reflect the new SAE testing procedure, as Ford is not going to incur the extra expense of retesting its existing engines. Over time, most automakers are expected to comply with the new guidelines.
SAE tightened its horsepower rules to eliminate the opportunity for engine manufacturers to manipulate factors affecting performance such as how much oil was in the crankcase, engine control system calibration, and whether an engine was tested with high octane fuel. In some cases, such can add up to a change in horsepower ratings.
DIN 70020 is a German DIN standard for measuring road vehicle horsepower. DIN hp is measured at the engine's output shaft as a form of metric horsepower rather than mechanical horsepower. Similar to SAE net power rating, and unlike SAE gross power, DIN testing measures the engine as installed in the vehicle, with cooling system, charging system and stock exhaust system all connected. DIN 70020 is often seen abbreviated as "PS", derived from the German word for horsepower Pferdestärke.
A test standard by Italian CUNA ("Commissione Tecnica per l'Unificazione nell'Automobile", Technical Commission for Automobile Unification), a federated entity of standards organisation UNI, was formerly used in Italy.
CUNA prescribed that the engine be tested with all accessories necessary to its running fitted (such as the water pump), while all others—such as alternator/dynamo, radiator fan, and exhaust manifold—could be omitted. All calibration and accessories had to be as on production engines.
ECE R24 is a UN standard for the approval of compression ignition engine emissions, installation and measurement of engine power. It is similar to DIN 70020 standard, but with different requirements for connecting an engine's fan during testing causing it to absorb less power from the engine.
ECE R85 is a UN standard for the approval of internal combustion engines with regard to the measurement of the net power.
80/1269/EEC of 16 December 1980 is a European Union standard for road vehicle engine power.
The International Organization for Standardization (ISO) publishes several standards for measuring engine horsepower.
JIS D 1001 is a Japanese net, and gross, engine power test code for automobiles or trucks having a spark ignition, diesel engine, or fuel injection engine. | https://en.wikipedia.org/wiki?curid=14019 |
History of London
The history of London, the capital city of England and the United Kingdom, extends over 2000 years. In that time, it has become one of the world's most significant financial and cultural capital cities. It has withstood plague, devastating fire, civil war, aerial bombardment, terrorist attacks, and riots.
The City of London is the historic core of the Greater London metropolis, and is today its primary financial district, though it represents only a small part of the wider metropolis.
According to "Historia Regum Britanniae", by Geoffrey of Monmouth, London was founded by Brutus of Troy about 1000–1100 B.C.E. after he defeated the native giant Gogmagog; the settlement was known as ', ' (Latin for New Troy), which, according to a pseudo-etymology, was corrupted to "Trinovantum". Trinovantes were the Iron Age tribe who inhabited the area prior to the Romans. Geoffrey provides prehistoric London with a rich array of legendary kings, such as Lud (see also Lludd, from Welsh mythology) who, he claims, renamed the town "Caer Ludein", from which London was derived, and was buried at Ludgate.
Some recent discoveries indicate probable very early settlements near the Thames in the London area. In 1993, the remains of a Bronze Age bridge were found on the Thames's south foreshore, upstream of Vauxhall Bridge. This bridge either crossed the Thames, or went to a now lost island in the river. Dendrology dated the timbers to between 1750 B.C.E and 1285 B.C.E. In 2001, a further dig found that the timbers were driven vertically into the ground on the south bank of the Thames west of Vauxhall Bridge. In 2010, the foundations of a large timber structure, dated to between 4,800 B.C.E. and 4,500 B.C.E. were found, again on the foreshore south of Vauxhall Bridge. The function of the mesolithic structure is not known. All these structures are on the south bank at a natural crossing point where the River Effra flows into the Thames.
"Londinium" was established as a civilian town by the Romans about four years after the invasion of AD 43. London, like Rome, was founded on the point of the river where it was narrow enough to bridge and the strategic location of the city provided easy access to much of Europe. Early Roman London occupied a relatively small area, roughly equivalent to the size of Hyde Park. In around AD 60, it was destroyed by the Iceni led by their queen Boudica. The city was quickly rebuilt as a planned Roman town and recovered after perhaps 10 years; the city grew rapidly over the following decades.
During the 2nd century "Londinium" was at its height and replaced Colchester as the capital of Roman Britain (Britannia). Its population was around 60,000 inhabitants. It boasted major public buildings, including the largest basilica north of the Alps, temples, bath houses, an amphitheatre and a large fort for the city garrison. Political instability and recession from the 3rd century onwards led to a slow decline.
At some time between AD 180 and AD 225, the Romans built the defensive London Wall around the landward side of the city. The wall was about long, high, and thick. The wall would survive for another 1,600 years and define the City of London's perimeters for centuries to come. The perimeters of the present City are roughly defined by the line of the ancient wall.
Londinium was an ethnically diverse city with inhabitants from across the Roman Empire, including natives of Britannia, continental Europe, the Middle East, and North Africa.
In the late 3rd century, Londinium was raided on several occasions by Saxon pirates. This led, from around 255 onwards, to the construction of an additional riverside wall. Six of the traditional seven city gates of London are of Roman origin, namely: Ludgate, Newgate, Aldersgate, Cripplegate, Bishopsgate and Aldgate (Moorgate is the exception, being of medieval origin).
By the 5th century, the Roman Empire was in rapid decline and in AD 410, the Roman occupation of Britannia came to an end. Following this, the Roman city also went into rapid decline and by the end of the 5th century was practically abandoned.
Until recently it was believed that Anglo-Saxon settlement initially avoided the area immediately around Londinium. However, the discovery in 2008 of an Anglo-Saxon cemetery at Covent Garden indicates that the incomers had begun to settle there at least as early as the 6th century and possibly in the 5th. The main focus of this settlement was outside the Roman walls, clustering a short distance to the west along what is now the Strand, between the Aldwych and Trafalgar Square. It was known as "Lundenwic", the "-wic" suffix here denoting a trading settlement. Recent excavations have also highlighted the population density and relatively sophisticated urban organisation of this earlier Anglo-Saxon London, which was laid out on a grid pattern and grew to house a likely population of 10–12,000.
Early Anglo-Saxon London belonged to a people known as the Middle Saxons, from whom the name of the county of Middlesex is derived, but who probably also occupied the approximate area of modern Hertfordshire and Surrey. However, by the early 7th century the London area had been incorporated into the kingdom of the East Saxons. In 604 King Saeberht of Essex converted to Christianity and London received Mellitus, its first post-Roman bishop.
At this time Essex was under the overlordship of King Æthelberht of Kent, and it was under Æthelberht's patronage that Mellitus founded the first St. Paul's Cathedral, traditionally said to be on the site of an old Roman Temple of Diana (although Christopher Wren found no evidence of this). It would have only been a modest church at first and may well have been destroyed after he was expelled from the city by Saeberht's pagan successors.
The permanent establishment of Christianity in the East Saxon kingdom took place in the reign of King Sigeberht II in the 650s. During the 8th century, the kingdom of Mercia extended its dominance over south-eastern England, initially through overlordship which at times developed into outright annexation. London seems to have come under direct Mercian control in the 730s.
Viking attacks dominated most of the 9th century, becoming increasingly common from around 830 onwards. London was sacked in 842 and again in 851. The Danish "Great Heathen Army", which had rampaged across England since 865, wintered in London in 871. The city remained in Danish hands until 886, when it was captured by the forces of King Alfred the Great of Wessex and reincorporated into Mercia, then governed under Alfred's sovereignty by his son-in-law Ealdorman Æthelred.
Around this time the focus of settlement moved within the old Roman walls for the sake of defence, and the city became known as "Lundenburh". The Roman walls were repaired and the defensive ditch re-cut, while the bridge was probably rebuilt at this time. A second fortified Borough was established on the south bank at Southwark, the "Suthringa Geworc" (defensive work of the men of Surrey). The old settlement of "Lundenwic" became known as the "ealdwic" or "old settlement", a name which survives today as Aldwich.
From this point, the City of London began to develop its own unique local government. Following Ethelred's death in 911 it was transferred to Wessex, preceding the absorption of the rest of Mercia in 918. Although it faced competition for political pre-eminence in the united Kingdom of England from the traditional West Saxon centre of Winchester, London's size and commercial wealth brought it a steadily increasing importance as a focus of governmental activity. King Athelstan held many meetings of the "witan" in London and issued laws from there, while King Æthelred the Unready issued the Laws of London there in 978.
Following the resumption of Viking attacks in the reign of Ethelred, London was unsuccessfully attacked in 994 by an army under King Sweyn Forkbeard of Denmark. As English resistance to the sustained and escalating Danish onslaught finally collapsed in 1013, London repulsed an attack by the Danes and was the last place to hold out while the rest of the country submitted to Sweyn, but by the end of the year it too capitulated and Æthelred fled abroad. Sweyn died just five weeks after having been proclaimed king and Æthelred was restored to the throne, but Sweyn's son Cnut returned to the attack in 1015.
After Æthelred's death at London in 1016 his son Edmund Ironside was proclaimed king there by the "witangemot" and left to gather forces in Wessex. London was then subjected to a systematic siege by Cnut but was relieved by King Edmund's army; when Edmund again left to recruit reinforcements in Wessex the Danes resumed the siege but were again unsuccessful. However, following his defeat at the Battle of Assandun Edmund ceded to Cnut all of England north of the Thames, including London, and his death a few weeks later left Cnut in control of the whole country.
A Norse saga tells of a battle when King Æthelred returned to attack Danish-occupied London. According to the saga, the Danes lined London Bridge and showered the attackers with spears. Undaunted, the attackers pulled the roofs off nearby houses and held them over their heads in the boats. Thus protected, they were able to get close enough to the bridge to attach ropes to the piers and pull the bridge down, thus ending the Viking occupation of London. This story presumably relates to Æthelred's return to power after Sweyn's death in 1014, but there is no strong evidence of any such struggle for control of London on that occasion.
Following the extinction of Cnut's dynasty in 1042 English rule was restored under Edward the Confessor. He was responsible for the foundation of Westminster Abbey and spent much of his time at Westminster, which from this time steadily supplanted the City itself as the centre of government. Edward's death at Westminster in 1066 without a clear heir led to a succession dispute and the Norman conquest of England. Earl Harold Godwinson was elected king by the "witangemot" and crowned in Westminster Abbey but was defeated and killed by William the Bastard, Duke of Normandy at the Battle of Hastings. The surviving members of the "witan" met in London and elected King Edward's young nephew Edgar the Ætheling as king.
The Normans advanced to the south bank of the Thames opposite London, where they defeated an English attack and burned Southwark but were unable to storm the bridge. They moved upstream and crossed the river at Wallingford before advancing on London from the north-west. The resolve of the English leadership to resist collapsed and the chief citizens of London went out together with the leading members of the Church and aristocracy to submit to William at Berkhamstead, although according to some accounts there was a subsequent violent clash when the Normans reached the city. Having occupied London, William was crowned king in Westminster Abbey.
The new Norman regime established new fortresses within the city to dominate the native population. By far the most important of these was the Tower of London at the eastern end of the city, where the initial timber fortification was rapidly replaced by the construction of the first stone castle in England. The smaller forts of Baynard's Castle and Montfichet's Castle were also established along the waterfront. King William also granted a charter in 1067 confirming the city's existing rights, privileges and laws. London was a centre of England's nascent Jewish population, the first of whom arrived in about 1070. Its growing self-government was consolidated by the election rights granted by King John in 1199 and 1215.
In 1097, William Rufus, the son of William the Conqueror began the construction of 'Westminster Hall', which became the focus of the Palace of Westminster.
In 1176, construction began of the most famous incarnation of London Bridge (completed in 1209) which was built on the site of several earlier timber bridges. This bridge would last for 600 years, and remained the only bridge across the River Thames until 1739.
Violence against Jews took place in 1190, after it was rumoured that the new King had ordered their massacre after they had presented themselves at his coronation.
In 1216, during the First Barons' War London was occupied by Prince Louis of France, who had been called in by the baronial rebels against King John and was acclaimed as King of England in St Paul's Cathedral. However, following John's death in 1217 Louis's supporters reverted to their Plantagenet allegiance, rallying round John's son Henry III, and Louis was forced to withdraw from England.
In 1224, after an accusation of ritual murder, the Jewish community was subjected to a steep punitive levy. Then in 1232, Henry III confiscated the principal synagogue of the London Jewish community because he claimed their chanting was audible in a neighboring church. In 1264, during the Second Barons' War, Simon de Montfort's rebels occupied London and killed 500 Jews while attempting to seize records of debts.
London's Jewish community was forced to leave England by the expulsion by Edward I in 1290. They left for France, Holland and further afield; their property was seized, and many suffered robbery and murder as they departed.
Over the following centuries, London would shake off the heavy French cultural and linguistic influence which had been there since the times of the Norman conquest. The city would figure heavily in the development of Early Modern English.
During the Peasants' Revolt of 1381, London was invaded by rebels led by Wat Tyler. A group of peasants stormed the Tower of London and executed the Lord Chancellor, Archbishop Simon Sudbury, and the Lord Treasurer. The peasants looted the city and set fire to numerous buildings. Tyler was stabbed to death by the Lord Mayor William Walworth in a confrontation at Smithfield and the revolt collapsed.
Trade increased steadily during the Middle Ages, and London grew rapidly as a result. In 1100, London's population was somewhat more than 15,000. By 1300, it had grown to roughly 80,000. London lost at least half of its population during the Black Death in the mid-14th century, but its economic and political importance stimulated a rapid recovery despite further epidemics. Trade in London was organised into various guilds, which effectively controlled the city, and elected the Lord Mayor of the City of London.
Medieval London was made up of narrow and twisting streets, and most of the buildings were made from combustible materials such as timber and straw, which made fire a constant threat, while sanitation in cities was of low-quality.
In 1475, the Hanseatic League set up its main English trading base ("kontor") in London, called "Stalhof" or "Steelyard". It existed until 1853, when the Hanseatic cities of Lübeck, Bremen and Hamburg sold the property to South Eastern Railway. Woollen cloth was shipped undyed and undressed from 14th/15th century London to the nearby shores of the Low Countries, where it was considered indispensable.
During the Reformation, London was the principal early centre of Protestantism in England. Its close commercial connections with the Protestant heartlands in northern continental Europe, large foreign mercantile communities, disproportionately large number of literate inhabitants and role as the centre of the English print trade all contributed to the spread of the new ideas of religious reform. Before the Reformation, more than half of the area of London was the property of monasteries, nunneries and other religious houses.
Henry VIII's "Dissolution of the Monasteries" had a profound effect on the city as nearly all of this property changed hands. The process started in the mid 1530s, and by 1538 most of the larger monastic houses had been abolished. Holy Trinity Aldgate went to Lord Audley, and the Marquess of Winchester built himself a house in part of its precincts. The Charterhouse went to Lord North, Blackfriars to , the leper hospital of St Giles to Lord Dudley, while the king took for himself the leper hospital of St James, which was rebuilt as St James's Palace.
The period saw London rapidly rising in importance among Europe's commercial centres. Trade expanded beyond Western Europe to Russia, the Levant, and the Americas. This was the period of mercantilism and monopoly trading companies such as the Muscovy Company (1555) and the British East India Company (1600) were established in London by Royal Charter. The latter, which ultimately came to rule India, was one of the key institutions in London, and in Britain as a whole, for two and a half centuries. Immigrants arrived in London not just from all over England and Wales, but from abroad as well, for example Huguenots from France; the population rose from an estimated 50,000 in 1530 to about 225,000 in 1605. The growth of the population and wealth of London was fuelled by a vast expansion in the use of coastal shipping.
The late 16th and early 17th century saw the great flourishing of drama in London whose preeminent figure was William Shakespeare. During the mostly calm later years of Elizabeth's reign, some of her courtiers and some of the wealthier citizens of London built themselves country residences in Middlesex, Essex and Surrey. This was an early stirring of the villa movement, the taste for residences which were neither of the city nor on an agricultural estate, but at the time of Elizabeth's death in 1603, London was still very compact.
Xenophobia was rampant in London, and increased after the 1580s. Many immigrants became disillusioned by routine threats of violence and molestation, attempts at expulsion of foreigners, and the great difficulty in acquiring English citizenship. Dutch cities proved more hospitable, and many left London permanently. Foreigners are estimated to have made up 4,000 of the 100,000 residents of London by 1600, many being Dutch and German workers and traders.
London's expansion beyond the boundaries of the City was decisively established in the 17th century. In the opening years of that century the immediate environs of the City, with the principal exception of the aristocratic residences in the direction of Westminster, were still considered not conducive to health. Immediately to the north was Moorfields, which had recently been drained and laid out in walks, but it was frequented by beggars and travellers, who crossed it in order to get into London. Adjoining Moorfields were Finsbury Fields, a favourite practising ground for the archers, Mile End, then a common on the Great Eastern Road and famous as a rendezvous for the troops.
The preparations for King James I becoming king were interrupted by a severe plague epidemic, which may have killed over thirty thousand people. The Lord Mayor's Show, which had been discontinued for some years, was revived by order of the king in 1609. The dissolved monastery of the Charterhouse, which had been bought and sold by the courtiers several times, was purchased by Thomas Sutton for £13,000. The new hospital, chapel, and schoolhouse were begun in 1611. Charterhouse School was to be one of the principal public schools in London until it moved to Surrey in Victorian times, and the site is still used as a medical school.
The general meeting-place of Londoners in the day-time was the nave of Old St. Paul's Cathedral. Merchants conducted business in the aisles, and used the font as a counter upon which to make their payments; lawyers received clients at their particular pillars; and the unemployed looked for work. St Paul's Churchyard was the centre of the book trade and Fleet Street was a centre of public entertainment. Under James I the theatre, which established itself so firmly in the latter years of Elizabeth, grew further in popularity. The performances at the public theatres were complemented by elaborate masques at the royal court and at the inns of court.
Charles I acceded to the throne in 1625. During his reign, aristocrats began to inhabit the West End in large numbers. In addition to those who had specific business at court, increasing numbers of country landowners and their families lived in London for part of the year simply for the social life. This was the beginning of the "London season". Lincoln's Inn Fields was built about 1629. The piazza of Covent Garden, designed by England's first classically trained architect Inigo Jones followed in about 1632. The neighbouring streets were built shortly afterwards, and the names of Henrietta, Charles, James, King and York Streets were given after members of the royal family.
In January 1642 five members of parliament whom the King wished to arrest were granted refuge in the City. In August of the same year the King raised his banner at Nottingham, and during the English Civil War London took the side of the parliament. Initially the king had the upper hand in military terms and in November he won the Battle of Brentford a few miles to the west of London. The City organised a new makeshift army and Charles hesitated and retreated. Subsequently, an extensive system of fortifications was built to protect London from a renewed attack by the Royalists. This comprised a strong earthen rampart, enhanced with bastions and redoubts. It was well beyond the City walls and encompassed the whole urban area, including Westminster and Southwark. London was not seriously threatened by the royalists again, and the financial resources of the City made an important contribution to the parliamentarians' victory in the war.
The unsanitary and overcrowded City of London has suffered from the numerous outbreaks of the plague many times over the centuries, but in Britain it is the last major outbreak which is remembered as the "Great Plague" It occurred in 1665 and 1666 and killed around 60,000 people, which was one fifth of the population. Samuel Pepys chronicled the epidemic in his diary. On 4 September 1665 he wrote "I have stayed in the city till above 7400 died in one week, and of them about 6000 of the plague, and little noise heard day or night but tolling of bells."
The Great Plague was immediately followed by another catastrophe, albeit one which helped to put an end to the plague. On the Sunday, 2 September 1666 the Great Fire of London broke out at one o'clock in the morning at a bakery in Pudding Lane in the southern part of the City. Fanned by an eastern wind the fire spread, and efforts to arrest it by pulling down houses to make firebreaks were disorganised to begin with. On Tuesday night the wind fell somewhat, and on Wednesday the fire slackened. On Thursday it was extinguished, but on the evening of that day the flames again burst forth at the Temple. Some houses were at once blown up by gunpowder, and thus the fire was finally mastered. The Monument was built to commemorate the fire: for over a century and a half it bore an inscription attributing the conflagration to a ""popish frenzy"".
The fire destroyed about 60% of the City, including Old St Paul's Cathedral, 87 parish churches, 44 livery company halls and the Royal Exchange. However, the number of lives lost was surprisingly small; it is believed to have been 16 at most. Within a few days of the fire, three plans were presented to the king for the rebuilding of the city, by Christopher Wren, John Evelyn and Robert Hooke.
Wren proposed to build main thoroughfares north and south, and east and west, to insulate all the churches in conspicuous positions, to form the most public places into large piazzas, to unite the halls of the 12 chief livery companies into one regular square annexed to the Guildhall, and to make a fine quay on the bank of the river from Blackfriars to the Tower of London. Wren wished to build the new streets straight and in three standard widths of thirty, sixty and ninety feet. Evelyn's plan differed from Wren's chiefly in proposing a street from the church of St Dunstan's in the East to the St Paul's, and in having no quay or terrace along the river. These plans were not implemented, and the rebuilt city generally followed the streetplan of the old one, and most of it has survived into the 21st century.
Nonetheless, the new City was different from the old one. Many aristocratic residents never returned, preferring to take new houses in the West End, where fashionable new districts such as St. James's were built close to the main royal residence, which was Whitehall Palace until it was destroyed by fire in the 1690s, and thereafter St. James's Palace. The rural lane of Piccadilly sprouted courtiers mansions such as Burlington House. Thus the separation between the middle class mercantile City of London, and the aristocratic world of the court in Westminster became complete.
In the City itself there was a move from wooden buildings to stone and brick construction to reduce the risk of fire. Parliament's Rebuilding of London Act 1666 stated ""building with brick [is] not only more comely and durable, but also more safe against future perils of fire"". From then on only doorcases, window-frames and shop fronts were allowed to be made of wood.
Christopher Wren's plan for a new model London came to nothing, but he was appointed to rebuild the ruined parish churches and to replace St Paul's Cathedral. His domed baroque cathedral was the primary symbol of London for at least a century and a half. As city surveyor, Robert Hooke oversaw the reconstruction of the City's houses. The East End, that is the area immediately to the east of the city walls, also became heavily populated in the decades after the Great Fire. London's docks began to extend downstream, attracting many working people who worked on the docks themselves and in the processing and distributive trades. These people lived in Whitechapel, Wapping, Stepney and Limehouse, generally in slum conditions.
In the winter of 1683–1684, a frost fair was held on the Thames. The frost, which began about seven weeks before Christmas and continued for six weeks after, was the greatest on record. The Revocation of the Edict of Nantes in 1685 led to a large migration on Huguenots to London. They established a silk industry at Spitalfields.
At this time the Bank of England was founded, and the British East India Company was expanding its influence. Lloyd's of London also began to operate in the late 17th century. In 1700, London handled 80% of England's imports, 69% of its exports and 86% of its re-exports. Many of the goods were luxuries from the Americas and Asia such as silk, sugar, tea and tobacco. The last figure emphasises London's role as an entrepot: while it had many craftsmen in the 17th century, and would later acquire some large factories, its economic prominence was never based primarily on industry. Instead it was a great trading and redistribution centre. Goods were brought to London by England's increasingly dominant merchant navy, not only to satisfy domestic demand, but also for re-export throughout Europe and beyond.
William III, a Dutchman, cared little for London, the smoke of which gave him asthma, and after the first fire at Whitehall Palace (1691) he purchased Nottingham House and transformed it into Kensington Palace. Kensington was then an insignificant village, but the arrival of the court soon caused it to grow in importance. The palace was rarely favoured by future monarchs, but its construction was another step in the expansion of the bounds of London. During the same reign Greenwich Hospital, then well outside the boundary of London, but now comfortably inside it, was begun; it was the naval complement to the Chelsea Hospital for former soldiers, which had been founded in 1681. During the reign of Queen Anne an act was passed authorising the building of 50 new churches to serve the greatly increased population living outside the boundaries of the City of London.
The 18th century was a period of rapid growth for London, reflecting an increasing national population, the early stirrings of the Industrial Revolution, and London's role at the centre of the evolving British Empire.
In 1707, an Act of Union was passed merging the Scottish and the English Parliaments, thus establishing the Kingdom of Great Britain. A year later, in 1708 Christopher Wren's masterpiece, St Paul's Cathedral was completed on his birthday. However, the first service had been held on 2 December 1697; more than 10 years earlier. This Cathedral replaced the original St. Paul's which had been completely destroyed in the Great Fire of London. This building is considered one of the finest in Britain and a fine example of Baroque architecture.
Many tradesmen from different countries came to London to trade goods and merchandise. Also, more immigrants moved to London making the population greater. More people also moved to London for work and for business making London an altogether bigger and busier city. Britain's victory in the Seven Years' War increased the country's international standing and opened large new markets to British trade, further boosting London's prosperity.
During the Georgian period London spread beyond its traditional limits at an accelerating pace. This is shown in a series of detailed maps, particularly John Rocque's 1741–45 map "(see below)" and his 1746 Map of London. New districts such as Mayfair were built for the rich in the West End, new bridges over the Thames encouraged an acceleration of development in South London and in the East End, the Port of London expanded downstream from the City. During this period was also the uprising of the American colonies. In 1780, the Tower of London held its only American prisoner, former President of the Continental Congress, Henry Laurens. In 1779, he was the Congress's representative of Holland, and got the country's support for the Revolution. On his return voyage back to America, the Royal Navy captured him and charged him with treason after finding evidence of a reason of war between Great Britain and the Netherlands. He was released from the Tower on 21 December 1781 in exchange for General Lord Cornwallis.
In 1762, George III acquired Buckingham Palace (then called Buckingham House) from the Duke of Buckingham. It was enlarged over the next 75 years by architects such as John Nash.
A phenomenon of the era was the coffeehouse, which became a popular place to debate ideas. Growing literacy and the development of the printing press meant that news became widely available. Fleet Street became the centre of the embryonic national press during the century.
18th-century London was dogged by crime. The Bow Street Runners were established in 1750 as a professional police force. Penalties for crime were harsh, with the death penalty being applied for fairly minor crimes. Public hangings were common in London, and were popular public events.
In 1780, London was rocked by the Gordon Riots, an uprising by Protestants against Roman Catholic emancipation led by Lord George Gordon. Severe damage was caused to Catholic churches and homes, and 285 rioters were killed.
In the year 1787, freed slaves from London, America, and many of Britain's colonies founded Freetown in modern-day Sierra Leone.
Up until 1750, London Bridge was the only crossing over the Thames, but in that year Westminster Bridge was opened and, for the first time in history, London Bridge, in a sense, had a rival. In 1798, Frankfurt banker Nathan Mayer Rothschild arrived in London and set up a banking house in the city, with a large sum of money given to him by his father, Amschel Mayer Rothschild. The Rothschilds also had banks in Paris and Vienna. The bank financed numerous large-scale projects, especially regarding railways around the world and the Suez Canal.
The 18th century saw the breakaway of the American colonies and many other unfortunate events in London, but also great change and Enlightenment. This all led into the beginning of modern times, the 19th century.
During the 19th century, London was transformed into the world's largest city and capital of the British Empire. Its population expanded from 1 million in 1800 to 6.7 million a century later. During this period, London became a global political, financial, and trading capital. In this position, it was largely unrivalled until the latter part of the century, when Paris and New York began to threaten its dominance.
While the city grew wealthy as Britain's holdings expanded, 19th-century London was also a city of poverty, where millions lived in overcrowded and unsanitary slums. Life for the poor was immortalised by Charles Dickens in such novels as Oliver Twist In 1810, after the death of Sir Francis Baring and Abraham Goldsmid, Rothschild emerges as the major banker in London.
In 1829, the then Home Secretary (and future prime minister) Robert Peel established the Metropolitan Police as a police force covering the entire urban area. The force gained the nickname of "bobbies" or "peelers" named after Robert Peel.
19th-century London was transformed by the coming of the railways. A new network of metropolitan railways allowed for the development of suburbs in neighbouring counties from which middle-class and wealthy people could commute to the centre. While this spurred the massive outward growth of the city, the growth of greater London also exacerbated the class divide, as the wealthier classes emigrated to the suburbs, leaving the poor to inhabit the inner city areas.
The first railway to be built in London was a line from London Bridge to Greenwich, which opened in 1836. This was soon followed by the opening of great rail termini which eventually linked London to every corner of Great Britain, including Euston station (1837), Paddington station (1838), Fenchurch Street station (1841), Waterloo station (1848), King's Cross station (1850), and St Pancras station (1863). From 1863, the first lines of the London Underground were constructed.
The urbanised area continued to grow rapidly, spreading into Islington, Paddington, Belgravia, Holborn, Finsbury, Shoreditch, Southwark and Lambeth. Towards the middle of the century, London's antiquated local government system, consisting of ancient parishes and vestries, struggled to cope with the rapid growth in population. In 1855, the Metropolitan Board of Works (MBW) was created to provide London with adequate infrastructure to cope with its growth. One of its first tasks was addressing London's sanitation problems. At the time, raw sewage was pumped straight into the River Thames. This culminated in The Great Stink of 1858. Parliament finally gave consent for the MBW to construct a large system of sewers. The engineer put in charge of building the new system was Joseph Bazalgette. In what was one of the largest civil engineering projects of the 19th century, he oversaw construction of over 2100 km of tunnels and pipes under London to take away sewage and provide clean drinking water. When the London sewerage system was completed, the death toll in London dropped dramatically, and epidemics of cholera and other diseases were curtailed. Bazalgette's system is still in use today.
One of the most famous events of 19th-century London was the Great Exhibition of 1851. Held at The Crystal Palace, the fair attracted 6 million visitors from across the world and displayed Britain at the height of its Imperial dominance.
As the capital of a massive empire, London became a magnet for immigrants from the colonies and poorer parts of Europe. A large Irish population settled in the city during the Victorian period, with many of the newcomers refugees from the Great Famine (1845–1849). At one point, Catholic Irish made up about 20% of London's population; they typically lived in overcrowded slums. London also became home to a sizable Jewish community, which was notable for its entrepreneurship in the clothing trade and merchandising.
In 1888, the new County of London was established, administered by the London County Council. This was the first elected London-wide administrative body, replacing the earlier Metropolitan Board of Works, which had been made up of appointees. The County of London covered broadly what was then the full extent of the London conurbation, although the conurbation later outgrew the boundaries of the county. In 1900, the county was sub-divided into 28 metropolitan boroughs, which formed a more local tier of administration than the county council.
Many famous buildings and landmarks of London were constructed during the 19th century including:
London entered the 20th century at the height of its influence as the capital of one of the largest empires in history, but the new century was to bring many challenges.
London's population continued to grow rapidly in the early decades of the century, and public transport was greatly expanded. A large tram network was constructed by the London County Council, through the LCC Tramways; the first motorbus service began in the 1900s. Improvements to London's overground and underground rail network, including large scale electrification were progressively carried out.
During World War I, London experienced its first bombing raids carried out by German zeppelin airships; these killed around 700 people and caused great terror, but were merely a foretaste of what was to come. The city of London would experience many more terrors as a result of both World Wars. The largest explosion in London occurred during World War I: the Silvertown explosion, when a munitions factory containing 50 tons of TNT exploded, killing 73 and injuring 400.
The period between the two World Wars saw London's geographical extent growing more quickly than ever before or since. A preference for lower density suburban housing, typically semi-detached, by Londoners seeking a more "rural" lifestyle, superseded Londoners' old predilection for terraced houses. This was facilitated not only by a continuing expansion of the rail network, including trams and the Underground, but also by slowly widening car ownership. London's suburbs expanded outside the boundaries of the County of London, into the neighbouring counties of Essex, Hertfordshire, Kent, Middlesex and Surrey.
Like the rest of the country, London suffered severe unemployment during the Great Depression of the 1930s. In the East End during the 1930s, politically extreme parties of both right and left flourished. The Communist Party of Great Britain and the British Union of Fascists both gained serious support. Clashes between right and left culminated in the Battle of Cable Street in 1936. The population of London reached an all-time peak of 8.6 million in 1939.
Large numbers of Jewish immigrants fleeing from Nazi Germany settled in London during the 1930s, mostly in the East End.
Labour Party politician Herbert Morrison was a dominant figure in local government in the 1920s and 1930s. He became mayor of Hackney and a member of the London County Council in 1922, and for a while was Minister of Transport in Ramsay MacDonald's cabinet. When Labour gained power in London in 1934, Morrison unified the bus, tram and trolleybus services with the Underground, by the creation of the London Passenger Transport Board (known as London Transport) in 1933., He led the effort to finance and build the new Waterloo Bridge. He designed the Metropolitan Green Belt around the suburbs and worked to clear slums, build schools, and reform public assistance.
During World War II, London, as many other British cities, suffered severe damage, being bombed extensively by the "Luftwaffe" as a part of The Blitz. Prior to the bombing, hundreds of thousands of children in London were evacuated to the countryside to avoid the bombing. Civilians took shelter from the air raids in underground stations.
The heaviest bombing took place during The Blitz between 7 September 1940 and 10 May 1941. During this period, London was subjected to 71 separate raids receiving over 18,000 tonnes of high explosive. One raid in December 1940, which became known as the Second Great Fire of London, saw a firestorm engulf much of the City of London and destroy many historic buildings. St Paul's Cathedral, however, remained unscathed; a photograph showing the Cathedral shrouded in smoke became a famous image of the war.
Having failed to defeat Britain, Hitler turned his attention to the Eastern front and regular bombing raids ceased. They began again, but on a smaller scale with the "Little Blitz" in early 1944. Towards the end of the war, during 1944/45 London again came under heavy attack by pilotless V-1 flying bombs and V-2 rockets, which were fired from Nazi occupied Europe. These attacks only came to an end when their launch sites were captured by advancing Allied forces.
London suffered severe damage and heavy casualties, the worst hit part being the Docklands area. By the war's end, just under 30,000 Londoners had been killed by the bombing, and over 50,000 seriously injured, tens of thousands of buildings were destroyed, and hundreds of thousands of people were made homeless.
Three years after the war, the 1948 Summer Olympics were held at the original Wembley Stadium, at a time when the city had barely recovered from the war. London's rebuilding was slow to begin. However, in 1951 the Festival of Britain was held, which marked an increasing mood of optimism and forward looking.
In the immediate postwar years housing was a major issue in London, due to the large amount of housing which had been destroyed in the war. The authorities decided upon high-rise blocks of flats as the answer to housing shortages. During the 1950s and 1960s the skyline of London altered dramatically as tower blocks were erected, although these later proved unpopular. In a bid to reduce the number of people living in overcrowded housing, a policy was introduced of encouraging people to move into newly built new towns surrounding London.
Through the 19th and in the early half of the 20th century, Londoners used coal for heating their homes, which produced large amounts of smoke. In combination with climatic conditions this often caused a characteristic smog, and London became known for its typical "London Fog", also known as "Pea Soupers". London was sometimes referred to as "The Smoke" because of this. In 1952, this culminated in the disastrous Great Smog of 1952 which lasted for five days and killed over 4,000 people. In response to this, the Clean Air Act 1956 was passed, mandating the creating of "smokeless zones" where the use of "smokeless" fuels was required (this was at a time when most households still used open fires); the Act was effective.
Starting in the mid-1960s, and partly as a result of the success of such UK musicians as the Beatles and The Rolling Stones, London became a centre for the worldwide youth culture, exemplified by the Swinging London subculture which made Carnaby Street a household name of youth fashion around the world. London's role as a trendsetter for youth fashion continued strongly in the 1980s during the new wave and punk eras and into the mid-1990s with the emergence of the Britpop era.
From the 1950s onwards London became home to a large number of immigrants, largely from Commonwealth countries such as Jamaica, India, Bangladesh, Pakistan, which dramatically changed the face of London, turning it into one of the most diverse cities in Europe. However, the integration of the new immigrants was not always easy. Racial tensions emerged in events such as the Brixton Riots in the early 1980s.
From the beginning of "The Troubles" in Northern Ireland in the early 1970s until the mid-1990s, London was subjected to repeated terrorist attacks by the Provisional IRA.
The outward expansion of London was slowed by the war, and the introduction of the Metropolitan Green Belt. Due to this outward expansion, in 1965 the old County of London (which by now only covered part of the London conurbation) and the London County Council were abolished, and the much larger area of Greater London was established with a new Greater London Council (GLC) to administer it, along with 32 new London boroughs.
Greater London's population declined steadily in the decades after World War II, from an estimated peak of 8.6 million in 1939 to around 6.8 million in the 1980s. However, it then began to increase again in the late 1980s, encouraged by strong economic performance and an increasingly positive image.
London's traditional status as a major port declined dramatically in the post-war decades as the old Docklands could not accommodate large modern container ships. The principal ports for London moved downstream to the ports of Felixstowe and Tilbury. The docklands area had become largely derelict by the 1980s, but was redeveloped into flats and offices from the mid-1980s onwards. The Thames Barrier was completed in the 1980s to protect London against tidal surges from the North Sea.
In the early 1980s political disputes between the GLC run by Ken Livingstone and the Conservative government of Margaret Thatcher led to the GLC's abolition in 1986, with most of its powers relegated to the London boroughs. This left London as the only large metropolis in the world without a central administration.
In 2000, London-wide government was restored, with the creation of the Greater London Authority (GLA) by Tony Blair's government, covering the same area of Greater London. The new authority had similar powers to the old GLC, but was made up of a directly elected Mayor and a London Assembly. The first election took place on 4 May, with Ken Livingstone comfortably regaining his previous post. London was recognised as one of the nine regions of England. In global perspective, it was emerging as a World city widely compared to New York and Tokyo.
Around the start of the 21st century, London hosted the much derided Millennium Dome at Greenwich, to mark the new century. Other Millennium projects were more successful. One was the largest observation wheel in the world, the "Millennium Wheel", or the London Eye, which was erected as a temporary structure, but soon became a fixture, and draws four million visitors a year. The National Lottery also released a flood of funds for major enhancements to existing attractions, for example the roofing of the Great Court at the British Museum.
The London Plan, published by the Mayor of London in 2004, estimated that the population would reach 8.1 million by 2016, and continue to rise thereafter. This was reflected in a move towards denser, more urban styles of building, including a greatly increased number of tall buildings, and proposals for major enhancements to the public transport network. However, funding for projects such as Crossrail remained a struggle.
On 6 July 2005 London won the right to host the 2012 Olympics and Paralympics making it the first city to host the modern games three times. However, celebrations were cut short the following day when the city was rocked by a series of terrorist attacks. More than 50 were killed and 750 injured in three bombings on London Underground trains and a fourth on a double decker bus near King's Cross.
London was the starting point for countrywide riots which occurred in August 2011, when thousands of people rioted in several city boroughs and in towns across England. In 2011, the population grew over 8 million people for the first time in decades. White British formed less than half of the population for the first time.
In the public there was ambivalence leading-up to the Olympics, though public sentiment changed strongly in their favour following a successful opening ceremony and when the anticipated organisational and transport problems never occurred. | https://en.wikipedia.org/wiki?curid=14020 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.