text stringlengths 10 951k | source stringlengths 39 44 |
|---|---|
Beauty
Beauty is the ascription of a property or characteristic to an animal, idea, object, person or place that provides a perceptual experience of pleasure or satisfaction. Beauty is studied as part of aesthetics, culture, social psychology and sociology. An "ideal beauty" is an entity which is admired, or possesses features widely attributed to beauty in a particular culture, for perfection. Ugliness is the opposite of beauty.
The experience of "beauty" often involves an interpretation of some entity as being in balance and harmony with nature, which may lead to feelings of attraction and emotional well-being. Because this can be a subjective experience, it is often said that "beauty is in the eye of the beholder." Often, given the observation that empirical observations of things that are considered beautiful often align among groups in consensus, beauty has been stated to have levels of objectivity and partial subjectivity which are not fully subjective in their aesthetic judgement.
The classical Greek noun that best translates to the English-language words "beauty" or "beautiful" was κάλλος, "kallos", and the adjective was καλός, "kalos". However, "kalos" may and is also translated as ″good″ or ″of fine quality″ and thus has a broader meaning than mere physical or material beauty. Similarly, "kallos" was used differently from the English word beauty in that it first and foremost applied to humans and bears an erotic connotation.
The Koine Greek word for beautiful was ὡραῖος, "hōraios", an adjective etymologically coming from the word ὥρα, "hōra", meaning "hour". In Koine Greek, beauty was thus associated with "being of one's hour". Thus, a ripe fruit (of its time) was considered beautiful, whereas a young woman trying to appear older or an older woman trying to appear younger would not be considered beautiful. In Attic Greek, "hōraios" had many meanings, including "youthful" and "ripe old age".
The earliest Western theory of beauty can be found in the works of early Greek philosophers from the pre-Socratic period, such as Pythagoras. The Pythagorean school saw a strong connection between mathematics and beauty. In particular, they noted that objects proportioned according to the golden ratio seemed more attractive. Ancient Greek architecture is based on this view of symmetry and proportion.
Plato considered beauty to be the Idea (Form) above all other Ideas. Aristotle saw a relationship between the beautiful ("to kalon") and virtue, arguing that "Virtue aims at the beautiful."
Classical philosophy and sculptures of men and women produced according to the Greek philosophers' tenets of ideal human beauty were rediscovered in Renaissance Europe, leading to a re-adoption of what became known as a "classical ideal". In terms of female human beauty, a woman whose appearance conforms to these tenets is still called a "classical beauty" or said to possess a "classical beauty", whilst the foundations laid by Greek and Roman artists have also supplied the standard for male beauty and female beauty in western civilization as seen, for example, in the "Winged Victory of Samothrace". During the Gothic era, the classical aesthetical canon of beauty was rejected as sinful. Later, Renaissance and Humanist thinkers rejected this view, and considered beauty to be the product of rational order and harmonious proportions. Renaissance artists and architects (such as Giorgio Vasari in his "Lives of Artists") criticised the Gothic period as irrational and barbarian. This point of view of Gothic art lasted until Romanticism, in the 19th century.
In the Middle Ages, Catholic philosophers like Thomas Aquinas included beauty among the transcendental attributes of being. In his Summa Theologica, Aquinas described the three conditions of beauty as: integritas (Wholeness), consonantia (harmony), claritas (radiance of form)
In the Gothic Architecture of the High and Late Middle Ages, light was considered the most beautiful revelation of God, which was heralded in design. Examples are the stained glass of Gothic Cathedrals including Notre-Dame de Paris and Chartes Cathedral.
The Age of Reason saw a rise in an interest in beauty as a philosophical subject. For example, Scottish philosopher Francis Hutcheson argued that beauty is "unity in variety and variety in unity". The Romantic poets, too, became highly concerned with the nature of beauty, with John Keats arguing in "Ode on a Grecian Urn" that:
In the Romantic period, Edmund Burke postulated a difference between beauty in its classical meaning and the sublime. The concept of the sublime, as explicated by Burke and Kant, suggested viewing Gothic art and architecture, though not in accordance with the classical standard of beauty, as sublime.
The 20th century saw an increasing rejection of beauty by artists and philosophers alike, culminating in postmodernism's anti-aesthetics. This is despite beauty being a central concern of one of postmodernism's main influences, Friedrich Nietzsche, who argued that the Will to Power was the Will to Beauty.
In the aftermath of postmodernism's rejection of beauty, thinkers have returned to beauty as an important value. American analytic philosopher Guy Sircello proposed his New Theory of Beauty as an effort to reaffirm the status of beauty as an important philosophical concept. Elaine Scarry also argues that beauty is related to justice.
Beauty is also studied by psychologists and neuroscientists in the field of experimental aesthetics and neuroesthetics respectively. Psychological theories see beauty as a form of pleasure. Correlational findings support the view that more beautiful objects are also more pleasing. Some studies suggest that higher experienced beauty is associated with activity in the medial orbitofrontal cortex. This approach of localizing the processing of beauty in one brain region has received criticism within the field.
The characterization of a person as “beautiful”, whether on an individual basis or by community consensus, is often based on some combination of "inner beauty", which includes psychological factors such as personality, intelligence, grace, politeness, charisma, integrity, congruence and elegance, and "outer beauty" (i.e. physical attractiveness) which includes physical attributes which are valued on an aesthetic basis.
Standards of beauty have changed over time, based on changing cultural values. Historically, paintings show a wide range of different standards for beauty. However, humans who are relatively young, with smooth skin, well-proportioned bodies, and regular features, have traditionally been considered the most beautiful throughout history.
A strong indicator of physical beauty is "averageness". When images of human faces are averaged together to form a composite image, they become progressively closer to the "ideal" image and are perceived as more attractive. This was first noticed in 1883, when Francis Galton overlaid photographic composite images of the faces of vegetarians and criminals to see if there was a typical facial appearance for each. When doing this, he noticed that the composite images were more attractive compared to any of the individual images. Researchers have replicated the result under more controlled conditions and found that the computer-generated, mathematical average of a series of faces is rated more favorably than individual faces. It is argued that it is evolutionarily advantageous that sexual creatures are attracted to mates who possess predominantly common or average features, because it suggests the absence of genetic or acquired defects. There is also evidence that a preference for beautiful faces emerges early in infancy, and is probably innate,
and that the rules by which attractiveness is established are similar across different genders and cultures.
A feature of beautiful women that has been explored by researchers is a waist–hip ratio of approximately 0.70. Physiologists have shown that women with hourglass figures are more fertile than other women due to higher levels of certain female hormones, a fact that may subconsciously condition males choosing mates. However, other commentators have suggested that this preference may not be universal. For instance, in some non-Western cultures in which women have to do work such as finding food, men tend to have preferences for higher waist-hip ratios.
Beauty standards are rooted in cultural norms crafted by societies and media over centuries. Globally, it is argued that the predominance of white women featured in movies and advertising leads to a Eurocentric concept of beauty, breeding cultures that assign inferiority to women of color. Thus, societies and cultures across the globe struggle to diminish the longstanding internalized racism. The black is beautiful cultural movement sought to dispel this notion in the 1960s.
Exposure to the thin ideal in mass media, such as fashion magazines, directly correlates with body dissatisfaction, low self-esteem, and the development of eating disorders among female viewers. Further, the widening gap between individual body sizes and societal ideals continues to breed anxiety among young girls as they grow, highlighting the dangerous nature of beauty standards in society.
The concept of beauty in men is known as 'bishōnen' in Japan. Bishōnen refers to males with distinctly feminine features, physical characteristics establishing the standard of beauty in Japan and typically exhibited in their pop culture idols. A multibillion-dollar industry of Japanese Aesthetic Salons exists for this reason. However, different nations have varying male beauty ideals; Eurocentric standards for men include tallness, leanness, and muscularity; thus, these features are idolized through American media, such as in Hollywood films and magazine covers.
The prevailing eurocentric concept of beauty has varying effects on different cultures. Primarily, adherence to this standard among African American women has bred a lack of positive reification of African beauty, and philosopher Cornel West elaborates that, "much of black self-hatred and self-contempt has to do with the refusal of many black Americans to love their own black bodies-especially their black noses, hips, lips, and hair." These insecurities can be traced back to global idealization of women with light skin, green or blue eyes, and long straight or wavy hair in magazines and media that starkly contrast with the natural features of African women.
In East Asian cultures, familial pressures and cultural norms shape beauty ideals; professor and scholar Stephanie Wong's experimental study concluded that expecting that men in Asian culture didn't like women who look “fragile” impacted the lifestyle, eating, and appearance choices made by Asian American women. In addition to the male gaze, media portrayals of Asian women as petite and the portrayal of beautiful women in American media as fair complexioned and slim-figured induce anxiety and depressive symptoms among Asian American women who don't fit either of these beauty ideals. Further, the high status associated with fairer skin can be attributed to Asian societal history; upper-class people hired workers to perform outdoor, manual labor, cultivating a visual divide over time between lighter complexioned, wealthier families and sun tanned, darker laborers. This along with the Eurocentric beauty ideals embedded in Asian culture has made skin lightening creams, rhinoplasty, and blepharoplasty (an eyelid surgery meant to give Asians a more European, "double-eyelid" appearance) commonplace among Asian women, illuminating the insecurity that results from cultural beauty standards.
Much criticism has been directed at models of beauty which depend solely upon Western ideals of beauty as seen for example in the Barbie model franchise. Criticisms of Barbie are often centered around concerns that children consider Barbie a role model of beauty and will attempt to emulate her. One of the most common criticisms of Barbie is that she promotes an unrealistic idea of body image for a young woman, leading to a risk that girls who attempt to emulate her will become anorexic.
These criticisms have led to a constructive dialogue to enhance the presence of non-exclusive models of Western ideals in body type and beauty. Complaints also point to a lack of diversity in such franchises as the Barbie model of beauty in Western culture. Mattel responded to these criticisms. Starting in 1980, it produced Hispanic dolls, and later came models from across the globe. For example, in 2007, it introduced "Cinco de Mayo Barbie" wearing a ruffled red, white, and green dress (echoing the Mexican flag). "Hispanic" magazine reports that:
Researchers have found that good-looking students get higher grades from their teachers than students with an ordinary appearance. Some studies using mock criminal trials have shown that physically attractive "defendants" are less likely to be convicted—and if convicted are likely to receive lighter sentences—than less attractive ones (although the opposite effect was observed when the alleged crime was swindling, perhaps because jurors perceived the defendant's attractiveness as facilitating the crime). Studies among teens and young adults, such as those of psychiatrist and self-help author Eva Ritvo show that skin conditions have a profound effect on social behavior and opportunity.
How much money a person earns may also be influenced by physical beauty. One study found that people low in physical attractiveness earn 5 to 10 percent less than ordinary-looking people, who in turn earn 3 to 8 percent less than those who are considered good-looking. In the market for loans, the least attractive people are less likely to get approvals, although they are less likely to default. In the marriage market, women's looks are at a premium, but men's looks do not matter much.
Conversely, being very unattractive increases the individual's propensity for criminal activity for a number of crimes ranging from burglary to theft to selling illicit drugs.
Discrimination against others based on their appearance is known as lookism.
St. Augustine said of beauty "Beauty is indeed a good gift of God; but that the good may not think it a great good, God dispenses it even to the wicked."
Philosopher and novelist Umberto Eco wrote "" (2004) and "On Ugliness" (2007). A character in his novel "The Name of the Rose" declares: "three things concur in creating beauty: first of all integrity or perfection, and for this reason we consider ugly all incomplete things; then proper proportion or consonance; and finally clarity and light", before going on to say "the sight of the beautiful implies peace". | https://en.wikipedia.org/wiki?curid=4431 |
Brownian motion
Brownian motion, or pedesis (from "leaping"), is the random motion of particles suspended in a fluid (a liquid or a gas) resulting from their collision with the fast-moving molecules in the fluid.
This pattern of motion typically consists of random fluctuations in a particle's position inside a fluid sub-domain, followed by a relocation to another sub-domain. Each relocation is followed by more fluctuations within the new closed volume. This pattern describes a fluid at thermal equilibrium, defined by a given temperature. Within such a fluid, there exists no preferential direction of flow (as in transport phenomena). More specifically, the fluid's overall linear and angular momenta remain null over time. The kinetic energies of the molecular Brownian motions, together with those of molecular rotations and vibrations, sum up to the caloric component of a fluid's internal energy (the Equipartition theorem).
This motion is named after the botanist Robert Brown, who first described the phenomenon in 1827, while looking through a microscope at pollen of the plant "Clarkia pulchella" immersed in water. In 1905, almost eighty years later, theoretical physicist Albert Einstein published a paper where he modeled the motion of the pollen as being moved by individual water molecules, making one of his first major scientific contributions. This explanation of Brownian motion served as convincing evidence that atoms and molecules exist and was further verified experimentally by Jean Perrin in 1908. Perrin was awarded the Nobel Prize in Physics in 1926 "for his work on the discontinuous structure of matter". The direction of the force of atomic bombardment is constantly changing, and at different times the particle is hit more on one side than another, leading to the seemingly random nature of the motion.
The many-body interactions that yield the Brownian pattern cannot be solved by a model accounting for every involved molecule. In consequence, only probabilistic models applied to molecular populations can be employed to describe it. Two such models of the statistical mechanics, due to Einstein and Smoluchowski are presented below. Another, pure probabilistic class of models is the class of the stochastic process models. There exist sequences of both simpler and more complicated stochastic processes which converge (in the limit) to Brownian motion (see random walk and Donsker's theorem).
The Roman philosopher Lucretius' scientific poem "On the Nature of Things" (c. 60 BC) has a remarkable description of Brownian motion of dust particles in verses 113–140 from Book II. He uses this as a proof of the existence of atoms:
Although the mingling motion of dust particles is caused largely by air currents, the glittering, tumbling motion of small dust particles is, indeed, caused chiefly by true Brownian dynamics.
While Jan Ingenhousz described the irregular motion of coal dust particles on the surface of alcohol in 1785, the discovery of this phenomenon is often credited to the botanist Robert Brown in 1827. Brown was studying pollen grains of the plant "Clarkia pulchella" suspended in water under a microscope when he observed minute particles, ejected by the pollen grains, executing a jittery motion. By repeating the experiment with particles of inorganic matter he was able to rule out that the motion was life-related, although its origin was yet to be explained.
The first person to describe the mathematics behind Brownian motion was Thorvald N. Thiele in a paper on the method of least squares published in 1880. This was followed independently by Louis Bachelier in 1900 in his PhD thesis "The theory of speculation", in which he presented a stochastic analysis of the stock and option markets. The Brownian motion model of the stock market is often cited, but Benoit Mandelbrot rejected its applicability to stock price movements in part because these are discontinuous.
Albert Einstein (in one of his 1905 papers) and Marian Smoluchowski (1906) brought the solution of the problem to the attention of physicists, and presented it as a way to indirectly confirm the existence of atoms and molecules. Their equations describing Brownian motion were subsequently verified by the experimental work of Jean Baptiste Perrin in 1908.
There are two parts to Einstein's theory: the first part consists in the formulation of a diffusion equation for Brownian particles, in which the diffusion coefficient is related to the mean squared displacement of a Brownian particle, while the second part consists in relating the diffusion coefficient to measurable physical quantities. In this way Einstein was able to determine the size of atoms, and how many atoms there are in a mole, or the molecular weight in grams, of a gas. In accordance to Avogadro's law this volume is the same for all ideal gases, which is 22.414 liters at standard temperature and pressure. The number of atoms contained in this volume is referred to as the Avogadro number, and the determination of this number is tantamount to the knowledge of the mass of an atom since the latter is obtained by dividing the mass of a mole of the gas by the Avogadro constant.
The first part of Einstein's argument was to determine how far a Brownian particle travels in a given time interval. Classical mechanics is unable to determine this distance because of the enormous number of bombardments a Brownian particle will undergo, roughly of the order of 1014 collisions per second. Thus Einstein was led to consider the collective motion of Brownian particles.
He regarded the increment of particle positions in time formula_1 in a one dimensional ("x") space (with the coordinates chosen so that the origin lies at the initial position of the particle) as a random variable (formula_2) with some probability density function formula_3. Further, assuming conservation of particle number, he expanded the density (number of particles per unit volume) at time formula_4 in a Taylor series,
where the second equality in the first line is by definition of formula_6. The integral in the first term is equal to one by the definition of probability, and the second and other even terms (i.e. first and other odd moments) vanish because of space symmetry. What is left gives rise to the following relation:
Where the coefficient after the Laplacian, the second moment of probability of displacement formula_2, is interpreted as mass diffusivity "D":
Then the density of Brownian particles "ρ" at point "x" at time "t" satisfies the diffusion equation:
Assuming that "N" particles start from the origin at the initial time "t" = 0, the diffusion equation has the solution
This expression (which is a normal distribution with the mean formula_12 and variance formula_13 usually called Brownian motion formula_14) allowed Einstein to calculate the moments directly. The first moment is seen to vanish, meaning that the Brownian particle is equally likely to move to the left as it is to move to the right. The second moment is, however, non-vanishing, being given by
This equation expresses the mean squared displacement in terms of the time elapsed and the diffusivity. From this expression Einstein argued that the displacement of a Brownian particle is not proportional to the elapsed time, but rather to its square root. His argument is based on a conceptual switch from the "ensemble" of Brownian particles to the "single" Brownian particle: we can speak of the relative number of particles at a single instant just as well as of the time it takes a Brownian particle to reach a given point.
The second part of Einstein's theory relates the diffusion constant to physically measurable quantities, such as the mean squared displacement of a particle in a given time interval. This result enables the experimental determination of Avogadro's number and therefore the size of molecules. Einstein analyzed a dynamic equilibrium being established between opposing forces. The beauty of his argument is that the final result does not depend upon which forces are involved in setting up the dynamic equilibrium.
In his original treatment, Einstein considered an osmotic pressure experiment, but the same conclusion can be reached in other ways.
Consider, for instance, particles suspended in a viscous fluid in a gravitational field. Gravity tends to make the particles settle, whereas diffusion acts to homogenize them, driving them into regions of smaller concentration. Under the action of gravity, a particle acquires a downward speed of "v" = "μmg", where "m" is the mass of the particle, "g" is the acceleration due to gravity, and "μ" is the particle's mobility in the fluid. George Stokes had shown that the mobility for a spherical particle with radius "r" is formula_16, where "η" is the dynamic viscosity of the fluid. In a state of dynamic equilibrium, and under the hypothesis of isothermal fluid, the particles are distributed according to the barometric distribution
where "ρ" − "ρ"0 is the difference in density of particles separated by a height difference of "h", "k"B is the Boltzmann constant (the ratio of the universal gas constant, "R", to the Avogadro constant, "N"), and "T" is the absolute temperature.
Dynamic equilibrium is established because the more that particles are pulled down by gravity, the greater the tendency for the particles to migrate to regions of lower concentration. The flux is given by Fick's law,
where "J" = "ρv". Introducing the formula for "ρ", we find that
In a state of dynamical equilibrium, this speed must also be equal to "v" = "μmg". Both expressions for "v" are proportional to "mg", reflecting that the derivation is independent of the type of forces considered. Similarly, one can derive an equivalent formula for identical charged particles of charge "q" in a uniform electric field of magnitude "E", where "mg" is replaced with the electrostatic force "qE". Equating these two expressions yields a formula for the diffusivity, independent of "mg" or "qE" or other such forces:
Here the first equality follows from the first part of Einstein's theory, the third equality follows from the definition of Boltzmann's constant as "k"B = "R" / "N", and the fourth equality follows from Stokes's formula for the mobility. By measuring the mean squared displacement over a time interval along with the universal gas constant "R", the temperature "T", the viscosity "η", and the particle radius "r", the Avogadro constant "N" can be determined.
The type of dynamical equilibrium proposed by Einstein was not new. It had been pointed out previously by J. J. Thomson in his series of lectures at Yale University in May 1903 that the dynamic equilibrium between the velocity generated by a concentration gradient given by Fick's law and the velocity due to the variation of the partial pressure caused when ions are set in motion "gives us a method of determining Avogadro's Constant which is independent of any hypothesis as to the shape or size of molecules, or of the way in which they act upon each other".
An identical expression to Einstein's formula for the diffusion coefficient was also found by Walther Nernst in 1888 in which he expressed the diffusion coefficient as the ratio of the osmotic pressure to the ratio of the frictional force and the velocity to which it gives rise. The former was equated to the law of van 't Hoff while the latter was given by Stokes's law. He writes formula_21 for the diffusion coefficient "k′", where formula_22 is the osmotic pressure and "k" is the ratio of the frictional force to the molecular viscosity which he assumes is given by Stokes's formula for the viscosity. Introducing the ideal gas law per unit volume for the osmotic pressure, the formula becomes identical to that of Einstein's. The use of Stokes's law in Nernst's case, as well as in Einstein and Smoluchowski, is not strictly applicable since it does not apply to the case where the radius of the sphere is small in comparison with the mean free path.
At first, the predictions of Einstein's formula were seemingly refuted by a series of experiments by Svedberg in 1906 and 1907, which gave displacements of the particles as 4 to 6 times the predicted value, and by Henri in 1908 who found displacements 3 times greater than Einstein's formula predicted. But Einstein's predictions were finally confirmed in a series of experiments carried out by Chaudesaigues in 1908 and Perrin in 1909. The confirmation of Einstein's theory constituted empirical progress for the kinetic theory of heat. In essence, Einstein showed that the motion can be predicted directly from the kinetic model of thermal equilibrium. The importance of the theory lay in the fact that it confirmed the kinetic theory's account of the second law of thermodynamics as being an essentially statistical law.
Smoluchowski's theory of Brownian motion starts from the same premise as that of Einstein and derives the same probability distribution "ρ"("x", "t") for the displacement of a Brownian particle along the "x" in time "t". He therefore gets the same expression for the mean squared displacement: formula_23. However, when he relates it to a particle of mass "m" moving at a velocity formula_24 which is the result of a frictional force governed by Stokes's law, he finds
where "μ" is the viscosity coefficient, and formula_26 is the radius of the particle. Associating the kinetic energy formula_27 with the thermal energy "RT"/"N", the expression for the mean squared displacement is 64/27 times that found by Einstein. The fraction 27/64 was commented on by Arnold Sommerfeld in his necrology on Smoluchowski: "The numerical coefficient of Einstein, which differs from Smoluchowski by 27/64 can only be put in doubt."
Smoluchowski attempts to answer the question of why a Brownian particle should be displaced by bombardments of smaller particles when the probabilities for striking it in the forward and rear directions are equal.
If the probability of "m" gains and "n" − "m" losses follows a binomial distribution,
with equal "a priori" probabilities of 1/2, the mean total gain is
If "n" is large enough so that Stirling's approximation can be used in the form
then the expected total gain will be
showing that it increases as the square root of the total population.
Suppose that a Brownian particle of mass "M" is surrounded by lighter particles of mass "m" which are traveling at a speed "u". Then, reasons Smoluchowski, in any collision between a surrounding and Brownian particles, the velocity transmitted to the latter will be "mu"/"M". This ratio is of the order of 10−7 cm/s. But we also have to take into consideration that in a gas there will be more than 1016 collisions in a second, and even greater in a liquid where we expect that there will be 1020 collision in one second. Some of these collisions will tend to accelerate the Brownian particle; others will tend to decelerate it. If there is a mean excess of one kind of collision or the other to be of the order of 108 to 1010 collisions in one second, then velocity of the Brownian particle may be anywhere between 10 and 1000 cm/s. Thus, even though there are equal probabilities for forward and backward collisions there will be a net tendency to keep the Brownian particle in motion, just as the ballot theorem predicts.
These orders of magnitude are not exact because they don't take into consideration the velocity of the Brownian particle, "U", which depends on the collisions that tend to accelerate and decelerate it. The larger "U" is, the greater will be the collisions that will retard it so that the velocity of a Brownian particle can never increase without limit. Could such a process occur, it would be tantamount to a perpetual motion of the second type. And since equipartition of energy applies, the kinetic energy of the Brownian particle, formula_32, will be equal, on the average, to the kinetic energy of the surrounding fluid particle, formula_33.
In 1906 Smoluchowski published a one-dimensional model to describe a particle undergoing Brownian motion. The model assumes collisions with "M" ≫ "m" where "M" is the test particle's mass and "m" the mass of one of the individual particles composing the fluid. It is assumed that the particle collisions are confined to one dimension and that it is equally probable for the test particle to be hit from the left as from the right. It is also assumed that every collision always imparts the same magnitude of Δ"V". If "N"R is the number of collisions from the right and "N"L the number of collisions from the left then after "N" collisions the particle's velocity will have changed by Δ"V"(2"N"R − "N"). The multiplicity is then simply given by:
and the total number of possible states is given by 2"N". Therefore, the probability of the particle being hit from the right "NR" times is:
As a result of its simplicity, Smoluchowski's 1D model can only qualitatively describe Brownian motion. For a realistic particle undergoing Brownian motion in a fluid, many of the assumptions don't apply. For example, the assumption that on average occurs an equal number of collisions from the right as from the left falls apart once the particle is in motion. Also, there would be a distribution of different possible Δ"V"s instead of always just one in a realistic situation.
The diffusion equation yields an approximation of the time evolution of the probability density function associated to the position of the particle going under a Brownian movement under the physical definition. The approximation is valid on short timescales.
The time evolution of the position of the Brownian particle itself is best described using Langevin equation, an equation which involves a random force field representing the effect of the thermal fluctuations of the solvent on the particle.
The displacement of a particle undergoing Brownian motion is obtained by solving the diffusion equation under appropriate boundary conditions and finding the rms of the solution. This shows that the displacement varies as the square root of the time (not linearly), which explains why previous experimental results concerning the velocity of Brownian particles gave nonsensical results. A linear time dependence was incorrectly assumed.
At very short time scales, however, the motion of a particle is dominated by its inertia and its displacement will be linearly dependent on time: Δ"x" = "v"Δ"t". So the instantaneous velocity of the Brownian motion can be measured as "v" = Δ"x"/Δ"t", when Δ"t" « "τ", where "τ" is the momentum relaxation time. In 2010, the instantaneous velocity of a Brownian particle (a glass microsphere trapped in air with optical tweezers) was measured successfully. The velocity data verified the Maxwell–Boltzmann velocity distribution, and the equipartition theorem for a Brownian particle.
In stellar dynamics, a massive body (star, black hole, etc.) can experience Brownian motion as it responds to gravitational forces from surrounding stars. The rms velocity "V" of the massive object, of mass "M", is related to the rms velocity formula_36 of the background stars by
where formula_38 is the mass of the background stars. The gravitational force from the massive object causes nearby stars to move faster than they otherwise would, increasing both formula_36 and "V". The Brownian velocity of Sgr A*, the supermassive black hole at the center of the Milky Way galaxy, is predicted from this formula to be less than 1 km "s"−1.
In mathematics, Brownian motion is described by the Wiener process, a continuous-time stochastic process named in honor of Norbert Wiener. It is one of the best known Lévy processes (càdlàg stochastic processes with stationary independent increments) and occurs frequently in pure and applied mathematics, economics and physics.
The Wiener process "Wt" is characterized by four facts:
formula_42 denotes the normal distribution with expected value "μ" and variance "σ"2. The condition that it has independent increments means that if formula_43 then formula_44 and formula_45 are independent random variables.
An alternative characterisation of the Wiener process is the so-called "Lévy characterisation" that says that the Wiener process is an almost surely continuous martingale with "W"0 = 0 and quadratic variation formula_46.
A third characterisation is that the Wiener process has a spectral representation as a sine series whose coefficients are independent formula_47 random variables. This representation can be obtained using the Karhunen–Loève theorem.
The Wiener process can be constructed as the scaling limit of a random walk, or other discrete-time stochastic processes with stationary independent increments. This is known as Donsker's theorem. Like the random walk, the Wiener process is recurrent in one or two dimensions (meaning that it returns almost surely to any fixed neighborhood of the origin infinitely often) whereas it is not recurrent in dimensions three and higher. Unlike the random walk, it is scale invariant.
The time evolution of the position of the Brownian particle itself can be described approximately by a Langevin equation, an equation which involves a random force field representing the effect of the thermal fluctuations of the solvent on the Brownian particle. On long timescales, the mathematical Brownian motion is well described by a Langevin equation. On small timescales, inertial effects are prevalent in the Langevin equation. However the mathematical "Brownian motion" is exempt of such inertial effects. Inertial effects have to be considered in the Langevin equation, otherwise the equation becomes singular. so that simply removing the inertia term from this equation would not yield an exact description, but rather a singular behavior in which the particle doesn't move at all.
The Brownian motion can be modeled by a random walk. Random walks in porous media or fractals are anomalous.
In the general case, Brownian motion is a non-Markov random process and described by stochastic integral equations.
The French mathematician Paul Lévy proved the following theorem, which gives a necessary and sufficient condition for a continuous R"n"-valued stochastic process "X" to actually be "n"-dimensional Brownian motion. Hence, Lévy's condition can actually be used as an alternative definition of Brownian motion.
Let "X" = ("X"1, ..., "X""n") be a continuous stochastic process on a probability space (Ω, Σ, P) taking values in R"n". Then the following are equivalent:
The spectral content of a stochastic process formula_48can be found from the power spectral density, formally defined as
formula_49,
where formula_50 stands for the expected value. The power spectral density of Brownian motion is found to be
formula_51.
where formula_52 is the diffusion coefficient of formula_48. For naturally occurring signals, the spectral content can be found from the power spectral density of a single realization, with finite available time, i.e.,
formula_54,
which for an individual realization of a Brownian motion trajectory, it is found to have expected value formula_55
formula_56
and variance formula_57
formula_58.
For sufficiently long realization times, the expected value of the power spectrum of a single trajectory converges to the formally defined power spectral density formula_59, but its coefficient of variation formula_60 tends to formula_61. This implies the distribution of formula_62is broad even in the infinite time limit.
The infinitesimal generator (and hence characteristic operator) of a Brownian motion on R"n" is easily calculated to be ½Δ, where Δ denotes the Laplace operator. In image processing and computer vision, the Laplacian operator has been used for various tasks such as blob and edge detection. This observation is useful in defining Brownian motion on an "m"-dimensional Riemannian manifold ("M", "g"): a Brownian motion on "M" is defined to be a diffusion on "M" whose characteristic operator formula_63 in local coordinates "x""i", 1 ≤ "i" ≤ "m", is given by ½ΔLB, where ΔLB is the Laplace–Beltrami operator given in local coordinates by
where ["g""ij"] = ["g""ij"]−1 in the sense of the inverse of a square matrix.
The narrow escape problem is a ubiquitous problem in biology, biophysics and cellular biology which has the following formulation: a Brownian particle (ion, molecule, or protein) is confined to a bounded domain (a compartment or a cell) by a reflecting boundary, except for a small window through which it can escape. The narrow escape problem is that of calculating the mean escape time. This time diverges as the window shrinks, thus rendering the calculation a singular perturbation problem. | https://en.wikipedia.org/wiki?curid=4436 |
Barcelona
Barcelona ( , , ) is a city on the coast of northeastern Spain. It is the capital and largest city of the autonomous community of Catalonia, as well as the second most populous municipality of Spain. With a population of 1.6 million within city limits, its urban area extends to numerous neighbouring municipalities within the Province of Barcelona and is home to around 4.8 million people, making it the fifth most populous urban area in the European Union after Paris, the Ruhr area, Madrid, and Milan. It is one of the largest metropolises on the Mediterranean Sea, located on the coast between the mouths of the rivers Llobregat and Besòs, and bounded to the west by the Serra de Collserola mountain range, the tallest peak of which is high.
Founded as a Roman city, in the Middle Ages Barcelona became the capital of the County of Barcelona. After merging with the Kingdom of Aragon, Barcelona continued to be an important city in the Crown of Aragon as an economic and administrative centre of this Crown and the capital of the Principality of Catalonia. Barcelona has a rich cultural heritage and is today an important cultural centre and a major tourist destination. Particularly renowned are the architectural works of Antoni Gaudí and Lluís Domènech i Montaner, which have been designated UNESCO World Heritage Sites. Since 1450, it is home to the University of Barcelona. The headquarters of the Union for the Mediterranean are located in Barcelona. The city is known for hosting the 1992 Summer Olympics as well as world-class conferences and expositions and also many international sport tournaments.
Barcelona is a major cultural, economic, and financial centre in southwestern Europe, as well as the main biotech hub in Spain. As a leading world city, Barcelona's influence in global socio-economic affairs qualifies it for global city status.
Barcelona is a transport hub, with the Port of Barcelona being one of Europe's principal seaports and busiest European passenger port, an international airport, Barcelona–El Prat Airport, which handles over 50 million passengers per year, an extensive motorway network, and a high-speed rail line with a link to France and the rest of Europe.
The name "Barcelona" comes from the ancient Iberian "Barkeno", attested in an ancient coin inscription found on the right side of the coin in Iberian script as , in ancient Greek sources as , "Barkinṓn"; and in Latin as "Barcino", "Barcilonum" and "Barcenona".
Some older sources suggest that the city may have been named after the Carthaginian general Hamilcar Barca, who was supposed to have founded the city in the 3rd century BC, but there is no evidence that Barcelona was ever a Carthaginian settlement, or that its name in antiquity, "Barcino", had any connection with the Barcid family of Hamilcar.
During the Middle Ages, the city was variously known as "Barchinona", "Barçalona", "Barchelonaa", and "Barchenona".
Internationally, Barcelona's name is wrongly abbreviated to 'Barça'. However, this name refers only to FC Barcelona, the football club. The common abbreviated form used by locals is "Barna".
Another common abbreviation is 'BCN', which is also the IATA airport code of the Barcelona-El Prat Airport.
The city is also referred to as the "Ciutat Comtal" in Catalan, and "Ciudad Condal" in Spanish (i.e. Comital City or City of Counts), owing to its past as the seat of the Count of Barcelona.
The origin of the earliest settlement at the site of present-day Barcelona is unclear. The ruins of an early settlement have been
found, including different tombs and dwellings dating to earlier than 5000 BC. The founding of Barcelona is the subject of two different legends. The first attributes the founding of the city to the mythological Hercules. The second legend attributes the foundation of the city directly to the historical Carthaginian general, Hamilcar Barca, father of Hannibal, who supposedly named the city "Barcino" after his family in the 3rd century BC, but there is no historical or linguistic evidence that this is true.
In about 15 BC, the Romans redrew the town as a "castrum" (Roman military camp) centred on the ""Mons Taber"", a little hill near the contemporary city hall (Plaça de Sant Jaume). Under the Romans, it was a colony with the surname of "Faventia", or, in full, "Colonia Faventia Julia Augusta Pia Barcino" or "Colonia Julia Augusta Faventia Paterna Barcino". Pomponius Mela mentions it among the small towns of the district, probably as it was eclipsed by its neighbour "Tarraco" (modern Tarragona), but it may be gathered from later writers that it gradually grew in wealth and consequence, favoured as it was with a beautiful situation and an excellent harbour. It enjoyed immunity from imperial burdens. The city minted its own coins; some from the era of Galba survive.
Important Roman vestiges are displayed in Plaça del Rei underground, as a part of the Barcelona City History Museum (MUHBA); the typically Roman grid plan is still visible today in the layout of the historical centre, the "Barri Gòtic" (Gothic Quarter). Some remaining fragments of the Roman walls have been incorporated into the cathedral. The cathedral, known very formally by the long name of "Catedral Basílica Metropolitana de Barcelona", is also sometimes called "La Seu", which simply means cathedral (and see, among other things) in Catalan. It is said to have been founded in 343.
The city was conquered by the Visigoths in the early 5th century, becoming for a few years the capital of all Hispania. After being conquered by the Arabs in the early 8th century, it was conquered in 801 by Charlemagne's son Louis, who made Barcelona the seat of the Carolingian "Hispanic March" ("Marca Hispanica"), a buffer zone ruled by the Count of Barcelona.
The Counts of Barcelona became increasingly independent and expanded their territory to include all of Catalonia, although on 6 July 985, Barcelona was sacked by the army of Almanzor. The sack was so traumatic that most of Barcelona's population was either killed or enslaved. In 1137, Aragon and the County of Barcelona merged in dynastic union by the marriage of Ramon Berenguer IV and Petronilla of Aragon, their titles finally borne by only one person when their son Alfonso II of Aragon ascended to the throne in 1162. His territories were later to be known as the Crown of Aragon, which conquered many overseas possessions and ruled the western Mediterranean Sea with outlying territories in Naples and Sicily and as far as Athens in the 13th century. The forging of a dynastic link between the Crowns of Aragon and Castile marked the beginning of Barcelona's decline. The Bank of Barcelona (""), probably the oldest public bank in Europe, was established by the city magistrates in 1401. It originated from necessities of the state, as did the Bank of Venice (1402) and the Bank of Genoa (1407).
The marriage of Ferdinand II of Aragon and Isabella I of Castile in 1469 united the two royal lines. Madrid became the centre of political power whilst the colonisation of the Americas reduced the financial importance (at least in relative terms) of Mediterranean trade. Barcelona was a centre of Catalan separatism, including the Catalan Revolt (1640–52) against Philip IV of Spain. The great plague of 1650–1654 halved the city's population.
In the 18th century, a fortress was built at Montjuïc that overlooked the harbour. In 1794, this fortress was used by the French astronomer Pierre François André Méchain for observations relating to a survey stretching to Dunkirk that provided the official basis of the measurement of a metre. The definitive metre bar, manufactured from platinum, was presented to the French legislative assembly on 22 June 1799. Much of Barcelona was negatively affected by the Napoleonic wars, but the start of industrialisation saw the fortunes of the province improve.
During the Spanish Civil War, the city, and Catalonia in general, were resolutely Republican. Many enterprises and public services were collectivised by the CNT and UGT unions. As the power of the Republican government and the Generalitat diminished, much of the city was under the effective control of anarchist groups. The anarchists lost control of the city to their own allies, the Communists and official government troops, after the street fighting of the Barcelona May Days. The fall of the city on 26 January 1939, caused a mass exodus of civilians who fled to the French border. The resistance of Barcelona to Franco's coup d'état was to have lasting effects after the defeat of the Republican government. The autonomous institutions of Catalonia were abolished, and the use of the Catalan language in public life was suppressed. Barcelona remained the second largest city in Spain, at the heart of a region which was relatively industrialised and prosperous, despite the devastation of the civil war. The result was a large-scale immigration from poorer regions of Spain (particularly Andalusia, Murcia and Galicia), which in turn led to rapid urbanisation.
In 1992, Barcelona hosted the Summer Olympics. The after-effects of this are credited with driving major changes in what had, up until then, been a largely industrial city. As part of the preparation for the games, industrial buildings along the sea-front were demolished and two miles of beach were created. New construction increased the road capacity of the city by 17%, the sewage handling capacity by 27% and the amount of new green areas and beaches by 78%. Between 1990 and 2004, the number of hotel rooms in the city doubled. Perhaps more importantly, the outside perception of the city was changed making, by 2012, Barcelona the 12th most popular city destination in the world and the 5th amongst European cities.
The death of Franco in 1975 brought on a period of democratisation throughout Spain. Pressure for change was particularly strong in Barcelona, which considered (with some
justification) that it had been punished during nearly forty years of Francoism for its support of the Republican government. Massive, but peaceful, demonstrations on 11 September 1977 assembled over a million people in the streets of Barcelona to call for the restoration of Catalan autonomy. It was granted less than a month later.
The development of Barcelona was promoted by two events in 1986: Spanish accession to the European Community, and particularly Barcelona's designation as host city of the 1992 Summer Olympics. The process of urban regeneration has been rapid, and accompanied by a greatly increased international reputation of the city as a tourist destination. The increased cost of housing has led to a slight decline (−16.6%) in the population over the last two decades of the 20th century as many families move out into the suburbs. This decline has been reversed since 2001, as a new wave of immigration (particularly from Latin America and from Morocco) has gathered pace.
In 1987, an ETA car bombing at Hipercor killed 21 people. On 17 August 2017, a van was driven into pedestrians on La Rambla in the city, killing 14 and injuring at least 100, one of whom later died. Other attacks took place elsewhere in Catalonia. The Prime Minister of Spain, Mariano Rajoy, called the attack in Barcelona a jihadist attack. Amaq News Agency attributed indirect responsibility for the attack to the Islamic State of Iraq and the Levant (ISIL).
Barcelona is located on the northeast coast of the Iberian Peninsula, facing the Mediterranean Sea, on a plain approximately wide limited by the mountain range of Collserola, the Llobregat river to the southwest and the Besòs river to the north. This plain covers an area of , of which are occupied by the city itself. It is south of the Pyrenees and the Catalan border with France.
Tibidabo, high, offers striking views over the city and is topped by the Torre de Collserola, a telecommunications tower that is visible from most of the city. Barcelona is peppered with small hills, most of them urbanised, that gave their name to the neighbourhoods built upon them, such as Carmel (), () and Rovira (). The escarpment of Montjuïc (), situated to the southeast, overlooks the harbour and is topped by Montjuïc Castle, a fortress built in the 17–18th centuries to control the city as a replacement for the Ciutadella. Today, the fortress is a museum and Montjuïc is home to several sporting and cultural venues, as well as Barcelona's biggest park and gardens.
The city borders on the municipalities of Santa Coloma de Gramenet and Sant Adrià de Besòs to the north; the Mediterranean Sea to the east; El Prat de Llobregat and L'Hospitalet de Llobregat to the south; and Sant Feliu de Llobregat, Sant Just Desvern, Esplugues de Llobregat, Sant Cugat del Vallès, and Montcada i Reixac to the west. The municipality includes two small sparsely-inhabited exclaves to the north-west.
According to the Köppen climate classification, Barcelona has a maritime Mediterranean climate ("Csa"), with mild winters and warm to hot summers, while the rainiest seasons are autumn and spring. The rainfall pattern is characterised by a short (3 months) dry season in summer, as well as less winter rainfall than in a typical Mediterranean climate. This subtype, labelled as "Portuguese" by the French geographer George Viers after the climate classification of Emmanuel de Martonne and found in the NW Mediterranean area (e.g. Marseille), can be seen as transitional to the humid subtropical climate ("Cfa") found in inland areas such as the Po Valley (e.g. Milan), whose rainfall is greater in summer, a feature of continental climates.
Its average annual temperature is during the day and at night. The average annual temperature of the sea is about . In the coldest month, January, the temperature typically ranges from during the day, at night and the average sea temperature is . In the warmest month, August, the typical temperature ranges from during the day, about at night and the average sea temperature is . Generally, the summer or "holiday" season lasts about six months, from May to October. Two months – April and November – are transitional; sometimes the temperature exceeds , with an average temperature of during the day and at night. December, January and February are the coldest months, with average temperatures around during the day and at night. Large fluctuations in temperature are rare, particularly in the summer months. Because of the proximity to the warm sea plus the urban heat island, frosts are very rare in the city of Barcelona. Snow is also very infrequent.
Barcelona averages 78 rainy days per year (≥ 1 mm), and annual average relative humidity is 72%, ranging from 69% in July to 75% in October. Rainfall totals are highest in late summer and autumn (September–November) and lowest in early and mid-summer (June–August), with a secondary winter minimum (February–March). Sunshine duration is 2,524 hours per year, from 138 (average 4.5 hours of sunshine a day) in December to 310 (average 10 hours of sunshine a day) in July.
According to Barcelona's City Council, Barcelona's population was 1,608,746 people, on a land area of . It is the main component of an administrative area of Greater Barcelona, with a population of 3,218,071 in an area of (density 5,060 inhabitants/km2). The population of the urban area was 4,840,000. It is the central nucleus of the Barcelona metropolitan area, which relies on a population of 5,474,482.
Spanish is the most spoken language in Barcelona (according to the linguistic census held by the Government of Catalonia in 2013) and it is understood almost universally. Catalan is also very commonly spoken in the city: it is understood by 95% of the population, while 72.3% can speak it, 79% can read it, and 53% can write it. Knowledge of Catalan has increased significantly in recent decades thanks to a language immersion educational system.
In 1900, Barcelona had a population of 533,000 people, which grew steadily but slowly until 1950, when it started absorbing a high number of people from other less-industrialised parts of Spain. Barcelona's population peaked in 1979 with 1,906,998 people, and fell throughout the 1980s and 1990s as more people sought a higher quality of life in outlying cities in the Barcelona Metropolitan Area. After bottoming out in 2000 with 1,496,266 people, the city's population began to rise again as younger people started to return, causing a great increase in housing prices.
"Note: This text is entirely based on the municipal statistical database provided by the city council."
Barcelona is one of the most densely populated cities in Europe. For the year 2008 the city council calculated the population to 1,621,090 living in the 102.2 km2 sized municipality, giving the city an average population density of 15,926 inhabitants per square kilometre with Eixample being the most populated district.
In the case of Barcelona though, the land distribution is extremely uneven. Half of the municipality or 50.2 km2, all of it located on the municipal edge is made up of the ten least densely populated neighbourhoods containing less than 10% of the city's population, the uninhabited Zona Franca industrial area and Montjuïc forest park. Leaving the remaining 90% or slightly below 1.5 million inhabitants living on the remaining at an average density close to 28,500 inhabitants per square kilometre.
Of the 73 neighbourhoods in the city, 45 had a population density above 20,000 inhabitants per square kilometre with a combined population of 1,313,424 inhabitants living on 38.6 km2 at an average density of 33,987 inhabitants per square km. The 30 most densely populated neighbourhoods accounted for 57.5% of the city population occupying only 22.7% of the municipality, or in other words, 936,406 people living at an average density of 40,322 inhabitants per square kilometre. The city's highest density is found at and around the neighbourhood of la Sagrada Família where four of the city's most densely populated neighbourhoods are located side by side, all with a population density above 50,000 inhabitants per square kilometre.
In 1900 almost a third (28.9 percent) were children (aged younger than 14 years), In 2017 this age group constituted only 12.7; those aged between 15 and 24 years in 2017 were 9 percent; those aged between 25 and 44 years a 30.6 percent. In contrast, in 2017 the aged between 45 and 64 years formed the 56.9% of all Barcelonans; while in 1900 the aged 65 and older were just the 6.5 percent, in 2017 reached a 21.5.
In 2016 about 59% of the inhabitants of the city were born in Catalonia and 18.5% coming from the rest of the country. In addition to that, 22.5% of the population was born outside of Spain, a proportion which has more than doubled since 2001 and more than quintupled since 1996 when it was 8.6% respectively 3.9%.
The most important region of origin of migrants is Europe, with many coming from Italy (26,676) or France (13,506). Moreover, many migrants come from Latin American nations such as Bolivia, Ecuador or Colombia. Since the 1990s, and similar to other migrants, many Latin Americans have settled in northern parts of the city.
There exists a relatively large Pakistani community in Barcelona with up to twenty thousand nationals. The community consists of significantly more men than women. Many of the Pakistanis are living in Ciutat Vella. First Pakistani migrants came in the 1970s, with increasing numbers in the 1990s.
Other significant migrant groups come from Asia as from China and the Philippines. There is a Japanese community clustered in Bonanova, Les Tres Torres, Pedralbes, and other northern neighbourhoods, and a Japanese international school serves that community.
Most of the inhabitants state they are Roman Catholic (208 churches). In a 2011 survey conducted by InfoCatólica, 49.5% of Barcelona residents of all ages identified themselves as Catholic. This was the first time that more than half of respondents did not identify themselves as Catholic Christians. The numbers reflect a broader trend in Spain whereby the numbers of self-identified Catholics have declined. In 2019, a survey by Centro de Investigaciones Sociológicas showed that 53.2% of residents in Barcelona identified themselves as Catholic (9.9% practising Catholics, 43.3% non-practising Catholics).
The province has the largest Muslim community in Spain, 322,698 people in Barcelona province are of Muslim religion. A considerable number of Muslims live in Barcelona due to immigration (169 locations, mostly professed by Moroccans in Spain). In 2014, 322,698 out of 5.5 million people in the province of Barcelona identified themselves as Muslim, which makes 5.6% of total population.
The city also has the largest Jewish community in Spain, with an estimated 3,500 Jews living in the city. There are also a number of other groups, including Evangelical (71 locations, mostly professed by Roma), Jehovah's Witnesses (21 Kingdom Halls), Buddhists (13 locations), and Eastern Orthodox.
The Barcelona metropolitan area comprises over 66% of the people of Catalonia, one of the richer regions in Europe and the fourth richest region per capita in Spain, with a GDP per capita amounting to €28,400 (16% more than the EU average). The greater Barcelona metropolitan area had a GDP amounting to $177 billion (equivalent to $34,821 in per capita terms, 44% more than the EU average), making it the 4th most economically powerful city by gross GDP in the European Union, and 35th in the world in 2009. Barcelona city had a very high GDP of €80,894 per head in 2004, according to Eurostat. Furthermore, Barcelona was Europe's fourth best business city and fastest improving European city, with growth improved by 17% per year .
Barcelona was the 24th most "livable city" in the world in 2015 according to lifestyle magazine "Monocle." Similarly, according to Innovation Analysts 2thinknow, Barcelona occupies 13th place in the world on "Innovation Cities™ Global Index".
Barcelona has a long-standing mercantile tradition. Less well known is that the city industrialised early, taking off in 1833, when Catalonia's already sophisticated textile industry began to use steam power. It became the first and most important industrial city in the Mediterranean basin. Since then, manufacturing has played a large role in its history.
Borsa de Barcelona (Barcelona Stock Exchange) is the main stock exchange in the northeastern part of the Iberian Peninsula.
Barcelona was recognised as the Southern European City of the Future for 2014/15, based on its economic potential, by "FDi Magazine" in their bi-annual rankings.
Drawing upon its tradition of creative art and craftsmanship, Barcelona is known for its award-winning industrial design. It also has several congress halls, notably Fira de Barcelona – the second largest trade fair and exhibition centre in Europe, that host a quickly growing number of national and international events each year (at present above 50). The total exhibition floor space of Fira de Barcelona venues is , not counting Gran Via centre on the Plaza de Europa. However, the Eurozone crisis and deep cuts in business travel affected the Council's positioning of the city as a convention centre.
An important business centre, the World Trade Center Barcelona, is located in Barcelona's Port Vell harbour.
The city is known for hosting well as world-class conferences and expositions, including the 1888 "Exposición Universal de Barcelona", the 1929 Barcelona International Exposition (Expo 1929), the 2004 Universal Forum of Cultures and the 2004 World Urban Forum.
Barcelona was the 20th-most-visited city in the world by international visitors and the fifth most visited city in Europe after London, Paris, Istanbul and Rome, with 5.5 million international visitors in 2011. By 2015, both Prague and Milan had more international visitors. With its Rambles, Barcelona is ranked the most popular city to visit in Spain.
Barcelona as internationally renowned a tourist destination, with numerous recreational areas, one of the best beaches in the world, mild and warm climate, historical monuments, including eight UNESCO World Heritage Sites, 519 hotels including 35 five star hotels, and developed tourist infrastructure.
Due to its large influx of tourists each year, Barcelona, like many other tourism capitals, has to deal with pickpockets, with wallets and passports being commonly stolen items. For this reason, most travel guides recommend that visitors take precautions to ensure their possessions' safety, especially inside the metro premises. Despite its moderate pickpocket rate, Barcelona is considered one of the safest cities in terms of health security and personal safety, mainly because of a sophisticated policing strategy that has dropped crime by 32% in just over three years and has led it to be considered the 15th safest city in the world by Business Insider.
While tourism produces economic benefits, the city is "overrun" ... by "hordes of tourists" according to one report. In early 2017, over 150,000 protesters warned that tourism is destabilizing the city. Slogans included "Tourists go home", "Barcelona is not for sale" and "We will not be driven out". By then, number of visitors had increased from 1.7 million in 1990 to 32 million in a city with a population of 1.62 million, increasing the cost of rental housing for residents and overcrowding the public places. While tourists spent an estimated €30 billion in 2017, they are viewed by some as a threat to Barcelona's identity.
A May 2017 article in England's The Telegraph newspaper included Barcelona among the "Eight Places That Hate Tourists the Most" and included a comment from Mayor Ada Colau, "We don't want the city to become a cheap souvenir shop [like Venice]". To moderate the problem, the city has stopped issuing licenses for new hotels and holiday apartments; it also fined AirBnb with a €30,000. The mayor has suggested an additional tourist tax and setting a limit on the number of visitors. One industry insider, Justin Francis, founder of the Responsible Travel agency, stated that steps must be taken to limit the number of visitors that are causing an "overtourism crisis" in several major European cities. "Ultimately, residents must be prioritised over tourists for housing, infrastructure and access to services because they have a long-term stake in the city's success.", he said. "Managing tourism more responsibly can help", Francis later told a journalist, "but some destinations may just have too many tourists, and Barcelona may be a case of that".
Industry generates 21% of the total gross domestic product (GDP) of the region, with the energy, chemical and metallurgy industries accounting for 47% of industrial production. The Barcelona metropolitan area had 67% of the total number of industrial establishments in Catalonia as of 1997.
Barcelona has long been an important European automobile manufacturing centre. Formerly there were automobile factories of AFA, Abadal, Actividades Industriales, Alvarez, America, Artés de Arcos, Balandrás, Baradat-Esteve, Biscúter, J. Castro, Clúa, David, Delfín, Díaz y Grilló, Ebro trucks, , Elizalde, Automóviles España, Eucort, Fenix, Fábrica Hispano, Auto Academia Garriga, Fábrica Española de Automóviles Hebe, Hispano-Suiza, Huracán Motors, Talleres Hereter, Junior SL, Kapi, La Cuadra, M.A., Automóviles Matas, Motores y Motos, Nacional Custals, National Pescara, Nacional RG, Nacional Rubi, Nacional Sitjes, Automóviles Nike, Orix, Otro Ford, Partia, Pegaso, PTV, Ricart, Ricart-España, Industrias Salvador, Siata Española, Stevenson, Romagosa y Compañía, Garaje Storm, Talleres Hereter, Trimak, Automóviles Victoria, Manufacturas Mecánicas Aleu.
Today, the headquarters and a large factory of SEAT (the largest Spanish automobile manufacturer) are in one of its suburbs. There is also a Nissan factory in the logistics and industrial area of the city. The factory of Derbi, a large manufacturer of motorcycles, scooters and mopeds, also lies near the city.
As in other modern cities, the manufacturing sector has long since been overtaken by the services sector, though it remains very important. The region's leading industries are textiles, chemical, pharmaceutical, motor, electronic, printing, logistics, publishing, in telecommunications industry and culture the notable Mobile World Congress, and information technology services.
The traditional importance of textiles is reflected in Barcelona's drive to become a major fashion centre. There have been many attempts to launch Barcelona as a fashion capital, notably "Gaudi Home".
Beginning in the summer of 2000, the city hosted the Bread & Butter urban fashion fair until 2009, when its organisers announced that it would be returning to Berlin. This was a hard blow for the city as the fair brought €100 m to the city in just three days.
Since 2009, "The Brandery", an urban fashion show, has been held in Barcelona twice a year until 2012. According to the Global Language Monitor's annual ranking of the world's top fifty fashion capitals Barcelona was named as the seventh most important fashion capital of the world right after Milano and before Berlin in 2015.
As the capital of the autonomous community of Catalonia, Barcelona is the seat of the Catalan government, known as the "Generalitat de Catalunya"; of particular note are the executive branch, the parliament, and the High Court of Justice of Catalonia. The city is also the capital of the Province of Barcelona and the Barcelonès comarca (district).
Barcelona is governed by a city council formed by 41 city councillors, elected for a four-year term by universal suffrage. As one of the two biggest cities in Spain, Barcelona is subject to a special law articulated through the "Carta Municipal" (Municipal Law). A first version of this law was passed in 1960 and amended later, but the current version was approved in March 2006. According to this law, Barcelona's city council is organised in two levels: a political one, with elected city councillors, and one executive, which administrates the programs and executes the decisions taken on the political level. This law also gives the local government a special relationship with the central government and it also gives the mayor wider prerogatives by the means of municipal executive commissions. It expands the powers of the city council in areas like telecommunications, city traffic, road safety and public safety. It also gives a special economic regime to the city's treasury and it gives the council a veto in matters that will be decided by the central government, but that will need a favourable report from the council.
The "Comissió de Govern" (Government Commission) is the executive branch, formed by 24 councillors, led by the Mayor, with 5 lieutenant-mayors and 17 city councillors, each in charge of an area of government, and 5 non-elected councillors. The plenary, formed by the 41 city councillors, has advisory, planning, regulatory, and fiscal executive functions. The six "Commissions del Consell Municipal" (City council commissions) have executive and controlling functions in the field of their jurisdiction. They are composed by a number of councillors proportional to the number of councillors each political party has in the plenary. The city council has jurisdiction in the fields of city planning, transportation, municipal taxes, public highways security through the "Guàrdia Urbana" (the municipal police), city maintenance, gardens, parks and environment, facilities (like schools, nurseries, sports centres, libraries, and so on), culture, sports, youth and social welfare. Some of these competencies are not exclusive, but shared with the Generalitat de Catalunya or the central Spanish government. In some fields with shared responsibility (such as public health, education or social services), there is a shared Agency or Consortium between the city and the Generalitat to plan and manage services.
The executive branch is led by a Chief Municipal Executive Officer which answers to the Mayor. It is made up of departments which are legally part of the city council and by separate legal entities of two types: autonomous public departments and public enterprises.
The seat of the city council is on the Plaça de Sant Jaume, opposite the seat of Generalitat de Catalunya. Since the coming of the Spanish democracy, Barcelona had been governed by the PSC, first with an absolute majority and later in coalition with ERC and ICV. After the May 2007 election, the ERC did not renew the coalition agreement and the PSC governed in a minority coalition with ICV as the junior partner.
After 32 years, on 22 May 2011, CiU gained a plurality of seats at the municipal election, gaining 15 seats to the PSC's 11. The PP hold 8 seats, ICV 5 and ERC 2.
Since 1987, the city has been divided into 10 administrative districts ("districtes" in Catalan, "distritos" in Spanish):
The districts are based mostly on historical divisions, and several are former towns annexed by the city of Barcelona in the 18th and 19th centuries that still maintain their own distinct character. Each district has its own council led by a city councillor. The composition of each district council depends on the number of votes each political party had in that district, so a district can be led by a councillor from a different party than the executive council.
Barcelona has a well-developed higher education system of public universities. Most prominent among these is the University of Barcelona (established in 1450), a world-renowned research and teaching institution with campuses around the city. Barcelona is also home to the Polytechnic University of Catalonia, and the newer Pompeu Fabra University, and, in the private sector the EADA Business School founded in 1957, became the first Barcelona institution to run manager training programmes for the business community. IESE Business School, as well as the largest private educational institution, the Ramon Llull University, which encompasses schools and institutes such as the ESADE Business School. The Autonomous University of Barcelona, another public university, is located in Bellaterra, a town in the Metropolitan Area. Toulouse Business School and the Open University of Catalonia (a private Internet-centred open university) are also based in Barcelona.
The city has a network of public schools, from nurseries to high schools, under the responsibility of a consortium led by city council (though the curriculum is the responsibility of the Generalitat de Catalunya). There are also many private schools, some of them Roman Catholic. Most such schools receive a public subsidy on a per-student basis, are subject to inspection by the public authorities, and are required to follow the same curricular guidelines as public schools, though they charge tuition. Known as "escoles concertades", they are distinct from schools whose funding is entirely private ("escoles privades").
The language of instruction at public schools and "escoles concertades" is Catalan, as stipulated by the 2009 Catalan Education Act. Spanish may be used as a language of instruction by teachers of Spanish literature or language, and foreign languages by teachers of those languages. An experimental partial immersion programme adopted by some schools allows for the teaching of a foreign language (English, generally) across the curriculum, though this is limited to a maximum of 30% of the school day. No public school or "escola concertada" in Barcelona may offer 50% or full immersion programmes in a foreign language, nor does any public school or "escola concertada" offer International Baccalaureate programmes.
Barcelona's cultural roots go back 2000 years. Since the arrival of democracy, the Catalan language (very much repressed during the dictatorship of Franco) has been promoted, both by recovering works from the past and by stimulating the creation of new works. Barcelona is designated as a world-class city by the Globalization and World Cities Study Group and Network. It has also been part of the UNESCO Creative Cities Network as a City of Literature since 2015.
Barcelona has many venues for live music and theatre, including the world-renowned Gran Teatre del Liceu opera house, the Teatre Nacional de Catalunya, the Teatre Lliure and the Palau de la Música Catalana concert hall. Barcelona also is home to the Barcelona Symphony and Catalonia National Orchestra (Orquestra Simfònica de Barcelona i Nacional de Catalunya, usually known as OBC), the largest symphonic orchestra in Catalonia. In 1999, the OBC inaugurated its new venue in the brand-new Auditorium (L'Auditori). It performs around 75 concerts per season and its current director is Eiji Oue. It is home to the Barcelona Guitar Orchestra, directed by Sergi Vicente.
The major thoroughfare of La Rambla is home to mime artists and street performers.
Yearly, two major pop music festivals take place in the city, the Sónar Festival and the Primavera Sound Festival. The city also has a thriving alternative music scene, with groups such as The Pinker Tones receiving international attention.
"El Periódico de Catalunya", "La Vanguardia" and "Ara" are Barcelona's three major daily newspapers (the first two with Catalan and Spanish editions, "Ara" only in Catalan) while "Sport" and "El Mundo Deportivo" (both in Spanish) are the city's two major sports daily newspapers, published by the same companies. The city is also served by a number of smaller publications such as "Ara" and "El Punt Avui" (in Catalan), by nationwide newspapers with special Barcelona editions like "El Pais" (in Spanish, with an online version in Catalan) and "El Mundo" (in Spanish), and by several free newspapers like "20 minutos" and "Què" (all bilingual).
Barcelona's oldest and main online newspaper "VilaWeb" is also the oldest one in Europe (with Catalan and English editions).
Several major FM stations include Catalunya Ràdio, RAC 1, RAC 105 and Cadena SER. Barcelona also has a local TV stations, BTV, owned by city council. The headquarters of Televisió de Catalunya, Catalonia's public network, are located in Sant Joan Despí, in Barcelona's metropolitan area.
Barcelona has a long sporting tradition and hosted the highly successful 1992 Summer Olympics as well as several matches during the 1982 FIFA World Cup (at the two stadiums). It has hosted about 30 sports events of international significance.
FC Barcelona is a sports club best known worldwide for its football team, one of the largest and the second richest in the world. It has 74 national trophies (while finishing 46 times as runners-up) and 17 continental prizes (with being runners-up 11 times), including five UEFA Champions League trophies out of eight finals and three FIFA Club World Cup wins out of four finals. It is the only male football team in the world to win six trophies in a calendar year (in 2009). FC Barcelona also has professional teams in other sports like FC Barcelona Regal (basketball), FC Barcelona Handbol (handball), FC Barcelona Hoquei (roller hockey), FC Barcelona Ice Hockey (ice hockey), FC Barcelona Futsal (futsal) and FC Barcelona Rugby (rugby union), all at one point winners of the highest national and/or European competitions. The club's museum is the second most visited in Catalonia. The matches against cross-town rivals RCD Espanyol are of particular interest, but there are other Barcelonan football clubs in lower categories, like CE Europa and UE Sant Andreu. FC Barcelona's basketball team has a noted rivalry in the Liga ACB with nearby Joventut Badalona.
Barcelona has three UEFA elite stadiums: FC Barcelona's Camp Nou, the largest stadium in Europe with a capacity of 99,354; the publicly owned Estadi Olímpic Lluís Companys, with a capacity of 55,926; used for the 1992 Olympics; and Estadi Cornellà-El Prat, with a capacity of 40,500. Furthermore, the city has several smaller stadiums such as Mini Estadi (also owned by FC Barcelona) with a capacity of 15,000, Camp Municipal Narcís Sala with a capacity of 6,563 and Nou Sardenya with a capacity of 7,000. The city has a further three multifunctional venues for sports and concerts: the Palau Sant Jordi with a capacity of 12,000 to 24,000 (depending on use), the Palau Blaugrana with a capacity of 7,500, and the Palau dels Esports de Barcelona with a capacity of 3,500.
Barcelona was the host city for the 2013 World Aquatics Championships, which were held at the Palau San Jordi.
Several road running competitions are organised year-round in Barcelona: the Barcelona Marathon every March with over 10,000 participants in 2010, the Cursa de Bombers in April, the Cursa de El Corte Inglés in May (with about 60,000 participants each year), the Cursa de la Mercè, the Cursa Jean Bouin, the Milla Sagrada Família and the San Silvestre. There's also the Ultratrail Collserola which passes through the Collserola forest. The Open Seat Godó, a 50-year-old ATP World Tour 500 Series tennis tournament, is held annually in the facilities of the Real Club de Tenis Barcelona. Each Christmas, a swimming race across the port is organised. Near Barcelona, in Montmeló, the 107,000 capacity Circuit de Barcelona-Catalunya racetrack hosts the Formula One Spanish Grand Prix, the Catalan motorcycle Grand Prix, the Spanish GT Championship and races in the GP2 Series. Skateboarding and cycling are also very popular in Barcelona; in and around the city there are dozens of kilometers of bicycle paths.
Barcelona is also home to numerous social centres and illegal squats that effectively form a shadow society mainly made up of the unemployed, immigrants, dropouts, anarchists, anti-authoritarians and autonomists. Peter Gelderloos estimates that there around 200 squatted buildings and 40 social centres across the city with thousands of inhabitants, making it one of the largest squatter movements in the world. He notes that they pirate electricity, internet and water allowing them to live on less than one euro a day. He argues that these squats embrace an anarcho-communist and anti-work philosophy, often freely fixing up new houses, cleaning, patching roofs, installing windows, toilets, showers, lights and kitchens. In the wake of austerity, the squats have provided a number of social services to the surrounding residents, including bicycle repair workshops, carpentry workshops, self-defense classes, free libraries, community gardens, free meals, computer labs, language classes, theatre groups, free medical care and legal support services. The squats help elderly residents avoid eviction and organise various protests throughout Barcelona. Notable squats include Can Vies and Can Masdeu. Police have repeatedly tried to shut down the squatters movement with waves of evictions and raids, but the movement is still going strong.
Barcelona is served by Barcelona-El Prat Airport, about from the centre of Barcelona. It is the second-largest airport in Spain, and the largest on the Mediterranean coast, which handled more than 50.17 million passengers in 2018, showing an annual upward trend. It is a main hub for Vueling Airlines and Ryanair, and also a focus for Iberia and Air Europa. The airport mainly serves domestic and European destinations, although some airlines offer destinations in Latin America, Asia and the United States. The airport is connected to the city by highway, metro (Airport T1 and Airport T2 stations), commuter train (Barcelona Airport railway station) and scheduled bus service. A new terminal (T1) has been built, and entered service on 17 June 2009.
Some low-cost airlines, also use Girona-Costa Brava Airport, about to the north, Reus Airport, to the south, or Lleida-Alguaire Airport, about to the west, of the city. Sabadell Airport is a smaller airport in the nearby town of Sabadell, devoted to pilot training, aerotaxi and private flights.
The Port of Barcelona has a 2000-year-old history and a great contemporary commercial importance. It is Europe's ninth largest container port, with a trade volume of 1.72 million TEU's in 2013. The port is managed by the Port Authority of Barcelona. Its are divided into three zones: Port Vell (the old port), the commercial port and the logistics port (Barcelona Free Port). The port is undergoing an enlargement that will double its size thanks to diverting the mouth of the Llobregat river to the south.
The Barcelona harbour is the leading European cruiser port and a most important Mediterranean turnaround base. In 2013, 3,6 million of pleasure cruises passengers used services of the Port of Barcelona.
The Port Vell area also houses the Maremagnum (a commercial mall), a multiplex cinema, the IMAX Port Vell and one of Europe's largest aquariums – Aquarium Barcelona, containing 8,000 fish and 11 sharks contained in 22 basins filled with 4 million litres of sea water. The Maremagnum, being situated within the confines of the port, is the only commercial mall in the city that can open on Sundays and public holidays.
Barcelona is a major hub for RENFE, the Spanish state railway network. The city's main Inter-city rail station is Barcelona Sants railway station, whilst Estació de França terminus serves a secondary role handling suburban, regional and medium distance services. Freight services operate to local industries and to the Port of Barcelona.
RENFE's AVE high-speed rail system, which is designed for speeds of , was extended from Madrid to Barcelona in 2008 in the form of the Madrid–Barcelona high-speed rail line. A shared RENFE-SNCF high-speed rail connecting Barcelona and France (Paris, Marseilles and Toulouse, through Perpignan–Barcelona high-speed rail line) was launched in 2013. Both these lines serve Barcelona Sants terminal station.
Barcelona is served by an extensive local public transport network that includes a metro system, a bus network, a regional railway system, trams, funiculars, rack railways, a Gondola lift and aerial cable cars. These networks and lines are run by a number of different operators but they are integrated into a coordinated fare system, administered by the Autoritat del Transport Metropolità (ATM). The system is divided into fare zones (1 to 6) and various Integrated Travel Cards are available.
The Barcelona Metro network comprises twelve lines, identified by an "L" followed by the line number as well as by individual colours. The Metro largely runs underground; eight Metro lines are operated on dedicated track by the Transports Metropolitans de Barcelona (TMB), whilst four lines are operated by the Ferrocarrils de la Generalitat de Catalunya (FGC) and some of them share tracks with RENFE commuter lines.
In addition to the city Metro, several regional rail lines operated by RENFE's Rodalies de Catalunya run across the city, providing connections to outlying towns in the surrounding region.
The city's two modern tram systems, Trambaix and Trambesòs, are operated by TRAMMET. A heritage tram line, the Tramvia Blau, also operates between the metro Line 7 and the Funicular del Tibidabo.
Barcelona's metro and rail system is supplemented by several aerial cable cars, funiculars and rack railways that provide connections to mountain-top stations. FGC operates the Funicular de Tibidabo up the hill of Tibidabo and the Funicular de Vallvidrera (FGC), while TMB runs the Funicular de Montjuïc up Montjuïc. The city has two aerial cable cars: the Montjuïc Cable Car, which serves Montjuïc castle, and the Port Vell Aerial Tramway that runs via Torre Jaume I and Torre Sant Sebastià over the port.
Buses in Barcelona are a major form of public transport, with extensive local, interurban and night bus networks. Most local services are operated by the TMB, although some other services are operated by a number of private companies, albeit still within the ATM fare structure. A separate private bus line, known as Aerobús, links the airport with the city centre, with its own fare structure.
The Estació del Nord (Northern Station), a former railway station which was renovated for the 1992 Olympic Games, now serves as the terminus for long-distance and regional bus services.
Barcelona has a metered taxi fleet governed by the Institut Metropolità del Taxi (Metropolitan Taxi Institute), composed of more than 10,000 cars. Most of the licences are in the hands of self-employed drivers. With their black and yellow livery, Barcelona's taxis are easily spotted, and can be caught from one of many taxi ranks, hailed on street, called by telephone or via app.
On 22 March 2007, Barcelona's City Council started the Bicing service, a bicycle service understood as a public transport. Once the user has their user card, they can take a bicycle from any of the more than 400 stations spread around the city and use it anywhere the urban area of the city, and then leave it at another station. The service has been a success, with 50,000 subscribed users in three months.
Barcelona lies on three international routes, including European route E15 that follows the Mediterranean coast, European route E90 to Madrid and Lisbon, and European route E09 to Paris. It is also served by a comprehensive network of motorways and highways throughout the metropolitan area, including A-2, A-7/AP-7, C-16, C-17, C-31, C-32, C-33, C-60.
The city is circled by three half ring roads or bypasses, Ronda de Dalt (B-20) (on the mountain side), Ronda del Litoral (B-10) (along the coast) and Ronda del Mig (separated into two parts: Travessera de Dalt in the north and the Gran Via de Carles III), two partially covered fast highways with several exits that bypass the city.
The city's main arteries include Diagonal Avenue, which crosses it diagonally, Meridiana Avenue which leads to Glòries and connects with Diagonal Avenue and Gran Via de les Corts Catalanes, which crosses the city from east to west, passing through its centre. The famous boulevard of La Rambla, whilst no longer an important vehicular route, remains an important pedestrian route.
The "Barri Gòtic" (Catalan for "Gothic Quarter") is the centre of the old city of Barcelona. Many of the buildings date from medieval times, some from as far back as the Roman settlement of Barcelona. Catalan "modernista" architecture (related to the movement known as Art Nouveau in the rest of Europe) developed between 1885 and 1950 and left an important legacy in Barcelona. Several of these buildings are World Heritage Sites. Especially remarkable is the work of architect Antoni Gaudí, which can be seen throughout the city. His best-known work is the immense but still unfinished church of the Sagrada Família, which has been under construction since 1882 and is still financed by private donations. , completion is planned for 2026.
Barcelona was also home to Mies van der Rohe's Barcelona Pavilion. Designed in 1929 for the International Exposition for Germany, it was an iconic building that came to symbolise modern architecture as the embodiment of van der Rohe's aphorisms "less is more" and "God is in the details." The Barcelona pavilion was intended as a temporary structure and was torn down in 1930 less than a year after it was constructed. A modern re-creation by Spanish architects now stands in Barcelona, however, constructed in 1986.
Barcelona won the 1999 RIBA Royal Gold Medal for its architecture, the first (and , only) time that the winner has been a city rather than an individual architect.
Barcelona is the home of many points of interest declared World Heritage Sites by UNESCO:
Barcelona has a great number of museums, which cover different areas and eras. The National Museum of Art of Catalonia possesses a well-known collection of Romanesque art, while the Barcelona Museum of Contemporary Art focuses on post-1945 Catalan and Spanish art. The Fundació Joan Miró, Picasso Museum, and Fundació Antoni Tàpies hold important collections of these world-renowned artists, as well as the Can Framis Museum, focused on post-1960 Catalan Art owned by Fundació Vila Casas.
Several museums cover the fields of history and archaeology, like the Barcelona City History Museum (MUHBA), the Museum of the History of Catalonia, the Archeology Museum of Catalonia, the Barcelona Maritime Museum, the Music Museum of Barcelona and the privately owned Egyptian Museum. The Erotic museum of Barcelona is among the most peculiar ones, while CosmoCaixa is a science museum that received the European Museum of the Year Award in 2006.
The Museum of Natural Sciences of Barcelona was founded in 1882 under the name of "Museo Martorell de Arqueología y Ciencias Naturales" (Spanish for "Martorell Museum of Archaeology and Natural Sciences"). In 2011 the Museum of Natural Sciences ended up with a merge of five institutions: the Museum of Natural Sciences of Barcelona (the main site, at the Forum Building), the Martorell Museum (the historical seat of the Museum, opened to the public from 1924 to 2010 as a geology museum), the "Laboratori de Natura", at the Castle of the Three Dragons (from 1920 to 2010: the Zoology Museum), the Historical Botanical Garden of Barcelona, founded 1930, and the Botanical garden of Barcelona, founded 1999. Those two gardens are a part of the Botanical Institute of Barcelona too.
The FC Barcelona Museum has been the most visited museum in the city of Barcelona, with 1,506,022 visitors in 2013.
Barcelona contains sixty municipal parks, twelve of which are historic, five of which are thematic (botanical), forty-five of which are urban, and six of which are forest. They range from vest-pocket parks to large recreation areas. The urban parks alone cover 10% of the city (). The total park surface grows about per year, with a proportion of of park area per inhabitant.
Of Barcelona's parks, Montjuïc is the largest, with 203 ha located on the mountain of the same name. It is followed by Parc de la Ciutadella (which occupies the site of the old military citadel and which houses the Parliament building, the Barcelona Zoo, and several museums); including the zoo), the Guinardó Park (), Park Güell (designed by Antoni Gaudí; ), Oreneta Castle Park (also ), Diagonal Mar Park (, inaugurated in 2002), Nou Barris Central Park (), Can Dragó Sports Park and Poblenou Park (both ), the Labyrinth Park (), named after the garden maze it contains. There are also several smaller parks, for example, the Parc de Les Aigües (). A part of the Collserola Park is also within the city limits. PortAventura World, one of the largest resort in Europe, with 5,837,509 visitors per year, is located one hour's drive from Barcelona. Also, within the city lies Tibidabo Amusement Park, a smaller amusement park in Plaza del Tibidabo, with the Muntanya Russa amusement ride.
Barcelona beach was listed as number one in a list of the top ten city beaches in the world according to "National Geographic" and "Discovery Channel". Barcelona contains seven beaches, totalling of coastline. Sant Sebastià, Barceloneta and Somorrostro beaches, both in length, are the largest, oldest and the most-frequented beaches in Barcelona.
The Olympic Harbour separates them from the other city beaches: Nova Icària, Bogatell, Mar Bella, Nova Mar Bella and Llevant. These beaches (ranging from were opened as a result of the city restructuring to host the 1992 Summer Olympics, when a great number of industrial buildings were demolished. At present, the beach sand is artificially replenished given that storms regularly remove large quantities of material. The 2004 Universal Forum of Cultures left the city a large concrete bathing zone on the eastmost part of the city's coastline. Most recently, Llevant is the first beach to allow dogs access during summer season.
Barcelona is twinned with the following cities:(in chronological order)
Other forms of co-operation and city friendship similar to the twin city programmes exist to many cities worldwide. | https://en.wikipedia.org/wiki?curid=4443 |
Bandy
Bandy is a team winter sport played on ice, in which skaters use sticks to direct a ball into the opposing team's goal.
The sport is considered a form of hockey and has a common background with ice hockey and field hockey. Bandy has also been influenced by the rules of association football: both games are normally played in halves of 45 minutes, there are 11 players on each team, and the fields in both games are about the same size. Bandy is played, like ice hockey, on ice but players use bowed sticks and a small ball, as in field hockey.
A variant of bandy, rink bandy, is played to the same rules but on a field the size of an ice hockey rink, with ice hockey goal cages and with six players on each team, or five in USA Rink Bandy League. Traditional eleven-a-side bandy and rink bandy are recognized by the International Olympic Committee. More informal varieties also exist, like seven-a-side bandy with normally sized goal cages but without corner strokes. Those rules were applied at Davos Cup in 2016.
Rink bandy has in turn led to the creation of the sport rinkball. Bandy is also the predecessor of floorball, which was invented when people started playing with plastic bandy-shaped sticks and lightweight balls when running on the floors of indoor gym halls.
Based on the number of participating athletes, bandy is the world's second-most participated winter sport after ice hockey. Bandy is also ranked as the number two winter sport in terms of tickets sold per day of competitions at the sport's world championship.
However, compared with the seven Winter Olympic sports, bandy's popularity compared to that of other winter sports across the globe is considered by the International Olympic Committee to represent a "gap between popularity and participation and global audiences." This is held to constitute a roadblock to future Olympic inclusion.
The earliest origin of the sport is debated. Though many Russians see their old countrymen as the creators of the sport – reflected by the unofficial title for bandy, "Russian hockey" (русский хоккей) – Russia, England and Holland each had sports or pastimes which can be seen as forerunners of the present sport.
English bandy developed as a winter sport in the Fens of East Anglia. Large expanses of ice would form on the flooded meadows or shallow washes in cold winters, and skating has been a tradition. Members of the Bury Fen Bandy Club published rules of the game in 1882, and introduced it into other countries. The first international match took place in 1891 between Bury Fen and the then Haarlemsche Hockey & Bandy Club from the Netherlands (a club which after a couple of club fusions now is named HC Bloemendaal). The same year, the National Bandy Association was started in England.
The match later dubbed "the original bandy match", was actually held in 1875 at The Crystal Palace in London. However, at the time, the game was called "hockey on the ice", probably as it was considered an ice variant of field hockey.
The first national bandy league was started in Sweden in 1902. Bandy was played at the Nordic Games in Stockholm and Kristiania (present day Oslo) in 1901, 1903, and 1905 and between Swedish, Finnish and Russian teams at similar games in Helsinki in 1907. A European championship was held in 1913 with eight countries participating.
In modern times, Russia has held a top position in the bandy area, both as a founding nation of the International Federation in 1955 and fielding the most successful team in the World Championships (when counting the previous Soviet Union team and Russia together).
The highest altitude where bandy has been played is in the capital of the Tajik autonomous province of Gorno-Badakhshan, Khorugh.
As a precursor to ice hockey bandy has influenced its development and history – mainly in European and former Soviet countries. While modern ice hockey was created in Canada, a game more similar to bandy was played initially, after British soldiers introduced the game in the late 19th century. At the same time as modern ice hockey rules were formalized in British North America (present day Canada), bandy rules were formulated in Europe. A cross between English and Russian bandy rules eventually developed, with the football-inspired English rules dominant, together with the Russian low border along most of the two sidelines, and this is the basis of the present sport since the 1950s.
Before Canadians introduced ice hockey into Europe in the early 20th century, "hockey" was another name for bandy, and still is in parts of Russia and Kazakhstan.
With football and bandy being dominant sports in parts of Europe, it was common for sports clubs to have bandy and football sections, with athletes playing both sports at different times of the year. Some examples are English Nottingham Forest Football and Bandy Club (today known just as Nottingham Forest F.C.) and Norwegian Strømsgodset IF and Mjøndalen IF, with the latter still having an active bandy section. In Sweden, most football clubs which were active during the first half of the 20th century also played bandy. Later, as the season for each sport increased in time, it was not as easy for the players to engage in both sports, so some clubs came to concentrate on one or the other. Many old clubs still have both sports on their program.
Both bandy and ice hockey were played in Europe during the 20th century, especially in Sweden, Finland, and Norway. Ice hockey became more popular than bandy in most of Europe mostly because it had become an Olympic sport, while bandy had not. Athletes in Europe who had played bandy switched to ice hockey in the 1920s to compete in the Olympics. The smaller ice fields needed for ice hockey also made its rinks easier to maintain, especially in countries with short winters. On the other hand, ice hockey was not played in the Soviet Union until the 1950s when the USSR wanted to compete internationally. The typical European style of ice hockey, with flowing, less physical play, represents a heritage of bandy.
The sport's English name comes from the verb "to bandy", from the Middle French "bander" ("to strike back and forth"), and originally referred to a 17th-century Irish game similar to field hockey. The curved stick was also called a "bandy". The etymological connection to the similarly named Welsh hockey game of bando is not clear.
An old name for bandy is "hockey on the ice"; in the first rule books from England at the turn of the Century 1900, the sport is literally called "bandy or hockey on the ice". Since the mid-20th century the term bandy is usually preferred to prevent confusion with ice hockey.
The sport is known as bandy in many languages though there are a few notable exceptions. In Russian bandy is called "Russian hockey" (русский хоккей) or more frequently, and officially, "hockey with a ball" (xоккей с мячом) while ice hockey is called "hockey with a puck" (xоккей с шайбой) or more frequently just "hockey". If the context makes it clear that bandy is the subject, it as well can be called just "hockey". In Belarusian, Ukrainian and Bulgarian it is also called "hockey with a ball" (хакей з мячoм, хокей з м'ячем and хокей с топка respectively). In Slovak "bandy hockey" (bandyhokej) is the name. In Armenian, Kazakh, Kyrgyz, Mongol and Uzbek, bandy is known as "ball hockey" (գնդակով հոկեյ, допты хоккей, топтуу хоккей, бөмбөгтэй хоккей and koptokli xokkey respectively). In Finnish the two sports are distinguished as "ice ball" (jääpallo) and "ice puck" (jääkiekko), as well as in Hungarian (jéglabda; jégkorong), although in Hungarian it is more often called "bandy" nowadays. In Estonian bandy is also called "ice ball" (jääpall). In Mandarin Chinese it is "bandy ball" (班迪球). In Scottish Gaelic the name is "ice shinty" (camanachd-deighe). In old times shinty or shinney were also sometimes used in English for bandy.
Bandy is played on ice, using a single round ball. Two teams of 11 players each compete to get the ball into the other team's goal using sticks, thereby scoring a goal.
The game is designed to be played on a rectangle of ice the same size as a football field. Bandy also has other rules that are similar to football. Each team has 11 players, one of whom is a goalkeeper. The offside rule is also employed. A goal cannot be scored from a goal throw, but unlike football, a goal can be scored from a stroke-in or a corner stroke. All free strokes are "direct" and allow a goal to be scored without another player touching the ball.
The team that has scored more goals at the end of the game is the winner. If both teams have scored an equal number of goals, then, with some exceptions, the game is a draw.
The primary rule is that the players (other than the goalkeepers) may not intentionally touch the ball with their heads, hands or arms during play. Although players usually use their sticks to move the ball around, they may use any part of their bodies other than their heads, hands or arms and may use their skates in a limited manner. Heading the ball results in a five-minute penalty.
In typical game play, players attempt to propel the ball toward their opponents' goal through individual control of the ball, such as by dribbling, passing the ball to a teammate, and taking shots at the goal, which is guarded by the opposing goalkeeper. Opposing players may try to regain control of the ball by intercepting a pass or through tackling the opponent who controls the ball. However, physical contact between opponents is limited. Bandy is generally a free-flowing game, with play stopping only when the ball has left the field of play, or when play is stopped by the referee. After a stoppage, play can recommence with a free stroke, a penalty shot or a corner stroke. If the ball has left the field along the sidelines, the referee must decide which team touched the ball last, and award a restart stroke to the opposing team, just like football's throw-in.
The rules do not specify any player positions other than goalkeeper, but a number of player specialisations have evolved. Broadly, these include three main categories: forwards, whose main task is to score goals; defenders, who specialise in preventing their opponents from scoring; and midfielders, who take the ball from the opposition and pass it to the forwards. Players in these positions are referred to as outfield players, to discern them from the single goalkeeper. These positions are further differentiated by which side of the field the player spends most time in. For example, there are central defenders, and left and right midfielders. The ten outfield players may be arranged in these positions in any combination (for example, there may be three defenders, five midfielders, and two forwards), and the number of players in each position determines the style of the team's play; more forwards and fewer defenders would create a more aggressive and offensive-minded game, while the reverse would create a slower, more defensive style of play. While players may spend most of the game in a specific position, there are few restrictions on player movement, and players can switch positions at any time. The layout of the players on the pitch is called the team's "formation", and defining the team's formation and tactics is usually the prerogative of the team's manager(s).
There are eighteen rules in official play, designed to apply to all levels of bandy, although certain modifications for groups such as juniors, veterans or women are permitted. The rules are often framed in broad terms, which allow flexibility in their application depending on the nature of the game.
The Bandy Playing Rules can be found on the official website of the Federation of International Bandy, and are overseen by the Rules and Referee Committee.
Each team consists of a maximum of 11 players (excluding substitutes), one of whom must be the goalkeeper. A team of fewer than eight players may not start a game. Goalkeepers are the only players allowed to play the ball with their hands or arms, and they are only allowed to do so within the penalty area in front of their own goal.
Though there are a variety of positions in which the outfield (non-goalkeeper) players are strategically placed by a coach, these positions are not defined or required by the rules of the game.
The positions and formations of the players in bandy are virtually the same as the common association football positions and the same terms are used for the different positions of the players. A team usually consists of defenders, midfielders and forwards. The defenders can play in the form of centre-backs, full-backs and sometimes wing-backs, midfielders playing in the centre, attacking or defensive, and forwards in the form of centre forward, second strikers and sometimes a winger. Sometimes one player is also taking up the role of a libero.
Any number of players may be replaced by substitutes during the course of the game. Substitutions can be performed without notifying the referee and can be performed while the ball is in play. However, if the substitute enters the ice before his teammate has left it, this will result in a five-minute ban. A team can bring at the most four substitutes to the game and one of these is likely to be an extra goalkeeper.
A game is officiated by a referee, the authority and enforcer of the rules, whose decisions are final. The referee may have one or two assistant referees. A secretary outside of the field often takes care of the match protocol.
The basic equipment players are required to wear includes a pair of Bandy skates, a helmet, a mouth guard and, in the case of the goalkeeper, a face guard.
The teams must wear uniforms that make it easy to distinguish the two teams. The goal keeper wears distinct colours to single him out from his or her teammates, just as in football. The skates, sticks and any tape on the stick must be of another colour than the bandy ball, which shall be orange or cerise.
In addition to the aforementioned, various protections are used to protect knees, elbows, genitals and throat. The pants and gloves may contain padding.
The stick used in bandy is an essential part of the sport. It should be made of an approved material such as wood or a similar material and should not contain any metal or sharp parts which can hurt the surrounding players. Sticks are crooked and are available in five angles, where 1 has the smallest bend and 5 has the most. Bend 4 is the most common size in professional bandy. The bandy stick should not have similar colours to the ball, such as orange or pink; it should be no longer than , and no wider than .
A bandy field is by , a total of , or about the same size as a football pitch and considerably larger than an ice hockey rink. Along the sidelines a high border (vant, sarg, wand, wall) is placed to prevent the ball from leaving the ice. It should not be attached to the ice, to glide upon collisions, and should end away from the corners.
Centered at each shortline is a wide and high goal cage and in front of the cage is a half-circular penalty area with a radius. A penalty spot is located in front of the goal and there are two free-stroke spots at the penalty area line, each surrounded by a circle.
A centre spot with a circle of radius denotes the center of the field. A centre-line is drawn through the centre spot parallel with the shortlines.
At each of the corners, a radius quarter-circle is drawn, and a dotted line is painted parallel to the shortline and away from it without extending into the penalty area. The dotted line can be replaced with a long line starting at the edge of the penalty area and extending towards the sideline, from the shortline.
A standard adult bandy match consists of two periods of 45 minutes each, known as halves. Each half runs continuously, meaning the clock is not stopped when the ball is out of play; the referee can, however, make allowance for time lost through significant stoppages as described below. There is usually a 15-minute half-time break. The end of the match is known as full-time.
The referee is the official timekeeper for the match, and may make an allowance for time lost through substitutions, injured players requiring attention, or other stoppages. This added time is commonly referred to as "stoppage time" or "injury time", and must be reported to the match secretary and the two captains. The referee alone signals the end of the match.
If it is very cold or if it is snowing, the match can be broken into thirds of 30 minutes each. At the extremely cold 1999 World Championship some matches were played in four periods of 15 minutes each and with extra long breaks in between. In the World Championships the two halves can be 30 minutes each for the nations in the B division.
In league competitions games may end in a draw, but in some knockout competitions if a game is tied at the end of regulation time it may go into extra time, which consists of two further 15-minute periods. If the score is still tied after extra time, the game will be replayed. As an alternative, the extra two times 15-minutes may be played as "golden goal" which means the first team that scores during the extra-time wins the game. If both extra periods are played without a scored goal, a penalty shootout will settle the game. The teams shoot five penalties each and if this doesn't settle the game, the teams shoot one more penalty each until one of them misses and the other scores.
Under the rules, the two basic states of play during a game are "ball in play" and "ball out of play". From the beginning of each playing period with a stroke-off (a set strike from the centre-spot by one team) until the end of the playing period, the ball is in play at all times, except when either the ball leaves the field of play, or play is stopped by the referee. When the ball becomes out of play, play is restarted by one of six restart methods depending on how it went out of play:
If the time runs out while a team is preparing for a free-stroke or penalty, the strike should still be made but it must go into the goal by one shot to count as a goal. Similarly, a goal made via a corner stroke should be allowed, but it must be executed using only one shot in addition to the strike needed to put the ball in play.
Free-strokes can be awarded to a team if a player of the opposite team breaks any rule, for example, by hitting with the stick against the opponent's stick or skates. Free-strokes can also be awarded upon incorrect execution of corner-strikes, free-strikes, goal-throws, and so on. or the use of incorrect equipment, such as a broken stick.
Rather than stopping play, the referee may allow play to continue when its continuation will benefit the team against which an offence has been committed. This is known as "playing an advantage". The referee may "call back" play and penalise the original offence if the anticipated advantage does not ensue within a short period of time, typically taken to be four to five seconds. Even if an offence is not penalised because the referee plays an advantage, the offender may still be sanctioned (see below) for any associated misconduct at the next stoppage of play.
If a defender violently attacks an opponent within the penalty area, a penalty shot is awarded. Certain other offences, when carried out within the penalty area, result in a penalty shot provided there is a goal situation. These include a defender holding or hooking an attacker, or blocking a goal situation with a lifted skate, thrown stick or glove and so on. Also, the defenders (with the exception of the goal-keeper) are not allowed to kneel or lie on the ice. The final offences that might mandate a penalty shot are those of hitting or blocking an opponent's stick or touching the ball with the hands, arms, stick or head. If any of these actions is carried out in a non-goal situation, they shall be awarded with a free-stroke from one of the free-stroke spots at the penalty area line. A penalty shot should always be accompanied by a 5 or 10 minutes penalty (see below). If the penalty results in a goal, the penalty should be considered personal meaning that a substitute can be sent in for the penalised player. This does not apply in the event of a red card (see below).
A ten-minute penalty is indicated through the use of a blue card and can be caused by protesting or behaving incorrectly, attacking an opponent violently or stopping the ball incorrectly to get an advantage.
The third time a player receives a penalty, it will be a personal penalty, meaning he or she will miss the remainder of the match. A substitute can enter the field after five or ten minutes. A full game penalty can be received upon using abusive language or directly attacking an opponent and means that the player can neither play nor be substituted for the remainder of the game. A match penalty is indicated through the use of a red card.
The offside rule effectively limits the ability of attacking players to remain forward (i.e. closer to the opponent's goal-line) of the ball, the second-to-last defending player (which can include the goalkeeper), and the half-way line. This rule is in effect just like that of soccer.
The Federation of International Bandy (FIB) has had 33 members at most. Currently, 27 members are a part. Formed in 1955, the name was changed from "International Bandy Federation" in 2001 after the International Olympic Committee approved it as a so-called "recognized sport"; the abbreviation "IBF" was already used by another recognized sports federation. In 2004, FIB was fully accepted by IOC.
FIB is now a member of Association of IOC Recognised International Sports Federations.
The Bandy World Championship for men is arranged by the FIB and was first held in 1957. It was held every two years starting in 1961, and every year since 2003. Currently the record number of countries participating in the World Championships is twenty (2019). Since the number of countries playing bandy is not large, every country which can set up a team is welcome to take part in the World Championship. The quality of the teams varies; however, with only six nations, Sweden, the Soviet Union, Russia, Finland, Norway, and Kazakhstan, having won medals (allowing for the fact that Russia's team took over from the Soviet Union in 1993). Finland won the 2004 world championship in Västerås, Sweden, while all other championships have been won by Sweden, the Soviet Union and Russia.
In February 2004, Sweden won the first World Championship for women, hosted in Finland, without conceding a goal. In the 2014 women's World Championship Russia won, for the first time toppling the Swedes from the throne. In 2016 Sweden took the title back. In 2018 the tournament was played in a totally Asian country for the first time when Chengde in China hosted it.
The same goes for the men's tournament (the area north and west of the Ural River is located in Europe, thus Kazakhstan is a transcontinental country), when Harbin hosted the 2018 Division B tournament.
There are also Youth Bandy World Championships in different age groups for boys and young men and in one age group for girls. The oldest group is the under 23 championship, Bandy World Championship Y-23.
Bandy is recognized by the International Olympic Committee, and was played as a demonstration sport at the 1952 Winter Olympics in Oslo. However, it has yet to officially be played at the Olympics.
FIB president Boris Skrynnik lobbied for Bandy to be included in the 2014 Winter Olympics in Sochi, given Russia's prominence in the sport. Members of the Chinese Olympic Committee were present at the 2017 world championships to meet with Skrynnik about the possibility of considering the sport for the 2022 Winter Olympics in Beijing. However, in 2018 it was announced no new sports would be added for 2022.
Compared with the seven Winter Olympic sports, bandy's popularity across the globe is considered by the International Olympic Committee to have a, "gap between popularity and participation and global audiences", which is a roadblock into future Olympic inclusion.
At the 2011 Asian Winter Games, open to members of the Olympic Council of Asia, men's bandy was included for the first time. Three teams contested the inaugural competition, and Kazakhstan won the gold medal. President Nursultan Nazarbayev attended the final.
There was no bandy competition at the 2017 Asian Winter Games in Japan.
Bandy made its debut at the Winter Universiade during the 2019 Games. Originally a six team tournament for men and a four team tournament for women were planned to be held. However, later China withdrew from the men's tournament and was supposed be replaced by Belarus. Since that did not happen either, participating teams among women were Russia, Sweden, Norway and USA, while among men Russia, Sweden, Norway, Finland and Kazakhstan.
There is a chance for participation also in 2023. In fact International University Sports Federation expects it to happen.
The World Championships should not be confused with the annual World Cup in Ljusdal, Sweden, which has been played annually since the 1970s and is the biggest bandy tournament for elite level club teams. It is played indoors in Sandviken since 2009 because Ljusdal has no indoor arena. It is expected to return to Ljusdal once an indoor arena has been built. World Cup matches are played day and night, and the tournament is played in four days in late October. The teams participating are mostly, and some years exclusively, from Sweden and Russia, which has the two best leagues in the world.
Since 2007, there is also a Bandy World Cup Women for women's teams.
Rink bandy is a variety played on an ice hockey-size rink. It was in the programme of the 2012 European Company Sports Games.
Some FIB countries don't have a large ice surface and only play rink bandy at home; this includes most of the World Championships Group B participants.
The China Bandy Federation was set up in 2014 and China has since then participated in a number of world championship tournaments, with men's, women's and youth teams. China Bandy is mainly financed by private resources. The development of the sport in China is supported by the Harbin Sport University.
The first recorded games of bandy on ice took place in The Fens during the great frost of 1813–1814, although it is probable that the game had been played there in the previous century. Bury Fen Bandy Club from Bluntisham-cum-Earith, near St Ives, was the most successful team, remaining unbeaten until the winter of 1890–1891. Charles G Tebbutt of the Bury Fen bandy club was responsible for the first published rules of bandy in 1882, and also for introducing the game into the Netherlands and Sweden, as well as elsewhere in England where it became popular with cricket, rowing and hockey clubs. Tebbutt's home-made bandy stick can be seen in the Norris Museum in St Ives.
The first Ice Hockey Varsity Matches between Oxford University and Cambridge University were played to bandy rules, even if it was called hockey on ice at the time.
England won the European Bandy Championships in 1913, but that turned out to be the grand finale, and bandy is now virtually unknown in England. In March 2004, Norwegian ex-player Edgar Malman invited two big clubs to play a rink bandy exhibition game in Streatham, London. Russian
Champions and World Cup Winner Vodnik met Swedish Champions Edsbyns IF in a match that ended 10–10. In 2010 England became a Federation of International Bandy member. The federation is based in Cambridgeshire, the historical heartland.
The England Bandy Federation, now the Great Britain Bandy Federation, was set up on 2 January 2017 at a meeting held in the historic old skaters public house, the Lamb and Flag in Welney in Norfolk, England, replacing the Bandy Federation of England which was founded in 2010. President is Rev Lyn Gibb-de Swarte of Littleport and past resident of Streatham in south west London, where she was chair of the Streatham ice speed club, ice hockey club and of the association of ice clubs. Vice Presidents; Simon Seager and Les Mead. Chair is Andrew Hutchinson. Treasurer is Tammy Nichol Twallin . General Secretary, Fixtures and Minutes Secretary, Cathy Gibb-de Swarte. Participation Officer, Anders Gidrup. Recruitment UK is Oscar Gillingham Aukner. They are all busy promoting the sport for all and will be instituting rink bandy around the country. The president is the project director of the Littleport Ice Stadium Project and plans are already drawn for a 400 metres indoor speed skating oval and an inner ice pad 100 × 60 metres bandy pitch. In September 2017, the federation decided to widen its territory to all of the United Kingdom and changed its name to Great Britain Bandy Federation. Great Britain entered a national team in the 2019 World Championships Group B in January and undefeated up to the final, won the silver medal in their final match against Estonia.
Bandy was played in Estonia in the 1910s to 1930s and the country had a national championship for some years. The national team played friendlies against Finland in the 1920s and '30s. The sport was played sporadically during Soviet occupation 1944–1991. It has since then become more organised again, partly through exchange with Finnish clubs and enthusiasts. As of 2018, Estonia takes part in both the men's and the Women's Bandy World Championship.
Bandy was introduced to Finland from Russia in the 1890s. Finland has been playing bandy friendlies against Sweden and Estonia since its independence in 1917.
The first Finnish national championships were held in 1908 and was the first national Finnish championship held in any team sport. National champions have been named every year except for three years in the first half of the 20th Century when Finland was at war. The top national league is called Bandyliiga and is semi-professional. The best players often go fully professional by being recruited by clubs in Sweden or Russia.
Finland was an original member of the Federation of International Bandy and is the only country beside Russia/Soviet Union and Sweden to have won a Bandy World Championship, which it did in 2004.
Bandy was played in Germany in the early 20th century, including by Crown Prince Wilhelm, but the interest died out in favour of ice hockey. Leipziger Sportclub had the best team and was also last to give bandy up. The sport was reintroduced in the 2010s, with the German Bandy Association being founded in 2013.
Bandy has a long history in many parts of the country and it used to be one of the most popular sports in Soviet times. However, after independence it suffered a rapid decline in popularity and only remained in Oral (often called by the Russian name, Uralsk), where the country's only professional club Akzhaiyk is located. They are competing in the Russian second tier division, the Supreme League. Recently bandy has started to gain popularity again outside of Oral, most notably in Petropavl and Khromtau. Those were for example the three Kazakh cities which at the Youth-17 World Championship 2016 had players in the team. The capital Nur-Sultan has hosted national youth championships in rink bandy. as well as championships in traditional eleven-a-side bandy. The former capital Almaty has in recent years hosted both the Asian Winter Games (with bandy on the program) as well as the Bandy World Championship in which Kazakhstan finished 3rd. Plans are made to reinvigorate the bandy section of the club Dynamo Almaty, who won the Soviet Championships in 1977 and 1990 as well as the European Cup in 1978. The Asian Bandy Federation also has its headquarters in Almaty. Since a few years the state is supporting bandy. Medeu in Almaty is the only arena with artificial ice. A second arena in Almaty was built for the World Championship 2012, but it was taken down afterwards. Stadion Yunost in Oral was supposed to get artificial ice for the 2017–18 season. It got delayed but in 2018 it was officially ready for use.
The national team took a silver medal at the 2011 Asian Winter Games, which led to being chosen as the best Mongolian sport team of 2011. Mongolia was proud to win the bronze medal of the B division at the 2017 Bandy World Championship after which the then President of Mongolia, Tsakhiagiin Elbegdorj, held a reception for the team.
Bandy was introduced to the Netherlands in the 1890s by Pim Mulier and the sport became popular. However, in the 1920s, the interest turned to ice hockey, but in contrast to other countries in central and western Europe, the sport has been continuously played in the Netherlands and since the 1970s, the country has become a member of FIB and games have been more formalised again. The national team started to compete at the WCS in 1991. However, without a proper venue, only rink bandy is played within the country. The national governing body is the Bandy Bond Nederland.
Bandy was introduced to Norway in the 1910s. The Swedes contributed largely, and clubs sprang up around the capital of Oslo. Oslo, including neighbouring towns, is still today the region where bandy enjoys most popularity in Norway.
In 1912 the Norwegians played their first National Championship, which was played annually up to 1940. During WWII, illegal bandy was played in hidden places in forests, on ponds and lakes. In 1943, −44 and −45, illegal championships were held. In 1946 legal play resumed and still goes on. After WWII the number of teams rose, as well as attendance which regularly were in the thousands, but mild winters in the 1970s and 80s shrunk the league, and in 2003 only five clubs (teams) fought out the 1st division with low attendance numbers and little media coverage.
In recent years, the number of artificially frozen pitches have increased in Norway, and more sports clubs have reinvigorated their bandy sections with new men's and youth teams. Because of this, as well as an increase of Swedish players in Norway, the competitiveness of the game has risen, especially in the first division Eliteserien. The adult men's game in Norway today consists of Eliteserien with eight teams, as well as three lower divisions. Bandy in Norway has also started to spread geographically, but some clubs in apart locations in the 3rd division only have access to ice hockey rinks and therefore play rink bandy for home games. Compared to the past, attendance is still fairly low, but important Eliteserien matches can attract around 1000 spectators.
In Russia bandy is known as hockey with a ball or simply Russian hockey. A similar game became popular among the Russian nobility in the early 1700s, with the imperial court of Peter the Great playing a predecessor of modern bandy on Saint Petersburg's frozen Neva river. Russians played this game using ordinary footwear, with sticks made out of juniper wood, only later were skates introduced. It was only in the second half of the 19th century that bandy became popular among the masses throughout the Russian Empire. Traditionally the Russians used a longer skate blade than other nations, giving them the advantage of skating faster. However, they would find it more difficult to turn quickly. A bandy skate has a longer blade than a hockey skate, and the "Russian skate" is even longer.
When the Federation of International Bandy was formed in 1955, with the Soviet Union as one of its founding members, the Russians largely adopted the international rules of the game developed in England in the 19th century, with one notable exception. The other countries adopted the border.
Bandy is considered a national sport in Russia and is the only discipline to have official support of the Russian Orthodox Church.
The Russian Bandy Super League is played every year and the winner in the final becomes Russian champion. The Russian Cup has been played annually (except for just some years) since 1937.
After the victory in the 2016 World Championship, the fourth in a row, President Putin received four players of the national team, Head Coach and Vice-President of the Russian Bandy Federation Sergey Myaus, the Russian Bandy Federation as well as Federation of International Bandy President Boris Skrynnik in The Kremlin. He talked, among other things, about the need to give more support to Russian bandy. It was the first time a head of state had accepted a meeting to talk about Russian bandy. Attending the meeting were also Minister of Sport, Tourism and Youth policy Vitaly Mutko and presidential adviser Igor Levitin. The month after, Igor Levitin held a follow-up meeting.
Bandy was introduced to Sweden in 1895. The Swedish royal family, noblemen and diplomats were the first players. Swedish championships for men have been played annually since 1907. In the 1920s students played the game and it became a largely middle class sport. After Slottsbrons IF won the Swedish championship in 1934 it became popular amongst workers in the smaller industrial towns and villages. Bandy remains the main sport in many of these places.
Bandy in Sweden is famous for its "culture" – both playing bandy and being a spectator requires great fortitude and dedication. A "" is the classic accessory for spectating – it is typically made of brown leather, well worn and contains a warm drink in a thermos and/or a bottle of liquor.
Bandy is most often played at outdoor arenas during winter time, so the need for spectators to carry flasks or thermoses of 'warming' liquid like glögg is a natural effect.
A notable tradition is "annandagsbandy", bandy games played on Saint Stephen's Day, which for many Swedes is an important Christmas season tradition and always draws bigger crowds than usual. Games traditionally begin at 1:15 pm.
The final match for the Swedish Championship is played every year on the third Saturday of March. From 1991 to 2012, it was played at Studenternas Idrottsplats in Uppsala, often drawing crowds in excess of 20,000. The reason the play-off match was set in Uppsala is because of IFK Uppsala's success in the beginning of the 20th century. IFK Uppsala won 11 titles in the Swedish Championships between 1907 and 1920, which made them the most successful bandy club in the entire country. Now, however, the record is held by Västerås SK. A contributing factor was the poor quality of the ice at Söderstadion, where the finals were held from 1967 to 1989.
In 2013 and 2014 the final was played indoors in Friends Arena, the national stadium for football in Solna, Stockholm, with a retractable roof and a capacity of 50 000. The first final at Friends Arena in 2013 drew a record crowd of 38,474 when Hammarby IF Bandy, after ending up in second place in six finals during the 2000s, won their second title. Due to declining attendance since, for 2015 through 2017 Tele2 Arena in southern Stockholm was chosen as a new venue. However, the new indoor venue failed to attract much more than half of the total capacity. In May 2017 it was announced that the finals will again be held at Studenternas IP in Uppsala from 2018 to 2021.
In the late 19th and early 20th century, Switzerland had become a popular place for winter vacations and people went there from all over Europe. Winter sports like skiing, sledding and bandy was played in Geneva and other towns. Students from Oxford and Cambridge went to Switzerland to play each other – the predecessor of the recurring Ice Hockey Varsity Match was a bandy match played in St. Moritz in 1885. This popularity for Swiss venues of winter sport may have been a reason, the European Championship was held there in 1913.
Bandy has mainly been played as a recreational sport in Switzerland in the last decade, but a Swiss national team took part in the 2018 Women's Bandy World Championship.
Bandy was played in Ukraine when it was part of the Soviet Union. After independence in 1991, it took some years before organised bandy formed again, but Ukrainian champions have been named annually since 2012.
Bandy has been played in the United States since around 1980. The sport is centered in Minnesota, with very few teams based elsewhere. The United States national bandy team has participated in the Bandy World Championships since 1985 and is also regularly playing friendly matches against Canada.
United States bandy championships have been played annually since the early 1980s, but the sport is not widely covered by American sports media. | https://en.wikipedia.org/wiki?curid=4444 |
Bob Frankston
Robert M. Frankston (born June 14, 1949) is an American software engineer and businessman who co-created, with Dan Bricklin, the VisiCalc spreadsheet program. Frankston is also the co-founder of Software Arts.
Frankston was born and raised in Brooklyn, New York. He graduated from Stuyvesant High School in New York City in 1966. He earned a S.B degree in computer science and mathematics from the Massachusetts Institute of Technology, followed by a Master of Engineering degree computer science, also from MIT.
Following his work with Dan Bricklin, Frankston later worked at Lotus Software and Microsoft.
Frankston became an outspoken advocate for reducing the role of telecommunications companies in the evolution of the Internet, particularly with respect to broadband and mobile communications. He coined the term "Regulatorium" to describe what he considers collusion between telecommunication companies and their regulators that prevents change. | https://en.wikipedia.org/wiki?curid=4445 |
Booker Prize
The Booker Prize for Fiction, formerly known as the Booker–McConnell Prize (1969–2001) and the Man Booker Prize (2002–2019), is a literary prize awarded each year for the best original novel written in the English language and published in the United Kingdom. The winner of the Booker Prize is generally assured international renown and success; therefore, the prize is of great significance for the book trade. From its inception, only novels written by Commonwealth, Irish, and South African (and later Zimbabwean) citizens were eligible to receive the prize; in 2014 it was widened to any English-language novel—a change that proved controversial.
A high-profile literary award in British culture, the Booker Prize is greeted with anticipation and fanfare. It is also a mark of distinction for authors to be selected for inclusion in the shortlist or even to be nominated for the "longlist".
The prize was originally established as the Booker–McConnell Prize, after the company Booker, McConnell Ltd began sponsoring the event in 1969; it became commonly known as the "Booker Prize" or simply "the Booker".
When administration of the prize was transferred to the Booker Prize Foundation in 2002, the title sponsor became the investment company Man Group, which opted to retain "Booker" as part of the official title of the prize. The foundation is an independent registered charity funded by the entire profits of Booker Prize Trading Ltd, of which it is the sole shareholder. The prize money awarded with the Booker Prize was originally £21,000, and was subsequently raised to £50,000 in 2002 under the sponsorship of the Man Group, making it one of the world's richest literary prizes.
In 1970, Bernice Rubens became the first woman to win the Booker Prize, for "The Elected Member". The rules of the Booker changed in 1971; previously, it had been awarded retrospectively to books published prior to the year in which the award was given. In 1971 the year of eligibility was changed to the same as the year of the award; in effect, this meant that books published in 1970 were not considered for the Booker in either year. The Booker Prize Foundation announced in January 2010 the creation of a special award called the "Lost Man Booker Prize", with the winner chosen from a longlist of 22 novels published in 1970.
Alice Munro's "The Beggar Maid" was shortlisted in 1980, and remains the only short story collection to be shortlisted.
John Sutherland, who was a judge for the 1999 prize, has said:
In 1972, winning writer John Berger, known for his Marxist worldview, protested during his acceptance speech against Booker McConnell. He blamed Booker's 130 years of sugar production in the Caribbean for the region's modern poverty. Berger donated half of his £5,000 prize to the British Black Panther movement, because it had a socialist and revolutionary perspective in agreement with his own.
In 1980, Anthony Burgess, writer of "Earthly Powers", refused to attend the ceremony unless it was confirmed to him in advance whether he had won. His was one of two books considered likely to win, the other being "Rites of Passage" by William Golding. The judges decided only 30 minutes before the ceremony, giving the prize to Golding. Both novels had been seen as favourites to win leading up to the prize, and the dramatic "literary battle" between two senior writers made front-page news.
In 1981, nominee John Banville wrote a letter to "The Guardian" requesting that the prize be given to him so that he could use the money to buy every copy of the longlisted books in Ireland and donate them to libraries, "thus ensuring that the books not only are bought but also read — surely a unique occurrence."
Judging for the 1983 award produced a draw between J. M. Coetzee's "Life & Times of Michael K" and Salman Rushdie's "Shame", leaving chair of judges Fay Weldon to choose between the two. According to Stephen Moss in "The Guardian", "Her arm was bent and she chose Rushdie," only to change her mind as the result was being phoned through.
In 1992, the jury split the prize between Michael Ondaatje's "The English Patient" and Barry Unsworth's "Sacred Hunger". This prompted the foundation to draw up a rule that made it mandatory for the appointed jury to make the award to just a single author/book.
In 1993, two of the judges threatened to walk out when "Trainspotting" appeared on the longlist; Irvine Welsh's novel was pulled from the shortlist to satisfy them. The novel would later receive critical acclaim, and is now considered Welsh's masterpiece.
The choice of James Kelman's book "How Late It Was, How Late" as 1994 Booker Prize winner proved to be one of the most controversial in the award's history. Rabbi Julia Neuberger, one of the judges, declared it "a disgrace" and left the event, later deeming the book to be "crap"; WHSmith's marketing manager called the award "an embarrassment to the whole book trade"; Waterstone's in Glasgow sold a mere 13 copies of Kelman's book the following week. In 1994, "The Guardian"s literary editor Richard Gott, citing the lack of objective criteria and the exclusion of American authors, described the prize as "a significant and dangerous iceberg in the sea of British culture that serves as a symbol of its current malaise."
In 1997, the decision to award Arundhati Roy's "The God of Small Things" proved controversial. Carmen Callil, chair of the previous year's Booker judges, called it an "execrable" book and said on television that it should not even have been on the shortlist. Booker Prize chairman Martyn Goff said Roy won because nobody objected, following the rejection by the judges of Bernard MacLaverty's shortlisted book due to their dismissal of him as "a wonderful short-story writer and that "Grace Notes" was three short stories strung together."
Before 2001, each year's longlist of nominees was not publicly revealed. In 2001, A. L. Kennedy, who was a judge in 1996, called the prize "a pile of crooked nonsense" with the winner determined by "who knows who, who's sleeping with who, who's selling drugs to who, who's married to who, whose turn it is".
The Booker Prize created a permanent home for the archives from 1968 to present at Oxford Brookes University Library. The Archive, which encompasses the administrative history of the Prize from 1968 to date, collects together a diverse range of material, including correspondence, publicity material, copies of both the Longlists and the Shortlists, minutes of meetings, photographs and material relating to the awards dinner (letters of invitation, guest lists, seating plans). Embargoes of ten or twenty years apply to certain categories of material; examples include all material relating to the judging process and the Longlist prior to 2002.
Between 2005 and 2008, the Booker Prize alternated between writers from Ireland and India. "Outsider" John Banville began this trend in 2005 when his novel "The Sea" was selected as a surprise winner: Boyd Tonkin, literary editor of "The Independent", famously condemned it as "possibly the most perverse decision in the history of the award" and rival novelist Tibor Fischer poured scorn on Banville's victory. Kiran Desai of India won in 2006. Anne Enright's 2007 victory came about due to a jury badly split over Ian McEwan's novel "On Chesil Beach". The following year it was India's turn again, with Aravind Adiga narrowly defeating Enright's fellow Irishman Sebastian Barry.
Historically, the winner of the Booker Prize had been required to be a citizen of the Commonwealth of Nations, the Republic of Ireland, or Zimbabwe. It was announced on 18 September 2013 that future Booker Prize awards would consider authors from anywhere in the world, so long as their work was in English and published in the UK. This change proved controversial in literary circles. Former winner A. S. Byatt and former judge John Mullan said the prize risked diluting its identity, whereas former judge A. L. Kennedy welcomed the change. Following this expansion, the first winner not from the Commonwealth, Ireland, or Zimbabwe was American Paul Beatty in 2016. Another American, George Saunders, won the following year. In 2018, publishers sought to reverse the change, arguing that the inclusion of American writers would lead to homogenisation, reducing diversity and opportunities everywhere, including in America, to learn about "great books that haven't already been widely heralded."
Man Group announced in early 2019 that the year's prize would be the last of eighteen under their sponsorship. A new sponsor, Crankstart – a charitable foundation run by Sir Michael Moritz and his wife, Harriet Heyman – then announced it would sponsor the award for five years, with the option to renew for another five years. The award title was changed to simply "The Booker Prize".
In 2019, despite having been unequivocally warned against doing so, the foundation's jury – under the chair Peter Florence – split the prize, awarding it to two authors, in breach of a rule established in 1993. Florence justified the decision, saying: "We came down to a discussion with the director of the Booker Prize about the rules. And we were told quite firmly that the rules state that you can only have one winner...and as we have managed the jury all the way through on the principle of consensus, our consensus was that it was our decision to flout the rules and divide this year’s prize to celebrate two winners." The two were British writer Bernardine Evaristo for her novel "Girl, Woman, Other" and Canadian writer Margaret Atwood for "The Testaments". Evaristo’s win marked the first time the Booker had been awarded to a black woman while at 79, Atwood’s win made her the oldest.
The selection process for the winner of the prize commences with the formation of an advisory committee, which includes a writer, two publishers, a literary agent, a bookseller, a librarian, and a chairperson appointed by the Booker Prize Foundation. The advisory committee then selects the judging panel, the membership of which changes each year, although on rare occasions a judge may be selected a second time. Judges are selected from amongst leading literary critics, writers, academics and leading public figures.
The Booker judging process and the very concept of a "best book" being chosen by a small number of literary insiders is controversial for many. "The Guardian" introduced the "Not the Booker Prize" voted for by readers partly as a reaction to this.
Author Amit Chaudhuri wrote: "The idea that a 'book of the year' can be assessed annually by a bunch of people – judges who have to read almost a book a day – is absurd, as is the idea that this is any way of honouring a writer."
The winner is usually announced at a ceremony in London's Guildhall, usually in early October.
In 1993, to mark the prize's 25th anniversary, a ""Booker of Bookers" Prize" was given. Three previous judges of the award, Malcolm Bradbury, David Holloway and W. L. Webb, met and chose Salman Rushdie's "Midnight's Children", the 1981 winner, as "the best novel out of all the winners".
In 2006, the Man Booker Prize set up a "Best of Beryl" prize, for the author Beryl Bainbridge, who had been nominated five times and yet failed to win once. The prize is said to count as a Booker Prize. The nominees were "An Awfully Big Adventure", "Every Man for Himself", "The Bottle Factory Outing", "The Dressmaker" and "Master Georgie", which won.
Similarly, The Best of the Booker was awarded in 2008 to celebrate the prize's 40th anniversary. A shortlist of six winners was chosen and the decision was left to a public vote; the winner was again "Midnight's Children".
In 2018, to celebrate the 50th anniversary, the Golden Man Booker was awarded. One book from each decade was selected by a panel of judges: Naipaul's "In a Free State" (the 1971 winner), Lively's "Moon Tiger" (1987), Ondaatje's "The English Patient" (1992), Mantel's "Wolf Hall" and Saunders' "Lincoln in the Bardo". The winner, by popular vote, was "The English Patient".
Since 2014, each publisher's imprint may submit a number of titles based on their longlisting history (previously they could submit two). Non-longlisted publishers can submit one title, publishers with one or two longlisted books in the previous five years can submit two, publishers with three or four longlisted books are allowed three submissions, and publishers with five or more longlisted books can have four submissions.
In addition, previous winners of the prize are automatically considered if they enter new titles. Books may also be called in: publishers can make written representations to the judges to consider titles in addition to those already entered. In the 21st century the average number of books considered by the judges has been approximately 130.
A separate prize for which any living writer in the world may qualify, the Man Booker International Prize was inaugurated in 2005. Until 2015, it was given every two years to a living author of any nationality for a body of work published in English or generally available in English translation. In 2016, the award was significantly reconfigured, and is now given annually to a single book in English translation, with a £50,000 prize for the winning title, shared equally between author and translator.
A Russian version of the Booker Prize was created in 1992 called the Booker-Open Russia Literary Prize, also known as the Russian Booker Prize. In 2007, Man Group plc established the Man Asian Literary Prize, an annual literary award given to the best novel by an Asian writer, either written in English or translated into English, and published in the previous calendar year.
As part of "The Times"' Literature Festival in Cheltenham, a Booker event is held on the last Saturday of the festival. Four guest speakers/judges debate a shortlist of four books from a given year from before the introduction of the Booker prize, and a winner is chosen. Unlike the real Man Booker (1969 through 2014), writers from outside the Commonwealth are also considered. In 2008, the winner for 1948 was Alan Paton's "Cry, the Beloved Country", beating Norman Mailer's "The Naked and the Dead", Graham Greene's "The Heart of the Matter" and Evelyn Waugh's "The Loved One". In 2015, the winner for 1915 was Ford Madox Ford's "The Good Soldier", beating "The Thirty-Nine Steps" (John Buchan), "Of Human Bondage" (W. Somerset Maugham), "Psmith, Journalist" (P. G. Wodehouse) and "The Voyage Out" (Virginia Woolf). | https://en.wikipedia.org/wiki?curid=4446 |
Book of Joel
The Book of Joel is part of the Hebrew Bible and Christian Old Testament, one of twelve prophetic books known as the Twelve Minor Prophets. (The term indicates the short length of the text in relation to longer prophetic texts known as the Major Prophets.)
After a superscription ascribing the prophecy to Joel (son of Pethuel), the book may be broken down into the following sections:
The Book of Joel's division into chapters and verses differs widely between editions of the Bible; some editions have three chapters, others four. Translations with four chapters include:
In the 1611 King James Bible, the Book of Joel is formed by three chapters: the second one has 32 verses, and it is equivalent to the union of the chapter 2 (with 26 verses) and chapter 3 (with 5 verses) of other editions of the Bible.
The differences of the division is as follows:
As there are no explicit references in the book to datable persons or events, scholars have assigned a wide range of dates to the book. The main positions are:
Evidence produced for these positions includes allusions in the book to the wider world, similarities with other prophets, and linguistic details. Some commentators, such as John Calvin, attach no great importance to the precise dating.
Joel 1 and 2 are preserved in the Dead Sea Scrolls, in fragmentary manuscripts 4Q78, 4Q82, and the Scroll Wadi Muraba’at.
The Masoretic text places Joel between Hosea and Amos (the order inherited by the Tanakh and Old Testament), while the Septuagint order is Hosea–Amos–Micah–Joel–Obadiah–Jonah. The Hebrew text of Joel seems to have suffered little from scribal transmission, but is at a few points supplemented by the Septuagint, Syriac, and Vulgate versions, or by conjectural emendation. While the book purports to describe a plague of locusts, some ancient Jewish opinion saw the locusts as allegorical interpretations of Israel's enemies. This allegorical interpretation was applied to the church by many church fathers. Calvin took a literal interpretation of chapter 1, but allegorical view of chapter 2, a position echoed by some modern interpreters. Most modern interpreters, however, see Joel speaking of a literal locust plague given a prophetic/ apocalyptic interpretation.
The traditional ascription of the whole book to the prophet Joel was challenged in the late nineteenth and early twentieth centuries by a theory of a three-stage process of composition: 1:1–2:27 were from the hand of Joel, and dealt with a contemporary issue; 2:28–3:21/3:1–4:21 were ascribed to a continuator with an apocalyptic outlook. Mentions in the first half of the book to the day of the Lord were also ascribed to this continuator. 3:4–8/4:4–8 could be seen as even later. Details of exact ascriptions differed between scholars.
This splitting of the book's composition began to be challenged in the mid-twentieth century, with scholars defending the unity of the book, the plausibility of the prophet combining a contemporary and apocalyptic outlook, and later additions by the prophet. The authenticity of 3:4–8 has presented more challenges, although a number of scholars still defend it.
There are many parallels of language between Joel and other Old Testament prophets. They may represent Joel's literary use of other prophets, or vice versa.
In the New Testament, his prophecy of the outpouring of God's Holy Spirit upon all people was notably quoted by Saint Peter in his Pentecost sermon.
Joel 3:10 / 4:10 is a rare reversed reference to swords to plowshares.
The table below represents some of the more explicit quotes and allusions between specific passages in Joel and passages from the Old and New Testaments.
Plange quasi virgo ("Lament like a virgin"), the third responsory for Holy Saturday, is loosely based on verses from the Book of Joel: the title comes from .
See also works on the Minor Prophets as a whole. | https://en.wikipedia.org/wiki?curid=4447 |
Book of Hosea
The Book of Hosea (, romanized: "Sefer Hōšēaʿ") is one of the books of the Hebrew Bible. According to the traditional order of most Hebrew Bibles, it is the first of the twelve Minor Prophets.
Set around the fall of the Northern Kingdom of Israel, the Book of Hosea denounces the worship of gods other than Jehovah, metaphorically comparing Israel's abandonment of Jehovah to a woman being unfaithful to her husband. According to the book's narrative, the relationship between Hosea and his unfaithful wife Gomer is comparable to the relationship between Yahweh and his unfaithful people Israel. The eventual reconciliation of Hosea and Gomer is treated as a hopeful metaphor for the eventual reconciliation between Yahweh and Israel.
Dated to c. 760–720 BCE, it is one of the oldest books of the Hebrew Bible, predating most of the Torah (Pentateuch). Hosea is the source of the phrase "reap the whirlwind", which has passed into common usage in English and other languages.
Hosea prophesied during a dark and melancholic era of Israel's history, the period of the Northern Kingdom's decline and fall in the 8th century BC. According to the book, the apostasy of the people was rampant, having turned away from God in order to serve both the calves of Jeroboam and Baal, a Canaanite god.
The Book of Hosea says that, during Hosea's lifetime, the kings of the Northern Kingdom, their aristocratic supporters, and the priests had led the people away from the Law of God, as given in the Pentateuch. It says that they forsook the worship of God; they worshiped other gods, especially Baal, the Canaanite storm god, and Asherah, a Canaanite fertility goddess. Other sins followed, says the Book, including homicide, perjury, theft, and sexual sin. Hosea declares that unless they repent of these sins, God will allow their nation to be destroyed, and the people will be taken into captivity by Assyria, the greatest nation of the time.
The prophecy of Hosea centers around God's unending love towards a sinful Israel. In this text, God's agony is expressed over His betrayal by Israel. Stephen Cook asserts that the prophetic efforts of this book can be summed up in this passage "I have been the Lord your God ever since the land of Egypt; you know no God but me, and besides me there is no savior" (). Hosea's job was to speak these words during a time when they had been essentially forgotten.
The Book of Hosea contains a number of YHWH prophecies and messages for both Judah and Northern Israel (Samaria). These are delivered by the prophet Hosea.
A brief outline of the concepts presented in the Book of Hosea exist below:
No further breakdown of ideas is clear in 4–14:9/14:10.
Following this, the prophecy is made that someday this will all be changed, that God will indeed have pity on Israel.
Chapter two describes a divorce. This divorce seems to be the end of the covenant between God and the Northern Kingdom. However, it is probable that this was again a symbolic act, in which Hosea divorced Gomer for infidelity, and used the occasion to preach the message of God's rejection of the Northern Kingdom. He ends this prophecy with the declaration that God will one day renew the covenant, and will take Israel back in love.
In Chapter three, at God's command, Hosea seeks out Gomer once more. Either she has sold herself into slavery for debt, or she is with a lover who demands money in order to give her up, because Hosea has to buy her back. He takes her home, but refrains from sexual intimacy with her for many days, to symbolize the fact that Israel will be without a king for many years, but that God will take Israel back, even at a cost to Himself.
Chapters 4–14 spell out the allegory at length. Chapters 1–3 speaks of Hosea's family, and the issues with Gomer. Chapters 4–10 contain a series of oracles, or prophetic sermons, showing exactly why God is rejecting the Northern Kingdom (what the grounds are for the divorce). Chapter 11 is God's lament over the necessity of giving up the Northern Kingdom, which is a large part of the people of Israel, whom God loves. God promises not to give them up entirely. Then, in Chapter 12, the prophet pleads for Israel's repentance. Chapter 13 foretells the destruction of the kingdom at the hands of Assyria, because there has been no repentance. In Chapter 14, the prophet urges Israel to seek forgiveness, and promises its restoration, while urging the utmost fidelity to God.
Matthew 2:13 cites Hosea's prophecy in that God would call His Son out of Egypt as foretelling the flight into Egypt and return to Israel of Joseph, Mary, and the infant Jesus.
In Luke 23:30, Jesus referenced Hosea 10:8 when he said "Then they will begin to say to the mountains 'Cover us" and to the hills, 'Fall on us.' (NRSV) The quote is also echoed in Revelation 6:16.
The capital of the Northern Kingdom fell in 722 BC. All the members of the upper classes and many of the ordinary people were taken captive and carried off to live as prisoners of war.
First, Hosea was directed by God to marry a promiscuous woman of ill-repute, and he did so. Marriage here is symbolic of the covenantal relationship between God and Israel. However, Israel has been unfaithful to God by following other gods and breaking the commandments which are the terms of the covenant, hence Israel is symbolized by a harlot who violates the obligations of marriage to her husband.
Second, Hosea and his wife, Gomer, have a son. God commands that the son be named Jezreel. This name refers to a valley in which much blood had been shed in Israel's history, especially by the kings of the Northern Kingdom. (See I Kings 21 and II Kings 9:21–35). The naming of this son was to stand as a prophecy against the reigning house of the Northern Kingdom, that they would pay for that bloodshed. Jezreel's name means God Sows.
Third, the couple have a daughter. God commands that she be named Lo-ruhamah; Unloved, or, Pity or Pitied On to show Israel that, although God will still have pity on the Southern Kingdom, God will no longer have pity on the Northern Kingdom; its destruction is imminent. In the NIV translation, the omitting of the word 'him' leads to speculation as to whether Lo-Ruhamah was the daughter of Hosea or one of Gomer's lovers. James Mays, however, says that the failure to mention Hosea's paternity is "hardly an implication" of Gomer's adultery.
Fourth, a son is born to Gomer. It is questionable whether this child was Hosea's, for God commands that his name be Lo-ammi, meaning 'not my people.' The child bore this name of shame to show that the Northern Kingdom would also be shamed, for its people would no longer be known as God's People. In other words, the Northern Kingdom had been rejected by God.
In Hosea 2, the woman in the marriage metaphor could be Hosea's wife Gomer, or could be referring to the nation of Israel, invoking the metaphor of Israel as God's bride. The woman is not portrayed in a positive light. This is reflected throughout the beginning of Hosea 2. “I will strip her naked and expose her as in the day she was born” (Hosea 2:3). “Upon her children I will have no pity, because they are children of whoredom” (Hosea 2:4). “For she said, I will go after my lovers...” (Hosea 2:5).
Biblical scholar Ehud Ben Zvi reminds readers of the socio-historical context in which Hosea was composed. In his article Observations on the marital metaphor of YHWH and Israel in its ancient Israelite context: general considerations and particular images in Hosea 1.2, Ben Zvi describes the role of the Gomer in the marriage metaphor as one of the "central attributes of the ideological image of a human marriage that was shared by the male authorship and the primary and intended male readership as building blocks for their imagining of the relationship."
Tristanne J. Connolly makes a similar observation, stating that the husband-wife motif reflects marriage as it was understood at the time. Connolly also suggests that in context the marriage metaphor was necessary in that it truly exemplified the unequal interaction between Yahweh and the people Israel. Biblical scholar Michael D. Coogan describes the importance of understanding the covenant in relation to interpreting Hosea. According to Coogan, Hosea falls under a unique genre called “covenant lawsuit” where God accuses Israel of breaking their previously made agreement. God's disappointment towards Israel is therefore expressed through the broken marriage covenant made between husband and wife.
Brad E. Kelle has referred to 'many scholars' finding references to cultic sexual practices in the worship of Baal, in Hosea 2, to be evidence of an historical situation in which Israelites were either giving up Yahweh worship for Baal, or blending the two. Hosea's references to sexual acts being metaphors for Israelite 'apostasy'.
Hosea 13:1–3 describes how the Israelites are abandoning Yahweh for the worship of Baal, and accuses them of making or using molten images for 'idol' worship. Chief among these was the image of the bull at the northern shrine of Bethel, which by the time of Hosea was being worshipped as an image of Baal.
Hosea is a prophet whom God uses to portray a message of repentance to God's people. Through Hosea's marriage to Gomer, God, also known as Yahweh, shows his great love for his people, comparing himself to a husband whose wife has committed adultery. It a metaphor of the covenant between God and Israel, and he influenced latter prophets such as Jeremiah. He is among the first writing prophets, and the last chapter of Hosea has a format similar to wisdom literature.
Like Amos, Hosea elevated the religion of Israel to the altitude of ethical monotheism, being the first to emphasize the moral side of God's nature. Israel's faithlessness, which resisted all warnings, compelled Him to punish the people because of His own holiness. Hosea considers infidelity as the chief sin, of which Israel, the adulterous wife, has been guilty against her loving husband, God. Against this he sets the unquenchable love of God, who, in spite of this infidelity, does not cast Israel away forever, but will take His people unto Himself again after the judgment. | https://en.wikipedia.org/wiki?curid=4449 |
Book of Micah
The Book of Micah is the sixth of the twelve minor prophets in the Hebrew BibleOld Testament. Ostensibly, it records the sayings of Micah, whose name is "Mikayahu" (), meaning "Who is like Yahweh?", an 8th-century BC prophet from the village of Moresheth in Judah (Hebrew name from the opening verse: מיכה המרשתי).
The book has three major divisions, chapters 1–2, 3–5 and 6–7, each introduced by the word "Hear," with a pattern of alternating announcements of doom and expressions of hope within each division. Micah reproaches unjust leaders, defends the rights of the poor against the rich and powerful; while looking forward to a world at peace centered on Zion under the leadership of a new Davidic monarch.
While the book is relatively short, it includes lament (1.8–16; 7.8–10), theophany (1.3–4), hymnic prayer of petition and confidence (7.14–20), and the "covenant lawsuit" (6.1–8), a distinct genre in which Yahweh (God) sues Israel for breach of contract of the Mosaic covenant.
Chapter 1:1 identifies the prophet as "Micah of Moresheth" (a town in southern Judah), and states that he lived during the reigns of Yehotam, Ahaz and Hezekiah, roughly 750–700 BC.
This corresponds to the period when, after a long period of peace, Israel, Judah, and the other nations of the region came under increasing pressure from the aggressive and rapidly expanding Neo-Assyrian empire. Between 734 and 727 Tiglath-Pileser III of Assyria conducted almost annual campaigns in Palestine, reducing the Kingdom of Israel, the Kingdom of Judah and the Philistine cities to vassalage, receiving tribute from Ammon, Moab and Edom, and absorbing Damascus (the Kingdom of Aram) into the Empire. On Tiglath-Pileser's death Israel rebelled, resulting in an Assyrian counter-attack and the destruction of the capital, Samaria, in 721 after a three-year siege. Micah 1:2–7 draws on this event: Samaria, says the prophet, has been destroyed by God because of its crimes of idolatry, oppression of the poor, and misuse of power. The Assyrian attacks on Israel (the northern kingdom) led to an influx of refugees into Judah, which would have increased social stresses, while at the same time the authorities in Jerusalem had to invest huge amounts in tribute and defense.
When the Assyrians attacked Judah in 701 they did so via the Philistine coast and the Shephelah, the border region which included Micah's village of Moresheth, as well as Lachish, Judah's second largest city. This in turn forms the background to verses 1:8–16, in which Micah warns the towns of the coming disaster (Lachish is singled out for special mention, accused of the corrupt practices of both Samaria and Jerusalem). In verses 2:1–5 he denounces the appropriation of land and houses, which might simply be the greed of the wealthy and powerful, or possibly the result of the militarising of the area in preparation for the Assyrian attack.
Some, but not all, scholars accept that only chapters 1–3 contain material from the late 8th century prophet Micah. The latest material comes from the post-Exilic period after the Temple was rebuilt in 515 BC, so that the early 5th century BC seems to be the period when the book was completed. The first stage was the collection and arrangement of some spoken sayings of the historical Micah (the material in chapters 1–3), in which the prophet attacks those who build estates through oppression and depicts the Assyrian invasion of Judah as Yahweh's punishment on the kingdom's corrupt rulers, including a prophecy that the Temple will be destroyed.
The prophecy was not fulfilled in Micah's time, but a hundred years later when Judah was facing a similar crisis with the Neo-Babylonian Empire, Micah's prophecies were reworked and expanded to reflect the new situation. Still later, after Jerusalem did fall to the Neo-Babylonian Empire, the book was revised and expanded further to reflect the circumstances of the late exilic and post-exilic community.
At the broadest level, Micah can be divided into three roughly equal parts:
Within this broad three-part structure are a series of alternating oracles of judgment and promises of restoration:
Micah addresses the future of Judah/Israel after the Babylonian exile. Like Isaiah, the book has a vision of the punishment of Israel and creation of a "remnant", followed by world peace centered on Zion under the leadership of a new Davidic monarch; the people should do justice, turn to Yahweh, and await the end of their punishment. However, whereas Isaiah sees Jacob/Israel joining "the nations" under Yahweh's rule, Micah looks forward to Israel ruling over the nations. Insofar as Micah appears to draw on and rework parts of Isaiah, it seems designed at least partly to provide a counterpoint to that book.
In the New Testament, the Book of Matthew quotes from the Book of Micah in relation to Jesus being born in Bethlehem:
Jesus quotes Micah when he warns that families will be divided by the gospel: | https://en.wikipedia.org/wiki?curid=4452 |
Book of Malachi
Malachi (or Malachias; , "") is the last book of the Neviim contained in the Tanakh, canonically the last of the Twelve Minor Prophets. In the Christian ordering, the grouping of the Prophetic Books is the last section of the Old Testament, making Malachi the last book before The New Testament.
The book is commonly attributed to a prophet by the name of "Malachi," as its title has frequently been understood as a proper name, although its Hebrew meaning is simply "My Messenger " (the Septuagint reads "his messenger") and may not be the author's name at all. The name occurs in the superscription at 1:1 and in 3:1, although it is highly unlikely that the word refers to the same character in both of these references. Thus, there is substantial debate regarding the identity of the book's author. One of the Targums identifies Ezra (or Esdras) as the author of Malachi. Priest and Historian Jerome suggests that this may be because Ezra is seen as an intermediary between the prophets and the "great synagogue." There is, however, no historical evidence yet to support this claim.
Some scholars note affinities between Zechariah 9–14 and the Book of Malachi. Zechariah 9, Zechariah 12, and Malachi 1 are all introduced as The word of Elohim. Some scholars argue that this collection originally consisted of three independent and anonymous prophecies, two of which were subsequently appended to the Book of Zechariah as what they refer to as Deutero-Zechariah, with the third becoming the Book of Malachi. As a result, most scholars consider the Book of Malachi to be the work of a single author who may or may not have been identified by the title Malachi. The present division of the oracles results in a total of 12 books of minor prophets, a number parallelling the sons of Jacob who became the heads of the 12 Israelite tribes. The "Catholic Encyclopedia" asserts, "We are no doubt in presence of an abbreviation of the name Mál'akhîyah, that is Messenger of Elohim."
Little is known of the biography of the author of the Book of Malachi, although it has been suggested that he may have been Levitical. The books of Zechariah and Haggai were written during the lifetime of Ezra (see 5:1); perhaps this may explain the similarities in style.
According to the editors of the 1897 Easton's Bible Dictionary, some scholars believe the name "Malachi" is not a proper noun but rather an abbreviation of "messenger of YHWH". This reading could be based on Malachi 3:1, "Behold, I will send "my messenger"...", if "my messenger" is taken literally as the name "Malachi". Several scholars consider the book to be anonymous, regarding verse 1:1 as a later addition. However, other scholars, including the editors of the "Catholic Encyclopedia", argue that the grammatical evidence leads us to conclude that Malachi is in fact a name.
Another interpretation of the authorship comes from the Septuagint superscription, ὲν χειρὶ ἀγγήλου αὐτοῦ, which can be read as either "by the hand of his messenger" or as "by the hand of his angel". The "angel" reading found an echo among the ancient Church Fathers and ecclesiastical writers, and even gave rise to the "strangest fancies", especially among the disciples of Origen of Alexandria.
There are very few historical details in the Book of Malachi. The greatest clue as to its dating may lie in the fact that the Persian-era term for governor (pehâ) is used in 1:8. This points to a post-exilic (that is, after 538 BCE) date of composition both because of the use of the Persian period term and because Judah had a king before the exile. Since, in the same verse, the temple has been rebuilt, the book must also be later than 515 BC. Malachi was apparently known to the author of Ecclesiasticus early in the 2nd century BC. Because of the development of themes in the book of Malachi, most scholars assign it to a position after Haggai and Zechariah, close to the time when Ezra and Nehemiah came to Jerusalem in 445 BC.
The Book of Malachi was written to correct the lax religious and social behaviour of the Israelites – particularly the priests – in post-exilic Jerusalem. Although the prophets urged the people of Judah and Israel to see their exile as punishment for failing to uphold their covenant with God, it was not long after they had been restored to the land and to Temple worship that the people's commitment to their God began, once again, to wane. It was in this context that the prophet commonly referred to as Malachi delivered his prophecy.
In 1:2, Malachi has the people of Israel question God's love for them. This introduction to the book illustrates the severity of the situation which Malachi addresses. The graveness of the situation is also indicated by the dialectical style with which Malachi confronts his audience. Malachi proceeds to accuse his audience of failing to respect God as God deserves. One way in which this disrespect is made manifest is through the substandard sacrifices which Malachi claims are being offered by the priests. While God demands animals that are "without blemish" (Leviticus 1:3, NRSV), the priests, who were "to determine whether the animal was acceptable" (Mason 143), were offering blind, lame and sick animals for sacrifice because they thought nobody would notice.
In 2:1, Malachi states Yahweh Sabaoth is sending a curse on the priests who have not honored him with appropriate animal sacrifices: "Now, watch how I am going to paralyze your arm and throw dung in your face--the dung from your very solemnities--and sweep you away with it. Then you shall learn that it is I who have given you this warning of my intention to abolish my covenant with Levi, says Yahweh Sabaoth."
In 2:10, Malachi addresses the issue of divorce. On this topic, Malachi deals with divorce both as a social problem ("Why then are we faithless to one another ... ?" 2:10) and as a religious problem ("Judah ... has married the daughter of a foreign god" 2:11). In contrast to the book of Ezra, Malachi urges each to remain steadfast to the wife of his youth.
Malachi also criticizes his audience for questioning God's justice. He reminds them that God is just, exhorting them to be faithful as they await that justice. Malachi quickly goes on to point out that the people have not been faithful. In fact, the people are not giving God all that God deserves. Just as the priests have been offering unacceptable sacrifices, so the people have been neglecting to offer their full tithe to God. The result of these shortcomings is that the people come to believe that no good comes out of serving God.
Malachi assures the faithful among his audience that in the eschaton, the differences between those who served God faithfully and those who did not will become clear. The book concludes by calling upon the teachings of Moses and by promising that Elijah will return prior to the Day of Yahweh.
The book of Malachi is divided into three chapters in the Hebrew Bible and the Greek Septuagint and four chapters in the Latin Vulgate. The fourth chapter in the Vulgate consists of the remainder of the third chapter starting at verse 3:19.
The New Revised Standard Version of the Bible supplies headings for the book as follows:
The majority of scholars consider the book to be made up of six distinct oracles. According to this scheme, the book of Malachi consists of a series of disputes between Yahweh and the various groups within the Israelite community. In the course of the book's three or four chapters, Yahweh is vindicated while those who do not adhere to the law of Moses are condemned. Some scholars have suggested that the book, as a whole, is structured along the lines of a judicial trial, a suzerain treaty or a covenant—one of the major themes throughout the Hebrew Scriptures. Implicit in the prophet's condemnation of Israel's religious practices is a call to keep Yahweh's statutes.
The Book of Malachi draws upon various themes found in other books of the Bible. Malachi appeals to the rivalry between Jacob and Esau and of Yahweh's preference for Jacob contained in Book of Genesis 25–28. Malachi reminds his audience that, as descendants of Jacob (Israel), they have been and continue to be favoured by God as God's chosen people. In the second dispute, Malachi draws upon the Levitical Code (e.g. Leviticus 1:3) in condemning the priest for offering unacceptable sacrifices.
In the third dispute (concerning divorce), the author of the Book of Malachi likely intends his argument to be understood on two levels. Malachi appears to be attacking either the practice of divorcing Jewish wives in favour of foreign ones (a practice which Ezra vehemently condemns) or, alternatively, Malachi could be condemning the practice of divorcing foreign wives in favour of Jewish wives (a practice which Ezra promoted). Malachi appears adamant that nationality is not a valid reason to terminate a marriage, "For I hate divorce, says the Lord . . ." (2:16).
In many places throughout the Hebrew Scriptures – particularly the Book of Hosea – Israel is figured as Yahweh's wife or bride. Malachi's discussion of divorce may also be understood to conform to this metaphor. Malachi could very well be urging his audience not to break faith with Yahweh (the God of Israel) by adopting new gods or idols. It is quite likely that, since the people of Judah were questioning Yahweh's love and justice (1:2, 2:17), they might be tempted to adopt foreign gods. William LaSor suggests that, because the restoration to the land of Judah had not resulted in anything like the prophesied splendor of the messianic age which had been prophesied, the people were becoming quite disillusioned with their religion.
Indeed, the fourth dispute asserts that judgment is coming in the form of a messenger who "is like refiner's fire and like fullers' soap . . ." (3:2). Following this, the prophet provides another example of wrongdoing in the fifth dispute – that is, failing to offer full tithes. In this discussion, Malachi has Yahweh request the people to "Bring the full tithe . . . [and] see if I will not open the windows of heaven for you and pour down on you an overflowing blessing" (3:10). This request offers the opportunity for the people to amend their ways. It also stresses that keeping the Lord's statutes will not only allow the people to avoid God's wrath, but will also lead to God's blessing. In the sixth dispute, the people of Israel illustrate the extent of their disillusionment. Malachi has them say "'It is vain to serve God . . . Now we count the arrogant happy; evildoers not only prosper, but when they put God to the test they escape'" (3:14–15). Once again, Malachi has Yahweh assure the people that the wicked will be punished and the faithful will be rewarded.
In the light of what Malachi understands to be an imminent judgment, he exhorts his audience to "Remember the teaching of my servant Moses, that statutes and ordinances that I commanded him at Horeb for all Israel" (4:4; 3:22, MT). Before the Day of the Lord, Malachi declares that Elijah (who "ascended in a whirlwind into heaven . . . [,]" 2 Kings 2:11) will return to earth in order that people might follow in God's ways.
Primarily because of its messianic promise, the Book of Malachi is frequently referred to in the Christian New Testament. What follows is a brief comparison between the Book of Malachi and the New Testament texts which refer to it (as suggested in Hill 84–88).
Although many Christians believe that the messianic prophecies of the Book of Malachi have been fulfilled in the life, ministry, transfiguration, death and resurrection of Jesus of Nazareth, most Jews continue to await the coming of the prophet Elijah who will prepare the way for the Lord. | https://en.wikipedia.org/wiki?curid=4455 |
Book of Zechariah
The Book of Zechariah, attributed to the Hebrew prophet Zechariah, is included in the Twelve Minor Prophets in the Hebrew Bible.
Zechariah's prophecies took place during the reign of Darius the Great, and were contemporary with Haggai in a post-exilic world after the fall of Jerusalem in 587/6 BC. Ezekiel and Jeremiah wrote before the fall of Jerusalem, while continuing to prophesy in the early exile period. Scholars believe Ezekiel, with his blending of ceremony and vision, heavily influenced the visionary works of Zechariah 1–8. Zechariah is specific about dating his writing (520–518 BC).
During the Exile many Judahites and Benjamites were taken to Babylon, where the prophets told them to make their homes, suggesting they would spend a long period of time there. Eventually freedom did come to many Israelites, when Cyrus the Great overtook the Babylonians in 539 BC. In 538 BC, the famous Edict of Cyrus was released, and the first return took place under Sheshbazzar. After the death of Cyrus in 530 BC, Darius consolidated power and took office in 522 BC. His system divided the different colonies of the empire into easily manageable districts overseen by governors. Zerubbabel comes into the story, appointed by Darius as governor over the district of Yehud Medinata.
Under the reign of Darius, Zechariah also emerged, centering on the rebuilding of the Temple. Unlike the Babylonians, the Persian Empire went to great lengths to keep “cordial relations” between vassal and lord. The rebuilding of the Temple was encouraged by the leaders of the empire in hopes that it would strengthen the authorities in local contexts. This policy was good politics on the part of the Persians, and the Jews viewed it as a blessing from God.
The name "Zechariah" means "God remembered." Not much is known about Zechariah's life other than what may be inferred from the book. It has been speculated that his grandfather Iddo was the head of a priestly family who returned with Zerubbabel, and that Zechariah may himself have been a priest as well as a prophet. This is supported by Zechariah's interest in the Temple and the priesthood, and from Iddo's preaching in the Books of Chronicles.
Most modern scholars believe the Book of Zechariah was written by at least two different people. Zechariah 1–8, sometimes referred to as First Zechariah, was written in the 6th century BC. Zechariah 9–14, often called Second Zechariah, contains within the text no datable references to specific events or individuals but most scholars give the text a date in the fifth century BC. Second Zechariah, in the opinion of some scholars, appears to make use of the books of Isaiah, Jeremiah, and Ezekiel, the Deuteronomistic History, and the themes from First Zechariah. This has led some to believe that the writer(s) or editor(s) of Second Zechariah may have been a disciple of the prophet Zechariah. There are some scholars who go even further and divide Second Zechariah into Second Zechariah (9–11) and Third Zechariah (12–14) since each begins with a heading oracle.
The return from exile is the theological premise of the prophet's visions in chapters 1–6. Chapters 7–8 address the quality of life God wants his renewed people to enjoy, containing many encouraging promises to them. Chapters 9–14 comprise two "oracles" of the future.
The book begins with a preface, which recalls the nation's history, for the purpose of presenting a solemn warning to the present generation. Then follows a series of eight visions, succeeding one another in one night, which may be regarded as a symbolical history of Israel, intended to furnish consolation to the returned exiles and stir up hope in their minds. The symbolic action, the crowning of Joshua, describes how the kingdoms of the world become the kingdom of God's Messiah.
Chapters Zechariah 7 and Zechariah 8, delivered two years later, are an answer to the question whether the days of mourning for the destruction of the city should be kept any longer, and an encouraging address to the people, assuring them of God's presence and blessing.
This section consists of two "oracles" or "burdens":
The purpose of this book is not strictly historical but theological and pastoral. The main emphasis is that God is at work and all His good deeds, including the construction of the Second Temple, are accomplished "not by might nor by power, but by My Spirit."() Ultimately, YHWH plans to live again with His people in Jerusalem. He will save them from their enemies and cleanse them from sin. However, God requires repentance, a turning away from sin towards faith in Him ()
Zechariah's concern for purity is apparent in the temple, priesthood and all areas of life as the prophecy gradually eliminates the influence of the governor in favour of the high priest, and the sanctuary becomes ever more clearly the centre of messianic fulfillment. The prominence of prophecy is quite apparent in Zechariah, but it is also true that Zechariah (along with Haggai) allows prophecy to yield to the priesthood; this is particularly apparent in comparing Zechariah to "Third Isaiah" (chapters 55–66 of the Book of Isaiah), whose author was active sometime after the first return from exile.
Most Christian commentators read the series of predictions in chapters 7 to 14 as Messianic prophecies, either directly or indirectly. These chapters helped the writers of the Gospels understand Jesus’ suffering, death and resurrection, which they quoted as they wrote of Jesus’ final days. Much of the Book of Revelation, which narrates the denouement of history, is also colored by images in Zechariah.
Chapters 9–14 of the Book of Zechariah are an early example of apocalyptic literature. Although not as fully developed as the apocalyptic visions described in the Book of Daniel, the "oracles", as they are titled in Zechariah 9–14, contain apocalyptic elements. One theme these oracles contain is descriptions of the Day of the Lord, when "the Lord will go forth and fight against those nations as when he fights on a day of battle" (Zechariah 14:3). These chapters also contain "pessimism about the present, but optimism for the future based on the expectation of an ultimate divine victory and the subsequent transformation of the cosmos".
The final word in Zechariah proclaims that on the Day of the Lord "There will be no Canaanite in the house of the Lord of hosts on that day" (14:21), proclaiming the need for purity in the Temple, which would come when God judges at the end of time. The Revised Standard Version has this: "There will be no trader in the house of the Lord of hosts on that day." In the Masoretic Text it is: "and in that day there shall be no more a trafficker in the house of the Lord of hosts." | https://en.wikipedia.org/wiki?curid=4456 |
Book of Zephaniah
The Book of Zephaniah (, "Tsfanya") is the ninth of the Twelve Minor Prophets, preceded by the Book of Habakkuk and followed by the Book of Haggai. Zephaniah means "Yahweh has hidden/protected," or "Yahweh hides".
The book's superscription attributes its authorship to "Zephaniah son of Cushi son of Gedaliah son of Amariah son of Hezekiah, in the days of King Josiah son of Amon of Judah" (1:1, NRSV). All that is known of Zephaniah comes from the text.
The name "Cushi," Zephaniah's father, means "Cushite" or "Ethiopian," and the text of Zephaniah mentions the sin and restoration of Ethiopians. While some have concluded from this that Zephaniah was a black Jew, Ehud Ben Zvi maintains that, based on the context, "Cushi" must be understood as a personal name rather than an indicator of nationality. Abraham ibn Ezra interpreted the name Hezekiah in the superscription as King Hezekiah of Judah, though that is not a claim advanced in the text of Zephaniah.
As with many of the other prophets, there is no external evidence to directly associate composition of the book with a prophet by the name of Zephaniah. Some scholars, such as Kent Harold Richards and Jason DeRouchie, consider the words in Zephaniah to reflect a time early in the reign of King Josiah (640–609 BC) before his reforms of 622 BC took full effect, in which case the prophet may have been born during the reign of Manasseh (698/687–642 BC). Others agree that some portion of the book is postmonarchic, that is, dating to later than 586 BC when the Kingdom of Judah fell in the Siege of Jerusalem. Some who consider the book to have largely been written by a historical Zephaniah have suggested that he may have been a disciple of Isaiah because of the two books' similar focus on rampant corruption and injustice in Judah.
If Zephaniah was largely composed during the monarchic period, then its composition was occasioned by Judah's refusal to obey its covenant obligations toward Yahweh despite having seen Israel's exile a generation or two previously—an exile that the Judahite literary tradition attributed to Yahweh's anger against Israel's disobedience to his covenant. In this historical context, Zephaniah urges Judah to obedience to Yahweh, saying that "perhaps" he will forgive them if they do.
"The HarperCollins Study Bible" supplies headings for the book as follows:
More consistently than any other prophetic book, Zephaniah focuses on "the day of the Lord," developing this tradition from its first appearance in Amos. The day of the Lord tradition also appears in Isaiah, Ezekiel, Obadiah, Joel, and Malachi.
The book begins by describing Yahweh's judgement. The threefold repetition of "I will sweep away" in emphasizes the totality of the destruction, as the number three often signifies complete perfection in the Bible. The order of creatures in ("humans and animals ... the birds ... the fish") is the opposite of the creation order in , signifying an undoing of creation. This is also signified by the way that "from the face of the earth" forms an "inclusio" around , hearkening back to how the phrase is used in the Genesis flood narrative in , , , where it also connotes an undoing of creation.
As is common in prophetic literature in the Bible, a "remnant" survives Yahweh's judgement in Zephaniah by humbly seeking refuge in Yahweh. The book concludes in an announcement of hope and joy, as Yahweh "bursts forth in joyful divine celebration" over his people.
Because of its hopeful tone of the gathering and restoration of exiles, has been included in Jewish liturgy.
Zephaniah served as a major inspiration for the medieval Catholic hymn "Dies Irae," whose title and opening words are from the Vulgate translation of . | https://en.wikipedia.org/wiki?curid=4457 |
Backward compatibility
Backward compatibility (sometimes backwards compatibility) is a property of a system, product, or technology that allows for interoperability with an older legacy system, or with input designed for such a system, especially in telecommunications and computing. Backward compatibility is sometimes also called downward compatibility.
Modifying a system in a way that does not allow backward compatibility is sometimes called "breaking" backward compatibility.
A complementary concept is forward compatibility. A design that is forward-compatible usually has a roadmap for compatibility with future standards and products.
In programming jargon, the concept is sometimes referred to as hysterical reasons or hysterical raisins, homophones for "historical reasons".
There are several incentives for a company to implement backward compatibility. Backward compatibility can be used to preserve older software that would have otherwise been lost when a manufacturer decides to stop supporting older hardware. Classic video games are a common example used when discussing the value of supporting older software. The cultural impact of video games is a large part of their continued success, and some believe ignoring backward compatibility would cause these titles to disappear. Backward compatibility also acts as an additional selling point for new hardware, as an existing player base can more affordably upgrade to subsequent generations of a console. This also helps to make up for a lack of content in the early launch of new systems, as users can pull from the previous console's large library of games while developers slowly transition to the new hardware.
One example of this is the Sony PlayStation 2 (PS2) which was backward compatible with games for its predecessor PlayStation (PS1). While the selection of PS2 games available at launch was small, sales of the console were nonetheless strong in 2000-2001 thanks to the large library of games for the preceding PS1. This bought time for the PS2 to grow a large installed base and developers to release more quality PS2 games for the crucial 2001 holiday season.
Additionally, and despite not being included at launch, Microsoft slowly incorporated backward compatibility for select titles on the Xbox One several years into its product life cycle. Players have racked up over a billion hours with backward compatible games on Xbox, and it is anticipated that next generation consoles such as PlayStation 5 and Xbox Series X will also support this feature. A large part of the success and implementation of this feature is that the hardware within newer generation consoles is both powerful and similar enough to legacy systems that older titles can be broken down and re-configured to run on the Xbox One. The backward compatibility program not only supports the previous generation Xbox 360, but also titles from the original Xbox system. Some titles are even given slight visual improvements and additional levels at no cost to the user. This program has proven incredibly popular with Xbox players and goes against the recent trend of studio made remasters of classic titles, creating what some believe to be an important shift in console maker's strategies.
The literal costs of supporting old software is considered a large drawback to the usage of backward compatibility. The associated costs of backward compatibility are a higher bill of materials if hardware is required to support the legacy systems; increased complexity of the product that may lead to longer time to market, technological hindrances, and slowing innovation; and increased expectations from users in terms of compatibility. Because of this, several gaming consoles chose to phase out backward compatibility toward the end of the console generation in order to reduce cost and briefly re-invigorate sales before the arrival of newer hardware.
A notable example is the Sony PlayStation 3, as the first PS3 iteration was expensive to manufacture in part due to including the Emotion Engine from the preceding PS2 in order to run PS2 games, since the PS3 architecture was completely different from the PS2. Subsequent PS3 hardware revisions have eliminated the Emotion Engine as it saved production costs while removing the ability to run PS2 titles, as Sony found out that backward compatibility was not a major selling point for the PS3 in contrast to the PS2. The PS3's chief competitor, the Microsoft Xbox 360, took a different approach to backward compatibility by using software emulation in order to run games from the first Xbox, rather than including legacy hardware from the original Xbox which was quite different from the Xbox 360, however Microsoft stopped releasing emulation profiles after 2007.
However, with the current decline in physical game sales and the rise of digital storefronts and downloads, some believe backwards compatibility will soon be as obsolete as the phased-out consoles it supports. Many game studios are re-mastering and re-releasing their most popular titles by improving the quality of graphics and adding new content. These remasters have found success by appealing both to nostalgic players who remember enjoying the original versions when they were younger, and to newcomers who may not have had the original system it was released on. For most consumers, digital remasters are more appealing than hanging on to bulky cartridges and obsolete hardware. For the manufacturers of consoles, digital re-releases of classic titles are a large benefit. It not only removes the financial drawbacks of supporting older hardware, but also shifts all of the costs of updating software to the developers. The manufacturer gets a new addition to their system with strong name recognition, and the studio does not have to completely develop a game from the ground up.
A simple example of both backward and forward compatibility is the introduction of FM radio in stereo. FM radio was initially mono, with only one audio channel represented by one signal. With the introduction of two-channel stereo FM radio, many listeners had only mono FM receivers. Forward compatibility for mono receivers with stereo signals was achieved through sending the sum of both left and right audio channels in one signal and the difference in another signal. That allows mono FM receivers to receive and decode the sum signal while ignoring the difference signal, which is necessary only for separating the audio channels. Stereo FM receivers can receive a mono signal and decode it without the need for a second signal, and they can separate a sum signal to left and right channels if both sum and difference signals are received. Without the requirement for backward compatibility, a simpler method could have been chosen.
Full backward compatibility is particularly important in computer instruction set architectures, one of the most successful being the x86 family of microprocessors. Their full backward compatibility spans back to the 16-bit Intel 8086/8088 processors introduced in 1978. (The 8086/8088, in turn, were designed with easy machine-translatability of programs written for its predecessor in mind, although they were not instruction-set compatible with the 8-bit Intel 8080 processor as of 1974. The Zilog Z80, however, was fully backwards compatible with the Intel 8080.)
Fully backwards compatible processors can process the same binary executable software instructions as their predecessors, allowing the use of a newer processor without having to acquire new applications or operating systems. Similarly, the success of the Wi-Fi digital communication standard is attributed to its broad forward and backward compatibility; it became more popular than other standards that were not backward compatible.
Compiler backward compatibility may refer to the ability of a compiler of a newer version of the language to accept programs or data that worked under the previous version.
A data format is said to be backward compatible with its predecessor if every message or file that is valid under the old format is still valid, retaining its meaning under the new format. | https://en.wikipedia.org/wiki?curid=4459 |
Bacterial conjugation
Bacterial conjugation is the transfer of genetic material between bacterial cells by direct cell-to-cell contact or by a bridge-like connection between two cells. This takes place through a pilus.
It is a mechanism of horizontal gene transfer as are transformation and transduction although these two other mechanisms do not involve cell-to-cell contact.
Classical "E. coli" bacterial conjugation is often regarded as the bacterial equivalent of sexual reproduction or mating since it involves the exchange of genetic material. However, it is not sexual reproduction, since no exchange of gamete occurs, and indeed no generation of a new organism: instead an existing organism is transformed. During classical "E. coli" conjugation the "donor" cell provides a conjugative or mobilizable genetic element that is most often a plasmid or transposon. Most conjugative plasmids have systems ensuring that the "recipient" cell does not already contain a similar element.
The genetic information transferred is often beneficial to the recipient. Benefits may include antibiotic resistance, xenobiotic tolerance or the ability to use new metabolites. Such beneficial plasmids may be considered bacterial endosymbionts. Other elements, however, may be viewed as bacterial parasites and conjugation as a mechanism evolved by them to allow for their spread.
Conjugation in "Escherichia coli" by spontaneous zygogenesis and in "Mycobacterium smegmatis" by distributive conjugal transfer differ from the more well studied classical "E. coli" conjugation in that these cases involve substantial blending of the parental genomes.
The process was discovered by Joshua Lederberg and Edward Tatum in 1946.
Conjugation diagram
The F-plasmid is an episome (a plasmid that can integrate itself into the bacterial chromosome by homologous recombination) with a length of about 100 kb. It carries its own origin of replication, the "oriV", and an origin of transfer, or "oriT". There can only be one copy of the F-plasmid in a given bacterium, either free or integrated, and bacteria that possess a copy are called "F-positive" or "F-plus" (denoted F+). Cells that lack F plasmids are called "F-negative" or "F-minus" (F−) and as such can function as recipient cells.
Among other genetic information, the F-plasmid carries a "tra" and "trb" locus, which together are about 33 kb long and consist of about 40 genes. The "tra" locus includes the "pilin" gene and regulatory genes, which together form pili on the cell surface. The locus also includes the genes for the proteins that attach themselves to the surface of F− bacteria and initiate conjugation. Though there is some debate on the exact mechanism of conjugation it seems that the pili are not the structures through which DNA exchange occurs. This has been shown in experiments where the pilus are allowed to make contact, but then are denatured with SDS and yet DNA transformation still proceeds. Several proteins coded for in the "tra" or "trb" locus seem to open a channel between the bacteria and it is thought that the traD enzyme, located at the base of the pilus, initiates membrane fusion.
When conjugation is initiated by a signal the relaxase enzyme creates a nick in one of the strands of the conjugative plasmid at the "oriT". Relaxase may work alone or in a complex of over a dozen proteins known collectively as a relaxosome. In the F-plasmid system the relaxase enzyme is called TraI and the relaxosome consists of TraI, TraY, TraM and the integrated host factor IHF. The nicked strand, or "T-strand", is then unwound from the unbroken strand and transferred to the recipient cell in a 5'-terminus to 3'-terminus direction. The remaining strand is replicated either independent of conjugative action (vegetative replication beginning at the "oriV") or in concert with conjugation (conjugative replication similar to the rolling circle replication of lambda phage). Conjugative replication may require a second nick before successful transfer can occur. A recent report claims to have inhibited conjugation with chemicals that mimic an intermediate step of this second nicking event.
If the F-plasmid that is transferred has previously been integrated into the donor's genome (producing an Hfr strain ["High Frequency of Recombination"]) some of the donor's chromosomal DNA may also be transferred with the plasmid DNA. The amount of chromosomal DNA that is transferred depends on how long the two conjugating bacteria remain in contact. In common laboratory strains of "E. coli" the transfer of the entire bacterial chromosome takes about 100 minutes. The transferred DNA can then be integrated into the recipient genome via homologous recombination.
A cell culture that contains in its population cells with non-integrated F-plasmids usually also contains a few cells that have accidentally integrated their plasmids. It is these cells that are responsible for the low-frequency chromosomal gene transfers that occur in such cultures. Some strains of bacteria with an integrated F-plasmid can be isolated and grown in pure culture. Because such strains transfer chromosomal genes very efficiently they are called Hfr (high frequency of recombination). The "E. coli" genome was originally mapped by interrupted mating experiments in which various Hfr cells in the process of conjugation were sheared from recipients after less than 100 minutes (initially using a Waring blender). The genes that were transferred were then investigated.
Since integration of the F-plasmid into the "E. coli" chromosome is a rare spontaneous occurrence, and since the numerous genes promoting DNA transfer are in the plasmid genome rather than in the bacterial genome, it has been argued that conjugative bacterial gene transfer, as it occurs in the "E. coli" Hfr system, is not an evolutionary adaptation of the bacterial host, nor is it likely ancestral to eukaryotic sex.
Spontaneous zygogenesis in "E. coli"
In addition to classical bacterial conjugation described above for "E. coli", a form of conjugation referred to as spontaneous zygogenesis (Z-mating for short) is observed in certain strains of "E. coli". In Z-mating there is complete genetic mixing, and unstable diploids are formed that throw off phenotypically haploid cells, of which some show a parental phenotype and some are true recombinants.
Conjugation in "Mycobacteria smegmatis", like conjugation in "E. coli", requires stable and extended contact between a donor and a recipient strain, is DNase resistant, and the transferred DNA is incorporated into the recipient chromosome by homologous recombination. However, unlike "E. coli" Hfr conjugation, mycobacterial conjugation is chromosome rather than plasmid based. Furthermore, in contrast to "E. coli" Hfr conjugation, in "M. smegmatis" all regions of the chromosome are transferred with comparable efficiencies. The lengths of the donor segments vary widely, but have an average length of 44.2kb. Since a mean of 13 tracts are transferred, the average total of transferred DNA per genome is 575kb. This process is referred to as "Distributive conjugal transfer." Gray et al. found substantial blending of the parental genomes as a result of conjugation and regarded this blending as reminiscent of that seen in the meiotic products of sexual reproduction.
Bacteria related to the nitrogen fixing "Rhizobia" are an interesting case of inter-kingdom conjugation. For example, the tumor-inducing (Ti) plasmid of "Agrobacterium" and the root-tumor inducing (Ri) plasmid of "A. rhizogenes" contain genes that are capable of transferring to plant cells. The expression of these genes effectively transforms the plant cells into opine-producing factories. Opines are used by the bacteria as sources of nitrogen and energy. Infected cells form crown gall or root tumors. The Ti and Ri plasmids are thus endosymbionts of the bacteria, which are in turn endosymbionts (or parasites) of the infected plant.
The Ti and Ri plasmids can also be transferred between bacteria using a system (the "tra", or transfer, operon) that is different and independent of the system used for inter-kingdom transfer (the "vir", or virulence, operon). Such transfers create virulent strains from previously avirulent strains.
Conjugation is a convenient means for transferring genetic material to a variety of targets. In laboratories, successful transfers have been reported from bacteria to yeast, plants, mammalian cells, diatoms and isolated mammalian mitochondria. Conjugation has advantages over other forms of genetic transfer including minimal disruption of the target's cellular envelope and the ability to transfer relatively large amounts of genetic material (see the above discussion of "E. coli" chromosome transfer). In plant engineering, "Agrobacterium"-like conjugation complements other standard vehicles such as tobacco mosaic virus (TMV). While TMV is capable of infecting many plant families these are primarily herbaceous dicots. "Agrobacterium"-like conjugation is also primarily used for dicots, but monocot recipients are not uncommon. | https://en.wikipedia.org/wiki?curid=4460 |
Babrak Karmal
Babrak Karmal (Dari/, Russian: Бабра́к Карма́ль, born Sultan Hussein; 6 January 1929 – 1 or 3 December 1996) was an Afghan politician who was installed as President of Afghanistan by the Soviet Union when they intervened in 1979. Karmal was born in Kamari and educated at Kabul University. When the People's Democratic Party of Afghanistan (PDPA) was formed, Karmal became one of its leading members, having been introduced to Marxism by Mir Akbar Khyber during his imprisonment for activities deemed too radical by the government. He eventually became the leader of the Parcham faction when the PDPA split in 1967, with their ideological nemesis being the Khalq faction. Under Karmal's leadership, the Parchamite PDPA participated in Mohammad Daoud Khan's rise to power in 1973, and his subsequent regime. While relations were good at the beginning, Daoud began a major purge of leftist influence in the mid-1970s. This in turn led to the reformation of the PDPA in 1977, and Karmal played a major role in the 1978 Saur Revolution when the PDPA took power, though in later years he denounced it.
Karmal was appointed Deputy Chairman of the Revolutionary Council, synonymous with vice head of state, in the communist government. The Parchamite faction found itself under significant pressure by the Khalqists soon after taking power. In June 1978, a PDPA Central Committee meeting voted in favor of giving the Khalqist faction exclusive control over PDPA policy. This decision was followed by a failed Parchamite coup, after which Hafizullah Amin, a Khalqist, initiated a purge against the Parchamites. Karmal survived this purge but was exiled to Prague and eventually dismissed from his post. Instead of returning to Kabul, he feared for his life and lived with his family in the forests protected by the Czechoslovak secret police StB. The Afghan secret police KHAD had alledgedly sent members to Czechoslovakia to assassinate Karmal. In late 1979 he was brought to Moscow by the KGB and eventually, in December 1979, the Soviet Union intervened in Afghanistan (with the consent of Amin's government) to stabilize the country. The Soviet troops staged a coup and assassinated Amin.
Karmal was promoted to Chairman of the Revolutionary Council and Chairman of the Council of Ministers on 27 December 1979. He remained in office until 1981, when he was succeeded by Sultan Ali Keshtmand. Throughout his term, Karmal worked to establish a support base for the PDPA by introducing several reforms. Among these were the "Fundamental Principles of the Democratic Republic of Afghanistan", introducing a general amnesty for those people imprisoned during Nur Mohammad Taraki's and Amin's rule. He also replaced the red Khalqist flag with a more traditional one. These policies failed to increase the PDPA's legitimacy in the eyes of the Afghan people and the mujahideen rebels. Karmal had always been highly critical of his Khalqist predecessors Nur Muhammad Taraki and Amin's ultra-left radicalism.
These policy failures, and the stalemate that ensued after the Soviet intervention, led the Soviet leadership to become highly critical of Karmal's leadership. Under Mikhail Gorbachev, the Soviet Union deposed Karmal in 1986 and replaced him with Mohammad Najibullah. Following his loss of power, he was again exiled, this time to Moscow. It was Anahita Ratebzad who persuaded Najibullah to allow Babrak Karmal to return to Afghanistan in 1991, where Karmal became an associate of Abdul Rashid Dostum and possibly helped remove the Najibullah government from power in 1992. He eventually left Afghanistan again for Moscow. Not long after, in 1996, Karmal died from liver cancer.
Karmal was born Sultan Hussein on 6 January 1929, was the son of Muhammad Hussein Hashem, a Major General in the Afghan Army and former governor of the province of Paktia, and was the second of five siblings. His family was one of the wealthier families in Kabul. His ethnic background was Tajik(kabuli) from his father's side and Ghilzai Pashtun from his mother's side.
Karmal was born in Kamari, a village close to Kabul. He attended Nejat High School, a German-speaking school, and graduated from it in 1948, and applied to enter the Faculty of Law and Political Science of Kabul University. Karmal's application was initially denied admission to Kabul University because of his student political activist and his openly leftist views. He was always a charismatic speaker and became involved in the student union and the Wikh-i-Zalmayan (Awakened Youth Movement), a progressive and leftist organization. He studied at the College of Law and Political Science at Kabul University from 1951 to 1953. In 1953 Karmal was arrested because of his student union activities, but was released three years later in 1956 in an amnesty by Muhammad Daoud Khan. Shortly after, in 1957, Karmal found work as an English and German translator, before quitting and leaving for military training. Karmal graduated from the College of Law and Political Science in 1960, and in 1961, he found work as an employee in the Compilation and Translation Department of the Ministry of Education. From 1961 to 1963 he worked in the Ministry of Planning. When his mother died, Karmal left with his maternal aunt to live somewhere else. His father disowned him because of his leftist views. Karmal was involved in much debauchery, which was controversial in the mostly conservative Afghan society.
Imprisoned from 1953 to 1956, Karmal befriended fellow inmate Mir Akbar Khyber, who introduced Karmal to Marxism. Karmal changed his name from Sultan Hussein to Babrak Karmal, which means "Comrade of the Workers'" in Pashtun, to disassociate himself from his bourgeois background. When he was released from prison, he continued his activities in the student union, and began to promote Marxism. Karmal spent the rest of the 1950s and the early 1960s becoming involved with Marxist organizations, of which there were at least four in Afghanistan at the time; two of the four were established by Karmal. When the 1964 Afghan Provisional Constitution, which legalised the establishment of new political entities, was introduced several prominent Marxists agreed to establish a communist political party. The People's Democratic Party of Afghanistan (PDPA, the Communist Party) was established in January 1965 in Nur Muhammad Taraki's home. Factionalism within the PDPA quickly became a problem; the party split into the Khalq led by Taraki alongside Hafizullah Amin, and the Parcham led by Karmal.
During the 1965 parliamentary election Karmal was one of four PDPA members elected to the lower house of parliament; the three others were Anahita Ratebzad, Nur Ahmed Nur and Fezanul Haq Fezan. No Khalqists were elected; however, Amin was 50 votes short of being elected. The Parchamite victory may be explained by the simple fact that Karmal could contribute financially to the PDPA electoral campaign. Karmal became a leading figure within the student movement in the 1960s, electing Mohammad Hashim Maiwandwal as Prime Minister after a student demonstration (called for by Karmal) concluded with three deaths under the former leadership.
In 1967, the PDPA unofficially split into two formal parties, one Khalqist and one Parchamist. The dissolution of the PDPA was initiated by the closing down of the Khalqist newspaper, "Khalq". Karmal criticised the "Khalq" for being too communist, and believed that its leadership should have hidden its Marxist orientation instead of promoting it. According to the official version of events, the majority of the PDPA Central Committee rejected Karmal's criticism. The vote was a close one, and it is reported that Taraki expanded the Central Committee to win the vote; this plan resulted in eight of the new members becoming politically unaligned with and one switching to the Parchamite side. Karmal and half the PDPA Central Committee left the PDPA to establish a Parchamite-led PDPA. Officially the split was caused by ideological differences, but the party may have divided between the different leadership styles and plans of Taraki versus Karmal. Taraki wanted to model the party after Leninist norms while Karmal wanted to establish a democratic front. Other differences were socioeconomic. The majority of Khalqists came from rural areas; hence they were poorer, and were of Pashtun origin. The Parchamites were urban, richer, and spoke Dari more often than not. The Khalqists accused the Parchamites of having a connection with the monarchy, and because of it, referred to the Parchamite PDPA as the "Royal Communist Party". Both Karmal and Amin retained their seats in the lower house of parliament in the 1969 parliamentary election.
Mohammed Daoud Khan, in collaboration with the Parchamite PDPA and radical military officers, overthrew the monarchy and instituted the Afghan Republic in 1973. After Daoud's seizure of power, an American embassy cable stated that the new government had established a Soviet-style Central Committee, in which Karmal and Mir Akbar Khyber were given leading positions. Most ministries were given to Parchamites; Hassan Sharq became Deputy Prime Minister, Major Faiz Mohammad became Minister of Internal Affairs and Nematullah Pazhwak became Minister of Education. The Parchamites took control over the ministries of finance, agriculture, communications and border affairs. The new government quickly suppressed the opposition, and secured their power base. At first, the National Front government between Daoud and the Parchamites seemed to work. By 1975, Daoud had strengthened his position by enhancing the executive, legislative and judicial powers of the Presidency. To the dismay of the Parchamites, all parties other than the National Revolutionary Party (NRP, established by Daoud) were made illegal.
Shortly after the ban on opposition to the NRP, Daoud began a massive purge of Parchamites in government. Mohammad lost his position as interior minister, Abdul Qadir was demoted, and Karmal was put under government surveillance. To mitigate Daoud's suddenly anti-communist directives, the Soviet Union reestablished the PDPA; Taraki was elected its General Secretary and Karmal, Second Secretary. While the Saur Revolution (literally the April Revolution) was planned for August, the assassination of Khyber led to a chain of events which ended with the communists seizing power. Karmal, when taking power in 1979, accused Amin of ordering the assassination of Khyber.
Taraki was appointed Chairman of the Presidium of the Revolutionary Council and Chairman of the Council of Ministers, retaining his post as PDPA general secretary. Taraki initially formed a government which consisted of both Khalqists and Parchamites; Karmal became Deputy Chairman of the Revolutionary Council, while Amin became Minister of Foreign Affairs and Deputy Chairman of the Council of Ministers.Mohammad Aslam Watanjar became Deputy Chairman of the Council of Ministers. The two Parchamites Abdul Qadir and Mohammad Rafi, became Minister of Defence and Minister of Public Works, respectively. The appointment of Amin, Karmal and Watanjar led to splits within the Council of Ministers: the Khalqists answered to Amin; Karmal led the civilian Parchamites; and the military officers (who were Parchamites) were answerable to Watanjar (a Khalqist). The first conflict arose when the Khalqists wanted to give PDPA Central Committee membership to military officers who had participated in the Saur Revolution; Karmal opposed such a move but was overruled. A PDPA Politburo meeting voted in favour of giving Central Committee membership to the officers.
On 27 June, three months after the Saur Revolution, Amin outmaneuvered the Parchamites at a Central Committee meeting, giving the Khalqists exclusive right over formulating and deciding policy. A purge against the Parchamites was initiated by Amin and supported by Taraki on 1 July 1979. Karmal, fearing for his safety, went into hiding in one of his Soviet friends' homes. Karmal tried to contact Alexander Puzanov, the Soviet ambassador to Afghanistan, to talk about the situation. Puzanov refused, and revealed Karmal's location to Amin. The Soviets probably saved Karmal's life by sending him to the Socialist Republic of Czechoslovakia. In exile, Karmal established a network with the remaining Parchamites in government. A coup to overthrow Amin was planned for 4 September 1979. Its leading members in Afghanistan were Qadir and the Army Chief of Staff General Shahpur Ahmedzai. The coup was planned for the Festival of Eid, in anticipation of relaxed military vigilance. The conspiracy failed when the Afghan ambassador to India told the Afghan leadership about the plan. Another purge was initiated, and Parchamite ambassadors were recalled. Few returned to Afghanistan; Karmal and Mohammad Najibullah stayed in their respective countries. The Soviets decided that Amin should be removed to make way for a Karmal-Taraki coalition government. However Amin managed to order the arrest and later the murder of Taraki.
Amin was informed of the Soviet decision to intervene in Afghanistan and was initially supportive, but was assassinated. Under the command of the Soviets, Karmal ascended to power. On 27 December 1979, Karmal's pre-recorded speech to the Afghan people was broadcast via Radio Kabul from Tashkent in the Uzbek SSR (the radio wavelength was changed to that of Kabul), saying: "Today the torture machine of Amin has been smashed, his accomplices – the primitive executioners, usurpers and murderers of tens of thousand of our fellow countrymen – fathers, mothers, sisters, brothers, sons and daughters, children and old people ..." Karmal was not in Kabul when the speech was broadcast; he was in Bagram, protected by the KGB.
That evening Yuri Andropov, the Chairman of the KGB, congratulated Karmal on his rise to the Chairmanship of the Presidium of the Revolutionary Council, some time before Karmal received an official appointment. Karmal returned to Kabul on 28 December. He travelled alongside a Soviet military column. For the next few days Karmal lived in a villa on the outskirts of Kabul under the protection of the KGB. On 1 January 1980 Leonid Brezhnev, the General Secretary of the Central Committee of the Communist Party of the Soviet Union, and Alexei Kosygin, the Soviet Chairman of the Council of Ministers, congratulated Karmal on his "election" as leader.
When he came to power, Karmal promised an end to executions, the establishment of democratic institutions and free elections, the creation of a constitution, and legalization of alternative political parties. Prisoners incarcerated under the two previous governments would be freed in a general amnesty (which occurred on 6 January). He promised the creation of a coalition government which would not espouse socialism. At the same time, he told the Afghan people that he had negotiated with the Soviet Union to give economic, military and political assistance. The mistrust most Afghans felt towards the government was a problem for Karmal. Many still remembered he had said he would protect private capital in 1978—a promise later proven to be a lie.
Karmal's three most important promises were the general amnesty of prisoners, the promulgation of the Fundamental Principles of the Democratic Republic of Afghanistan and the adoption of a new flag containing the traditional black, red and green (the flag of Taraki and Amin was red). His government granted concessions to religious leaders and the restoration of confiscated property. Some property, which was confiscated during earlier land reforms, was also partially restored. All these measures, with the exception of the general amnesty of prisoners, were introduced gradually. Of 2,700 prisoners, 2,600 were released from prison; 600 of these were Parchamites. The general amnesty was greatly publicized by the government. While the event was hailed with enthusiasm by some, many others greeted the event with disdain, since their loved ones or associates had died during earlier purges. Amin had planned to introduce a general amnesty on 1 January 1980, to coincide with the PDPA's sixteenth anniversary.
Work on the Fundamental Principles had started under Amin: it guaranteed democratic rights such as freedom of speech, the right to security and life, the right to peaceful association, the right to demonstrate and the right that "no one would be accused of crime but in accord with the provisions of law" and that the accused had the right to a fair trial. The Fundamental Principles envisaged a democratic state led by the PDPA, the only party then permitted by law. The Revolutionary Council, the organ of supreme power, would convene twice every year. The Revolutionary Council in turn elected a Presidium which would take decisions on behalf of the Revolutionary Council when it was not in session. The Presidium consisted mostly of PDPA Politburo members. The state would safeguard three kinds of property: state, cooperative and private property. The Fundamental Principles said that the state had the right to change the Afghan economy from an economy where man was exploited to an economy were man was free. Another clause stated that the state had the right to take "families, both parents and children, under its supervision." While it looked democratic at the outset, the Fundamental Principles was based on contradictions.
The Fundamental Principles led to the establishment of two important state organs: the Special Revolutionary Court, a specialized court for crimes against national security and territorial integrity, and the Institute for Legal and Scientific Research and Legislative Affairs, the supreme legislative organ of state, This body could amend and draft laws, and introduce regulations and decrees on behalf of the government. The introduction of more Soviet-style institutions led the Afghan people to distrust the communist government even more.
The Fundamental Principles constitution came into power on 22 April 1980.
With Karmal's ascension to power, Parchamites began to "settle old scores". Revolutionary Troikas were created to arrest, sentence and execute people. Amin's guard were the first victims of the terror which ensued. Those commanders who had stayed loyal to Amin were arrested, filling the prisons. The Soviets protested, and Karmal replied, "As long as you keep my hands bound and do not let me deal with the Khalq faction there will be no unity in the PDPA and the government cannot become strong ... They tortured and killed us. They still hate us! They are the enemies of the party ..." Amin's daughter, along with her baby, was imprisoned for twelve years, until Mohammad Najibullah, then leader of the PDPA, released her. When Karmal took power, leading posts in the Party and Government bureaucracy were taken over by Parchamites. The Khalq faction was removed from power, and only technocrats, opportunists and individuals which the Soviets trusted would be appointed to the higher echelons of government. Khalqists remained in control of the Ministry of Interior, but Parchamites were given control over KHAD and the secret police. The Parchamites and the Khalqists controlled an equal share of the military. Two out of Karmal's three Council of Ministers deputy chairmen were Khalqists. Khalqists controlled the Ministry of Communications and the interior ministry. Parchamites, on the other hand, controlled the Ministry of Foreign Affairs and the Ministry of Defence. In addition to the changes in government, the Parchamites held clear majority in the PDPA Central Committee. Only one Khalqi, Saleh Mohammad Zeary, was a member of the PDPA Secretariat during Karmal's rule.
Over 14 and 15 March 1982 the PDPA held a party "conference" at the Kabul Polytechnic Institute instead of a party "congress", since a party congress would have given the Khalq faction a majority and could have led to a Khalqist takeover of the PDPA. The rules of holding a party conference were different, and the Parchamites had a three-fifths majority. This infuriated several Khalqists; the threat of expulsion did not lessen their anger. The conference was not successful, but it was portrayed as such by the official media. The conference broke up after one and a half days of a 3-day long program, because of the inter-party struggle for power between the Khalqists and the Parchamites. A "program of action" was introduced, and party rules were given minor changes. As an explanation of the low party membership, the official media also made it seem hard to become a member of the party.
When Karmal took power, he began expanding the support base of the PDPA. Karmal tried to persuade certain groups, which had been referred to class enemies of the revolution during Taraki and Amin's rule, to support the PDPA. Karmal appointed several non-communists to top positions. Between March and May 1980, 78 out of the 191 people appointed to government posts were not members of the PDPA. Karmal reintroduced the old Afghan custom of having an Islamic invocation every time the government issued a proclamation. In his first live speech to the Afghan people, Karmal called for the establishment of the National Fatherland Front (NFF); the NFF's founding congress was held in June 1981. Unfortunately for Karmal, his policies did not lead to a notable increase in support for his regime, and it did not help Karmal that most Afghans saw the Soviet intervention as an invasion.
By 1981, the government gave up on political solutions to the conflict. At the fifth PDPA Central Committee plenum in June, Karmal resigned from his Council of Ministers chairmanship and was replaced by Sultan Ali Keshtmand, while Nur Ahmad Nur was given a bigger role in the Revolutionary Council. This was seen as "base broadening". The previous weight given to non-PDPA members in top positions ceased to be an important matter in the media by June 1981. This was significant, considering that up to five members of the Revolutionary Council were non-PDPA members. By the end of 1981, the previous contenders, who had been heavily presented in the media, were all gone; two were given ambassadorships, two ceased to be active in politics, and one continued as an advisor to the government. The other three changed sides, and began to work for the opposition.
The national policy of reconciliation continued: in January 1984 the land reform introduced by Taraki and Amin was drastically modified, the limits of landholdings were increased to win the support of middle class peasants, the literacy programme was continued, and concessions to women were made. In 1985 the Loya Jirga was reconvened. The 1985 Loya Jirga was followed by a tribal jirga in September. In 1986 Abdul Rahim Hatef, a non-PDPA member, was elected to the NFF chairmanship. During the 1985–86 elections it was said that 60 percent of the elected officials were non-PDPA members. By the end of Karmal's rule, several non-PDPA members had high-level government positions.
In March 1979, the military budget was 6.4 million US$, which was 8.3 percent of the government budget, but only 2.2 of gross national product. After the Soviet intervention, the defence budget increased to 208 million US$ in 1980, and 325 million US$ by 1981. In 1982 it was reported that the government spent around 22 percent of total expenditure.
When the political solution failed (see "PDPA base" section), the Afghan government and the Soviet military decided to solve the conflict militarily. The change from a political to a military solution did not come suddenly. It began in January 1981, as Karmal doubled wages for military personnel, issued several promotions, and decorated one general and thirteen colonels. The draft age was lowered, the obligatory length of arms duty was extended and the age for reservists was increased to thirty-five years of age. In June 1981, Assadullah Sarwari lost his seat in the PDPA Politburo, replaced by Mohammad Aslam Watanjar, a former tank commander and Minister of Communications, Major General Mohammad Rafi was madeMinister of Defence and Mohammad Najibullah appointed KHAD Chairman.
These measures were introduced due to the collapse of the army during the Soviet intervention. Before the invasion the army could field 100,000 troops, after the invasion only 25,000. Desertions were pandemic, and the recruitment campaigns for young people often drove them to the opposition. To better organize the military, seven military zones were established, each with its own Defence Council. The Defence Councils were established at the national, provincial and district level to empower the local PDPA. It is estimated that the Afghan government spent as much as 40 percent of government revenue on defense.
During the civil war and the ensuing Soviet–Afghan War, most of the country's infrastructure was destroyed. Normal patterns of economic activity were disrupted. The Gross national product (GNP) fell substantially during Karmal's rule because of the conflict; trade and transport was disrupted with loss of labor and capital. In 1981 the Afghan GDP stood at 154.3 billion Afghan afghanis, a drop from 159.7 billion in 1978. GNP per capita decreased from 7,370 in 1978 to 6,852 in 1981. The dominant form of economic activity was in the agricultural sector. Agriculture accounted for 63 percent of gross domestic product (GDP) in 1981; 56 percent of the labor force was working in agriculture in 1982. Industry accounted for 21 percent of GDP in 1982, and employed 10 percent of the labor force. All industrial enterprises were government-owned. The service sector, the smallest of the three, accounted for 10 percent of GDP in 1981, and employed an estimated one-third of the labour force. The balance of payments, which had grown in the pre-communist administration of Muhammad Daoud Khan, decreased, turning negative by 1982 at 70.3 million $US. The only economic activity which grew substantially during Karmal's rule was export and import.
Karmal observed in early 1983 that without Soviet intervention, "It is unknown what the destiny of the Afghan Revolution would be ... We are realists and we clearly realize that in store for us yet lie trials and deprivations, losses and difficulties." Two weeks before this statement Sultan Ali Keshtmand, the Chairman of the Council of Ministers, lamented the fact that half the schools and three-quarters of communications had been destroyed since 1979. The Soviet Union rejected several Western-made peace plans, such as the Carrington Plan, since they did not take into consideration the PDPA government. Most Western peace plans had been made in collaboration with the Afghan opposition forces. At the 26th Congress of the Communist Party of the Soviet Union (CPSU) Leonid Brezhnev, the General Secretary of the CPSU Central Committee, stated;
The stance of the Pakistani government was clear, demanding complete Soviet withdrawal from Afghanistan and the establishment of a non-PDPA government. Karmal, summarizing his discussions with Iran and Pakistan, said "Iran and Pakistan have so far not opted for concrete and constructive positions." During Karmal's rule Afghan–Pakistani relations remained hostile; the Soviet intervention in Afghanistan was the catalyst for the hostile relationship. The increasing numbers of Afghan refugees in Pakistan challenged the PDPA's legitimacy to rule.
The Soviet Union threatened in 1985 that it would support the Baloch separatist movement in Pakistan if the Pakistani government continued to aid the mujahideen in Afghanistan. Karmal, problematically for the Soviets, did not want a Soviet withdrawal, and he hampered attempts to improve relations with Pakistan since the Pakistani government had refused to recognise the PDPA government.
Mikhail Gorbachev, then General Secretary of the Central Committee of the Communist Party of the Soviet Union, said, "The main reason that there has been no national consolidation so far is that Comrade Karmal is hoping to continue sitting in Kabul with our help." Karmal's position became less secure when the Soviet leadership began blaming him for the failures in Afghanistan. Gorbachev, worried over the situation, told the Soviet Politburo "If we don't change approaches [to evacuate Afghanistan], we will be fighting there for another 20 or 30 years." It is not clear when the Soviet leadership began to campaign for Karmal's dismissal, but Andrei Gromyko discussed the possibility of Karmal's resignation with Javier Pérez de Cuéllar, the Secretary-General of the United Nations in 1982. While it was Gorbachev who would dismiss Karmal, there may have been a consensus within the Soviet leadership in 1983 that Karmal should resign. Gorbachev's own plan was to replace Karmal with Mohammad Najibullah, who had joined the PDPA at its creation. Najibullah was thought highly of by Yuri Andropov, Boris Ponomarev and Dmitriy Ustinov, and negotiations for his succession may have started in 1983. Najibullah was not the Soviet leadership's only choice for Karmal's succession; a GRU report noted that the majority of the PDPA leadership would support Assadullah Sarwari's ascension to leadership. According to the GRU, Sarwari was a better candidate as he could balance between the Pashtuns, Tajiks and Uzbeks; Najibullah was a Pashtun nationalist. Another viable candidate was Abdul Qadir, who had been a participant in the Saur Revolution.
Najibullah was appointed to the PDPA Secretariat in November 1985. During Karmal's March 1986 visit to the Soviet Union, the Soviets tried to persuade Karmal that he was too ill to govern, and that he should resign. This backfired, as a Soviet doctor attending to Karmal told him he was in good health. Karmal asked to return home to Kabul, and said that he understood and would listen to the Soviet recommendations. Before leaving, Karmal promised he would step down as PDPA General Secretary. The Soviets did not trust him and sent Vladimir Kryuchkov, the head of intelligence (FCD) in the KGB, into Afghanistan. At a meeting in Kabul, Karmal confessed his undying love for the Soviet Union, comparing his ardor to his Muslim faith. Kryuchkov, concluding that he could not persuade Karmal to resign, left the meeting. After Kryuchkov left the room, the Afghan defence minister and the state security minister visited Karmal's office, telling him that he had to resign from one of his posts. Understanding that his Soviet support had been eliminated, Karmal resigned from the office of the General Secretary at the 18th PDPA Central Committee plenum. He was succeeded in his post by Najibullah.
Karmal still had support within the party, and used his base to curb Najibullah's powers. He began spreading rumors that he would be reappointed General Secretary. Najibullah's power base was in the KHAD, the Afghan equivalent to the KGB, and not the party. Considering the fact that the Soviet Union had supported Karmal for over six years, the Soviet leadership wanted to ease him out of power gradually. Yuli Vorontsov, the Soviet ambassador to Afghanistan, told Najibullah to begin undermining Karmal's power slowly. Najibullah complained to the Soviet leadership that Karmal used most of his spare time looking for errors and "speaking against the National Reconciliation [programme]". At a meeting of the Soviet Politburo on 13 November 1986 it was decided that Najibullah should remove Karmal; this motion was supported by Gromyko, Vorontsov, Eduard Shevardnadze, Anatoly Dobrynin and Viktor Chebrikov. A PDPA meeting in November relieved Karmal of his Revolutionary Council chairmanship, and exiled him to Moscow where he was given a state-owned apartment and a dacha. Karmal was succeeded as Revolutionary Council chairman by Haji Mohammad Chamkani, who was not a member of the PDPA.
Many years after the end of his Presidency, he denounced the Saur Revolution of 1978 in which he took part, taking aim at the Khalq governments of Taraki and Amin. He told a Russian reporter:
It was the greatest crime against the people of Afghanistan. Parcham's leaders were against armed actions because the country was not ready for a revolution... I knew that people would not support us if we decided to keep power without such support.
For unknown reasons, Karmal was invited back to Kabul by Najibullah, and "for equally obscure reasons Karmal accepted", returning on 20 June 1991. (This could have been on the recommendation of Anahita Ratebzad who was very close to Karmal & also respected by Najib & generally respected by great part of Left movement in Afghanestan.) If Najibullah's plan was to strengthen his position within the Watan Party (the renamed PDPA) by appeasing the pro-Karmal Parchamites, he failed – Karmal's apartment became a center for opposition to Najibullah's government. When Najibullah was toppled in 1992, Karmal became the most powerful politician in Kabul through leadership of the Parcham. However, his negotiations with the rebels collapsed quickly, and on 16 April 1992 the rebels, led by Gulbuddin Hekmatyar, took Kabul. After the fall of Najibullah's government, Karmal was based in Hairatan. There, it is alleged, Karmal used most of his time either trying to establish a new party, or advising people to join the secular National Islamic Movement ("Junbish-i-Milli"). Abdul Rashid Dostum, the leader of Junbish-i-Milli, was a supporter of Karmal during his rule. It is unknown how much control Karmal had over Dostum, but there is little evidence that Karmal was in any commanding position. Karmal's influence over Dostum appeared indirect – some of his former associates supported Dostum. Those who spoke with Karmal during this period noted his lack of interest in politics. In June 1992 it was reported that he had died in a plane crash along with Dostum, although these reports later proved to be false.
In early December 1996, Karmal died in Moscow's Central Clinical Hospital from liver cancer. The date of his death was reported by some sources as 1 December and by others as 3 December. The Taliban summed up his rule as follows:
[he] committed all kinds of crimes during his illegitimate rule ... God inflicted on him various kinds of hardship and pain. Eventually he died of cancer in a hospital belonging to his paymasters, the Russians. | https://en.wikipedia.org/wiki?curid=4463 |
Buddhist philosophy
Buddhist philosophy refers to the philosophical investigations and systems of inquiry that developed among various Buddhist schools in India following the parinirvana (i.e. death) of the Buddha and later spread throughout Asia. The Buddhist path combines both philosophical reasoning and meditation. The Buddhist traditions present a multitude of Buddhist paths to liberation, and Buddhist thinkers in India and subsequently in East Asia have covered topics as varied as phenomenology, ethics, ontology, epistemology, logic and philosophy of time in their analysis of these paths.
Early Buddhism was based on empirical evidence gained by the sense organs ("ayatana") and the Buddha seems to have retained a skeptical distance from certain metaphysical questions, refusing to answer them because they were not conducive to liberation but led instead to further speculation. A recurrent theme in Buddhist philosophy has been the reification of concepts, and the subsequent return to the Buddhist Middle Way.
Particular points of Buddhist philosophy have often been the subject of disputes between different schools of Buddhism. These elaborations and disputes gave rise to various schools in early Buddhism of Abhidharma, and to the Mahayana traditions such as Prajnaparamita, Madhyamaka, Buddha-nature and Yogācāra.
Edward Conze splits the development of Indian Buddhist philosophy into three phases:
Various elements of these three phases are incorporated and/or further developed in the philosophy and world view of the various sects of Buddhism that then emerged.
Philosophy in India was aimed mainly at spiritual liberation and had soteriological goals. In his study of Mādhyamaka Buddhist philosophy in India, Peter Deller Santina writes:
For the Indian Buddhist philosophers, the teachings of the Buddha were not meant to be taken on faith alone, but to be confirmed by logical analysis ("pramana") of the world. The early Buddhist texts mention that a person becomes a follower of the Buddha's teachings after having pondered them over with wisdom and the gradual training also requires that a disciple "investigate" ("upaparikkhati") and "scrutinize" ("tuleti") the teachings. The Buddha also expected his disciples to approach him as a teacher in a critical fashion and scrutinize his actions and words, as shown in the "Vīmaṃsaka Sutta."
Scholarly opinion varies as to whether the Buddha himself was engaged in philosophical inquiry. The Buddha (c. 5th century BCE) was a north Indian sramana (wandering ascetic) from Magadha. He cultivated various yogic techniques and ascetic practices and taught throughout north India, where his teachings took hold. These teachings are preserved in the Pali Nikayas and in the Agamas as well as in other surviving fragmentary textual collections (collectively known as the Early Buddhist Texts). Dating these texts is difficult, and there is disagreement on how much of this material goes back to a single religious founder. While the focus of the Buddha's teachings are about attaining the highest good of nirvana, they also contain an analysis of the source of human suffering, the nature of personal identity, and the process of acquiring knowledge about the world.
The Buddha defined his teaching as "the middle way" (Pali: "Majjhimāpaṭipadā"). In the "Dhammacakkappavattana Sutta", this is used to refer to the fact that his teachings steer a middle course between the extremes of asceticism and bodily denial (as practiced by the Jains and other ascetic groups) and sensual hedonism or indulgence. Many sramanas of the Buddha's time placed much emphasis on a denial of the body, using practices such as fasting, to liberate the mind from the body. The Buddha, however, realized that the mind was embodied and causally dependent on the body, and therefore that a malnourished body did not allow the mind to be trained and developed. Thus, Buddhism's main concern is not with luxury or poverty, but instead with the human response to circumstances.
Certain basic teachings appear in many places throughout these early texts, so older studies by various scholars conclude that the Buddha must at least have taught some of these key teachings:
According to N. Ross Reat, all of these doctrines are shared by the Theravada Pali texts and the Mahasamghika school's "Śālistamba Sūtra". A recent study by Bhikkhu Analayo concludes that the Theravada "Majjhima Nikaya" and Sarvastivada "Madhyama Agama" contain mostly the same major doctrines. Richard Salomon, in his study of the Gandharan texts (which are the earliest manuscripts containing early discourses), has confirmed that their teachings are "consistent with non-Mahayana Buddhism, which survives today in the Theravada school of Sri Lanka and Southeast Asia, but which in ancient times was represented by eighteen separate schools."
However, some scholars such as Schmithausen, Vetter, and Bronkhorst argue that critical analysis reveals discrepancies among these various doctrines. They present alternative possibilities for what was taught in early Buddhism and question the authenticity of certain teachings and doctrines.
For example, some scholars think that karma was not central to the teaching of the historical Buddha, while other disagree with this position. Likewise, there is scholarly disagreement on whether insight was seen as liberating in early Buddhism or whether it was a later addition to the practice of the four "dhyāna." According to Vetter and Bronkhorst, "dhyāna" constituted the original "liberating practice", while discriminating insight into transiency as a separate path to liberation was a later development."" Scholars such as Bronkhorst and Carol Anderson also think that the four noble truths may not have been formulated in earliest Buddhism but as Anderson writes "emerged as a central teaching in a slightly later period that still preceded the final redactions of the various Buddhist canons."
According to some scholars, the philosophical outlook of earliest Buddhism was primarily negative, in the sense that it focused on what doctrines to "reject" more than on what doctrines to "accept". Only knowledge that is useful in achieving enlightenment is valued. According to this theory, the cycle of philosophical upheavals that in part drove the diversification of Buddhism into its many schools and sects only began once Buddhists began attempting to make explicit the implicit philosophy of the Buddha and the early texts.
The four noble truths or "truths of the noble one" are a central feature of the teachings and are put forth in the "Dhammacakkappavattana Sutta". The first truth of dukkha, often translated as "suffering", is the inherent unsatisfactoriness of life. This unpleasantness is said to be not just physical pain, but also a kind of existential unease caused by the inevitable facts of our mortality and ultimately by the impermanence of all phenomena. It also arises because of contact with unpleasant events, and due to not getting what one desires. The second truth is that this unease arises out of conditions, mainly 'craving' (tanha) and ignorance (avidya). The third truth is then the fact that if you let go of craving and remove ignorance through knowledge, dukkha ceases (nirodha). The fourth is the eightfold path which are eight practices that end suffering. They are: right view, right intention, right speech, right action, right livelihood, right effort, right mindfulness and right samadhi (mental unification, meditation). The goal taught by the Buddha, nirvana, literally means 'extinguishing' and signified "the complete extinguishing of greed, hatred, and delusion (i.e. ignorance), the forces which power "samsara". Nirvana also means that after an enlightened being's death, there is no further rebirth. In early Buddhism, the concept of dependent origination was most likely limited to processes of mental conditioning and not to all physical phenomena. The Buddha understood the world in procedural terms, not in terms of things or substances. His theory posits a flux of events arising under certain conditions which are interconnected and dependent, such that the processes in question at no time are considered to be static or independent. Craving, for example, is always dependent on, and caused by sensations. Sensations are always dependent on contact with our surroundings. Buddha's causal theory is simply descriptive: "This existing, that exists; this arising, that arises; this not existing, that does not exist; this ceasing, that ceases." This understanding of causation as "impersonal lawlike causal ordering" is important because it shows how the processes that give rise to suffering work, and also how they can be reversed.
The removal of suffering, then, requires a deep understanding of the nature of reality (prajña). While philosophical analysis of arguments and concepts is clearly necessary to develop this understanding, it is not enough to remove our unskillful mental habits and deeply ingrained prejudices, which require meditation, paired with understanding. According to the Buddha of the early texts, we need to train the mind in meditation to be able to truly see the nature of reality, which is said to have the marks of suffering, impermanence and not-self. Understanding and meditation are said to work together to 'clearly see' (vipassana) the nature of human experience and this is said to lead to liberation.
The Buddha argued that compounded entities lacked essence, correspondingly the self is without essence. This means there is no part of a person which is unchanging and essential for continuity, and it means that there is no individual "part of the person that accounts for the identity of that person over time". This is in opposition to the Upanishadic concept of an unchanging ultimate self (Atman) and any view of an eternal soul. The Buddha held that attachment to the appearance of a permanent self in this world of change is the cause of suffering, and the main obstacle to liberation.
The most widely used argument that the Buddha employed against the idea of an unchanging ego is an empiricist one, based on the observation of the five aggregates that make up a person and the fact that these are always changing. This argument can be put in this way:
This argument requires the implied premise that the five aggregates are an exhaustive account of what makes up a person, or else the self could exist outside of these aggregates. This premise is affirmed in other suttas, such as SN 22.47 which states: "whatever ascetics and brahmins regard various kinds of things as self, all regard the five grasping aggregates, or one of them."
This argument is famously expounded in the "Anattalakkhana Sutta". According to this text, the apparently fixed self is merely the result of identification with the temporary aggregates, the changing processes making up an individual human being. In this view a 'person' is only a convenient nominal designation on a certain grouping of processes and characteristics, and an 'individual' is a conceptual construction overlaid upon a stream of experiences just like a chariot is merely a conventional designation for the parts of a chariot and how they are put together. The foundation of this argument is empiricist, for it is based on the fact that all we observe is subject to change, especially everything observed when looking inwardly in meditation.
Another argument for 'non-self', the 'argument from lack of control', is based on the fact that we often seek to change certain parts of ourselves, that the 'executive function' of the mind is that which finds certain things unsatisfactory and attempts to alter them. Furthermore, it is also based on the Indian 'Anti Reflexivity Principle' which states an entity cannot operate on or control itself (a knife can cut other things but not itself, a finger can point at other things but not at itself, etc.). This means then, that the self could never desire to change itself and could not do so (another reason for this is that in most Indian traditions besides Buddhism, the true self or Atman is perfectly blissful and does not suffer). The Buddha uses this idea to attack the concept of self. This argument could be structured thus:
This argument then denies that there is one permanent "controller" in the person. Instead it views the person as a set of constantly changing processes which include volitional events seeking change and an awareness of that desire for change. According to Mark Siderits:"What the Buddhist has in mind is that on one occasion one part of the person might perform the executive function, on another occasion another part might do so. This would make it possible for every part to be subject to control without there being any part that always fills the role of controller (and so is the self). On some occasions a given part might fall on the controller side, while on other occasions it might fall on the side of the controlled. This would explain how it's possible for us to seek to change any of the skandhas while there is nothing more to us than just those skandhas."As noted by K.R. Norman and Richard Gombrich, the Buddha extended his anatta critique to the Brahmanical belief expounded in the "Brihadaranyaka Upanishad" that the Self (Atman) was indeed the whole world, or Brahman. This is shown by the "Alagaddupama Sutta", where the Buddha argues that an individual cannot experience the suffering of the entire world. He used the example of someone carrying off and burning grass and sticks from the Jeta grove and how a monk would not sense or consider themselves harmed by that action. In this example the Buddha is arguing that we do not have direct experience of the entire world, and hence the Self cannot be the whole world. In this sutta (as well as in the "Soattā Sutta") the Buddha outlines six wrong views about Self:
"There are six wrong views: An unwise, untrained person may think of the body, 'This is mine, this is me, this is my self'; he may think that of feelings; of perceptions; of volitions; or of what has been seen, heard, thought, cognized, reached, sought or considered by the mind. The sixth is to identify the world and self, to believe: 'At death I shall become permanent, eternal, unchanging, and so remain forever the same; and that is mine, that is me, that is my self.' A wise and well-trained person sees that all these positions are wrong, and so he is not worried about something that does not exist."
Furthermore, the Buddha argues that the world can be observed to be a cause of suffering (Brahman was held to be ultimately blissful) and that since we cannot control the world as we wish, the world cannot be the Self. The idea that "this cosmos is the self" is one of the views rejected by the Buddha along with the related Monistic theory that held that "everything is a Oneness" (SN 12.48 "Lokayatika Sutta"). The Buddha also held that understanding and seeing the truth of not-self led to un-attachment, and hence to the cessation of suffering, while ignorance about the true nature of personality led to further suffering.
All schools of Indian philosophy recognize various sets of valid justifications for knowledge, or "pramana" and many see the Vedas as providing access to truth. The Buddha denied the authority of the Vedas, though, like his contemporaries, he affirmed the soteriological importance of having a proper understanding of reality (right view). However, this understanding was not conceived primarily as metaphysical and cosmological knowledge, but as a knowledge into the arising and cessation of suffering in human experience. Therefore, the Buddha's epistemic project is different than that of modern philosophy; it is primarily a solution to the fundamental human spiritual/existential problem.
The Buddha's epistemology has been compared to empiricism, in the sense that it was based on experience of the world through the senses. The Buddha taught that empirical observation through the six sense fields (ayatanas) was the proper way of verifying any knowledge claims. Some suttas go further, stating that "the All", or everything that exists ("sabbam"), are these six sense spheres (SN 35.23, Sabba Sutta) and that anyone who attempts to describe another "All" will be unable to do so because "it lies beyond range". This sutta seems to indicate that for the Buddha, things in themselves or noumena, are beyond our epistemological reach ("avisaya").
Furthermore, in the Kalama Sutta the Buddha tells a group of confused villagers that the only proper reason for one's beliefs is verification in one's own personal experience (and the experience of the wise) and denies any verification which stems from personal authority, sacred tradition ("anussava") or any kind of rationalism which constructs metaphysical theories ("takka"). In the Tevijja Sutta (DN 13), the Buddha rejects the personal authority of Brahmins because none of them can prove they have had personal experience of Brahman. The Buddha also stressed that experience is the only criterion for verification of the truth in this passage from the Majjhima Nikaya (MN.I.265):
Furthermore, the Buddha's standard for personal verification was a pragmatic and salvific one, for the Buddha a belief counts as truth only if it leads to successful Buddhist practice (and hence, to the destruction of craving). In the "Discourse to Prince Abhaya" (MN.I.392–4) the Buddha states this pragmatic maxim by saying that a belief should only be accepted if it leads to wholesome consequences. This tendency of the Buddha to see what is true as what was useful or 'what works' has been called by scholars such as Mrs Rhys Davids and Vallée-Poussin a form of Pragmatism. However, K. N. Jayatilleke argues the Buddha's epistemology can also be taken to be a form of correspondence theory (as per the 'Apannaka Sutta') with elements of Coherentism and that for the Buddha, it is causally impossible for something which is false to lead to cessation of suffering and evil.
The Buddha discouraged his followers from indulging in intellectual disputation for its own sake, which is fruitless, and distracts one from the goal of awakening. Only philosophy and discussion which has pragmatic value for liberation from suffering is seen as important. According to the scriptures, during his lifetime the Buddha remained silent when asked several metaphysical questions which he regarded as the basis for "unwise reflection". These 'unanswered questions' (avyākata) regarded issues such as whether the universe is eternal or non-eternal (or whether it is finite or infinite), the unity or separation of the body and the self, the complete inexistence of a person after Nirvana and death, and others. The Buddha stated that thinking about these imponderable (Acinteyya) issues led to "a thicket of views, a wilderness of views, a contortion of views, a writhing of views, a fetter of views" (Aggi-Vacchagotta Sutta).
One explanation for this pragmatic suspension of judgment or epistemic Epoché is that such questions distract from activity that is practical to realizing enlightenment and bring about the danger of substituting the experience of liberation by conceptual understanding of the doctrine or by religious faith. According to the Buddha, the Dharma is not an ultimate end in itself or an explanation of all metaphysical reality, but a pragmatic set of teachings. The Buddha used two parables to clarify this point, the 'Parable of the raft' and the Parable of the Poisoned Arrow. The Dharma is like a raft in the sense that it is only a pragmatic tool for attaining nirvana ("for the purpose of crossing over, not for the purpose of holding onto", MN 22); once one has done this, one can discard the raft. It is also like medicine, in that the particulars of how one was injured by a poisoned arrow (i.e. metaphysics, etc.) do not matter in the act of removing and curing the arrow wound itself (removing suffering). In this sense, the Buddha was often called 'the great physician' because his goal was to cure the human condition of suffering first and foremost, not to speculate about metaphysics.
Having said this, it is still clear that resisting (even refuting) a false or slanted doctrine can be useful to extricate the interlocutor, or oneself, from error; hence, to advance in the way of liberation. Witness the Buddha's confutation of several doctrines by Nigantha Nataputta and other purported sages which sometimes had large followings (e.g., Kula Sutta, Sankha Sutta, Brahmana Sutta). This shows that a virtuous and appropriate use of dialectics can take place. By implication, reasoning and argument shouldn't be disparaged by Buddhists.
After the Buddha's death, some Buddhists such as Dharmakirti went on to use the sayings of the Buddha as sound evidence equal to perception and inference.
Another possible reason why the Buddha refused to engage in metaphysics is that he saw ultimate reality and nirvana as devoid of sensory mediation and conception and therefore language itself is "a priori" inadequate to explain it. Thus, the Buddha's silence does not indicate misology or disdain for philosophy. Rather, it indicates that he viewed the answers to these questions as not understandable by the unenlightened. Dependent arising provides a framework for analysis of reality that is not based on metaphysical assumptions regarding existence or non-existence, but instead on direct cognition of phenomena as they are presented to the mind in meditation.
The Buddha of the earliest Buddhists texts describes Dharma (in the sense of "truth") as "beyond reasoning" or "transcending logic", in the sense that reasoning is a subjectively introduced aspect of the way unenlightened humans perceive things, and the conceptual framework which underpins their cognitive process, rather than a feature of things as they really are. Going "beyond reasoning" means in this context penetrating the nature of reasoning from the inside, and removing the causes for experiencing any future stress as a result of it, rather than functioning outside the system as a whole.
The Buddha's ethics are based on the soteriological need to eliminate suffering and on the premise of the law of karma. Buddhist ethics have been termed eudaimonic (with their goal being well-being) and also compared to virtue ethics (this approach began with Damien Keown). Keown writes that Buddhist Nirvana is analogous to the Aristotelian Eudaimonia, and that Buddhist moral acts and virtues derive their value from how they lead us to or act as an aspect of the nirvanic life.
The Buddha outlined five precepts (no killing, stealing, sexual misconduct, lying, or drinking alcohol) which were to be followed by his disciples, lay and monastic. There are various reasons the Buddha gave as to why someone should be ethical.
First, the universe is structured in such a way that if someone intentionally commits a misdeed, a bad karmic fruit will be the result (and vice versa). Hence, from a pragmatic point of view, it is best to abstain from these negative actions which bring forth negative results. However, the important word here is "intentionally": for the Buddha, karma is nothing else but intention/volition, and hence unintentionally harming someone does not create bad karmic results. Unlike the Jains who believed that karma was a quasi-physical element, for the Buddha karma was a volitional mental event, what Richard Gombrich calls 'an ethicised consciousness'.
This idea leads into the second moral justification of the Buddha: intentionally performing negative actions reinforces and propagates mental defilements which keep persons bound to the cycle of rebirth and interfere with the process of liberation, and hence intentionally performing good karmic actions is participating in mental purification which leads to nirvana, the highest happiness. This perspective sees immoral acts as unskillful ("akusala") in our quest for happiness, and hence it is pragmatic to do good.
The third meta-ethical consideration takes the view of not-self and our natural desire to end our suffering to its logical conclusion. Since there is no self, there is no reason to prefer our own welfare over that of others because there is no ultimate grounding for the differentiation of "my" suffering and someone else's. Instead, an enlightened person would just work to end suffering "tout court", without thinking of the conventional concept of persons. According to this argument, anyone who is selfish does so out of ignorance of the true nature of personal identity and irrationality.
The main Indian Buddhist philosophical schools practiced a form of analysis termed "Abhidharma" which sought to systematize the teachings of the early Buddhist discourses (sutras). Abhidharma analysis broke down human experience into momentary phenomenal events or occurrences called ""dharmas"". Dharmas are impermanent and dependent on other causal factors, they arise and pass as part of a web of other interconnected dharmas, and are never found alone. The Abhidharma schools held that the teachings of the Buddha in the sutras were merely conventional, while the Abhidharma analysis was ultimate truth (paramattha sacca), the way things really are when seen by an enlightened being. The Abhidharmic project has been likened as a form of phenomenology or process philosophy. Abhidharma philosophers not only outlined what they believed to be an exhaustive listing of "dharmas", or phenomenal events, but also the causal relations between them. In the Abhidharmic analysis, the only thing which is ultimately real is the interplay of dharmas in a causal stream; everything else is merely conceptual ("paññatti") and nominal.
This view has been termed "mereological reductionism" by Mark Siderits because it holds that only impartite entities are real, not wholes. Abhidharmikas such as Vasubandhu argued that conventional things (tables, persons, etc.) "disappear under analysis" and that this analysis reveals only a causal stream of phenomenal events and their relations. The mainstream Abhidharmikas defended this view against their main Hindu rivals, the Nyaya school, who were substance theorists and posited the existence of universals. Some Abhidharmikas such as the Prajñaptivāda were also strict nominalists, and held that all things - even dharmas - were merely conceptual.
An important Abhidhamma work from the Theravāda school is the Kathāvatthu ("Points of controversy"), attributed to the Indian scholar-monk Moggaliputta-Tissa (ca.327–247 BCE). This text is important because it attempts to refute several philosophical views which had developed after the death of the Buddha, especially the theory that 'all exists' ("sarvāstivāda"), the theory of momentariness ("khāṇavāda") and the personalist view ("pudgalavada") These were the major philosophical theories which divided the Buddhist Abhidharma schools in India. After being brought to Sri Lanka in the first century BCE, the Theravada Pali language Abhidhamma tradition was heavily influenced by the works of Buddhaghosa (4-5th century AD), the most important philosopher and commentator of the Theravada school. The Theravada philosophical enterprise was mostly carried out in the genre of Atthakatha, commentaries (as well as sub-commentaries) on the Pali Abhidhamma, but also included short summaries and compendiums.
The Sarvāstivāda was one of the major Buddhist philosophical schools in India, and they were so named because of their belief that dharmas exist in all three times: past, present and future. Though the Sarvāstivāda Abhidharma system began as a mere categorization of mental events, their philosophers and exegetes such as Dharmatrata and Katyāyāniputra (the compiler of the Mahavibhasa, a central text of the school) eventually refined this system into a robust realism, which also included a type of essentialism. This realism was based on a quality of dharmas, which was called svabhava or 'intrinsic existence'. Svabhava is a sort of essence, though it is not a completely independent essence, since all dharmas were said to be causally dependent. The Sarvāstivāda system extended this realism across time, effectively positing a type of eternalism with regards to time; hence, the name of their school means "the view that everything exists".
Other Buddhist schools such as the Prajñaptivadins ('nominalists'), the Purvasailas and the Vainasikas refused to accept the concept of svabhava. The main topic of the Tattvasiddhi Śāstra by Harivarman (3-4th century AD), an influential Abhidharma text, is the emptiness (shunyata) of dharmas.
The Theravādins and other schools such as the Sautrāntikas attacked the realism of the Sarvāstivādins, especially their theory of time. A major figure in this argument was the scholar Vasubandhu, an ex-Sarvāstivādin, who critiqued the theory of all exists and argued for philosophical presentism in his comprehensive treatise, the Abhidharmakosa. This work is the major Abhidharma text used in Tibetan and East Asian Buddhism today. The Theravāda also holds that dharmas only exist in the present, and are thus also presentists. The Theravādin presentation of Abhidharma is also not as concerned with ontology as the Sarvāstivādin view, but is more of a phenomenology and hence the concept of svabhava for the Theravādins is more of a certain characteristic or dependent feature of a dharma, than any sort of essence or metaphysical grounding. According to Y Karunadasa:
In the Pali tradition it is only for the sake of definition and description that each dhamma is postulated as if it were a separate entity; but in reality it is by no means a solitary phenomenon having an existence of its own...If this Abhidhammic view of existence, as seen from its doctrine of dhammas, cannot be interpreted as a radical pluralism, neither can it be interpreted as an out-and-out monism. For what are called dhammas -- the component factors of the universe, both within us and outside us -- are not fractions of an absolute unity but a multiplicity of co-ordinate factors. They are not reducible to, nor do they emerge from, a single reality, the fundamental postulate of monistic metaphysics. If they are to be interpreted as phenomena, this should be done with the proviso that they are phenomena with no corresponding noumena, no hidden underlying ground. For they are not manifestations of some mysterious metaphysical substratum, but processes taking place due to the interplay of a multitude of conditions.
An important theory held by some Sarvāstivādins, Theravādins and Sautrāntikas was the theory of "momentariness" (Skt., kṣāṇavāda, Pali, khāṇavāda). This theory held that dhammas only last for a minute moment ("ksana") after they arise. The Sarvāstivādins saw these 'moments' in an atomistic way, as the smallest length of time possible (they also developed a material atomism). Reconciling this theory with their eternalism regarding time was a major philosophical project of the Sarvāstivāda. The Theravādins initially rejected this theory, as evidenced by the Khaṇikakathā of the Kathavatthu which attempts to refute the doctrine that "all phenomena (dhamma) are as momentary as a single mental entity." However, momentariness with regards to mental dhammas (but not physical or rūpa dhammas) was later adopted by the Sri Lankan Theravādins, and it is possible that it was first introduced by the scholar Buddhagosa.
All Abhidharma schools also developed complex theories of causation and conditionality to explain how dharmas interacted with each other. Another major philosophical project of the Abhidharma schools was the explanation of perception. Some schools such as the Sarvastivadins explained perception as a type of phenomenalist realism while others such as the Sautrantikas preferred representationalism and held that we only perceive objects indirectly. The major argument used for this view by the Sautrāntikas was the "time lag argument." According to Mark Siderits: "The basic idea behind the argument is that since there is always a tiny gap between when the sense comes in contact with the external object and when there is sensory awareness, what we are aware of can't be the external object that the senses were in contact with, since it no longer exists." This is related to the theory of extreme momentariness.
One major philosophical view which was rejected by all the schools mentioned above was the view held by the Pudgalavadin or 'personalist' schools. They seemed to have held that there was a sort of 'personhood' in some ultimately real sense which was not reducible to the five aggregates. This controversial claim was in contrast to the other Buddhists of the time who held that a personality was a mere conceptual construction (prajñapti) and only conventionally real.
From about the 1st century BCE, a new textual tradition began to arise in Indian Buddhist thought called Mahāyāna (Great Vehicle), which would slowly come to dominate Indian Buddhist philosophy. Buddhist philosophy thrived in large monastery-university complexes such as Nalanda and Vikramasila, which became centres of learning in North India. Mahāyāna philosophers continued the philosophical projects of Abhidharma while at the same time critiquing them and introducing new concepts and ideas. Since the Mahāyāna held to the pragmatic concept of truth which states that doctrines are regarded as conditionally "true" in the sense of being spiritually beneficial, the new theories and practices were seen as 'skillful means' (Upaya). The Mahayana also promoted the Bodhisattva ideal, which included an attitude of compassion for all sentient beings. The Bodhisattva is someone who chooses to remain in "samsara" (the cycle of birth and death) to benefit all other beings who are suffering.
Major Mahayana philosophical schools and traditions include the Prajnaparamita, Madhyamaka, Tathagatagarbha, the Epistemological school of Dignaga, Yogācāra, Huayan, Tiantai and the Chan/Zen schools.
The earliest Prajñāpāramitā-sutras ("perfection of insight" sutras) (circa 1st century BCE) emphasize the shunyata (emptiness) of phenomena and dharmas. The Prajñāpāramitā is said to be true knowledge of the nature of ultimate reality, which is illusory and empty of essence.
The "Diamond Sutra" states that:
The "Heart Sutra" famously affirms the shunyata of phenomena:
"Oh, Sariputra, form does not differ from shunyata,and shunyata does not differ from form.
Form is shunyata and shunyata is form;
the same is true for feelings,
perceptions, volitions and consciousness".
The Prajñāpāramitā teachings are associated with the work of the Buddhist philosopher Nāgārjuna (c. 150 – c. 250 CE) and the Madhyamaka (Middle way) school. Nāgārjuna was one of the most influential Indian Buddhist thinkers; he gave the classical arguments for the empty nature of phenomena and attacked the Sarvāstivāda and Pudgalavada schools' essentialism in his magnum opus, "The Fundamental Verses on the Middle Way" ("Mūlamadhyamakakārikā"). In the "Mūlamadhyamakakārikā", Nagarjuna relies on reductio ad absurdum arguments to refute various theories which assume svabhava (an inherent essence or "own being"). In this work, he covers topics such as causation, motion, and the sense faculties.
Nagarjuna asserted a direct connection between, even identity of, dependent origination, non-self ("anatta"), and emptiness ("śūnyatā"). He pointed out that implicit in the early Buddhist concept of dependent origination is the lack of anatta (substantial being) underlying the participants in origination, so that they have no independent existence, a state identified as śūnyatā (i.e., emptiness of a nature or essence ("svabhāva sunyam").
Later philosophers of the Madhyamaka school built upon Nagarjuna's analysis and defended Madhyamaka against their opponents. These included Āryadeva (3rd century CE), Nāgārjuna's pupil; Candrakīrti (600–c. 650), who wrote an important commentary on the Mūlamadhyamakakārikā; and Shantideva (8th century). Buddhapālita (470–550) has been understood as the originator of the 'prāsaṅgika' approach which is based on critiquing essentialism only through "reductio ad absurdum" arguments. He was criticized by Bhāvaviveka (c. 500 – c. 578), who argued for the use of syllogisms "to set one's own doctrinal stance". These two approaches were later termed the Prāsaṅgika and the Svātantrika approaches to Madhyamaka by Tibetan philosophers and commentators.
Influenced by the work of Dignaga, Bhāvaviveka's Madhyamika philosophy makes use of Buddhist epistemology. Candrakīrti, on the other hand, critiqued Bhāvaviveka's adoption of the epistemological ("pramana") tradition on the grounds that it contained a subtle essentialism. He quotes Nagarjuna's famous statement in the "Vigrahavyavartani" which says "I have no thesis" for his rejection of positive epistemic Madhyamaka statements. Candrakīrti held that a true Madhyamika could only use "consequence" ("prasanga"), in which one points out the inconsistencies of their opponent's position without asserting an "autonomous inference" ("svatantra"), for no such inference can be ultimately true from the point of view of Madhyamaka.
In China, the Madhyamaka school (known as Sānlùn) was founded by Kumārajīva (344–413 CE), who translated the works of Nagarjuna to Chinese. Other Chinese Madhymakas include Kumārajīva 's pupil Sengzhao, Jizang (549–623), who wrote over 50 works on Madhyamaka, and Hyegwan, a Korean monk who brought Madhyamaka teachings to Japan.
The Yogācāra school ("Yoga practice") was a Buddhist philosophical tradition which arose in between the 2nd century CE and the 4th century CE and is associated with the philosophers Asanga and Vasubandhu and with various sutras such as the Sandhinirmocana Sutra and the Lankavatara Sutra. The central feature of Yogācāra thought is the concept of "Vijñapti-mātra", often translated as "impressions only" or "appearance only" and this has been interpreted as a form of Idealism or as a form of Phenomenology. Other names for the Yogacara school are 'Vijñanavada' (the doctrine of consciousness) and 'Cittamatra' (mind-only).
Yogacara thinkers like Vasubandhu argued against the existence of external objects by pointing out that we only ever have access to our own mental impressions, and hence our inference of the existence of external objects is based on faulty logic. Vasubandhu's "Vijnaptimatratasiddhi", or "The Proof that There Are Only Impressions" (20 verses), begins thus:"I. This [world] is nothing but impressions, since it manifests itself as an unreal object,
Just like the case of those with cataracts seeing unreal hairs in the moon and the like."According to Vasubandhu then, all our experiences are like seeing hairs on the moon when we have cataracts, that is, we project our mental images into something "out there" when there are no such things. Vasubandhu then goes on to use the dream argument to argue that mental impressions do not require external objects to (1) seem to be spatio-temporally located, (2) to seem to have an inter-subjective quality, and (3) to seem to operate by causal laws. The fact that purely mental events can have causal efficacy and be intersubjective is proved by the event of a wet dream and by the mass or shared hallucinations created by the karma of certain types of beings.
After having argued that impressions-only is a theory which can explain our everyday experience, Vasubandhu then appeals to parsimony - since we do not need the concept of external objects to explain reality, then we can do away with those superfluous concepts altogether as they are most likely just mentally superimposed on our concepts of reality by the mind. Inter-subjective reality for Vasubandhu is then the causal interaction between various mental streams and their karma, and does not include any external physical objects. The soteriological importance of this theory is that, by removing the concept of an external world, it also weakens the 'internal' sense of self as observer which is supposed to be separate from the external world. To dissolve the dualism of inner and outer is also to dissolve the sense of self and other. The later Yogacara commentator Sthiramati explains this thus:"There is a grasper if there is something to be grasped, but not in the absence of what is to be grasped. Where there is no thing to be grasped, the absence of a grasper also follows, there is not just the absence of the thing to be grasped. Thus there arises the extra-mundane non-conceptual cognition that is alike without object and without cognizer."Vasubandhu also attacked the realist theories of Buddhist atomism and the Abhidharma theory of svabhava. He argued that atoms as conceived by the atomists (un-divisible entities) would not be able to come together to form larger aggregate entities, and hence that they were illogical concepts.
Later Yogacara thinkers include Dharmapala of Nalanda, Sthiramati, Chandragomin (who debated Candrakirti), and Śīlabhadra. Yogacarins such as Paramartha and Guṇabhadra brought the school to China and translated Yogacara works there, where it is known as Wéishí-zōng or Fǎxiàng-zōng. An important contribution to East Asian Yogācāra is Xuanzang's "Cheng Weishi Lun", or "Discourse on the Establishment of Consciousness Only".
Jñānagarbha (8th century) and his student Śāntarakṣita (725–788) brought together Yogacara, Madhyamaka and the Dignaga school of epistemology into a philosophical synthesis known as the "Yogācāra-Svatantrika-Mādhyamika". Śāntarakṣita was also instrumental in the introduction of Buddhism and the Sarvastivadin monastic ordination lineage to Tibet, which was conducted at Samye. Śāntarakṣita's disciples included Haribhadra and Kamalaśīla. This philosophical tradition is influential in Tibetan Buddhist thought.
The "tathāgathagarbha sutras", in a departure from mainstream Buddhist language, insist that the potential for awakening is inherent to every sentient being. They marked a shift from a largely apophatic (negative) philosophical trend within Buddhism to a decidedly more cataphatic (positive) modus.
Prior to the period of these scriptures, Mahāyāna metaphysics had been dominated by teachings on emptiness in the form of Madhyamaka philosophy. The language used by this approach is primarily negative, and the "tathāgatagarbha" genre of sutras can be seen as an attempt to state orthodox Buddhist teachings of dependent origination using positive language instead, to prevent people from being turned away from Buddhism by a false impression of nihilism.
In these sutras, the perfection of the wisdom of not-self is stated to be the true self; the ultimate goal of the path is then characterized using a range of positive language that had been used previously in Indian philosophy by essentialist philosophers, but which was now transmuted into a new Buddhist vocabulary to describe a being who has successfully completed the Buddhist path.
The word "self" ("atman") is used in a way idiosyncratic to these sutras; the "true self" is described as the perfection of the wisdom of not-self in the "Buddha-Nature Treatise", for example. Language that had previously been used by essentialist non-Buddhist philosophers was now adopted, with new definitions, by Buddhists to promote orthodox teachings.
The "tathāgatagarbha" does not, according to some scholars, represent a substantial self; rather, it is a positive language expression of emptiness and represents the potentiality to realize Buddhahood through Buddhist practices. In this interpretation, the intention of the teaching of "tathāgatagarbha" is soteriological rather than theoretical.
The "tathāgathagarbha", the Theravāda doctrine of "bhavaṅga", and the Yogācāra store consciousness were all identified at some point with the luminous mind of the Nikāyas.
In the Mahayana "Mahaparinirvana Sutra", the Buddha insists that while pondering upon Dharma is vital, one must then relinquish fixation on words and letters, as these are utterly divorced from liberation and the Buddha-nature.
Dignāga (c. 480–540) and Dharmakīrti (c. 6-7th century) were Buddhist philosophers who developed a system of epistemology (pramana) and logic in their debates with the Brahminical philosophers in order to defend Buddhist doctrine. This tradition is called "those who follow reasoning" (Tibetan: "rigs pa rjes su 'brang ba"); in modern literature it is sometimes known by the Sanskrit "'pramāṇavāda"', or "the Epistemological School." They were associated with the Yogacara and Sautrantika schools, and defended theories held by both of these schools. Dignaga's influence was profound and led to an "epistemological turn" among all Buddhist and also all Sanskrit language philosophers in India after his death. In the centuries following Dignaga's work, Sanskrit philosophers became much more focused on defending all of their propositions with fully developed theories of knowledge.
The "School of Dignāga" includes later philosophers and commentators like Santabhadra, Dharmottara (8th century), Jñanasrimitra (975–1025), Ratnakīrti (11th century) and Samkarananda. The epistemology they developed defends the view that there are only two 'instruments of knowledge' or 'valid cognitions' ("pramana"): "perception" (pratyaksa) and "inference" (anumāṇa). Perception is a non-conceptual awareness of particulars which is bound by causality, while inference is reasonable, linguistic and conceptual.
These Buddhist philosophers argued in favor of the theory of momentariness, the Yogacara "awareness only" view, the reality of particulars (svalakṣaṇa), atomism, nominalism and the self-reflexive nature of consciousness (svasaṃvedana). They attacked Hindu theories of God (Isvara), universals, the authority of the Vedas, and the existence of a permanent soul ("atman").
The tradition associated with a group of texts known as the Buddhist Tantras, known as Vajrayana, developed by the eighth century in North India. By this time Tantra was a key feature of Indian Buddhism, and Indian Tantric scholars developed philosophical defenses, hermeneutics and explanations of the Buddhist tantric systems, especially through commentaries on key tantras such as the Guhyasamāja Tantra and the Guhyagarbha Tantra.
While the view of the Vajrayana was based on Madhyamaka, Yogacara and Buddha-nature theories, it saw itself as being a faster vehicle to liberation containing many skillful methods (upaya) of tantric ritual. The need for an explication and defense of the Tantras arose out of the unusual nature of the rituals associated with them, which included the use of secret mantras, alcohol, sexual yoga, complex visualizations of mandalas filled with wrathful deities and other practices and injunctions which were discordant with or at least novel in comparison to traditional Buddhist thought. The Guhyasamāja Tantra, for example, states: "you should kill living beings, speak lying words, take things that are not given and have sex with many women". Other features of tantra included a focus on the physical body as the means to liberation and a reaffirmation of feminine elements, feminine deities and sexuality.
The defense of these practices is based on the theory of transformation which states that negative mental factors and physical actions can be cultivated and transformed in a ritual setting. The Hevajra tantra states:
Those things by which evil men are bound, others turn into means and gain thereby release from the bonds of existence. By passion the world is bound, by passion too it is released, but by heretical Buddhists this practice of reversals is not known.
Another hermeneutic of Buddhist Tantric commentaries such as the Vimalaprabha of Pundarika (a commentary on the Kalacakra Tantra) is one of interpreting taboo or unethical statements in the Tantras as metaphorical statements about tantric practice. For example, in the Vimalaprabha, "killing living beings" refers to stopping the prana at the top of the head. In the Tantric Candrakirti's "Pradipoddyotana", a commentary to the Guhyasamaja Tantra, killing living beings is glossed as "making them void" by means of a "special samadhi" which according to Bus-ton is associated with completion stage tantric practice.
Douglas Duckworth notes that Vajrayana philosophical outlook is one of embodiment, which sees the physical and cosmological body as already containing wisdom and divinity. Liberation (nirvana) and Buddhahood are not seen as something outside or an event in the future, but as imminently present and accessible right now through unique tantric practices like deity yoga, and hence Vajrayana is also called the "resultant vehicle". Duckworth names the philosophical view of Vajrayana as a form of pantheism, by which he means the belief that every existing entity is in some sense divine and that all things express some form of unity.
Major Indian Tantric Buddhist philosophers such as Buddhaguhya, Padmavajra (author of the "Guhyasiddhi"), Nagarjuna (7th-century disciple of Saraha), Indrabhuti (author of the "Jñānasiddhi"), Anangavajra, Dombiheruka, Durjayacandra, Ratnākaraśānti and Abhayakaragupta wrote tantric texts and commentaries systematizing the tradition. Others such as Vajrabodhi and Śubhakarasiṃha brought Tantra to Tang China (716 to 720), and tantric philosophy continued to be developed in Chinese and Japanese by thinkers such as Yi Xing and Kūkai. In Tibet, philosophers such as Sakya Pandita (1182-28 – 1251), Longchenpa (1308–1364) and Tsongkhapa (1357–1419) continued the tradition of Buddhist Tantric philosophy in Tibetan.
Tibetan Buddhist philosophy is mainly a continuation and refinement of the Indian traditions of Madhyamaka, Yogacara and the Dignaga-Dharmakīrti school of epistemology or "reliable cognition" (Sanskrit: "pramana", Tib. "tshad ma"). The initial efforts of Śāntarakṣita and Kamalaśīla brought their eclectic scholarly tradition to Tibet. Other influences include Buddhist Tantras and the Buddha nature texts.
The initial work of early Tibetan Buddhist philosophers was in translation of classical Indian philosophical treatises and the writing of commentaries. This initial period is from the 8th to the 10th century. Early Tibetan commentator philosophers were heavily influenced by the work of Dharmakirti and these include Ngok Lo-dza-wa (1059-1109) and Cha-ba (1182-1251). Their works are now lost. The 12th and 13th centuries saw the translation of the works of Chandrakirti, the promulgation of his views in Tibet by scholars such as Patsab Nyima Drakpa, Kanakavarman and Jayananda (12th century) and the development of the Tibetan debate between the prasangika and svatantrika views which continues to this day among Tibetan Buddhist schools. The main disagreement between these views is the use of reasoned argument. For Śāntarakṣita, Kamalaśīla and their defenders, reason is useful in establishing arguments that lead one to a correct understanding of emptiness, then, through the use of meditation, one can reach non-conceptual gnosis that does not rely on reason. For Chandrakirti, however, this is wrong, because meditation on emptiness cannot possibly involve any object. Reason's role here is to negate any essence or essentialist views, and then eventually negate itself along with any Conceptual proliferation.
There are various Tibetan Buddhist schools or monastic orders. According to Georges B.J. Dreyfus, within Tibetan thought, the Sakya school holds a mostly anti-realist philosophical position, while the Gelug school tends to defend a form of realism. The Kagyu and Nyingma schools also tend to follow Sakya anti-realism (with some differences).
The 14th century saw increasing interest in the Buddha nature texts and doctrines. This can be seen in the work of the third Kagyu Karmapa Rangjung Dorje (1284-1339), especially his treatise "Profound Inner Meaning". This treatise describes ultimate nature or suchness as Buddha nature which is the basis for nirvana and samsara, radiant in nature and empty in essence, surpassing thought.
Dolpopa ("Dol-bo-ba", 1292–1361), founder of the Jonang school, developed a view called shentong (Wylie: gzhan ) (other empty), which is closely tied to Yogacara and Buddha-nature theories. This view holds that the qualities of Buddhahood or Buddha nature are already present in the mind, and that it is empty of all conventional reality which occludes its own nature as Buddhahood or Dharmakaya. According to Dolpopa, all beings are said to have Buddha nature, which is real, unchanging, permanent, non-conditioned, eternal, blissful and compassionate. Dolpopa's shentong view taught that ultimate reality was truly a "Great Self" or "Supreme Self" referring to works such as the "Mahāyāna Mahāparinirvāṇa Sūtra", the "Aṅgulimālīya Sūtra" and the "Śrīmālādevī Siṃhanāda Sūtra." This view had an influence on philosophers of other schools, such as Nyingma and Kagyu thinkers, and was also widely criticized in some circles as being similar to the Hindu notions of Atman. The Shentong philosophy was also expounded in Tibet and Mongolia by the later Jonang scholar Tāranātha (1575–1634).
In the late 17th century, the Jonang order and its teachings came under attack by the 5th Dalai Lama, who converted the majority of their monasteries in Tibet to the Gelug order, although several survived in secret.
Je Tsongkhapa (Dzong-ka-ba) (1357–1419) founded the Gelug school of Tibetan Buddhism, which came to dominate the country through the office of the Dalai Lama and is the major defender of the Prasaṅgika Madhyamaka view. His work is influenced by the philosophy of Candrakirti and Dharmakirti. Tsongkhapa's magnum opus is "The Ocean of Reasoning", a Commentary on Nagarjuna's Mulamadhyamakakarika. Gelug philosophy is based upon study of Madhyamaka texts and Tsongkhapa's works as well as formal debate (rtsod pa).
Tsongkhapa defended Prasangika Madhyamaka as the highest view and critiqued the Svatantrika. Tsongkhapa argued that, because the Svatantrika conventionally establish things by their own characteristics, they fail to completely understand the emptiness of phenomena and hence do not achieve the same realization. Drawing on Chandrakirti, Tsongkhapa rejected the Yogacara teachings, even as a provisional stepping point to the Madhyamaka view. Tsongkhapa was also critical of the Shengtong view of Dolpopa, which he saw as dangerously absolutist and hence outside the middle way. Tsongkhapa identified two major flaws in interpretations of Madhyamika, under-negation (of svabhava or own essence), which could lead to Absolutism, and over-negation, which could lead to Nihilism. Tsongkhapa's solution to this dilemma was the promotion of the use of inferential reasoning only within the conventional realm of the two truths framework, allowing for the use of reason for ethics, conventional monastic rules and promoting a conventional epistemic realism, while holding that, from the view of ultimate truth ("paramarthika satya"), all things (including Buddha nature and Nirvana) are empty of inherent existence (svabhava), and that true enlightenment is this realization of emptiness.
Sakya scholars such as Rongtön and Gorampa disagreed with Tsongkhapa, and argued that the prasangika svatantrika distinction was merely pedagogical. Gorampa also critiqued Tsongkhapa's realism, arguing that the structures which allow an empty object to be presented as conventionally real eventually dissolve under analysis and are thus unstructured and non-conceptual (spros bral). Tsongkhapa's students Gyel-tsap, Kay-drup, and Ge-dun-drup set forth an epistemological realism against the Sakya scholars' anti-realism.
Sakya Pandita (1182–1251) was a 13th-century head of the Sakya school and ruler of Tibet. He was also one of the most important Buddhist philosophers in the Tibetan tradition, writing works on logic and epistemology and promoting Dharmakirti's "Pramanavarttika" (Commentary on Valid Cognition) as central to scholastic study. Sakya Pandita's 'Treasury of Logic on Valid Cognition' ("Tshad ma rigs pa'i gter") set forth the classic Sakya epistemic anti-realist position, arguing that concepts such as universals are not known through valid cognition and hence are not real objects of knowledge. Sakya Pandita was also critical of theories of sudden enlightenment, which were held by some teachers of the "Chinese Great Perfection" in Tibet.
Later Sakyas such as Gorampa (1429–1489) and Sakya Chokden (1428–1507) would develop and defend Sakya anti-realism, and they are seen as the major interpreters and critics of Sakya Pandita's philosophy. Sakya Chokden also critiqued Tsongkhapa's interpretation of Madhyamaka and Dolpopa's Shentong. In his "Definite ascertainment of the middle way", Chokden criticized Tsongkhapa's view as being too logo-centric and still caught up in conceptualization about the ultimate reality which is beyond language. Sakya Chokden's philosophy attempted to reconcile the views of the Yogacara and Madhyamaka, seeing them both as valid and complementary perspectives on ultimate truth. Madhyamaka is seen by Chokden as removing the fault of taking the unreal as being real, and Yogacara removes the fault of the denial of Reality. Likewise, the Shentong and Rangtong views are seen as complementary by Sakya Chokden; Rangtong negation is effective in cutting through all clinging to wrong views and conceptual rectification, while Shentong is more amenable for describing and enhancing meditative experience and realization. Therefore, for Sakya Chokden, the same realization of ultimate reality can be accessed and described in two different but compatible ways.
The Nyingma school is strongly influenced by the view of Dzogchen (Great Perfection) and the Dzogchen Tantric literature. Longchenpa (1308–1364) was a major philosopher of the Nyingma school and wrote an extensive number of works on the Tibetan practice of Dzogchen and on Buddhist Tantra. These include the "Seven Treasures", the "Trilogy of Natural Ease", and his "Trilogy of Dispelling Darkness". Longchenpa's works provide a philosophical understanding of Dzogchen, a defense of Dzogchen in light of the sutras, as well as practical instructions. For Longchenpa, the ground of reality is luminous clarity, rigpa, or Buddha nature, and this ground is also the bridge between sutra and tantra. Longchenpa's philosophy sought to establish the positive aspects of Buddha nature thought against the totally negative theology of Madhyamika without straying into the absolutism of Dolpopa. For Longchenpa, the basis for Dzogchen and Tantric practice in Vajrayana is the "Ground" ("gzhi"), the immanent Buddha nature, "the primordially luminous reality that is unconditioned and spontaneously present" which is "free from all elaborated extremes".
The 19th century saw the rise of the Rimé movement (non-sectarian, unbiased) which sought to push back against the politically dominant Gelug school's criticisms of the Sakya, Kagyu, Nyingma and Bon philosophical views, and develop a more eclectic or universal system of textual study. Jamyang Khyentse Wangpo (1820-1892) and Jamgön Kongtrül (1813-1899) were the founders of Rimé. The Rimé movement came to prominence at a point in Tibetan history when the religious climate had become partisan. The aim of the movement was "a push towards a middle ground where the various views and styles of the different traditions were appreciated for their individual contributions rather than being refuted, marginalized, or banned." Philosophically, Jamgön Kongtrül defended Shentong as being compatible with Madhyamaka while another Rimé scholar Jamgon Ju Mipham Gyatso (1846–1912) criticized Tsongkhapa from a Nyingma perspective. Mipham argued that the view of the middle way is Unity (zung 'jug), meaning that from the ultimate perspective the duality of sentient beings and Buddhas is also dissolved. Mipham also affirmed the view of "rangtong" (self emptiness). The later Nyingma scholar Botrul (1894–1959) classified the major Tibetan Madhyamaka positions as shentong (other emptiness), Nyingma rangtong (self emptiness) and Gelug bdentong (emptiness of true existence). The main difference between them is their "object of negation"; shengtong states that inauthentic experience is empty, rangtong negates any conceptual reference and bdentong negates any true existence.
The 14th Dalai Lama was also influenced by this eclectic approach. Having studied under teachers from all major Tibetan Buddhist schools, his philosophical position tends to be that the different perspectives on emptiness are complementary:
There is a tradition of making a distinction between two different perspectives on the nature of emptiness: one is when emptiness is presented within a philosophical analysis of the ultimate reality of things, in which case it ought to be understood in terms of a non-affirming negative phenomena. On the other hand, when it is discussed from the point of view of experience, it should be understood more in terms of an affirming negation – 14th Dalai Lama
The schools of Buddhism that had existed in China prior to the emergence of the Tiantai are generally believed to represent direct transplantations from India, with little modification to their basic doctrines and methods. The Tiantai school, founded by Zhiyi (538–597), was the first truly unique Chinese Buddhist philosophical school. The doctrine of Tiantai was based on the ekayana or "one vehicle" doctrine taught in the Lotus sutra and sought to bring together all Buddhist teachings and texts into a comprehensively inclusive hierarchical system, which placed the Lotus sutra at the top of this hierarchy.
Tiantai's metaphysics is an immanent holism, which sees every phenomenon, moment or event as conditioned and manifested by the whole of reality. Every instant of experience is a reflection of every other, and hence, suffering and nirvana, good and bad, Buddhahood and evildoing, are all "inherently entailed" within each other. Each moment of consciousness is simply the Absolute itself, infinitely immanent and self reflecting.
This metaphysics is entailed in the Tiantai teaching of the "three truths", which is an extension of the Mādhyamaka two truths doctrine. The three truths are: the conventional truth of appearance, the truth of emptiness (shunyata) and the third truth of 'the exclusive Center' (但中 "danzhong") or middle way, which is beyond conventional truth and emptiness. This third truth is the Absolute and expressed by the claim that nothing is "Neither-Same-Nor-Different" than anything else, but rather each 'thing' is the absolute totality of all things manifesting as a particular, everything is mutually contained within each thing. Everything is a reflection of 'The Ultimate Reality of All Appearances'(諸法實相 zhufashixiang) and each thought "contains three thousand worlds". This perspective allows the Tiantai school to state such seemingly paradoxical things as "evil is ineradicable from the highest good, Buddhahood." Moreover, in Tiantai, nirvana and samsara are ultimately the same; as Zhiyi writes, "A single, unalloyed reality is all there is – no entities whatever exist outside of it."
Though Zhiyi did write "One thought contains three thousand worlds", this does not entail idealism. According to Zhiyi, "The objects of the [true] aspects of reality are not something produced by Buddhas, gods, or men. They exist inherently on their own and have no beginning" (The Esoteric Meaning, 210). This is then a form of realism, which sees the mind as real as the world, interconnected with and inseparable from it. In Tiantai thought, ultimate reality is simply the phenomenal world of interconnected events or dharmas.
Other key figures of Tiantai thought are Zhanran (711–782) and Siming Zhili (960–1028). Zhanran developed the idea that non-sentient beings have Buddha nature, since they are also a reflection of the Absolute. In Japan, this school was known as Tendai and was first brought to the island by Saicho.
The Huayan developed the doctrine of "interpenetration" or "coalescence" (Wylie: "zung-'jug"; Sanskrit: "yuganaddha"), based on the "Avataṃsaka Sūtra" (Flower Garland sutra), a Mahāyāna scripture. Huayan holds that all phenomena (Sanskrit: "dharmas") are deeply interconnected, mutually arising and that every phenomenon contains all other phenomena. Various metaphors and images are used to illustrate this idea. The first is known as Indra's net. The net is set with jewels which have the extraordinary property that they reflect all of the other jewels, while the reflections also contain every other reflection, ad infinitum. The second image is that of the world text. This image portrays the world as consisting of an enormous text which is as large as the universe itself. The words of the text are composed of the phenomena that make up the world. However, every atom of the world contains the whole text within it. It is the work of a Buddha to let out the text so that beings can be liberated from suffering.
Fazang (Fa-tsang, 643–712), one of the most important Huayan thinkers, wrote 'Essay on the Golden Lion' and 'Treatise on the Five Teachings', which contain other metaphors for the interpenetration of reality. He also used the metaphor of a House of mirrors. Fazang introduced the distinction of "the Realm of Principle" and "the Realm of Things". This theory was further developed by Cheng-guan (738–839) into the major Huayan thesis of "the fourfold Dharmadhatu" (dharma realm): the Realm of Principle, the Realm of Things, the Realm of the Noninterference between Principle and Things, and the Realm of the Noninterference of All Things. The first two are the universal and the particular, the third is the interpenetration of universal and particular, and the fourth is the interpenetration of all particulars. The third truth was explained by the metaphor of a golden lion: the gold is the universal and the particular is the shape and features of the lion.
While both Tiantai and Huayan hold to the interpenetration and interconnection of all things, their metaphysics have some differences. Huayan metaphysics is influenced by Yogacara thought and is closer to idealism. The Avatamsaka sutra compares the phenomenal world to a dream, an illusion, and a magician's conjuring. The sutra states nothing has true reality, location, beginning and end, or substantial nature. The Avatamsaka also states that "The triple world is illusory – it is only made by one mind", and Fazang echoes this by writing, "outside of mind there is not a single thing that can be apprehended." Furthermore, according to Huayan thought, each mind creates its own world "according to their mental patterns", and "these worlds are infinite in kind" and constantly arising and passing away. However, in Huayan, mind is not real either, but also empty. The true reality in Huayan, the noumenon, or "Principle", is likened to a mirror, while phenomena are compared to reflections in the mirror. It is also compared to the ocean, and phenomena to waves.
In Korea, this school was known as Hwaeom and is represented in the work of Wonhyo (617–686), who also wrote about the idea of essence-function, a central theme in Korean Buddhist thought. In Japan, Huayan is known as Kegon and one of its major proponents was Myōe, who also introduced Tantric practices.
The philosophy of Chinese Chan Buddhism and Japanese Zen is based on various sources; these include Chinese Madhyamaka ("Sānlùn"), Yogacara ("Wéishí"), the Laṅkāvatāra Sūtra, and the Buddha nature texts. An important issue in Chan is that of subitism or "sudden enlightenment", the idea that enlightenment happens all at once in a flash of insight. This view was promoted by Shenhui and is a central issue discussed in the Platform Sutra, a key Chan scripture composed in China.
Huayan philosophy also had an influence on Chan. The theory of the Fourfold Dharmadhatu influenced the Five Ranks of Dongshan Liangjie (806-869), the founder of the Caodong Chan lineage. Guifeng Zongmi, who was also a patriarch of Huayan Buddhism, wrote extensively on the philosophy of Chan and on the Avatamsaka sutra.
Japanese Buddhism during the 6th and 7th centuries saw an increase in the proliferation of new schools and forms of thought, a period known as the six schools of Nara ("Nanto Rokushū"). The Kamakura Period (1185–1333) also saw another flurry of intellectual activity. During this period, the influential figure of Nichiren (1222–1282) made the practice and universal message of the Lotus Sutra more readily available to the population. He is of particular importance in the history of thought and religion, as his teachings constitute a separate sect of Buddhism, one of the only major sects to have originated in Japan
Also during the Kamakura period, the founder of Soto Zen, Dogen (1200–1253), wrote many works on the philosophy of Zen, and the "Shobogenzo" is his magnum opus. In Korea, Chinul was an important exponent of Seon Buddhism at around the same time.
Tantric Buddhism arrived in China in the 7th century, during the Tang Dynasty. In China, this form of Buddhism is known as Mìzōng (密宗), or "Esoteric School", and "Zhenyan" (true word, Sanskrit: Mantrayana). Kūkai (AD774–835) is a major Japanese Buddhist philosopher and the founder of the Tantric Shingon (true word) school in Japan. He wrote on a wide variety of topics such as public policy, language, the arts, literature, music and religion. After studying in China under Huiguo, Kūkai brought together various elements into a cohesive philosophical system of Shingon.
Kūkai's philosophy is based on the Mahavairocana Tantra and the Vajrasekhara Sutra (both from the seventh century). His "Benkenmitsu nikkyôron" (Treatise on the Differences Between Esoteric and Exoteric Teachings) outlines the difference between exoteric, mainstream Mahayana Buddhism (kengyô) and esoteric Tantric Buddhism (mikkyô). Kūkai provided the theoretical framework for the esoteric Buddhist practices of Mantrayana, bridging the gap between the doctrine of the sutras and tantric practices. At the foundation of Kūkai's thought is the Trikaya doctrine, which holds there are three "bodies of the Buddha".
According to Kūkai, esoteric Buddhism has the Dharmakaya (Jpn: "hosshin", embodiment of truth) as its source, which is associated with Vairocana Buddha (Dainichi). Hosshin is embodied absolute reality and truth. Hosshin is mostly ineffable but can be experienced through esoteric practices such as mudras and mantras. While Mahayana is taught by the historical Buddha (nirmankaya), it does not have ultimate reality as its source or the practices to experience the esoteric truth. For Shingon, from an enlightened perspective, the whole phenomenal world itself is also the teaching of Vairocana. The body of the world, its sounds and movements, is the body of truth (dharma) and furthermore it is also identical with the personal body of the cosmic Buddha. For Kūkai, world, actions, persons and Buddhas are all part of the cosmic monologue of Vairocana, they are the truth being preached, to its own self manifestations. This is "hosshin seppô" (literally: "the dharmakâya's expounding of the Dharma") which can be accessed through mantra which is the cosmic language of Vairocana emanating through cosmic vibration concentrated in sound. In a broad sense, the universe itself a huge text expressing ultimate truth (Dharma) which must be "read".
Dainichi means "Great Sun" and Kūkai uses this as a metaphor for the great primordial Buddha, whose teaching and presence illuminates and pervades all, like the light of the sun. This immanent presence also means that every being already has access to enlightenment (hongaku) and Buddha nature, and that, because of this, there is the possibility of "becoming Buddha in this very embodied existence" ("sokushinjôbutsu"). This is achieved because of the non-dual relationship between the macrocosm of Hosshin and the microcosm of the Shingon practitioner.
Kūkai's exposition of what has been called Shingon's "metaphysics" is based on the three aspects of the cosmic truth or Hosshin – body, appearance and function. The body is the physical and mental elements, which are the body and mind of the cosmic Buddha and which is also empty (Shunyata). The physical universe for Shingon contains the interconnected mental and physical events. The appearance aspect is the form of the world, which appears as mandalas of interconnected realms and is depicted in mandala art such as the Womb Realm mandala. The function is the movement and change which happens in the world, which includes change in forms, sounds and thought. These forms, sounds and thoughts are expressed by the Shingon practitioner in various rituals and tantric practices which allow them to connect with and inter-resonate with Dainichi and hence reach enlightenment here and now.
In Sri Lanka, Buddhist modernists such as Anagarika Dharmapala (1864-1933) and the American convert Henry Steel Olcott sought to show that Buddhism was rational and compatible with modern Scientific ideas such as the theory of evolution. Dharmapala also argued that Buddhism included a strong social element, interpreting it as liberal, altruistic and democratic. K. N. Jayatilleke wrote the classic modern account of Buddhist epistemology ("Early Buddhist Theory of Knowledge", 1963) and his student David Kalupahana wrote on the history of Buddhist thought and psychology. Other important Sri Lankan Buddhist thinkers include Ven Ñāṇananda ("Concept and Reality"), Walpola Rahula, Hammalawa Saddhatissa ("Buddhist Ethics", 1987), Gunapala Dharmasiri ("A Buddhist critique of the Christian concept of God", 1988), P. D. Premasiri and R. G. de S. Wettimuny.
In 20th-century China, the modernist Taixu (1890-1947) advocated a reform and revival of Buddhism. He promoted an idea of a Buddhist Pure Land, not as a metaphysical place in Buddhist cosmology but as something possible to create here and now in this very world, which could be achieved through a "Buddhism for Human Life" () which was free of supernatural beliefs. Taixu also wrote on the connections between modern science and Buddhism, ultimately holding that "scientific methods can only corroborate the Buddhist doctrine, they can never advance beyond it". Like Taixu, Yin Shun (1906–2005) advocated a form of Humanistic Buddhism grounded in concern for humanitarian issues, and his students and followers have been influential in promoting Humanistic Buddhism in Taiwan. This period also saw a revival of the study of Weishi (Yogachara), by Yang Rensan (1837-1911), Ouyang Jinwu (1871-1943) and Liang Shuming (1893–1988).
One of Tibetan Buddhism's most influential modernist thinkers is Gendün Chöphel (1903–1951), who, according to Donald S. Lopez Jr., "was arguably the most important Tibetan intellectual of the twentieth century." Gendün Chöphel traveled throughout India with the Indian Buddhist Rahul Sankrityayan and wrote a wide variety of material, including works promoting the importance of modern science to his Tibetan countrymen and also Buddhist philosophical texts such as "Adornment for Nagarjuna’s Thought". Another very influential Tibetan Buddhist modernist was Chögyam Trungpa, whose Shambhala Training was meant to be more suitable to modern Western sensitivities by offering a vision of "secular enlightenment".
In Southeast Asia, thinkers such as Buddhadasa, Thích Nhất Hạnh, Sulak Sivaraksa and Aung San Suu Kyi have promoted a philosophy of socially Engaged Buddhism and have written on the socio-political application of Buddhism. Likewise, Buddhist approaches to economic ethics (Buddhist economics) have been explored in the works of E. F. Schumacher, Prayudh Payutto, Neville Karunatilake and Padmasiri de Silva. The study of the Pali Abhidhamma tradition continued to be influential in Myanmar, where it was developed by monks such as Ledi Sayadaw and Mahasi Sayadaw.
Japanese Buddhist philosophy was heavily influenced by the work of the Kyoto School which included Kitaro Nishida, Keiji Nishitani, Hajime Tanabe and Masao Abe. These thinkers brought Buddhist ideas in dialogue with Western philosophy, especially European phenomenologists and existentialists. The most important trend in Japanese Buddhist thought after the formation of the Kyoto school is Critical Buddhism, which argues against several Mahayana concepts such as Buddha nature and original enlightenment. In Nichiren Buddhism, the work of Daisaku Ikeda has also been popular.
The Japanese Zen Buddhist D.T. Suzuki (1870–1966) was instrumental in bringing Zen Buddhism to the West and his Buddhist modernist works were very influential in the United States. Suzuki's worldview was a Zen Buddhism influenced by Romanticism and Transcendentalism, which promoted a spiritual freedom as "a spontaneous, emancipatory consciousness that transcends rational intellect and social convention." This idea of Buddhism influenced the Beat writers, and a contemporary representative of Western Buddhist Romanticism is Gary Snyder. The American Theravada Buddhist monk Thanissaro Bhikkhu has critiqued 'Buddhist Romanticism' in his writings.
Western Buddhist monastics and priests such as Nanavira Thera, Bhikkhu Bodhi, Nyanaponika Thera, Robert Aitken, Taigen Dan Leighton, and Matthieu Ricard have written texts on Buddhist philosophy. A feature of Buddhist thought in the West has been a desire for dialogue and integration with modern science and psychology, and various modern Buddhists such as Alan Wallace, James H. Austin, Mark Epstein and the 14th Dalai Lama have worked and written on this issue. Another area of convergence has been Buddhism and environmentalism, which is explored in the work of Joanna Macy. Another Western Buddhist philosophical trend has been the project to secularize Buddhism, as seen in the works of Stephen Batchelor.
In the West, Comparative philosophy between Buddhist and Western thought began with the work of Charles A. Moore, who founded the journal Philosophy East and West. Contemporary Western Academics such as Mark Siderits, Jan Westerhoff, Jonardon Ganeri, Miri Albahari, Owen Flanagan, Damien Keown, Tom Tillemans, David Loy, Evan Thompson and Jay Garfield have written various works which interpret Buddhist ideas through Western philosophy.
Scholars such as Thomas McEvilley, Christopher I. Beckwith, and Adrian Kuzminski have identified cross influences between ancient Buddhism and the ancient Greek philosophy of Pyrrhonism. The Greek philosopher Pyrrho spent 18 months in India as part of Alexander the Great's court on Alexander's conquest of western India, where ancient biographers say his contact with the gymnosophists caused him to create his philosophy. Because of the high degree of similarity between Nāgārjuna's philosophy and Pyrrhonism, particularly the surviving works of Sextus Empiricus Thomas McEvilley suspects that Nāgārjuna was influenced by Greek Pyrrhonist texts imported into India.
Baruch Spinoza, though he argued for the existence of a permanent reality, asserts that all phenomenal existence is transitory. In his opinion sorrow is conquered "by finding an object of knowledge which is not transient, not ephemeral, but is immutable, permanent, everlasting." The Buddha taught that the only thing which is eternal is Nirvana. David Hume, after a relentless analysis of the mind, concluded that consciousness consists of fleeting mental states. Hume's Bundle theory is a very similar concept to the Buddhist "skandhas", though his skepticism about causation lead him to opposite conclusions in other areas. Arthur Schopenhauer's philosophy parallels Buddhism in his affirmation of asceticism and renunciation as a response to suffering and desire.
Ludwig Wittgenstein's "language-game" closely parallel the warning that intellectual speculation or papañca is an impediment to understanding, as found in the Buddhist "Parable of the Poison Arrow". Friedrich Nietzsche, although himself dismissive of Buddhism as yet another nihilism, had a similar impermanent view of the self. Heidegger's ideas on being and nothingness have been held by some to be similar to Buddhism today.
An alternative approach to the comparison of Buddhist thought with Western philosophy is to use the concept of the Middle Way in Buddhism as a critical tool for the assessment of Western philosophies. In this way Western philosophies can be classified in Buddhist terms as eternalist or nihilist. In a Buddhist view all philosophies are considered non-essential views (ditthis) and not to be clung to. | https://en.wikipedia.org/wiki?curid=4468 |
Billy Bob Thornton
Billy Bob Thornton (born August 4, 1955) is an American actor, writer, director, and musician.
Thornton had his first break when he co-wrote and starred in the 1992 thriller "One False Move", and received international attention after writing, directing, and starring in the independent drama film "Sling Blade" (1996), for which he won an Academy Award for Best Adapted Screenplay and was nominated for an Academy Award for Best Actor. He appeared in several major film roles in the 1990s following "Sling Blade", including Oliver Stone's neo-noir "U Turn" (1997), political drama "Primary Colors" (1998), science fiction disaster film "Armageddon" (1998), the highest-grossing film of that year, and the crime drama "A Simple Plan" (1998), which earned him his third Oscar nomination.
In the 2000s, Thornton achieved further success in starring dramas "Monster's Ball" (2001), "The Man Who Wasn't There" (2001), and "Friday Night Lights" (2004); comedies "Bandits" (2001), "Intolerable Cruelty" (2003), and "Bad Santa" (2003); and action films "Eagle Eye" (2008) and "Faster" (2010). In 2014, Thornton starred as Lorne Malvo in the first season of the anthology series "Fargo", earning a nomination for the Outstanding Lead Actor in a Miniseries or TV Movie at the Emmy Awards and won Best Actor in a Miniseries or TV Film at the 72nd Golden Globe Awards. In 2016, he starred in an Amazon original series, "Goliath," which earned him a Golden Globe Award for Best Actor – Television Series Drama.
Thornton has been vocal about his distaste for celebrity culture, choosing to keep his life out of the public eye. However, the attention of the media has proven unavoidable in certain cases, his marriage to Angelina Jolie being a notable example. Thornton has written a variety of films, usually set in the Southern United States and mainly co-written with Tom Epperson, including "A Family Thing" (1996) and "The Gift" (2000). After "Sling Blade", he directed several other films, including "Daddy and Them" (2001), "All the Pretty Horses" (2000), and "Jayne Mansfield's Car" (2012).
Thornton has received the President's Award from the Academy of Science Fiction, Fantasy & Horror Films, a Special Achievement Award from the National Board of Review, and a star on the Hollywood Walk of Fame. He has also been nominated for an Emmy Award, four Golden Globes, and three Screen Actors Guild Awards. In addition to film work, Thornton began a career as a singer-songwriter. He has released four solo albums and is the vocalist of the blues rock band The Boxmasters.
Billy Bob Thornton was born on August 4, 1955, in Hot Springs, Arkansas, the son of Virginia Roberta ("née" Faulkner; died July 29, 2017), a self-proclaimed psychic, and William Raymond "Billy Ray" Thornton (November 1929 – August 1974), a high school history teacher and basketball coach. His brother Jimmy Don (April 1958 – October 1988) wrote a number of songs; Thornton recorded two of them ("Island Avenue" and "Emily") on his solo albums. He is of part Irish descent. He has another brother, John David Thornton.
Thornton lived in numerous places in Arkansas during his childhood, including Alpine, Malvern, and Mount Holly. He was raised Methodist in an extended family in a shack that had no electricity or plumbing. He graduated from Malvern High School in 1973. A good high school baseball player, he tried out for the Kansas City Royals, but was released after an injury. After a short period laying asphalt for the Arkansas State Transportation Department, he attended Henderson State University to pursue studies in psychology but dropped out after two semesters.
In the mid-1980s, Thornton settled in Los Angeles to pursue his career as an actor, with future writing partner Tom Epperson. He had a difficult time succeeding as an actor and worked in telemarketing, offshore wind farming, and fast food management between auditioning for acting jobs. He also played the drums and sang with South African rock band Jack Hammer. While working as a waiter for an industry event, he served film director and screenwriter Billy Wilder. He struck up a conversation with Wilder, who advised Thornton to consider a career as a screenwriter.
Thornton's first screen role was in 1980's "South of Reno", where he played a small role as a counter man in a restaurant. He also made an appearance as a pawn store clerk in the 1987 "Matlock" episode "The Photographer". Another one of his early screen roles was as a cast member on the CBS sitcom Hearts Afire and in 1989 he appeared as an angry heckler in Adam Sandler's debut film "Going Overboard". His role as the villain in 1992's "One False Move", which he also co-wrote, brought him to the attention of critics. He also had small roles in the 1990s films "Indecent Proposal", "On Deadly Ground", "Bound by Honor", and "Tombstone". He went on to write, direct, and star in the 1996 independent film "Sling Blade". The film, an expansion of the short film "Some Folks Call It a Sling Blade", introduced the story of a mentally handicapped man imprisoned for a gruesome and seemingly inexplicable murder.
"Sling Blade" garnered international acclaim. Thornton's screenplay earned him an Academy Award for Best Adapted Screenplay, a Writers Guild of America Award, and an Edgar Award, while his performance received Oscar and Screen Actors Guild nominations for Best Actor. In 1998, Thornton portrayed the James Carville-like Richard Jemmons in "Primary Colors". He adapted the book "All the Pretty Horses" into a 2000 film of the same name. The negative experience (he was forced to cut more than an hour of footage) led to his decision to never direct another film; a subsequent release, "Daddy and Them", had been filmed earlier. Also in 2000, an early script which he and Tom Epperson wrote together was made into "The Gift".
In 2000, Thornton appeared in Travis Tritt's music video for the song "Modern Day Bonnie and Clyde". His screen persona has been described by the press as that of a "tattooed, hirsute man's man". He appeared in several major film roles following the success of "Sling Blade", including 1998's "Armageddon" and "A Simple Plan". In 2001, he directed "Daddy and Them" while securing starring roles in three Hollywood films: "Monster's Ball", "Bandits", and "The Man Who Wasn't There", for which he received many awards.
Thornton played a malicious mall Santa in 2003's "Bad Santa", a black comedy that performed well at the box office and established him as a leading comic actor, and in the same year, portrayed a womanizing President of the United States in the British romantic comedy film "Love Actually". He stated that, following the success of "Bad Santa", audiences "like to watch him play that kind of guy" and that "casting directors call him up when they need an asshole". He referred to this when he said that "it's kinda that simple... you know how narrow the imagination in this business can be".
In 2004, Thornton played David Crockett in "The Alamo". Later that year, he received a star on the Hollywood Walk of Fame on October 7. He appeared in the 2006 comic film "School for Scoundrels". In the film, he plays a self-help doctor, which was written specifically for him. More recent films include 2007 drama "The Astronaut Farmer" and the comedy "Mr. Woodcock", in which he played a sadistic gym teacher. In September 2008, he starred in the action film "Eagle Eye". He has also expressed an interest in directing another film, possibly a period piece about cave explorer Floyd Collins, based on the book "Trapped! The Story of Floyd Collins".
In 2014, Thornton starred as sociopathic hitman Lorne Malvo in the FX miniseries "Fargo", inspired by the 1996 film of the same name, for which he won a Golden Globe for Best Actor in a Mini-Series.
Thornton made a guest appearance on "The Big Bang Theory" in 2014, where he played a middle-aged urologist who gets excited about every woman who touches him.
"Goliath", a television series by Amazon Studios, features Thornton as a formerly brilliant and personable lawyer, who is now washed up and alcoholic. It premiered on October 13, 2016, on Amazon Video. On February 15, 2017, Amazon announced the series had been renewed for a second season.
In 2017, Thornton starred in the music video "Stand Down" by Kario Salem (musically known as K.O.). It received the Best Music Video award from the Toronto Shorts International Film Festival and has had 13 million views on Facebook and counting.
From the time he was 10 years old, Thornton has been in bands. His first performance was on drums at a school PTA meeting where his band played "The Ballad of The Green Berets" instrumentally. Several bands followed, with Thornton's first recording experience coming at Widget Sound in Muscle Shoals, Alabama in 1974. Later in the 1970s, Thornton was the drummer of a blues rock band named "Tres Hombres". Guitarist Billy Gibbons referred to the band as "the best little cover band in Texas", and Thornton bears a tattoo with the band's name on it.
In 1985, Thornton joined Piet Botha in the South African rock band Jack Hammer, while Botha worked in Los Angeles. Thornton recorded one studio album with Jack Hammer, "Death" "of" "a" "Gypsy", which was released in 1986.
In 2001, Thornton released an album titled "Private Radio" on Lost Highway Records. Subsequent albums include "The Edge of the World" (2003), "Hobo" (2005) and "Beautiful Door" (2007). He performed the Warren Zevon song "The Wind" on the tribute album "". Thornton recorded a cover of the Johnny Cash classic "Ring of Fire" with Earl Scruggs, for the "Oxford American" magazine's Southern Music CD in 2001. The song also appeared on Scruggs' 2001 album "Earl Scruggs and Friends".
In 2007, Thornton formed The Boxmasters with J.D. Andrew.
Thornton has been married six times. The first five marriages ended in divorce, and he has four children by three women.
From 1978 to 1980, he was married to Melissa Lee Gatlin, who in her divorce petition cited “incompatibility and adultery on his part”. They had a daughter, Amanda (Brumfield), who in 2008 was sentenced to 20 years in prison for the death of her friend's 1-year-old daughter.
Thornton married actress Toni Lawrence in 1986; they separated the following year and divorced in 1988.
From 1990 to 1992, he was married to actress Cynda Williams, whom he cast in his writing debut, "One False Move" (1992).
In 1993, Thornton married "Playboy" model Pietra Dawn Cherniak, with whom he had two sons, Harry James and William. The marriage ended in 1997, with Cherniak accusing Thornton of spousal abuse, sometimes in front of his children.
Thornton was engaged to be married to actress Laura Dern, whom he dated from 1997 to 1999, but in 2000, he married actress Angelina Jolie, with whom he starred in "Pushing Tin" (1999) and who was 20 years his junior. The marriage became known for the couple's eccentric displays of affection, which reportedly included wearing vials of each other's blood around their necks; Thornton later clarified that the "vials" were actually two small lockets, each containing only a single drop of blood. Thornton and Jolie announced the adoption of a child from Cambodia in March 2002, but it was later revealed that Jolie had adopted the child as a single parent. They separated in June 2002 and divorced the following year.
In 2003, Thornton began a relationship with makeup effects crew member Connie Angland, with whom he has a daughter named Bella. They reside in Los Angeles, California. Although he once said that he likely would not marry again, saying that he believes marriage "doesn't work" for him, his representatives confirmed that he and Angland were married on October 22, 2014, in Los Angeles.
During his early years in Los Angeles, Thornton was admitted to a hospital and diagnosed with myocarditis. He has since said that he follows a vegan diet and is "extremely healthy", eating no junk food as he is allergic to wheat and dairy.
Thornton suffers from OCD. Various idiosyncratic behaviors have been well documented in interviews with Thornton; among these is a phobia of antique furniture, a disorder shared by Dwight Yoakam's character Doyle Hargraves in the Thornton-penned "Sling Blade" and by Thornton's own character in the 2001 film "Bandits". Additionally, he has stated that he has a fear of certain types of silverware, a trait assumed by his character in 2001's "Monster's Ball", in which Grotowski insists on a plastic spoon for his daily bowl of ice cream.
In a 2004 interview with "The Independent", Thornton explained, "It's just that I won't use real silver. You know, like the big, old, heavy-ass forks and knives, I can't do that. It's the same thing as the antique furniture. I just don't like old stuff. I'm creeped out by it, and I have no explanation why ... I don't have a phobia about American antiques, it's mostly French—you know, like the big, old, gold-carved chairs with the velvet cushions. The Louis XIV type. That's what creeps me out. I can spot the imitation antiques a mile off. They have a different vibe. Not as much dust.".
Thornton is a baseball fan; his favorite team is the St. Louis Cardinals, and he has said that his childhood dream was to play for them. He narrated "The 2006 World Series Film", the year-end retrospective DVD chronicling the Cardinals' championship season. He is also a professed fan of the Indianapolis Colts football team. | https://en.wikipedia.org/wiki?curid=4471 |
The Big O
The television series was designed as a tribute to Japanese and Western shows from the 1960s and 1970s. The series is presented in the style of "film noir" and combines themes of detective fiction and mecha anime. The setpieces are reminiscent of "tokusatsu" productions of the 1950s and 1960s, particularly Toho's "kaiju" movies, and the score is an eclectic mix of styles and musical homages.
"The Big O" aired on WOWOW satellite television from October 13, 1999, and January 19, 2000. The English-language version premiered on Cartoon Network on April 2, 2001, and ended on April 18, 2001. Originally planned as a 26-episode series, low viewership in Japan reduced production to the first 13. Positive international reception resulted in a second season consisting of the remaining 13 episodes; co-produced by Cartoon Network, Sunrise, and Bandai Visual. Season two premiered on Japan's SUN-TV on January 2, 2003, and the American premiere took place seven months later. Following the closure of Bandai Entertainment by parent company Bandai (owned by Bandai Namco Holdings) in 2012, Sunrise announced at Otakon 2013 that Sentai Filmworks rescued both seasons of "The Big O".
"The Big O" is set in the fictional city-state of . The city is located on a seacoast and is surrounded by a vast desert wasteland. The partially domed city is wholly controlled by the monopolistic Paradigm Corporation, resulting in a corporate police state. Paradigm is known as because of forty years prior to the story, " destroyed the world outside the city and left the survivors without any prior memories. The city is characterized by severe class inequity; the higher-income population resides inside the more pleasant domes, with the remainder left in tenements outside. Residents of the city believe that they are the last survivors of the world and no other nations exist outside the city. Androids and giant robots known as "Megadeus" coexist with the residents of Paradigm City and do not find them unusual.
After failing to negotiate with terrorists at the cost of his client's life, Roger Smith is obligated to care for Dorothy Wayneright, a young female android. Over the course of the series, Roger Smith continues to accept negotiation work from the residents of Paradigm City, he often leads to uncovering the nature and mystery of Paradigm City and encountering megadeus or other giant enemies that require Big O. Supporting characters are Angel, a mysterious woman in search of memories; Dan Dastun, chief of the military police of Paradigm city and old friend of Roger Smith; and Norman Burg, the butler of Roger Smith and mechanic of Big O.
The main antagonist is Alex Rosewater, chairman of Paradigm City whose goal is to revive the megadeus "Big Fau" in attempts to become the god of Paradigm City. Other recurring antagonists are Jason Beck, criminal and con-artist attempting to humiliate Roger Smith; Schwarzwald, an ex-reporter obsessed with finding the truth of Paradigm City and pilot of the megadeus "Big Duo"; Vera Ronstadt, leader of a group of foreigners known as the Union searching for memories and revenge against Paradigm City; and Allen Gabriel, a cyborg assassin working for Alex Rosewater and the Union.
The series ends with the awakening of a new megadeus, and the revelation that the world is a simulated reality. A climactic battle ensues between Big O and Big Fau, after which reality is systematically erased by the new megadeus, an incarnation of Angel, recognized as "Big Venus" by Dorothy. Roger implores Angel to "let go of the past" regardless of its existential reality, and focus only on the present and the future. In an isolated control room, the real Angel observes Roger and her past encounters with him on a series of television monitors. On the control panel lies "Metropolis", a book featured prominently since the thirteenth episode, with the cover featuring an illustration of angel wings and gives the author's name as "Angel Rosewater". Big Venus and Big O physically merge, causing the virtual reality to reset. The final scene shows Roger Smith driving down a restored Paradigm city with Dorothy and Angel observing him from the side of the road.
Development of the retro-styled series began in 1996. Keiichi Sato came up with the concept of "The Big O": a giant city-smashing robot, piloted by a man in black, in a Gotham-like environment. He later met up with Kazuyoshi Katayama, who had just finished directing "Those Who Hunt Elves", and started work on the layouts and character designs. But when things "were about to really start moving," production on Katayama's "Sentimental Journey" began, putting plans on hold. Meanwhile, Sato was heavily involved with his work on "City Hunter".
Sato admits it all started as "a gimmick for a toy" but the representatives at Bandai Hobby Division did not see the same potential. From there on, the dealings would be with Bandai Visual, but Sunrise still needed some safeguards and requested more robots be designed to increase prospective toy sales. In 1999, with the designs complete, Chiaki J. Konaka was brought on as head writer. Among other things, Konaka came up with the idea of "a town without memory" and his writing staff put together the outline for a 26-episodes series.
"The Big O" premiered on 13 October 1999 on WOWOW. When the production staff was informed the series would be shortened to 13 episodes, the writers decided to end it with a cliffhanger, hoping the next 13 episodes would be picked up. In April 2001, "The Big O" premiered on Cartoon Network's Toonami lineup.
The series garnered positive fan response internationally that resulted in a second season co-produced by Cartoon Network and Sunrise. Season two premiered on Japan's SUN-TV on January 2003, with the American premiere taking place seven months later as an Adult Swim exclusive. The second season would not be seen on Toonami until July 27, 2013, 10 years after it began airing on Adult Swim.
The second season was scripted by Chiaki Konaka with input from the American producers. Cartoon Network raised two requests for the second season: more action and reveal the mystery in the first season, although Kazuyoshi Katayama admitted that he didn't intend to reveal it, just to make an anthology of adventures setting in the universe. Along with the 13 episodes of season two, Cartoon Network had an option for 26 additional episodes to be written by Konaka, but according to Jason DeMarco, executive producer for season two, the middling ratings and DVD sales in the United States and Japan made any further episodes impossible to be produced.
Following the closure of Bandai Entertainment by parent company in 2012, Sunrise announced at Otakon 2013 that Sentai Filmworks rescued both seasons of "The Big O". On June 20, 2017, Sentai Filmworks released both seasons on Blu-ray.
"The Big O" was scored by "Geidai" alumnus Toshihiko Sahashi. His composition is richly symphonic and classical, with a number of pieces delving into electronica and jazz. Chosen because of his "frightening amount of musical knowledge about TV dramas overseas," Sahashi integrates musical homages into the soundtrack. The background music draws from "film noir", spy films and sci-fi television series like "The Twilight Zone". The battle themes are reminiscent of Akira Ifukube's compositions for the "Godzilla" series.
The first opening theme is the Queen-influenced "Big-O!". Composed, arranged and performed by Rui Nagai, the song resembles the theme to the "Flash Gordon" film. The second opening theme is "Respect," composed by Sahashi. The track is an homage to the music of Gerry Anderson's "UFO", composed by Barry Gray. In 2007, Rui Nagai composed "Big-O! Show Must Go On," a 1960s hard rock piece, for Animax's reruns of the show. The closing theme is the slow love ballad "And Forever..." written by Chie and composed by Ken Shima. The duet is performed by Robbie Danzie and Naoki Takao.
Along with Sahashi's original compositions, the soundtrack features Chopin's Prelude No. 15 and a jazz saxophone rendition of "Jingle Bells." The complete score was released in two volumes by Victor Entertainment.
"The Big O" is the brainchild of Keiichi Sato and Kazuyoshi Katayama, an homage to the shows they grew up with. The show references the works of "tokusatsu" produced by the Toei Company and Tsuburaya Productions, as well as shows such as "Super Robot Red Baron" and "Super Robot Mach Baron" and "old school" super robot anime. The series is done in the style of "film noir" and pulp fiction and combines the feel of a detective show with the giant robot genre.
"The Big O" shares much of its themes, diction, archetypes and visual iconography with "film noirs" of the 1940s like "The Big Sleep" (1946). The series incorporates the use of long dark shadows in the tradition of "chiaroscuro" and tenebrism. "Film noir" is also known for its use of odd angles, such as Roger's low shot introduction in the first episode. "Noir" cinematographers favoured this angle because it made characters almost rise from the ground, giving them dramatic girth and symbolic overtones. Other disorientating devices like dutch angles, mirror reflection and distorting shots are employed throughout the series.
The characters of "The Big O" fit the "noir" and pulp fiction archetypes. Roger Smith is a protagonist in the mold of Chandler's Philip Marlowe or Hammett's Sam Spade. He is canny and cynical, a disillusioned cop-turned-negotiator whose job has more in common with detective-style work than negotiating. Big Ear is Roger's street informant and Dan Dastun is the friend on the police force. The recurring Beck is the imaginative thug compelled by delusions of grandeur while Angel fills the role of the "femme fatale". Minor characters include crooked cops, corrupt business men and deranged scientists.
The dialogue in the series is recognized for its witty, wry sense of humor. The characters come off as charming and exchange banter not often heard in anime series, as the dialogue has the tendency to be straightforward. The plot is moved along by Roger's voice-over narration, a device used in "film noir" to place the viewer in the mind of the protagonist so it can intimately experience the character's angst and partly identify with the narrator.
The tall buildings and giant domes create a sense of claustrophobia and paranoia characteristic of the style. The rural landscape, Ailesberry Farm, contrasts Paradigm City. "Noir" protagonists often look for sanctuary in such settings but they just as likely end up becoming a killing ground. The series score is representative of its setting. While no classic "noir" possesses a jazz score, the music could be heard in nightclubs within the films. Roger's recurring theme, a lone saxophone accompaniment to the protagonist's narration, best exemplifies the "noir" stylings of the series.
Amnesia is a common plot device in "film noir". Because most of these stories focused on a character proving his innocence, authors up the ante by making him an amnesiac, unable to prove his innocence even to himself.
Before "The Big O", Sunrise was a subcontractor for Warner Bros. Animation's "", one of the series' influences. Cartoon Network, under the Toonami flag advertised the series as "One part Bond. One part Bruce Wayne. One part City Smashing Robot."
Roger Smith is a pastiche of the Bruce Wayne persona and the Batman. The character design resembles Wayne, complete with slicked-back hair and double-breasted business suit. Like Bruce, Roger prides himself in being a rich playboy to the extent that one of his household's rules is only women may be let into his mansion without his permission. Like Batman, Roger Smith carries a no-gun policy, albeit more flexible. Unlike the personal motives of the Batman, Roger enforces this rule for "it's all part of being a gentleman." Among Roger's gadgetry is the Griffon, a large, black hi-tech sedan comparable to the Batmobile, a grappling cable that shoots out his wristwatch and the giant robot that Angel calls "Roger's alter ego."
"The Big O"'s cast of supporting characters includes Norman, Roger's faithful mechanically-inclined butler who fills the role of Alfred Pennyworth; R. Dorothy Wayneright, who plays the role of the sidekick; and Dan Dastun, a good honest cop who, like Jim Gordon, is both a friend to the hero and greatly respected by his comrades.
The other major influence is Mitsuteru Yokoyama's "Giant Robo". Before working on "The Big O", Kazuyoshi Katayama and other animators worked with Yasuhiro Imagawa on "". The feature, a "retro chic" homage to Yokoyama's career, took seven years to produce and suffered low sales and high running costs. Frustrated by the experience, Katayama and his staff put all their efforts into making "good" with "The Big O".
Like Giant Robo, the megadeuses of "Big O" are metal behemoths. The designs are strange and "more macho than practical," sporting big stovepipe arms and exposed rivets. Unlike the giants of other mecha series, the megadeuses do not exhibit ninja-like speed nor grace. Instead, the robots are armed with "old school" weaponry such as missiles, piston powered punches, machine guns and laser cannons.
Katayama also cited "Super Robot Red Baron" and "Super Robot Mach Baron" among influences on the inspiration of "The Big O". Believing that because "Red Baron" had such a low budget and the big fights always happened outside of a city setting, he wanted "Big O" to be the show he felt "Red Baron" could be with a bigger budget. He also spoke of how he first came up with designs for the robots first as if they were making designs to appeal to toy companies, rather than how "Gundam" was created with a toy company wanting an anime to represent their new product. Big O's large pumping piston "Sudden Impact" arms, for example, he felt would be cool gimmicks in a toy.
"The Big O" was conceived as a media franchise. To this effect, Sunrise requested a manga be produced along with the animated series. "The Big O" manga started serialization in Kodansha's "Magazine Z" on July 1999, three months before the anime premiere. Authored by Hitoshi Ariga, the manga uses Keiichi Sato's concept designs in an all-new story. The series ended on October 2001. The issues were later collected in six volumes. The English version of the manga is published by Viz Media.
In anticipation of the broadcast of the second season, a new manga series was published. , authored by Hitoshi Ariga. "Lost Memory" takes place between volumes five and six of the original manga. The issues were serialized in "Magazine Z" from November 2002 to September 2003 and were collected in two volumes. , a novel by Yuki Taniguchi, was released 16 July 2003 by Tokuma Shoten.
"The Big O Visual: The official companion to the TV series" () was published by Futabasha in 2003. The book contains full-color artwork, character bios and concept art, mecha sketches, video/LD/DVD jacket illustrations, history on the making of The Big O, staff interviews, "Roger's Monologues" comic strip and the original script for the final episode of the series.
"Walking Together On The Yellow Brick Road" was released by Victor Entertainment on 21 September 2000. The drama CD was written by series head writer Chiaki J. Konaka and featured the series' voice cast. An English translation, written by English dub translator David Fleming, was posted on Konaka's website.
The first season of Big O is featured in "Super Robot Wars D" for the Game Boy Advance in 2003. The series, including its second season, is also featured in "Super Robot Wars Z", released in 2008. "The Big O" became a mainstay of the "Z" games, appearing in each entry of the subseries.
Bandai released a non-scale model kit of Big O in 2000. Though it was an easy snap-together kit, it required painting, as all of the parts (except the clear orange crown and canopy) were molded in dark gray. The kit included springs that enabled the slide-action Side Piles on the forearms to simulate Big O's Sudden Impact maneuver. Also included was an unpainted Roger Smith figure.
PVC figures of Big O and Big Duo (Schwarzwald's Megadeus) were sold by Bandai America. Each came with non-poseable figures of Roger, Dorothy and Angel. Mini-figure sets were sold in Japan and America during the run of the second season. The characters included Big O (standard and attack modes), Roger, Dorothy & Norman, Griffon (Roger's car), Dorothy-1 (Big O's first opponent), Schwarzwald and Big Duo.
In 2009, Bandai released a plastic/diecast figure of the Big O under their Soul of Chogokin line. The figure has the same features as the model kit, but with added detail and accessories. Its design was closely supervised by original designer Keiichi Sato.
In 2011, Max Factory released action figures of Roger and Dorothy through their Figma toyline. Like most Figmas, they are very detailed, articulated and come with accessories and interchangeable faces. In the same year, Max Factory also released a 12-inch, diecast figure of Big O under their Max Gokin line. The figure contained most of the accessories as the Soul of Chogokin figure but also included some others that could be bought separately from the SOC figure, such as the Mobydick (hip) Anchors and Roger Smith's car: the Griffon. Like the Soul of Chogokin figure, its design was also supervised by Keiichi Sato. As well, in that same year, Max Factory released soft vinyl figures of Big Duo and Big Fau, in-scale with the Max Gokin Big O. These figures are high in detail but limited in articulation, such as the arms and legs being the only things to move. To date, this is the only action figure of Big Fau.
"The Big O" premiered on 13 October 1999. The show was not a hit in its native Japan, rather it was reduced from an outlined 26 episodes to 13 episodes. Western audiences were more receptive and the series achieved the success its creators were looking for. In an interview with AnimePlay, Keiichi Sato said "This is exactly as we had planned", referring to the success overseas.
Several words appear constantly in the English-language reviews; adjectives like "hip", "sleek," "stylish",
"classy", and, above all, "cool" serve to describe the artwork, the concept, and the series itself. Reviewers have pointed out references and homages to various works of fiction, namely "Batman", Giant Robo, the works of Isaac Asimov, Fritz Lang's "Metropolis", James Bond, and "Cowboy Bebop". But "while saying that may cause one to think the show is completely derivative", reads an article at Anime on DVD, ""The Big O" still manages to stand out as something original amongst the other numerous cookie-cutter anime shows." One reviewer cites the extensive homages as one of the series problems and calls to unoriginality on the creators' part.
The first season's reception was positive. Anime on DVD recommends it as an essential series. Chris Beveridge of the aforementioned site gave an A− to Vols. 1 and 2, and a B+ to Vols. 3 and 4. Mike Toole of Anime Jump gave it 4.5 (out of a possible 5) stars, while the review at the Anime Academy gave it a grade of 83, listing the series' high points as being "unique", the characters "interesting," and the action "nice." Reviewers, and fans alike, agree the season's downfall was the ending, or its lack thereof. The dangling plot threads frustrated the viewers and prompted Cartoon Network's involvement in the production of further episodes.
The look and feel of the show received a big enhancement in the second season. This time around, the animation is "near OVA quality" and the artwork "far more lush and detailed." Also enhanced are the troubles of the first season. The giant robot battles still seem out of place to some, while others praise the "over-the-top-ness" of their execution.
For some reviewers, the second season "doesn't quite match the first" addressing to "something" missing in these episodes. Andy Patrizio of IGN points out changes in Roger Smith's character, who "lost some of his cool and his very funny side in the second season." Like a repeat of season one, this season's ending is considered its downfall. Chris Beveridge of Anime on DVD wonders if this was head writer "Konaka's attempt to throw his hat into the ring for creating one of the most confusing and oblique endings of any series." Patrizio states "the creators watched "The Truman Show" and "The Matrix" a few times too many."
The series continues to have a strong cult following into the 2010s. In 2014 BuzzFeed writer Ryan Broderick ranked "The Big O" as one of the best anime series to binge-watch. Dan Casey host of The Nerdist's "Dan Cave" stated "The Big O" was the anime series he was most eager to see rebooted or remade, along with "Trigun" and "Soul Eater". In 2017, Ollie Barder of Forbes wrote,"From the classic and retro styled mecha design of Keiichi Sato to the overall film noir visual tone of the series, The Big O was a fascinating and visually very different kind of show. It also had a fantastic voice cast, with probably the most notable of these being Akiko Yajima as the voice of Roger's disapproving android Dorothy." In 2019, Crunchyroll writer Thomas Zoth ranked "The Big O" as his top 10 anime since the 1990s. | https://en.wikipedia.org/wiki?curid=4472 |
BIOS
BIOS (pronounced: , ; an acronym for Basic Input/Output System and also known as the System BIOS, ROM BIOS or PC BIOS) is firmware used to perform hardware initialization during the booting process (power-on startup), and to provide runtime services for operating systems and programs. The BIOS firmware comes pre-installed on a personal computer's system board, and it is the first software to run when powered on. The name originates from the Basic Input/Output System used in the CP/M operating system in 1975. The BIOS originally proprietary to the IBM PC has been reverse engineered by companies looking to create compatible systems. The interface of that original system serves as a "de facto" standard.
The BIOS in modern PCs initializes and tests the system hardware components, and loads a boot loader from a mass memory device which then initializes an operating system. In the era of DOS, the BIOS provided a hardware abstraction layer for the keyboard, display, and other input/output (I/O) devices that standardized an interface to application programs and the operating system. More recent operating systems do not use the BIOS after loading, instead accessing the hardware components directly.
Most BIOS implementations are specifically designed to work with a particular computer or motherboard model, by interfacing with various devices that make up the complementary system chipset. Originally, BIOS firmware was stored in a ROM chip on the PC motherboard. In modern computer systems, the BIOS contents are stored on flash memory so it can be rewritten without removing the chip from the motherboard. This allows easy, end-user updates to the BIOS firmware so new features can be added or bugs can be fixed, but it also creates a possibility for the computer to become infected with BIOS rootkits. Furthermore, a BIOS upgrade that fails can brick the motherboard permanently, unless the system includes some form of backup for this case.
Unified Extensible Firmware Interface (UEFI) is a successor to the legacy PC BIOS, aiming to address its technical shortcomings.
The term BIOS (Basic Input/Output System) was created by Gary Kildall and first appeared in the CP/M operating system in 1975, describing the machine-specific part of CP/M loaded during boot time that interfaces directly with the hardware. (A CP/M machine usually has only a simple boot loader in its ROM.)
Versions of MS-DOS, PC DOS or DR-DOS contain a file called variously "IO.SYS", "IBMBIO.COM", "IBMBIO.SYS", or "DRBIOS.SYS"; this file is known as the "DOS BIOS" (also known as the "DOS I/O System") and contains the lower-level hardware-specific part of the operating system. Together with the underlying hardware-specific but operating system-independent "System BIOS", which resides in ROM, it represents the analogue to the "CP/M BIOS".
With the introduction of PS/2 machines, IBM divided the System BIOS into real- and protected-mode portions. The real-mode portion was meant to provide backward compatibility with existing operating systems such as DOS, and therefore was named "CBIOS" (for "Compatibility BIOS"), whereas the "ABIOS" (for "Advanced BIOS") provided new interfaces specifically suited for multitasking operating systems such as OS/2.
The BIOS of the original IBM PC and XT had no interactive user interface. Error codes or messages were displayed on the screen, or coded series of sounds were generated to signal errors when the power-on self-test (POST) had not proceeded to the point of successfully initializing a video display adapter. Options on the IBM PC and XT were set by switches and jumpers on the main board and on expansion cards. Starting around the mid-1990s, it became typical for the BIOS ROM to include a ""BIOS configuration utility"" (BCU) or "BIOS setup utility", accessed at system power-up by a particular key sequence. This program allowed the user to set system configuration options, of the type formerly set using DIP switches, through an interactive menu system controlled through the keyboard. In the interim period, IBM-compatible PCsincluding the IBM ATheld configuration settings in battery-backed RAM and used a bootable configuration program on disk, not in the ROM, to set the configuration options contained in this memory. The disk was supplied with the computer, and if it was lost the system settings could not be changed. The same applied in general to computers with an EISA bus, for which the configuration program was called an EISA Configuration Utility (ECU).
A modern Wintel-compatible computer provides a setup routine essentially unchanged in nature from the ROM-resident BIOS setup utilities of the late 1990s; the user can configure hardware options using the keyboard and video display. Also, when errors occur at boot time, a modern BIOS usually displays user-friendly error messages, often presented as pop-up boxes in a TUI style, and offers to enter the BIOS setup utility or to ignore the error and proceed if possible. Instead of battery-backed RAM, the modern Wintel machine may store the BIOS configuration settings in flash ROM, perhaps the same flash ROM that holds the BIOS itself.
Early Intel processors started at physical address 000FFFF0h. Systems with later processors provide logic to start running the BIOS from the system ROM.
If the system has just been powered up or the reset button was pressed ("cold boot"), the full power-on self-test (POST) is run. If Ctrl+Alt+Delete was pressed ("warm boot"), a special flag value stored in nonvolatile BIOS memory ("CMOS") tested by the BIOS allows bypass of the lengthy POST and memory detection.
The POST identifies, and initializes system devices such as the CPU, RAM, interrupt and DMA controllers and other parts of the chipset, video display card, keyboard, hard disk drive, optical disc drive and other basic hardware.
Early IBM PCs had a routine in the POST that would download a program into RAM through the keyboard port and run it. This feature was intended for factory test or diagnostic purposes.
After the option ROM scan is completed and all detected ROM modules with valid checksums have been called, or immediately after POST in a BIOS version that does not scan for option ROMs, the BIOS calls INT 19h to start boot processing. Post-boot, programs loaded can also call INT 19h to reboot the system, but they must be careful to disable interrupts and other asynchronous hardware processes that may interfere with the BIOS rebooting process, or else the system may hang or crash while it is rebooting.
When INT 19h is called, the BIOS attempts to locate boot loader software on a "boot device", such as a hard disk, a floppy disk, CD, or DVD. It loads and executes the first boot software it finds, giving it control of the PC.
The BIOS uses the boot devices set in EEPROM, CMOS RAM or, in the earliest PCs, DIP switches. The BIOS checks each device in order to see if it is bootable by attempting to load the first sector (boot sector). If the sector cannot be read, the BIOS proceeds to the next device. If the sector is read successfully, some BIOSes will also check for the boot sector signature 0x55 0xAA in the last two bytes of the sector (which is 512 bytes long), before accepting a boot sector and considering the device bootable.
When a bootable device is found, the BIOS transfers control to the loaded sector. The BIOS does not interpret the contents of the boot sector other than to possibly check for the boot sector signature in the last two bytes. Interpretation of data structures like partition tables and BIOS Parameter Blocks is done by the boot program in the boot sector itself or by other programs loaded through the boot process.
A non-disk device such as a network adapter attempts booting by a procedure that is defined by its option ROM or the equivalent integrated into the motherboard BIOS ROM. As such, option ROMs may also influence or supplant the boot process defined by the motherboard BIOS ROM.
The user can select the boot priority implemented by the BIOS. For example, most computers have a hard disk that is bootable, but usually there is a removable-media drive that has higher boot priority, so the user can cause a removable disk to be booted.
In most modern BIOSes, the boot priority order can be configured by the user. In older BIOSes, limited boot priority options are selectable; in the earliest BIOSes, a fixed priority scheme was implemented, with floppy disk drives first, fixed disks (i.e. hard disks) second, and typically no other boot devices supported, subject to modification of these rules by installed option ROMs. The BIOS in an early PC also usually would only boot from the first floppy disk drive or the first hard disk drive, even if there were two drives installed.
With the El Torito optical media boot standard, the optical drive actually emulates a 3.5" high-density floppy disk to the BIOS for boot purposes. Reading the "first sector" of a CD-ROM or DVD-ROM is not a simply defined operation like it is on a floppy disk or a hard disk. Furthermore, the complexity of the medium makes it difficult to write a useful boot program in one sector. The bootable virtual floppy disk can contain software that provides access to the optical medium in its native format.
On the original IBM PC and XT, if no bootable disk was found, ROM BASIC was started by calling INT 18h. Since few programs used BASIC in ROM, clone PC makers left it out; then a computer that failed to boot from a disk would display "No ROM BASIC" and halt (in response to INT 18h).
Later computers would display a message like "No bootable disk found"; some would prompt for a disk to be inserted and a key to be pressed to retry the boot process. A modern BIOS may display nothing or may automatically enter the BIOS configuration utility when the boot process fails.
The environment for the boot program is very simple: the CPU is in real mode and the general-purpose and segment registers are undefined, except SS, SP, CS, and DL. CS:IP always points to physical address codice_1. What values CS and IP actually have is not well defined. Some BIOSes use a CS:IP of codice_2 while others may use codice_3. Because boot programs are always loaded at this fixed address, there is no need for a boot program to be relocatable. DL may contain the drive number, as used with INT 13h, of the boot device. SS:SP points to a valid stack that is presumably large enough to support hardware interrupts, but otherwise SS and SP are undefined. (A stack must be already set up in order for interrupts to be serviced, and interrupts must be enabled in order for the system timer-tick interrupt, which BIOS always uses at least to maintain the time-of-day count and which it initializes during POST, to be active and for the keyboard to work. The keyboard works even if the BIOS keyboard service is not called; keystrokes are received and placed in the 15-character type-ahead buffer maintained by BIOS.) The boot program must set up its own stack, because the size of the stack set up by BIOS is unknown and its location is likewise variable; although the boot program can investigate the default stack by examining SS:SP, it is easier and shorter to just unconditionally set up a new stack.
At boot time, all BIOS services are available, and the memory below address codice_4 contains the interrupt vector table. BIOS POST has initialized the system timers, interrupt controller(s), DMA controller(s), and other motherboard/chipset hardware as necessary to bring all BIOS services to ready status. DRAM refresh for all system DRAM in conventional memory and extended memory, but not necessarily expanded memory, has been set up and is running. The interrupt vectors corresponding to the BIOS interrupts have been set to point at the appropriate entry points in the BIOS, hardware interrupt vectors for devices initialized by the BIOS have been set to point to the BIOS-provided ISRs, and some other interrupts, including ones that BIOS generates for programs to hook, have been set to a default dummy ISR that immediately returns. The BIOS maintains a reserved block of system RAM at addresses codice_5 with various parameters initialized during the POST. All memory at and above address codice_6 can be used by the boot program; it may even overwrite itself.
Peripheral cards such as some hard disk drive controllers and some video display adapters have their own BIOS extension option ROMs, which provide additional functionality to BIOS. Code in these extensions runs before the BIOS boots the system from mass storage. These ROMs typically test and initialize hardware, add new BIOS services, and augment or replace existing BIOS services with their own versions of those services. For example, a SCSI controller usually has a BIOS extension ROM that adds support for hard drives connected through that controller. Some video cards have extension ROMs that replace the video services of the motherboard BIOS with their own video services. BIOS extension ROMs gain total control of the machine, so they can in fact do anything, and they may never return control to the BIOS that invoked them. An extension ROM could in principle contain an entire operating system or an application program, or it could implement an entirely different boot process such as booting from a network. Operation of an IBM-compatible computer system can be completely changed by removing or inserting an adapter card (or a ROM chip) that contains a BIOS extension ROM.
The motherboard BIOS typically contains code to access hardware components necessary for bootstrapping the system, such as the keyboard, display, and storage. In addition, plug-in adapter cards such as SCSI, RAID, network interface cards, and video boards often include their own BIOS (e.g. Video BIOS), complementing or replacing the system BIOS code for the given component. Even devices built into the motherboard can behave in this way; their option ROMs can be stored as separate code on the main BIOS flash chip, and upgraded either in tandem with, or separately from, the main BIOS.
An add-in card requires an option ROM if the card is not supported by the main BIOS and the card needs to be initialized or made accessible through BIOS services before the operating system can be loaded (usually this means it is required in the bootstrapping process). Even when it is not required, an option ROM can allow an adapter card to be used without loading driver software from a storage device after booting begins with an option ROM, no time is taken to load the driver, the driver does not take up space in RAM nor on hard disk, and the driver software on the ROM always stays with the device so the two cannot be accidentally separated. Also, if the ROM is on the card, both the peripheral hardware and the driver software provided by the ROM are installed together with no extra effort to install the software. An additional advantage of ROM on some early PC systems (notably including the IBM PCjr) was that ROM was faster than main system RAM. (On modern systems, the case is very much the reverse of this, and BIOS ROM code is usually copied ("shadowed") into RAM so it will run faster.)
There are many methods and utilities for examining the contents of various motherboard BIOS and expansion ROMs, such as Microsoft DEBUG or the Unix dd.
If an expansion ROM wishes to change the way the system boots (such as from a network device or a SCSI adapter for which the BIOS has no driver code) in a cooperative way, it can use the "BIOS Boot Specification" (BBS) API to register its ability to do so. Once the expansion ROMs have registered using the BBS APIs, the user can select among the available boot options from within the BIOS's user interface. This is why most BBS compliant PC BIOS implementations will not allow the user to enter the BIOS's user interface until the expansion ROMs have finished executing and registering themselves with the BBS API. The specification can be downloaded from the "ACPI" (Advanced Configuration and Power Interface) "Component Architecture" website. The official title is BIOS Boot Specification (Version 1.01, 11 January 1996).
Also, if an expansion ROM wishes to change the way the system boots unilaterally, it can simply hook INT 19h or other interrupts normally called from interrupt 19h, such as INT 13h, the BIOS disk service, to intercept the BIOS boot process. Then it can replace the BIOS boot process with one of its own, or it can merely modify the boot sequence by inserting its own boot actions into it, by preventing the BIOS from detecting certain devices as bootable, or both. Before the BIOS Boot Specification was promulgated, this was the only way for expansion ROMs to implement boot capability for devices not supported for booting by the native BIOS of the motherboard.
After the motherboard BIOS completes its POST, most BIOS versions search for option ROM modules, also called BIOS extension ROMs, and execute them. The motherboard BIOS scans for extension ROMs in a portion of the "upper memory area" (the part of the x86 real-mode address space at and above address 0xA0000) and runs each ROM found, in order. To discover memory-mapped ISA option ROMs, a BIOS implementation scans the real-mode address space from codice_7 to codice_8 on 2 KiB boundaries, looking for a two-byte ROM "signature": 0x55 followed by 0xAA. In a valid expansion ROM, this signature is followed by a single byte indicating the number of 512-byte blocks the expansion ROM occupies in real memory, and the next byte is the option ROM's entry point (also known as its "entry offset"). A checksum of the specified number of 512-byte blocks is calculated, and if the ROM has a valid checksum, the BIOS transfers control to the entry address, which in a normal BIOS extension ROM should be the beginning of the extension's initialization routine.
At this point, the extension ROM code takes over, typically testing and initializing the hardware it controls and registering interrupt vectors for use by post-boot applications. It may use BIOS services (including those provided by previously initialized option ROMs) to provide a user configuration interface, to display diagnostic information, or to do anything else that it requires. It is possible that an option ROM will not return to BIOS, pre-empting the BIOS's boot sequence altogether.
An option ROM should normally return to the BIOS after completing its initialization process. Once (and if) an option ROM returns, the BIOS continues searching for more option ROMs, calling each as it is found, until the entire option ROM area in the memory space has been scanned.
Option ROMs normally reside on adapter cards. However, the original PC, and perhaps also the PC XT, have a spare ROM socket on the motherboard (the "system board" in IBM's terms) into which an option ROM can be inserted, and the four ROMs that contain the BASIC interpreter can also be removed and replaced with custom ROMs which can be option ROMs. The IBM PCjr is unique among PCs in having two ROM cartridge slots on the front. Cartridges in these slots map into the same region of the upper memory area used for option ROMs, and the cartridges can contain option ROM modules that the BIOS would recognize. The cartridges can also contain other types of ROM modules, such as BASIC programs, that are handled differently. One PCjr cartridge can contain several ROM modules of different types, possibly stored together in one ROM chip.
The BIOS ROM is customized to the particular manufacturer's hardware, allowing low-level services (such as reading a keystroke or writing a sector of data to diskette) to be provided in a standardized way to programs, including operating systems. For example, an IBM PC might have either a monochrome or a color display adapter (using different display memory addresses and hardware), but a single, standard, BIOS system call may be invoked to display a character at a specified position on the screen in text mode or graphics mode.
The BIOS provides a small library of basic input/output functions to operate peripherals (such as the keyboard, rudimentary text and graphics display functions and so forth). When using MS-DOS, BIOS services could be accessed by an application program (or by MS-DOS) by executing an INT 13h interrupt instruction to access disk functions, or by executing one of a number of other documented BIOS interrupt calls to access video display, keyboard, cassette, and other device functions.
Operating systems and executive software that are designed to supersede this basic firmware functionality provide replacement software interfaces to application software. Applications can also provide these services to themselves. This began even in the 1980s under MS-DOS, when programmers observed that using the BIOS video services for graphics display was very slow. To increase the speed of screen output, many programs bypassed the BIOS and programmed the video display hardware directly. Other graphics programmers, particularly but not exclusively in the demoscene, observed that there were technical capabilities of the PC display adapters that were not supported by the IBM BIOS and could not be taken advantage of without circumventing it. Since the AT-compatible BIOS ran in Intel real mode, operating systems that ran in protected mode on 286 and later processors required hardware device drivers compatible with protected mode operation to replace BIOS services.
In modern personal computers running modern operating systems the BIOS is used only during booting and initial loading of system software. Before the operating system's first graphical screen is displayed, input and output are typically handled through BIOS. A boot menu such as the textual menu of Windows, which allows users to choose an operating system to boot, to boot into the safe mode, or to use the last known good configuration, is displayed through BIOS and receives keyboard input through BIOS.
Most modern PCs can still boot and run legacy operating systems such as MS-DOS or DR-DOS that rely heavily on BIOS for their console and disk I/O, providing that the system has a BIOS or BIOS-compatible firmware, which is not necessarily the case with UEFI-based PCs.
Intel processors have reprogrammable microcode since the P6 microarchitecture. The BIOS may contain patches to the processor microcode that fix errors in the initial processor microcode; reprogramming is not persistent, thus loading of microcode updates is performed each time the system is powered up. Without reprogrammable microcode, an expensive processor swap would be required; for example, the Pentium FDIV bug became an expensive fiasco for Intel as it required a product recall because the original Pentium processor's defective microcode could not be reprogrammed.
Some BIOSes contain a software licensing description table (SLIC), a digital signature placed inside the BIOS by the original equipment manufacturer (OEM), for example Dell. The SLIC is inserted into the ACPI data table and contains no active code.
Computer manufacturers that distribute OEM versions of Microsoft Windows and Microsoft application software can use the SLIC to authenticate licensing to the OEM Windows Installation disk and system recovery disc containing Windows software. Systems with a SLIC can be preactivated with an OEM product key, and they verify an XML formatted OEM certificate against the SLIC in the BIOS as a means of self-activating (see System Locked Preinstallation, SLP). If a user performs a fresh install of Windows, they will need to have possession of both the OEM key (either SLP or COA) and the digital certificate for their SLIC in order to bypass activation. This can be achieved if the user performs a restore using a pre-customised image provided by the OEM. Power users can copy the necessary certificate files from the OEM image, decode the SLP product key, then perform SLP activation manually. Cracks for non-genuine Windows distributions usually edit the SLIC or emulate it in order to bypass Windows activation.
Some BIOS implementations allow overclocking, an action in which the CPU is adjusted to a higher clock rate than its manufacturer rating for guaranteed capability. Overclocking may, however, seriously compromise system reliability in insufficiently cooled computers and generally shorten component lifespan. Overclocking, when incorrectly performed, may also cause components to overheat so quickly that they mechanically destroy themselves.
Some older operating systems, for example MS-DOS, rely on the BIOS to carry out most input/output tasks within the PC.
Because the BIOS still runs in 16-bit real mode, calling BIOS services directly is inefficient for protected-mode operating systems. BIOS services are not used by modern multitasking operating systems after they initially load, so the importance of the primary part of BIOS is greatly reduced from what it was initially.
Later BIOS implementations took on more complex functions, by including interfaces such as Advanced Configuration and Power Interface (ACPI). Functions of ACPI include power management, interrupt management, hot swapping, and thermal management. After operating systems load, the System Management Mode code is still running in SMRAM. Since 2010, BIOS technology is in a transitional process toward UEFI.
Historically, the BIOS in the IBM PC and XT had no built-in user interface. The BIOS versions in earlier PCs (XT-class) were not software configurable; instead, users set the options via DIP switches on the motherboard. Later computers, including all IBM-compatibles with 80286 CPUs, had a battery-backed nonvolatile BIOS memory (CMOS RAM chip) that held BIOS settings. These settings, such as video-adapter type, memory size, and hard-disk parameters, could only be configured by running a configuration program from a disk, not built into the ROM. A special "reference diskette" was inserted in an IBM AT to configure settings such as memory size.
Early BIOS versions did not have passwords or boot-device selection options. The BIOS was hard-coded to boot from the first floppy drive, or, if that failed, the first hard disk. Access control in early AT-class machines was by a physical keylock switch (which was not hard to defeat if the computer case could be opened). Anyone who could switch on the computer could boot it.
Later, 386-class computers started integrating the BIOS setup utility in the ROM itself, alongside the BIOS code; these computers usually boot into the BIOS setup utility if a certain key or key combination is pressed, otherwise the BIOS POST and boot process are executed.
A modern BIOS setup utility has a menu-based user interface (UI) accessed by pressing a certain key on the keyboard when the PC starts. Usually, the key is advertised for short time during the early startup, for example "Press F1 to enter CMOS setup". The actual key depends on specific hardware. Features present in the BIOS setup utility typically include:
A modern BIOS setup screen often features a PC Health Status or a Hardware Monitoring tab, which directly interfaces with a Hardware Monitor chip of the mainboard. This makes it possible to monitor CPU and chassis temperature, the voltage provided by the power supply unit, as well as monitor and control the speed of the fans connected to the motherboard.
Once the system is booted, hardware monitoring and computer fan control is normally done directly by the Hardware Monitor chip itself, which can be a separate chip, interfaced through I²C or SMBus, or come as a part of a Super I/O solution, interfaced through Low Pin Count (LPC). Some operating systems, like NetBSD with envsys and OpenBSD with sysctl hw.sensors, feature integrated interfacing with hardware monitors, which is normally done without any interaction with the BIOS.
However, in certain circumstances, the BIOS vendor also provides the underlying information about hardware monitoring through ACPI, in which case, the operating system may be using ACPI to perform hardware monitoring; this is done, for example, on some ASUSTeK motherboards with the AI Booster feature.
In modern PCs the BIOS is stored in rewritable memory, allowing the contents to be replaced and modified. This rewriting of the contents is sometimes termed "flashing", based on the common use of a kind of EEPROM known technically as "flash EEPROM" and colloquially as "flash memory". It can be done by a special program, usually provided by the system's manufacturer, or at POST, with a BIOS image in a hard drive or USB flash drive. A file containing such contents is sometimes termed "a BIOS image". A BIOS might be reflashed in order to upgrade to a newer version to fix bugs or provide improved performance or to support newer hardware, or a reflashing operation might be needed to fix a damaged BIOS
The original IBM PC BIOS (and cassette BASIC) was stored on mask-programmed read-only memory (ROM) chips in sockets on the motherboard. ROMs could be replaced, but not altered, by users. To allow for updates, many compatible computers used re-programmable memory devices such as EPROM and later flash memory devices. According to Robert Braver, the president of the BIOS manufacturer Micro Firmware, Flash BIOS chips became common around 1995 because the electrically erasable PROM (EEPROM) chips are cheaper and easier to program than standard ultraviolet erasable PROM (EPROM) chips. Flash chips are programmed (and re-programmed) in-circuit, while EPROM chips need to be removed from the motherboard for re-programming. BIOS versions are upgraded to take advantage of newer versions of hardware and to correct bugs in previous revisions of BIOSes.
Beginning with the IBM AT, PCs supported a hardware clock settable through BIOS. It had a century bit which allowed for manually changing the century when the year 2000 happened. Most BIOS revisions created in 1995 and nearly all BIOS revisions in 1997 supported the year 2000 by setting the century bit automatically when the clock rolled past midnight, December 31, 1999.
The first flash chips were attached to the ISA bus. Starting in 1997, the BIOS flash moved to the LPC bus, a functional replacement for ISA, following a new standard implementation known as "firmware hub" (FWH). In 2006, the first systems supporting a Serial Peripheral Interface (SPI) appeared, and the BIOS flash memory moved again.
The size of the BIOS, and the capacity of the ROM, EEPROM, or other media it may be stored on, has increased over time as new features have been added to the code; BIOS versions now exist with sizes up to 16 megabytes. For contrast, the original IBM PC BIOS was contained in an 8 KiB mask ROM. Some modern motherboards are including even bigger NAND flash memory ICs on board which are capable of storing whole compact operating systems, such as some Linux distributions. For example, some ASUS motherboards included Splashtop OS embedded into their NAND flash memory ICs. However, the idea of including an operating system along with BIOS in the ROM of a PC is not new; in the 1980s, Microsoft offered a ROM option for MS-DOS, and it was included in the ROMs of some PC clones such as the Tandy 1000 HX.
Another type of firmware chip was found on the IBM PC AT and early compatibles. In the AT, the keyboard interface was controlled by a microcontroller with its own programmable memory. On the IBM AT, that was a 40-pin socketed device, while some manufacturers used an EPROM version of this chip which resembled an EPROM. This controller was also assigned the A20 gate function to manage memory above the one-megabyte range; occasionally an upgrade of this "keyboard BIOS" was necessary to take advantage of software that could use upper memory.
The BIOS may contain components such as the Memory Reference Code (MRC), which is responsible for handling memory timings and related hardware settings.
IBM published the entire listings of the BIOS for its original PC, PC XT, PC AT, and other contemporary PC models, in an appendix of the "IBM PC Technical Reference Manual" for each machine type. The effect of the publication of the BIOS listings is that anyone can see exactly what a definitive BIOS does and how it does it.
In May 1984 Phoenix Software Associates released its first ROM-BIOS, which enabled OEMs to build essentially fully compatible clones without having to reverse-engineer the IBM PC BIOS themselves, as Compaq had done for the Portable, helping fuel the growth in the PC-compatibles industry and sales of non-IBM versions of DOS. And the first American Megatrends (AMI) BIOS was released on 1986.
New standards grafted onto the BIOS are usually without complete public documentation or any BIOS listings. As a result, it is not as easy to learn the intimate details about the many non-IBM additions to BIOS as about the core BIOS services.
Most PC motherboard suppliers license a BIOS "core" and toolkit from a commercial third party, known as an "independent BIOS vendor", or IBV. The motherboard manufacturer then customizes this BIOS to suit its own hardware. For this reason, updated BIOSes are normally obtained directly from the motherboard manufacturer. Major BIOS vendors include American Megatrends (AMI), Insyde Software, Phoenix Technologies and Byosoft. Former vendors include Award Software and Microid Research that were acquired by Phoenix Technologies in 1998; Phoenix later phased out the Award brand name. General Software, which was also acquired by Phoenix in 2007, sold BIOS for embedded systems based on Intel processors.
The open-source community increased their effort to develop a replacement for proprietary BIOSes and their future incarnations with an open-sourced counterpart through the libreboot, coreboot and OpenBIOS/Open Firmware projects. AMD provided product specifications for some chipsets, and Google is sponsoring the project. Motherboard manufacturer Tyan offers coreboot next to the standard BIOS with their Opteron line of motherboards. MSI and Gigabyte Technology have followed suit with the MSI K9ND MS-9282 and MSI K9SD MS-9185 resp. the M57SLI-S4 models.
EEPROM chips are advantageous because they can be easily updated by the user; it is customary for hardware manufacturers to issue BIOS updates to upgrade their products, improve compatibility and remove bugs. However, this advantage had the risk that an improperly executed or aborted BIOS update could render the computer or device unusable. To avoid these situations, more recent BIOSes use a "boot block"; a portion of the BIOS which runs first and must be updated separately. This code verifies if the rest of the BIOS is intact (using hash checksums or other methods) before transferring control to it. If the boot block detects any corruption in the main BIOS, it will typically warn the user that a recovery process must be initiated by booting from removable media (floppy, CD or USB flash drive) so the user can try flashing the BIOS again. Some motherboards have a "backup" BIOS (sometimes referred to as DualBIOS boards) to recover from BIOS corruptions.
There are at least five known BIOS attack viruses, two of which were for demonstration purposes. The first one found in the wild was "Mebromi", targeting Chinese users.
The first BIOS virus was BIOS Meningitis, which instead of erasing BIOS chips it infected them. BIOS Meningitis has relatively harmless, compared to a virus like CIH
The second BIOS virus was CIH, also known as the "Chernobyl Virus", which was able to erase flash ROM BIOS content on compatible chipsets. CIH appeared in mid-1998 and became active in April 1999. Often, infected computers could no longer boot, and people had to remove the flash ROM IC from the motherboard and reprogram it. CIH targeted the then-widespread Intel i430TX motherboard chipset and took advantage of the fact that the Windows 9x operating systems, also widespread at the time, allowed direct hardware access to all programs.
Modern systems are not vulnerable to CIH because of a variety of chipsets being used which are incompatible with the Intel i430TX chipset, and also other flash ROM IC types. There is also extra protection from accidental BIOS rewrites in the form of boot blocks which are protected from accidental overwrite or dual and quad BIOS equipped systems which may, in the event of a crash, use a backup BIOS. Also, all modern operating systems such as FreeBSD, Linux, macOS, Windows NT-based Windows OS like Windows 2000, Windows XP and newer, do not allow user-mode programs to have direct hardware access.
As a result, as of 2008, CIH has become essentially harmless, at worst causing annoyance by infecting executable files and triggering antivirus software. Other BIOS viruses remain possible, however; since most Windows home users without Windows Vista/7's UAC run all applications with administrative privileges, a modern CIH-like virus could in principle still gain access to hardware without first using an exploit. The operating system OpenBSD prevents all users from having this access and the grsecurity patch for the Linux kernel also prevents this direct hardware access by default, the difference being an attacker requiring a much more difficult kernel level exploit or reboot of the machine.
The second BIOS virus was a technique presented by John Heasman, principal security consultant for UK-based Next-Generation Security Software. In 2006, at the Black Hat Security Conference, he showed how to elevate privileges and read physical memory, using malicious procedures that replaced normal ACPI functions stored in flash memory.
The third BIOS virus was a technique called "Persistent BIOS infection." It appeared in 2009 at the CanSecWest Security Conference in Vancouver, and at the SyScan Security Conference in Singapore. Researchers Anibal Sacco and Alfredo Ortega, from Core Security Technologies, demonstrated how to insert malicious code into the decompression routines in the BIOS, allowing for nearly full control of the PC at start-up, even before the operating system is booted. The proof-of-concept does not exploit a flaw in the BIOS implementation, but only involves the normal BIOS flashing procedures. Thus, it requires physical access to the machine, or for the user to be root. Despite these requirements, Ortega underlined the profound implications of his and Sacco's discovery: "We can patch a driver to drop a fully working rootkit. We even have a little code that can remove or disable antivirus."
Mebromi is a trojan which targets computers with AwardBIOS, Microsoft Windows, and antivirus software from two Chinese companies: Rising Antivirus and Jiangmin KV Antivirus. Mebromi installs a rootkit which infects the master boot record.
In a December 2013 interview with "60 Minutes", Deborah Plunkett, Information Assurance Director for the US National Security Agency claimed the NSA had uncovered and thwarted a possible BIOS attack by a foreign nation state, targeting the US financial system. The program cited anonymous sources alleging it was a Chinese plot. However follow-up articles in "The Guardian," "The Atlantic," "Wired" and "The Register" refuted the NSA's claims.
, the legacy PC BIOS is being replaced by the more complex Extensible Firmware Interface (EFI) in many new machines. EFI is a specification which replaces the runtime interface of the legacy BIOS. Initially written for the Intel Itanium architecture, EFI is now available for x86 and x86-64 platforms; the specification development is driven by The Unified EFI Forum, an industry Special Interest Group. EFI booting has been supported in only Microsoft Windows versions supporting GPT, the Linux kernel 2.6.1 and later, and macOS on Intel-based Macs. , new PC hardware predominantly ships with UEFI firmware. The architecture of the rootkit safeguard can also prevent the system from running the user's own software changes, which makes UEFI controversial as a legacy BIOS replacement in the open hardware community.
Other alternatives to the functionality of the "Legacy BIOS" in the x86 world include coreboot and libreboot.
Some servers and workstations use a platform-independent Open Firmware (IEEE-1275) based on the Forth programming language; it is included with Sun's SPARC computers, IBM's RS/6000 line, and other PowerPC systems such as the CHRP motherboards, along with the x86-based OLPC XO-1.
As of at least 2015, Apple has removed legacy BIOS support from MacBook Pro computers. As such the bios utility no longer supports the legacy switch, and prints "Legacy mode not supported on this system". In 2017, Intel announced to remove legacy BIOS support until 2020.
Since 2019, New Intel platform OEM PC no longer support the legacy switch. | https://en.wikipedia.org/wiki?curid=4473 |
Bose–Einstein condensate
A Bose–Einstein condensate (BEC) is a state of matter (also called the fifth state of matter) which is typically formed when a gas of bosons at low densities is cooled to temperatures very close to absolute zero (-273.15 °C). Under such conditions, a large fraction of bosons occupy the lowest quantum state, at which point microscopic quantum phenomena, particularly wavefunction interference, become apparent macroscopically. A BEC is formed by cooling a gas of extremely low density, about one-hundred-thousandth (1/100,000) the density of normal air, to ultra-low temperatures.
This state was first predicted, generally, in 1924–1925 by Albert Einstein following and crediting a pioneering paper by Satyendra Nath Bose on the new field now known as quantum statistics.
Satyendra Nath Bose first sent a paper to Einstein on the quantum statistics of light quanta (now called photons), in which he derived Planck's quantum radiation law without any reference to classical physics. Einstein was impressed, translated the paper himself from English to German and submitted it for Bose to the "Zeitschrift für Physik", which published it in 1924. (The Einstein manuscript, once believed to be lost, was found in a library at Leiden University in 2005.) Einstein then extended Bose's ideas to matter in two other papers. The result of their efforts is the concept of a Bose gas, governed by Bose–Einstein statistics, which describes the statistical distribution of identical particles with integer spin, now called bosons. Bosons, which is a group of particles that includes the photon as well as atoms such as helium-4 (), are allowed to share a quantum state. Einstein proposed that cooling bosonic atoms to a very low temperature would cause them to fall (or "condense") into the lowest accessible quantum state, resulting in a new form of matter.
In 1938, Fritz London proposed the BEC as a mechanism for superfluidity in and superconductivity.
On 5 June 1995, the first gaseous condensate was produced by Eric Cornell and Carl Wieman at the University of Colorado at Boulder NIST–JILA lab, in a gas of rubidium atoms cooled to 170 nanokelvins (nK). Shortly thereafter, Wolfgang Ketterle at MIT realized a BEC in a gas of sodium atoms. For their achievements Cornell, Wieman, and Ketterle received the 2001 Nobel Prize in Physics. These early studies founded the field of ultracold atoms, and hundreds of research groups around the world now routinely produce BECs of dilute atomic vapors in their labs.
Since 1995, many other atomic species have been condensed, and BECs have also been realized using molecules, quasi-particles, and photons.
This transition to BEC occurs below a critical temperature, which for a uniform three-dimensional gas consisting of non-interacting particles with no apparent internal degrees of freedom is given by:
where:
Interactions shift the value and the corrections can be calculated by mean-field theory.
This formula is derived from finding the gas degeneracy in the Bose gas using Bose–Einstein statistics.
For an ideal Bose gas we have the equation of state:
formula_2
where formula_3 is the per particle volume, formula_4 the thermal wavelength, formula_5 the fugacity and formula_6. It is noticeable that formula_7 is a monotonically growing function of formula_5 in formula_9, which are the only values for which the series converge.
Recognizing that the second term on the right-hand side contains the expression for the average occupation number of the fundamental state formula_10, the equation of state can be rewritten as
formula_11
Because the left term on the second equation must always be positive, formula_12 and because formula_13, a stronger condition is
formula_14
which defines a transition between a gas phase and a condensed phase. On the critical region it is possible to define a critical temperature and thermal wavelength:
formula_15
formula_16
recovering the value indicated on the previous section. The critical values are such that if formula_17 or formula_18 we are in the presence of a Bose-Einstein condensate.
Understanding what happens with the fraction of particles on the fundamental level is crucial. As so, write the equation of state for formula_19, obtaining
formula_20 and equivalently formula_21.
So, if formula_22 the fraction formula_23 and if formula_24 the fraction formula_25. At temperatures near to absolute 0, particles tend to condensate in the fundamental state (state with momentum formula_26).
Consider a collection of "N" non-interacting particles, which can each be in one of two quantum states, formula_27 and formula_28. If the two states are equal in energy, each different configuration is equally likely.
If we can tell which particle is which, there are formula_29 different configurations, since each particle can be in formula_27 or formula_28 independently. In almost all of the configurations, about half the particles are in formula_27 and the other half in formula_28. The balance is a statistical effect: the number of configurations is largest when the particles are divided equally.
If the particles are indistinguishable, however, there are only "N"+1 different configurations. If there are "K" particles in state formula_28, there are particles in state formula_27. Whether any particular particle is in state formula_27 or in state formula_28 cannot be determined, so each value of "K" determines a unique quantum state for the whole system.
Suppose now that the energy of state formula_28 is slightly greater than the energy of state formula_27 by an amount "E". At temperature "T", a particle will have a lesser probability to be in state formula_28 by formula_41. In the distinguishable case, the particle distribution will be biased slightly towards state formula_27. But in the indistinguishable case, since there is no statistical pressure toward equal numbers, the most-likely outcome is that most of the particles will collapse into state formula_27.
In the distinguishable case, for large "N", the fraction in state formula_27 can be computed. It is the same as flipping a coin with probability proportional to "p" = exp(−"E"/"T") to land tails.
In the indistinguishable case, each value of "K" is a single state, which has its own separate Boltzmann probability. So the probability distribution is exponential:
For large "N", the normalization constant "C" is . The expected total number of particles not in the lowest energy state, in the limit that formula_46, is equal to formula_47. It does not grow when "N" is large; it just approaches a constant. This will be a negligible fraction of the total number of particles. So a collection of enough Bose particles in thermal equilibrium will mostly be in the ground state, with only a few in any excited state, no matter how small the energy difference.
Consider now a gas of particles, which can be in different momentum states labeled formula_48. If the number of particles is less than the number of thermally accessible states, for high temperatures and low densities, the particles will all be in different states. In this limit, the gas is classical. As the density increases or the temperature decreases, the number of accessible states per particle becomes smaller, and at some point, more particles will be forced into a single state than the maximum allowed for that state by statistical weighting. From this point on, any extra particle added will go into the ground state.
To calculate the transition temperature at any density, integrate, over all momentum states, the expression for maximum number of excited particles, :
When the integral (also known as Bose-Einstein integral) is evaluated with factors of formula_51 and ℏ restored by dimensional analysis, it gives the critical temperature formula of the preceding section. Therefore, this integral defines the critical temperature and particle number corresponding to the conditions of negligible chemical potential formula_52. In Bose–Einstein statistics distribution, formula_52 is actually still nonzero for BECs; however, formula_52 is less than the ground state energy. Except when specifically talking about the ground state, formula_52 can be approximated for most energy or momentum states as formula_56.
Nikolay Bogoliubov considered perturbations on the limit of dilute gas, finding a finite pressure at zero temperature and positive chemical potential. This leads to corrections for the ground state. The Bogoliubov state has pressure ("T" = 0): formula_57.
The original interacting system can be converted to a system of non-interacting particles with a dispersion law.
In some simplest cases, the state of condensed particles can be described with a nonlinear Schrödinger equation, also known as Gross–Pitaevskii or Ginzburg–Landau equation. The validity of this approach is actually limited to the case of ultracold temperatures, which fits well for the most alkali atoms experiments.
This approach originates from the assumption that the state of the BEC can be described by the unique wavefunction of the condensate formula_58. For a system of this nature, formula_59 is interpreted as the particle density, so the total number of atoms is formula_60
Provided essentially all atoms are in the condensate (that is, have condensed to the ground state), and treating the bosons using mean field theory, the energy (E) associated with the state formula_58 is:
Minimizing this energy with respect to infinitesimal variations in formula_58, and holding the number of atoms constant, yields the Gross–Pitaevski equation (GPE) (also a non-linear Schrödinger equation):
where:
In the case of zero external potential, the dispersion law of interacting Bose–Einstein-condensed particles is given by so-called Bogoliubov spectrum (for formula_65):
The Gross-Pitaevskii equation (GPE) provides a relatively good description of the behavior of atomic BEC's. However, GPE does not take into account the temperature dependence of dynamical variables, and is therefore valid only for formula_65.
It is not applicable, for example, for the condensates of excitons, magnons and photons, where the critical temperature is comparable to room temperature.
The Gross-Pitaevskii equation is a partial differential equation in space and time variables. Usually it does not have analytic solution and
different numerical methods, such as split-step
Crank-Nicolson
and Fourier spectral
methods, are used for its solution. There are different Fortran and C programs for its solution for contact interaction
and long-range dipolar interaction
which can be freely used.
The Gross–Pitaevskii model of BEC is a physical approximation valid for certain classes of BECs. By construction, the GPE uses the following simplifications: it assumes that interactions between condensate particles are of the contact two-body type and also neglects anomalous contributions to self-energy. These assumptions are suitable mostly for the dilute three-dimensional condensates. If one relaxes any of these assumptions, the equation for the condensate wavefunction acquires the terms containing higher-order powers of the wavefunction. Moreover, for some physical systems the amount of such terms turns out to be infinite, therefore, the equation becomes essentially non-polynomial. The examples where this could happen are the Bose–Fermi composite condensates, effectively lower-dimensional condensates, and dense condensates and superfluid clusters and droplets.
However, it is clear that in a general case the behaviour of Bose–Einstein condensate can be described by coupled evolution equations for condensate density, superfluid velocity and distribution function of elementary excitations. This problem was in 1977 by Peletminskii et al. in microscopical approach. The Peletminskii equations are valid for any finite temperatures below the critical point. Years after, in 1985, Kirkpatrick and Dorfman obtained similar equations using another microscopical approach. The Peletminskii equations also reproduce Khalatnikov hydrodynamical equations for superfluid as a limiting case.
The phenomena of superfluidity of a Bose gas and superconductivity of a strongly-correlated Fermi gas (a gas of Cooper pairs) are tightly connected to Bose–Einstein condensation. Under corresponding conditions, below the temperature of phase transition, these phenomena were observed in helium-4 and different classes of superconductors. In this sense, the superconductivity is often called the superfluidity of Fermi gas. In the simplest form, the origin of superfluidity can be seen from the weakly interacting bosons model.
In 1938, Pyotr Kapitsa, John Allen and Don Misener discovered that helium-4 became a new kind of fluid, now known as a superfluid, at temperatures less than 2.17 K (the lambda point). Superfluid helium has many unusual properties, including zero viscosity (the ability to flow without dissipating energy) and the existence of quantized vortices. It was quickly believed that the superfluidity was due to partial Bose–Einstein condensation of the liquid. In fact, many properties of superfluid helium also appear in gaseous condensates created by Cornell, Wieman and Ketterle (see below). Superfluid helium-4 is a liquid rather than a gas, which means that the interactions between the atoms are relatively strong; the original theory of Bose–Einstein condensation must be heavily modified in order to describe it. Bose–Einstein condensation remains, however, fundamental to the superfluid properties of helium-4. Note that helium-3, a fermion, also enters a superfluid phase (at a much lower temperature) which can be explained by the formation of bosonic Cooper pairs of two atoms (see also fermionic condensate).
The first "pure" Bose–Einstein condensate was created by Eric Cornell, Carl Wieman, and co-workers at JILA on 5 June 1995. They cooled a dilute vapor of approximately two thousand rubidium-87 atoms to below 170 nK using a combination of laser cooling (a technique that won its inventors Steven Chu, Claude Cohen-Tannoudji, and William D. Phillips the 1997 Nobel Prize in Physics) and magnetic evaporative cooling. About four months later, an independent effort led by Wolfgang Ketterle at MIT condensed sodium-23. Ketterle's condensate had a hundred times more atoms, allowing important results such as the observation of quantum mechanical interference between two different condensates. Cornell, Wieman and Ketterle won the 2001 Nobel Prize in Physics for their achievements.
A group led by Randall Hulet at Rice University announced a condensate of lithium atoms only one month following the JILA work. Lithium has attractive interactions, causing the condensate to be unstable and collapse for all but a few atoms. Hulet's team subsequently showed the condensate could be stabilized by confinement quantum pressure for up to about 1000 atoms. Various isotopes have since been condensed.
In the image accompanying this article, the velocity-distribution data indicates the formation of a Bose–Einstein condensate out of a gas of rubidium atoms. The false colors indicate the number of atoms at each velocity, with red being the fewest and white being the most. The areas appearing white and light blue are at the lowest velocities. The peak is not infinitely narrow because of the Heisenberg uncertainty principle: spatially confined atoms have a minimum width velocity distribution. This width is given by the curvature of the magnetic potential in the given direction. More tightly confined directions have bigger widths in the ballistic velocity distribution. This anisotropy of the peak on the right is a purely quantum-mechanical effect and does not exist in the thermal distribution on the left. This graph served as the cover design for the 1999 textbook "Thermal Physics" by Ralph Baierlein.
Bose–Einstein condensation also applies to quasiparticles in solids. Magnons, Excitons, and Polaritons have integer spin which means they are bosons that can form condensates.
Magnons, electron spin waves, can be controlled by a magnetic field. Densities from the limit of a dilute gas to a strongly interacting Bose liquid are possible. Magnetic ordering is the analog of superfluidity. In 1999 condensation was demonstrated in antiferromagnetic , at temperatures as great as 14 K. The high transition temperature (relative to atomic gases) is due to the magnons' small mass (near that of an electron) and greater achievable density. In 2006, condensation in a ferromagnetic yttrium-iron-garnet thin film was seen even at room temperature, with optical pumping.
Excitons, electron-hole pairs, were predicted to condense at low temperature and high density by Boer et al., in 1961. Bilayer system experiments first demonstrated condensation in 2003, by Hall voltage disappearance. Fast optical exciton creation was used to form condensates in sub-kelvin in 2005 on.
Polariton condensation was first detected for exciton-polaritons in a quantum well microcavity kept at 5 K.
In June 2020, the Cold Atom Laboratory experiment on board the International Space Station successfully created a BEC. Although initially just a proof of function, early results showed that, in the microgravity of the ISS, about half of the atoms formed into a halo-like cloud around the main body of the BEC.
As in many other systems, vortices can exist in BECs. These can be created, for example, by "stirring" the condensate with lasers, or rotating the confining trap. The vortex created will be a quantum vortex. These phenomena are allowed for by the non-linear formula_59 term in the GPE. As the vortices must have quantized angular momentum the wavefunction may have the form formula_69 where formula_70 and formula_71 are as in the cylindrical coordinate system, and formula_72 is the angular quantum number (a.k.a. the "charge" of the vortex). Since the energy of a vortex is proportional to the square of its angular momentum, in trivial topology only formula_73 vortices can exist in the steady state; Higher-charge vortices will have a tendency to split into formula_73 vortices, if allowed by the topology of the geometry.
An axially symmetric (for instance, harmonic) confining potential is commonly used for the study of vortices in BEC. To determine formula_75, the energy of formula_58 must be minimized, according to the constraint formula_69. This is usually done computationally, however, in a uniform medium, the following analytic form demonstrates the correct behavior, and is a good approximation:
Here, formula_79 is the density far from the vortex and formula_80, where formula_81 is the healing length of the condensate.
A singly charged vortex (formula_73) is in the ground state, with its energy formula_83 given by
where formula_85 is the farthest distance from the vortices considered.(To obtain an energy which is well defined it is necessary to include this boundary formula_86.)
For multiply charged vortices (formula_87) the energy is approximated by
which is greater than that of formula_72 singly charged vortices, indicating that these multiply charged vortices are unstable to decay. Research has, however, indicated they are metastable states, so may have relatively long lifetimes.
Closely related to the creation of vortices in BECs is the generation of so-called dark solitons in one-dimensional BECs. These topological objects feature a phase gradient across their nodal plane, which stabilizes their shape even in propagation and interaction. Although solitons carry no charge and are thus prone to decay, relatively long-lived dark solitons have been produced and studied extensively.
Experiments led by Randall Hulet at Rice University from 1995 through 2000 showed that lithium condensates with attractive interactions could stably exist up to a critical atom number. Quench cooling the gas, they observed the condensate to grow, then subsequently collapse as the attraction overwhelmed the zero-point energy of the confining potential, in a burst reminiscent of a supernova, with an explosion preceded by an implosion.
Further work on attractive condensates was performed in 2000 by the JILA team, of Cornell, Wieman and coworkers. Their instrumentation now had better control so they used naturally "attracting" atoms of rubidium-85 (having negative atom–atom scattering length). Through Feshbach resonance involving a sweep of the magnetic field causing spin flip collisions, they lowered the characteristic, discrete energies at which rubidium bonds, making their Rb-85 atoms repulsive and creating a stable condensate. The reversible flip from attraction to repulsion stems from quantum interference among wave-like condensate atoms.
When the JILA team raised the magnetic field strength further, the condensate suddenly reverted to attraction, imploded and shrank beyond detection, then exploded, expelling about two-thirds of its 10,000 atoms. About half of the atoms in the condensate seemed to have disappeared from the experiment altogether, not seen in the cold remnant or expanding gas cloud. Carl Wieman explained that under current atomic theory this characteristic of Bose–Einstein condensate could not be explained because the energy state of an atom near absolute zero should not be enough to cause an implosion; however, subsequent mean field theories have been proposed to explain it. Most likely they formed molecules of two rubidium atoms; energy gained by this bond imparts velocity sufficient to leave the trap without being detected.
The process of creation of molecular Bose condensate during the sweep of the magnetic field throughout the Feshbach resonance, as well as the reverse process, are described by the exactly solvable model that can explain many experimental observations.
Compared to more commonly encountered states of matter, Bose–Einstein condensates are extremely fragile. The slightest interaction with the external environment can be enough to warm them past the condensation threshold, eliminating their interesting properties and forming a normal gas.
Nevertheless, they have proven useful in exploring a wide range of questions in fundamental physics, and the years since the initial discoveries by the JILA and MIT groups have seen an increase in experimental and theoretical activity. Examples include experiments that have demonstrated interference between condensates due to wave–particle duality, the study of superfluidity and quantized vortices, the creation of bright matter wave solitons from Bose condensates confined to one dimension, and the slowing of light pulses to very low speeds using electromagnetically induced transparency. Vortices in Bose–Einstein condensates are also currently the subject of analogue gravity research, studying the possibility of modeling black holes and their related phenomena in such environments in the laboratory. Experimenters have also realized "optical lattices", where the interference pattern from overlapping lasers provides a periodic potential. These have been used to explore the transition between a superfluid and a Mott insulator, and may be useful in studying Bose–Einstein condensation in fewer than three dimensions, for example the Tonks–Girardeau gas. Further, the sensitivity of the pinning transition of strongly interacting bosons confined in a shallow one-dimensional optical lattice originally observed by Haller has been explored via a tweaking of the primary optical lattice by a secondary weaker one. Thus for a resulting weak bichromatic optical lattice, it has been found that the pinning transition is robust against the
introduction of the weaker secondary optical lattice. Studies of vortices in nonuniform Bose–Einstein condensates as well as excitatons of these systems by the application of moving repulsive or attractive obstacles, have also been undertaken. Within this context, the conditions for order and chaos in the dynamics of a trapped Bose–Einstein condensate have been explored by the application of moving blue and red-detuned laser beams via the time-dependent Gross-Pitaevskii equation.
Bose–Einstein condensates composed of a wide range of isotopes have been produced.
Cooling fermions to extremely low temperatures has created degenerate gases, subject to the Pauli exclusion principle. To exhibit Bose–Einstein condensation, the fermions must "pair up" to form bosonic compound particles (e.g. molecules or Cooper pairs). The first molecular condensates were created in November 2003 by the groups of Rudolf Grimm at the University of Innsbruck, Deborah S. Jin at the University of Colorado at Boulder and Wolfgang Ketterle at MIT. Jin quickly went on to create the first fermionic condensate, working with the same system but outside the molecular regime.
In 1999, Danish physicist Lene Hau led a team from Harvard University which slowed a beam of light to about 17 meters per second using a superfluid. Hau and her associates have since made a group of condensate atoms recoil from a light pulse such that they recorded the light's phase and amplitude, recovered by a second nearby condensate, in what they term "slow-light-mediated atomic matter-wave amplification" using Bose–Einstein condensates: details are discussed in "Nature".
Another current research interest is the creation of Bose–Einstein condensates in microgravity in order to use its properties for high precision atom interferometry. The first demonstration of a BEC in weightlessness was achieved in 2008 at a drop tower in Bremen, Germany by a consortium of researchers led by Ernst M. Rasel from Leibniz University Hannover. The same team demonstrated in 2017 the first creation of a Bose–Einstein condensate in space and it is also the subject of two upcoming experiments on the International Space Station.
Researchers in the new field of atomtronics use the properties of Bose–Einstein condensates when manipulating groups of identical cold atoms using lasers.
In 1970, BECs were proposed by Emmanuel David Tannenbaum for anti-stealth technology.
P. Sikivie and Q. Yang showed that cold dark matter axions form a Bose–Einstein condensate by thermalisation because of gravitational self-interactions. Axions have not yet been confirmed to exist. However the important search for them has been greatly enhanced with the completion of upgrades to the Axion Dark Matter Experiment(ADMX) at the University of Washington in early 2018.
In 2014 a potential dibaryon was detected at the Jülich Research Center at about 2380 MeV. The center claimed that the measurements confirm results from 2011, via a more replicable method. The particle existed for 10−23 seconds and was named d*(2380). This particle is hypothesized to consist of three up and three down quarks. It is theorized that groups of d-stars could form Bose-Einstein condensates due to prevailing low temperatures in the early universe, and that BECs made of such hexaquarks with trapped electrons could behave like dark matter.
The effect has mainly been observed on alkaline atoms which have nuclear properties particularly suitable for working with traps. As of 2012, using ultra-low temperatures of formula_90 or below, Bose–Einstein condensates had been obtained for a multitude of isotopes, mainly of alkali metal, alkaline earth metal,
and lanthanide atoms (, , , , , , , , , , , , , , and ). Research was finally successful in hydrogen with the aid of the newly developed method of 'evaporative cooling'. In contrast, the superfluid state of below is not a good example, because the interaction between the atoms is too strong. Only 8% of atoms are in the ground state near absolute zero, rather than the 100% of a true condensate.
The bosonic behavior of some of these alkaline gases appears odd at first sight, because their nuclei have half-integer total spin. It arises from a subtle interplay of electronic and nuclear spins: at ultra-low temperatures and corresponding excitation energies, the half-integer total spin of the electronic shell and half-integer total spin of the nucleus are coupled by a very weak hyperfine interaction. The total spin of the atom, arising from this coupling, is an integer lower value. The chemistry of systems at room temperature is determined by the electronic properties, which is essentially fermionic, since room temperature thermal excitations have typical energies much higher than the hyperfine values. | https://en.wikipedia.org/wiki?curid=4474 |
B (programming language)
B is a programming language developed at Bell Labs circa 1969. It is the work of Ken Thompson with Dennis Ritchie.
B was derived from BCPL, and its name may be a contraction of BCPL. Thompson's coworker Dennis Ritchie speculated that the name might be based on Bon, an earlier, but unrelated, programming language that Thompson designed for use on Multics.
B was designed for recursive, non-numeric, machine-independent applications, such as system and language software. It was a typeless language, with the only data type being the underlying machine's natural memory word format, whatever that might be. Depending on the context, the word was treated either as an integer or a memory address.
As machines with ASCII processing became common, notably the DEC PDP-11 that arrived at Bell, support for character data stuffed in memory words became important. The typeless nature of the language was seen as a disadvantage, which led Thompson and Ritchie to develop an expanded version of the language supporting new internal and user-defined types, which became the C programming language.
Circa 1969, Ken Thompson and later Dennis Ritchie developed B basing it mainly on the BCPL language Thompson used in the Multics project. B was essentially the BCPL system stripped of any component Thompson felt he could do without in order to make it fit within the memory capacity of the minicomputers of the time. The BCPL to B transition also included changes made to suit Thompson's preferences (mostly along the lines of reducing the number of non-whitespace characters in a typical program). Much of the typical ALGOL-like syntax of BCPL was rather heavily changed in this process. The assignment operator codice_1 changed to codice_2 and the equality operator codice_2 was replaced by codice_4.
Thompson added "two-address assignment operators" using codice_5 syntax to add y to x (in C the operator is written codice_6). This syntax came from Douglas McIlroy's implementation of TMG, in which B's compiler was first implemented (and it came to TMG from ALGOL 68's codice_7 syntax). Thompson went further by inventing the increment and decrement operators (codice_8 and codice_9). Their prefix or postfix position determines whether the value is taken before or after alteration of the operand. This innovation was not in the earliest versions of B. According to Dennis Ritchie, people often assumed that they were created for the auto-increment and auto-decrement address modes of the DEC PDP-11, but this is historically impossible as the machine didn't exist when B was first developed.
B is typeless, or more precisely has one data type: the computer word. Most operators (e.g. codice_10, codice_11, codice_12, codice_13) treated this as an integer, but others treated it as a memory address to be dereferenced. In many other ways it looked a lot like an early version of C. There are a few library functions, including some that vaguely resemble functions from the standard I/O library in C.
Early implementations were for the DEC PDP-7 and PDP-11 minicomputers using early Unix, and Honeywell 36-bit mainframes running the operating system GCOS. The earliest PDP-7 implementations compiled to threaded code, and Ritchie wrote a compiler using TMG which produced machine code. In 1970 a PDP-11 was acquired and threaded code was used for the port; an assembler, dc, and the B language itself were written in B to bootstrap the computer. An early version of yacc was produced with this PDP-11 configuration. Ritchie took over maintenance during this period.
The typeless nature of B made sense on the Honeywell, PDP-7 and many older computers, but was a problem on the PDP-11 because it was difficult to elegantly access the character data type that the PDP-11 and most modern computers fully support. Starting in 1971 Ritchie made changes to the language while converting its compiler to produce machine code, most notably adding data typing for variables. During 1971 and 1972 B evolved into "New B" (NB) and then C.
B is almost extinct, having been superseded by the C language. However, it continues to see use on GCOS mainframes ()
and on certain embedded systems () for a variety of reasons: limited hardware in small systems, extensive libraries, tooling, licensing cost issues, and simply being good enough for the job. The highly influential AberMUD was originally written in B.
The following examples are from the "Users' Reference to B" by Ken Thompson:
/* The following function will print a non-negative number, n, to
printn(n, b) {
/* The following program will calculate the constant e-2 to about
main() {
v[2000];
n 2000; | https://en.wikipedia.org/wiki?curid=4475 |
Beer–Lambert law
The Beer–Lambert law, also known as Beer's law, the Lambert–Beer law, or the Beer–Lambert–Bouguer law relates the attenuation of light to the properties of the material through which the light is travelling. The law is commonly applied to chemical analysis measurements and used in understanding attenuation in physical optics, for photons, neutrons, or rarefied gases. In mathematical physics, this law arises as a solution of the BGK equation.
The law was discovered by Pierre Bouguer before 1729, while looking at red wine, during a brief vacation in Alentejo, Portugal. It is often attributed to Johann Heinrich Lambert, who cited Bouguer's "Essai d'optique sur la gradation de la lumière" (Claude Jombert, Paris, 1729)—and even quoted from it—in his "Photometria" in 1760. Lambert's law stated that the loss of light intensity when it propagates in a medium is directly proportional to intensity and path length. Much later, August Beer discovered another attenuation relation in 1852. Beer's law stated that the transmittance of a solution remains constant if the product of concentration and path length stays constant. The modern derivation of the Beer–Lambert law combines the two laws and correlates the absorbance, which is the negative decadic logarithm of the transmittance, to both the concentrations of the attenuating species and the thickness of the material sample.
A common and practical expression of the Beer-Lambert law relates the optical attenuation of a physical material containing a single attenuating species of uniform concentration to the optical path length through the sample and absorptivity of the species. This expression is:
Where
A more general form of the Beer–Lambert law states that, for formula_5 attenuating species in the material sample,
or equivalently that
where
In the above equations, the transmittance formula_18 of material sample is related to its optical depth formula_19 and to its absorbance "A" by the following definition
where
Attenuation cross section and molar attenuation coefficient are related by
and number density and amount concentration by
where formula_25 is the Avogadro constant.
In case of "uniform" attenuation, these relations become
or equivalently
Cases of "non-uniform" attenuation occur in atmospheric science applications and radiation shielding theory for instance.
The law tends to break down at very high concentrations, especially if the material is highly scattering. Absorbance within range of 0.2 to 0.5 is ideal to maintain the linearity in Beer-Lambart law. If the radiation is especially intense, nonlinear optical processes can also cause variances. The main reason, however, is that the concentration dependence is in general non-linear and Beer's law is valid only under certain conditions as shown by derivation below. For strong oscillators and at high concentrations the deviations are stronger. If the molecules are closer to each other interactions can set in. These interactions can be roughly divided into physical and chemical interactions. Physical interaction do not alter the polarizability of the molecules as long as the interaction is not so strong that light and molecular quantum state intermix (strong coupling), but cause the attenuation cross sections to be non-additive via electromagnetic coupling. Chemical interactions in contrast change the polarizability and thus absorption.
The Beer–Lambert law can be expressed in terms of attenuation coefficient, but in this case is better called Lambert's law since amount concentration, from Beer's law, is hidden inside the attenuation coefficient. The (Napierian) attenuation coefficient formula_29 and the decadic attenuation coefficient formula_30 of a material sample are related to its number densities and amount concentrations as
respectively, by definition of attenuation cross section and molar attenuation coefficient. Then the Beer–Lambert law becomes
and
In case of "uniform" attenuation, these relations become
or equivalently
In many cases, the attenuation coefficient does not vary with formula_39, in which case one does not have to perform an integral and can express the law as:
where the attenuation is usually an addition of absorption coefficient formula_41 (creation of electron-hole pairs) or scattering (for example Rayleigh scattering if the scattering centers are much smaller than the incident wavelength). Also note that for some systems we can put formula_42 (1 over inelastic mean free path) in place of formula_29.
Assume that a beam of light enters a material sample. Define "z" as an axis parallel to the direction of the beam. Divide the material sample into thin slices, perpendicular to the beam of light, with thickness d"z" sufficiently small that one particle in a slice cannot obscure another particle in the same slice when viewed along the "z" direction. The radiant flux of the light that emerges from a slice is reduced, compared to that of the light that entered, by , where "μ" is the (Napierian) attenuation coefficient, which yields the following first-order linear ODE:
The attenuation is caused by the photons that did not make it to the other side of the slice because of scattering or absorption. The solution to this differential equation is obtained by multiplying the integrating factor
throughout to obtain
which simplifies due to the product rule (applied backwards) to
Integrating both sides and solving for Φe for a material of real thickness "ℓ", with the incident radiant flux upon the slice and the transmitted radiant flux gives
and finally
Since the decadic attenuation coefficient "μ"10 is related to the (Napierian) attenuation coefficient by , one also have
To describe the attenuation coefficient in a way independent of the number densities "n""i" of the "N" attenuating species of the material sample, one introduces the attenuation cross section . "σ""i" has the dimension of an area; it expresses the likelihood of interaction between the particles of the beam and the particles of the specie "i" in the material sample:
One can also use the molar attenuation coefficients , where NA is the Avogadro constant, to describe the attenuation coefficient in a way independent of the amount concentrations of the attenuating species of the material sample:
The above assumption that the attenuation cross sections are additive is generally incorrect since electromagnetic coupling occurs if the distances between the absorbing entities is small.
The derivation of the concentration dependence of the absorbance is based on electromagnetic theory. Accordingly, the macroscopic polarization of a medium formula_53 derives from the microscopic dipole moments formula_54 in the absence of interaction according to
where formula_56 is the dipole moment and formula_57 the number of absorbing entities per unit volume. On the other hand, macroscopic polarization is given by:
Here formula_59represents the relative dielectric function, formula_60 the vacuum permittivity and formula_61 the electric field. After equating and solving for the relative dielectric function the result is:
If we take into account that the polarizability formula_63 is defined by formula_64 and that for the number of absorbers per unit volume formula_65holds, it follows that:
According to Maxwell's wave equation the following relation between the complex dielectric function and the complex index of refraction function holds formula_67for isotropic and homogeneous media. Therefore:
The imaginary part of the complex index of refraction is the index of absorption formula_69. Employing the imaginary part of the polarizability formula_70and the approximation formula_71 it follows that:
Taking into account the relation between formula_69 and formula_74, formula_75 it eventually follows that
As a consequence, the linear relation between concentration and absorbance is generally an approximation, and holds in particular only for small polarisabilities and weak absorptions, i.e. oscillator strengths. If we do not introduce the approximation formula_71, and employ instead the following relation between the imaginary part of the relative dielectric function and index of refraction and absorption formula_78 it can be seen that the molar attenuation coefficient depends on the index of refraction (which is itself concentration dependent):
Under certain conditions Beer–Lambert law fails to maintain a linear relationship between attenuation and concentration of analyte. These deviations are classified into three categories:
There are at least six conditions that need to be fulfilled in order for Beer–Lambert law to be valid. These are:
If any of these conditions are not fulfilled, there will be deviations from Beer–Lambert law.
The Beer–Lambert law is not compatible with Maxwell's equations. Being strict, the law does not describe the transmittance through a medium, but the propagation within that medium. It can be made compatible with Maxwell's equations if the transmittance of a sample with solute is ratioed against the transmittance of the pure solvent which explains why it works so well in spectrophotometry. As this is not possible for pure media, the uncritical employment of the Beer–Lambert law can easily generate errors of the order of 100% or more. In such cases it is necessary to apply the Transfer-matrix method.
Recently it has also been demonstrated that Beer's law is a limiting law, since the absorbance is only approximately linearly depending on concentration. The reason is that the attenuation coefficient also depends on concentration and density, even in the absence of any interactions. These changes are, however, usually negligible except for high concentrations and large oscillator strength. For high concentrations and/or oscillator strengths, it is the integrated absorbance which is linearly depending on concentration, at least as long as there are no local field effects. If there are local field effects, they can be approximately taken into account by applying the Lorentz-Lorenz relation. In fact, Beer's law, i.e. the concentration dependence of absorbance, can be derived directly from the Lorentz-Lorenz relation (or, equivalently, the Clausius-Mossotti relation). Correspondingly, it can be demonstrated that there is a twin law according to which the change of the refractive index is approximately linear to the molar concentration for diluted solutions. This twin law can also be derived from the Lorentz-Lorenz relation.
Beer–Lambert law can be applied to the analysis of a mixture by spectrophotometry, without the need for extensive pre-processing of the sample. An example is the determination of bilirubin in blood plasma samples. The spectrum of pure bilirubin is known, so the molar attenuation coefficient "ε" is known. Measurements of decadic attenuation coefficient "μ"10 are made at one wavelength "λ" that is nearly unique for bilirubin and at a second wavelength in order to correct for possible interferences. The amount concentration "c" is then given by
For a more complicated example, consider a mixture in solution containing two species at amount concentrations "c"1 and "c"2. The decadic attenuation coefficient at any wavelength "λ" is, given by
Therefore, measurements at two wavelengths yields two equations in two unknowns and will suffice to determine the amount concentrations "c"1 and "c"2 as long as the molar attenuation coefficient of the two components, "ε"1 and "ε"2 are known at both wavelengths. This two system equation can be solved using Cramer's rule. In practice it is better to use linear least squares to determine the two amount concentrations from measurements made at more than two wavelengths. Mixtures containing more than two components can be analyzed in the same way, using a minimum of "N" wavelengths for a mixture containing "N" components.
The law is used widely in infra-red spectroscopy and near-infrared spectroscopy for analysis of polymer degradation and oxidation (also in biological tissue) as well as to measure the concentration of various compounds in different food samples. The carbonyl group attenuation at about 6 micrometres can be detected quite easily, and degree of oxidation of the polymer calculated.
This law is also applied to describe the attenuation of solar or stellar radiation as it travels through the atmosphere. In this case, there is scattering of radiation as well as absorption. The optical depth for a slant path is , where "τ" refers to a vertical path, "m" is called the relative airmass, and for a plane-parallel atmosphere it is determined as where "θ" is the zenith angle corresponding to the given path. The Beer–Lambert law for the atmosphere is usually written
where each "τ""x" is the optical depth whose subscript identifies the source of the absorption or scattering it describes:
"m" is the "optical mass" or "airmass factor", a term approximately equal (for small and moderate values of "θ") to 1/cos "θ", where "θ" is the observed object's zenith angle (the angle measured from the direction perpendicular to the Earth's surface at the observation site). This equation can be used to retrieve "τ"a, the aerosol optical thickness, which is necessary for the correction of satellite images and also important in accounting for the role of aerosols in climate. | https://en.wikipedia.org/wiki?curid=4476 |
The Beach Boys
The Beach Boys are an American rock band formed in Hawthorne, California in 1961. The group's original lineup consisted of brothers Brian, Dennis, and Carl Wilson, their cousin Mike Love, and their friend Al Jardine. Distinguished by their vocal harmonies and early surf songs, they are one of the most influential acts of the rock era. The band drew on the music of jazz-based vocal groups, 1950s rock and roll, and black R&B to create their unique sound, and with Brian as composer, arranger, producer, and de facto leader, they often incorporated classical elements and unconventional recording techniques in innovative ways.
The Beach Boys began as a garage band led by Brian and managed by the Wilsons' father Murry. In 1963, the band gained national prominence with a string of top-ten singles reflecting a southern California youth culture of surfing, cars, and romance, dubbed the "California sound". From 1965, they abandoned beachgoing themes for more personal lyrics and ambitious orchestrations. In 1966, the "Pet Sounds" album and "Good Vibrations" single raised the group's prestige as rock innovators and established the band as symbols of the nascent counterculture era. After scrapping the unfinished album "Smile" in 1967, Brian's contributions diminished due to his mental health issues. The group's commercial momentum faltered, and despite efforts to maintain an experimental sound, they were effectively blacklisted by the early rock music press.
Carl took over as the band's musical leader until the late 1970s, during which they rebounded with successful worldwide concert tours. Personal struggles, creative disagreements, and the overshadowing success of the band's greatest hits albums precipitated their transition into an oldies act. Dennis drowned in 1983 and Brian soon became estranged from the group. Between the 1990s and 2000s, the members filed numerous lawsuits over royalties, defamation, songwriting credits, and use of the band's name. Following Carl's death from lung cancer in 1998, the group and its corporation (Brother Records Inc.) granted Love legal rights to tour as "the Beach Boys". , Brian and Jardine do not perform with Love's Beach Boys, but remain official members of the band.
The Beach Boys are one of the most critically acclaimed, commercially successful, and influential bands of all time. They were one of the earliest self-contained rock bands and one of the few US bands who maintained their success before, during, and after the 1964 British Invasion. Between the 1960s and 2010s, they had over 80 songs chart worldwide, 36 of them in the US Top 40 (the most by a US rock band), and four topping the "Billboard" Hot 100. They have sold over 100 million records worldwide, making them one of the world's best-selling bands of all time, and are ranked number 12 on "Rolling Stone" magazine's 2004 list of the "100 Greatest Artists of All Time". Their influence spans musical genres and movements such as psychedelia, power pop, progressive rock, punk, alternative, and lo-fi. The core quintet of the three Wilsons, Love, and Jardine was inducted into the Rock and Roll Hall of Fame in 1988.
At the time of his sixteenth birthday on June 20, 1958, Brian Wilson shared a bedroom with his brothers, Dennis and Carl – aged thirteen and eleven, respectively – in their family home in Hawthorne. He had watched his father, Murry Wilson, play piano, and had listened intently to the harmonies of vocal groups such as the Four Freshmen. After dissecting songs such as "Ivory Tower" and "Good News", Brian would teach family members how to sing the background harmonies. For his birthday that year, Brian received a reel-to-reel tape recorder. He learned how to overdub, using his vocals and those of Carl and their mother. Brian played piano with Carl and David Marks, an eleven-year-old longtime neighbor, playing guitars they had each received as Christmas presents.
Soon Brian and Carl were avidly listening to Johnny Otis' KFOX radio show. Inspired by the simple structure and vocals of the rhythm and blues songs he heard, Brian changed his piano-playing style and started writing songs. Family gatherings brought the Wilsons in contact with cousin Mike Love. Brian taught Love's sister Maureen and a friend harmonies. Later, Brian, Love and two friends performed at Hawthorne High School. Brian also knew Al Jardine, a high school classmate. Brian suggested to Jardine that they team up with his cousin and brother Carl. Love gave the fledgling band its name: "The Pendletones", a pun on "Pendleton", a style of woolen shirt popular at the time. Dennis was the only avid surfer in the group, and he suggested that the group write songs that celebrated the sport and the lifestyle that it had inspired in Southern California. Brian finished the song, titled "Surfin'", and with Mike Love, wrote "Surfin' Safari". Murry recalled, "They had written a song called 'Surfin' ', which I never did like and still don't like, it was so rude and crude."
Murry Wilson, who was a sometime songwriter, arranged for the Pendletones to meet his publisher Hite Morgan. He said: "Finally, [Hite] agreed to hear it, and Mrs. Morgan said 'Drop everything, we're going to record your song. I think it's good.' And she's the one responsible." On September 15, 1961, the band recorded a demo of "Surfin'" with the Morgans. A more professional recording was made on October 3, at World Pacific Studio in Hollywood. David Marks was not present at the session as he was in school that day. Murry brought the demos to Herb Newman, owner of Candix Records and Era Records, and he signed the group on December 8. When the single was released a few weeks later, the band found that they had been renamed "the Beach Boys". Candix wanted to name the group the Surfers until Russ Regan, a young promoter with Era Records, noted that there already existed a group by that name. He suggested calling them the Beach Boys. "Surfin" was a regional success for the West Coast, and reached number 75 on the national "Billboard" Hot 100 chart. It was so successful that the number of unpaid orders for the single bankrupted Candix.
By this time the de facto manager of the Beach Boys, Murry landed the group's first paying gig (for which they earned $300) on New Year's Eve, 1961, at the Ritchie Valens Memorial Dance in Long Beach. In their earliest public appearances, the band wore heavy wool jacket-like shirts that local surfers favored before switching to their trademark striped shirts and white pants. In early 1962, Morgan requested that some of the members add vocals to a couple of instrumental tracks that he had recorded with other musicians. This led to the creation of the short-lived group Kenny & the Cadets, which Brian led under the pseudonym "Kenny". The other members were Carl, Jardine, and the Wilsons' mother Audree. In February, Jardine left the Beach Boys to study dentistry and was replaced by David Marks. Murry remembered that after "Surfin", the group had a difficult time being picked up by another label; "they [all] thought [the group was] a one-shot record."
After being turned down by Dot and Liberty, the Beach Boys signed a seven-year contract with Capitol Records. This was at the urging of Capitol executive and staff producer Nick Venet who signed the group, seeing them as the "teenage gold" he had been scouting for. On June 4, 1962, the Beach Boys debuted on Capitol with their second single, "Surfin' Safari" backed with "409". The release prompted national coverage in the June 9 issue of "Billboard", which praised Love's lead vocal and said the song had potential. "Surfin' Safari" rose to number 14 and found airplay in New York and Phoenix, a surprise for the label.
The Beach Boys completed their first album, "Surfin' Safari", with production credited to Nick Venet. Carl later denied that Venet had any significant role in the group's early music, saying that Venet "would be in the booth, and he would call the take number, and that was about it. I wouldn't call him a musical heavy by any ... Brian didn't want anything to do with Venet." "Surfin' Safari", released in October 1962, was different from other rock albums of the time in that it consisted almost entirely of original songs, primarily written by Brian with Mike Love and friend Gary Usher. Another unusual feature of the Beach Boys was that, although they were marketed as "surf music", their repertoire bore little resemblance to the music of other surf bands, which was mainly instrumental and incorporated heavy use of spring reverb. For this reason, some of the Beach Boys' early local performances had young audience members throwing vegetables at the band, believing that the group were poseurs.
In January 1963, the Beach Boys recorded their first top-ten single, "Surfin' U.S.A.", which began their long run of highly successful recording efforts. It was during the sessions for this single that Brian made the production decision from that point on to use double tracking on the group's vocals, resulting in a deeper and more resonant sound. The album of the same name followed in March and reached number 2 on the "Billboard" charts. Its success propelled the group into a nationwide spotlight, and was vital to launching surf music as a national craze, albeit the Beach Boys' vocal approach to the genre, not the original instrumental style pioneered by Dick Dale. Biographer Luis Sanchez highlights the "Surfin' U.S.A." single as a turning point for the band, "creat[ing] a direct passage to California life for a wide teenage audience ... [and] a distinct Southern California sensibility that exceeded its conception as such to advance right to the front of American consciousness."
Throughout 1963, and for the next few years, Brian produced a variety of singles for outside artists. Among these were the Honeys, a surfer trio that comprised sisters Diane and Marilyn Rovell with cousin Ginger Blake. Brian was convinced that they could potentially be a successful female counterpart to the Beach Boys, and he produced a number of singles for them, although they could not replicate the Beach Boys' popularity. He also attended some of Phil Spector's sessions at Gold Star Studios. His creative and songwriting interests were revamped upon hearing the Ronettes' 1963 song "Be My Baby", which was produced by Spector. The first time he heard the song was while driving, and was so overwhelmed that he had to pull over to the side of the road and analyze the chorus. Later, he reflected: "I was unable to really think as a producer up until the time where I really got familiar with Phil Spector's work. That was when I started to design the experience to be a record rather than just a song."
The surf music craze, along with the careers of nearly all surf acts, was slowly replaced by the British Invasion. Following a successful Australasian tour in January and February 1964, the Beach Boys returned home to face their new competition, the Beatles. Both groups shared the same record label in the US, and Capitol's support for the Beach Boys immediately began waning. This caused Murry to fight for the band at the label more than before, often visiting their offices without warning to "twist executive arms". Carl said that Phil Spector "was Brian's favorite kind of rock; he liked [him] better than the early Beatles stuff. He loved the Beatles' later music when they evolved and started making intelligent, masterful music, but before that Phil was it." According to Mike Love, Carl followed the Beatles closer than anyone else in the band, while Brian was the most "rattled" by the Beatles and felt tremendous pressure to "keep pace" with them. For Brian, the Beatles ultimately "eclipsed a lot [of what] we'd worked for ... [they] eclipsed the whole music world."
Brian wrote his last surf song in April 1964. That month, during recording of the single "I Get Around", Murry was relieved of his duties as manager. He remained in close contact with the group and attempted to continue advising on their career decisions. When "I Get Around" was released in May, it would climb to number one, their first single to do so, proving that the Beach Boys could compete with contemporary British pop groups. In July, the album that the song appeared on, "All Summer Long", reached No. 4 in the US. "All Summer Long" introduced exotic textures to the Beach Boys' sound exemplified by the piccolos and xylophones of its title track. The album was a swan-song to the surf and car music the Beach Boys built their commercial standing upon. Later albums took a different stylistic and lyrical path. Before this, a live album, "Beach Boys Concert", was released in October to a four-week chart stay at number one, containing a set list of previously recorded songs and covers that they had not yet recorded.
In June 1964, Brian recorded the bulk of "The Beach Boys' Christmas Album" with a forty-one-piece studio orchestra in collaboration with Four Freshmen arranger Dick Reynolds. The album was a response to Phil Spector's "A Christmas Gift for You" (1963). Released in December, the Beach Boys' album was divided between five new, original Christmas-themed songs, and seven reinterpretations of traditional Christmas songs. It would be regarded as one of the finest holiday albums of the rock era. One single from the album, "The Man with All the Toys", was released, peaking at No. 6 on the US "Billboard" Christmas chart. On October 29, the Beach Boys performed for "The T.A.M.I. Show", a concert film intended to bring together a wide range of musicians for a one-off performance. The result was released to movie theaters one month later.
By the end of 1964, the stress of road travel, writing, and producing became too much for Brian. On December 23, while on a flight from Los Angeles to Houston, he suffered a panic attack hours after performing with the Beach Boys on the musical variety series "Shindig!". In January 1965, he announced his withdrawal from touring to concentrate entirely on songwriting and record production. For the rest of 1964 and into 1965, session musician Glen Campbell served as Brian's temporary replacement in concert. Carl took over as the band's musical director onstage.
Now a full-time studio artist, Brian wanted to move the Beach Boys beyond their surf aesthetic, believing that their image was antiquated and distracting the public from his talents as a producer and songwriter. In the period following his resignation from touring, Brian put more distance between him and his bandmates, and began expanding his social circle to include a mix of worldly-minded friends, musicians, mystics, and business advisers. He also took an increasing interest in the developing Los Angeles "hip" scene and in recreational drugs (particularly marijuana, LSD, and Desbutal). Musically, he said he began to "take the things I learned from Phil Spector and use more instruments whenever I could. I doubled up on basses and tripled up on keyboards, which made everything sound bigger and deeper."
Released in March 1965, "The Beach Boys Today!" marked the first time the group experimented with the "album-as-art" form. The tracks on side one feature an uptempo sound that contrasts side two, which consists mostly of emotional ballads. Music writer Scott Schinder referenced its "suite-like structure" as an early example of the rock album format being used to make a cohesive artistic statement. Brian also established his new lyrical approach toward the autobiographical; journalist Nick Kent wrote that the subjects of Brian's songs "were suddenly no longer simple happy souls harmonizing their sun-kissed innocence and dying devotion to each other over a honey-coated backdrop of surf and sand. Instead, they'd become highly vulnerable, slightly neurotic and riddled with telling insecurities." In the book "Yeah Yeah Yeah: The Story of Modern Pop", Bob Stanley remarked that "Brian was aiming for Johnny Mercer but coming up proto-indie." In 2012, the album was voted 271 on "Rolling Stone" magazine's list of the 500 Greatest Albums of All Time.
In April 1965, Campbell's own career success pulled him from touring with the group. Columbia Records staff producer Bruce Johnston was asked to locate a replacement for Campbell; having failed to find one, Johnston himself became a full-time member of the band on May 19, 1965, first replacing Brian on the road and later contributing in the studio, beginning with the June 4 vocal sessions for "California Girls", which first appeared in the band's next album "Summer Days (And Summer Nights!!)" and eventually charted at number three in the US while the album went to number two. The album also included a reworked arrangement of "Help Me, Rhonda" which became the band's second number one single in the spring of 1965.
To appease Capitol's demands for a Beach Boys LP for the 1965 Christmas season, Brian conceived "Beach Boys' Party!", a live-in-the-studio album consisting mostly of acoustic covers of 1950s rock and R&B songs, in addition to covers of three Beatles songs, Bob Dylan's "The Times They Are a-Changin'", and idiosyncratic rerecordings of the group's earlier songs. The album was an early precursor of the "unplugged" trend. It included a cover of the Regents' song "Barbara Ann" which unexpectedly reached number-two when released several weeks later. In November, the group released another top-twenty single, "The Little Girl I Once Knew". It was considered the band's most experimental statement thus far. The single continued Brian's ambitions for daring arrangements, featuring unexpected tempo changes and numerous false endings. It was the band's second single not to reach the top ten since their 1962 breakthrough, peaking at number 20. According to Luis Sanchez, in 1965, Bob Dylan was "rewriting the rules for pop success" with his music and image, and it was at this juncture that Wilson "led The Beach Boys into a transitional phase in an effort to win the pop terrain that had been thrown up for grabs."
In January 1966, Wilson commenced recording sessions for the Beach Boys' forthcoming album "Pet Sounds", which was largely a collaboration with jingle writer Tony Asher. The album was a refinement of the themes and ideas that were introduced in "Today!". In some ways, the music was a jarring departure from their earlier style. When the other Beach Boys returned from a three-week tour of Japan and Hawaii, they were presented with a substantial portion of the new album, and various reports suggest that they fought over the new direction. Musicologist Daniel Harrison wrote, "In terms of the structure of the songs themselves, there is comparatively little advance from what Brian had already accomplished." In "The Journal on the Art of Record Production", Marshall Heiser writes that "Pet Sounds" "diverges from previous Beach Boys' efforts in several ways: its sound field has a greater sense of depth and 'warmth;' the songs employ even more inventive use of harmony and chord voicings; the prominent use of percussion is a key feature (as opposed to driving drum backbeats); whilst the orchestrations, at times, echo the quirkiness of 'exotica' bandleader Les Baxter, or the 'cool' of Burt Bacharach, more so than Spector's teen fanfares." Tony Asher recalled witnessing "tense" recording sessions in which all of Brian's bandmates complained that the music "'isn't our kind of shit!'".
For "Pet Sounds", Brian desired to make "a complete statement", similar to what he believed the Beatles had done with their newest album "Rubber Soul", released in December 1965. Brian was immediately enamored with the album, given the impression that it had no filler tracks, a feature that was mostly unheard of at a time when 45 rpm singles were considered more noteworthy than full-length LPs. He later said: "It didn't make me want to copy them but to be as good as them. I didn't want to do the same kind of music, but on the same level." Thanks to mutual connections, Brian was introduced to the Beatles' former press officer Derek Taylor, who was subsequently employed as the Beach Boys' publicist. Responding to Brian's request to reinvent the band's image, Taylor devised a promotion campaign with the tagline "Brian Wilson is a genius", a belief which Taylor sincerely held. Taylor's prestige was crucial in offering a credible perspective to those on the outside, and his efforts are widely recognized as instrumental in the album's success in Britain.
Released on May 16, 1966, "Pet Sounds" was widely influential and raised the band's prestige as an innovative rock group. Early reviews for the album in the US ranged from negative to tentatively positive, and its sales numbered approximately 500,000 units, a drop-off from the run of albums that immediately preceded it. It was assumed that Capitol considered "Pet Sounds" a risk, appealing more to an older demographic than the younger, female audience upon which the Beach Boys had built their commercial standing. Within two months, the label capitulated by releasing the group's first greatest hits compilation, "Best of the Beach Boys", which was quickly certified gold by the RIAA. By contrast, "Pet Sounds" met a highly favorable critical response in Britain, where it reached number 2 and remained among the top-ten positions for six months. Responding to the hype, "Melody Maker" ran a feature in which many pop musicians were asked whether they believed that the album was truly revolutionary and progressive, or "as sickly as peanut butter". The author concluded that "the record's impact on artists and the men behind the artists has been considerable."
In its evaluation of "Pet Sounds", the book "101 Albums that Changed Popular Music" (2009) calls it "one of the most innovative recordings in rock", and states that it "elevated Brian Wilson from talented bandleader to studio genius". In 1995, a panel of numerous musicians, songwriters and producers assembled by "Mojo" voted "Pet Sounds" as the greatest record ever made. Paul McCartney frequently spoke of his affinity with the album, citing "God Only Knows" as his favorite song of all time, and crediting it with furthering his interest in devising melodic bass lines. He said that "Pet Sounds" was the primary impetus for the Beatles' 1967 album "Sgt. Pepper's Lonely Hearts Club Band". According to author Carys Wyn Jones, the interplay between these two groups during the "Pet Sounds" era remains one of the most noteworthy episodes in rock history. In 2003, when "Rolling Stone" magazine created its list of the "500 Greatest Albums of All Time", the publication placed "Pet Sounds" second to honour its influence on the highest ranked album, "Sgt. Pepper".
Throughout the summer of 1966, Brian concentrated on finishing the group's next single, "Good Vibrations". During the making of "Pet Sounds", Wilson started changing his writing process. Rather than going to the studio with a completed song, he would record a track containing a series of chord changes he liked, take an acetate disc home, and then compose the song's melody and write its lyrics. With "Good Vibrations", Wilson said, "I had a lot of unfinished ideas, fragments of music I called 'feels.' Each feel represented a mood or an emotion I'd felt, and I planned to fit them together like a mosaic." Most of the song's structure and arrangement was written as it was recorded. Instead of working on whole songs with clear large-scale syntactical structures, Brian limited himself to recording short interchangeable fragments (or "modules"). Through the method of tape splicing, each fragment could then be assembled into a linear sequence, allowing any number of larger structures and divergent moods to be produced at a later time. Coming at a time when pop singles were usually recorded in under two hours, it was one of the most complex pop productions ever undertaken, with sessions for the song stretching over several months in four major Hollywood studios. It was also the most expensive single ever recorded to that point, with the production costs estimated to be in the tens of thousands.
While in the midst of "Good Vibrations" sessions, Wilson invited session musician and songwriter Van Dyke Parks to collaborate as lyricist for the Beach Boys' next album project, soon titled "Smile", to which Parks agreed. Wilson and Parks intended "Smile" to be a continuous suite of songs that were linked both thematically and musically, with the main songs being linked together by small vocal pieces and instrumental segments that elaborated upon the musical themes of the major songs. It was explicitly American in style and subject, a conscious reaction to the overwhelming British dominance of popular music at the time. Some of the music incorporated chanting, cowboy songs, explorations in Indian and Hawaiian music, jazz, classical tone poems, cartoon sound effects, "musique concrète", and yodeling. "Saturday Evening Post" writer Jules Siegel recalled that, during one evening in October, Brian announced to his wife and friends that he was "writing a teenage symphony to God". Brian told "Melody Maker": "Our new album will be better than "Pet Sounds". It will be as much an improvement over "Sounds" as that was over "Summer Days"." Derek Taylor continued to write articles in the music press, sometimes anonymously, in an effort to further speculation about the album.
Recording for "Smile" lasted about a year, from mid 1966 to mid 1967, and followed the same modular production approach as "Good Vibrations". Concurrently, Wilson planned many different multimedia side-projects, such as a sound effects collage, a comedy album, and a "health food" album. Capitol did not support some of these ideas, which led to the Beach Boys' desire to form their own label, Brother Records. According to biographer Steven Gaines, Love was "the most receptive" to the proposal, wanting the Beach Boys to have more creative control over their work, and supported Wilson's decision to employ his newfound "best friend" David Anderle as the head of the label, even though it was against the wishes of band manager Nick Grillo. In a press release, Anderle stated that Brother Records was to give "entirely new concepts to the recording industry, and to give the Beach Boys total creative and promotional control over their product". The group established a short-lived film production company, called Home Movies, to create live action film and television properties starring the Beach Boys. The company completed only one production, a promotional clip for "Good Vibrations".
Released on October 10, 1966, "Good Vibrations" was the Beach Boys' third US number-one single, reaching the top of the "Billboard" Hot 100 in December, and became their first number one in Britain. That month, the record was their first single certified gold by the RIAA. It came to be widely acclaimed as one of the greatest masterpieces of rock music. In December 1966, the Beach Boys were voted the top band in the world in the "NME"s annual readers' poll, ahead of the Beatles, the Walker Brothers, the Rolling Stones, and the Four Tops. "Billboard" said that this result was probably influenced by the success of "Good Vibrations" when the votes were cast, together with the band's recent UK tour, whereas the Beatles had neither a recent single nor had they toured the UK throughout 1966. The reporter nevertheless added that "The sensational success of the Beach Boys ... is being taken as a portent that the popularity of the top British groups of the last three years is past its peak."
Throughout 1966, EMI had flooded the UK market with previously unreleased albums by the band, including "Beach Boys' Party!", "The Beach Boys Today!" and "Summer Days (and Summer Nights!!)", and "Best of the Beach Boys" was number two there for several weeks at the end of the year.
Over the final quarter of 1966, the Beach Boys were the highest-selling album act in the UK, where for the first time in three years American artists broke the chart dominance of British acts. In 1971, a writer in "Cue" magazine said that, from mid 1966 to late 1967, the Beach Boys "were among the vanguard in practically every aspect of the counter culture". Biographer David Leaf wrote that the success of "Good Vibrations" "bought Brian some time [and] shut up everybody who said that Brian's new ways wouldn't sell ... his inability to "quickly" follow up [the single was what] became a snowballing problem." Sanchez writes that as time passed, the hype for "Smile" turned into "expectation", "doubt", and finally, "bemusement".
By December 1966, Wilson had completed much of the "Smile" backing tracks. When the Beach Boys returned from a month-long tour of Europe, they were confused by the new music he had recorded and the new coterie of interlopers that surrounded him. Gaines wrote that David Anderle now appeared to them as the leader of "a whole group of strangers [that] had infiltrated and taken over the Beach Boys". Throughout the first half of 1967, the album's release date was repeatedly postponed as Brian tinkered with the recordings, experimenting with different takes and mixes, unable or unwilling to supply a completed version of the album. Meanwhile, he suffered from delusions and paranoia, believing on one occasion that the album track "" (also known as "Mrs. O'Leary's Cow") caused a building to burn down. On January 3, 1967, Carl Wilson refused to be drafted for military service, leading to indictment and criminal prosecution which he challenged as a conscientious objector. He was arrested by the FBI in April, and it would take several years in the courts before the matter would be resolved.
After months of recording and media hype, the original "Smile" project was shelved due to the numerous personal, technical, and legal issues which surrounded its making. A February 1967 lawsuit seeking $255,000 (equivalent to $ in ) was launched against Capitol Records over neglected royalty payments. Within the lawsuit, there was also an attempt to terminate the band's contract with Capitol before its November 1969 expiry. Since the group's future at Capitol was in limbo, an immediate release of "Smile" would have been unlikely, regardless of whether the album was completed. Band quarrels led Parks to leave the project in April 1967, with Anderle following suit weeks later. Brian later said: "Time can be spent in the studio to the point where you get so next to it, you don't know where you are with it, you decide to just chuck it for a while." He discussed breaking up the Beach Boys "on many occasions", according to Anderle, "But it was easier, I think to get rid of the outsiders like myself than it was to break up the brothers. You can't break up brothers."
In the decades following "Smile"s non-release, it became the subject of intense speculation and mystique and gained status as the most legendary unreleased album in the history of popular music. Many of the album's advocates believe that had it been released, it would have altered the group's direction and established them at the vanguard of rock innovators. In October 1967, "Cheetah" magazine published "Goodbye Surfing, Hello God!", a memoir by Jules Siegel that chronicled his time with Brian during the "Smile" sessions. The article propelled the mythology of "Smile" and the Beach Boys and credited the album's collapse to "an obsessive cycle of creation and destruction that threatened not only his career and his fortune but also his marriage, his friendships, his relationships with the Beach Boys and, some of his closest friends worried, his mind". Carl blamed the article and "a lot of that stuff that went around before" with "really turn[ing Brian] off". Some of the original "Smile" tracks continued to trickle out in later releases, often as filler tracks to offset Brian's unwillingness to contribute. In 2011, "Uncut" magazine staff voted "Smile" the "greatest bootleg recording of all time".
In May 1967, the Beach Boys attempted to tour Europe with four extra musicians brought from the US, but were stopped by the British musicians' union. The tour went on without the extra support, and critics described their performances as "amateurish" and "floundering". Days after announcing that "Smile" was "scrapped", Derek Taylor terminated his employment with the group to focus his attention on organizing the Monterey Pop Festival, an event held in June that the Beach Boys declined to headline at the last minute. According to David Leaf: "Monterey was a gathering place for the 'far out' sounds of the 'new' rock, and the Beach Boys in concert really had no exotic sounds (excepting "Good Vibrations") to display. The net result of all this internal and external turmoil was that the Beach Boys didn't go to Monterey, and it is thought that this non-appearance was what really turned the 'underground' tide against them." Fan magazines speculated that the group were on the verge of breaking up.
Publicly, the band said that they could not play Monterey because of Carl's military draft, but many of the people involved with the festival thought that the group was simply too scared to compete with the "new music". Love later said that "Carl was to appear in federal court the Tuesday after the concert, but for all we knew, they were going to arrest him again if he performed onstage. ... None of us were afraid to perform at Monterey." Steven Gaines wrote that the decision ultimately "had a snowballing effect" that came to represent "a damning admission that [the Beach Boys] were washed up". A controversy involving whether the band was to be taken as a serious rock group developed among critics and fans. Detractors referred to the band as the "Bleach Boys" and "the California Hypes" as the media focus shifted from Los Angeles to the happenings in San Francisco. On December 14, 1967, "Rolling Stone" co-founder and editor Jann Wenner printed an influential article that denounced the Beach Boys as "just one prominent example of a group that has gotten hung up on trying to catch The Beatles. It's a pointless pursuit." The article had the effect of excluding the group among serious rock fans.
The Beach Boys were still under pressure and a contractual obligation to record and present an album to Capitol. Carl remembered: "Brian just said, 'I can't do this. We're going to make a homespun version of ["Smile"] instead. We're just going to take it easy. I'll get in the pool and sing. Or let's go in the gym and do our parts.' That was "Smiley Smile"." Sessions for the new album lasted from June to July 1967 at Brian's new makeshift home studio. Most of the album featured the Beach Boys playing their own instruments, rather than the session musicians employed in much of their previous work. It was the first album for which production was credited to the entire group, instead of Brian alone. When asked if Brian was "still the producer of "Smiley Smile"", Carl answered, "Most definitely."
In July, lead single "Heroes and Villains" was issued, arriving after months of public anticipation, and reached number 12 in US. It was met with general confusion among underwhelming reviews, and in the "NME", Jimi Hendrix famously dismissed the single as a "psychedelic barbershop quartet". By then, the group's lawsuit with Capitol was resolved, and it was agreed that "Smile" would not be the band's next album. In August, the group embarked on a two-date tour of Hawaii. Bruce Johnston, who was absent for most of the "Smiley Smile" recording, did not accompany the group, although Brian did. Their performances were filmed and recorded with the intention of releasing a live album, "Lei'd in Hawaii", which was also left unfinished and unreleased. In an interview that month, Brian stated: "I think rock n' roll–the pop scene–is happening. It's great. But I think basically, the Beach Boys are squares. We're not happening."
"Smiley Smile" was released on September 18, 1967, and peaked at number 41 in the US, making it their worst-selling album to that date. It began a string of under-performing Beach Boys albums that would last until 1974. When released in the UK in November, it performed better, reaching number 9. Critics and fans were generally underwhelmed by the album. According to Scott Schinder, the album was released to "general incomprehension. While "Smile" may have divided the Beach Boys' fans had it been released, "Smiley Smile" merely baffled them." Over the years, the album gathered a reputation as one of the best "chill-out" albums to listen to during an LSD comedown. In 1974, the writing staff of "NME" voted it as the 64th greatest album of all time.
The Beach Boys immediately recorded a new album, "Wild Honey", which was an excursion into soul music. Carl described it as "music for Brian to cool out by. He was still very spaced." The album was a self-conscious attempt by the Beach Boys to "regroup" themselves as a rock band in opposition to their more orchestral affairs of the past. Its music differs in many ways from previous Beach Boys records: it contains very little group singing compared to previous albums, and mainly features Brian singing at his piano. Again, the Beach Boys recorded mostly at his home studio. Love reflected that "Wild Honey" was "completely out of the mainstream for what was going on at that time, which was all hard rock/psychedelic music. It just didn't have anything to do with what was going on, and that was the idea."
"Wild Honey" was released on December 18, 1967, in competition with the Beatles' "Magical Mystery Tour" and the Rolling Stones' "Their Satanic Majesties Request". It had a lower chart placing than "Smiley Smile" and remained on the charts for only 15 weeks. As with "Smiley Smile", contemporary critics viewed it as inconsequential, and it alienated fans whose expectations had been raised by "Smile". That month, Mike Love told a British journalist: "Brian has been re-thinking our recording program and in any case we all have a much greater say nowadays in what we turn out in the studio." "Wild Honey" remained the last Beach Boys album to feature Brian as a primary composer until 1977. Over the coming months, its non-conforming approach would be echoed in albums released by Bob Dylan ("John Wesley Harding"), the Kinks ("Village Green Preservation Society"), and the Byrds ("The Notorious Byrd Brothers").
The Beach Boys were at their lowest popularity in the late 1960s, and their cultural standing was especially worsened by their public image, which remained incongruous with the "heavier" music of their peers. Capitol continued to bill the Beach Boys as "America's Top Surfin' Group!" and expected Brian to write more beachgoing songs for the yearly summer markets. From 1968 onward, his songwriting output declined substantially, but the public narrative of "Brian-as-leader" continued. The group also stopped wearing their longtime striped-shirt stage uniforms in favor of matching white, polyester suits that were similar to a Las Vegas show band.
After meeting Maharishi Mahesh Yogi at a UNICEF Variety Gala in Paris, Love, along with other high-profile celebrities such as the Beatles and Donovan traveled to Rishikesh in India during February and March 1968. The following Beach Boys album "Friends" had songs influenced by the Transcendental Meditation taught by the Maharishi. In support of the "Friends" album, Love had arranged for the Beach Boys to tour with the Maharishi in the U.S.. Starting on May 3, 1968, the tour lasted five shows and was canceled when the Maharishi had to withdraw to fulfill film contracts. Because of disappointing audience numbers and the Maharishi's withdrawal, twenty-four tour dates were subsequently canceled at a cost estimated at $250,000 for the band. "Friends", released on June 24, peaked at number 126 in the US. A collection of Beach Boys backing tracks, "Stack-o-Tracks", was issued by Capitol in August. The album became the first Beach Boys LP that failed to chart in the US and UK.
In June 1968, Dennis befriended Charles Manson, an aspiring singer-songwriter, and their relationship lasted for several months. Dennis bought him time at Brian's home studio where recording sessions were attempted while Brian stayed in his room. Dennis then proposed that Manson be signed to Brother Records. Brian reportedly disliked Manson, and so a deal was never made. In July 1968, the group released a standalone single, "Do It Again", which was written in the style of their earlier songs. Around this time, Brian admitted himself to a psychiatric hospital. His bandmates wrote and produced material in his absence. To complete their contract with Capitol, they produced one more album, "20/20", released in January 1969. It consisted mostly of outtakes and leftovers from recent albums; Brian produced virtually none of the newer recordings. In 1976, Dennis called it "the only letdown of the Beach Boys' career that embarrassed me through and through ... we had to find things that Brian worked on and try and piece it together. That's when [he had] no involvement at all."
The Beach Boys recorded one song penned by Manson without his involvement: "Cease to Exist", rewritten as "Never Learn Not to Love", which was included on "20/20", but first released as the B-side of a single one month earlier. Manson was enthused by the idea of the group recording one of his songs; however, after accruing a large monetary debt to the group, Dennis deliberately omitted Manson's credit on its release while also altering the song's arrangement and lyrics, which angered Manson. As his cult of followers took over Dennis' home, Dennis gradually distanced himself from Manson. According to Leaf, "The entire Wilson family reportedly feared for their lives." In November 1969, three months after the Tate–LaBianca murders, Manson was apprehended by the police, and his connections with the Beach Boys was the subject of media attention. He was later convicted for several counts of murder and conspiracy to murder. In 1976, Dennis commented that "I don't talk about Manson. I think he's a sick fuck. I think of Roman [Polanski] and all those wonderful people who had a beautiful family and they fucking had their tits cut off. I want to benefit from that?"
In April 1969, the band revisited their 1967 lawsuit against Capitol Records after they alleged an audit undertaken revealed the band were owed over $2 million for unpaid royalties and production duties. In May, Brian told the music press that the group's funds were depleted to the point that they were considering filing for bankruptcy at the end of the year, which "Disc & Music Echo" called "stunning news" and a "tremendous shock on the American pop scene". Brian hoped that the success of a forthcoming single, "Break Away", would mend their financial issues. The song, which was written and produced by Brian and Murry, reached number 63 in the US and number 6 in the UK, and Brian's remarks to the press ultimately thwarted long-simmering contract negotiations with Deutsche Grammophon. The group's Capitol contract expired two weeks later with one more album still due, after which the label deleted the Beach Boys' catalog from print, effectively cutting off their royalty flow. The lawsuit was later settled in their favor and they acquired the rights to their post-1965 catalog.
In August, Sea of Tunes, the Beach Boys' catalog, was sold to Irving Almo Music for $700,000 (equivalent to $ in ). Brian, according to his wife Marilyn Wilson, was devastated by the sale. Over the years, the catalog would generate more than $100 million in publishing royalties, none of which Murry nor the band members ever received.
The group were signed to Reprise Records in 1970. Scott Schinder described the label as "probably the hippest and most artist-friendly major label of the time." The deal was brokered by Van Dyke Parks, who was then employed as a multimedia executive at Warner Music Group. Reprise's contract stipulated Brian's proactive involvement with the band in all albums By the time the Beach Boys tenure ended with Capitol in 1969, they had sold 65 million records worldwide, closing the decade as the most commercially successful American group in popular music.
After recording over 30 different songs and going through several album titles, their first LP for Reprise, "Sunflower", was released on August 31, 1970. "Sunflower" featured a strong group presence with significant writing contributions from all band members. Brian was active during this period, writing or co-writing seven of the twelve songs on "Sunflower" and performing at half of the band's domestic concerts in 1970. The album received critical acclaim in both the US and the UK. This was offset by the album reaching only number 151 on US record charts during a four-week stay, becoming the worst selling Beach Boys album at that point. In his review for "Rolling Stone", critic Jim Miller praised the album as "without doubt the best Beach Boys album in recent memory, a stylistically coherent "tour de force"", but mused: "It makes one wonder though whether anyone still listens to their music, or could give a shit about it." In the UK, the album reached 29. Fans generally regard the LP as the Beach Boys' finest post-"Pet Sounds" album. In 2003, it placed at number 380 on "Rolling Stone"s "Greatest Albums of All Time" list.
In 1969, Brian opened a short-lived health food store called the Radiant Radish. While working there, he met journalist and radio presenter Jack Rieley. Rieley spoke with Brian for a radio interview, with the subject eventually turning to the unreleased song "Surf's Up", a track which had taken on notoriety since the demise of the "Smile" album three years earlier. Brian did not feel it should be released. In August 1970, Rieley offered a six-page memo ruminating on how to stimulate "increased record sales and popularity for The Beach Boys." Within the next few months, the Beach Boys hired Rieley as their manager. One of his initiatives was to encourage the band to record songs featuring more socially conscious lyrics. He also requested the completion of "Surf's Up" and arranged a guest appearance at a Grateful Dead concert at Bill Graham's Fillmore East in April 1971 to foreground the Beach Boys' transition into the counterculture. During this time, the group ceased wearing matching uniforms on stage.
In July 1971, the Beach Boys filmed a concert for ABC-TV in Central Park. It aired as "Good Vibrations from Central Park" on August 19, 1971. The concert also featured performances by Boz Scaggs, Kate Taylor, Carly Simon, and Ike & Tina Turner.
On August 30, 1971, the band released "Surf's Up", which included the title track. The album was moderately successful, reaching the US top 30, a marked improvement over their recent releases. While the record charted, the Beach Boys added to their renewed fame by performing a near-sellout set at Carnegie Hall; their live shows during this era included reworked arrangements of many of the band's previous songs. A large portion of their set lists culled from "Pet Sounds" and "Smile". Music writer Domenic Priore noted, "They basically played what they could have played at the Monterey Pop Festival in the summer of 1967." Dennis injured his hand during the "Surf's Up" sessions, leaving him temporarily unable to play the drums.
Reprise, however, felt that the album required a strong single. This resulted in the song "Sail On, Sailor", a collaboration between Brian Wilson, Tandyn Almer, Ray Kennedy, Jack Rieley and Van Dyke Parks featuring a soulful lead vocal by Chaplin. Reprise subsequently approved, and the resulting album, "Holland", was released in January 1973, peaking at number 37. Brian's musical children story, "Mount Vernon and Fairway (A Fairy Tale)" was included as a bonus EP.
In August 1973, the 41-song soundtrack to "American Graffiti" was released, including the band's early songs "Surfin' Safari" and "All Summer Long". The album was a catalyst in creating a wave of nostalgia that reintroduced the Beach Boys into contemporary American consciousness. Chaplin also left in late 1973 after an argument with Steve Love, the band's business manager (and Mike's brother). In June 1974, Capitol issued "Endless Summer", the band's first major pre-"Pet Sounds" greatest hits package. The compilation rose to the top of the "Billboard" album charts. It remained on the charts for two years, the longest of any Beach Boys release. Capitol followed with a second compilation, "Spirit of America", which also sold well. With these compilations, the Beach Boys became one of the most popular acts in rock, propelling themselves from opening for Crosby, Stills, Nash and Young to headliners selling out basketball arenas in a matter of weeks. "Rolling Stone" named the Beach Boys the "Band of the Year" for 1974.
Fataar remained with the band until 1974, when he was offered a chance to join a new group led by future Eagles member Joe Walsh. Chaplin's replacement, James William Guercio, started offering the group career advice that resulted in his becoming their new manager. A new album was attempted, with sessions being held both at Guercio's Caribou Ranch recording studio in Colorado and at the bands own Brother Studios in L.A. Only a scattering of material from these sessions saw eventual release. The impetus had shifted from recording new material to large venue touring and under Guercio, the Beach Boys staged a successful 1975 joint concert tour with Chicago, with each group performing some of the other's songs, including their previous year's collaboration on Chicago's single "Wishing You Were Here". While their concerts continuously sold out, the stage act slowly changed from a contemporary presentation followed by oldies encores to an entire show made up of mostly pre-1967 music.
Brian spent the majority of two years secluded in the chauffeur's quarters of his home, abusing alcohol, taking drugs (including heroin), overeating, and exhibiting other self-destructive behavior. Although increasingly reclusive during the day, Wilson spent many nights at singer Danny Hutton's house, fraternizing with colleagues such as Alice Cooper and Iggy Pop. In 1975, Brian attempted to join California Music, a Los Angeles collective that included Gary Usher, Curt Boettcher, and Bruce Johnston. The Beach Boys' recent "Endless Summer" compilation was selling well, and the band was touring non-stop, making them the biggest live draw in the US. Guercio was then fired by the group and replaced by Steve Love, who urged the group to encourage Brian to return to the production helm. According to Steve: "We were under contract with Warner Bros., and we couldn't have him going on a tangent. If he was going to be productive, it's gotta be for the Beach Boys." Brian, who had already grown tired of working with the Beach Boys, was then legally ousted from California Music in order to focus his undivided attention on the band. In October, Marilyn persuaded Brian to admit himself under the care of psychotherapist Eugene Landy, who was successful in keeping Brian from indulging in substance abuse with constant supervision.
At the end of January 1976, the Beach Boys returned to the studio with an apprehensive Brian producing once again. At the time, he felt: "It was a little scary because [the Beach Boys and I] weren't as close. We had drifted apart, personality-wise. A lot of the guys had developed new personalities through meditation. ... But we went into the studio with the attitude that we had to get it done." Group meetings were supervised by Landy, and discussions over each song for the record were reported to last for up to eight hours. Brian decided the band should do an album of rock and roll and doo wop standards. Carl and Dennis disagreed, feeling that an album of originals was far more ideal, while Love and Jardine wanted the album out as quickly as possible. Brian's production role was undermined as group members overdubbed and remixed tracks, without his knowledge, to fight against his desire for a rough, unfinished sound. He later attributed his hoarse voice on the album to a bout of laryngitis.
Released on July 5, 1976, "15 Big Ones" was generally disliked by fans and critics upon release. Its lead single, a cover of Chuck Berry's "Rock and Roll Music", peaked at number five. Carl and Dennis disparaged the album to the press. Dennis said: "It was a great mistake to put Brian in full control. He was always the absolute producer, but little did he know that in his absence, people grew up, people became as sensitive as the next guy. Why do I relinquish my rights as an artist? The whole process was a little bruising." Brian said that "the new album is nothing too deep", but remained hopeful that their next release would be on par with the group's "Good Vibrations". An August 1976 NBC-TV special, titled "The Beach Boys", was produced by "Saturday Night Live" (SNL) creator Lorne Michaels, and featured appearances by "SNL" cast members John Belushi and Dan Aykroyd. In December, Brian was released from Landy's program due to disputes over the doctor's monthly fee.
From late 1976 to early 1977, Brian spent his time making sporadic public appearances and producing the band's next album "The Beach Boys Love You", a collection of 14 songs mostly written, arranged and produced alone. He later called "Love You" one of his favorite Beach Boys releases, saying that "That's when it all happened for me. That's where my heart lies."
The album's engineer, Earle Mankey, compared it to the surrealist film "Eraserhead", and said that while it was "lighthearted" on the surface, it was intended to be a "serious, autobiographical work". Writing for "Pitchfork", D. Erik Kempke said the album "stands in sharp contrast to the albums that preceded and followed it, because it was a product of genuine inspiration on Brian Wilson's part, with little outside interference." Al Jardine credited Carl and Dennis with having "the most to do with that album ... [they were] paying tribute to their brother."
Released on April 11, 1977, "Love You" peaked at number 53 in the US and number 28 in the UK. It was divided between fans and critics. Some saw the album as a work of "eccentric genius" whereas others dismissed it as "childish and trivial". In a review for "Circus", Lester Bangs called the Beach Boys "a diseased bunch of motherfuckers if ever there was one ... But the miracle is that the Beach Boys have made that disease sound like the literal babyflesh pink of health." The album was released weeks after the band signed a new record deal with CBS. Gaines hypothesized that the lack of promotion Reprise put into "Love You" was a byproduct of the falling out between artist and label.
After "Love You" was released, Brian assembled "Adult/Child", an unreleased effort largely consisting of songs written by Brian from 1976 and 1977 with select big band arrangements by Dick Reynolds. Although publicized as the Beach Boys' next release, "Adult/Child" caused tension within the group and was ultimately shelved. Following this period, his concert appearances with the band gradually diminished and their performances were occasionally erratic. The internal wrangling came to a head after a show at Central Park on September 1, 1977, when the band effectively split into two camps; Dennis and Carl Wilson on one side, Mike Love and Al Jardine on the other with Brian remaining neutral. Following a confrontation on an airport tarmac, Dennis declared to "Rolling Stone" on September 3 that he had left the band: "It was Al Jardine who really knifed me in the heart when he said they didn't need me. That was the clincher. And all I told him was that he couldn't play more than four chords. They kept telling me I had my solo album now ["Pacific Ocean Blue"], like I should go off in a corner and leave the Beach Boys to them. The album really bothers them. They don't like to admit it's doing so well; they never even acknowledge it in interviews."
The band broke up for two and a half weeks, until a meeting on September 17 at Brian's house. In light of a potential new Caribou Records contract the parties negotiated a settlement resulting in Love gaining control of Brian's vote in the group, allowing Love and Jardine to outvote Carl and Dennis Wilson on any matter. Dennis started to withdraw from the group to focus on his second solo album, "Bambu". The album was shelved just as alcoholism and marital problems overcame all three Wilson brothers. Carl appeared intoxicated during concerts (especially at appearances for their 1978 Australia tour) and Brian gradually slid back into addiction and an unhealthy lifestyle.
Their last album for Reprise, "M.I.U. Album" (1978), was recorded at Maharishi International University in Iowa at the suggestion of Love. Dennis and Carl made limited contributions; the album was mostly produced by Jardine and Ron Altbach, with Brian credited as "executive producer". "M.I.U." was largely a contractual obligation to finish out their association with Reprise, who likewise did not promote the result.
In an April 1980 interview, Carl reflected that "the last two years have been the most important and difficult time of our career. We were at the ultimate crossroads. We had to decide whether what we had been involved in since we were teenagers had lost its meaning. We asked ourselves and each other the difficult questions we'd often avoided in the past." By the next year, he left the touring group because of unhappiness with the band's nostalgia format and lackluster live performances, subsequently pursuing a solo career. He stated: "I haven't quit the Beach Boys but I do not plan on touring with them until they decide that 1981 means as much to them as 1961."
Carl returned in May 1982, after approximately 14 months of being away, on the condition that the group reconsider their rehearsal and touring policies and refrain from "Las Vegas-type" engagements. Later that year, Brian overdosed on a combination of alcohol, cocaine, and other psychoactive drugs. His former therapist Eugene Landy was once more employed, and a more radical program was undertaken to try to restore Brian to health. This involved removing him from the group on November 5, 1982, at the behest of Carl, Love, and Jardine, in addition to putting him on a rigorous diet and health regimen. Coupled with long, extreme counseling sessions, this therapy was successful in bringing Brian back to physical health, slimming down from to .
From 1980 through 1982, the Beach Boys and the Grass Roots performed Independence Day concerts at the National Mall in Washington, D.C., attracting large crowds. However, in April 1983, James G. Watt, President Ronald Reagan's Secretary of the Interior, banned Independence Day concerts on the Mall by such groups. Watt said that "rock bands" that had performed on the Mall on Independence Day in 1981 and 1982 had encouraged drug use and alcoholism and had attracted "the wrong element", who would steal from attendees. During the ensuing uproar, which included over 40,000 complaints to the Department of the Interior, the Beach Boys stated that the Soviet Union, which had invited them to perform in Leningrad in 1978, "...obviously ... did not feel that the group attracted the wrong element." Vice President George H. W. Bush said of the Beach Boys, "They're my friends and I like their music". Watt later apologized to the band after learning that President Reagan and First Lady Nancy Reagan were fans. White House staff presented Watt with a plaster foot with a hole in it, showing that he had "shot himself in the foot".
In 1983, tensions between Dennis and Love escalated so high that each obtained a restraining order against the other. With the rest of the band fearing that he would end up like Brian, Dennis was given an ultimatum after his last performance in November 1983 to check into rehab for his alcohol problems or be banned from performing live with them. Dennis checked into rehab for his chance to get sober, but on December 28, 1983, he drowned at the age of 39 in Marina del Rey while diving from a friend's boat trying to recover items that he had previously thrown overboard in fits of rage.
Between 1983 and 1986, Landy charged Brian about $430,000 annually. When he requested more money, Carl was obliged to give away a quarter of Brian's publishing royalties. As Brian's recovery consolidated, he stopped working with the Beach Boys on a regular basis. Commenting on his relationship to the band in 1988, Brian said that he avoided his family at Landy's suggestion, and that "Although we stay together as a group, as people we're a far cry from friends." In the mid 1980s, Landy stated, "I influence all of [Brian]'s thinking. I'm practically a member of the band ... [We're] partners in life." Brian later responded to allegations with, "People say that Dr. Landy runs my life, but the truth is, I'm in charge." Mike Love denied Landy's accusation that he and the band were keeping Brian from participating with the group, and later wrote that Landy's "goal ... was to destroy us ... [and become] the sole custodian of Brian's career and legacy."
The Beach Boys spent the next several years touring, often playing in front of large audiences, and recording songs for film soundtracks and various artists compilations. In 1988, they unexpectedly claimed their first U.S. number one single in 22 years with "Kokomo", which topped the chart for one week. It appeared in the movie "Cocktail" They released the album "Still Cruisin'", which went platinum in the US.
Love filed a defamation lawsuit against Brian due to how he was presented in Brian's 1992 memoir "". Its publisher HarperCollins settled the suit for $1.5 million. He said that the suit allowed his lawyer "to gain access to the transcripts of Brian's interviews with his [book] collaborator, Todd Gold. Those interviews affirmed—according to Brian—that I had been the inspiration of the group and that I had written many of the songs that [would soon be] in dispute." Other defamation lawsuits were filed by Carl, Brother Records, and the Wilsons' mother Audree.
The day after California courts issued a restraining order between Brian and Landy, Brian phoned Sire Records staff producer Andy Paley to collaborate on new material tentatively for the Beach Boys. After losing the songwriting credits lawsuit with Love, Brian told "MOJO" in February 1995: "Mike and I are just cool. There's a lot of shit Andy and I got written for him. I just had to get through that goddamn trial!" In April, it was unclear whether the project would turn into a Wilson solo album, a Beach Boys album, or a combination of the two. The project ultimately disintegrated. Instead, Brian and his bandmates recorded "Stars and Stripes Vol. 1", an album of country music stars covering Beach Boys songs, with co-production helmed by River North Records owner Joe Thomas. Afterward, the group discussed finishing the album "Smile", but Carl rejected the idea, fearing that it would cause Brian another nervous breakdown.
In early 1997, Carl was diagnosed with lung and brain cancer after years of heavy smoking. Despite his terminal condition, Carl continued to perform with the band on its 1997 summer tour (a double-bill with the band Chicago) while undergoing chemotherapy. During performances, he sat on a stool and needed oxygen after every song. Carl died on February 6, 1998, at the age of 51, two months after the death of the Wilsons' mother, Audree.
In turn, Jardine left the band and began to tour regularly with his band "Beach Boys: Family & Friends" until he ran into legal issues for using the name without license. Meanwhile, Jardine sued Love, claiming that he had been excluded from their concerts., through its longtime attorney, Ed McPherson, sued Jardine in Federal Court. Jardine, in turn, counter-claimed against BRI for wrongful termination. BRI ultimately prevailed. Love and Johnston continued to tour as "The Beach Boys" with supporting musicians. Marks left the touring band in 1999 because of his health.
In 2000, ABC-TV premiered a two-part television miniseries, "", that dramatized the Beach Boys' story. It was produced by John Stamos, and was criticized for historical inaccuracies. Brian Wilson said that he "didn't like the second part. It wasn't really true to the way things were. I'd like to see another movie if it was done right."
In 2004, Wilson recorded and released his solo album "Brian Wilson Presents Smile", a reinterpretation of the unfinished "Smile" project. That September, Wilson issued a free CD through the "Mail On Sunday" that included Beach Boys songs he had recently rerecorded, five of which he co-authored with Love. The 10 track compilation had 2.6 million copies distributed and prompted Love to file a lawsuit in November 2005; he claimed the promotion hurt the sales of the original recordings. Love's suit was dismissed in 2007 when a judge determined that there were no triable issues.
On October 31, Capitol released a compilation and box set dedicated to "Smile" in the form of "The Smile Sessions". The album garnered universal critical acclaim and charting in both the Billboard US and UK Top 30. It went on to win Best Historical Album at the 2013 Grammy Awards.
On December 16, 2011, it was announced that Wilson, Love, Jardine, Johnston and David Marks would reunite for a new album and 50th anniversary tour. On February 12, 2012, the Beach Boys performed at the 2012 Grammy Awards, in what was billed as a "special performance" by organizers. It marked the group's first live performance to include Wilson since 1996, Jardine since 1998, and Marks since 1999. Released on June 5, "That's Why God Made the Radio" debuted at number 3 on U.S. charts, which expanding the group's span of "Billboard" 200 top ten albums across 49 years and one week, passing the Beatles with 47 years of top ten albums. Critics generally regarded the album as an "uneven" collection, with most of the praise centered on its closing musical suite.
On June 1, 2012, Love received an e-mail from Brian's wife and manager Melinda Ledbetter stating "no more shows for Wilson". Love, who is obligated by his license of the Beach Boys name to maintain revenue flow to Brother Records, then began accepting invitations for when the reunion was over. On June 25, Ledbetter sent another e-mail asking to disregard her last message, but by then, Love says, "it was too late. We had booked other concerts, and promoters had begun selling tickets." The next day, Love announced additional touring dates that would not feature Wilson. Wilson then denied knowledge of these new dates. He later wrote: "I had wanted to send out a joint press release, between Brian and me, formally announcing the end of the reunion tour on September 28. But I couldn't get Brian's management team on board (Brian himself doesn't make those kinds of decisions)."
In late September, news outlets began reporting that Love had dismissed Wilson from the Beach Boys. On October 5, Love responded in a self-written press release to the "LA Times" stating he "did not fire Brian Wilson from the Beach Boys. I cannot fire Brian Wilson from the Beach Boys ... I do not have such authority. And even if I did, I would never fire Brian Wilson from the Beach Boys." He explained that nobody in the band "wanted to do a 50th anniversary tour that lasted 10 years" and that its limited run "was long agreed upon". Four days later, Wilson and Jardine submitted a written response to the rumors stating: "I was completely blindsided by his press release ... We hadn't even discussed as a band what we were going to do with all the offers that were coming in for more 50th shows." Love said that Wilson's statements in this press release were falsified by his agents.
Love and Johnston continued to perform under the Beach Boys brand name, while Wilson, Jardine, and Marks continued to tour as a trio, and a subsequent tour with guitarist Jeff Beck also included Blondie Chaplin at select dates. Reflecting upon the band's reunion in 2013, Love stated: "I had a wonderful experience being in the studio together. Brian has lost none of his ability to structure those melodies and chord progressions ... Touring was more for the fans. ... It was a great experience, it had a term to it, and now everyone's going on with their ways of doing things." Jardine, Marks, Johnston and Love appeared together at the 2014 Ella Awards Ceremony, where Love was honored for his work as a singer.
In 2015, "Soundstage" aired an episode featuring Wilson performing with Jardine and former Beach Boys Blondie Chaplin and Ricky Fataar at The Venetian in Las Vegas. In April 2015, when asked if he was interested in making music with Love again, Wilson replied: "I don't think so, no," later adding in July that he "doesn't talk to the Beach Boys [or] Mike Love."
In 2016, Love and Wilson published memoirs, "" and "I Am Brian Wilson", respectively. Love was asked about negative comments that Wilson made about him in the book and said: "He's not in charge of his life, like I am mine. His every move is orchestrated and a lot of things he's purported to say, there's not tape of it. But, I don't like to put undue pressure on him, either, because I know he has a lot of issues. Out of compassion, I don't respond to everything that is purportedly said by him." In an interview with "Rolling Stone" conducted in June 2016, Wilson stated that he would like to try to repair his relationship with Love and collaborate with him again. In January 2017, Love stated "If it were possible to make it just Brian and I, and have it under control and done better than what happened in 2012, then yeah, I'd be open to something."
In July 2018, Wilson, Jardine, Love, Johnston, and Marks reunited for a one-off Q&A session moderated by director Rob Reiner at the Capitol Records Tower in Los Angeles. It was the first time the band appeared together in public since their 2012 tour. That December, Love described his new holiday album, "Reason for the Season", as a "message to Brian" and said that he "would love nothing more than to get together with Brian and do some music."
In February 2020, Wilson's official social media pages encouraged fans to boycott the band's music after it was announced that Love's Beach Boys would perform at the Safari Club International Convention in Reno, Nevada. The concert proceeded despite online protests, as Love issued a statement that said his group has always supported "freedom of thought and expression as a fundamental tenet of our rights as Americans." In March, Jardine was asked about a possible reunion and responded that the band will reunite for a string of live performances in 2021, although he believed a new album was unlikely. In response to reunion rumors, Love said in May that he was open to a 60th anniversary tour, although Wilson has "some serious health issues", while Wilson's manager Jean Sievers commented that nobody had spoken to Wilson about such a tour.
In "Understanding Rock: Essays in Musical Analysis", musicologist Daniel Harrison writes:
The Beach Boys began as a garage band playing 1950s style rock and roll, reassembling styles of music such as surf to include vocal jazz harmony, which created their unique sound. In addition, they introduced their signature approach to common genres such as the pop ballad by applying harmonic or formal twists not native to rock and roll. Among the distinct elements of the Beach Boys' style were the nasal quality of their singing voices, their use of a falsetto harmony over a driving, locomotive-like melody, and the sudden chiming in of the whole group on a key line. Brian Wilson handled most stages of the group's recording process from the beginning, even though he was not properly credited on most of the early recordings.
Early on, Mike Love sang lead vocals in the rock-oriented songs, while Carl contributed guitar lines on the group's ballads. Jim Miller commented: "On straight rockers they sang tight harmonies behind Love's lead ... on ballads, Brian played his falsetto off against lush, jazz-tinged voicings, often using (for rock) unorthodox harmonic structures." Harrison adds that "even the least distinguished of the Beach Boys' early uptempo rock 'n' roll songs show traces of structural complexity at some level; Brian was simply too curious and experimental to leave convention alone." Although Brian was often dubbed a perfectionist, he was an inexperienced musician, and his understanding of music was mostly self-taught. At the lyric stage, he usually worked with Love, whose assertive persona provided youthful swagger that contrasted Brian's explorations in romanticism and sensitivity. Luis Sanchez noted a pattern where Brian would spare surfing imagery when working with collaborators outside of his band's circle, in the examples "Lonely Sea" and "In My Room".
Brian's bandmates resented the notion that he was the sole creative force in the group. In a 1966 article that asked if "the Beach Boys rely too much on sound genius Brian", Carl said that although Brian was the most responsible for their music, every member of the group contributed ideas. Mike Love wrote, "As far as I was concerned, Brian "was" a genius, deserving of that recognition. But the rest of us were seen as nameless components in Brian's music machine ... It didn't feel to us as if we were just riding on Brian's coattails." Conversely, Dennis defended Brian's stature in the band, stating: "Brian Wilson "is" the Beach Boys. He is the band. We're his fucking messengers. He is all of it. Period. We're nothing. He's everything."
The band's earliest influences came primarily from the work of Chuck Berry and the Four Freshmen. Performed by the Four Freshmen, "Their Hearts Were Full of Spring" (1961) was a particular favorite of the group. By analyzing their arrangements of pop standards, Brian educated himself on jazz harmony. Bearing this in mind, Philip Lambert noted, "If Bob Flanigan helped teach Brian how to sing, then Gershwin, Kern, Porter, and the other members of this pantheon helped him learn how to craft a song." Other general influences on the group included the Hi-Los, the Penguins, the Robins, Bill Haley & His Comets, Otis Williams, the Cadets, the Everly Brothers, the Shirelles, the Regents, and the Crystals.
The eclectic mix of white and black vocal group influences – ranging from the rock and roll of Berry, the jazz harmonies of the Four Freshmen, the pop of the Four Preps, the folk of the Kingston Trio, the R&B of groups like the Coasters and the Five Satins, and the doo wop of Dion and the Belmonts – helped contribute to the Beach Boys' uniqueness in American popular music. Carl remembered: "Most of [Mike's] classmates were black. He was the only white guy on his track team. He was really immersed in doo-wop and that music and I think he influenced Brian to listen to it. The black artists were so much better in terms of rock records in those days that the white records almost sounded like put-ons." On Jimi Hendrix and "heavy" music, Brian said he felt no pressure to go in that direction: "We never got into the heavy musical level trip. We never needed to. It's already been done."
Another significant influence on Brian's work was Burt Bacharach. He said in the 1960s: "Burt Bacharach and Hal David are more like me. They're also the best pop team – per se – today. As a producer, Bacharach has a very fresh, new approach." Regarding surf rock pioneer Dick Dale, Brian said that his influence on the group was limited to Carl and his style of guitar playing. Carl credited Chuck Berry, the Ventures, and John Walker with shaping his guitar style, and that the Beach Boys had learned to play all of the Ventures' songs by ear early in their career.
In 1967, Lou Reed wrote in "Aspen" that the Beach Boys created a "hybrid sound" out of old rock and the Four Freshmen, explaining that such songs as "Let Him Run Wild", "Don't Worry Baby", "I Get Around", and "Fun, Fun, Fun" were not unlike "Peppermint Stick" by the Elchords. Similarly, John Sebastian of the Lovin' Spoonful noted, "Brian had control of this vocal palette of which we had no idea. We had never paid attention to the Four Freshmen or doo-wop combos like the Crew Cuts. Look what gold he mined out of that."
Brian identified each member individually for their vocal range, once detailing the ranges for Carl, Dennis, Jardine ("[they] progress upwards through G, A, and B"), Love ("can go from bass to the E above middle C"), and himself ("I can take the second D in the treble clef"). He declared in 1966 that his greatest interest was to expand modern vocal harmony, owing his fascination with voice to the Four Freshmen, which he considered a "groovy sectional sound." He added, "The harmonies that we are able to produce give us a uniqueness which is really the only important thing you can put into records – some quality that no one else has got. I love peaks in a song – and enhancing them on the control panel. Most of all, I love the human voice for its own sake." For a period, Brian avoided singing falsetto for the group, saying "I thought people thought I was a fairy...the band told me, 'If that's the way you sing, don't worry about it.'"
From lowest intervals to highest, the group's vocal harmony stack usually began with Love or Dennis, followed by Jardine or Carl, and finally Brian on top, according to Jardine, while Carl said that the blend was Love on bottom, Carl above, followed by Dennis or Jardine, and then Brian on top. Jardine explains, "We always sang the same vocal intervals. ... As soon as we heard the chords on the piano we'd figure it out pretty easily. If there was a vocal move [Brian] envisioned, he'd show that particular singer that move. We had somewhat photographic memory as far as the vocal parts were concerned so that [was] never a problem for us." Striving for perfection, Brian insured that his intricate vocal arrangements exercised the group's calculated blend of intonation, attack, phrasing, and expression. Sometimes, he would sing each vocal harmony part alone through multi-track tape.
On the group's blend, Carl said: "[Love] has a beautifully rich, very full-sounding bass voice. Yet his lead singing is real nasal, real punk. [Jardine]'s voice has a bright timbre to it; it really cuts. My voice has a kind of calm sound. We're big oooh-ers; we love to oooh. It's a big, full sound, that's very pleasing to us; it opens up the heart." Rock critic Erik Davis wrote, "The 'purity' of tone and genetic proximity that smoothed their voices was almost creepy, pseudo-castrato, [and] a 'barbershop' sound." Jimmy Webb said, "They used very little vibrato and sing in very straight tones. The voices all lie down beside each other very easily – there's no bumping between them because the pitch is very precise." According to Brian: "Jack Good once told us, 'You sing like eunuchs in a Sistine Chapel,' which was a pretty good quote." Writer Richard Goldstein reported that, according to a fellow journalist who asked Brian about the black roots of his music, Brian's response was: "We're white and we sing white." Goldstein added that when he asked where his approach to vocal harmonies had derived from, Wilson answered: 'Barbershop'."
Nine months after forming, the group acquired national success, and demand for their personal appearance skyrocketed. Biographer James Murphy said, "By most contemporary accounts, they were not a very good live band when they started. ... The Beach Boys learned to play as a band in front of live audiences", eventually to become "one of the best and enduring live bands". For the recording of the Beach Boys' instrumental tracks, Brian arranged many of his compositions for a conglomerate of session musicians later known as "the Wrecking Crew". Their assistance was needed because of the increasingly complicated nature of the material. Afterward, the members only performed the instrumental tracks to certain recordings. It is the belief of Richie Unterberger that, "Before session musicians took over most of the parts, the Beach Boys could play respectably gutsy surf rock as a self-contained unit."
Carl was an exception among the group in that he played alongside these musicians whenever he was available to attend sessions. In archivist Craig Slowinski's view, "One should not sell short Carl's own contributions; the youngest Wilson had developed as a musician sufficiently to play alongside the horde of high-dollar session pros that big brother was now bringing into the studio. Carl's guitar playing [was] a key ingredient."
A common misconception is that Dennis' drumming in the Beach Boys' recordings was filled in exclusively by studio musicians. His drumming is documented on a number of the group's singles, including "I Get Around", "Fun, Fun Fun", and "Don't Worry Baby".
The band members often reflected on the spiritual nature of their music (and music in general), particularly for the recording of "Pet Sounds" and "Smile". Even though the Wilsons did not grow up in a particularly religious household, Carl was described as "the most truly religious person I know" by Brian, and Carl was forthcoming about the group's spiritual beliefs stating: "We believe in God as a kind of universal consciousness. God is love. God is you. God is me. God is everything right here in this room. It's a spiritual concept which inspires a great deal of our music." Carl told "Rave" magazine in 1967 that the group's influences are of a "religious nature", but not any religion in specific, only "an idea based upon that of Universal Consciousness. ... The spiritual concept of happiness and doing good to others is extremely important to the lyric of our songs, and the religious element of some of the better church music is also contained within some of our new work."
Brian is quoted during the "Smile" era: "I'm very religious. Not in the sense of churches, going to church; but like the essence of "all" religion." During the recording of "Pet Sounds", Brian held prayer meetings, later reflecting that "God was with us the whole time we were doing this record ... I could feel that feeling in my brain." In 1966, he explained that he wanted to move into a white spiritual sound, and predicted that the rest of the music industry would follow suit. In 2011, Brian maintained the spirituality was important to his music, and that he did not follow any particular religion.
Carl said that "Smile" was chosen as an album title because of its connection to the group's spiritual beliefs. Brian referred to "Smile" as his "teenage symphony to God", composing a hymn, "Our Prayer", as the album's opening spiritual invocation. Experimentation with psychotropic substances also proved pivotal to the group's development as artists. He spoke of his LSD trips as a "religious experience", and during a session for "Our Prayer", Brian can be heard asking the other Beach Boys: "Do you guys feel any acid yet?". In 1968, the group's interest in transcendental meditation led them to record the original song, "Transcendental Meditation".
The Beach Boys are one of the most critically acclaimed, commercially successful, and influential bands of all time. They have sold over 100 million records worldwide. The group's early songs made them major pop stars in the US, the UK, Australia and other countries, having seven top 10 singles between April 1963 and November 1964. They were one of the first American groups to exhibit the definitive traits of a self-contained rock band, playing their own instruments and writing their own songs, and they were one of the few American bands formed prior to the 1964 British Invasion to continue their success. Among artists of the 1960s, they are one of few central figures in the histories of rock.
Brian Wilson's artistic control over the Beach Boys' records was unprecedented for the time. Carl Wilson elaborated: "Record companies were used to having absolute control over their artists. It was especially nervy, because Brian was a 21-year-old kid with just two albums. It was unheard of. But what could they say? Brian made good records." This made the Beach Boys one of the first rock groups to exert studio control. Music producers after the mid 1960s would draw on Brian's influence, setting a precedent that allowed bands and artists to enter a recording studio and act as producers, either autonomously, or in conjunction with other like minds.
The band routinely appears in the upper reaches of ranked lists such as "The Top 1000 Albums of All Time." Many of the group's songs and albums, including "The Beach Boys Today!", "Smiley Smile", "Sunflower", and "Surf's Up"—and especially "Pet Sounds" and "Good Vibrations"—are featured in numerous lists devoted to the greatest albums or singles of all time. The latter two frequently appear on the number one spot. On Acclaimed Music, which aggregates the rankings of decades of critics' lists, "Pet Sounds" is ranked as the greatest album of all time, while "Good Vibrations" is the third-greatest song of all time ("God Only Knows" is also ranked 21). The group itself is ranked number 11 in its 1000 most recommended artists of all time.
In 1988, the core quintet of the Wilson brothers, Love, and Jardine were inducted into the Rock and Roll Hall of Fame. Ten years later, they were selected for the Vocal Group Hall of Fame. In 2004, "Pet Sounds" was preserved in the National Recording Registry by the Library of Congress for being "culturally, historically, and aesthetically significant." Their recordings of "In My Room", "Good Vibrations", "California Girls" and the entire "Pet Sounds" album have been inducted into the Grammy Hall of Fame.
In 2017, a study of AllMusic's catalog indicated the Beach Boys as the 6th most frequently cited artist influence in its database. For the 50th anniversary of "Pet Sounds", 26 artists contributed to a "Pitchfork" retrospective on its influence, which included comments from members of Talking Heads, Yo La Tengo, Chairlift, and Deftones. The editor noted that the "wide swath of artists assembled for this feature represent but a modicum of the album's vast measure of influence. Its scope transcends just about all lines of age, race, and gender. Its impact continues to broaden with each passing generation."
Professor of cultural studies James M. Curtis wrote in 1987, "We can say that the Beach Boys represent the outlook and values of white Protestant Anglo-Saxon teenagers in the early sixties. Having said that, we immediately realize that they must mean much more than this. Their stability, their staying power, and their ability to attract new fans prove as much." Cultural historian Kevin Starr explains that the group first connected with young Americans specifically for their lyrical interpretation of a mythologized landscape: "Cars and the beach, surfing, the California Girl, all this fused in the alembic of youth: Here was a way of life, an iconography, already half-released into the chords and multiple tracks of a new sound." in Robert Christgau's opinion, "the Beach Boys were a touchstone for real rock and rollers, all of whom understood that the music had its most essential roots in an innocently hedonistic materialism."
The group's "California sound" grew to national prominence through the success of their 1963 album "Surfin' U.S.A.", which helped turn the surfing subculture into a mainstream youth-targeted advertising image widely exploited by the film, television, and food industry. The group's surf music was not entirely of their own invention, being preceded by artists such as Dick Dale. However, previous surf musicians did not project a world view as the Beach Boys did. The band's earlier surf music helped raise the profile of the state of California, creating its first major regional style with national significance, and establishing a musical identity for Southern California, as opposed to Hollywood. California ultimately supplanted New York as the center of popular music thanks to the success of Brian's productions.
A 1966 article discussing new trends in rock music writes that the Beach Boys popularized a type of drum beat heard in Jan and Dean's "Surf City", which sounds like a "a locomotive getting up speed", in addition to the method of "suddenly stopping in between the chorus and verse". Pete Townshend of the Who is credited with coining the term "power pop", which he defined as "what we play—what the Small Faces used to play, and the kind of pop the Beach Boys played in the days of 'Fun, Fun, Fun' which I preferred."
The California sound gradually evolved to reflect a more musically ambitious and mature world view, becoming less to do with surfing and cars and more about social consciousness and political awareness. Between 1964 and 1969, it fueled innovation and transition, inspiring artists to tackle largely unmentioned themes such as sexual freedom, black pride, drugs, oppositional politics, other countercultural motifs, and war. Soft pop (later known as "sunshine pop") derived in part from this movement. Sunshine pop producers widely imitated the orchestral style of "Pet Sounds"; however, the Beach Boys themselves were rarely representative of the genre, which was rooted in easy-listening and advertising jingles.
By the end of the 1960s, the California sound declined due to a combination of the West Coast's cultural shifts, Wilson's professional and psychological downturn, and the Manson murders, with David Howard calling it the "sunset of the original California Sunshine Sound ... [the] sweetness advocated by the California Myth had led to chilling darkness and unsightly rot". Drawing from the Beach Boys' associations with Charles Manson and former California governor Ronald Reagan, Erik Davis remarked, "The Beach Boys may be the only bridge between those deranged poles. There is a wider range of political and aesthetic sentiments in their records than in any other band in those heady times—like the state [of California], they expand and bloat and contradict themselves."
During the 1970s, advertising jingles and imagery were predominately based on the Beach Boys' early music and image. The group also inspired the development of the West Coast style later dubbed "yacht rock". According to "Jacobin"s Dan O'Sullivan, the band's aesthetic was the first to be "scavenged" by yacht rock acts like Rupert Holmes. O'Sullivan also cites the Beach Boys' recording of "Sloop John B" as the origin of yacht rock's preoccupation with the "sailors and beachgoers" aesthetic that was "lifted by everyone, from Christopher Cross to Eric Carmen, from 'Buffalo Springfield' folksters like Jim Messina to 'Philly Sound' rockers like Hall & Oates."
"Pet Sounds" came to inform the developments of genres such as pop, rock, jazz, electronic, experimental, punk, and hip hop. Similar to subsequent experimental rock LPs by Frank Zappa, the Beatles, and the Who, "Pet Sounds" featured countertextural aspects that called attention to the very recordedness of the album. Professor of American history John Robert Greene stated that the album broke new ground and took rock music away from its casual lyrics and melodic structures into what was then uncharted territory. He furthermore called it one factor which spawned the majority of trends in post-1965 rock music, the only others being "Rubber Soul", the Beatles' "Revolver", and the contemporary folk movement. The album was the first piece in popular music to incorporate the Electro-Theremin, an easier-to-play version of the theremin, as well as the first in rock music to feature a theremin-like instrument. With "Pet Sounds", they were also the first group to make an entire album that departed from the usual small-ensemble electric rock band format.
According to David Leaf in 1978, "Pet Sounds" and "Good Vibrations" "established the group as the leaders of a new type of pop music, Art Rock." Academic Bill Martin states that the band opened a path in rock music "that went from "Sgt. Pepper's" to "Close to the Edge" and beyond". He argues that the advancing technology of multitrack recording and mixing boards were more influential to experimental rock than electronic instruments such as the synthesizer, allowing the Beatles and the Beach Boys to become the first crop of non-classically trained musicians to create extended and complex compositions. In "Strange Sounds: Offbeat Instruments and Sonic Experiments in Pop", Mark Brend writes:
The making of "Good Vibrations", according to Domenic Priore, was "unlike anything previous in the realms of classical, jazz, international, soundtrack, or any other kind of recording", while biographer Peter Ames Carlin wrote that it "sounded like nothing that had ever been played on the radio before." It contained previously untried mixes of instruments, and was the first successful pop song to have cellos in a juddering rhythm. Musicologist Charlie Gillett called it "one of the first records to flaunt studio production as a quality in its own right, rather than as a means of presenting a performance". Again, Brian employed the use of Electro-Theremin for the track. Upon release, the single prompted an unexpected revival in theremins while increasing awareness of analog synthesizers, leading Moog Music to produce their own brand of ribbon-controlled instruments. In a 1968 editorial for "Jazz & Pop", Gene Sculatti predicted that the song "may yet prove to be the most significantly revolutionary piece of the current rock renaissance ... In no minor way, 'Good Vibrations' is a primary influential piece for all producing rock artists; everyone has felt its import to some degree".
Discussing "Smiley Smile", Daniel Harrison argues that the album could "almost" be considered art music in the Western classical tradition, and that the group's innovations in the musical language of rock can be compared to those that introduced atonal and other nontraditional techniques into that classical tradition. He explains, "The spirit of experimentation is just as palpable ... as it is in, say, Schoenberg's op. 11 piano pieces." However, such notions were not widely acknowledged by rock audiences nor by the classically minded at the time. Harrison concludes: "What influences could these innovations then have? The short answer is, not much. "Smiley Smile", "Wild Honey", "Friends", and "20/20" sound like few other rock albums; they are "sui generis". … It must be remembered that the commercial failure of the Beach Boys' experiments was hardly motivation for imitation." Musicologist David Toop, who included the "Smiley Smile" track "Fall Breaks and Back to Winter" on a companion CD for his book "Ocean of Sound", placed the Beach Boys' effect on sound pioneering in league with Les Baxter, Aphex Twin, Herbie Hancock, King Tubby, and My Bloody Valentine.
"Sunflower" marked an end to the experimental songwriting and production phase initiated by "Smiley Smile". After "Surf's Up", Harrison wrote, their albums "contain a mixture of middle-of-the-road music entirely consonant with pop style during the early 1970s with a few oddities that proved that the desire to push beyond conventional boundaries was not dead," until 1974, "the year in which the Beach Boys ceased to be a rock 'n' roll act and became an oldies act."
In the 1970s, the Beach Boys served a "totemic influence" on punk rock that later gave way to indie rock. Brad Shoup of Stereogum surmised that, thanks to the Ramones' praise for the group, many punk, pop punk, or "punk-adjacent" artists showed influence from the Beach Boys, noting cover versions of the band's songs recorded by Slickee Boys, Agent Orange, Bad Religion, Shonen Knife, the Queers, Hi-Standard, the Descendents, the Donnas, M.O.D., and the Vandals. "The Beach Boys Love You" is sometimes considered the group's "punk album", and "Pet Sounds" is sometimes advanced as the first emo album.
In the 1990s, the Beach Boys experienced a resurgence of popularity with the alternative rock generation. According to Sean O'Hagan, leader of the High Llamas and former member of Stereolab, a younger generation of record-buyers "stopped listening to indie records" in favor of the Beach Boys. Bands who advocated for the Beach Boys included founding members of the Elephant 6 Collective (Neutral Milk Hotel, the Olivia Tremor Control, the Apples in Stereo, and of Montreal). United by a shared love of the group's music, they named Pet Sounds Studio in honor of the band. "Rolling Stone" writer Barry Walters wrote in 2000 that albums such as "Surf's Up" and "Love You" "are becoming sonic blueprints, akin to what early Velvet Underground LPs meant to the previous indie peer group." The High Llamas, Eric Matthews and St. Etienne are among the "alt heroes" who contributed cover versions of "unreleased, overlooked or underappreciated Wilson/Beach Boys obscurities" on the tribute album "Caroline Now!" (2000).
"Smile" became a touchstone for many bands who were labelled "chamber pop", a term used for artists influenced by the lush orchestrations of Brian Wilson, Lee Hazlewood, and Burt Bacharach. "Pitchfork" writer Mark Richardson cited "Smiley Smile" as the origin point of "the kind of lo-fi bedroom pop that would later propel Sebadoh, Animal Collective, and other characters." The "Sunflower" track "All I Wanna Do" is also cited as one of the earliest precursors to chillwave, a microgenre that developed in the 2000s.
Between 1965 and 1967, the Beach Boys developed a musical and lyrical sophistication that contrasted their work from before and after. This divide was further solidified by the difference in sound between their albums and their stage performances. When the band's studio recordings grew more complex, they were unable to effectively reproduce them in their live show. Starting in 1966, band publicist Derek Taylor was instrumental in campaigning the idea of Brian Wilson as a "genius" to members of the burgeoning rock press, painting him as a mastermind who stays at home composing while the rest of the band tour. All of these elements combined to create a split fanbase corresponding to two distinct musical markets. One group is the conservative audience who enjoys the band's early singles as a wholesome representation of American popular culture from before the political and social movements brought on in the mid 1960s. The other group also appreciates the early songs for their energy and complexity, but not as much as the band's ambitious work that was created during the formative psychedelic era.
Initially, rock music journalists valued the Beach Boys' early records over their experimental work. Real surfers were critical of the band for not being true adherents of the sport. As authenticity became a higher concern among critics, the group's legitimacy in rock music became an oft-repeated criticism, especially since their early songs appeared to celebrate a politically unconscious youth culture. Music critic Kenneth Partridge blamed the lack of "edginess" on the group's early records for why they're "rarely talked about in the same breath as the Beatles and the Rolling Stones, and when they are, it's really only because of two albums". The "particular appeal" of Wilson's genius, according to music critic Barney Hoskyns, was "the fact that the Beach Boys were the very obverse of hip – the unlikeliness of these songs growing out of disposable surf pop – and in the singular naivety and ingenuousness of his personality." Luis Sanchez argued that despite the immaturity of their early songs, "what matters is that it captured a lack of self-consciousness—a "genuineness"—that set them apart from their peers. And it was this quality that came to define Brian's oeuvre as he moved beyond and into bigger pop productions that would culminate in "Smile"."
Generally, the record-buying public came to view the music made after "Smile" as the point marking their artistic decline. After "Smiley Smile", the group was virtually blacklisted by the music press, to the extent that reviews of the group's records were either withheld from publication or published long after the release dates. Mike Love said that, unlike Brian, he was never concerned about being taken seriously by critics, and considered the negatively described "simplicity" of their early songs as "elitism at its worst: because so many people loved our music, there must be something wrong with it." In a review of "The Smile Sessions" for "NewMusicBox", Frank Oteri argued that the popular caricature of the Beach Boys' as a "light-hearted party band" ensured that they will never earn themselves "the same pride of place in American music history held by other great innovators". Peter Ames Carlin summarized the group's various phases: "Once surfin' pin-ups, they remade themselves as avant-garde pop artists, then psychedelic oracles. After that they were down-home hippies, then retro-hip icons. Eventually they devolved into none of the above: a kind of perpetual-motion nostalgia machine."
Since the 1990s, there has been an increasing tendency to recontextualize the Beach Boys outside of their typical iconography, with academic Kirk Curnutt citing such examples as the use of "Sloop John B" as Vietnam allegory in the film "Forrest Gump" (1994) and "I Just Wasn't Made for These Times" as an LSD-inspired underscore for one episode of the television drama "Mad Men" (2012).
The Wilsons' California house, where the Wilson brothers grew up and the group began, was demolished in 1986 to make way for Interstate 105, the Century Freeway. A Beach Boys Historic Landmark (California Landmark No. 1041 at 3701 West 119th Street), dedicated on May 20, 2005, marks the location. On December 30, 1980, the Beach Boys were awarded a star on the Hollywood Walk of Fame, located at 1500 Vine Street. On September 2, 1977, the group performed before an audience of 40,000 at Narragansett Park in Pawtucket, Rhode Island, which remains the largest concert audience in Rhode Island history. In 2017, the street where the concert stage formerly stood was officially renamed to "Beach Boys Way".
Current members
Former members
Timeline
Notable supporting musicians for both the Beach Boys' live performances and studio recordings included guitarist Glen Campbell, keyboardists Daryl Dragon and Toni Tennille (Captain & Tennille), and saxophonist Charles Lloyd.
Studio albums
Selected archival releases
See also
Bibliography
Articles
Books | https://en.wikipedia.org/wiki?curid=4477 |
Beatrix Potter
Helen Beatrix Potter (, US , 28 July 186622 December 1943) was an English writer, illustrator, natural scientist, and conservationist best known for her children's books featuring animals, such as those in "The Tale of Peter Rabbit".
Born into an upper-middle-class household, Potter was educated by governesses and grew up isolated from other children. She had numerous pets and spent holidays in Scotland and the Lake District, developing a love of landscape, flora, and fauna, all of which she closely observed and painted.
Potter's study and watercolours of fungi led to her being widely respected in the field of mycology. In her thirties, Potter self-published the highly successful children's book "The Tale of Peter Rabbit". Following this, Potter began writing and illustrating children's books full-time.
In all, Potter wrote thirty books; the best known being her twenty-three children's tales. With the proceeds from the books and a legacy from an aunt, in 1905 Potter bought Hill Top Farm in Near Sawrey, a village in the Lake District which at that time was in Lancashire. Over the following decades, she purchased additional farms to preserve the unique hill country landscape. In 1913, at the age of 47, she married William Heelis, a respected local solicitor from Hawkshead. Potter was also a prize-winning breeder of Herdwick sheep and a prosperous farmer keenly interested in land preservation. She continued to write and illustrate, and to design spin-off merchandise based on her children's books for British publisher Warne until the duties of land management and her diminishing eyesight made it difficult to continue.
Potter died of pneumonia and heart disease on 22 December 1943 at her home in Near Sawrey at the age of 77, leaving almost all her property to the National Trust. She is credited with preserving much of the land that now constitutes the Lake District National Park. Potter's books continue to sell throughout the world in many languages with her stories being retold in songs, films, ballet, and animations, and her life depicted in a feature film and television film.
Potter's paternal grandfather, Edmund Potter, from Glossop in Derbyshire, owned what was then the largest calico printing works in England, and later served as a Member of Parliament.
Beatrix's father, Rupert William Potter (1832–1914), was educated at Manchester College by the Unitarian philosopher James Martineau. He then trained as a barrister in London. Rupert practised law, specialising in equity law and conveyancing. He married Helen Leech (1839–1932) on 8 August 1863 at Hyde Unitarian Chapel, Gee Cross. Helen was the daughter of Jane Ashton (1806–1884) and John Leech, a wealthy cotton merchant and shipbuilder from Stalybridge. Helen's first cousins were Harriet Lupton ("née" Ashton), the sister of Thomas Ashton, 1st Baron Ashton of Hyde. It was reported in July 2014 that Beatrix had personally given a number of her own original hand-painted illustrations to the two daughters of Arthur and Harriet Lupton, who were cousins to both Beatrix and Catherine, Duchess of Cambridge.
Beatrix's parents lived comfortably at 2 Bolton Gardens, West Brompton, where Helen Beatrix was born on 28 July 1866 and her brother Walter Bertram on 14 March 1872. Beatrix lived in the house until her marriage in 1913. The house was destroyed in the Blitz. Bousfield Primary School now stands where the house once was. A blue plaque on the school building testifies to the former site of The Potter home.
Both parents were artistically talented, and Rupert was an adept amateur photographer. Rupert had invested in the stock market, and by the early 1890s, he was extremely wealthy.
Potter's family on both sides were from the Manchester area. They were English Unitarians, associated with dissenting Protestant congregations, influential in 19th century England, that affirmed the oneness of God and that rejected the doctrine of the Trinity.
Beatrix was educated by three able governesses, the last of whom was Annie Moore ("née" Carter), just three years older than Beatrix, who tutored Beatrix in German as well as acting as lady's companion. She and Beatrix remained friends throughout their lives, and Annie's eight children were the recipients of many of Potter's delightful picture letters. It was Annie who later suggested that these letters might make good children's books.
She and her younger brother Walter Bertram (1872–1918) grew up with few friends outside their large extended family. Her parents were artistic, interested in nature, and enjoyed the countryside. As children, Beatrix and Bertram had numerous small animals as pets which they observed closely and drew endlessly. In their schoolroom, Beatrix and Bertram kept a variety of small pets, mice, rabbits, a hedgehog and some bats, along with collections of butterflies and other insects which they drew and studied. Beatrix was devoted to the care of her small animals, often taking them with her on long holidays. In most of the first fifteen years of her life, Beatrix spent summer holidays at Dalguise, an estate on the River Tay in Perthshire, Scotland. There she sketched and explored an area that nourished her imagination and her observation. Beatrix and her brother were allowed great freedom in the country, and both children became adept students of natural history. In 1882, when Dalguise was no longer available, the Potters took their first summer holiday in the Lake District, at Wray Castle near Lake Windermere. Here Beatrix met Hardwicke Rawnsley, vicar of Wray and later the founding secretary of the National Trust, whose interest in the countryside and country life inspired the same in Beatrix and who was to have a lasting impact on her life.
At about the age of 14, Beatrix began to keep a diary. It was written in a code of her own devising which was a simple letter for letter substitution. Her "Journal" was important to the development of her creativity, serving as both sketchbook and literary experiment: in tiny handwriting, she reported on society, recorded her impressions of art and artists, recounted stories and observed life around her. The "Journal", decoded and transcribed by Leslie Linder in 1958, does not provide an intimate record of her personal life, but it is an invaluable source for understanding a vibrant part of British society in the late 19th century. It describes Potter's maturing artistic and intellectual interests, her often amusing insights on the places she visited, and her unusual ability to observe nature and to describe it. Started in 1881, her journal ends in 1897 when her artistic and intellectual energies were absorbed in scientific study and in efforts to publish her drawings. Precocious but reserved and often bored, she was searching for more independent activities and wished to earn some money of her own while dutifully taking care of her parents, dealing with her especially demanding mother, and managing their various households.
Beatrix Potter's parents did not discourage higher education. As was common in the Victorian era, women of her class were privately educated and rarely went to university.
Beatrix Potter was interested in every branch of natural science save astronomy. Botany was a passion for most Victorians and nature study was a popular enthusiasm. Potter was eclectic in her tastes: collecting fossils, studying archaeological artefacts from London excavations, and interested in entomology. In all these areas, she drew and painted her specimens with increasing skill. By the 1890s, her scientific interests centred on mycology. First drawn to fungi because of their colours and evanescence in nature and her delight in painting them, her interest deepened after meeting Charles McIntosh, a revered naturalist and amateur mycologist, during a summer holiday in Dunkeld in Perthshire in 1892. He helped improve the accuracy of her illustrations, taught her taxonomy, and supplied her with live specimens to paint during the winter. Curious as to how fungi reproduced, Potter began microscopic drawings of fungus spores (the agarics) and in 1895 developed a theory of their germination. Through the connections of her uncle Sir Henry Enfield Roscoe, a chemist and vice-chancellor of the University of London, she consulted with botanists at Kew Gardens, convincing George Massee of her ability to germinate spores and her theory of hybridisation. She did not believe in the theory of symbiosis proposed by Simon Schwendener, the German mycologist, as previously thought; instead, she proposed a more independent process of reproduction.
Rebuffed by William Thiselton-Dyer, the Director at Kew, because of her sex and her amateur status, Beatrix wrote up her conclusions and submitted a paper, "On the Germination of the Spores of the Agaricineae", to the Linnean Society in 1897. It was introduced by Massee because, as a female, Potter could not attend proceedings or read her paper. She subsequently withdrew it, realising that some of her samples were contaminated, but continued her microscopic studies for several more years. Her paper has only recently been rediscovered, along with the rich, artistic illustrations and drawings that accompanied it. Her work is only now being properly evaluated. Potter later gave her other mycological and scientific drawings to the Armitt Museum and Library in Ambleside, where mycologists still refer to them to identify fungi. There is also a collection of her fungus paintings at the Perth Museum and Art Gallery in Perth, Scotland, donated by Charles McIntosh. In 1967, the mycologist W.P.K. Findlay included many of Potter's beautifully accurate fungus drawings in his "Wayside & Woodland Fungi", thereby fulfilling her desire to one day have her fungus drawings published in a book. In 1997, the Linnean Society issued a posthumous apology to Potter for the sexism displayed in its handling of her research.
Potter's artistic and literary interests were deeply influenced by fairies, fairy tales and fantasy. She was a student of the classic fairy tales of Western Europe. As well as stories from the Old Testament, John Bunyan's "The Pilgrim's Progress" and Harriet Beecher Stowe's "Uncle Tom's Cabin", she grew up with "Aesop's Fables", the fairy tales of the Brothers Grimm and Hans Christian Andersen, Charles Kingsley's "The Water Babies", the folk tales and mythology of Scotland, the German Romantics, Shakespeare, and the romances of Sir Walter Scott. As a young child, before the age of eight, Edward Lear's "Book of Nonsense", including the much loved "The Owl and the Pussycat", and Lewis Carroll's "Alice in Wonderland" had made their impression, although she later said of "Alice" that she was more interested in Tenniel's illustrations than what they were about. The "Brer Rabbit" stories of Joel Chandler Harris had been family favourites, and she later studied his "Uncle Remus" stories and illustrated them. She studied book illustration from a young age and developed her own tastes, but the work of the picture book triumvirate Walter Crane, Kate Greenaway and Randolph Caldecott, the last an illustrator whose work was later collected by her father, was a great influence. When she started to illustrate, she chose first the traditional rhymes and stories, "Cinderella", "Sleeping Beauty", "Ali Baba and the Forty Thieves", "Puss-in-boots", and "Red Riding Hood". However, most often her illustrations were fantasies featuring her own pets: mice, rabbits, kittens, and guinea pigs.
In her teenage years, Potter was a regular visitor to the art galleries of London, particularly enjoying the summer and winter exhibitions at the Royal Academy in London. Her "Journal" reveals her growing sophistication as a critic as well as the influence of her father's friend, the artist Sir John Everett Millais, who recognised Beatrix's talent of observation. Although Potter was aware of art and artistic trends, her drawing and her prose style were uniquely her own.
As a way to earn money in the 1890s, Beatrix and her brother began to print Christmas cards of their own design, as well as cards for special occasions. Mice and rabbits were the most frequent subject of her fantasy paintings. In 1890, the firm of Hildesheimer and Faulkner bought several of the drawings of her rabbit Benjamin Bunny to illustrate verses by Frederic Weatherly titled "A Happy Pair". In 1893, the same printer bought several more drawings for Weatherly's "Our Dear Relations", another book of rhymes, and the following year Potter sold a series of frog illustrations and verses for "Changing Pictures", a popular annual offered by the art publisher Ernest Nister. Potter was pleased by this success and determined to publish her own illustrated stories.
Whenever Potter went on holiday to the Lake District or Scotland, she sent letters to young friends, illustrating them with quick sketches. Many of these letters were written to the children of her former governess Annie Carter Moore, particularly to Moore's eldest son Noel who was often ill. In September 1893, Potter was on holiday at Eastwood in Dunkeld, Perthshire. She had run out of things to say to Noel, and so she told him a story about "four little rabbits whose names were Flopsy, Mopsy, Cottontail and Peter". It became one of the most famous children's letters ever written and the basis of Potter's future career as a writer-artist-storyteller.
In 1900, Potter revised her tale about the four little rabbits, and fashioned a dummy book of it – it has been suggested, in imitation of Helen Bannerman's 1899 bestseller "The Story of Little Black Sambo". Unable to find a buyer for the work, she published it for family and friends at her own expense in December 1901. It was drawn in black and white with a coloured frontispiece. Rawnsley had great faith in Potter's tale, recast it in didactic verse, and made the rounds of the London publishing houses. Frederick Warne & Co had previously rejected the tale but, eager to compete in the booming small format children's book market, reconsidered and accepted the "bunny book" (as the firm called it) following the recommendation of their prominent children's book artist L. Leslie Brooke. The firm declined Rawnsley's verse in favour of Potter's original prose, and Potter agreed to colour her pen and ink illustrations, choosing the then-new Hentschel three-colour process to reproduce her watercolours.
On 2 October 1902, "The Tale of Peter Rabbit" was published, and was an immediate success. It was followed the next year by "The Tale of Squirrel Nutkin" and "The Tailor of Gloucester", which had also first been written as picture letters to the Moore children. Working with Norman Warne as her editor, Potter published two or three little books each year: 23 books in all. The last book in this format was "Cecily Parsley's Nursery Rhymes" in 1922, a collection of favourite rhymes. Although "The Tale of Little Pig Robinson" was not published until 1930, it had been written much earlier. Potter continued creating her little books until after the First World War when her energies were increasingly directed toward her farming, sheep-breeding and land conservation.
The immense popularity of Potter's books was based on the lively quality of her illustrations, the non-didactic nature of her stories, the depiction of the rural countryside, and the imaginative qualities she lent to her animal characters.
Potter was also a canny businesswoman. As early as 1903, she made and patented a Peter Rabbit doll. It was followed by other "spin-off" merchandise over the years, including painting books, board games, wall-paper, figurines, baby blankets and china tea-sets. All were licensed by Frederick Warne & Co and earned Potter an independent income, as well as immense profits for her publisher.
In 1905, Potter and Norman Warne became unofficially engaged. Potter's parents objected to the match because Warne was "in trade" and thus not socially suitable. The engagement lasted only one month until Warne died of pernicious anaemia at age 37. That same year, Potter used some of her income and a small inheritance from an aunt to buy Hill Top Farm in Near Sawrey in the English Lake District near Windermere. Potter and Warne may have hoped that Hill Top Farm would be their holiday home, but after Warne's death, Potter went ahead with its purchase as she had always wanted to own that farm, and live in "that charming village".
The tenant farmer John Cannon and his family agreed to stay on to manage the farm for her while she made physical improvements and learned the techniques of fell farming and of raising livestock, including pigs, cows and chickens; the following year she added sheep. Realising she needed to protect her boundaries, she sought advice from W.H. Heelis & Son, a local firm of solicitors with offices in nearby Hawkshead. With William Heelis acting for her, she bought contiguous pasture, and in 1909 the Castle Farm across the road from Hill Top Farm. She visited Hill Top at every opportunity, and her books written during this period (such as "The Tale of Ginger and Pickles", about the local shop in Near Sawrey and "The Tale of Mrs. Tittlemouse", a wood mouse) reflect her increasing participation in village life and her delight in country living.
Owning and managing these working farms required routine collaboration with the widely respected William Heelis. By the summer of 1912, Heelis had proposed marriage and Beatrix had accepted; although she did not immediately tell her parents, who once again disapproved because Heelis was only a country solicitor. Potter and Heelis were married on 15 October 1913 in London at St Mary Abbots in Kensington. The couple moved immediately to Near Sawrey, residing at Castle Cottage, the renovated farmhouse on Castle Farm, which was 34 acres large. Hill Top remained a working farm but was now remodelled to allow for the tenant family and Potter's private studio and workshop. At last her own woman, Potter settled into the partnerships that shaped the rest of her life: her country solicitor husband and his large family, her farms, the Sawrey community and the predictable rounds of country life. "The Tale of Jemima Puddle-Duck" and "The Tale of Tom Kitten" are representative of Hill Top Farm and her farming life and reflect her happiness with her country life.
Rupert Potter died in 1914 and, with the outbreak of World War I, Potter, now a wealthy woman, persuaded her mother to move to the Lake District and found a property for her to rent in Sawrey. Finding life in Sawrey dull, Helen Potter soon moved to Lindeth Howe (now a 34 bedroomed hotel) a large house the Potters had previously rented for the summer in Bowness, on the other side of Lake Windermere, Potter continued to write stories for Frederick Warne & Co and fully participated in country life. She established a Nursing Trust for local villages and served on various committees and councils responsible for footpaths and other rural issues.
Soon after acquiring Hill Top Farm, Potter became keenly interested in the breeding and raising of Herdwick sheep, the indigenous fell sheep. In 1923 she bought a large sheep farm in the Troutbeck Valley called Troutbeck Park Farm, formerly a deer park, restoring its land with thousands of Herdwick sheep. This established her as one of the major Herdwick sheep farmers in the county. She was admired by her shepherds and farm managers for her willingness to experiment with the latest biological remedies for the common diseases of sheep, and for her employment of the best shepherds, sheep breeders, and farm managers.
By the late 1920s, Potter and her Hill Top farm manager Tom Storey had made a name for their prize-winning Herdwick flock, which took many prizes at the local agricultural shows, where Potter was often asked to serve as a judge. In 1942 she became President-elect of the Herdwick Sheepbreeders’ Association, the first time a woman had been elected but died before taking office.
Potter had been a disciple of the land conservation and preservation ideals of her long-time friend and mentor, Canon Hardwicke Rawnsley, the first secretary and founding member of the National Trust for Places of Historic Interest or Natural Beauty. She supported the efforts of the National Trust to preserve not just the places of extraordinary beauty but also those heads of valleys and low grazing lands that would be irreparably ruined by development. Potter was also an authority on the traditional Lakeland crafts, period furniture and stonework. She restored and preserved the farms that she bought or managed, making sure that each farm house had in it a piece of antique Lakeland furniture. Potter was interested in preserving not only the Herdwick sheep but also the way of life of fell farming. In 1930 the Heelises became partners with the National Trust in buying and managing the fell farms included in the large Monk Coniston Estate. The estate was composed of many farms spread over a wide area of north-western Lancashire, including the Tarn Hows. Potter was the "de facto" estate manager for the Trust for seven years until the National Trust could afford to repurchase most of the property from her. Potter's stewardship of these farms earned her full regard, but she was not without her critics, not the least of which were her contemporaries who felt she used her wealth and the position of her husband to acquire properties in advance of their being made public. She was notable in observing the problems of afforestation, preserving the intake grazing lands, and husbanding the quarries and timber on these farms. All her farms were stocked with Herdwick sheep and frequently with Galloway cattle.
Potter continued to write stories and to draw, although mostly for her own pleasure. Her books in the late 1920s included the semi-autobiographical "The Fairy Caravan", a fanciful tale set in her beloved Troutbeck fells. It was published only in the US during Potter's lifetime, and not until 1952 in the UK. "Sister Anne", Potter's version of the story of Bluebeard, was written for her American readers, but illustrated by Katharine Sturges. A final folktale, "Wag by Wall", was published posthumously by "The Horn Book Magazine" in 1944. Potter was a generous patron of the Girl Guides, whose troupes she allowed to make their summer encampments on her land, and whose company she enjoyed as an older woman.
Potter and William Heelis enjoyed a happy marriage of thirty years, continuing their farming and preservation efforts throughout the hard days of World War II. Although they were childless, Potter played an important role in William's large family, particularly enjoying her relationship with several nieces whom she helped educate, and giving comfort and aid to her husband's brothers and sisters.
Potter died of complications from pneumonia and heart disease on 22 December 1943 at Castle Cottage, and her remains were cremated at Carleton Crematorium. She left nearly all her property to the National Trust, including over of land, sixteen farms, cottages and herds of cattle and Herdwick sheep. Hers was the largest gift at that time to the National Trust, and it enabled the preservation of the land now included in the Lake District National Park and the continuation of fell farming. The central office of the National Trust in Swindon was named "Heelis" in 2005 in her memory. William Heelis continued his stewardship of their properties and of her literary and artistic work for the twenty months he survived her. When he died in August 1945, he left the remainder to the National Trust.
Potter left almost all the original illustrations for her books to the National Trust. The copyright to her stories and merchandise was then given to her publisher Frederick Warne & Co, now a division of the Penguin Group. On 1 January 2014, the copyright expired in the UK and other countries with a 70-years-after-death limit. Hill Top Farm was opened to the public by the National Trust in 1946; her artwork was displayed there until 1985 when it was moved to William Heelis's former law offices in Hawkshead, also owned by the National Trust as the Beatrix Potter Gallery.
Potter gave her folios of mycological drawings to the Armitt Library and Museum in Ambleside before her death. "The Tale of Peter Rabbit" is owned by Frederick Warne and Company, "The Tailor of Gloucester" by the Tate Gallery and "The Tale of the Flopsy Bunnies" by the British Museum.
The largest public collection of her letters and drawings is the Leslie Linder Bequest and Leslie Linder Collection at the Victoria and Albert Museum in London. In the United States, the largest public collections are those in the Rare Book Department of the Free Library of Philadelphia, and the Cotsen Children's Library at Princeton University.
In 2015 a manuscript for an unpublished book was discovered by Jo Hanks, a publisher at Penguin Random House Children's Books, in the Victoria and Albert Museum archive. The book "The Tale of Kitty-in-Boots", with illustrations by Quentin Blake, was published 1 September 2016, to mark the 150th anniversary of Potter's birth.
In 2017, "The Art of Beatrix Potter: Sketches, Paintings, and Illustrations" by Emily Zach was published after San Francisco publisher Chronicle Books decided to mark the 150th anniversary of Beatrix Potter's birth by showing that she was "far more than a 19th-century weekend painter. She was an artist of astonishing range."
In December 2017, the asteroid 13975 Beatrixpotter, discovered by Belgian astronomer Eric Elst in 1992, was named in her memory.
There are many interpretations of Potter's literary work, the sources of her art, and her life and times. These include critical evaluations of her corpus of children's literature and Modernist interpretations of Humphrey Carpenter and Katherine Chandler. Judy Taylor, "That Naughty Rabbit: Beatrix Potter and Peter Rabbit" (rev. 2002) tells the story of the first publication and many editions.
Potter's country life and her farming have been discussed in the work of Susan Denyer and other authors in the publications of The National Trust, such as "Beatrix Potter at Home in the Lake District" (2004).
Potter's work as a scientific illustrator and her work in mycology are discussed in Linda Lear's books "Beatrix Potter: A Life in Nature" (2006) and "Beatrix Potter: The Extraordinary Life of a Victorian Genius" (2008).
In 1971, a ballet film was released, "The Tales of Beatrix Potter", directed by Reginald Mills, set to music by John Lanchbery with choreography by Frederick Ashton, and performed in character costume by members of the Royal Ballet and the Royal Opera House orchestra. The ballet of the same name has been performed by other dance companies around the world.
In 1992, Potter's famous children's book "The Tale of Benjamin Bunny" was featured in the film "Lorenzo's Oil".
Potter is also featured in Susan Wittig Albert's series of light mysteries called The Cottage Tales of Beatrix Potter. The first of the eight-book series is "Tale of Hill Top Farm" (2004), which deals with Potter's life in the Lake District and the village of Near Sawrey between 1905 and 1913.
In 1982, the BBC produced "The Tale of Beatrix Potter". This dramatization of her life was written by John Hawkesworth, directed by Bill Hayes, and starred Holly Aird and Penelope Wilton as the young and adult Beatrix, respectively. "The World of Peter Rabbit and Friends", a TV series based on her stories, which starred actress Niamh Cusack as Beatrix Potter.
In 2006, Chris Noonan directed "Miss Potter", a biographical film of Potter's life focusing on her early career and romance with her editor Norman Warne. The film stars Renée Zellweger, Ewan McGregor and Emily Watson.
On 9 February 2018, Columbia Pictures released "Peter Rabbit", directed by Will Gluck, based on the work by Potter. | https://en.wikipedia.org/wiki?curid=4481 |
Liberal Party (UK)
The Liberal Party was one of the two major political parties in the United Kingdom with the opposing Conservative Party in the 19th and early 20th centuries. The party arose from an alliance of Whigs and free trade-supporting Peelites and the reformist Radicals in the 1850s. By the end of the 19th century, it had formed four governments under William Gladstone. Despite being divided over the issue of Irish Home Rule, the party returned to government in 1905 and then won a landslide victory in the following year's general election.
Under prime ministers Henry Campbell-Bannerman (1905–1908) and H. H. Asquith (1908–1916), the Liberal Party passed the welfare reforms that created a basic British welfare state. Although Asquith was the party's leader, its dominant figure was David Lloyd George. Asquith was overwhelmed by the wartime role of coalition prime minister and Lloyd George replaced him as prime minister in late 1916, but Asquith remained as Liberal Party leader. The pair fought for years over control of the party, badly weakening it in the process. In "The Oxford Companion to British History", historian Martin Pugh argues:
The government of Lloyd George was dominated by the Conservative Party, which finally deposed him in 1922. By the end of the 1920s, the Labour Party had replaced the Liberals as the Conservatives' main rival. The Liberal Party went into decline after 1918 and by the 1950s won no more than six seats at general elections. Apart from notable by-election victories, its fortunes did not improve significantly until it formed the SDP–Liberal Alliance with the newly formed Social Democratic Party (SDP) in 1981. At the 1983 general election, the Alliance won over a quarter of the vote, but only 23 of the 650 seats it contested. At the 1987 general election, its share of the vote fell below 23% and the Liberals and Social Democratic Party merged in 1988 to form the Liberal Democrats. A splinter group reconstituted the Liberal Party in 1989.
Prominent intellectuals associated with the Liberal Party include the philosopher John Stuart Mill, the economist John Maynard Keynes and social planner William Beveridge.
The Liberal Party grew out of the Whigs, who had their origins in an aristocratic faction in the reign of Charles II and the early 19th century Radicals. The Whigs were in favour of reducing the power of the Crown and increasing the power of Parliament. Although their motives in this were originally to gain more power for themselves, the more idealistic Whigs gradually came to support an expansion of democracy for its own sake. The great figures of reformist Whiggery were Charles James Fox (died 1806) and his disciple and successor Earl Grey. After decades in opposition, the Whigs returned to power under Grey in 1830 and carried the First Reform Act in 1832.
The Reform Act was the climax of Whiggism, but it also brought about the Whigs' demise. The admission of the middle classes to the franchise and to the House of Commons led eventually to the development of a systematic middle class liberalism and the end of Whiggery, although for many years reforming aristocrats held senior positions in the party. In the years after Grey's retirement, the party was led first by Lord Melbourne, a fairly traditional Whig, and then by Lord John Russell, the son of a Duke but a crusading radical, and by Lord Palmerston, a renegade Irish Tory and essentially a conservative, although capable of radical gestures.
As early as 1839, Russell had adopted the name of "Liberals", but in reality his party was a loose coalition of Whigs in the House of Lords and Radicals in the Commons. The leading Radicals were John Bright and Richard Cobden, who represented the manufacturing towns which had gained representation under the Reform Act. They favoured social reform, personal liberty, reducing the powers of the Crown and the Church of England (many Liberals were Nonconformists), avoidance of war and foreign alliances (which were bad for business) and above all free trade. For a century, free trade remained the one cause which could unite all Liberals.
In 1841, the Liberals lost office to the Conservatives under Sir Robert Peel, but their period in opposition was short because the Conservatives split over the repeal of the Corn Laws, a free trade issue; and a faction known as the Peelites (but not Peel himself, who died soon after) defected to the Liberal side. This allowed ministries led by Russell, Palmerston and the Peelite Lord Aberdeen to hold office for most of the 1850s and 1860s. A leading Peelite was William Ewart Gladstone, who was a reforming Chancellor of the Exchequer in most of these governments. The formal foundation of the Liberal Party is traditionally traced to 1859 and the formation of Palmerston's second government.
However, the Whig-Radical amalgam could not become a true modern political party while it was dominated by aristocrats and it was not until the departure of the "Two Terrible Old Men", Russell and Palmerston, that Gladstone could become the first leader of the modern Liberal Party. This was brought about by Palmerston's death in 1865 and Russell's retirement in 1868. After a brief Conservative government (during which the Second Reform Act was passed by agreement between the parties), Gladstone won a huge victory at the 1868 election and formed the first Liberal government. The establishment of the party as a national membership organisation came with the foundation of the National Liberal Federation in 1877. The philosopher John Stuart Mill was also a Liberal MP from 1865 to 1868.
For the next thirty years Gladstone and Liberalism were synonymous. William Ewart Gladstone served as prime minister four times (1868–74, 1880–85, 1886, and 1892–94). His financial policies, based on the notion of balanced budgets, low taxes and "laissez-faire", were suited to a developing capitalist society, but they could not respond effectively as economic and social conditions changed. Called the "Grand Old Man" later in life, Gladstone was always a dynamic popular orator who appealed strongly to the working class and to the lower middle class. Deeply religious, Gladstone brought a new moral tone to politics, with his evangelical sensibility and his opposition to aristocracy. His moralism often angered his upper-class opponents (including Queen Victoria), and his heavy-handed control split the Liberal Party.
In foreign policy, Gladstone was in general against foreign entanglements, but he did not resist the realities of imperialism. For example, he ordered the occupation of Egypt by British forces in 1882. His goal was to create a European order based on co-operation rather than conflict and on mutual trust instead of rivalry and suspicion; the rule of law was to supplant the reign of force and self-interest. This Gladstonian concept of a harmonious Concert of Europe was opposed to and ultimately defeated by a Bismarckian system of manipulated alliances and antagonisms.
As prime minister from 1868 to 1874, Gladstone headed a Liberal Party which was a coalition of Peelites like himself, Whigs and Radicals. He was now a spokesman for "peace, economy and reform". One major achievement was the Elementary Education Act of 1870, which provided England with an adequate system of elementary schools for the first time. He also secured the abolition of the purchase of commissions in the army and of religious tests for admission to Oxford and Cambridge; the introduction of the secret ballot in elections; the legalization of trade unions; and the reorganization of the judiciary in the Judicature Act.
Regarding Ireland, the major Liberal achievements were land reform, where he ended centuries of landlord oppression, and the disestablishment of the (Anglican) Church of Ireland through the Irish Church Act 1869.
In the 1874 general election Gladstone was defeated by the Conservatives under Benjamin Disraeli during a sharp economic recession. He formally resigned as Liberal leader and was succeeded by the Marquess of Hartington, but he soon changed his mind and returned to active politics. He strongly disagreed with Disraeli's pro-Ottoman foreign policy and in 1880 he conducted the first outdoor mass-election campaign in Britain, known as the Midlothian campaign. The Liberals won a large majority in the 1880 election. Hartington ceded his place and Gladstone resumed office.
Among the consequences of the Third Reform Act (1884) was the giving of the vote to many Catholics in Ireland. In the 1885 general election the Irish Parliamentary Party held the balance of power in the House of Commons, and demanded Irish Home Rule as the price of support for a continued Gladstone ministry. Gladstone personally supported Home Rule, but a strong Liberal Unionist faction led by Joseph Chamberlain, along with the last of the Whigs, Hartington, opposed it. The Irish Home Rule bill proposed to offer all owners of Irish land a chance to sell to the state at a price equal to 20 years' purchase of the rents and allowing tenants to purchase the land. Irish nationalist reaction was mixed, Unionist opinion was hostile, and the election addresses during the 1886 election revealed English radicals to be against the bill also. Among the Liberal rank and file, several Gladstonian candidates disowned the bill, reflecting fears at the constituency level that the interests of the working people were being sacrificed to finance a costly rescue operation for the landed élite. Further, Home Rule had not been promised in the Liberals' election manifesto, and so the impression was given that Gladstone was buying Irish support in a rather desperate manner to hold on to power.
The result was a catastrophic split in the Liberal Party, and heavy defeat in the 1886 election at the hands of Lord Salisbury, who was supported by the breakaway Liberal Unionist Party. There was a final weak Gladstone ministry in 1892, but it also was dependent on Irish support and failed to get Irish Home Rule through the House of Lords.
Historically, the aristocracy was divided between Conservatives and Liberals. However, when Gladstone committed to home rule for Ireland, Britain's upper classes largely abandoned the Liberal party, giving the Conservatives a large permanent majority in the House of Lords. Following the Queen, High Society in London largely ostracized home rulers and Liberal clubs were badly split. Joseph Chamberlain took a major element of upper-class supporters out of the Party and into a third party called Liberal Unionism on the Irish issue. It collaborated with and eventually merged into the Conservative party. The Gladstonian liberals in 1891 adopted The Newcastle Programme that included home rule for Ireland, disestablishment of the Church of England in Wales, tighter controls on the sale of liquor, major extension of factory regulation and various democratic political reforms. The Programme had a strong appeal to the nonconformist middle-class Liberal element, which felt liberated by the departure of the aristocracy.
A major long-term consequence of the Third Reform Act was the rise of Lib-Lab candidates, in the absence of any committed Labour Party. The Act split all county constituencies (which were represented by multiple MPs) into single-member constituencies, roughly corresponding to population patterns. In areas with working class majorities, in particular coal-mining areas, Lib-Lab candidates were popular, and they received sponsorship and endorsement from trade unions. In the first election after the Act was passed (1885), thirteen were elected, up from two in 1874. The Third Reform Act also facilitated the demise of the Whig old guard: in two-member constituencies, it was common to pair a Whig and a radical under the Liberal banner. After the Third Reform Act, fewer Whigs were selected as candidates.
A broad range of interventionist reforms were introduced by the 1892–1895 Liberal government. Amongst other measures, standards of accommodation and of teaching in schools were improved, factory inspection was made more stringent, and ministers used their powers to increase the wages and reduce the working hours of large numbers of male workers employed by the state.
Historian Walter L. Arnstein concludes:
Gladstone finally retired in 1894. Gladstone's support for Home Rule deeply divided the party, and it lost its upper and upper-middle-class base, while keeping support among Protestant nonconformists and the Celtic fringe. Historian R. C. K. Ensor reports that after 1886, the main Liberal Party was deserted by practically the entire whig peerage and the great majority of the upper-class and upper-middle-class members. High prestige London clubs that had a Liberal base were deeply split. Ensor notes that, "London society, following the known views of the Queen, practically ostracized home rulers."
The new Liberal leader was the ineffectual Lord Rosebery. He led the party to a heavy defeat in the 1895 general election.
The Liberal Party lacked a unified ideological base in 1906. It contained numerous contradictory and hostile factions, such as imperialists and supporters of the Boers; near-socialists and laissez-faire classical liberals; suffragettes and opponents of women's suffrage; antiwar elements and supporters of the military alliance with France. Nonconformists – Protestants outside the Anglican fold – were a powerful element, dedicated to opposing the established church in terms of education and taxation. However, the non-conformists were losing support amid society at large and played a lesser role in party affairs after 1900. The party, furthermore, also included Irish Catholics, and secularists from the labour movement. Many Conservatives (including Winston Churchill) had recently protested against high tariff moves by the Conservatives by switching to the anti-tariff Liberal camp, but it was unclear how many old Conservative traits they brought along, especially on military and naval issues.
The middle-class business, professional and intellectual communities were generally strongholds, although some old aristocratic families played important roles as well. The working-class element was moving rapidly toward the newly emerging Labour Party. One uniting element was widespread agreement on the use of politics and Parliament as a device to upgrade and improve society and to reform politics. All Liberals were outraged when Conservatives used their majority in the House of Lords to block reform legislation.
The late nineteenth century saw the emergence of New Liberalism within the Liberal Party, which advocated state intervention as a means of guaranteeing freedom and removing obstacles to it such as poverty and unemployment. The policies of the New Liberalism are now known as social liberalism.
The New Liberals included intellectuals like L. T. Hobhouse, and John A. Hobson. They saw individual liberty as something achievable only under favourable social and economic circumstances. In their view, the poverty, squalor, and ignorance in which many people lived made it impossible for freedom and individuality to flourish. New Liberals believed that these conditions could be ameliorated only through collective action coordinated by a strong, welfare-oriented, and interventionist state.
After the historic 1906 victory, the Liberal Party introduced multiple reforms on range of issues, including health insurance, unemployment insurance, and pensions for elderly workers, thereby laying the groundwork for the future British welfare state. Some proposals failed, such as licensing fewer pubs, or rolling back Conservative educational policies. The People's Budget of 1909, championed by David Lloyd George and fellow Liberal Winston Churchill, introduced unprecedented taxes on the wealthy in Britain and radical social welfare programmes to the country's policies. It was the first budget with the expressed intent of redistributing wealth among the public. It imposed increased taxes on luxuries, liquor, tobacco, high incomes, and land – taxation that fell heavily on the rich. The new money was to be made available for new welfare programmes as well as new battleships. In 1911 Lloyd George succeeded in putting through Parliament his National Insurance Act, making provision for sickness and invalidism, and this was followed by his Unemployment Insurance Act.
Historian Peter Weiler argues:
Contrasting Old Liberalism with New Liberalism, David Lloyd George noted in a 1908 speech the following:
The Liberals languished in opposition for a decade while the coalition of Salisbury and Chamberlain held power. The 1890s were marred by infighting between the three principal successors to Gladstone, party leader William Harcourt, former prime minister Lord Rosebery, and Gladstone's personal secretary, John Morley. This intrigue finally led Harcourt and Morley to resign their positions in 1898 as they continued to be at loggerheads with Rosebery over Irish home rule and issues relating to imperialism. Replacing Harcourt as party leader was Sir Henry Campbell-Bannerman. Harcourt's resignation briefly muted the turmoil in the party, but the beginning of the Second Boer War soon nearly broke the party apart, with Rosebery and a circle of supporters including important future Liberal figures H. H. Asquith, Edward Grey and Richard Burdon Haldane forming a clique dubbed the Liberal Imperialists that supported the government in the prosecution of the war. On the other side, more radical members of the party formed a Pro-Boer faction that denounced the conflict and called for an immediate end to hostilities. Quickly rising to prominence among the Pro-Boers was David Lloyd George, a relatively new MP and a master of rhetoric, who took advantage of having a national stage to speak out on a controversial issue to make his name in the party. Harcourt and Morley also sided with this group, though with slightly different aims. Campbell-Bannerman tried to keep these forces together at the head of a moderate Liberal rump, but in 1901 he delivered a speech on the government's "methods of barbarism" in South Africa that pulled him further to the left and nearly tore the party in two. The party was saved after Salisbury's retirement in 1902 when his successor, Arthur Balfour, pushed a series of unpopular initiatives such as the Education Act 1902 and Joseph Chamberlain called for a new system of protectionist tariffs.
Campbell-Bannerman was able to rally the party around the traditional liberal platform of free trade and land reform and led them to the greatest election victory in their history. This would prove the last time the Liberals won a majority in their own right. Although he presided over a large majority, Sir Henry Campbell-Bannerman was overshadowed by his ministers, most notably H. H. Asquith at the Exchequer, Edward Grey at the Foreign Office, Richard Burdon Haldane at the War Office and David Lloyd George at the Board of Trade. Campbell-Bannerman retired in 1908 and died soon after. He was succeeded by Asquith, who stepped up the government's radicalism. Lloyd George succeeded Asquith at the Exchequer, and was in turn succeeded at the Board of Trade by Winston Churchill, a recent defector from the Conservatives.
The 1906 general election also represented a shift to the left by the Liberal Party. According to Rosemary Rees, almost half of the Liberal MPs elected in 1906 were supportive of the 'New Liberalism' (which advocated government action to improve people's lives),) while claims were made that “five-sixths of the Liberal party are left wing.” Other historians, however, have questioned the extent to which the Liberal Party experienced a leftward shift; according to Robert C. Self however, only between 50 and 60 Liberal MPs out of the 400 in the parliamentary party after 1906 were Social Radicals, with a core of 20 to 30. Nevertheless, important junior offices were held in the cabinet by what Duncan Tanner has termed "genuine New Liberals, Centrist reformers, and Fabian collectivists," and much legislation was pushed through by the Liberals in government. This included the regulation of working hours, National Insurance and welfare.
A political battle erupted over the People's Budget and resulted in the passage of an act ending the power of the House of Lords to block legislation. The cost was high, however, as the government was required by the king to call two general elections in 1910 to validate its position and ended up frittering away most of its large majority, being left once again dependent on the Irish Nationalists.
As a result, Asquith was forced to introduce a new third Home Rule bill in 1912. Since the House of Lords no longer had the power to block the bill, the Unionist's Ulster Volunteers led by Sir Edward Carson, launched a campaign of opposition that included the threat of armed resistance in Ulster and the threat of mass resignation of their commissions by army officers in Ireland in 1914 ("see Curragh Incident"). In their resistance to Home Rule the Ulster Protestants had the full support of the Conservatives, whose leader, Bonar Law, was of Ulster-Scots descent. The country seemed to be on the brink of civil war when the First World War broke out in August 1914. Historian George Dangerfield has argued that the multiplicity of crises in 1910 to 1914, before the war broke out, so weakened the Liberal coalition that it marked the "Strange Death of Liberal England". However, most historians date the collapse to the crisis of the First World War.
The Liberal Party might have survived a short war, but the totality of the Great War called for measures that the Party had long rejected. The result was the permanent destruction of the ability of the Liberal Party to lead a government. Historian Robert Blake explains the dilemma:
Blake further notes that it was the Liberals, not the Conservatives who needed the moral outrage of Belgium to justify going to war, while the Conservatives called for intervention from the start of the crisis on the grounds of "realpolitik" and the balance of power. However, Lloyd George and Churchill were zealous supporters of the war, and gradually forced the old peace-orientated Liberals out.
Asquith was blamed for the poor British performance in the first year. Since the Liberals ran the war without consulting the Conservatives, there were heavy partisan attacks. However, even Liberal commentators were dismayed by the lack of energy at the top. At the time, public opinion was intensely hostile, both in the media and in the street, against any young man in civilian garb and labeled as a slacker. The leading Liberal newspaper, the "Manchester Guardian" complained:
Asquith's Liberal government was brought down in , due in particular to a crisis in inadequate artillery shell production and the protest resignation of Admiral Fisher over the disastrous Gallipoli Campaign against Turkey. Reluctant to face doom in an election, Asquith formed a new coalition government on 25 May, with the majority of the new cabinet coming from his own Liberal party and the Unionist (Conservative) party, along with a token Labour representation. The new government lasted a year and a half, and was the last time Liberals controlled the government. The analysis of historian A. J. P. Taylor is that the British people were so deeply divided over numerous issues, But on all sides there was growing distrust of the Asquith government. There was no agreement whatsoever on wartime issues. The leaders of the two parties realized that embittered debates in Parliament would further undermine popular morale and so the House of Commons did not once discuss the war before May 1915. Taylor argues:
The 1915 coalition fell apart at the end of 1916, when the Conservatives withdrew their support from Asquith and gave it instead to Lloyd George, who became prime minister at the head of a new coalition largely made up of Conservatives. Asquith and his followers moved to the opposition benches in Parliament and the Liberal Party was deeply split once again.
Lloyd George remained a Liberal all his life, but he abandoned many standard Liberal principles in his crusade to win the war at all costs. He insisted on strong government controls over business as opposed to the "laissez-faire" attitudes of traditional Liberals. He insisted on conscription of young men into the Army, a position that deeply troubled his old colleagues. That brought him and a few like-minded Liberals into the new coalition on the ground long occupied by Conservatives. There was no more planning for world peace or liberal treatment of Germany, nor discomfit with aggressive and authoritarian measures of state power. More deadly to the future of the party, says historian Trevor Wilson, was its repudiation by ideological Liberals, who decided sadly that it no longer represented their principles. Finally the presence of the vigorous new Labour Party on the left gave a new home to voters disenchanted with the Liberal performance.
In the 1918 general election, Lloyd George, hailed as "the Man Who Won the War", led his coalition into a khaki election. Lloyd George and the Conservative leader Bonar Law wrote a joint letter of support to candidates to indicate they were considered the official Coalition candidates—this "coupon", as it became known, was issued against many sitting Liberal MPs, often to devastating effect, though not against Asquith himself. The coalition won a massive victory as the Asquithian Liberals and Labour were decimated. Those remaining Liberal MPs who were opposed to the Coalition Government went into opposition under the parliamentary leadership of Sir Donald MacLean who also became Leader of the Opposition. Asquith, who had appointed MacLean, remained as overall Leader of the Liberal Party even though he lost his seat in 1918. Asquith returned to parliament in 1920 and resumed leadership. Between 1919–1923, the anti-Lloyd George Liberals were called Asquithian Liberals, Wee Free Liberals or Independent Liberals.
Lloyd George was increasingly under the influence of the rejuvenated Conservative party who numerically dominated the coalition. In 1922, the Conservative backbenchers rebelled against the continuation of the coalition, citing, in particular, Lloyd George's plan for war with Turkey in the Chanak Crisis, and his corrupt sale of honours. He resigned as prime minister and was succeeded by Bonar Law.
At the 1922 and 1923 elections the Liberals won barely a third of the vote and only a quarter of the seats in the House of Commons as many radical voters abandoned the divided Liberals and went over to Labour. In 1922, Labour became the official opposition. A reunion of the two warring factions took place in 1923 when the new Conservative prime minister Stanley Baldwin committed his party to protective tariffs, causing the Liberals to reunite in support of free trade. The party gained ground in the 1923 general election but made most of its gains from Conservatives whilst losing ground to Labour—a sign of the party's direction for many years to come. The party remained the third largest in the House of Commons, but the Conservatives had lost their majority. There was much speculation and fear about the prospect of a Labour government and comparatively little about a Liberal government, even though it could have plausibly presented an experienced team of ministers compared to Labour's almost complete lack of experience as well as offering a middle ground that could obtain support from both Conservatives and Labour in crucial Commons divisions. However, instead of trying to force the opportunity to form a Liberal government, Asquith decided instead to allow Labour the chance of office in the belief that they would prove incompetent and this would set the stage for a revival of Liberal fortunes at Labour's expense, but it was a fatal error.
Labour was determined to destroy the Liberals and become the sole party of the left. Ramsay MacDonald was forced into a snap election in 1924 and although his government was defeated he achieved his objective of virtually wiping the Liberals out as many more radical voters now moved to Labour whilst moderate middle-class Liberal voters concerned about socialism moved to the Conservatives. The Liberals were reduced to a mere forty seats in Parliament, only seven of which had been won against candidates from both parties and none of these formed a coherent area of Liberal survival. The party seemed finished, and during this period some Liberals, such as Churchill, went over to the Conservatives while others went over to Labour. Several Labour ministers of later generations, such as Michael Foot and Tony Benn, were the sons of Liberal MPs.
Asquith died in 1928 and the enigmatic figure of Lloyd George returned to the leadership and began a drive to produce coherent policies on many key issues of the day. In the 1929 general election, he made a final bid to return the Liberals to the political mainstream, with an ambitious programme of state stimulation of the economy called "We Can Conquer Unemployment!", largely written for him by the Liberal economist John Maynard Keynes. The Liberal Party stood in Northern Ireland for the first and only time in the 1929 general election gaining 17% of the vote but won no seats. The Liberals gained ground, but once again it was at the Conservatives' expense whilst also losing seats to Labour. Indeed, the urban areas of the country suffering heavily from unemployment, which might have been expected to respond the most to the radical economic policies of the Liberals, instead gave the party its worst results. By contrast, most of the party's seats were won either due to the absence of a candidate from one of the other parties or in rural areas on the Celtic fringe, where local evidence suggests that economic ideas were at best peripheral to the electorate's concerns. The Liberals now found themselves with 59 members, holding the balance of power in a Parliament where Labour was the largest party but lacked an overall majority. Lloyd George offered a degree of support to the Labour government in the hope of winning concessions, including a degree of electoral reform to introduce the alternative vote, but this support was to prove bitterly divisive as the Liberals increasingly divided between those seeking to gain what Liberal goals they could achieve, those who preferred a Conservative government to a Labour one and vice versa.
The last majority Liberal Government in Britain was elected in 1906. The years preceding the First World War were marked by worker strikes and civil unrest and saw many violent confrontations between civilians and the police and armed forces. Other issues of the period included women's suffrage and the Irish Home Rule movement. After the carnage of 1914–1918, the democratic reforms of the Representation of the People Act 1918 instantly tripled the number of people entitled to vote in Britain from seven to twenty-one million. The Labour Party benefited most from this huge change in the electorate, forming its first minority government in 1924.
In 1931 MacDonald's government fell apart in response to the Great Depression, and the Liberals agreed to join his National Government, dominated by the Conservatives. Lloyd George himself was ill and did not actually join. Soon, however, the Liberals faced another divisive crisis when a National Government was proposed to fight the 1931 general election with a mandate for tariffs. From the outside, Lloyd George called for the party to abandon the government completely in defence of free trade, but only a few MPs and candidates followed. Another group under Sir John Simon then emerged, who were prepared to continue their support for the government and take the Liberal places in the Cabinet if there were resignations. The third group under Sir Herbert Samuel pressed for the parties in government to fight the election on separate platforms. In doing so the bulk of Liberals remained supporting the government, but two distinct Liberal groups had emerged within this bulk – the Liberal Nationals (officially the "National Liberals" after 1947) led by Simon, also known as "Simonites", and the "Samuelites" or "official Liberals", led by Samuel who remained as the official party. Both groups secured about 34 MPs but proceeded to diverge even further after the election, with the Liberal Nationals remaining supporters of the government throughout its life. There were to be a succession of discussions about them rejoining the Liberals, but these usually foundered on the issues of free trade and continued support for the National Government. The one significant reunification came in 1946 when the Liberal and Liberal National party organisations in London merged.
The official Liberals found themselves a tiny minority within a government committed to protectionism. Slowly they found this issue to be one they could not support. In early 1932 it was agreed to suspend the principle of collective responsibility to allow the Liberals to oppose the introduction of tariffs. Later in 1932 the Liberals resigned their ministerial posts over the introduction of the Ottawa Agreement on Imperial Preference. However, they remained sitting on the government benches supporting it in Parliament, though in the country local Liberal activists bitterly opposed the government. Finally in late 1933 the Liberals crossed the floor of the House of Commons and went into complete opposition. By this point their number of MPs was severely depleted. In the 1935 general election, just 17 Liberal MPs were elected, along with Lloyd George and three followers as independent Liberals. Immediately after the election the two groups reunited, though Lloyd George declined to play much of a formal role in his old party. Over the next ten years there would be further defections as MPs deserted to either the Liberal Nationals or Labour. Yet there were a few recruits, such as Clement Davies, who had deserted to the National Liberals in 1931 but now returned to the party during World War II and who would lead it after the war.
Samuel had lost his seat in the 1935 election and the leadership of the party fell to Sir Archibald Sinclair. With many traditional domestic Liberal policies now regarded as irrelevant, he focused the party on opposition to both the rise of Fascism in Europe and the appeasement foreign policy of the British government, arguing that intervention was needed, in contrast to the Labour calls for pacifism. Despite the party's weaknesses, Sinclair gained a high profile as he sought to recall the Midlothian Campaign and once more revitalise the Liberals as the party of a strong foreign policy.
In 1940, they joined Churchill's wartime coalition government, with Sinclair serving as Secretary of State for Air, the last British Liberal to hold Cabinet rank office for seventy years. However, it was a sign of the party's lack of importance that they were not included in the War Cabinet; some leading party members founded Radical Action, a group which called for liberal candidates to break the war-time electoral pact. At the 1945 general election, Sinclair and many of his colleagues lost their seats to both Conservatives and Labour and the party returned just 12 MPs to Westminster, but this was just the beginning of the decline. In 1950, the general election saw the Liberals return just nine MPs. Another general election was called in 1951 and the Liberals were left with just six MPs and all but one of them were aided by the fact that the Conservatives refrained from fielding candidates in those constituencies.
In 1957, this total fell to five when one of the Liberal MPs died and the subsequent by-election was lost to the Labour Party, which selected the former Liberal Deputy Leader Megan Lloyd George as its own candidate. The Liberal Party seemed close to extinction. During this low period, it was often joked that Liberal MPs could hold meetings in the back of one taxi.
Through the 1950s and into the 1960s the Liberals survived only because a handful of constituencies in rural Scotland and Wales clung to their Liberal traditions, whilst in two English towns, Bolton and Huddersfield, local Liberals and Conservatives agreed to each contest only one of the town's two seats. Jo Grimond, for example, who became Liberal leader in 1956, was MP for the remote Orkney and Shetland islands. Under his leadership a Liberal revival began, marked by the Orpington by-election of March 1962 which was won by Eric Lubbock. There, the Liberals won a seat in the London suburbs for the first time since 1935.
The Liberals became the first of the major British political parties to advocate British membership of the European Economic Community. Grimond also sought an intellectual revival of the party, seeking to position it as a non-socialist radical alternative to the Conservative government of the day. In particular he canvassed the support of the young post-war university students and recent graduates, appealing to younger voters in a way that many of his recent predecessors had not, and asserting a new strand of Liberalism for the post-war world.
The new middle-class suburban generation began to find the Liberals' policies attractive again. Under Grimond (who retired in 1967) and his successor, Jeremy Thorpe, the Liberals regained the status of a serious third force in British politics, polling up to 20% of the vote, but unable to break the duopoly of Labour and Conservative and win more than fourteen seats in the Commons. An additional problem was competition in the Liberal heartlands in Scotland and Wales from the Scottish National Party and Plaid Cymru who both grew as electoral forces from the 1960s onwards. Although Emlyn Hooson held on to the seat of Montgomeryshire, upon Clement Davies death in 1962, the party lost five Welsh seats between 1950 and 1966. In September 1966, the Welsh Liberal Party formed their own state party, moving the Liberal Party into a fully federal structure.
In local elections, Liverpool remained a Liberal stronghold, with the party taking the plurality of seats on the elections to the new Liverpool Metropolitan Borough Council in 1973. In the February 1974 general election, the Conservative government of Edward Heath won a plurality of votes cast, but the Labour Party gained a plurality of seats. The Conservatives were unable to form a government due to the Ulster Unionist MPs refusing to support the Conservatives after the Northern Ireland Sunningdale Agreement. The Liberals now held the balance of power in the Commons. Conservatives offered Thorpe the Home Office if he would join a coalition government with Heath. Thorpe was personally in favour of it, but the party insisted on a clear government commitment to introducing proportional representation and a change of prime minister. The former was unacceptable to Heath's cabinet and the latter to Heath personally, so the talks collapsed. Instead, a minority Labour government was formed under Harold Wilson but with no formal support from Thorpe. In the October 1974 general election, the Liberals slipped back slightly and the Labour government won a wafer-thin majority.
Thorpe was subsequently forced to resign after allegations that he attempted to have his homosexual lover murdered by a hitman. The party's new leader, David Steel, negotiated the Lib-Lab pact with Wilson's successor as prime minister, James Callaghan. According to this pact, the Liberals would support the government in crucial votes in exchange for some influence over policy. The agreement lasted from 1977 to 1978, but proved mostly fruitless, for two reasons: the Liberals' key demand of proportional representation was rejected by most Labour MPs, whilst the contacts between Liberal spokespersons and Labour ministers often proved detrimental, such as between finance spokesperson John Pardoe and Chancellor of the Exchequer Denis Healey, who were mutually antagonistic.
The Conservative Party under the leadership of Margaret Thatcher won the 1979 general election, placing the Labour Party back in opposition, which served to push the Liberals back into the margins.
In 1981, defectors from a moderate faction of the Labour Party, led by former Cabinet ministers Roy Jenkins, David Owen and Shirley Williams, founded the Social Democratic Party (SDP). The new party and the Liberals quickly formed the SDP–Liberal Alliance, which for a while polled as high as 50% in the opinion polls and appeared capable of winning the next general election. Indeed, Steel was so confident of an Alliance victory that he told the 1981 Liberal conference, "Go back to your constituencies, and prepare for government!".
However, the Alliance was overtaken in the polls by the Tories in the aftermath of the Falkland Islands War and at the 1983 general election the Conservatives were re-elected by a landslide, with Labour once again forming the opposition. While the SDP–Liberal Alliance came close to Labour in terms of votes (a share of more than 25%), it only had 23 MPs compared to Labour's 209. The Alliance's support was spread out across the country, and was not concentrated in enough areas to translate into seats.
In the 1987 general election, the Alliance's share of the votes fell slightly and it now had 22 MPs. In the election's aftermath Steel proposed a merger of the two parties. Most SDP members voted in favour of the merger, but SDP leader David Owen objected and continued to lead a "rump" SDP.
In March 1988, the Liberal Party and Social Democratic Party merged to create the Social and Liberal Democrats, renamed the Liberal Democrats in October 1989. Over two-thirds of Liberal members joined the merged party, along with all sitting MPs. Steel and SDP leader Robert Maclennan served briefly as interim leaders of the merged party.
A group of Liberal opponents of the merger with the Social Democrats, including Michael Meadowcroft (the former Liberal MP for Leeds West) and Paul Wiggin (who served on Peterborough City Council as a Liberal), continued with a new party organisation under the name of the 'Liberal Party'. Meadowcroft joined the Liberal Democrats in 2007, but the Liberal Party as reconstituted in 1989 continues to hold council seats and field candidates in Westminster Parliamentary elections.
During the 19th century, the Liberal Party was broadly in favour of what would today be called classical liberalism, supporting "laissez-faire" economic policies such as free trade and minimal government interference in the economy (this doctrine was usually termed Gladstonian liberalism after the Victorian era Liberal prime minister William Ewart Gladstone). The Liberal Party favoured social reform, personal liberty, reducing the powers of the Crown and the Church of England (many of them were nonconformists) and an extension of the electoral franchise. Sir William Harcourt, a prominent Liberal politician in the Victorian era, said this about liberalism in 1872: If there be any party which is more pledged than another to resist a policy of restrictive legislation, having for its object social coercion, that party is the Liberal party. (Cheers.) But liberty does not consist in making others do what you think right, (Hear, hear.) The difference between a free Government and a Government which is not free is principally this—that a Government which is not free interferes with everything it can, and a free Government interferes with nothing except what it must. A despotic Government tries to make everybody do what it wishes; a Liberal Government tries, as far as the safety of society will permit, to allow everybody to do as he wishes. It has been the tradition of the Liberal party consistently to maintain the doctrine of individual liberty. It is because they have done so that England is the place where people can do more what they please than in any other country in the world. [...] It is this practice of allowing one set of people to dictate to another set of people what they shall do, what they shall think, what they shall drink, when they shall go to bed, what they shall buy, and where they shall buy it, what wages they shall get and how they shall spend them, against which the Liberal party have always protested.
The political terms of "modern", "progressive" or "new" Liberalism began to appear in the mid to late 1880s and became increasingly common to denote the tendency in the Liberal Party to favour an increased role for the state as more important than the classical liberal stress on self-help and freedom of choice.
By the early 20th century, the Liberals stance began to shift towards "New Liberalism", what would today be called social liberalism, namely a belief in personal liberty with a support for government intervention to provide minimum levels of welfare. This shift was best exemplified by the Liberal government of H. H. Asquith and his Chancellor David Lloyd George, whose Liberal reforms in the early 1900s created a basic welfare state.
David Lloyd George adopted a programme at the 1929 general election entitled "We Can Conquer Unemployment!", although by this stage the Liberals had declined to third-party status. The Liberals as expressed in the "Liberal Yellow Book" now regarded opposition to state intervention as being a characteristic of right-wing extremists.
After nearly becoming extinct in the 1940s and the 1950s, the Liberal Party revived its fortunes somewhat under the leadership of Jo Grimond in the 1960s by positioning itself as a radical centrist, non-socialist alternative to the Conservative and Labour Party governments of the time.
Since 1660, nonconformist Protestants have played a major role in English politics. Relatively few MPs were Dissenters. However the Dissenters were a major voting bloc in many areas, such as the East Midlands. They were very well organised and highly motivated and largely won over the Whigs and Liberals to their cause. Down to the 1830s, Dissenters demanded removal of political and civil disabilities that applied to them (especially those in the Test and Corporation Acts). The Anglican establishment strongly resisted until 1828. Numerous reforms of voting rights, especially that of 1832, increased the political power of Dissenters. They demanded an end to compulsory church rates, in which local taxes went only to Anglican churches. They finally achieved the end of religious tests for university degrees in 1905. Gladstone brought the majority of Dissenters around to support for Home Rule for Ireland, putting the dissenting Protestants in league with the Irish Roman Catholics in an otherwise unlikely alliance. The Dissenters gave significant support to moralistic issues, such as temperance and sabbath enforcement. The nonconformist conscience, as it was called, was repeatedly called upon by Gladstone for support for his moralistic foreign policy. In election after election, Protestant ministers rallied their congregations to the Liberal ticket. In Scotland, the Presbyterians played a similar role to the Nonconformist Methodists, Baptists and other groups in England and Wales.
By the 1820s, the different Nonconformists, including Wesleyan Methodists, Baptists, Congregationalists and Unitarians, had formed the Committee of Dissenting Deputies and agitated for repeal of the highly restrictive Test and Corporation Acts. These Acts excluded Nonconformists from holding civil or military office or attending Oxford or Cambridge, compelling them to set up their own Dissenting Academies privately. The Tories tended to be in favour of these Acts and so the Nonconformist cause was linked closely to the Whigs, who advocated civil and religious liberty. After the Test and Corporation Acts were repealed in 1828, all the Nonconformists elected to Parliament were Liberals. Nonconformists were angered by the Education Act 1902, which integrated Church of England denominational schools into the state system and provided for their support from taxes. John Clifford formed the National Passive Resistance Committee and by 1906 over 170 Nonconformists had gone to prison for refusing to pay school taxes. They included 60 Primitive Methodists, 48 Baptists, 40 Congregationalists and 15 Wesleyan Methodists.
The political strength of Dissent faded sharply after 1920 with the secularisation of British society in the 20th century. The rise of the Labour Party reduced the Liberal Party strongholds into the nonconformist and remote "Celtic Fringe", where the party survived by an emphasis on localism and historic religious identity, thereby neutralising much of the class pressure on behalf of the Labour movement. Meanwhile, the Anglican church was a bastion of strength for the Conservative party. On the Irish issue, the Anglicans strongly supported unionism. Increasingly after 1850, the Roman Catholic element in England and Scotland was composed of recent immigrants from Ireland. They voted largely for the Irish Parliamentary Party until its collapse in 1918. | https://en.wikipedia.org/wiki?curid=4482 |
Bank of England
The Bank of England is the central bank of the United Kingdom and the model on which most modern central banks have been based. Established in 1694 to act as the English Government's banker, and still one of the bankers for the Government of the United Kingdom, it is the world's eighth-oldest bank. It was privately owned by stockholders from its foundation in 1694 until it was nationalised in 1946.
The Bank became an independent public organisation in 1998, wholly owned by the Treasury Solicitor on behalf of the government, but with independence in setting monetary policy.
The Bank is one of eight banks authorised to issue banknotes in the United Kingdom, has a monopoly on the issue of banknotes in England and Wales and regulates the issue of banknotes by commercial banks in Scotland and Northern Ireland.
The Bank's Monetary Policy Committee has a devolved responsibility for managing monetary policy. The Treasury has reserve powers to give orders to the committee "if they are required in the public interest and by extreme economic circumstances", but such orders must be endorsed by Parliament within 28 days. The Bank's Financial Policy Committee held its first meeting in June 2011 as a macroprudential regulator to oversee regulation of the UK's financial sector.
The Bank's headquarters have been in London's main financial district, the City of London, on Threadneedle Street, since 1734. It is sometimes known as The Old Lady of Threadneedle Street, a name taken from a satirical cartoon by James Gillray in 1797. The road junction outside is known as Bank junction.
As a regulator and central bank, the Bank of England has not offered consumer banking services for many years, but it still does manage some public-facing services such as exchanging superseded bank notes. Until 2016, the bank provided personal banking services as a privilege for employees.
England's crushing defeat by France, the dominant naval power, in naval engagements culminating in the 1690 Battle of Beachy Head, became the catalyst for England to rebuild itself as a global power. William III's government wanted to build a naval fleet that would rival that of France; however, the ability to construct this fleet was hampered both by a lack of available public funds and the low credit of the English government in London. This lack of credit made it impossible for the English government to borrow the £1,200,000 (at 8% per annum) that it wanted for the construction of the fleet.
To induce subscription to the loan, the subscribers were to be incorporated by the name of the Governor and Company of the Bank of England. The Bank was given exclusive possession of the government's balances, and was the only limited-liability corporation allowed to issue bank notes. The lenders would give the government cash (bullion) and issue notes against the government bonds, which can be lent again. The £1.2 million was raised in 12 days; half of this was used to rebuild the navy.
As a side effect, the huge industrial effort needed, including establishing ironworks to make more nails and advances in agriculture feeding the quadrupled strength of the navy, started to transform the economy. This helped the new Kingdom of Great Britain – England and Scotland were formally united in 1707 – to become powerful. The power of the navy made Britain the dominant world power in the late 18th and early 19th centuries.
The establishment of the bank was devised by Charles Montagu, 1st Earl of Halifax, in 1694. The plan of 1691, which had been proposed by William Paterson three years before, had not then been acted upon. 58 years earlier, in 1636, Financier to the king, Philip Burlamachi, had proposed exactly the same idea in a letter addressed to Sir Francis Windebank. He proposed a loan of £1.2 million to the government; in return the subscribers would be incorporated as The Governor and Company of the Bank of England with long-term banking privileges including the issue of notes. The royal charter was granted on 27 July through the passage of the Tonnage Act 1694. Public finances were in such dire condition at the time that the terms of the loan were that it was to be serviced at a rate of 8% per annum, and there was also a service charge of £4,000 per annum for the management of the loan. The first governor was Sir John Houblon, who is depicted in the £50 note issued in 1994. The charter was renewed in 1742, 1764, and 1781.
The Bank's original home was in Walbrook, a street in the City of London, where during reconstruction in 1954 archaeologists found the remains of a Roman temple of Mithras (Mithras is – rather fittingly – said to have been worshipped as, amongst other things, the God of Contracts); the Mithraeum ruins are perhaps the most famous of all 20th-century Roman discoveries in the City of London and can be viewed by the public.
The Bank moved to its current location in Threadneedle Street in 1734, and thereafter slowly acquired neighbouring land to create the site necessary for erecting the Bank's original home at this location, under the direction of its chief architect Sir John Soane, between 1790 and 1827. (Sir Herbert Baker's rebuilding of the Bank in the first half of the 20th century, demolishing most of Soane's masterpiece, was described by architectural historian Nikolaus Pevsner as "the greatest architectural crime, in the City of London, of the twentieth century".)
When the idea and reality of the national debt came about during the 18th century, this was also managed by the Bank. During the American war of independence, business for the Bank was so good that George Washington remained a shareholder throughout the period. By the charter renewal in 1781 it was also the bankers' bank – keeping enough gold to pay its notes on demand until 26 February 1797 when war had so diminished gold reserves that – following an invasion scare caused by the Battle of Fishguard days earlier – the government prohibited the Bank from paying out in gold by the passing of the Bank Restriction Act 1797. This prohibition lasted until 1821.
The 1844 Bank Charter Act tied the issue of notes to the gold reserves and gave the Bank sole rights with regard to the issue of banknotes. Private banks that had previously had that right retained it, provided that their headquarters were outside London and that they deposited security against the notes that they issued. A few English banks continued to issue their own notes until the last of them was taken over in the 1930s. Scottish and Northern Irish private banks still have that right.
The bank acted as lender of last resort for the first time in the panic of 1866.
The last private bank in England to issue its own notes was Thomas Fox's Fox, Fowler and Company bank in Wellington, which rapidly expanded, until it merged with Lloyds Bank in 1927. They were legal tender until 1964. There are nine notes left in circulation; one is housed at Tone Dale House, Wellington.
Britain was on the gold standard until 1931, when the Bank of England unilaterally and abruptly took Britain off the gold standard.
During the governorship of Montagu Norman, from 1920 to 1944, the Bank made deliberate efforts to move away from commercial banking and become a central bank. In 1946, shortly after the end of Norman's tenure, the bank was nationalised by the Labour government.
The Bank pursued the multiple goals of Keynesian economics after 1945, especially "easy money" and low interest rates to support aggregate demand. It tried to keep a fixed exchange rate, and attempted to deal with inflation and sterling weakness by credit and exchange controls.
In 1977, the Bank set up a wholly owned subsidiary called Bank of England Nominees Limited (BOEN), a private limited company, with two of its hundred £1 shares issued. According to its Memorandum & Articles of Association, its objectives are: "To act as Nominee or agent or attorney either solely or jointly with others, for any person or persons, partnership, company, corporation, government, state, organisation, sovereign, province, authority, or public body, or any group or association of them..." Bank of England Nominees Limited was granted an exemption by Edmund Dell, Secretary of State for Trade, from the disclosure requirements under Section 27(9) of the Companies Act 1976, because "it was considered undesirable that the disclosure requirements should apply to certain categories of shareholders." The Bank of England is also protected by its royal charter status, and the Official Secrets Act. BOEN is a vehicle for governments and heads of state to invest in UK companies (subject to approval from the Secretary of State), providing they undertake "not to influence the affairs of the company". BOEN is no longer exempt from company law disclosure requirements. Although a dormant company, dormancy does not preclude a company actively operating as a nominee shareholder. BOEN has two shareholders: the Bank of England, and the Secretary of the Bank of England.
The reserve requirement for banks to hold a minimum fixed proportion of their deposits as reserves at the Bank of England was abolished in 1981: see reserve requirement for more details. The contemporary transition from Keynesian economics to Chicago economics was analysed by Nicholas Kaldor in "The Scourge of Monetarism".
On 6 May 1997, following the 1997 general election that brought a Labour government to power for the first time since 1979, it was announced by the Chancellor of the Exchequer, Gordon Brown, that the Bank would be granted operational independence over monetary policy. Under the terms of the Bank of England Act 1998 (which came into force on 1 June 1998), the Bank's Monetary Policy Committee was given sole responsibility for setting interest rates to meet the Government's Retail Prices Index (RPI) inflation target of 2.5%. The target has changed to 2% since the Consumer Price Index (CPI) replaced the Retail Prices Index as the Treasury's inflation index. If inflation overshoots or undershoots the target by more than 1%, the Governor has to write a letter to the Chancellor of the Exchequer explaining why, and how he will remedy the situation.
The success of inflation targeting in the United Kingdom has been attributed to the Bank's focus on transparency. The Bank of England has been a leader in producing innovative ways of communicating information to the public, especially through its Inflation Report, which have been emulated by many other central banks.
Independent central banks that adopt an inflation target are known as Friedmanite central banks. Inflation targets combined with central bank independence have been characterised as a "starve the beast" strategy creating a lack of money in the public sector. This change in Labour's politics was described by Skidelsky in "The Return of the Master" as a mistake and as an adoption of the Rational Expectations Hypothesis as promulgated by Walters.
The handing over of monetary policy to the Bank had been a key plank of the Liberal Democrats' economic policy since the 1992 general election. Conservative MP Nicholas Budgen had also proposed this as a private member's bill in 1996, but the bill failed as it had the support of neither the government nor the opposition.
Mark Carney assumed the post of Governor of the Bank of England on 1 July 2013. He succeeded Mervyn King, who took over on 30 June 2003. Carney, a Canadian, will serve an initial five-year term rather than the typical eight. He became the first Governor not to be a UK citizen, but has since been granted citizenship. At Government request, his term was extended to 2019, then again to 2020. As of January 2014, the Bank also has four Deputy Governors.
BOEN was dissolved, following liquidation, in July 2017.
There are two main areas which are tackled by the Bank to ensure it carries out these functions efficiently:
Note: It is important to note that "monetary" and "financial" are synonyms.
Stable prices and confidence in the currency are the two main criteria for monetary stability. Stable prices are maintained by seeking to ensure that price increases meet the Government's inflation target. The Bank aims to meet this target by adjusting the base interest rate, which is decided by the Monetary Policy Committee, and through its communications strategy, such as publishing yield curves.
The Bank works together with other institutions to secure both monetary and financial stability, including:
The 1997 memorandum of understanding describes the terms under which the Bank, the Treasury and the FSA work toward the common aim of increased financial stability. In 2010, the incoming Chancellor announced his intention to merge the FSA back into the Bank. As of 2012, the current director for financial stability is Andy Haldane.
The Bank acts as the government's banker, and it maintains the government's Consolidated Fund account. It also manages the country's foreign exchange and gold reserves. The Bank also acts as the bankers' bank, especially in its capacity as a lender of last resort.
The Bank has a monopoly on the issue of banknotes in England and Wales. Scottish and Northern Irish banks retain the right to issue their own banknotes, but they must be backed one for one with deposits at the Bank, excepting a few million pounds representing the value of notes they had in circulation in 1845. The Bank decided to sell its banknote printing operations to De La Rue in December 2002, under the advice of Close Brothers Corporate Finance Ltd.
Since 1998, the Monetary Policy Committee (MPC) has had the responsibility for setting the official interest rate. However, with the decision to grant the Bank operational independence, responsibility for government debt management was transferred in 1998 to the new Debt Management Office, which also took over government cash management in 2000. Computershare took over as the registrar for UK Government bonds (gilt-edged securities or "gilts") from the Bank at the end of 2004.
The Bank used to be responsible for the regulation and supervision of the banking and insurance industries. This responsibility was transferred to the Financial Services Authority in June 1998, but after the financial crises in 2008 new banking legislation transferred the responsibility for regulation and supervision of the banking and insurance industries back to the Bank.
In 2011, the interim Financial Policy Committee (FPC) was created as a mirror committee to the MPC to spearhead the Bank's new mandate on financial stability. The FPC is responsible for macro prudential regulation of all UK banks and insurance companies.
To help maintain economic stability, the Bank attempts to broaden understanding of its role, both through regular speeches and publications by senior Bank figures, a semiannual Financial Stability Report, and through a wider education strategy aimed at the general public. It currently maintains a free museum and ran the Target Two Point Zero competition for A-level students, closing in 2017.
The Bank has operated, since January 2009, an Asset Purchase Facility (APF) to buy "high-quality assets financed by the issue of Treasury bills and the DMO's cash management operations" and thereby improve liquidity in the credit markets. It has, since March 2009, also provided the mechanism by which the Bank's policy of quantitative easing (QE) is achieved, under the auspices of the MPC. Along with the managing the £200 billion of QE funds, the APF continues to operate its corporate facilities. Both are undertaken by a subsidiary company of the Bank of England, the Bank of England Asset Purchase Facility Fund Limited (BEAPFF).
The Bank has issued banknotes since 1694. Notes were originally hand-written; although they were partially printed from 1725 onwards, cashiers still had to sign each note and make them payable to someone. Notes were fully printed from 1855. Until 1928 all notes were "White Notes", printed in black and with a blank reverse. In the 18th and 19th centuries White Notes were issued in £1 and £2 denominations. During the 20th century White Notes were issued in denominations between £5 and £1000.
Until the mid-19th century, commercial banks were allowed to issue their own banknotes, and notes issued by provincial banking companies were commonly in circulation. The Bank Charter Act 1844 began the process of restricting note issue to the Bank; new banks were prohibited from issuing their own banknotes and existing note-issuing banks were not permitted to expand their issue. As provincial banking companies merged to form larger banks, they lost their right to issue notes, and the English private banknote eventually disappeared, leaving the Bank with a monopoly of note issue in England and Wales. The last private bank to issue its own banknotes in England and Wales was Fox, Fowler and Company in 1921. However, the limitations of the 1844 Act only affected banks in England and Wales, and today three commercial banks in Scotland and four in Northern Ireland continue to issue their own banknotes, regulated by the Bank.
At the start of the First World War, the Currency and Bank Notes Act 1914 was passed, which granted temporary powers to HM Treasury for issuing banknotes to the values of £1 and 10/- (ten shillings). Treasury notes had full legal tender status and were not convertible into gold through the Bank; they replaced the gold coin in circulation to prevent a run on sterling and to enable raw material purchases for armament production. These notes featured an image of King George V (Bank of England notes did not begin to display an image of the monarch until 1960). The wording on each note was:
Treasury notes were issued until 1928, when the Currency and Bank Notes Act 1928 returned note-issuing powers to the banks. The Bank of England issued notes for ten shillings and one pound for the first time on 22 November 1928.
During the Second World War the German Operation Bernhard attempted to counterfeit denominations between £5 and £50, producing 500,000 notes each month in 1943. The original plan was to parachute the money into the UK in an attempt to destabilise the British economy, but it was found more useful to use the notes to pay German agents operating throughout Europe. Although most fell into Allied hands at the end of the war, forgeries frequently appeared for years afterwards, which led banknote denominations above £5 to be removed from circulation.
In 2006, over £53 million in banknotes belonging to the Bank was stolen from a depot in Tonbridge, Kent.
Modern banknotes are printed by contract with De La Rue Currency in Loughton, Essex.
The bank is custodian to the official gold reserves of the United Kingdom and around 30 other countries. , the bank held around 400,000 bars, which is equivalent to of gold. These gold deposits were estimated in August 2018 to have a current market value of approximately £200 billion. These estimates suggest the vault could hold as much as 3% of the gold mined throughout human history.
Following is a list of the Governors of the Bank of England since the beginning of the 20th century:
The Court of Directors is a unitary board that is responsible for setting the organisation's strategy and budget and taking key decisions on resourcing and appointments. It consists of five executive members from the Bank plus up to 9 non-executive members, all of whom are appointed by the Crown. The Chancellor selects the Chairman of the Court from among the non-executive members. The Court is required to meet at least 7 times a year.
The Governor serves for a period of eight years, the Deputy Governors for five years, and the non-executive members for up to four years.
Since 2013, the Bank has had a chief operating officer (COO). , the Bank's COO has been Joanna Place.
, the Bank's chief economist is Andrew Haldane. | https://en.wikipedia.org/wiki?curid=4484 |
Chuck D
Carlton Douglas Ridenhour (born August 1, 1960), known professionally as Chuck D, is an American rapper, author, and producer. As the leader of the rap group Public Enemy, which he co-founded in 1985 with Flavor Flav, Chuck D helped create politically and socially conscious hip hop music in the mid-1980s. "The Source" ranked him at No. 12 on their list of the Top 50 Hip-Hop Lyricists of All Time.
Ridenhour was born in Queens, New York. He began writing rhymes after the New York City blackout of 1977. After attending W. Tresper Clarke High School, he went to Adelphi University on Long Island to study graphic design, where he met William Drayton (Flavor Flav). He received a B.F.A. from Adelphi in 1984 and later received an honorary doctorate from Adelphi in 2013.
While at Adelphi, Ridenhour co-hosted hip hop radio show the "Super Spectrum Mix Hour" as Chuck D on Saturday nights at Long Island rock radio station WLIR, designed flyers for local hip-hop events, and drew a cartoon called "Tales of the Skind" for Adelphi student newspaper "The Delphian".
Ridenhour (using the nickname Chuck D) formed Public Enemy in 1985 with Flavor Flav. Upon hearing Ridenhour's demo track "Public Enemy Number One", fledgling producer/upcoming music-mogul Rick Rubin insisted on signing him to his Def Jam label. Their major label releases were "Yo! Bum Rush the Show" (1987), "It Takes a Nation of Millions to Hold Us Back" (1988), "Fear of a Black Planet" (1990), "Apocalypse 91... The Enemy Strikes Black" (1991), the compilation album "Greatest Misses" (1992), and "Muse Sick-n-Hour Mess Age" (1994). They also released a full-length album soundtrack for the film "He Got Game" in 1998. As of Flavor Flav's firing from the group in March 2020, Chuck D is the only remaining original member left in Public Enemy.
Ridenhour also contributed (as Chuck D) to several episodes of the PBS documentary series "The Blues". He has appeared as a featured artist on many other songs and albums, having collaborated with artists such as Janet Jackson, Kool Moe Dee, The Dope Poet Society, Run–D.M.C., Ice Cube, Boom Boom Satellites, Rage Against the Machine, Anthrax, John Mellencamp and many others. In 1990, he appeared on "Kool Thing", a song by the alternative rock band Sonic Youth, and along with Flavor Flav, he sang on George Clinton's song "Tweakin'", which appears on his 1989 album "The Cinderella Theory". In 1993, he executive produced "Got 'Em Running Scared", an album by Ichiban Records group Chief Groovy Loo and the Chosen Tribe.
In 1996, Ridenhour released "Autobiography of Mistachuck" on Mercury Records. Chuck D made a rare appearance at the 1998 MTV Video Music Awards, presenting the Video Vanguard Award to the Beastie Boys, whilst commending their musicianship. In November 1998, he settled out of court with Christopher "The Notorious B.I.G." Wallace's estate over the latter's sampling of his voice in the song "Ten Crack Commandments". The specific sampling is Ridenhour counting off the numbers one to nine on the track "Shut 'Em Down". He later described the decision to sue as "stupid".
In September 1999, he launched a multi-format "supersite" on the web site Rapstation.com. The site includes a TV and radio station with original programming, prominent hip hop DJs, celebrity interviews, free MP3 downloads (the first was contributed by multi-platinum rapper Coolio), downloadable ringtones by ToneThis, social commentary, current events, and regular features on turning rap careers into a viable living. Since 2000, he has been one of the most vocal supporters of peer-to-peer file sharing in the music industry.
He loaned his voice to "" as DJ Forth Right MC for the radio station Playback FM. In 2000, he collaborated with Public Enemy's Gary G-Whiz and MC Lyte on the theme music to the television show "Dark Angel". He appeared with Henry Rollins in a cover of Black Flag's "Rise Above" for the album "". In 2003, he was featured in the PBS documentary in which he recorded a version of Muddy Waters' song "Mannish Boy" with Common, Electrik Mud Cats, and Kyle Jason. He was also featured on Z-Trip's album "Shifting Gears" on a track called "Shock and Awe"; a 12-inch of the track was released featuring artwork by Shepard Fairey. In 2008 he contributed a chapter to "Sound Unbound: Sampling Digital Music and Culture" (The MIT Press, 2008) edited by Paul D. Miller a.k.a. DJ Spooky, and also turned up on The Go! Team's album "Proof of Youth" on the track "Flashlight Fight." He also fulfilled his childhood dreams of being a sports announcer by performing the play-by-play commentary in the video game "" on Xbox 360 and PlayStation 3.
In 2009, Ridenhour wrote the foreword to the book "The Love Ethic: The Reason Why You Can't Find and Keep Beautiful Black Love" by Kamau and Akilah Butler. He also appeared on Brother Ali's album, "Us".
In March 2011, Chuck D re-recorded vocals with The Dillinger Escape Plan for a cover of "Fight the Power".
Chuck D duetted with Rock singer Meat Loaf on his 2011 album "Hell in a Handbasket" on the song "Mad Mad World/The Good God Is a Woman and She Don't Like Ugly".
In 2016 Chuck D joined the band Prophets of Rage along with B-Real and former members of Rage Against the Machine.
In July 2019, Ridenhour sued Terrordome Music Publishing and Reach Music Publishing for $1 million for withholding royalties.
Chuck D is known for his powerful rapping. "How to Rap" says he "has a powerful, resonant voice that is often acclaimed as one of the most distinct and impressive in hip-hop". Chuck says this was based on listening to Melle Mel and sportscasters such as Marv Albert.
Chuck often comes up with a title for a song first. He writes on paper, though sometimes edits using a computer. He prefers to not punch in or overdub vocals.
Chuck listed his favourite rap albums in "Hip Hop Connection": 10. N.W.A, "Straight Outta Compton" 9. Boogie Down Productions, "Criminal Minded" 8. Run-DMC, "Tougher Than Leather" 7. Big Daddy Kane, "Looks Like a Job For..." 6. Stetsasonic, "In Full Gear" 5. Ice Cube, "AmeriKKKa's Most Wanted" 4. Dr. Dre, "The Chronic" 3. De La Soul, "3 Feet High and Rising" 2. Eric B. & Rakim, "Follow the Leader" 1. Run-DMC, "Raising Hell" ("It was the first record that made me realise this was an album-oriented genre")
Chuck D identifies as Black, as opposed to African or African-American. In a 1993 issue of DIRT Magazine covering a taping of In the Mix hosted by Alimi Ballard at the Apollo, Dan Field writes, At one point, Chuck bristles a bit at the term "African-American." He thinks of himself as Black and sees nothing wrong with the term. Besides, he says, having been born in the United States and lived his whole life here, he doesn't consider himself African. Being in Public Enemy has given him the chance to travel around the world, an experience that really opened his eyes and his mind. He says visiting Africa and experiencing life on a continent where the majority of people are Black gave him a new perspective and helped him get in touch with his own history. He also credits a trip to the ancient Egyptian pyramids at Giza with helping him appreciate the relative smallness of man. Ridenhour is politically active; he co-hosted "Unfiltered" on Air America Radio, testified before Congress in support of peer-to-peer MP3 sharing, and was involved in a 2004 rap political convention. He has continued to be an activist, publisher, lecturer, and producer.
Addressing the negative views associated with rap music, he co-wrote the essay book "Fight the Power: Rap, Race, and Reality" with Yusuf Jah. He argues that "music and art and culture is escapism, and escapism sometimes is healthy for people to get away from reality", but sometimes the distinction is blurred and that's when "things could lead a young mind in a direction." He also founded the record company Slam Jamz and acted as narrator in Kareem Adouard's short film "Bling: Consequences and Repercussions", which examines the role of conflict diamonds in bling fashion. Despite Chuck D and Public Enemy's success, Chuck D claims that popularity or public approval was never a driving motivation behind their work. He is admittedly skeptical of celebrity status, revealing in a 1999 interview with "BOMB Magazine" that, "The key for the record companies is to just keep making more and more stars, and make the ones who actually challenge our way of life irrelevant. The creation of celebrity has clouded the minds of most people in America, Europe and Asia. It gets people off the path they need to be on as individuals."
In an interview with "Le Monde" published January 29, 2008, Chuck D stated that rap is devolving so much into a commercial enterprise, that the relationship between the rapper and the record label is that of slave to a master. He believes that nothing has changed for African-Americans since the debut of Public Enemy and, although he thinks that an Obama-Clinton alliance is great, he does not feel that the establishment will allow anything of substance to be accomplished. He stated that French President Nicolas Sarkozy is like any other European elite: he has profited through the murder, rape, and pillaging of those less fortunate and he refuses to allow equal opportunity for those men and women from Africa. In this article, he defended a comment made by Professor Griff in the past that he says was taken out of context by the media. The real statement was a critique of the Israeli government and its treatment of the Palestinian people. Chuck D stated that it is Public Enemy's belief that all human beings are equal.
In an interview with the magazine "N'Digo" published in June 2008, he spoke of today's mainstream urban music seemingly relishing the addictive euphoria of materialism and sexism, perhaps being the primary cause of many people harboring resentment towards the genre and its future. However, he has expressed hope for its resurrection, saying "It's only going to be dead if it doesn't talk about the messages of life as much as the messages of death and non-movement", citing artists such as NYOil, M.I.A. and The Roots as socially conscious artists who push the envelope creatively. "A lot of cats are out there doing it, on the Web and all over. They're just not placing their career in the hands of some major corporation."
In 2010, Chuck D released a track, "Tear Down That Wall". He said, "I talked about the wall not only just dividing the U.S. and Mexico but the states of California, New Mexico and Texas. But Arizona, it's like, come on. Now they're going to enforce a law that talks about basically racial profiling."
He is on the board of the TransAfrica Forum, a Pan African organization that is focused on African, Caribbean and Latin American issues.
Chuck D lives in California, and lost his home in the Thomas Fire of December 2017-January 2018. He is a pescetarian.
Studio albums
Studio albums
Studio albums
Studio EPs
Studio albums
Compilation albums | https://en.wikipedia.org/wiki?curid=5718 |
Cutaway (filmmaking)
In film and video, a cutaway is the interruption of a continuously filmed action by inserting a view of something else. It is usually followed by a cut back to the first shot. A cutaway scene is the interruption of a scene with the insertion of another scene, generally unrelated or only peripherally related to the original scene. The interruption is usually quick, and is usually, although not always, ended by a return to the original scene. The effect is of commentary to the original scene, frequently comic in nature.
The most common use of cutaway shots in dramatic films is to adjust the pace of the main action, to conceal the deletion of some unwanted part of the main shot, or to allow the joining of parts of two versions of that shot. For example, a scene may be improved by cutting a few frames out of an actor's pause; a brief view of a listener can help conceal the break. Or the actor may fumble some of his lines in a group shot; rather than discarding a good version of the shot, the director may just have the actor repeat the lines for a new shot, and cut to that alternate view when necessary.
Cutaways are also used often in older horror films in place of special effects. For example, a shot of a zombie getting its head cut off may, for instance, start with a view of an axe being swung through the air, followed by a close-up of the actor swinging it, then followed by a cut back to the now severed head. George A. Romero, creator of the Dead Series, and Tom Savini pioneered effects that removed the need for cutaways in horror films. "30 Rock" would often use cutaway scenes to create visual humor, the Werewolf Bar Mitzvah scene taking three days to create for only five seconds of screen time. The animated television show "Family Guy" is best known for the use of cutaway gags as humor.
In news broadcasting and documentary work, the cutaway is used much as it would be in fiction. On location, there is usually just one camera to film an interview, and it's usually trained on the interviewee. Often there is also only one microphone. After the interview, the interviewer will usually repeat his questions while he himself is being filmed, with pauses as they act as if to listen to the answers. These shots can be used as cutaways. Cutaways to the interviewer, called noddies, can also be used to cover cuts.
The cutaway does not necessarily contribute any dramatic content of its own, but is used to help the editor assemble a longer sequence. For this reason, editors choose cutaways related to the main action, such as another action or object in the same location. For example, if the main shot is of a man walking down an alley, possible cutaways may include a shot of a cat on a nearby dumpster or a shot of a person watching from a window overhead. | https://en.wikipedia.org/wiki?curid=5719 |
Coma
A coma is a deep state of prolonged unconsciousness in which a person cannot be awakened; fails to respond normally to painful stimuli, light, or sound; lacks a normal wake-sleep cycle; and does not initiate voluntary actions. Coma patients exhibit a complete absence of wakefulness and are unable to consciously feel, speak or move. Comas can be derived by natural causes, or can be medically induced.
Clinically, a coma can be defined as the inability to consistently follow a one-step command. It can also be defined as a score of ≤ 8 on the Glasgow Coma Scale (GCS) lasting ≥ 6 hours. For a patient to maintain consciousness, the components of "wakefulness" and "awareness" must be maintained. Wakefulness describes the quantitative degree of consciousness, whereas awareness relates to the qualitative aspects of the functions mediated by the cortex, including cognitive abilities such as attention, sensory perception, explicit memory, language, the execution of tasks, temporal and spatial orientation and reality judgment. From a neurological perspective, consciousness is maintained by the activation of the cerebral cortex—the gray matter that forms the outer layer of the brain and by the reticular activating system (RAS), a structure located within the brainstem.
The term ‘coma’, from the Greek "koma", meaning deep sleep, had already been used in the Hippocratic corpus ("Epidemica") and later by Galen (second century AD). Subsequently, it was hardly used in the known literature up to the middle of the 17th century. The term is found again in Thomas Willis’ (1621–1675) influential "De anima brutorum" (1672), where lethargy (pathological sleep), ‘coma’ (heavy sleeping), "carus" (deprivation of the senses) and apoplexy (into which "carus" could turn and which he localized in the white matter) are mentioned. The term "carus" is also derived from Greek, where it can be found in the roots of several words meaning soporific or sleepy. It can still be found in the root of the term ‘carotid’. Thomas Sydenham (1624–89) mentioned the term ‘coma’ in several cases of fever (Sydenham, 1685).
General symptoms of a person in a comatose state are:
Many types of problems can cause a coma. Forty percent of comatose states result from drug poisoning. Certain drug use under certain conditions can damage or weaken the synaptic functioning in the ascending reticular activating system (ARAS) and keep the system from properly functioning to arouse the brain. Secondary effects of drugs, which include abnormal heart rate and blood pressure, as well as abnormal breathing and sweating, may also indirectly harm the functioning of the ARAS and lead to a coma. Given that drug poisoning is the cause for a large portion of patients in a coma, hospitals first test all comatose patients by observing pupil size and eye movement, through the vestibular-ocular reflex. (see "Diagnosis" below).
The second most common cause of coma, which makes up about 25% of cases, is lack of oxygen, generally resulting from cardiac arrest. The Central Nervous System (CNS) requires a great deal of oxygen for its neurons. Oxygen deprivation in the brain, also known as hypoxia, causes sodium and calcium from outside of the neurons to decrease and intracellular calcium to increase, which harms neuron communication. Lack of oxygen in the brain also causes ATP exhaustion and cellular breakdown from cytoskeleton damage and nitric oxide production.
Twenty percent of comatose states result from the side effects of a stroke. During a stroke, blood flow to part of the brain is restricted or blocked. An ischemic stroke, brain hemorrhage, or tumor may cause restriction of blood flow. Lack of blood to cells in the brain prevent oxygen from getting to the neurons, and consequently causes cells to become disrupted and die. As brain cells die, brain tissue continues to deteriorate, which may affect the functioning of the ARAS.
The remaining 15% of comatose cases result from trauma, excessive blood loss, malnutrition, hypothermia, hyperthermia, abnormal glucose levels, and many other biological disorders. Furthermore, studies show that 1 out of 8 patients with traumatic brain injury experience a comatose state.
Injury to either or both of the cerebral cortex or the reticular activating system (RAS) is sufficient to cause a person to enter coma.
The cerebral cortex is the outer layer of neural tissue of the cerebrum of the brain. The cerebral cortex is composed of gray matter which consists of the nuclei of neurons, whereas the inner portion of the cerebrum is composed of white matter and is composed of the axons of neuron. White matter is responsible for perception, relay of the sensory input via the thalamic pathway, and many other neurological functions, including complex thinking.
The RAS, on the other hand, is a more primitive structure in the brainstem which includes the reticular formation (RF). The RAS has two tracts, the ascending and descending tract. The ascending track, or ascending reticular activating system (ARAS), is made up of a system of acetylcholine-producing neurons, and works to arouse and wake up the brain. Arousal of the brain begins from the RF, through the thalamus, and then finally to the cerebral cortex. Any impairment in ARAS functioning, a neuronal dysfunction, along the arousal pathway stated directly above, prevents the body from being aware of its surroundings. Without the arousal and consciousness centers, the body cannot awaken, remaining in a comatose state.
The severity and mode of onset of coma depends on the underlying cause. There are two main subdivisions of a coma: structural and diffuse neuronal. A structural cause, for example, is brought upon by a mechanical force that brings about cellular damage, such as physical pressure or a blockage in neural transmission. While a diffuse cause is limited to aberrations of cellular function, that fall under a metabolic or toxic subgroup. Toxin-induced comas are caused by extrinsic substances, whereas metabolic-induced comas are caused by intrinsic processes, such as body thermoregulation or ionic imbalances(e.g. sodium). For instance, severe hypoglycemia (low blood sugar) or hypercapnia (increased carbon dioxide levels in the blood) are examples of a metabolic diffuse neuronal dysfunction. Hypoglycemia or hypercapnia initially cause mild agitation and confusion, but progress to obtundation, stupor, and finally, complete unconsciousness. In contrast, coma resulting from a severe traumatic brain injury or subarachnoid hemorrhage can be instantaneous. The mode of onset may therefore be indicative of the underlying cause.
Structural and diffuse causes of coma are not isolated from one another, as one can lead to the other in some situations. For instance, coma induced by a diffuse metabolic process, such as hypoglycemia, can result in a structural coma if it is not resolved. Another example is if cerebral edema, a diffuse dysfunction, leads to ischemia of the brainstem, a structural issue, due to the blockage of the circulation in the brain.
Although diagnosis of coma is simple, investigating the underlying cause of onset can be rather challenging. As such, after gaining stabilization of the patient's airways, breathing and circulation (the basic ABCs) various diagnostic tests, such as physical examinations and imaging tools (CT scan, MRI, etc.) are employed to access the underlying cause of the coma.
When an unconscious person enters a hospital, the hospital utilizes a series of diagnostic steps to identify the cause of unconsciousness. According to Young, the following steps should be taken when dealing with a patient possibly in a coma:
In the initial assessment of coma, it is common to gauge the level of consciousness on the AVPU (alert, vocal stimuli, painful stimuli, unresponsive) scale by spontaneously exhibiting actions and, assessing the patient's response to vocal and painful stimuli. More elaborate scales, such as the Glasgow Coma Scale, quantify an individual's reactions such as eye opening, movement and verbal response in order to indicate their extent of brain injury. The patient's score can vary from a score of 3 (indicating severe brain injury and death) to 15 (indicating mild or no brain injury).
In those with deep unconsciousness, there is a risk of asphyxiation as the control over the muscles in the face and throat is diminished. As a result, those presenting to a hospital with coma are typically assessed for this risk ("airway management"). If the risk of asphyxiation is deemed high, doctors may use various devices (such as an oropharyngeal airway, nasopharyngeal airway or endotracheal tube) to safeguard the airway.
Imaging basically encompasses computed tomography (CAT or CT) scan of the brain, or MRI for example, and is performed to identify specific causes of the coma, such as hemorrhage in the brain or herniation of the brain structures. Special tests such as an EEG can also show a lot about the activity level of the cortex such as semantic processing, presence of seizures, and are important available tools not only for the assessment of the cortical activity but also for predicting the likelihood of the patient's awakening. The autonomous responses such as the skin conductance response may also provide further insight on the patient's emotional processing.
In the treatment of traumatic brain injury (TBI), there are 4 examination methods that have proved useful: skull x-ray, angiography, computed tomography (CT), and magnetic resonance imaging (MRI). The skull x-ray can detect linear fractures, impression fractures (expression fractures) and burst fractures. Angiography is used on rare occasions for TBIs i.e. when there is suspicion of an aneurysm, carotid sinus fistula, traumatic vascular occlusion, and vascular dissection. A CT can detect changes in density between the brain tissue and hemorrhages like subdural and intracerebral hemorrhages. MRIs are not the first choice in emergencies because of the long scanning times and because fractures cannot be detected as well as CT. MRIs are used for the imaging of soft tissues and lesions in the posterior fossa which cannot be found with the use of CT.
Assessment of the brainstem and cortical function through special reflex tests such as the oculocephalic reflex test (doll's eyes test), oculovestibular reflex test (cold caloric test), corneal reflex, and the gag reflex. Reflexes are a good indicator of what cranial nerves are still intact and functioning and is an important part of the physical exam. Due to the unconscious status of the patient, only a limited number of the nerves can be assessed. These include the cranial nerves number 2 (CN II), number 3 (CN III), number 5 (CN V), number 7 (CN VII), and cranial nerves 9 and 10 (CN IX, CN X).
Assessment of posture and physique is the next step. It involves general observation about the patient's positioning. There are often two stereotypical postures seen in comatose patients. Decorticate posturing is a stereotypical posturing in which the patient has arms flexed at the elbow, and arms adducted toward the body, with both legs extended. Decerebrate posturing is a stereotypical posturing in which the legs are similarly extended (stretched), but the arms are also stretched (extended at the elbow). The posturing is critical since it indicates where the damage is in the central nervous system. A decorticate posturing indicates a lesion (a point of damage) at or above the red nucleus, whereas a decerebrate posturing indicates a lesion at or below the red nucleus. In other words, a decorticate lesion is closer to the cortex, as opposed to a decerebrate posturing which indicates that the lesion is closer to the brainstem.
Pupil assessment is often a critical portion of a comatose examination, as it can give information as to the cause of the coma; the following table is a technical, medical guideline for common pupil findings and their possible interpretations:
A coma can be classified as (1) supratentorial (above Tentorium cerebelli), (2) infratentorial (below Tentorium cerebelli), (3) metabolic or (4) diffused. This classification is merely dependent on the position of the original damage that caused the coma, and does not correlate with severity or the prognosis. The severity of coma impairment however is categorized into several levels. Patients may or may not progress through these levels. In the first level, the brain responsiveness lessens, normal reflexes are lost, the patient no longer responds to pain and cannot hear.
The Rancho Los Amigos Scale is a complex scale that has eight separate levels, and is often used in the first few weeks or months of coma while the patient is under closer observation, and when shifts between levels are more frequent.
Treatment for people in a coma will depend on the severity and cause of the comatose state. Upon admittance to an emergency department, coma patients will usually be placed in an Intensive Care Unit (ICU) immediately, where maintenance of the patient's respiration and circulation become a first priority. Stability of their respiration and circulation is sustained through the use of intubation, ventilation, administration of intravenous fluids or blood and other supportive care as needed.
Once a patient is stable and no longer in immediate danger, there may be a shift of priority from stabilizing the patient to maintaining the state of their physical wellbeing. Moving patients every 2–3 hours by turning them side to side is crucial to avoiding bed sores as a result of being confined to a bed. Moving patients through the use of physical therapy also aids in preventing atelectasis, contractures or other orthopedic deformities which would interfere with a coma patient's recovery.
Pneumonia is also common in coma patients due to their inability to swallow which can then lead to aspiration. A coma patient's lack of a gag reflex, and use of a feeding tube can result in food, drink or other solid organic matter being lodged within their lower respiratory tract (from the trachea to the lungs). This trapping of matter in their lower respiratory tract can ultimately lead to infection, resulting in aspiration pneumonia.
Coma patients may also deal with restless, or seizures. As such, soft cloth restraints may be used to prevent them from pulling on tubes or dressings and side rails on the bed should be kept up to prevent patients from falling.
Coma has a wide variety of emotional reactions from the family members of the affected patients, as well as the primary care givers taking care of the patients. Research has shown that the severity of injury causing coma was found to have no significant impact compared to how much time has passed since the injury occurred. Common reactions, such as desperation, anger, frustration, and denial are possible. The focus of the patient care should be on creating an amicable relationship with the family members or dependents of a comatose patient as well as creating a rapport with the medical staff. Although there is heavy importance of a primary care taker, secondary care takers can play a supporting role to temporarily relieve the primary care taker's burden of tasks.
Comas can last from several days to several weeks. In more severe cases a coma may last for over five weeks, while some have lasted as long as several years. After this time, some patients gradually come out of the coma, some progress to a vegetative state, and others die. Some patients who have entered a vegetative state go on to regain a degree of awareness and in some cases, may remain in vegetative state for years or even decades (the longest recorded period being 42 years).
Predicted chances of recovery will differ depending on which techniques were used to measure the patient's severity of neurological damage. Predictions of recovery are based on statistical rates, expressed as the level of chance the person has of recovering. Time is the best general predictor of a chance of recovery. For example, after four months of coma caused by brain damage, the chance of partial recovery is less than 15%, and the chance of full recovery is very low.
The outcome for coma and vegetative state depends on the cause, location, severity and extent of neurological damage. A deeper coma alone does not necessarily mean a slimmer chance of recovery, similarly, milder comas do not ensure higher chances of recovery . The most common cause of death for a person in a vegetative state is secondary infection such as pneumonia, which can occur in patients who lie still for extended periods.
People may emerge from a coma with a combination of physical, intellectual, and psychological difficulties that need special attention. It is common for coma patients to awaken in a profound state of confusion and suffer from dysarthria, the inability to articulate any speech. Recovery usually occurs gradually. In the first days, patients may only awaken for a few minutes, with increased duration of wakefulness as their recovery progresses and may eventually recover full awareness. That said, some patients may never progress beyond very basic responses.
There are reports of people coming out of a coma after long periods of time. After 19 years in a minimally conscious state, Terry Wallis spontaneously began speaking and regained awareness of his surroundings.
A brain-damaged man, trapped in a coma-like state for six years, was brought back to consciousness in 2003 by doctors who planted electrodes deep inside his brain. The method, called deep brain stimulation (DBS) successfully roused communication, complex movement and eating ability in the 38-year-old American man who suffered a traumatic brain injury. His injuries left him in a minimally conscious state (MCS), a condition akin to a coma but characterized by occasional, but brief, evidence of environmental and self-awareness that coma patients lack.
Research by Dr. Eelco Wijdicks on the depiction of comas in movies was published in Neurology in May 2006. Dr. Wijdicks studied 30 films (made between 1970 and 2004) that portrayed actors in prolonged comas, and he concluded that only two films accurately depicted the state of a coma victim and the agony of waiting for a patient to awaken: "Reversal of Fortune" (1990) and "The Dreamlife of Angels" (1998). The remaining 28 were criticized for portraying miraculous awakenings with no lasting side effects, unrealistic depictions of treatments and equipment required, and comatose patients remaining muscular and tanned.
A person in a coma is said to be in an unconscious state. Perspectives on personhood, identity and consciousness come into play when discussing the metaphysical and bioethical views on comas.
It has been argued that unawareness should be just as ethically relevant and important as a state of awareness and that there should be metaphysical support of unawareness as a state.
In the ethical discussions about disorders of consciousness (DOCs), two abilities are usually considered as central: "experiencing well-being and having interests." Well-being can broadly be understood as the positive effect related to what makes life good (according to specific standards) for the individual in question. The only condition for well-being broadly considered is the ability to experience its ‘positiveness’. That said, because experiencing positiveness is a basic emotional process with phylogenetic roots, it is likely to occur at a completely unaware level and therefore, introduces the idea of an unconscious well-being. As such, the ability of having interests, is crucial for describing two abilities which those with comas are deficient in. Having an interest in a certain domain can be understood as having a stake in something that can affect what makes our life good in that domain. An interest is what directly and immediately improves life from a certain point of view or within a particular domain, or greatly increases the likelihood of life improvement enabling the subject to realize some good. That said, sensitivity to reward signals is a fundamental element in the learning process, both consciously and unconsciously. Moreover, the unconscious brain is able to interact with its surroundings in a meaningful way and to produce meaningful information processing of stimuli coming from the external environment, including other people.
According to Hawkins, "1. A life is good if the subject is able to value, or more basically if the subject is able to care. Importantly, Hawkins stresses that caring has no need for cognitive commitment, i.e. for high-level cognitive activities: it requires being able to distinguish something, track it for a while, recognize it over time, and have certain emotional dispositions "vis-à-vis" something. 2. A life is good if the subject has the capacity for relationship with others, i.e. for meaningfully interacting with other people." This suggests that unawareness may (at least partly) fulfill both conditions identified by Hawkins for life to be good for a subject, thus making the unconscious ethically relevant. | https://en.wikipedia.org/wiki?curid=5721 |
Call of Cthulhu (role-playing game)
Call of Cthulhu is a horror fiction role-playing game based on H. P. Lovecraft's story of the same name and the associated Cthulhu Mythos. The game, often abbreviated as "CoC", is published by Chaosium; it was first released in 1981 and is currently in its seventh edition, with many different versions released. It makes use of Chaosium's Basic Role-Playing (BRP) system, with special rules for Sanity.
The setting of "Call of Cthulhu" is a darker version of our world, based on H. P. Lovecraft's observation (from his essay, "Supernatural Horror in Literature") that "The oldest and strongest emotion of mankind is fear, and the strongest kind of fear is fear of the unknown." The original game, first published in 1981, uses mechanics from Basic Role-Playing, and is set in the 1920s, the setting of many of Lovecraft's stories. Additional settings were developed in the "Cthulhu by Gaslight" supplement, a blend of occult and Holmesian mystery and mostly set in England during the 1890s, and modern/1980s conspiracy with "Cthulhu Now" and "Delta Green". More recent additions include 1000 AD ("Cthulhu: Dark Ages"), 23rd century ("Cthulhu Rising") and Ancient Roman times ("Cthulhu Invictus"). The protagonists may also travel to places that are not of this earth, represented in the Dreamlands (which can be accessed through dreams as well as being physically connected to the earth), to other planets, or into the voids of space. In keeping with the Lovecraftian theme, the gamemaster is called the Keeper of Arcane Lore, or simply the keeper, while player characters are called "investigators".
"CoC" uses the Basic Role-Playing system first developed for "RuneQuest" and used in other Chaosium games. It is skill-based, with player characters getting better with their skills by succeeding at using them for as long as they stay functionally healthy and sane. They do not, however, gain hit points and do not become significantly harder to kill. The game does not use levels.
"CoC" uses percentile dice (with a results ranging from 1 to 100) to determine success or failure. Every player statistic is intended to be compatible with the notion that there is a probability of success for a particular action given what the player is capable of doing. For example, an artist may have a 75% chance of being able to draw something (represented by having 75 in Art skill), and thus rolling a number under 75 would yield a success. Rolling or less of the skill level (1-15 in the example) would be a "special success" (or an "impale" for combat skills) and would yield some extra bonus to be determined by the keeper. For example, the artist character might draw especially well or especially fast, or catch some unapparent detail in the drawing.
The players take the roles of ordinary people drawn into the realm of the mysterious: detectives, criminals, scholars, artists, war veterans, etc. Often, happenings begin innocently enough, until more and more of the workings behind the scenes are revealed. As the characters learn more of the true horrors of the world and the irrelevance of humanity, their sanity (represented by "Sanity Points", abbreviated SAN) inevitably withers away. The game includes a mechanism for determining how damaged a character's sanity is at any given point; encountering the horrific beings usually triggers a loss of SAN points. To gain the tools they need to defeat the horrors – mystic knowledge and magic – the characters may end up losing some of their sanity, though other means such as pure firepower or simply outsmarting one's opponents also exist. "CoC" has a reputation as a game in which it is quite common for a player character to die in gruesome circumstances or end up in a mental institution. Eventual triumph of the players is not assumed.
The original conception of "Call of Cthulhu" was "Dark Worlds", a game commissioned by the publisher Chaosium but never published. Sandy Petersen contacted them regarding writing a supplement for their popular fantasy game "RuneQuest" set in Lovecraft's Dreamlands. He took over the writing of "Call of Cthulhu", and the game was released in 1981.
Since Petersen's departure from Chaosium, continuing development of "Call of Cthulhu" passed to Lynn Willis, credited as co-author in the fifth and sixth editions, and more recently to Paul Fricker and Mike Mason. The game system underwent only minor rules changes in its first six editions (between 1981 and 2011); the current seventh edition, released 2014, includes more significant rules alterations than in any previous release.
For those grounded in the RPG tradition, the very first release of "Call of Cthulhu" created a brand new framework for table-top gaming. Rather than the traditional format established by "Dungeons & Dragons", which often involved the characters wandering through caves or tunnels and fighting different types of monsters, Sandy Petersen introduced the concept of the "Onion Skin": Interlocking layers of information and nested clues that lead the player characters from seemingly minor investigations into a missing person to discovering mind-numbingly awful, global conspiracies to destroy the world. Unlike its predecessor games, "CoC" assumed that most investigators would not survive, alive or sane, and that the only safe way to deal with the vast majority of nasty things described in the rule books was to run away. A well-run "CoC" campaign should engender a sense of foreboding and inevitable doom in its players. The style and setting of the game, in a relatively modern time period, created an emphasis on real-life settings, character research, and thinking one's way around trouble.
The first book of "Call of Cthulhu" adventures was "Shadows of Yog-Sothoth". In this work, the characters come upon a secret society's foul plot to destroy mankind, and pursue it first near to home and then in a series of exotic locations. This template was to be followed in many subsequent campaigns, including "Fungi from Yuggoth" (later known as "Curse of Cthulhu" and "Day of the Beast"), "Spawn of Azathoth", and possibly the most highly acclaimed, "Masks of Nyarlathotep".
"Shadows of Yog-Sothoth" is important not only because it represents the first published addition to the boxed first edition of "Call of Cthulhu", but because its format defined a new way of approaching a campaign of linked RPG scenarios involving actual clues for the would-be detectives amongst the players to follow and link in order to uncover the dastardly plots afoot. Its format has been used by every other campaign-length "Call of Cthulhu" publication. The standard of "CoC" scenarios was well received by independent reviewers. "The Asylum and Other Tales", a series of stand alone articles released in 1983, rated an overall 9/10 in Issue 47 of "White Dwarf" magazine.
The standard of the included 'clue' material varies from scenario to scenario, but reached its zenith in the original boxed versions of the "Masks of Nyarlathotep" and "Horror on the Orient Express" campaigns. Inside these one could find matchbooks and business cards apparently defaced by non-player characters, newspaper cuttings and (in the case of "Orient Express") period passports to which players could attach their photographs, increasing the sense of immersion. Indeed, during the period that these supplements were produced, third party campaign publishers strove to emulate the quality of the additional materials, often offering separately-priced 'deluxe' clue packages for their campaigns.
Additional milieux were provided by Chaosium with the release of "Dreamlands", a boxed supplement containing additional rules needed for playing within the Lovecraft Dreamlands, a large map and a scenario booklet, and "Cthulhu By Gaslight", another boxed set which moved the action from the 1920s to the 1890s.
In 1987, Chaosium issued the supplement titled "Cthulhu Now", a collection of rules, supplemental source materials and scenarios for playing "Call of Cthulhu" in the present day. This proved to be a very popular alternative milieu, so much so that much of the supplemental material is now included in the core rule book.
Pagan Publishing released "Delta Green", a series of supplements originally set in the 1990s, although later supplements add support for playing closer to the present day. In these, player characters are agents of a secret agency known as Delta Green, which fights against creatures from the Mythos and conspiracies related to them. Arc Dream Publishing released a new version of "Delta Green" in 2016 as a standalone game, partially using the mechanics from "Call of Cthulhu".
"Lovecraft Country" was a line of supplements for "Call of Cthulhu" released in 1990. These supplements were overseen by Keith Herber and provided backgrounds and adventures set in Lovecraft's fictional towns of Arkham, Kingsport, Innsmouth, Dunwich, and their environs. The intent was to give investigators a common base, as well as to center the action on well-drawn characters with clear motivations.
In 1987, "Terror Australis: Call of Cthulhu in the Land Down Under" was published. In 2018, a revised and updated version of the 1987 game was reissued, with about triple the content and two new games. It requires the "Call of Cthulhu Keeper's Rulebook" (7th Edition) and is usable with "Pulp Cthulhu".
Owing to both works' "national significance relating to Australia and the Australian people", and the way it "blends the Australian environment and climate, as well as Indigenous culture and legends, into the fabric of the game", electronic versions have been lodged via National edeposit to the National Library of Australia. A physical copy of the 1987 book, which is out of print, is still being sought.
In the years since the collapse of the "Mythos" collectible card game (production ceased in 1997), the release of "CoC" books has been very sporadic, with up to a year between releases. Chaosium struggled with near bankruptcy for many years before finally starting their upward climb again.
2005 was Chaosium's busiest year for many years, with 10 releases for the game. Chaosium took to marketing "monographs"—short books by individual writers with editing and layout provided out-of-house—directly to the consumer, allowing the company to gauge market response to possible new works. The range of times and places in which the horrors of the Mythos can be encountered was also expanded in late 2005 onward with the addition of "Cthulhu Dark Ages" by Stéphane Gesbert, which gives a framework for playing games set in 11th century Europe, "Secrets of Japan" by Michael Dziesinski for gaming in modern-day Japan, and "Secrets of Kenya" by David Conyers for gaming in interwar period Africa.
In July 2011, Chaosium announced it would re-release a 30th anniversary edition of the "CoC" 6th edition role-playing game. This 320-page book features thick (3 mm) leatherette hardcovers with the front cover and spine stamped with gold foil. The interior pages are printed in black ink, on 90 gsm matte art paper. The binding is thread sewn, square backed. Chaosium offered a one-time printing of this Collector's Edition.
On May 28, 2013, a crowdfunding campaign on Kickstarter for the 7th edition of "Call of Cthulhu" was launched with a goal of $40,000; it ended on June 29 of the same year having collected $561,836. It included many more major revisions than any previous edition, and also split the core rules into two books, a Player's Guide and Keeper's Guide. Problems and delays fulfilling the Kickstarters for the 7th edition of "Call of Cthulhu" led Greg Stafford and Sandy Petersen (who had both left in 1998) to return to an active role at Chaosium in June 2015.
Chaosium has licensed other publishers to create supplements using their rule system, notably including "Delta Green" by Pagan Publishing. Other licensees have included Infogrames, Miskatonic River Press, Theater of the Mind Enterprises, Triad Entertainment, Games Workshop, Fantasy Flight Games, RAFM, Goodman Games, Grenadier Models Inc. and Yog-Sothoth.com. These supplements may be set in different time frames or even different game universes from the original game.
"Shadow of the Comet" (later repackaged as "Call of Cthulhu: Shadow of the Comet") is an adventure game developed and released by Infogrames in 1993. The game is based on H. P. Lovecraft's Cthulhu Mythos and uses many elements from Lovecraft's "The Dunwich Horror" and "The Shadow Over Innsmouth". A follow-up game, "Prisoner of Ice", is not a direct sequel.
"Prisoner of Ice" (also "Call of Cthulhu: Prisoner of Ice") is an adventure game developed and released by Infogrames for the PC and Macintosh computers in 1995 in America and Europe. It is based on H. P. Lovecraft's Cthulhu Mythos, particularly "At the Mountains of Madness", and is a follow-up to Infogrames' earlier "Shadow of the Comet". In 1997, the game was ported to the Sega Saturn and PlayStation exclusively in Japan.
In 2001, a stand-alone version of "Call of Cthulhu" was released by Wizards of the Coast, for the d20 system. Intended to preserve the feeling of the original game, the d20 conversion of the game rules were supposed to make the game more accessible to the large "D&D" player base. The d20 system also made it possible to use "Dungeons & Dragons" characters in "Call of Cthulhu", as well as to introduce the Cthulhu Mythos into "Dungeons & Dragons" games. The d20 version of the game is no longer supported by Wizards as per their contract with Chaosium. Chaosium included d20 stats as an appendix in three releases (see Lovecraft Country), but have since dropped the "dual stat" idea.
A licensed first-person shooter adventure game by Headfirst Productions, based on "Call of Cthulhu" campaign "Escape from Innsmouth" and released by Bethesda Softworks in 2005/2006 for the PC and Xbox.
In February 2008, Pelgrane Press published "Trail of Cthulhu", a stand-alone game created by Kenneth Hite using the GUMSHOE System developed by Robin Laws. "Trail of Cthulhu"s system is more mystery oriented and focuses mostly on interpreting clues.
In September 2008, Reality Deviant Publications published "Shadows of Cthulhu", a supplement that brings Lovecraftian gaming to Green Ronin's True20 system.
In October 2009, Reality Blurs published "Realms of Cthulhu", a supplement for Pinnacle Entertainment's Savage Worlds system.
In 2010, Cubicle 7 published an official role-playing game, "The Laundry" and a number of supplements, all based on Charles Stross's "Laundry Files" series.
In April 2011, Chaosium and new developer Red Wasp Design announced a joint project to produce a mobile video game based on the "Call of Cthulhu" RPG, entitled "Call of Cthulhu: The Wasted Land". The game was released on January 30, 2012.
In 2018, Metarcade produced "Cthulhu Chronicles", a game for iOS with a campaign of nine mobile interactive fiction stories set in 1920s England based on "Call of Cthulhu". The first five stories were released on July 10, 2018.
"Mythos" was a collectible card game (CCG) based on the Cthulhu Mythos that Chaosium produced and marketed during the mid-1990s. While generally praised for its fast gameplay and unique mechanics, it ultimately failed to gain a very large market presence. It bears mention because its eventual failure brought the company to hard times that affected its ability to produce material for "Call of Cthulhu". "Call of Cthulhu: The Card Game" is a second collectible card game, produced by Fantasy Flight Games.
The first licensed "Call of Cthulhu" gaming miniatures were sculpted by Andrew Chernack and released by Grenadier Models in boxed sets and blister packs in 1983. The license was later transferred to RAFM. As of 2011, RAFM still produce licensed C"all of Cthulhu" models sculpted by Bob Murch. Both lines include investigator player character models and the iconic monsters of the Cthulhu mythos.
As of July 2015, Reaper Miniatures started its third "Bones Kickstarter", a Kickstarter intended to help the company migrate some miniatures from metal to plastic, and introducing some new ones. Among the stretch goals was the second $50 expansion, devoted to the Mythos, with miniatures such as Cultists, Deep Ones, Mi'Go, and an extra $15 Shub-Niggurath "miniature" (it is, at least, 6x4 squares). It is expected for those miniatures to remain in the Reaper Miniatures catalogue after the Kickstarter project finishes. In 2020 Chaosium announced a license agreement with Ardacious for "Call of Cthulhu" virtual miniatures to be released on their augmented reality app Ardent Roleplay.
Call of Cthulhu: The Official Video Game is a survival horror role-playing video game developed by Cyanide and published by Focus Home Interactive for PlayStation 4, Xbox One and Windows. The game features a semi-open world environment and incorporates themes of Lovecraftian and psychological horror into a story which includes elements of investigation and stealth. It is inspired by H. P. Lovecraft's short story "The Call of Cthulhu".
Several reviews of various editions appeared in "Space Gamer/Fantasy Gamer".
Several reviews of various editions appeared in "White Dwarf".
Several reviews of various editions and supplements also appeared in "Dragon".
In a 1996 reader poll by "Arcane" magazine to determine the 50 most popular roleplaying games of all time, "Call of Cthulhu" was ranked 1st. Editor Paul Pettengale commented: ""Call of Cthulhu" is fully deserved of the title as the most popular roleplaying system ever - it's a game that doesn't age, is eminently playable, and which hangs together perfectly. The system, even though it's over ten years old, it still one of the very best you'll find in any roleplaying game. Also, there's not a referee in the land who could say they've read every Lovecraft inspired book or story going, so there's a pretty-well endless supply of scenario ideas. It's simply marvellous."
The game has won several major awards: | https://en.wikipedia.org/wiki?curid=5722 |
Cape Breton Island
Cape Breton Island (—formerly '; or '; ; or simply "Cape Breton") is an island on the Atlantic coast of North America and part of the province of Nova Scotia, Canada.
The island accounts for 18.7% of Nova Scotia's total area. Although the island is physically separated from the Nova Scotia peninsula by the Strait of Canso, the long rock-fill Canso Causeway connects it to mainland Nova Scotia. The island is east-northeast of the mainland with its northern and western coasts fronting on the Gulf of Saint Lawrence; its western coast also forms the eastern limits of the Northumberland Strait. The eastern and southern coasts front the Atlantic Ocean; its eastern coast also forms the western limits of the Cabot Strait. Its landmass slopes upward from south to north, culminating in the highlands of its northern cape. One of the world's larger salt water lakes, ("Arm of Gold" in French), dominates the island's centre.
The island is divided into four of Nova Scotia's eighteen counties: Cape Breton, Inverness, Richmond, and Victoria. Their total population at the 2016 census numbered 132,010 Cape Bretoners; this is approximately 15% of the provincial population. Cape Breton Island has experienced a decline in population of approximately 2.9% since the 2011 census. Approximately 75% of the island's population is in the Cape Breton Regional Municipality (CBRM) which includes all of Cape Breton County and is often referred to as Industrial Cape Breton, given the history of coal mining and steel manufacturing in this area, which was Nova Scotia's industrial heartland throughout the 20th century.
The island has five reserves of the Mi'kmaq Nation: , , , , and /Chapel Island. is the largest in both population and land area.
Its name may derive from near Bayonne, or more probably from "Cape" and the word ', the French demonym for ', the French historical region. William Francis Ganong, however, rejects a French origin for the name and offers instead that the earliest form of the name appeared on Portuguese maps as "bertomes", and, he argues, "that word meant at the time the English and not the French Bretons, and referred to the region which John Cabot and his Bristol Englishmen discovered on the voyage of 1497...therefore our Cape Breton would mean 'Cape of the English'."
Cape Breton Island's first residents were likely Archaic maritime natives, ancestors of the Mi'kmaq. These peoples and their progeny inhabited the island (known as Unama'ki) for several thousand years and continue to live there to this day. Their traditional lifestyle centred around hunting and fishing because of the unfavourable agricultural conditions of their maritime home. This ocean-centric lifestyle did, however, make them among the first indigenous peoples to discover European explorers and sailors fishing in the St Lawrence Estuary. John Cabot reportedly visited the island in 1497. However, European histories and maps of the period are of too poor quality to be sure whether Cabot first visited Newfoundland or Cape Breton Island. This discovery is commemorated by Cape Breton's Cabot Trail, and by the Cabot's Landing Historic Site & Provincial Park, near the village of Dingwall.
The local Mi'kmaq peoples began trading with European fishermen when the fishermen began landing in their territories as early as the 1520s. In about 1521–22, the Portuguese under João Álvares Fagundes established a fishing colony on the island. As many as two hundred settlers lived in a village, the name of which is not known, located according to some historians at what is now Ingonish on the island's northeastern peninsula. These fishermen traded with the local population but did not maintain a permanent settlement. This Portuguese colony's fate is unknown, but it is mentioned as late as 1570.
During the Anglo-French War of 1627 to 1629, under Charles I, the Kirkes took Quebec City; Sir James Stewart of Killeith, Lord Ochiltree planted a colony on Unama'ki at Baleine, Nova Scotia; and Alexander's son, William Alexander, 1st Earl of Stirling, established the first incarnation of "New Scotland" at Port Royal. These claims, and larger European ideals of native conquest were the first time the island was incorporated as European territory, though it would be several decades later that treaties would actually be signed (no copies of these treaties exist).
These Scottish triumphs, which left Cape Sable as the only major French holding in North America, did not last. Charles I's haste to make peace with France on the terms most beneficial to him meant the new North American gains would be bargained away in the Treaty of Saint-Germain-en-Laye (1632), which established which European power had claim over the territories, but did not in fact establish that Europeans had any claim to begin with.
The French quickly defeated the Scots at Baleine, and established the first European settlements on Île Royale: present day Englishtown (1629) and St. Peter's (1630). These settlements lasted only one generation, until Nicolas Denys left in 1659. The island did not have any European settlers for another fifty years before those communities along with Louisbourg were re-established in 1713, after which point European settlement was permanently established on the island.
Known as ""Île Royale"" ("Royal Island") to the French, the island also saw active settlement by France. After the French ceded their claims to Newfoundland and the Acadian mainland to the British by the Treaty of Utrecht in 1713, the French relocated the population of Plaisance, Newfoundland, to Île Royale and the French garrison was established in the central eastern part at Sainte Anne. As the harbour at Sainte Anne experienced icing problems, it was decided to build a much larger fortification at Louisbourg to improve defences at the entrance to the Gulf of Saint Lawrence and to defend France's fishing fleet on the Grand Banks. The French also built the Louisbourg Lighthouse in 1734, the first lighthouse in Canada and one of the first in North America. In addition to Cape Breton Island, the French colony of Île Royale also included Île Saint-Jean, today called Prince Edward Island, and Les Îles-de-la-Madeleine.
Louisbourg itself was one of the most important commercial and military centres in New France. Louisbourg was captured by New Englanders with British naval assistance in 1745 and by British forces in 1758. The French population of Île Royale was deported to France after each siege. While French settlers returned to their homes in Île Royale after the Treaty of Aix-la-Chapelle was signed in 1748, the fortress was demolished after the second siege. Île Royale remained formally part of New France until it was ceded to Great Britain by the Treaty of Paris in 1763. It was then merged with the adjacent, British colony of Nova Scotia (present day peninsular Nova Scotia and New Brunswick). Acadians who had been expelled from Nova Scotia and Île Royale were permitted to settle in Cape Breton beginning in 1764, and established communities in north-western Cape Breton, near Cheticamp, and southern Cape Breton, on and near Isle Madame.
Some of the first British-sanctioned settlers on the island following the Seven Years' War were Irish, although upon settlement they merged with local French communities to form a culture rich in music and tradition. From 1763 to 1784, the island was administratively part of the colony of Nova Scotia and was governed from Halifax.
The first permanently settled Scottish community on Cape Breton Island was Judique, settled in 1775 by Michael Mor MacDonald. He spent his first winter using his upside-down boat for shelter, which is reflected in the architecture of the village's Community Centre. He composed a song about the area called "O 's àlainn an t-àite", or "O, Fair is the Place."
During the American Revolution, on 1 November 1776, John Paul Jones – the father of the American Navy – set sail in command of "Alfred" to free hundreds of American prisoners working in the area's coal mines. Although winter conditions prevented the freeing of the prisoners, the mission did result in the capture of "Mellish", a vessel carrying a vital supply of winter clothing intended for John Burgoyne's troops in Canada.
Major Timothy Hierlihy and his regiment on board HMS "Hope" worked in and protected from privateer attacks on the coal mines at Sydney Cape Breton. Sydney Cape Breton provided a vital supply of coal for Halifax throughout the war. The British began developing the mining site at Sydney Mines in 1777. On 14 May 1778, Major Hierlihy arrived at Cape Breton. While there, Hierlihy reported that he "beat off many piratical attacks, killed some and took other prisoners."
A few years into the war there was also a naval engagement between French ships and a British convoy off Sydney, Nova Scotia, near Spanish River (1781), Cape Breton. French ships (fighting with the Americans) were re-coaling and defeated a British convoy. Six French sailors were killed and 17 British, with many more wounded.
In 1784, Britain split the colony of Nova Scotia into three separate colonies: New Brunswick, Cape Breton Island, and present-day peninsular Nova Scotia, in addition to the adjacent colonies of St. John's Island (renamed Prince Edward Island in 1798) and Newfoundland. The colony of Cape Breton Island had its capital at Sydney on its namesake harbour fronting on Spanish Bay and the Cabot Strait. Its first Lieutenant-Governor was Joseph Frederick Wallet DesBarres (1784–1787) and his successor was William Macarmick (1787).
A number of United Empire Loyalists emigrated to the Canadian colonies, including Cape Breton. David Mathews, the former Mayor of New York City during the American Revolution, emigrated with his family to Cape Breton in 1783. He succeeded Macarmick as head of the colony and served from 1795 to 1798.
From 1799 to 1807, the military commandant was John Despard, brother of Edward.
An order forbidding the granting of land in Cape Breton, issued in 1763, was removed in 1784. The mineral rights to the island were given over to the Duke of York by an order-in-council. The British government had intended that the Crown take over the operation of the mines when Cape Breton was made a colony, but this was never done, probably because of the rehabilitation cost of the mines. The mines were in a neglected state, caused by careless operations dating back at least to the time of the final fall of Louisbourg in 1758.
Large-scale shipbuilding began in the 1790s, beginning with schooners for local trade moving in the 1820s to larger brigs and brigantines, mostly built for British shipowners. Shipbuilding peaked in the 1850s, marked in 1851 by the full-rigged ship "Lord Clarendon", the largest wooden ship ever built in Cape Breton.
In 1820, the colony of Cape Breton Island was merged for the second time with Nova Scotia. This development is one of the factors which led to large-scale industrial development in the Sydney Coal Field of eastern Cape Breton County. By the late 19th century, as a result of the faster shipping, expanding fishery and industrialization of the island, exchanges of people between the island of Newfoundland and Cape Breton increased, beginning a cultural exchange that continues to this day.
The 1920s were some of the most violent times in Cape Breton. They were marked by several severe labour disputes. The famous murder of William Davis by strike breakers, and the seizing of the New Waterford power plant by striking miners led to a major union sentiment that persists to this day in some circles. William Davis Miners' Memorial Day is celebrated in coal mining towns to commemorate the deaths of miners at the hands of the coal companies.
The turn of the 20th century saw Cape Breton Island at the forefront of scientific achievement with the now-famous activities launched by inventors Alexander Graham Bell and Guglielmo Marconi.
Following his successful invention of the telephone and being relatively wealthy, Bell acquired land near Baddeck in 1885, largely due to surroundings reminiscent of his early years in Scotland. He established a summer estate complete with research laboratories, working with deaf people—including Helen Keller—and continued to invent. Baddeck would be the site of his experiments with hydrofoil technologies as well as the Aerial Experiment Association, financed by his wife, which saw the first powered flight in Canada when the AEA "Silver Dart" took off from the ice-covered waters of Bras d'Or Lake. Bell also built the forerunner to the iron lung and experimented with breeding sheep.
Marconi's contributions to Cape Breton Island were also quite significant, as he used the island's geography to his advantage in transmitting the first North American trans-Atlantic radio message from a station constructed at Table Head in Glace Bay to a receiving station at Poldhu in Cornwall, England. Marconi's pioneering work in Cape Breton marked the beginning of modern radio technology. Marconi's station at Marconi Towers, on the outskirts of Glace Bay, became the chief communication centre for the Royal Canadian Navy in World War I through to the early years of World War II.
Promotions for tourism beginning in the 1950s recognized the importance of the Scottish culture to the province, and the provincial government started encouraging the use of Gaelic once again. The establishment of funding for the Gaelic College of Celtic Arts and Crafts and formal Gaelic language courses in public schools are intended to address the near-loss of this culture to English assimilation.
In the 1960s, the Fortress of Louisbourg was partially reconstructed by Parks Canada. Since 2009, this National Historic Site of Canada has attracted an average of 90 000 visitors per year.
Gaelic speakers in Cape Breton, as elsewhere in Nova Scotia, furnished a large proportion of the local population from the eighteenth century on. They brought with them a common culture of poetry, traditional songs and tales, music and dance, and used this to develop distinctive local traditions.
Most Gaelic settlement in Nova Scotia happened between 1770 and 1840, with probably over 50,000 Gaelic speakers emigrating from the Scottish Highlands and the Hebrides to Nova Scotia and Prince Edward Island. Such emigration was facilitated by changes in Gaelic society and the economy, with sharp increases in rents, confiscation of land and disruption of local customs and rights. In Nova Scotia, poetry and song in Gaelic flourished. George Emmerson argues that an "ancient and rich" tradition of storytelling, song, and Gaelic poetry emerged during the eighteenth century and was transplanted from the Highlands of Scotland to Nova Scotia, where the language similarly took root there. The majority of those settling in Nova Scotia from the end of the eighteenth century through to middle of the next were from the Scottish Highlands, rather than the Lowlands, making the Highland tradition's impact more profound on the region. Gaelic settlement in Cape Breton began in earnest in the early nineteenth century.
The Gaelic language became dominant from Colchester County in the west of Nova Scotia into Cape Breton County in the east. It was reinforced in Cape Breton in the first half of the 19th century with an influx of Highland Scots numbering approximately 50,000 as a result of the Highland Clearances.
Gaelic speakers, however, tended to be poor; they were largely illiterate and had little access to education. This situation still obtained in the early twentieth century. In 1921 Gaelic was approved as an optional subject in the curriculum of Nova Scotia, but few teachers could be found and children were discouraged from using the language in schools. By 1931 the number of Gaelic speakers in Nova Scotia had fallen to approximately 25,000, mostly in discrete pockets. In Cape Breton it was still a majority language, but the proportion was falling. Children were no longer being raised with Gaelic.
From 1939 on, attempts were made to strengthen its position in the public school system in Nova Scotia, but funding, official commitment and the availability of teachers continued to be a problem. By the 1950s the number of speakers was less than 7,000. The advent of multiculturalism in Canada in the 1960s meant that new educational opportunities became available, with a gradual strengthening of the language at secondary and tertiary level. At present several schools in Cape Breton offer Gaelic Studies and Gaelic language programs, and the language is taught at University College of Cape Breton.
The 2016 Canadian Census shows that there are only 40 reported speakers of Gaelic as a mother tongue in Cape Breton. On the other hand, there are families and individuals who have recommenced intergenerational transmission. They include fluent speakers from Gaelic-speaking areas of Scotland and speakers who became fluent in Nova Scotia and who in some cases studied in Scotland. Other revitalization activities include adult education, community cultural events and publishing.
The island measures in area, making it the 77th largest island in the world and Canada's 18th largest island. Cape Breton Island is composed mainly of rocky shores, rolling farmland, glacial valleys, barren headlands, mountains, woods and plateaus. Geological evidence suggests at least part of the island was joined with present-day Scotland and Norway, now separated by millions of years of plate tectonics.
Cape Breton Island's northern portion is dominated by the Cape Breton Highlands, commonly shortened to simply the "Highlands", which are an extension of the Appalachian mountain chain. The Highlands comprise the northern portions of Inverness and Victoria counties. In 1936, the federal government established the Cape Breton Highlands National Park covering across the northern third of the Highlands. The Cabot Trail scenic highway also encircles the plateau's coastal perimeter.
Cape Breton Island's hydrological features include the Bras d'Or Lake system, a salt-water fjord at the heart of the island, and freshwater features including Lake Ainslie, the Margaree River system, and the Mira River. Innumerable smaller rivers and streams drain into the Bras d'Or Lake estuary and on to the Gulf of St. Lawrence and Atlantic coasts.
Cape Breton Island is joined to the mainland by the Canso Causeway, which was completed in 1955, enabling direct road and rail traffic to and from the island, but requiring marine traffic to pass through the Canso Canal at the eastern end of the causeway.
Cape Breton Island is divided into four counties: Cape Breton, Inverness, Richmond, and Victoria.
The climate is one of mild, often pleasantly warm summers and cold winters, although the proximity to the Atlantic Ocean and Gulf Stream moderates the extreme winter cold found on the mainland, especially on the east side that faces the Atlantic. Precipitation is abundant year round, with annual totals up to 60 inches on the eastern side facing the Atlantic storms. Considerable snowfall occurs in winter, especially in the highlands.
The island's residents can be grouped into five main cultures: Scottish, Mi'kmaq, Acadian, Irish, English, with respective languages Scottish Gaelic, Mi'kmaq, French, and English. English is now the primary language, including a locally distinctive Cape Breton accent, while Mi'kmaq, Scottish Gaelic and Acadian French are still spoken in some communities.
Later migrations of Black Loyalists, Italians, and Eastern Europeans mostly settled in the island's eastern part around the industrial Cape Breton region. Cape Breton Island's population has been in decline two decades with an increasing exodus in recent years due to economic conditions.
According to the Census of Canada, the population of Cape Breton [Economic region] in 2016 / 2011 / 2006 / 1996 was 132,010 / 135,974 / 142,298 / 158,260.
Statistics Canada in 2001 reported a "religion" total of 145,525 for Cape Breton, including 5,245 with "no religious affiliation." Major categories included:
Much of the recent economic history of Cape Breton Island can be tied to the coal industry.
The island has two major coal deposits:
Sydney has traditionally been the main port, with facilities in a large, sheltered, natural harbour. It is the island's largest commercial centre and home to the "Cape Breton Post" daily newspaper, as well as one television station, CJCB-TV (CTV), and several radio stations. The Marine Atlantic terminal at North Sydney is the terminal for large ferries traveling to Channel-Port aux Basques and seasonally to Argentia, both on the island of Newfoundland.
Point Edward on the west side of Sydney Harbour is the location of Sydport, a former navy base () now converted to commercial use. The Canadian Coast Guard College is nearby at Westmount. Petroleum, bulk coal, and cruise ship facilities are also in Sydney Harbour.
Glace Bay, the second largest urban community in population, was the island's main coal mining centre until its last mine closed in the 1980s. Glace Bay was the hub of the Sydney & Louisburg Railway and a major fishing port. At one time, Glace Bay was known as the largest town in Nova Scotia, based on population.
Port Hawkesbury has risen to prominence since the completion of the Canso Causeway and Canso Canal created an artificial deep-water port, allowing extensive petrochemical, pulp and paper, and gypsum handling facilities to be established. The Strait of Canso is completely navigable to Seawaymax vessels, and Port Hawkesbury is open to the deepest-draught vessels on the world's oceans. Large marine vessels may also enter Bras d'Or Lake through the Great Bras d'Or channel, and small craft can use the Little Bras d'Or channel or St. Peters Canal. While commercial shipping no longer uses the St. Peters Canal, it remains an important waterway for recreational vessels.
The industrial Cape Breton area faced several challenges with the closure of the Cape Breton Development Corporation's (DEVCO) coal mines and the Sydney Steel Corporation's (SYSCO) steel mill. In recent years, the Island's residents have tried to diversify the area economy by investing in tourism developments, call centres, and small businesses, as well as manufacturing ventures in fields such as auto parts, pharmaceuticals, and window glazings.
While the Cape Breton Regional Municipality is in transition from an industrial to a service-based economy, the rest of Cape Breton Island outside the industrial area surrounding Sydney-Glace Bay has been more stable, with a mixture of fishing, forestry, small-scale agriculture, and tourism.
Tourism in particular has grown throughout the post-Second World War era, especially the growth in vehicle-based touring, which was furthered by the creation of the Cabot Trail scenic drive. The scenery of the island is rivalled in northeastern North America by only Newfoundland; and Cape Breton Island tourism marketing places a heavy emphasis on its Scottish Gaelic heritage through events such as the Celtic Colours Festival, held each October, as well as promotions through the Gaelic College of Celtic Arts and Crafts.
Whale-watching is a popular attraction for tourists. Whale-watching cruises are operated by vendors from Baddeck to Cheticamp. The most popular species of whale found in Cape Breton's waters is the Pilot whale.
The island's primary east-west road is Highway 105, the Trans-Canada Highway, although Trunk 4 is also heavily used. Highway 125 is an important arterial route around Sydney Harbour in the Cape Breton Regional Municipality. The Cabot Trail, circling the Cape Breton Highlands, and Trunk 19, along the island's western coast, are important secondary roads. The Cape Breton and Central Nova Scotia Railway maintains railway connections between the port of Sydney to the Canadian National Railway in Truro
The Cabot Trail is a scenic road circuit around and over the Cape Breton Highlands with spectacular coastal vistas; over 400,000 visitors drive the Cabot Trail each summer and fall. Coupled with the Fortress of Louisbourg, it has driven the growth of the tourism industry on the island in recent decades. The "Condé Nast" travel guide has rated Cape Breton Island as one of the world's best island destinations.
Cape Breton is well known for its traditional fiddle music, which was brought to North America by Scottish immigrants during the Highland Clearances. The traditional style has been well preserved in Cape Breton, and céilidhs have become a popular attraction for tourists. Inverness County in particular has a heavy concentration of musical activity, with regular performances in communities such as Mabou and Judique. Judique is recognized as 'Baile nam Fonn', (literally: Village of Tunes) or the 'Home of Celtic Music', featuring the Celtic Music Interpretive Centre. Performers who have received significant recognition outside of Cape Breton include Angus Chisholm, Buddy MacMaster, Joseph Cormier, first Cape Breton fiddler to record an album made available in Europe (1974), Lee Cremo, Bruce Guthro, Natalie MacMaster, Ashley MacIsaac, The Rankin Family, Aselin Debison, Gordie Sampson, Dawn and Margie Beaton, also known as "The Beaton Sisters", and the Barra MacNeils. The Margaree's of Cape Breton also serve as a large contributor of fiddle music celebrated throughout the island. This traditional fiddle music of Cape Breton is studied by musicians around the world, where its global recognition continues to rise.
The Men of the Deeps are a male choral group of current and former miners from the industrial Cape Breton area.
Cape Breton artists who have been recognized with major national or international awards include actor Harold Russell of North Sydney, who won an Academy Award in 1946 for his portrayal of Homer Parrish in "The Best Years of Our Lives", and Lynn Coady and Linden MacIntyre of Inverness County, who are both past winners of the Giller Prize for Canadian literature. The Rankin Family and Rita MacNeil have recorded multiple albums certified as Double Platinum by Music Canada.
People from Cape Breton have also achieved a number of firsts in Canadian politics and governance. These include Mayann Francis of Whitney Pier, the first Black Lieutenant Governor of Nova Scotia, Isaac Phills of Sydney, Nova Scotia, the first person of African descent to be awarded the Order of Canada, and Elizabeth May of Margaree Harbour, the first member of the Green Party of Canada elected to the House of Commons of Canada.
Cape Breton Island is also home to YouTube weather forecaster Frankie MacDonald, who has over 200,000 subscribers. He accurately predicted an magnitude 7 earthquake in New Zealand in November 2016.
American artists like sculptor Richard Serra, composer Philip Glass and abstract painter John Beardman spent part of the year on Cape Breton Island.
Steve Arbuckle is a Canadian-born actor born in Cape Breton Island.
Director Ashley McKenzie's 2016 film "Werewolf" is set on the island and features local actors; McKenzie grew up on the island.
Bruce Guthro; Lead singer and guitarist of the former Scottish Celtic band Runrig, which disbanded in 2018 after 46 years. Guthro resides in Hammonds Plains, Nova Scotia.
Dylan Guthro; musician. Son of Bruce Guthro. | https://en.wikipedia.org/wiki?curid=5724 |
Cthulhu Mythos
The Cthulhu Mythos is a shared fictional universe, originating in the works of American horror writer H. P. Lovecraft. The term was coined by August Derleth, a contemporary correspondent and protégé of Lovecraft, to identify the settings, tropes, and lore that were employed by Lovecraft and his literary successors. The name "Cthulhu" derives from the central creature in Lovecraft's seminal short story, "The Call of Cthulhu", first published in the pulp magazine "Weird Tales" in 1928.
Richard L. Tierney, a writer who also wrote Mythos tales, later applied the term "Derleth Mythos" to distinguish Lovecraft's works from Derleth's later stories, which modify key tenets of the Mythos. Authors of Lovecraftian horror in particular frequently use elements of the Cthulhu Mythos.
In his essay "H. P. Lovecraft and the Cthulhu Mythos", Robert M. Price described two stages in the development of the Cthulhu Mythos. Price called the first stage the "Cthulhu Mythos proper." This stage was formulated during Lovecraft's lifetime and was subject to his guidance. The second stage was guided by August Derleth who, in addition to publishing Lovecraft's stories after his death, attempted to categorize and expand the Mythos.
An ongoing theme in Lovecraft's work is the complete irrelevance of mankind in the face of the cosmic horrors that apparently exist in the universe. Lovecraft made frequent references to the "Great Old Ones", a loose pantheon of ancient, powerful deities from space who once ruled the Earth and have since fallen into a deathlike sleep. While these monstrous deities were present in almost all of Lovecraft's published work (his second short story "Dagon", published in 1919, is considered the start of the mythos), the first story to really expand the pantheon of Great Old Ones and its themes is "The Call of Cthulhu", which was published in 1928.
Lovecraft broke with other pulp writers of the time by having his main characters' minds deteriorate when afforded a glimpse of what exists outside their perceived reality. He emphasized the point by stating in the opening sentence of the story that "The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents."
Writer Dirk W. Mosig notes that Lovecraft was a "mechanistic materialist" who embraced the philosophy of "cosmic indifference" (Cosmicism). Lovecraft believed in a purposeless, mechanical, and uncaring universe. Human beings, with their limited faculties, can never fully understand this universe, and the cognitive dissonance caused by this revelation leads to insanity, in his view.
There have been attempts at categorizing this fictional group of beings. Phillip A. Schreffler argues that by carefully scrutinizing Lovecraft's writings, a workable framework emerges that outlines the entire "pantheon"from the unreachable "Outer Ones" (e.g., Azathoth, who occupies the centre of the universe) and "Great Old Ones" (e.g., Cthulhu, imprisoned on Earth in the sunken city of R'lyeh) to the lesser castes (the lowly slave shoggoths and the Mi-go).
David E. Schultz, however, believes that Lovecraft never meant to create a canonical Mythos but rather intended his imaginary pantheon to serve merely as a background element. Lovecraft himself humorously referred to his Mythos as "Yog Sothothery" (Dirk W. Mosig coincidentally suggested the term "Yog-Sothoth Cycle of Myth" be substituted for "Cthulhu Mythos"). At times, Lovecraft even had to remind his readers that his Mythos creations were entirely fictional.
The view that there was no rigid structure is expounded upon by S. T. Joshi, who said
Price, however, believed that Lovecraft's writings could at least be divided into categories and identified three distinct themes: the "Dunsanian" (written a similar style as Lord Dunsany), "Arkham" (occurring in Lovecraft's fictionalized New England setting), and "Cthulhu" (the cosmic tales) cycles. Writer Will Murray noted that while Lovecraft often used his fictional pantheon in the stories he ghostwrote for other authors, he reserved Arkham and its environs exclusively for those tales he wrote under his own name.
Although the Mythos was not formalized or acknowledged between them, Lovecraft did correspond and share story elements with other contemporary writers including Clark Ashton Smith, Robert E. Howard, Robert Bloch, Frank Belknap Long, Henry Kuttner, Henry S. Whitehead, and Fritz Leibera group referred to as the "Lovecraft Circle."
For example, Robert E. Howard's character Friedrich Von Junzt reads Lovecraft's "Necronomicon" in the short story "The Children of the Night" (1931), and in turn Lovecraft mentions Howard's "Unaussprechlichen Kulten" in the stories "Out of the Aeons" (1935) and "The Shadow Out of Time" (1936). Many of Howard's original unedited "Conan" stories also involve parts of the Cthulhu Mythos.
Price denotes the second stage's commencement with August Derleth, with the principal difference between Lovecraft and Derleth being Derleth's use of hope and development of the idea that the Cthulhu mythos essentially represented a struggle between good and evil. Derleth is credited with creating the "Elder Gods." He stated:
Price believes that the basis for Derleth's system is found in Lovecraft: "Was Derleth's use of the rubric 'Elder Gods' so alien to Lovecraft's in "At the Mountains of Madness"? Perhaps not. In fact, this very story, along with some hints from "The Shadow over Innsmouth", provides the key to the origin of the 'Derleth Mythos'. For in "At the Mountains of Madness" is shown the history of a conflict between interstellar races, first among them the Elder Ones and the Cthulhu-spawn.
Derleth himself believed that Lovecraft wished for other authors to actively write about the Mythos as opposed to it being a discrete plot device within Lovecraft's own stories. Derleth expanded the boundaries of the Mythos by including any passing reference to another author's story elements by Lovecraft as part of the genre. Just as Lovecraft made passing reference to Clark Ashton Smith's "Book of Eibon", Derleth in turn added Smith's Ubbo-Sathla to the Mythos.
Derleth also attempted to connect the deities of the Mythos to the four elements ("air", "earth", "fire", and "water"), creating new beings representative of certain elements in order to legitimize his system of classification. Derleth created "Cthugha" as a sort of fire elemental when a fan, Francis Towner Laney, complained that he had neglected to include the element in his schema. Laney, the editor of "The Acolyte", had categorized the Mythos in an essay that first appeared in the Winter 1942 issue of the magazine.
Impressed by the glossary, Derleth asked Laney to rewrite it for publication in the Arkham House collection "Beyond the Wall of Sleep" (1943). Laney's essay ("The Cthulhu Mythos") was later republished in "Crypt of Cthulhu #32" (1985). In applying the elemental theory to beings that function on a cosmic scale (e.g., Yog-Sothoth) some authors created a fifth element that they termed "aethyr". | https://en.wikipedia.org/wiki?curid=5725 |
Crane shot
In filmmaking and video production, a crane shot is a shot taken by a camera on a moving crane or jib. Most cranes accommodate both the camera and an operator, but some can be moved by remote control. Camera cranes go back to the dawn of movie-making, and were frequently used in silent films to enhance the epic nature of large sets and massive crowds. Another use is to move up and away from the actors, a common way of ending a movie. Crane shots are often found in what are supposed to be emotional or suspenseful scenes. One example of this technique is the shots taken by remote cranes in the car-chase sequence of the 1985 film "To Live and Die in L.A.". Some filmmakers place the camera on a boom arm simply to make it easier to move around between ordinary set-ups.
The major supplier of cranes in the cinema of the United States throughout the 1940s, 1950s, and 1960s was the Chapman Company (later Chapman-Leonard of North Hollywood), supplanted by dozens of similar manufacturers around the world. The traditional design provided seats for both the director and the camera operator, and sometimes a third seat for the cinematographer as well. Large weights on the back of the crane compensate for the weight of the people riding the crane and must be adjusted carefully to avoid the possibility of accidents. During the 1960s, the tallest crane was the Chapman Titan crane, a massive design over 20 feet high that won an Academy Scientific & Engineering award. Most such cranes were manually operated, requiring an experienced boom operator who knew how to vertically raise, lower, and "crab" the camera alongside actors while the crane platform rolled on separate tracks. The crane operator and camera operator had to precisely coordinate their moves so that focus, pan, and camera position all started and stopped at the same time, requiring great skill and rehearsal.
Camera cranes may be small, medium, or large, depending on the load capacity and length of the loading arm. Historically, the first camera crane provided for lifting the chamber together with the operator, and sometimes an assistant. The range of motion of the boom was restricted because of the high load capacity and the need to ensure operator safety. In recent years a camera crane boom tripod with a remote control has become popular. It carries on the boom only a movie or television camera without an operator and allows shooting from difficult positions as a small load capacity makes it possible to achieve a long reach of the crane boom and relative freedom of movement. The operator controls the camera from the ground through a motorized panoramic head, using remote control and video surveillance by watching the image on the monitor. A separate category consists of telescopic camera cranes. These devices allow setting an arbitrary trajectory of the camera, eliminating the characteristic jib crane radial displacement that comes with traditional spanning shots.
Large camera cranes are almost indistinguishable from the usual boom-type cranes, with the exception of special equipment for smoothly moving the boom and controlling noise. Small camera cranes and crane-trucks have a lightweight construction, often without a mechanical drive. The valves are controlled manually by balancing the load-specific counterweight, facilitating manipulation. To improve usability and repeatability of movement of the crane in different takes, the axis of rotation arrows are provided with limbs and a pointer. In some cases, the camera crane is mounted on a dolly for even greater camera mobility. Such devices are called crane trolleys. In modern films robotic cranes allow use of multiple actuators for high-accuracy repeated movement of the camera in trick photography. These devices are called tap-robots; some sources use the term motion control.
During the last few years, camera cranes have been miniaturized and costs have dropped so dramatically that most aspiring film makers have access to these tools. What was once a "Hollywood" effect is now available for under $400. Main producers of cranes companies include ABC-Products, Cambo, Filmotechnic, Polecam, Panther and Matthews Studio Equipment. | https://en.wikipedia.org/wiki?curid=5726 |
Chariots of Fire
Chariots of Fire is a 1981 British historical drama film. It is based on the true story of two athletes in the 1924 Olympics: Eric Liddell, a devout Scottish Christian who runs for the glory of God, and Harold Abrahams, an English Jew who runs to overcome prejudice.
The film was conceived and produced by David Puttnam, written by Colin Welland, and directed by Hugh Hudson. Ben Cross and Ian Charleson starred as Abrahams and Liddell, alongside Nigel Havers, Ian Holm, Lindsay Anderson, John Gielgud, Cheryl Campbell, and Alice Krige in supporting roles. It was nominated for seven Academy Awards and won four, including Best Picture and Best Original Screenplay. It is ranked 19th in the British Film Institute's list of Top 100 British films. The film is also notable for its memorable electronic theme tune by Vangelis, who won the Academy Award for Best Original Score.
The film's title was inspired by the line, "Bring me my Chariot of fire!", from the William Blake poem adapted into the British hymn "Jerusalem"; the hymn is heard at the end of the film. The original phrase "chariot(s) of fire" is from 2 Kings and in the Bible.
In 1919, Harold Abrahams (Ben Cross) enters the University of Cambridge, where he experiences anti-Semitism from the staff, but enjoys participating in the Gilbert and Sullivan club. He becomes the first person to ever complete the Trinity Great Court Run, running around the college courtyard in the time it takes for the clock to strike 12, and achieves an undefeated string of victories in various national running competitions. Although focused on his running, he falls in love with Sybil (Alice Krige), a leading Gilbert and Sullivan soprano.
Eric Liddell (Ian Charleson), born in China of Scottish missionary parents, is in Scotland. His devout sister Jennie (Cheryl Campbell) disapproves of Liddell's plans to pursue competitive running, but Liddell sees running as a way of glorifying God before returning to China to work as a missionary.
When they first race against each other, Liddell beats Abrahams. Abrahams takes it poorly, but Sam Mussabini (Ian Holm), a professional trainer whom he had approached earlier, offers to take him on to improve his technique. This attracts criticism from the Cambridge college masters (John Gielgud and Lindsay Anderson), who allege it is not gentlemanly for an amateur to "play the tradesman" by employing a professional coach. Abrahams dismisses this concern, interpreting it as cover for anti-Semitic and class-based prejudice.
When Eric Liddell accidentally misses a church prayer meeting because of his running, his sister Jennie upbraids him and accuses him of no longer caring about God. Eric tells her that though he intends to return eventually to the China mission, he feels divinely inspired when running, and that not to run would be to dishonour God, saying "I believe that God made me for a purpose. But He also made me fast, and when I run, I feel His pleasure."
The two athletes, after years of training and racing, are accepted to represent Great Britain in the 1924 Olympics in Paris. Also accepted are Abrahams' Cambridge friends, Lord Andrew Lindsay (Nigel Havers), Aubrey Montague (Nicholas Farrell), and Henry Stallard (Daniel Gerroll).
While boarding the boat to France for the Olympics, Liddell discovers the heats for his 100-metre race will be on a Sunday. He refuses to run the race, despite strong pressure from the Prince of Wales and the British Olympic Committee, because his Christian convictions prevent him from running on the Sabbath.
A solution is found thanks to Liddell's teammate Lindsay, who, having already won a silver medal in the 400 metres hurdles, offers to give his place in the 400-metre race on the following Thursday to Liddell, who gratefully agrees. Liddell's religious convictions in the face of national athletic pride make headlines around the world.
Liddell delivers a sermon at the Paris Church of Scotland that Sunday, and quotes from , ending with "But they that wait upon the Lord shall renew their strength; they shall mount up with wings as eagles; they shall run, and not be weary; and they shall walk, and not faint."
Abrahams is badly beaten by the heavily favoured United States runners in the 200 metre race. He knows his last chance for a medal will be the 100 metres. He competes in the race, and wins. His coach Sam Mussabini is overcome that the years of dedication and training have paid off with an Olympic gold medal. Now Abrahams can get on with his life and reunite with his girlfriend Sybil, whom he had neglected for the sake of running.
Before Liddell's race, the American coach remarks dismissively to his runners that Liddell has little chance of doing well in his now, far longer, 400 metre race. But one of the American runners, Jackson Scholz, hands Liddell a note of support, quoting 1 Samuel 2:30 "He that honors Me I will honor". Liddell defeats the American favourites and wins the gold medal.
The British team returns home triumphant. As the film ends, onscreen text explains that Abrahams married Sybil and became the elder statesman of British athletics. Liddell went on to missionary work in China. All of Scotland mourned his death in 1945 in Japanese-occupied China.
The film depicts Abrahams as attending Gonville and Caius College, Cambridge with three other Olympic athletes: Henry Stallard, Aubrey Montague, and Lord Andrew Lindsay. Abrahams and Stallard were in fact students there and competed in the 1924 Olympics. Montague also competed in the Olympics as depicted, but he attended Oxford, not Cambridge. Aubrey Montague sent daily letters to his mother about his time at Oxford and the Olympics; these letters were the basis of Montague's narration in the film.
The character of Lindsay was based partially on Lord Burghley, a significant figure in the history of British athletics. Although Burghley did attend Cambridge, he was not a contemporary of Harold Abrahams, as Abrahams was an undergraduate from 1919 to 1923 and Burghley was at Cambridge from 1923 to 1927. One scene in the film depicts the Burghley-based "Lindsay" as practising hurdles on his estate with full champagne glasses placed on each hurdle – this was something the wealthy Burghley did, although he used matchboxes instead of champagne glasses. The fictional character of Lindsay was created when Douglas Lowe, who was Britain's third athletics gold medallist in the 1924 Olympics, was not willing to be involved with the film.
Another scene in the film recreates the Great Court Run, in which the runners attempt to run around the perimeter of the Great Court at Trinity College, Cambridge in the time it takes the clock to strike 12 at midday. The film shows Abrahams performing the feat for the first time in history. In fact, Abrahams never attempted this race, and at the time of filming the only person on record known to have succeeded was Lord Burghley, in 1927. In "Chariots of Fire", Lindsay, who is based on Lord Burghley, runs the Great Court Run with Abrahams in order to spur him on, and crosses the finish line just a moment too late. Since the film's release, the Great Court Run has also been successfully run by Trinity undergraduate Sam Dobin, in October 2007.
In the film, Eric Liddell is tripped up by a Frenchman in the 400-metre event of a Scotland–France international athletic meeting. He recovers, makes up a 20-metre deficit, and wins. This was based on fact; the actual race was the 440 yards at a Triangular Contest meet between Scotland, England, and Ireland at Stoke-on-Trent in England in July 1923. His achievement was remarkable as he had already won the 100- and 220-yard events that day. Also unmentioned with regard to Liddell is that it was he who introduced Abrahams to Sam Mussabini. This is alluded to: In the film Abrahams first encounters Mussabini while he is watching Liddell race. The film, however, suggests that Abrahams himself sought Mussabini's assistance.
Abrahams and Liddell did race against each other once, but not quite as depicted in the film, which shows Liddell winning the final of the 100 yards against a shattered Abrahams at the 1923 AAA Championship at Stamford Bridge. In fact, they raced only in a heat of the 220 yards, which Liddell won, five yards ahead of Abrahams, who did not progress to the final. In the 100 yards, Abrahams was eliminated in the heats and never raced against Liddell, who won the finals of both races the next day.
Abrahams' fiancée is misidentified as Sybil Gordon, a soprano at the D'Oyly Carte Opera Company. In fact, in 1936, Abrahams married Sybil Evers, who sang at the D'Oyly Carte, but they did not meet until 1934. Also, in the film, Sybil is depicted as singing the role of Yum-Yum in "The Mikado", but neither Sybil Gordon nor Sybil Evers ever sang that role with D'Oyly Carte, although Evers was known for her charm in singing Peep-Bo, one of the two other "little maids from school". Harold Abrahams' love of and heavy involvement with Gilbert and Sullivan, as depicted in the film, is factual.
Liddell's sister was several years younger than she was portrayed in the film. Her disapproval of Liddell's track career was creative licence; she actually fully supported his sporting work. Jenny Liddell Somerville cooperated fully with the making of the film and has a brief cameo in the Paris Church of Scotland during Liddell's sermon.
At the memorial service for Harold Abrahams, which opens the film, Lord Lindsay mentions that he and Aubrey Montague are the only members of the 1924 Olympic team still alive. However, Montague died in 1948, 30 years before Abrahams' death.
The film takes some liberties with the events at the 1924 Olympics, including the events surrounding Liddell's refusal to race on a Sunday. In the film, he doesn't learn that the 100-metre heat is to be held on the Christian Sabbath until he is boarding the boat to Paris. In fact, the schedule was made public several months in advance; Liddell did however face immense pressure to run on that Sunday and to compete in the 100 metres, getting called before a grilling by the British Olympic Committee, the Prince of Wales, and other grandees, and his refusal to run made headlines around the world.
The decision to change races was, even so, made well before embarking to Paris, and Liddell spent the intervening months training for the 400 metres, an event in which he had previously excelled. It is true, nonetheless, that Liddell's success in the Olympic 400m was largely unexpected.
The film depicts Lindsay, having already won a medal in the 400-metre hurdles, giving up his place in the 400-metre race for Liddell. In fact Burghley, on whom Lindsay is loosely based, was eliminated in the heats of the 110 hurdles (he would go on to win a gold medal in the 400 hurdles at the 1928 Olympics), and was not entered for the 400 metres.
The film reverses the order of Abrahams' 100m and 200m races at the Olympics. In reality, after winning the 100 metres race, Abrahams ran the 200 metres but finished last, Jackson Scholz taking the gold medal. In the film, before his triumph in the 100m, Abrahams is shown losing the 200m and being scolded by Mussabini. And during the following scene in which Abrahams speaks with his friend Montague while receiving a massage from Mussabini, there is a French newspaper clipping showing Scholz and Charley Paddock with a headline which states that the 200 metres was a triumph for the United States. In the same conversation, Abrahams laments getting "beaten out of sight" in the 200. The film thus has Abrahams overcoming the disappointment of losing the 200 by going on to win the 100, a reversal of the real order.
Eric Liddell actually also ran in the 200m race, and finished third, behind Paddock and Scholz. This was the only time in reality that Liddell and Abrahams competed in the same race. While their meeting in the 1923 AAA Championship in the film was fictitious, Liddell's record win in that race did spur Abrahams to train even harder.
Abrahams also won a silver medal as an opening runner for the 4 x 100 metres relay team, not shown in the film, and Aubrey Montague placed sixth in the steeplechase, as depicted.
In the film, the 100m bronze medallist is a character called "Tom Watson"; the real medallist was Arthur Porritt of New Zealand, who refused permission for his name to be used in the film, allegedly out of modesty, and his wish was accepted by the film's producers, even though his permission was not necessary. However, the brief back-story given for Watson, who is called up to the New Zealand team from the University of Oxford, substantially matches Porritt's history. With the exception of Porritt, all the runners in the 100m final are identified correctly when they line up for inspection by the Prince of Wales.
Jackson Scholz is depicted as handing Liddell an inspirational Bible-quotation message before the 400 metres final: "It says in the Old Book, 'He that honors me, I will honor.' Good luck." In reality, the note was from members of the British team, and was handed to Liddell before the race by his attending masseur at the team's Paris hotel. For dramatic purposes, screenwriter Welland asked Scholz if he could be depicted handing the note, and Scholz readily agreed, saying "Yes, great, as long as it makes me look good."
Producer David Puttnam was looking for a story in the mold of "A Man for All Seasons" (1966), regarding someone who follows his conscience, and felt sports provided clear situations in this sense. He discovered Eric Liddell's story by accident in 1977, when he happened upon a reference book on the Olympics while housebound from the flu in a rented house in Los Angeles.
Screenwriter Colin Welland, commissioned by Puttnam, did an enormous amount of research for his Academy Award-winning script. Among other things, he took out advertisements in London newspapers seeking memories of the 1924 Olympics, went to the National Film Archives for pictures and footage of the 1924 Olympics, and interviewed everyone involved who was still alive. Welland just missed Abrahams, who died 14 January 1978, but he did attend Abrahams' February 1978 memorial service, which inspired the present-day framing device of the film. Aubrey Montague's son saw Welland's newspaper ad and sent him copies of the letters his father had sent home – which gave Welland something to use as a narrative bridge in the film. Except for changes in the greetings of the letters from "Darling Mummy" to "Dear Mum" and the change from Oxford to Cambridge, all of the readings from Montague's letters are from the originals.
Welland's original script also featured, in addition to Eric Liddell and Harold Abrahams, a third protagonist, 1924 Olympic gold medallist Douglas Lowe, who was presented as a privileged aristocratic athlete. However, Lowe refused to have anything to do with the film, and his character was written out and replaced by the fictional character of Lord Andrew Lindsay.
Initial financing towards development costs was provided by Goldcrest Films, who then sold the project to Allied, but kept a percentage of the profits.
Ian Charleson wrote Eric Liddell's speech to the post-race workingmen's crowd at the Scotland v. Ireland races. Charleson, who had studied the Bible intensively in preparation for the role, told director Hugh Hudson that he didn't feel the portentous and sanctimonious scripted speech was either authentic or inspiring. Hudson and Welland allowed him to write words he personally found inspirational instead.
The film was slightly altered for the U.S. audience. A brief scene depicting a pre-Olympics cricket game between Abrahams, Liddell, Montague, and the rest of the British track team appears shortly after the beginning of the original film. For the American audience, this brief scene was deleted. In the U.S., to avoid the initial G rating, which had been strongly associated with children's films and might have hindered box office sales, a different scene was used – one depicting Abrahams and Montague arriving at a Cambridge railway station and encountering two World War I veterans who use an obscenity – in order to be given a PG rating.
Puttnam chose Hugh Hudson, a multiple award-winning advertising and documentary filmmaker who had never helmed a feature film, to direct "Chariots of Fire". Hudson and Puttnam had known each other since the 1960s, when Puttnam was an advertising executive and Hudson was making films for ad agencies. In 1977, Hudson had also been second-unit director on the Puttnam-produced film "Midnight Express".
Director Hugh Hudson was determined to cast young, unknown actors in all the major roles of the film, and to back them up by using veterans like John Gielgud, Lindsay Anderson, and Ian Holm as their supporting cast. Hudson and producer David Puttnam did months of fruitless searching for the perfect actor to play Eric Liddell. They then saw Scottish stage actor Ian Charleson performing the role of Pierre in the Royal Shakespeare Company's production of "Piaf", and knew immediately they had found their man. Unbeknownst to them, Charleson had heard about the film from his father, and desperately wanted to play the part, feeling it would "fit like a kid glove".
Ben Cross, who plays Harold Abrahams, was discovered while playing Billy Flynn in "Chicago". In addition to having a natural pugnaciousness, he had the desired ability to sing and play the piano. Cross was thrilled to be cast, and said he was moved to tears by the film's script.
20th Century Fox, which put up half of the production budget in exchange for distribution rights outside of North America, insisted on having a couple of notable American names in the cast. Thus the small parts of the two American champion runners, Jackson Scholz and Charlie Paddock, were cast with recent headliners: Brad Davis had recently starred in "Midnight Express" (also produced by Puttnam), and Dennis Christopher had recently starred, as a young bicycle racer, in the popular indie film "Breaking Away".
All of the actors portraying runners underwent a gruelling three-month training intensive with renowned running coach Tom McNab. This training and isolation of the actors also created a strong bond and sense of camaraderie among them.
Although the film is a period piece, set in the 1920s, the Academy Award-winning original soundtrack composed by Vangelis uses a modern 1980s electronic sound, with a strong use of synthesizer and piano among other instruments. This was a bold and significant departure from earlier period films, which employed sweeping orchestral instrumentals. The title theme of the film has become iconic, and has been used in subsequent films and television shows during slow-motion segments.
Vangelis, a Greek-born electronic composer who moved to Paris in the late 1960s, had been living in London since 1974. Director Hugh Hudson had collaborated with him on documentaries and commercials, and was also particularly impressed with his 1979 albums "Opera Sauvage" and "China". David Puttnam also greatly admired Vangelis's body of work, having originally selected his compositions for his previous film "Midnight Express". Hudson made the choice for Vangelis and for a modern score: "I knew we needed a piece which was anachronistic to the period to give it a feel of modernity. It was a risky idea but we went with it rather than have a period symphonic score." The soundtrack had a personal significance to Vangelis: After composing the iconic theme tune he told Puttnam, "My father is a runner, and this is an anthem to him."
Hudson originally wanted Vangelis's 1977 tune "L'Enfant", from his "Opera Sauvage" album, to be the title theme of the film, and the beach running sequence was actually filmed with "L'Enfant" playing on loudspeakers for the runners to pace to. Vangelis finally convinced Hudson he could create a new and better piece for the film's main theme – and when he played the now-iconic "Chariots of Fire" theme for Hudson, it was agreed the new tune was unquestionably better. The "L'Enfant" melody still made it into the film: when the athletes reach Paris and enter the stadium, a brass band marches through the field, and first plays a modified, acoustic performance of the piece. Vangelis's electronic "L'Enfant" track eventually was used prominently in the 1982 film "The Year of Living Dangerously".
Some pieces of Vangelis's music in the film did not end up on the film's soundtrack album. One of them is the background music to the race Eric Liddell runs in the Scottish highlands. This piece is a version of "Hymne", the original version of which appears on Vangelis's 1979 album, "Opéra sauvage". Various versions are also included on Vangelis's compilation albums "Themes", "Portraits", and "", though none of these include the version used in the film.
Five lively Gilbert and Sullivan tunes also appear in the soundtrack, and serve as jaunty period music which counterpoints Vangelis's modern electronic score. These are: "He is an Englishman" from "H.M.S. Pinafore", "Three Little Maids from School Are We" from "The Mikado", "With Catlike Tread" from "The Pirates of Penzance", "The Soldiers of Our Queen" from "Patience", and "There Lived a King" from "The Gondoliers".
The film also incorporates a major traditional work: "Jerusalem", sung by a British choir at the 1978 funeral of Harold Abrahams. The words, written by William Blake in 1804–08, were set to music by Parry in 1916 as a celebration of England. This hymn has been described as "England's unofficial national anthem", concludes the film and inspired its title. A handful of other traditional anthems and hymns and period-appropriate instrumental ballroom-dance music round out the film's soundtrack.
The beach scenes associated with the theme tune were filmed at West Sands, St Andrews. A plaque commemorating the filming can be found there today. The very last scene of the opening titles, of the athletes running to the Carlton Hotel in Broadstairs, Kent, was filmed at the 18th hole of the Old Course at St Andrews Links.
All of the Cambridge scenes were actually filmed at Hugh Hudson's alma mater Eton College, because Cambridge refused filming rights, fearing depictions of anti-Semitism. The Cambridge administration greatly regretted the decision after the film's enormous success.
Liverpool Town Hall was the setting for the scenes depicting the British Embassy in Paris. The Colombes Olympic Stadium in Paris was represented by the Oval Sports Centre, Bebington, Merseyside. The nearby Woodside ferry terminal was used to represent the embarkation scenes set in Dover. The railway station scenes were filmed in York, using locomotives from the National Railway Museum. The scene depicting a performance of "The Mikado" was filmed in the Royal Court Theatre, Liverpool with members of the D'Oyly Carte Opera Company who were on tour.
"Chariots of Fire" became a recurring theme in promotions for the 2012 Summer Olympics in London. The film's theme tune was featured at the opening of the 2012 London New Years fireworks celebrating the Olympics, and the film's iconic beach-running scene and theme tune were used in "The Sun"'s "Let's Make It Great, Britain" Olympic ads. The runners who first tested the new Olympic Park were spurred on by the "Chariots of Fire" theme tune, and the iconic music was also used to fanfare the carriers of the Olympic flame on parts of its route through the UK. The beach-running sequence was also recreated at St. Andrews and filmed as part of the Olympic torch relay.
The film's theme was also performed by the London Symphony Orchestra, conducted by Simon Rattle, during the Opening Ceremony of the games; the performance was accompanied by a comedy skit by Rowan Atkinson (as Mr Bean) which included the opening beach-running footage from the film. The film's theme tune was also played during each medal ceremony of the 2012 Olympics.
A stage adaptation of "Chariots of Fire" was mounted in honour of the 2012 Olympics. The play, "Chariots of Fire", which was adapted by playwright Mike Bartlett and included the iconic Vangelis score, ran from 9 May to 16 June 2012 at London's Hampstead Theatre, and transferred to the Gielgud Theatre in the West End on 23 June, where it ran until 5 January 2013. It starred Jack Lowden as Eric Liddell and James McArdle as Harold Abrahams, and Edward Hall directed. Stage designer Miriam Buether transformed each theatre into an Olympic stadium, and composer Jason Carr wrote additional music. Vangelis also created several new pieces of music for the production. The stage version for the London Olympic year was the idea of the film's director, Hugh Hudson, who co-produced the play; he stated, "Issues of faith, of refusal to compromise, standing up for one's beliefs, achieving something for the sake of it, with passion, and not just for fame or financial gain, are even more vital today."
Another play, "Running for Glory", written by Philip Dart, based on the 1924 Olympics, and focusing on Abrahams and Liddell, toured parts of Britain from 25 February to 1 April 2012. It starred Nicholas Jacobs as Harold Abrahams, and Tom Micklem as Eric Liddell.
As an official part of the London 2012 Festival celebrations, a new digitally re-mastered version of the film screened in 150 cinemas throughout the UK. The re-release began 13 July 2012, two weeks before the opening ceremony of the London Olympics.
A Blu-ray of the film was released on 10 July 2012 in North America, and was released 16 July 2012 in the UK. The release includes nearly an hour of special features, a CD sampler, and a 32-page "digibook".
Since its release, "Chariots of Fire" has received generally positive reviews from critics. , the film holds an 85% "Certified Fresh" rating on the review aggregator website Rotten Tomatoes, based on 71 reviews, with a weighted average of 7.64/10. The site's consensus reads: "Decidedly slower and less limber than the Olympic runners at the center of its story, the film nevertheless manages to make effectively stirring use of its spiritual and patriotic themes."
For its 2012 re-release, Kate Muir of "The Times" gave the film five stars, writing: "In a time when drug tests and synthetic fibres have replaced gumption and moral fibre, the tale of two runners competing against each other in the 1924 Olympics has a simple, undiminished power. From the opening scene of pale young men racing barefoot along the beach, full of hope and elation, backed by Vangelis's now famous anthem, the film is utterly compelling."
"Chariots of Fire" was very successful at the 54th Academy Awards, winning four of seven nominations. When accepting his Oscar for Best Original Screenplay, Colin Welland famously announced "The British are coming". At the 1981 Cannes Film Festival the film won two awards and competed for the Palme d'Or.
American Film Institute recognition | https://en.wikipedia.org/wiki?curid=5729 |
Consequentialism
Consequentialism is the class of normative ethical theories holding that the consequences of one's conduct are the ultimate basis for any judgment about the rightness or wrongness of that conduct. Thus, from a consequentialist standpoint, a morally right act (or omission from acting) is one that will produce a good outcome, or consequence.
Consequentialism is primarily non-prescriptive, meaning the moral worth of an action is determined by its potential consequence, not by whether it follows a set of written edicts or laws. One example would entail lying under the threat of government punishment to save an innocent person's life, even though it is illegal to lie under oath.
Consequentialism is usually contrasted with deontological ethics (or "deontology"), in that deontology, in which rules and moral duty are central, derives the rightness or wrongness of one's conduct from the character of the behaviour itself rather than the outcomes of the conduct. It is also contrasted with virtue ethics, which focuses on the character of the agent rather than on the nature or consequences of the act (or omission) itself, and pragmatic ethics which treats morality like science: advancing socially over the course of many lifetimes, such that any moral criterion is subject to revision.
Consequentialist theories differ in how they define moral goods. Some argue that consequentialist and deontological theories are not necessarily mutually exclusive. For example, T. M. Scanlon advances the idea that human rights, which are commonly considered a "deontological" concept, can only be justified with reference to the consequences of having those rights. Similarly, Robert Nozick argued for a theory that is mostly consequentialist, but incorporates inviolable "side-constraints" which restrict the sort of actions agents are permitted to do. Derek Parfit argued that in practice, when understood properly, rule consequentialism, Kantian deontology and contractualism would all end up prescribing the same.
State consequentialism, also known as Mohist consequentialism, is an ethical theory which evaluates the moral worth of an action based on how much it contributes to the welfare of a state. According to the "Stanford Encyclopedia of Philosophy", Mohist consequentialism, dating back to the 5th century BCE, is the "world's earliest form of consequentialism, a remarkably sophisticated version based on a plurality of intrinsic goods taken as constitutive of human welfare".
Unlike utilitarianism, which views utility as the sole moral good, "the basic goods in Mohist consequentialist thinking are... order, material wealth, and increase in population". During Mozi's era, war and famines were common, and population growth was seen as a moral necessity for a harmonious society. The "material wealth" of Mohist consequentialism refers to basic needs like shelter and clothing, and the "order" of Mohist consequentialism refers to Mozi's stance against warfare and violence, which he viewed as pointless and a threat to social stability. Stanford sinologist David Shepherd Nivison, in "The Cambridge History of Ancient China", writes that the moral goods of Mohism "are interrelated: more basic wealth, then more reproduction; more people, then more production and wealth... if people have plenty, they would be good, filial, kind, and so on unproblematically".
The Mohists believed that morality is based on "promoting the benefit of all under heaven and eliminating harm to all under heaven". In contrast to Jeremy Bentham's views, state consequentialism is not utilitarian because it is not hedonistic or individualistic. The importance of outcomes that are good for the community outweigh the importance of individual pleasure and pain. The term state consequentialism has also been applied to the political philosophy of the Confucian philosopher Xunzi.
On the other hand, the "Legalist" Han Fei "is motivated almost totally from the ruler's point of view".
In summary, Jeremy Bentham states that people are driven by their interests and their fears, but their interests take precedence over their fears, and their interests are carried out in accordance with how people view the consequences that might be involved with their interests. "Happiness" on this account is defined as the maximization of pleasure and the minimization of pain. It can be argued that the existence of phenomenal consciousness and "qualia" is required for the experience of pleasure or pain to have an ethical significance. Historically, hedonistic utilitarianism is the paradigmatic example of a consequentialist moral theory. This form of utilitarianism holds that what matters is the aggregate happiness; the happiness of everyone and not the happiness of any particular person. John Stuart Mill, in his exposition of hedonistic utilitarianism, proposed a hierarchy of pleasures, meaning that the pursuit of certain kinds of pleasure is more highly valued than the pursuit of other pleasures. However, some contemporary utilitarians, such as Peter Singer, are concerned with maximizing the satisfaction of preferences, hence "preference utilitarianism". Other contemporary forms of utilitarianism mirror the forms of consequentialism outlined below.
Ethical egoism can be understood as a consequentialist theory according to which the consequences for the individual agent are taken to matter more than any other result. Thus, egoism will prescribe actions that may be beneficial, detrimental, or neutral to the welfare of others. Some, like Henry Sidgwick, argue that a certain degree of egoism "promotes" the general welfare of society for two reasons: because individuals know how to please themselves best, and because if everyone were an austere altruist then general welfare would inevitably decrease.
Ethical altruism can be seen as a consequentialist ethic which prescribes that an individual take actions that have the best consequences for everyone except for himself. This was advocated by Auguste Comte, who coined the term "altruism," and whose ethics can be summed up in the phrase "Live for others".
In general, consequentialist theories focus on actions. However, this need not be the case. Rule consequentialism is a theory that is sometimes seen as an attempt to reconcile deontology and consequentialism—and in some cases, this is stated as a criticism of rule consequentialism. Like deontology, rule consequentialism holds that moral behavior involves following certain rules. However, rule consequentialism chooses rules based on the consequences that the selection of those rules has. Rule consequentialism exists in the forms of rule utilitarianism and rule egoism.
Various theorists are split as to whether the rules are the only determinant of moral behavior or not. For example, Robert Nozick held that a certain set of minimal rules, which he calls "side-constraints", are necessary to ensure appropriate actions. There are also differences as to how absolute these moral rules are. Thus, while Nozick's side-constraints are absolute restrictions on behavior, Amartya Sen proposes a theory that recognizes the importance of certain rules, but these rules are not absolute. That is, they may be violated if strict adherence to the rule would lead to much more undesirable consequences.
One of the most common objections to rule-consequentialism is that it is incoherent, because it is based on the consequentialist principle that what we should be concerned with is maximizing the good, but then it tells us not to act to maximize the good, but to follow rules (even in cases where we know that breaking the rule could produce better results).
Brad Hooker avoided this objection by not basing his form of rule-consequentialism on the ideal of maximizing the good. He writes:
…the best argument for rule-consequentialism is not that it derives from an overarching commitment to maximise the good. The best argument for rule-consequentialism is that it does a better job than its rivals of matching and tying together our moral convictions, as well as offering us help with our moral disagreements and uncertainties.
Derek Parfit described Brad Hooker's book on rule-consequentialism "Ideal Code, Real World" as the "best statement and defence, so far, of one of the most important moral theories".
Rule-consequentialism may offer a means to reconcile pure consequentialism with deontological, or rules-based ethics.
The two-level approach involves engaging in critical reasoning and considering all the possible ramifications of one's actions before making an ethical decision, but reverting to generally reliable moral rules when one is not in a position to stand back and examine the dilemma as a whole. In practice, this equates to adhering to rule consequentialism when one can only reason on an intuitive level, and to act consequentialism when in a position to stand back and reason on a more critical level.
This position can be described as a reconciliation between act consequentialism – in which the morality of an action is determined by that action's effects – and rule consequentialism – in which moral behavior is derived from following rules that lead to positive outcomes.
The two-level approach to consequentialism is most often associated with R. M. Hare and Peter Singer.
Another consequentialist version is motive consequentialism which looks at whether the state of affairs that results from the motive to choose an action is better or at least as good as each of the alternative state of affairs that would have resulted from alternative actions. This version gives relevance to the motive of an act and links it to its consequences. An act can therefore not be wrong if the decision to act was based on a right motive. A possible inference is, that one can not be blamed for mistaken judgments if the motivation was to do good.
Most consequentialist theories focus on "promoting" some sort of good consequences. However, negative utilitarianism lays out a consequentialist theory that focuses solely on minimizing bad consequences.
One major difference between these two approaches is the agent's responsibility. Positive consequentialism demands that we bring about good states of affairs, whereas negative consequentialism requires that we avoid bad ones. Stronger versions of negative consequentialism will require active intervention to prevent bad and ameliorate existing harm. In weaker versions, simple forbearance from acts tending to harm others is sufficient. An example of this is the Slippery Slope Argument, which encourages others to avoid a specified act on the grounds that it may ultimately lead to undesirable consequences.
Often "negative" consequentialist theories assert that reducing suffering is more important than increasing pleasure. Karl Popper, for example, claimed "…from the moral point of view, pain cannot be outweighed by pleasure...". (While Popper is not a consequentialist per se, this is taken as a classic statement of negative utilitarianism.) When considering a theory of justice, negative consequentialists may use a statewide or global-reaching principle: the reduction of suffering (for the disadvantaged) is more valuable than increased pleasure (for the affluent or luxurious).
Teleological ethics (Greek telos, "end"; logos, "science") is an ethical theory that holds that the ends or consequences of an act determine whether an act is good or evil. Teleological theories are often discussed in opposition to deontological ethical theories, which hold that acts themselves are "inherently" good or evil, regardless of the consequences of acts. The saying, "the end justifies the means", meaning that if a goal is morally important enough, any method of achieving it is acceptable.
Teleological theories differ on the nature of the end that actions ought to promote. Eudaemonist theories (Greek eudaimonia, "happiness") hold that the goal of ethics consists in some function or activity appropriate to man as a human being, and thus tend to emphasize the cultivation of virtue or excellence in the agent as the end of all action. These could be the classical virtues—courage, temperance, justice, and wisdom—that promoted the Greek ideal of man as the "rational animal", or the theological virtues—faith, hope, and love—that distinguished the Christian ideal of man as a being created in the image of God.
Utilitarian-type theories hold that the end consists in an experience or feeling produced by the action. Hedonism, for example, teaches that this feeling is pleasure—either one's own, as in egoism (the 17th-century English philosopher Thomas Hobbes), or everyone's, as in universalistic hedonism, or utilitarianism (the 19th-century English philosophers Jeremy Bentham, John Stuart Mill, and Henry Sidgwick), with its formula of the "greatest pleasure of the greatest number".
Other utilitarian-type views include the claims that the end of action is survival and growth, as in evolutionary ethics (the 19th-century English philosopher Herbert Spencer); the experience of power, as in despotism; satisfaction and adjustment, as in pragmatism (20th-century American philosophers Ralph Barton Perry and John Dewey); and freedom, as in existentialism (the 20th-century French philosopher Jean-Paul Sartre).
The chief problem for eudaemonist theories is to show that leading a life of virtue will also be attended by happiness—by the winning of the goods regarded as the chief end of action. That Job should suffer and Socrates and Jesus die while the wicked prosper, then seems unjust. Eudaemonists generally reply that the universe is moral and that, in Socrates' words, "No evil can happen to a good man, either in life or after death," or, in Jesus' words, "But he who endures to the end will be saved." (Matt 10:22).
Utilitarian theories, on the other hand, must answer the charge that ends do not justify the means. The problem arises in these theories because they tend to separate the achieved ends from the action by which these ends were produced. One implication of utilitarianism is that one's intention in performing an act may include all of its foreseen consequences. The goodness of the intention then reflects the balance of the good and evil of these consequences, with no limits imposed upon it by the nature of the act itself—even if it be, say, the breaking of a promise or the execution of an innocent man. Utilitarianism, in answering this charge, must show either that what is apparently immoral is not really so or that, if it really is so, then closer examination of the consequences will bring this fact to light. Ideal utilitarianism (G.E. Moore and Hastings Rashdall) tries to meet the difficulty by advocating a plurality of ends and including among them the attainment of virtue itself, which, as John Stuart Mill affirmed, "may be felt a good in itself, and desired as such with as great intensity as any other good".
Since pure consequentialism holds that an action is to be judged solely by its result, most consequentialist theories hold that a deliberate action is no different from a deliberate decision not to act. This contrasts with the "acts and omissions doctrine", which is upheld by some medical ethicists and some religions: it asserts there is a significant moral distinction between acts and deliberate non-actions which lead to the same outcome. This contrast is brought out in issues such as voluntary euthanasia.
One important characteristic of many normative moral theories such as consequentialism is the ability to produce practical moral judgements. At the very least, any moral theory needs to define the standpoint from which the goodness of the consequences are to be determined. What is primarily at stake here is the "responsibility" of the agent.
One common tactic among consequentialists, particularly those committed to an altruistic (selfless) account of consequentialism, is to employ an ideal, neutral observer from which moral judgements can be made. John Rawls, a critic of utilitarianism, argues that utilitarianism, in common with other forms of consequentialism, relies on the perspective of such an ideal observer. The particular characteristics of this ideal observer can vary from an omniscient observer, who would grasp all the consequences of any action, to an ideally informed observer, who knows as much as could reasonably be expected, but not necessarily all the circumstances or all the possible consequences. Consequentialist theories that adopt this paradigm hold that right action is the action that will bring about the best consequences from this ideal observer's perspective.
In practice, it is very difficult, and at times arguably impossible, to adopt the point of view of an ideal observer. Individual moral agents do not know everything about their particular situations, and thus do not know all the possible consequences of their potential actions. For this reason, some theorists have argued that consequentialist theories can only require agents to choose the best action in line with what they know about the situation. However, if this approach is naïvely adopted, then moral agents who, for example, recklessly fail to reflect on their situation, and act in a way that brings about terrible results, could be said to be acting in a morally justifiable way. Acting in a situation without first informing oneself of the circumstances of the situation can lead to even the most well-intended actions yielding miserable consequences. As a result, it could be argued that there is a moral imperative for an agent to inform himself as much as possible about a situation before judging the appropriate course of action. This imperative, of course, is derived from consequential thinking: a better-informed agent is able to bring about better consequences.
Moral action always has consequences for certain people or things. Varieties of consequentialism can be differentiated by the beneficiary of the good consequences. That is, one might ask "Consequences for whom?"
A fundamental distinction can be drawn between theories which require that agents act for ends perhaps disconnected from their own interests and drives, and theories which permit that agents act for ends in which they have some personal interest or motivation. These are called "agent-neutral" and "agent-focused" theories respectively.
Agent-neutral consequentialism ignores the specific value a state of affairs has for any particular agent. Thus, in an agent-neutral theory, an actor's personal goals do not count any more than anyone else's goals in evaluating what action the actor should take. Agent-focused consequentialism, on the other hand, focuses on the particular needs of the moral agent. Thus, in an agent-focused account, such as one that Peter Railton outlines, the agent might be concerned with the general welfare, but the agent is "more" concerned with the immediate welfare of herself and her friends and family.
These two approaches could be reconciled by acknowledging the tension between an agent's interests as an individual and as a member of various groups, and seeking to somehow optimize among all of these interests. For example, it may be meaningful to speak of an action as being good for someone as an individual, but bad for them as a citizen of their town.
Many consequentialist theories may seem primarily concerned with human beings and their relationships with other human beings. However, some philosophers argue that we should not limit our ethical consideration to the interests of human beings alone. Jeremy Bentham, who is regarded as the founder of utilitarianism, argues that animals can experience pleasure and pain, thus demanding that 'non-human animals' should be a serious object of moral concern. More recently, Peter Singer has argued that it is unreasonable that we do not give equal consideration to the interests of animals as to those of human beings when we choose the way we are to treat them. Such equal consideration does not necessarily imply identical treatment of humans and non-humans, any more than it necessarily implies identical treatment of all humans.
One way to divide various consequentialisms is by the types of consequences that are taken to matter most, that is, which consequences count as good states of affairs. According to utilitarianism, a good action is one that results in an increase in pleasure, and the best action is one that results in the most pleasure for the greatest number. Closely related is eudaimonic consequentialism, according to which a full, flourishing life, which may or may not be the same as enjoying a great deal of pleasure, is the ultimate aim. Similarly, one might adopt an aesthetic consequentialism, in which the ultimate aim is to produce beauty. However, one might fix on non-psychological goods as the relevant effect. Thus, one might pursue an increase in material equality or political liberty instead of something like the more ephemeral "pleasure". Other theories adopt a package of several goods, all to be promoted equally. As the consequentialist approach contains an inherent assumption that the outcomes of a moral decision can be quantified in terms of "goodness" or "badness", or at least put in order of increasing preference, it is an especially suited moral theory for a probabilistic and decision theoretical approach.
Consequentialism can also be contrasted with aretaic moral theories such as virtue ethics. Whereas consequentialist theories posit that consequences of action should be the primary focus of our thinking about ethics, virtue ethics insists that it is the character rather than the consequences of actions that should be the focal point. Some virtue ethicists hold that consequentialist theories totally disregard the development and importance of moral character. For example, Philippa Foot argues that consequences in themselves have no ethical content, unless it has been provided by a virtue such as benevolence.
However, consequentialism and virtue ethics need not be entirely antagonistic. Iain King has developed an approach that reconciles the two schools. Other consequentialists consider effects on the character of people involved in an action when assessing consequence. Similarly, a consequentialist theory may aim at the maximization of a particular virtue or set of virtues. Finally, following Foot's lead, one might adopt a sort of consequentialism that argues that virtuous activity ultimately produces the best consequences.
The "ultimate end" is a concept in the moral philosophy of Max Weber, in which individuals act in a faithful, rather than rational, manner.
The term "consequentialism" was coined by G. E. M. Anscombe in her essay "Modern Moral Philosophy" in 1958, to describe what she saw as the central error of certain moral theories, such as those propounded by Mill and Sidgwick.
The phrase and concept of "The end justifies the means" are at least as old as the first century BC. Ovid wrote in his "Heroides" that "Exitus acta probat" "The result justifies the deed".
G. E. M. Anscombe objects to the consequentialism of Sidgwick on the grounds that the moral worth of an action is premised on the predictive capabilities of the individual, relieving them of the responsibility for the "badness" of an act should they "make out a case for not having foreseen" negative consequences.
The future amplification of the effects of small decisions is an important factor that makes it more difficult to predict the ethical value of consequences, even though most would agree that only predictable consequences are charged with a moral responsibility.
Bernard Williams has argued that consequentialism is alienating because it requires moral agents to put too much distance between themselves and their own projects and commitments. Williams argues that consequentialism requires moral agents to take a strictly impersonal view of all actions, since it is only the consequences, and not who produces them, that are said to matter. Williams argues that this demands too much of moral agents—since (he claims) consequentialism demands that they be willing to sacrifice any and all personal projects and commitments in any given circumstance in order to pursue the most beneficent course of action possible. He argues further that consequentialism fails to make sense of intuitions that it can matter whether or not someone is personally the author of a particular consequence. For example, that participating in a crime can matter, even if the crime would have been committed anyway, or would even have been worse, without the agent's participation.
Some consequentialists—most notably Peter Railton—have attempted to develop a form of consequentialism that acknowledges and avoids the objections raised by Williams. Railton argues that Williams's criticisms can be avoided by adopting a form of consequentialism in which moral decisions are to be determined by the sort of life that they express. On his account, the agent should choose the sort of life that will, on the whole, produce the best overall effects.
A much longer list of utilitarian consequentialists may be found in the list of utilitarians. | https://en.wikipedia.org/wiki?curid=5734 |
Conscription
Conscription, sometimes called the draft, is the compulsory enlistment of people in a national service, most often a military service. Conscription dates back to antiquity and it continues in some countries to the present day under various names. The modern system of near-universal national conscription for young men dates to the French Revolution in the 1790s, where it became the basis of a very large and powerful military. Most European nations later copied the system in peacetime, so that men at a certain age would serve 1–8 years on active duty and then transfer to the reserve force.
Conscription is controversial for a range of reasons, including conscientious objection to military engagements on religious or philosophical grounds; political objection, for example to service for a disliked government or unpopular war; and ideological objection, for example, to a perceived violation of individual rights. Those conscripted may evade service, sometimes by leaving the country, and seeking asylum in another country. Some selection systems accommodate these attitudes by providing alternative service outside combat-operations roles or even outside the military, such as "Siviilipalvelus" (alternative civil service) in Finland, "Zivildienst" (compulsory community service) in Austria and Switzerland. Several countries conscript male soldiers not only for armed forces, but also for paramilitary agencies, which are dedicated to police-like "domestic only" service like internal troops, border guards or "non-combat" rescue duties like civil defence.
As of the early 21st century, many states no longer conscript soldiers, relying instead upon professional militaries with volunteers. The ability to rely on such an arrangement, however, presupposes some degree of predictability with regard to both war-fighting requirements and the scope of hostilities. Many states that have abolished conscription still, therefore, reserve the power to resume conscription during wartime or times of crisis. States involved in wars or interstate rivalries are most likely to implement conscription, and democracies are less likely than autocracies to implement conscription. Former British colonies are less likely to have conscription, as they are influenced by British anti-conscription norms that can be traced back to the English Civil War.
Around the reign of Hammurabi (1791–1750 BC), the Babylonian Empire used a system of conscription called "Ilkum". Under that system those eligible were required to serve in the royal army in time of war. During times of peace they were instead required to provide labour for other activities of the state. In return for this service, people subject to it gained the right to hold land. It is possible that this right was not to hold land "per se" but specific land supplied by the state.
Various forms of avoiding military service are recorded. While it was outlawed by the Code of Hammurabi, the hiring of substitutes appears to have been practiced both before and after the creation of the code. Later records show that Ilkum commitments could become regularly traded. In other places, people simply left their towns to avoid their Ilkum service. Another option was to sell Ilkum lands and the commitments along with them. With the exception of a few exempted classes, this was forbidden by the Code of Hammurabi.
In medieval Scandinavia the "leiðangr" (Old Norse), "leidang" (Norwegian), "leding", (Danish), "ledung" (Swedish), "lichting" (Dutch), "expeditio" (Latin) or sometimes "leþing" (Old English), was a levy of free farmers conscripted into coastal fleets for seasonal excursions and in defence of the realm.
The bulk of the Anglo-Saxon English army, called the "fyrd", was composed of part-time English soldiers drawn from the freemen of each county. In the 690s Laws of Ine, three levels of fines are imposed on different social classes for neglecting military service. Some modern writers claim military service was restricted to the landowning minor nobility. These thegns were the land-holding aristocracy of the time and were required to serve with their own armour and weapons for a certain number of days each year. The historian David Sturdy has cautioned about regarding the "fyrd" as a precursor to a modern national army composed of all ranks of society, describing it as a "ridiculous fantasy":The persistent old belief that peasants and small farmers gathered to form a national army or "fyrd" is a strange delusion dreamt up by antiquarians in the late eighteenth or early nineteenth centuries to justify universal military conscription.
The system of military slaves was widely used in the Middle East, beginning with the creation of the corps of Turkish slave-soldiers ("ghulams" or "mamluks") by the Abbasid caliph al-Mu'tasim in the 820s and 830s. The Turkish troops soon came to dominate the government, establishing a pattern throughout the Islamic world of a ruling military class, often separated by ethnicity, culture and even religion by the mass of the population, a paradigm that found its apogee in the Mamluks of Egypt and the Janissary corps of the Ottoman Empire, institutions that survived until the early 19th century.
In the middle of the 14th century, Ottoman Sultan Murad I developed personal troops to be loyal to him, with a slave army called the "Kapıkulu". The new force was built by taking Christian children from newly conquered lands, especially from the far areas of his empire, in a system known as the "devşirme" (translated "gathering" or "converting"). The captive children were forced to convert to Islam. The Sultans had the young boys trained over several years. Those who showed special promise in fighting skills were trained in advanced warrior skills, put into the sultan's personal service, and turned into the Janissaries, the elite branch of the "Kapıkulu". A number of distinguished military commanders of the Ottomans, and most of the imperial administrators and upper-level officials of the Empire, such as Pargalı İbrahim Pasha and Sokollu Mehmet Paşa, were recruited in this way. By 1609, the Sultan's "Kapıkulu" forces increased to about 100,000.
In later years, Sultans turned to the Barbary Pirates to supply their Jannissaries corps. Their attacks on ships off the coast of Africa or in the Mediterranean, and subsequent capture of able-bodied men for ransom or sale provided some captives for the Sultan's system. Starting in the 17th century, Christian families living under the Ottoman rule began to submit their sons into the Kapikulu system willingly, as they saw this as a potentially invaluable career opportunity for their children. Eventually the Sultan turned to foreign volunteers from the warrior clans of Circassians in southern Russia to fill his Janissary armies. As a whole the system began to break down, the loyalty of the Jannissaries became increasingly suspect. Mahmud II forcibly disbanded the Janissary corps in 1826.
Similar to the Janissaries in origin and means of development were the Mamluks of Egypt in the Middle Ages. The Mamluks were usually captive non-Muslim Iranian and Turkish children who had been kidnapped or bought as slaves from the Barbary coasts. The Egyptians assimilated and trained the boys and young men to become Islamic soldiers who served the Muslim caliphs and the Ayyubid sultans during the Middle Ages. The first mamluks served the Abbasid caliphs in 9th-century Baghdad. Over time they became a powerful military caste. On more than one occasion, they seized power, for example, ruling Egypt from 1250 to 1517.
From 1250 Egypt had been ruled by the Bahri dynasty of Kipchak origin. Slaves from the Caucasus served in the army and formed an elite corp of troops. They eventually revolted in Egypt to form the Burgi dynasty. The Mamluks' excellent fighting abilities, massed Islamic armies, and overwhelming numbers succeeded in overcoming the Christian Crusader fortresses in the Holy Land. The Mamluks were the most successful defense against the Mongol Ilkhanate of Persia and Iraq from entering Egypt.
On the western coast of Africa, Berber Muslims captured non-Muslims to put to work as laborers. They generally converted the younger people to Islam and many became quite assimilated. In Morocco, the Berber looked south rather than north. The Moroccan Sultan Moulay Ismail, called "the Bloodthirsty" (1672–1727), employed a corps of 150,000 black slaves, called his Black Guard. He used them to coerce the country into submission.
Modern conscription, the massed military enlistment of national citizens, was devised during the French Revolution, to enable the Republic to defend itself from the attacks of European monarchies. Deputy Jean-Baptiste Jourdan gave its name to the 5 September 1798 Act, whose first article stated: "Any Frenchman is a soldier and owes himself to the defense of the nation." It enabled the creation of the "Grande Armée", what Napoleon Bonaparte called "the nation in arms", which overwhelmed European professional armies that often numbered only into the low tens of thousands. More than 2.6 million men were inducted into the French military in this way between the years 1800 and 1813.
The defeat of the Prussian Army in particular shocked the Prussian establishment, which had believed it was invincible after the victories of Frederick the Great. The Prussians were used to relying on superior organization and tactical factors such as order of battle to focus superior troops against inferior ones. Given approximately equivalent forces, as was generally the case with professional armies, these factors showed considerable importance. However, they became considerably less important when the Prussian armies faced forces that outnumbered their own in some cases by more than ten to one. Scharnhorst advocated adopting the "levée en masse", the military conscription used by France. The "Krümpersystem" was the beginning of short-term compulsory service in Prussia, as opposed to the long-term conscription previously used.
In the Russian Empire, the military service time "owed" by serfs was 25 years at the beginning of the 19th century. In 1834 it was decreased to 20 years. The recruits were to be not younger than 17 and not older than 35. In 1874 Russia introduced universal conscription in the modern pattern, an innovation only made possible by the abolition of serfdom in 1861. New military law decreed that all male Russian subjects, when they reached the age of 20, were eligible to serve in the military for six years.
In the decades prior to World War I universal conscription along broadly Prussian lines became the norm for European armies, and those modeled on them. By 1914 the only substantial armies still completely dependent on voluntary enlistment were those of Britain and the United States. Some colonial powers such as France reserved their conscript armies for home service while maintaining professional units for overseas duties.
The range of eligible ages for conscripting was expanded to meet national demand during the World Wars.
In the United States, the Selective Service System drafted men for World War I initially in an age range from 21 to 30 but expanded its eligibility in 1918 to an age range of 18 to 45. In the case of a widespread mobilization of forces where service includes homefront defense, ages of conscripts may range much higher, with the oldest conscripts serving in roles requiring lesser mobility.
Expanded-age conscription was common during the Second World War: in Britain, it was commonly known as "call-up" and extended to age 51. Nazi Germany termed it "Volkssturm" ("People's Storm") and included children as young as 16 and men as old as 60. During the Second World War, both Britain and the Soviet Union conscripted women. The United States was on the verge of drafting women into the Nurse Corps because it anticipated it would need the extra personnel for its planned invasion of Japan. However, the Japanese surrendered and the idea was abandoned.
Feminists and opponents of discrimination against men, have criticized military conscription, or compulsory military service, as sexist. The National Coalition for Men, a men's rights group, sued the US Selective Service System and consequently the system was declared by a US court as unconstitutional.
Feminists have argued that military conscription is sexist because wars serve the interests of what they view as the patriarchy, the military is a sexist institution, conscripts are therefore indoctrinated in sexism, and conscription of men normalizes violence by men as socially acceptable. Feminists have been organizers and participants in resistance to conscription in several countries.
Conscription has also been criticized as, historically, only men have been subjected to conscription. Men who opt out or are deemed unfit for military service must often perform alternative service, such as Zivildienst in Austria and Switzerland, or pay extra taxes, whereas women do not have these obligations. Men who do not sign up for Selective Service in the US, are prohibited from eligibility for citizenship, financial aid, admissions to public colleges or universities, federal grants and loans, federal employment, and in some states, driving licenses.
American libertarians oppose conscription and call for the abolition of the Selective Service System, believing that impressment of individuals into the armed forces is "involuntary servitude." Ron Paul, a former presidential nominee of the U.S. Libertarian Party has said that conscription "is wrongly associated with patriotism, when it really represents slavery and involuntary servitude." The philosopher Ayn Rand opposed conscription, suggesting that "of all the statist violations of individual rights in a mixed economy, the military draft is the worst. It is an abrogation of rights. It negates man's fundamental right—the right to life—and establishes the fundamental principle of statism: that a man's life belongs to the state, and the state may claim it by compelling him to sacrifice it in battle."
In 1917, a number of radicals and anarchists, including Emma Goldman, challenged the new draft law in federal court arguing that it was a direct violation of the Thirteenth Amendment's prohibition against slavery and involuntary servitude. However, the Supreme Court unanimously upheld the constitutionality of the draft act in the case of "Arver v. United States" on 7 January 1918. The decision said the Constitution gave Congress the power to declare war and to raise and support armies. The Court emphasized the principle of the reciprocal rights and duties of citizens:
It can be argued that in a cost-to-benefit ratio, conscription during peacetime is not worthwhile. Months or years of service performed by the most fit and capable subtract from the productivity of the economy; add to this the cost of training them, and in some countries paying them. Compared to these extensive costs, some would argue there is very little benefit; if there ever was a war then conscription and basic training could be completed quickly, and in any case there is little threat of a war in most countries with conscription. In the United States, every male resident is required by law to register with the Selective Service System within 30 days following his 18th birthday and be available for a draft; this is often accomplished automatically by a motor vehicle department during licensing or by voter registration.
The cost of conscription can be related to the parable of the broken window in anti-draft arguments. The cost of the work, military service, does not disappear even if no salary is paid. The work effort of the conscripts is effectively wasted, as an unwilling workforce is extremely inefficient. The impact is especially severe in wartime, when civilian professionals are forced to fight as amateur soldiers. Not only is the work effort of the conscripts wasted and productivity lost, but professionally skilled conscripts are also difficult to replace in the civilian workforce. Every soldier conscripted in the army is taken away from his civilian work, and away from contributing to the economy which funds the military. This may be less a problem in an agrarian or pre-industrialized state where the level of education is generally low, and where a worker is easily replaced by another. However, this is potentially more costly in a post-industrial society where educational levels are high and where the workforce is sophisticated and a replacement for a conscripted specialist is difficult to find. Even direr economic consequences result if the professional conscripted as an amateur soldier is killed or maimed for life; his work effort and productivity are lost.
Jean Jacques Rousseau argued vehemently against professional armies, believing that it was the right and privilege of every citizen to participate to the defense of the whole society, and a mark of moral decline to leave this business to professionals. He based this belief upon the development of the Roman republic, which came to an end at the same time as the Roman army changed from a conscript to professional force. Similarly, Aristotle linked the division of armed service among the populace intimately with the political order of the state. Niccolò Machiavelli argued strongly for conscription, seeing the professional armies as the cause of the failure of societal unity in Italy.
Other proponents, such as William James, consider both mandatory military and national service as ways of instilling maturity in young adults. Some proponents, such as Jonathan Alter and Mickey Kaus, support a draft in order to reinforce social equality, create social consciousness, break down class divisions and for young adults to immerse themselves in public enterprise. Charles Rangel called for the reinstatement of the draft during the Iraq war, not because he seriously expected it to be adopted, but to stress how the socioeconomic restratification meant that very few children of upper-class Americans served in the all-volunteer American armed forces.
It is estimated by the British military that in a professional military, a company deployed for active duty in peacekeeping corresponds to three inactive companies at home. Salaries for each are paid from the military budget. In contrast, volunteers from a trained reserve are in their civilian jobs when they are not deployed.
It was more financially beneficial for less-educated young Portuguese men born in 1967 to participate in conscription, as opposed to participating in the highly competitive job market with men of the same age who continued through to higher education.
Traditionally conscription has been limited to the male population of a given body. Women and disabled men have been exempt from conscription. Many societies have considered, and continue to consider, military service as a test of manhood and a rite of passage from boyhood into manhood.
Only a few nations actively draft women into military service: Bolivia,
Chad,
Eritrea,
Israel,
Mozambique,
Norway,
North Korea
and Sweden.
Norway introduced female conscription in 2015, making it the first NATO member to have a legally compulsory national service for both men and women. In practice only motivated volunteers are selected to join the army in Norway.
Sweden introduced female conscription in 2010, but it was not activated until 2017. This made Sweden the second nation in Europe to draft women, and the second in the world to draft women on exactly the same formal terms as men.
Israel has universal female conscription, although in practice women can avoid service by claiming a religious exemption and over a third of Israeli women do so.
Sudanese law allows for conscription of women, but this is not implemented in practice.
In the United Kingdom during World War II, beginning in 1941, women were brought into the scope of conscription but, as all women with dependent children were exempt and many women were informally left in occupations such as nursing or teaching, the number conscripted was relatively few.
In the USSR, there was no systematic conscription of women for the armed forces, but the severe disruption of normal life and the high proportion of civilians affected by World War II after the German invasion attracted many volunteers for what was termed "The Great Patriotic War". Medical doctors of both sexes could and would be conscripted (as officers). Also, the Soviet university education system required Department of Chemistry students of both sexes to complete an ROTC course in NBC defense, and such female reservist officers could be conscripted in times of war. The United States came close to drafting women into the Nurse Corps in preparation for a planned invasion of Japan.
In 1981 in the United States, several men filed lawsuit in the case "Rostker v. Goldberg", alleging that the Selective Service Act of 1948 violates the Due Process Clause of the Fifth Amendment by requiring that only men register with the Selective Service System (SSS). The Supreme Court eventually upheld the Act, stating that "the argument for registering women was based on considerations of equity, but Congress was entitled, in the exercise of its constitutional powers, to focus on the question of military need, rather than 'equity.'"
On October 1, 1999 in Taiwan, the Judicial Yuan of the Republic of China in its Interpretation 490 considered that the physical differences between males and females and the derived role differentiation in their respective social functions and lives would not make drafting only males a violation of the Constitution of the Republic of China. Though women are not conscripted in Taiwan, transsexual persons are exempt.
A conscientious objector is an individual whose personal beliefs are incompatible with military service, or, more often, with any role in the armed forces. In some countries, conscientious objectors have special legal status, which augments their conscription duties. For example, Sweden used to allow (and once again, with the re-introduction of conscription, allows) conscientious objectors to choose a service in the "weapons-free" branch, such as an airport fireman, nurse, or telecommunications technician.
The reasons for refusing to serve in the military are varied. Some people are conscientious objectors for religious reasons. In particular, the members of the historic peace churches are pacifist by doctrine, and Jehovah's Witnesses, while not strictly pacifists, refuse to participate in the armed forces on the ground that they believe that Christians should be neutral in international conflicts.
Every male citizen of the Republic of Austria up to the age of 35 can be drafted for a six-month long basic military training in the Bundesheer. For men refusing to undergo this training, a nine-month lasting community service is mandatory.
Belgium abolished the conscription in 1994. The last conscripts left active service in February 1995. To this day (2019), a small minority of the Belgian citizens supports the idea of reintroducing military conscription, for both men and women.
Bulgaria had mandatory military service for males above 18 until conscription was ended in 2008. Due to a shortfall in the army of some 5500 soldiers, parts of the current ruling coalition have expressed their support for the return of mandatory military service, most notably Krasimir Karakachanov. Opposition towards this idea from the main coalition partner, GERB, saw a compromise in 2018, where instead of mandatory military service, Bulgaria could have possibly introduced a voluntary military service by 2019 where young citizens can volunteer for a period of 6 to 9 months, receiving a basic wage. However this has not gone forward.
Universal conscription in China dates back to the State of Qin, which eventually became the Qin Empire of 221 BC. Following unification, historical records show that a total of 300,000 conscript soldiers and 500,000 conscript labourers constructed the Great Wall of China.
In the following dynasties, universal conscription was abolished and reintroduced on numerous occasions.
, universal military conscription is theoretically mandatory in the People's Republic of China, and reinforced by law. However, due to the large population of China and large pool of candidates available for recruitment, the People's Liberation Army has always had sufficient volunteers, so conscription has not been required in practice at all.
Military service in Cyprus has a deep rooted history entagled with the Cyprus problem. Military service in the Cypriot National Guard is mandatory for all male citizens of the Republic of Cyprus, as well as any male non-citizens born of a parent of Greek Cypriot descent, lasting from the January 1 of the year in which they turn 18 years of age to December 31, of the year in which they turn 50. (Efthymiou, 2016). All male residents of Cyprus who are of military age (16 and over) are required to obtain an exit visa from the Ministry of Defense. Currently, military conscription in Cyprus lasts 14 months.
Conscription is known in Denmark since the Viking Age, where one man out of every 10 had to serve the king. Frederick IV of Denmark changed the law in 1710 to every 4th man. The men were chosen by the landowner and it was seen as a penalty.
Since 12 February 1849, every physically fit man must do military service. According to §81 in the Constitution of Denmark, which was promulgated in 1849: Every male person able to carry arms shall be liable with his person to contribute to the defence of his country under such rules as are laid down by Statute. — Constitution of DenmarkThe legislation about compulsory military service is articulated in the Danish Law of Conscription. National service takes 4–12 months. It is possible to postpone the duty when one is still in full-time education. Every male turning 18 will be drafted to the 'Day of Defence', where they will be introduced to the Danish military and their health will be tested. Physically unfit persons are not required to do military service. It is only compulsory for men, while women are free to choose to join the Danish army. Almost all of the men have been volunteers in recent years, 96.9% of the total number of recruits having been volunteers in the 2015 draft.
After lottery, one can become a conscientious objector. Total objection (refusal from alternative civilian service) results in up to 4 months jailtime according to the law. However, in 2014 a Danish man, who signed up for the service and objected later, got only 14 days of home arrest. In many countries the act of desertion (objection after signing up) is punished harder than objecting the compulsory service.
Conscription in Finland is part of a general compulsion for national military service for all adult males (; ) defined in the 127§ of the Constitution of Finland.
Conscription can take the form of military or of civilian service. According to Finnish Defence Forces 2011 data slightly under 80% of Finnish males turned 30 had entered and finished the military service. The number of female volunteers to annually enter armed service had stabilised at approximately 300. The service period is 165, 255 or 347 days for the rank and file conscripts and 347 days for conscripts trained as NCOs or reserve officers. The length of civilian service is always twelve months. Those electing to serve unarmed in duties where unarmed service is possible serve either nine or twelve months, depending on their training.
Any Finnish male citizen who refuses to perform both military and civilian service faces a penalty of 173 days in prison, minus any served days. Such sentences are usually served fully in prison, with no parole. Jehovah's Witnesses are no longer exempted from service as of February 27, 2019. The inhabitants of the demilitarized Åland Islands are exempt from military service. By the Conscription Act of 1951, they are, however, required to serve a time at a local institution, like the coast guard. However, until such service has been arranged, they are freed from service obligation. The non-military service of Åland islands has not been arranged since the introduction of the act, and there are no plans to institute it. The inhabitants of Åland islands can also volunteer for military service on the mainland. As of 1995, women are permitted to serve on a voluntary basis and pursue careers in the military after their initial voluntary military service.
The military service takes place in Finnish Defence Forces or in the Finnish Border Guard. All services of the Finnish Defence Forces train conscripts. However, the Border Guard trains conscripts only in land-based units, not in coast guard detachments or in the Border Guard Air Wing. Civilian service may take place in the Civilian Service Center in Lapinjärvi or in an accepted non-profit organization of educational, social or medical nature.
Between 1956 and 2011 conscription was mandatory for all male citizens in the German federal armed forces (German: "Bundeswehr"), as well as for the Federal Border Guard (German: "Bundesgrenzschutz") in the 1970s (see Border Guard Service). With the end of the Cold War the German government drastically reduced the size of its armed forces. The low demand for conscripts led to the suspension of compulsory conscription in 2011. Since then, only volunteer professionals serve in the "Bundeswehr".
Since 1914, Greece has had a period of mandatory military service lasting 9 months for men between the ages of 16 and 45. Citizens discharged from active service are normally placed in the reserve and are subject to periodic recalls of 1–10 days at irregular intervals.
Universal conscription was introduced in Greece during the military reforms of 1909, although various forms of selective conscription had been in place earlier. In more recent years, conscription was associated with the state of general mobilisation declared on July 20, 1974 due to the crisis in Cyprus (the mobilisation was formally ended on December 18, 2002).
The period of time that a conscript is required to serve has varied historically, between 12–36 months depending on various factors particular to the conscript, and the political situation. Although women are employed by the Greek army as officers and petty officers, they are not required to enlist, as men are. Soldiers receive no health insurance, but they are provided medical support during their army service, including hospitalization costs.
Since 2009, Greece has mandatory military service of 9 months for male citizens between the ages of 19 and 45. However, as the Armed forces had been gearing towards a completely professional army, the government had announced that the mandatory military service period would be cut to 6 months by 2008 or even abolished completely. However, this timetable was under reconsideration as of April 2006, due to severe manpower shortages. These had been caused by a combination of financial difficulties, meaning that professional soldiers could not be hired at the projected rate, and widespread abuse of the deferment process, resulting in two-thirds of the conscripts deferred service in 2005. In August 2009, the mandatory service period was reduced to 9 months for the army, but has remained at 12 months for the navy and the air force. The number of conscripts affected to the latter two has been greatly reduced, with an aim towards full professionalisation.
There is a mandatory military service for all men and women in Israel who are fit and 18 years old. Men must serve 32 months while women serve 24 months, with the vast majority of conscripts being Jewish.
Some Israeli citizens are exempt from mandatory service:
All of the exempt above are eligible to volunteer to the Israel Defense Forces (IDF), as long as they declare so.
Male Druze and male Circassian Israeli citizens are liable for conscription, in accordance with agreement set by their community leaders (their community leaders however signed a clause in which all female Druze and female Circassian are exempt from service).
A few male Bedouin Israeli citizens choose to enlist to the Israeli military in every draft (despite their Muslim-Arab background that exempt them from conscription).
Lithuania abolished its conscription in 2008. In May 2015, the Lithuanian parliament voted to return the conscription and the conscripts started their training in August 2015. In practice there is no conscription in Lithuania, since all recruits have been volunteers.
Luxembourg practiced military conscription from 1948 until 1967.
Moldova, which currently has male conscription, has announced plans to abolish the practice. Moldova's Defense Ministry announced that a plan which stipulates the gradual elimination of military conscription will be implemented starting from the autumn of 2018.
Conscription, which was called "Service Duty" () in the Netherlands, was first employed in 1810 by French occupying forces. Napoleon's brother Louis Bonaparte, who was King of Holland from 1806 to 1810, had tried to introduce conscription a few years earlier, unsuccessfully. Every man aged 20 years or older had to enlist. By means of drawing lots it was decided who had to undertake service in the French army. It was possible to arrange a substitute against payment.
Later on, conscription was used for all men over the age of 18. Postponement was possible, due to study, for example. Conscientious objectors could perform an alternative civilian service instead of military service. For various reasons, this forced military service was criticized at the end of the twentieth century. Since the Cold War was over, so was the direct threat of a war. Instead, the Dutch army was employed in more and more peacekeeping operations. The complexity and danger of these missions made the use of conscripts controversial. Furthermore, the conscription system was thought to be unfair as only men were drafted.
In the European part of Netherlands, compulsory attendance has been officially suspended since 1 May 1997. Between 1991 and 1996, the Dutch armed forces phased out their conscript personnel and converted to an all-professional force. The last conscript troops were inducted in 1995, and demobilized in 1996. The suspension means that citizens are no longer forced to serve in the armed forces, as long as it is not required for the safety of the country. Since then, the Dutch army has become an all-professional force. However, to this day, every male and – from January 2020 onward – female citizen aged 17 gets a letter in which they are told that they have been registered but do not have to present themselves for service.
, Norway currently employs a weak form of mandatory military service for men and women. In practice recruits are not forced to serve, instead only those who are motivated are selected. About 60,000 Norwegians are available for conscription every year, but only 8,000 to 10,000 are conscripted. Since 1985, women have been able to enlist for voluntary service as regular recruits. On 14 June 2013 the Norwegian Parliament voted to extend conscription to women, making Norway the first NATO member and first European country to make national service compulsory for both sexes. In earlier times, up until at least the early 2000s, all men aged 19–44 were subject to mandatory service, with good reasons required to avoid becoming drafted. There is a right of conscientious objection.
In addition to the military service, the Norwegian government draft a total of 8,000 men and women between 18 and 55 to non-military Civil defence duty. (Not to be confused with Alternative civilian service.) Former service in the military does not exclude anyone from later being drafted to the Civil defence, but an upper limit of total 19 months of service applies. Neglecting mobilisation orders to training exercises and actual incidents, may impose fines.
, Serbia no longer practises mandatory military service. Prior to this, mandatory military service lasted 6 months for men. Conscientious objectors could however opt for 9 months of civil service instead.
On 15 December 2010, the Parliament of Serbia voted to suspend mandatory military service. The decision fully came into force on January 1, 2011.
Sweden had conscription () for men between 1901 and 2010. During the last few decades it was selective. Since 1980, women have been allowed to sign up by choice, and, if passing the tests, do military training together with male conscripts. Since 1989 women have been allowed to serve in all military positions and units, including combat.
In 2010, conscription was made gender-neutral, meaning both women and men would be conscripted –on equal terms. The conscription system was simultaneously deactivated in peacetime. Seven years later, referencing increased military threat, the Swedish Government reactivated military conscription. Beginning in 2018, both men and women are conscripted.
The United Kingdom introduced conscription to full-time military service for the first time in January 1916 (the eighteenth month of World War I) and abolished it in 1920. Ireland, then part of the United Kingdom, was exempted from the original 1916 military service legislation, and although further legislation in 1918 gave power for an extension of conscription to Ireland, the power was never put into effect.
Conscription was reintroduced in 1939, in the lead up to World War II, and continued in force until 1963. Northern Ireland was exempted from conscription legislation throughout the whole period.
In all, eight million men were conscripted during both World Wars, as well as several hundred thousand younger single women. The introduction of conscription in May 1939, before the war began, was partly due to pressure from the French, who emphasized the need for a large British army to oppose the Germans. From early 1942 unmarried women age 19–30 were conscripted. Most were sent to the factories, but they could volunteer for the Auxiliary Territorial Service (ATS) and other women's services. Some women served in the Women's Land Army: initially volunteers but later conscription was introduced. However, women who were already working in a skilled job considered helpful to the war effort, such as a General Post Office telephonist, were told to continue working as before. None was assigned to combat roles unless she volunteered. By 1943 women were liable to some form of directed labour up to age 51. During the Second World War, 1.4 million British men volunteered for service and 3.2 million were conscripted. Conscripts comprised 50% of the Royal Air Force, 60% of the Royal Navy and 80% of the British Army.
The abolition of conscription in Britain was announced on 4 April 1957, by new prime minister Harold Macmillan, with the last conscripts being recruited three years later.
Conscription in the United States ended in 1973, but males aged between 18 and 25 are required to register with the Selective Service System to enable a reintroduction of conscription if necessary. President Gerald Ford had suspended mandatory draft registration in 1975; but, President Jimmy Carter reinstated that requirement when the Soviet Union intervened in Afghanistan five years later. Consequently, Selective Service registration is still required of almost all young men. There have been no prosecutions for violations of the draft registration law since 1986. Males between the ages of 17 and 45, and female members of the US National Guard may be conscripted for federal militia service pursuant to 10 U.S. Code § 246 and the Militia Clauses of the United States Constitution.
In February 2019, the United States District Court for the Southern District of Texas ruled that male-only conscription breached the Fourteenth Amendment's equal protection clause. In "National Coalition for Men v. Selective Service System", a case brought by non-profit men's rights organisation the National Coalition for Men against the U.S. Selective Service System, judge Gray H. Miller issued a declaratory judgement that the male-only registration requirement is unconstitutional, though did not specify what action the government should take. | https://en.wikipedia.org/wiki?curid=5735 |
Catherine Coleman
Catherine Grace "Cady" Coleman (born December 14, 1960) is an American chemist, a former United States Air Force officer, and a retired NASA astronaut. She is a veteran of two Space Shuttle missions, and departed the International Space Station on May 23, 2011, as a crew member of Expedition 27 after logging 159 days in space.
Coleman graduated from Wilbert Tucker Woodson High School, Fairfax, Virginia, in 1978; in 1978–1979, she was an exchange student at Røyken upper secondary school in Norway with the AFS Intercultural Programs. She received a B.S. degree in chemistry from the Massachusetts Institute of Technology in 1983 and was commissioned as graduate of the Air Force ROTC., then received a Ph.D. degree in polymer science and engineering from the University of Massachusetts Amherst in 1991 She was advised by Professor Thomas J. McCarthy on her doctorate. As an undergraduate she was a member of the intercollegiate crew and was a resident of Baker House.
Coleman continued to pursue her PhD at the University of Massachusetts Amherst as a Second Lieutenant. In 1988 she entered active duty at Wright-Patterson Air Force Base as a research chemist. During her work she participated as a surface analysis consultant on the NASA Long Duration Exposure Facility experiment. In 1991, she received her doctorate in polymer science and engineering. She retired from the Air Force in November 2009 as a colonel.
Coleman was selected by NASA in 1992 to join the NASA Astronaut Corps. In 1995, she was a member of the STS-73 crew on the scientific mission USML-2 with experiments including biotechnology, combustion science, and the physics of fluids. During the flight, she reported to Houston Mission Control that she had spotted an unidentified flying object. She also trained for the mission STS-83 to be the backup for Donald A. Thomas; however, as he recovered on time, she did not fly that mission. STS-93 was Coleman's second space flight in 1999. She was mission specialist in charge of deploying the Chandra X-ray Observatory and its Inertial Upper Stage out of the shuttle's cargo bay.
Coleman served as Chief of Robotics for the Astronaut Office, to include robotic arm operations and training for all Space Shuttle and International Space Station missions. In October 2004, Coleman served as an aquanaut during the mission aboard the Aquarius underwater laboratory, living and working underwater for eleven days.
Coleman was assigned as a backup U.S. crew member for Expeditions 19, 20 and 21 and served as a backup crew member for Expeditions 24 and 25 as part of her training for Expedition 26.
Coleman launched on December 15, 2010 (December 16 Baikonur time), aboard Soyuz TMA-20 to join the Expedition 26 mission aboard the International Space Station. She retired from NASA on December 1, 2016.
STS-73 on Space Shuttle "Columbia" (October 20 to November 5, 1995) was the second United States Microgravity Laboratory mission. The mission focused on materials science, biotechnology, combustion science, the physics of fluids, and numerous scientific experiments housed in the pressurized Spacelab module. In completing her first space flight, Coleman orbited the Earth 256 times, traveled over 6 million miles, and logged a total of 15 days, 21 hours, 52 minutes and 21 seconds in space.
STS-93 on "Columbia" (July 22 to 27, 1999) was a five-day mission during which Coleman was the lead mission specialist for the deployment of the Chandra X-ray Observatory. Designed to conduct comprehensive studies of the universe, the telescope will enable scientists to study exotic phenomena such as exploding stars, quasars, and black holes. Mission duration was 118 hours and 50 minutes.
Soyuz TMA-20 / Expedition 26/27 (December 15, 2010, to May 23, 2011) was an extended duration mission to the International Space Station.
Coleman is married to glass artist Josh Simpson who lives in Massachusetts. They have one son. She is part of the band Bandella, which also includes fellow NASA astronaut Steven Robinson, Canadian astronaut Chris Hadfield, and Micki Pettit (astronaut Don Pettit's wife). Coleman is a flute player and has taken several flutes with her to the ISS, including a pennywhistle from Paddy Moloney of the Chieftains, an old Irish flute from Matt Molloy of the Chieftains, and a flute from Ian Anderson of Jethro Tull. On February 15, 2011, she played one of the instruments live from orbit on National Public Radio. On April 12, 2011, she played live via video link for the audience of Jethro Tull's show in Russia in honour of the 50th anniversary of Yuri Gagarin's flight, playing in orbit while Anderson played on the ground. On May 13 of that year, Coleman delivered a taped commencement address to the class of 2011 at the University of Massachusetts Amherst.
As do many other astronauts, Coleman holds an amateur radio license (callsign: KC5ZTH).
As of 2015 she is also known to be working as a guest speaker at the Baylor College of Medicine, for the children's program 'Saturday Morning Science'.
In 2018 she gave a graduation address to Carter Lynch, the sole graduate of Cuttyhunk Elementary School, on Cuttyhunk Island, Massachusetts. | https://en.wikipedia.org/wiki?curid=5736 |
Cervix
The cervix or cervix uteri (Latin, 'neck of the uterus') is the lower part of the uterus in the human female reproductive system. The cervix is usually 2 to 3 cm long (~1 inch) and roughly cylindrical in shape, which changes during pregnancy. The narrow, central cervical canal runs along its entire length, connecting the uterine cavity and the lumen of the vagina. The opening into the uterus is called the internal os, and the opening into the vagina is called the external os. The lower part of the cervix, known as the vaginal portion of the cervix (or ectocervix), bulges into the top of the vagina. The cervix has been documented anatomically since at least the time of Hippocrates, over 2,000 years ago.
The cervical canal is a passage through which sperm must travel to fertilize an egg cell after sexual intercourse. Several methods of contraception, including cervical caps and cervical diaphragms, aim to block or prevent the passage of sperm through the cervical canal. Cervical mucus is used in several methods of fertility awareness, such as the Creighton model and Billings method, due to its changes in consistency throughout the menstrual period. During vaginal childbirth, the cervix must flatten and dilate to allow the fetus to progress along the birth canal. Midwives and doctors use the extent of the dilation of the cervix to assist decision-making during childbirth.
The cervical canal is lined with a single layer of column-shaped cells, while the ectocervix is covered with multiple layers of cells topped with flat cells. The two types of epithelia meet at the squamocolumnar junction. Infection with the human papillomavirus (HPV) can cause changes in the epithelium, which can lead to cancer of the cervix. Cervical cytology tests can often detect cervical cancer and its precursors, and enable early successful treatment. Ways to avoid HPV include avoiding sex, using condoms, and HPV vaccination. HPV vaccines, developed in the early 21st century, reduce the risk of cervical cancer by preventing infections from the main cancer-causing strains of HPV.
The cervix is part of the female reproductive system. Around in length, it is the lower narrower part of the uterus continuous above with the broader upper part—or body—of the uterus. The lower end of the cervix bulges through the anterior wall of the vagina, and is referred to as the vaginal portion of cervix (or ectocervix) while the rest of the cervix above the vagina is called the supravaginal portion of cervix. A central canal, known as the cervical canal, runs along its length and connects the cavity of the body of the uterus with the lumen of the vagina. The openings are known as the internal os and external orifice of the uterus (or external os) respectively. The mucosa lining the cervical canal is known as the endocervix, and the mucosa covering the ectocervix is known as the exocervix. The cervix has an inner mucosal layer, a thick layer of smooth muscle, and posteriorly the supravaginal portion has a serosal covering consisting of connective tissue and overlying peritoneum.
In front of the upper part of the cervix lies the bladder, separated from it by cellular connective tissue known as parametrium, which also extends over the sides of the cervix. To the rear, the supravaginal cervix is covered by peritoneum, which runs onto the back of the vaginal wall and then turns upwards and onto the rectum, forming the recto-uterine pouch. The cervix is more tightly connected to surrounding structures than the rest of the uterus.
The cervical canal varies greatly in length and width between women or over the course of a woman's life, and it can measure 8 mm (0.3 inch) at its widest diameter in premenopausal adults. It is wider in the middle and narrower at each end. The anterior and posterior walls of the canal each have a vertical fold, from which ridges run diagonally upwards and laterally. These are known as "palmate folds", due to their resemblance to a palm leaf. The anterior and posterior ridges are arranged in such a way that they interlock with each other and close the canal. They are often effaced after pregnancy.
The ectocervix (also known as the vaginal portion of the cervix) has a convex, elliptical shape and projects into the cervix between the anterior and posterior vaginal fornices. On the rounded part of the ectocervix is a small, depressed external opening, connecting the cervix with the vagina. The size and shape of the ectocervix and the external opening (external os) can vary according to age, hormonal state, and whether natural or normal childbirth has taken place. In women who have not had a vaginal delivery, the external opening is small and circular, and in women who have had a vaginal delivery, it is slit-like. On average, the ectocervix is long and wide.
Blood is supplied to the cervix by the descending branch of the uterine artery and drains into the uterine vein. The pelvic splanchnic nerves, emerging as S2–S3, transmit the sensation of pain from the cervix to the brain. These nerves travel along the uterosacral ligaments, which pass from the uterus to the anterior sacrum.
Three channels facilitate lymphatic drainage from the cervix. The anterior and lateral cervix drains to nodes along the uterine arteries, travelling along the cardinal ligaments at the base of the broad ligament to the external iliac lymph nodes and ultimately the paraaortic lymph nodes. The posterior and lateral cervix drains along the uterine arteries to the internal iliac lymph nodes and ultimately the paraaortic lymph nodes, and the posterior section of the cervix drains to the obturator and presacral lymph nodes. However, there are variations as lymphatic drainage from the cervix travels to different sets of pelvic nodes in some people. This has implications in scanning nodes for involvement in cervical cancer.
After menstruation and directly under the influence of estrogen, the cervix undergoes a series of changes in position and texture. During most of the menstrual cycle, the cervix remains firm, and is positioned low and closed. However, as ovulation approaches, the cervix becomes softer and rises to open in response to the higher levels of estrogen present. These changes are also accompanied by changes in cervical mucus, described below.
As a component of the female reproductive system, the cervix is derived from the two paramesonephric ducts (also called Müllerian ducts), which develop around the sixth week of embryogenesis. During development, the outer parts of the two ducts fuse, forming a single urogenital canal that will become the vagina, cervix and uterus. The cervix grows in size at a smaller rate than the body of the uterus, so the relative size of the cervix over time decreases, decreasing from being much larger than the body of the uterus in fetal life, twice as large during childhood, and decreasing to its adult size, smaller than the uterus, after puberty. Previously it was thought that during fetal development, the original squamous epithelium of the cervix is derived from the urogenital sinus and the original columnar epithelium is derived from the paramesonephric duct. The point at which these two original epithelia meet is called the original squamocolumnar junction. New studies show, however, that all the cervical as well as large part of the vaginal epithelium are derived from Müllerian duct tissue and that phenotypic differences might be due to other causes.
The endocervical mucosa is about thick and lined with a single layer of columnar mucous cells. It contains numerous tubular mucous glands, which empty viscous alkaline mucus into the lumen. In contrast, the ectocervix is covered with nonkeratinized stratified squamous epithelium, which resembles the squamous epithelium lining the vagina. The junction between these two types of epithelia is called the squamocolumnar junction. Underlying both types of epithelium is a tough layer of collagen. The mucosa of the endocervix is not shed during menstruation. The cervix has more fibrous tissue, including collagen and elastin, than the rest of the uterus.
In prepubertal girls, the functional squamocolumnar junction is present just within the cervical canal. Upon entering puberty, due to hormonal influence, and during pregnancy, the columnar epithelium extends outward over the ectocervix as the cervix everts. Hence, this also causes the squamocolumnar junction to move outwards onto the vaginal portion of the cervix, where it is exposed to the acidic vaginal environment. The exposed columnar epithelium can undergo physiological metaplasia and change to tougher metaplastic squamous epithelium in days or weeks, which is very similar to the original squamous epithelium when mature. The new squamocolumnar junction is therefore internal to the original squamocolumnar junction, and the zone of unstable epithelium between the two junctions is called the "transformation zone" of the cervix. Histologically, the transformation zone is generally defined as surface squamous epithelium with surface columnar epithelium or stromal glands/crypts, or both.
After menopause, the uterine structures involute and the functional squamocolumnar junction moves into the cervical canal.
Nabothian cysts (or Nabothian follicles) form in the transformation zone where the lining of metaplastic epithelium has replaced mucous epithelium and caused a strangulation of the outlet of some of the mucous glands. A buildup of mucus in the glands forms Nabothian cysts, usually less than about in diameter, which are considered physiological rather than pathological. Both gland openings and Nabothian cysts are helpful to identify the transformation zone.
The cervical canal is a pathway through which sperm enter the uterus after sexual intercourse, and some forms of artificial insemination. Some sperm remains in cervical crypts, infoldings of the endocervix, which act as a reservoir, releasing sperm over several hours and maximising the chances of fertilisation. A theory states the cervical and uterine contractions during orgasm draw semen into the uterus. Although the "upsuck theory" has been generally accepted for some years, it has been disputed due to lack of evidence, small sample size, and methodological errors.
Some methods of fertility awareness, such as the Creighton model and the Billings method involve estimating a woman's periods of fertility and infertility by observing physiological changes in her body. Among these changes are several involving the quality of her cervical mucus: the sensation it causes at the vulva, its elasticity ("Spinnbarkeit"), its transparency, and the presence of ferning.
Several hundred glands in the endocervix produce 20–60 mg of cervical mucus a day, increasing to 600 mg around the time of ovulation. It is viscous because it contains large proteins known as mucins. The viscosity and water content varies during the menstrual cycle; mucus is composed of around 93% water, reaching 98% at midcycle. These changes allow it to function either as a barrier or a transport medium to spermatozoa. It contains electrolytes such as calcium, sodium, and potassium; organic components such as glucose, amino acids, and soluble proteins; trace elements including zinc, copper, iron, manganese, and selenium; free fatty acids; enzymes such as amylase; and prostaglandins. Its consistency is determined by the influence of the hormones estrogen and progesterone. At midcycle around the time of ovulation—a period of high estrogen levels— the mucus is thin and serous to allow sperm to enter the uterus and is more alkaline and hence more hospitable to sperm. It is also higher in electrolytes, which results in the "ferning" pattern that can be observed in drying mucus under low magnification; as the mucus dries, the salts crystallize, resembling the leaves of a fern. The mucus has a stretchy character described as "Spinnbarkeit" most prominent around the time of ovulation.
At other times in the cycle, the mucus is thick and more acidic due to the effects of progesterone. This "infertile" mucus acts as a barrier to keep sperm from entering the uterus. Women taking an oral contraceptive pill also have thick mucus from the effects of progesterone. Thick mucus also prevents pathogens from interfering with a nascent pregnancy.
A cervical mucus plug, called the operculum, forms inside the cervical canal during pregnancy. This provides a protective seal for the uterus against the entry of pathogens and against leakage of uterine fluids. The mucus plug is also known to have antibacterial properties. This plug is released as the cervix dilates, either during the first stage of childbirth or shortly before. It is visible as a blood-tinged mucous discharge.
The cervix plays a major role in childbirth. As the fetus descends within the uterus in preparation for birth, the presenting part, usually the head, rests on and is supported by the cervix. As labour progresses, the cervix becomes softer and shorter, begins to dilate, and rotates to face anteriorly. The support the cervix provides to the fetal head starts to give way when the uterus begins its contractions. During childbirth, the cervix must dilate to a diameter of more than to accommodate the head of the fetus as it descends from the uterus to the vagina. In becoming wider, the cervix also becomes shorter, a phenomenon known as effacement.
Along with other factors, midwives and doctors use the extent of cervical dilation to assist decision making during childbirth. | https://en.wikipedia.org/wiki?curid=5738 |
Compiler
A compiler is a computer program that translates computer code written in one programming language (the source language) into another language (the target language). The name "compiler" is primarily used for programs that translate source code from a high-level programming language to a lower level language (e.g., assembly language, object code, or machine code) to create an executable program.
However, there are many different types of compilers. If the compiled program can run on a computer whose CPU or operating system is different from the one on which the compiler runs, the compiler is a cross-compiler. A bootstrap compiler is written in the language that it intends to compile. A program that translates from a low-level language to a higher level one is a decompiler. A program that translates between high-level languages is usually called a source-to-source compiler or transcompiler. A language rewriter is usually a program that translates the form of expressions without a change of language. The term compiler-compiler refers to tools used to create parsers that perform syntax analysis.
A compiler is likely to perform many or all of the following operations: preprocessing, lexical analysis, parsing, semantic analysis (syntax-directed translation), conversion of input programs to an intermediate representation, code optimization and code generation. Compilers implement these operations in phases that promote efficient design and correct transformations of source input to target output. Program faults caused by incorrect compiler behavior can be very difficult to track down and work around; therefore, compiler implementers invest significant effort to ensure compiler correctness.
Compilers are not the only language processor used to transform source programs. An interpreter is computer software that transforms and then executes the indicated operations. The translation process influences the design of computer languages which leads to a preference of compilation or interpretation. In practice, an interpreter can be implemented for compiled languages and compilers can be implemented for interpreted languages.
Theoretical computing concepts developed by scientists, mathematicians, and engineers formed the basis of digital modern computing development during World War II. Primitive binary languages evolved because digital devices only understand ones and zeros and the circuit patterns in the underlying machine architecture. In the late 1940s, assembly languages were created to offer a more workable abstraction of the computer architectures. Limited memory capacity of early computers led to substantial technical challenges when the first compilers were designed. Therefore, the compilation process needed to be divided into several small programs. The front end programs produce the analysis products used by the back end programs to generate target code. As computer technology provided more resources, compiler designs could align better with the compilation process.
It is usually more productive for a programmer to use a high-level language, so the development of high-level languages followed naturally from the capabilities offered by digital computers. High-level languages are formal languages that are strictly defined by their syntax and semantics which form the high-level language architecture. Elements of these formal languages include:
The sentences in a language may be defined by a set of rules called a grammar.
Backus–Naur form (BNF) describes the syntax of "sentences" of a language and was used for the syntax of Algol 60 by John Backus. The ideas derive from the context-free grammar concepts by Noam Chomsky, a linguist. "BNF and its extensions have become standard tools for describing the syntax of programming notations, and in many cases parts of compilers are generated automatically from a BNF description."
In the 1940s, Konrad Zuse designed an algorithmic programming language called Plankalkül ("Plan Calculus"). While no actual implementation occurred until the 1970s, it presented concepts later seen in APL designed by Ken Iverson in the late 1950s. APL is a language for mathematical computations.
High-level language design during the formative years of digital computing provided useful programming tools for a variety of applications:
Compiler technology evolved from the need for a strictly defined transformation of the high-level source program into a low-level target program for the digital computer. The compiler could be viewed as a front end to deal with the analysis of the source code and a back end to synthesize the analysis into the target code. Optimization between the front end and back end could produce more efficient target code.
Some early milestones in the development of compiler technology:
although the A-0 compiler functioned more as a loader or linker than the modern notion of a full compiler.
Early operating systems and software were written in assembly language. In the 1960s and early 1970s, the use of high-level languages for system programming was still controversial due to resource limitations. However, several research and industry efforts began the shift toward high-level systems programming languages, for example, BCPL, BLISS, B, and C.
BCPL (Basic Combined Programming Language) designed in 1966 by Martin Richards at the University of Cambridge was originally developed as a compiler writing tool. Several compilers have been implemented, Richards' book provides insights to the language and its compiler. BCPL was not only an influential systems programming language that is still used in research but also provided a basis for the design of B and C languages.
BLISS (Basic Language for Implementation of System Software) was developed for a Digital Equipment Corporation (DEC) PDP-10 computer by W.A. Wulf's Carnegie Mellon University (CMU) research team. The CMU team went on to develop BLISS-11 compiler one year later in 1970.
Multics (Multiplexed Information and Computing Service), a time-sharing operating system project, involved MIT, Bell Labs, General Electric (later Honeywell) and was led by Fernando Corbató from MIT. Multics was written in the PL/I language developed by IBM and IBM User Group. IBM's goal was to satisfy business, scientific, and systems programming requirements. There were other languages that could have been considered but PL/I offered the most complete solution even though it had not been implemented. For the first few years of the Mulitics project, a subset of the language could be compiled to assembly language with the Early PL/I (EPL) compiler by Doug McIlory and Bob Morris from Bell Labs. EPL supported the project until a boot-strapping compiler for the full PL/I could be developed.
Bell Labs left the Multics project in 1969: "Over time, hope was replaced by frustration as the group effort initially failed to produce an economically useful system." Continued participation would drive up project support costs. So researchers turned to other development efforts. A system programming language B based on BCPL concepts was written by Dennis Ritchie and Ken Thompson. Ritchie created a boot-strapping compiler for B and wrote Unics (Uniplexed Information and Computing Service) operating system for a PDP-7 in B. Unics eventually became spelled Unix.
Bell Labs started development and expansion of C based on B and BCPL. The BCPL compiler had been transported to Multics by Bell Labs and BCPL was a preferred language at Bell Labs. Initially, a front-end program to Bell Labs' B compiler was used while a C compiler was developed. In 1971, a new PDP-11 provided the resource to define extensions to B and rewrite the compiler. By 1973 the design of C language was essentially complete and the Unix kernel for a PDP-11 was rewritten in C. Steve Johnson started development of Portable C Compiler (PCC) to support retargeting of C compilers to new machines.
Object-oriented programming (OOP) offered some interesting possibilities for application development and maintenance. OOP concepts go further back but were part of LISP and Simula language science. At Bell Labs, the development of C++ became interested in OOP. C++ was first used in 1980 for systems programming. The initial design leveraged C language systems programming capabilities with Simula concepts. Object-oriented facilities were added in 1983. The Cfront program implemented a C++ front-end for C84 language compiler. In subsequent years several C++ compilers were developed as C++ popularity grew.
In many application domains, the idea of using a higher-level language quickly caught on. Because of the expanding functionality supported by newer programming languages and the increasing complexity of computer architectures, compilers became more complex.
DARPA (Defense Advanced Research Projects Agency) sponsored a compiler project with Wulf's CMU research team in 1970. The Production Quality Compiler-Compiler PQCC design would produce a Production Quality Compiler (PQC) from formal definitions of source language and the target. PQCC tried to extend the term compiler-compiler beyond the traditional meaning as a parser generator (e.g., Yacc) without much success. PQCC might more properly be referred to as a compiler generator.
PQCC research into code generation process sought to build a truly automatic compiler-writing system. The effort discovered and designed the phase structure of the PQC. The BLISS-11 compiler provided the initial structure. The phases included analyses (front end), intermediate translation to virtual machine (middle end), and translation to the target (back end). TCOL was developed for the PQCC research to handle language specific constructs in the intermediate representation. Variations of TCOL supported various languages. The PQCC project investigated techniques of automated compiler construction. The design concepts proved useful in optimizing compilers and compilers for the object-oriented programming language Ada.
The Ada Stoneman Document formalized the program support environment (APSE) along with the kernel (KAPSE) and minimal (MAPSE). An Ada interpreter NYU/ED supported development and standardization efforts with the American National Standards Institute (ANSI) and the International Standards Organization (ISO). Initial Ada compiler development by the U.S. Military Services included the compilers in a complete integrated design environment along the lines of the Stoneman Document. Army and Navy worked on the Ada Language System (ALS) project targeted to DEC/VAX architecture while the Air Force started on the Ada Integrated Environment (AIE) targeted to IBM 370 series. While the projects did not provide the desired results, they did contribute to the overal effort on Ada development.
Other Ada compiler efforts got underway in Britain at the University of York and in Germany at the University of Karlsruhe. In the U. S., Verdix (later acquired by Rational) delivered the Verdix Ada Development System (VADS) to the Army. VADS provided a set of development tools including a compiler. Unix/VADS could be hosted on a variety of Unix platforms such as DEC Ultrix and the Sun 3/60 Solaris targeted to Motorola 68020 in an Army CECOM evaluation. There were soon many Ada compilers available that passed the Ada Validation tests. The Free Software Foundation GNU project developed the GNU Compiler Collection (GCC) which provides a core capability to support multiple languages and targets. The Ada version GNAT is one of the most widely used Ada compilers. GNAT is free but there is also commercial support, for example, AdaCore, was founded in 1994 to provide commercial software solutions for Ada. GNAT Pro includes the GNU GCC based GNAT with a tool suite to provide an integrated development environment.
High-level languages continued to drive compiler research and development. Focus areas included optimization and automatic code generation. Trends in programming languages and development environments influenced compiler technology. More compilers became included in language distributions (PERL, Java Development Kit) and as a component of an IDE (VADS, Eclipse, Ada Pro). The interrelationship and interdependence of technologies grew. The advent of web services promoted growth of web languages and scripting languages. Scripts trace back to the early days of Command Line Interfaces (CLI) where the user could enter commands to be executed by the system. User Shell concepts developed with languages to write shell programs. Early Windows designs offered a simple batch programming capability. The conventional transformation of these language used an interpreter. While not widely used, Bash and Batch compilers have been written. More recently sophisticated interpreted languages became part of the developers tool kit. Modern scripting languages include PHP, Python, Ruby and Lua. (Lua is widely used in game development.) All of these have interpreter and compiler support.
"When the field of compiling began in the late 50s, its focus was limited to the translation of high-level language programs into machine code ... The compiler field is increasingly intertwined with other disciplines including computer architecture, programming languages, formal methods, software engineering, and computer security." The "Compiler Research: The Next 50 Years" article noted the importance of object-oriented languages and Java. Security and parallel computing were cited among the future research targets.
A compiler implements a formal transformation from a high-level source program to a low-level target program. Compiler design can define an end to end solution or tackle a defined subset that interfaces with other compilation tools e.g. preprocessors, assemblers, linkers. Design requirements include rigorously defined interfaces both internally between compiler components and externally between supporting toolsets.
In the early days, the approach taken to compiler design was directly affected by the complexity of the computer language to be processed, the experience of the person(s) designing it, and the resources available. Resource limitations led to the need to pass through the source code more than once.
A compiler for a relatively simple language written by one person might be a single, monolithic piece of software. However, as the source language grows in complexity the design may be split into a number of interdependent phases. Separate phases provide design improvements that focus development on the functions in the compilation process.
Classifying compilers by number of passes has its background in the hardware resource limitations of computers. Compiling involves performing much work and early computers did not have enough memory to contain one program that did all of this work. So compilers were split up into smaller programs which each made a pass over the source (or some representation of it) performing some of the required analysis and translations.
The ability to compile in a single pass has classically been seen as a benefit because it simplifies the job of writing a compiler and one-pass compilers generally perform compilations faster than multi-pass compilers. Thus, partly driven by the resource limitations of early systems, many early languages were specifically designed so that they could be compiled in a single pass (e.g., Pascal).
In some cases the design of a language feature may require a compiler to perform more than one pass over the source. For instance, consider a declaration appearing on line 20 of the source which affects the translation of a statement appearing on line 10. In this case, the first pass needs to gather information about declarations appearing after statements that they affect, with the actual translation happening during a subsequent pass.
The disadvantage of compiling in a single pass is that it is not possible to perform many of the sophisticated optimizations needed to generate high quality code. It can be difficult to count exactly how many passes an optimizing compiler makes. For instance, different phases of optimization may analyse one expression many times but only analyse another expression once.
Splitting a compiler up into small programs is a technique used by researchers interested in producing provably correct compilers. Proving the correctness of a set of small programs often requires less effort than proving the correctness of a larger, single, equivalent program.
Regardless of the exact number of phases in the compiler design, the phases can be assigned to one of three stages. The stages include a front end, a middle end, and a back end.
This front/middle/back-end approach makes it possible to combine front ends for different languages with back ends for different CPUs while sharing the optimizations of the middle end. Practical examples of this approach are the GNU Compiler Collection, Clang (LLVM-based C/C++ compiler), and the Amsterdam Compiler Kit, which have multiple front-ends, shared optimizations and multiple back-ends.
The front end analyzes the source code to build an internal representation of the program, called the intermediate representation (IR). It also manages the symbol table, a data structure mapping each symbol in the source code to associated information such as location, type and scope.
While the frontend can be a single monolithic function or program, as in a scannerless parser, it is more commonly implemented and analyzed as several phases, which may execute sequentially or concurrently. This method is favored due to its modularity and separation of concerns. Most commonly today, the frontend is broken into three phases: lexical analysis (also known as lexing), syntax analysis (also known as scanning or parsing), and semantic analysis. Lexing and parsing comprise the syntactic analysis (word syntax and phrase syntax, respectively), and in simple cases these modules (the lexer and parser) can be automatically generated from a grammar for the language, though in more complex cases these require manual modification. The lexical grammar and phrase grammar are usually context-free grammars, which simplifies analysis significantly, with context-sensitivity handled at the semantic analysis phase. The semantic analysis phase is generally more complex and written by hand, but can be partially or fully automated using attribute grammars. These phases themselves can be further broken down: lexing as scanning and evaluating, and parsing as building a concrete syntax tree (CST, parse tree) and then transforming it into an abstract syntax tree (AST, syntax tree). In some cases additional phases are used, notably "line reconstruction" and "preprocessing," but these are rare.
The main phases of the front end include the following:
The middle end, also known as "optimizer," performs optimizations on the intermediate representation in order to improve the performance and the quality of the produced machine code. The middle end contains those optimizations that are independent of the CPU architecture being targeted.
The main phases of the middle end include the following:
Compiler analysis is the prerequisite for any compiler optimization, and they tightly work together. For example, dependence analysis is crucial for loop transformation.
The scope of compiler analysis and optimizations vary greatly; their scope may range from operating within a basic block, to whole procedures, or even the whole program. There is a trade-off between the granularity of the optimizations and the cost of compilation. For example, peephole optimizations are fast to perform during compilation but only affect a small local fragment of the code, and can be performed independently of the context in which the code fragment appears. In contrast, interprocedural optimization requires more compilation time and memory space, but enable optimizations which are only possible by considering the behavior of multiple functions simultaneously.
Interprocedural analysis and optimizations are common in modern commercial compilers from HP, IBM, SGI, Intel, Microsoft, and Sun Microsystems. The free software GCC was criticized for a long time for lacking powerful interprocedural optimizations, but it is changing in this respect. Another open source compiler with full analysis and optimization infrastructure is Open64, which is used by many organizations for research and commercial purposes.
Due to the extra time and space needed for compiler analysis and optimizations, some compilers skip them by default. Users have to use compilation options to explicitly tell the compiler which optimizations should be enabled.
The back end is responsible for the CPU architecture specific optimizations and for code generation"."
The main phases of the back end include the following:
Compiler correctness is the branch of software engineering that deals with trying to show that a compiler behaves according to its language specification. Techniques include developing the compiler using formal methods and using rigorous testing (often called compiler validation) on an existing compiler.
Higher-level programming languages usually appear with a type of translation in mind: either designed as compiled language or interpreted language. However, in practice there is rarely anything about a language that "requires" it to be exclusively compiled or exclusively interpreted, although it is possible to design languages that rely on re-interpretation at run time. The categorization usually reflects the most popular or widespread implementations of a language — for instance, BASIC is sometimes called an interpreted language, and C a compiled one, despite the existence of BASIC compilers and C interpreters.
Interpretation does not replace compilation completely. It only hides it from the user and makes it gradual. Even though an interpreter can itself be interpreted, a directly executed program is needed somewhere at the bottom of the stack (see machine language).
Further, compilers can contain interpreters for optimization reasons. For example, where an expression can be executed during compilation and the results inserted into the output program, then it prevents it having to be recalculated each time the program runs, which can greatly speed up the final program. Modern trends toward just-in-time compilation and bytecode interpretation at times blur the traditional categorizations of compilers and interpreters even further.
Some language specifications spell out that implementations "must" include a compilation facility; for example, Common Lisp. However, there is nothing inherent in the definition of Common Lisp that stops it from being interpreted. Other languages have features that are very easy to implement in an interpreter, but make writing a compiler much harder; for example, APL, SNOBOL4, and many scripting languages allow programs to construct arbitrary source code at runtime with regular string operations, and then execute that code by passing it to a special evaluation function. To implement these features in a compiled language, programs must usually be shipped with a runtime library that includes a version of the compiler itself.
One classification of compilers is by the platform on which their generated code executes. This is known as the "target platform."
A "native" or "hosted" compiler is one whose output is intended to directly run on the same type of computer and operating system that the compiler itself runs on. The output of a cross compiler is designed to run on a different platform. Cross compilers are often used when developing software for embedded systems that are not intended to support a software development environment.
The output of a compiler that produces code for a virtual machine (VM) may or may not be executed on the same platform as the compiler that produced it. For this reason such compilers are not usually classified as native or cross compilers.
The lower level language that is the target of a compiler may itself be a high-level programming language. C, viewed by some as a sort of portable assembly language, is frequently the target language of such compilers. For example, Cfront, the original compiler for C++, used C as its target language. The C code generated by such a compiler is usually not intended to be readable and maintained by humans, so indent style and creating pretty C intermediate code are ignored. Some of the features of C that make it a good target language include the codice_1 directive, which can be generated by the compiler to support debugging of the original source, and the wide platform support available with C compilers.
While a common compiler type outputs machine code, there are many other types: | https://en.wikipedia.org/wiki?curid=5739 |
Castrato
A castrato (Italian, plural: "castrati") is a type of classical male singing voice equivalent to that of a soprano, mezzo-soprano, or contralto. The voice is produced by castration of the singer before puberty, or it occurs in one who, due to an endocrinological condition, never reaches sexual maturity.
Castration before puberty (or in its early stages) prevents a boy's larynx from being transformed by the normal physiological events of puberty. As a result, the vocal range of prepubescence (shared by both sexes) is largely retained, and the voice develops into adulthood in a unique way. Prepubescent castration for this purpose diminished greatly in the late 18th century and was made illegal in the Papal States, the last to prohibit them, in 1870.
As the castrato's body grew, his lack of testosterone meant that his epiphyses (bone-joints) did not harden in the normal manner. Thus the limbs of the castrati often grew unusually long, as did their ribs. This, combined with intensive training, gave them unrivalled lung-power and breath capacity. Operating through small, child-sized vocal cords, their voices were also extraordinarily flexible, and quite different from the equivalent adult female voice. Their vocal range was higher than that of the uncastrated adult male. Listening to the only surviving recordings of a castrato (see below), one can hear that the lower part of the voice sounds like a "super-high" tenor, with a more falsetto-like upper register above that.
Castrati were rarely referred to as such: in the 18th century, the euphemism "musico" (pl "musici") was much more generally used, although it usually carried derogatory implications; another synonym was "evirato," literally meaning "emasculated". Eunuch is a more general term since, historically, many eunuchs were castrated after puberty and thus the castration had no impact on their voices.
Castration as a means of subjugation, enslavement or other punishment has a very long history, dating back to ancient Sumer. In a Western context, eunuch singers are known to have existed from the early Byzantine Empire. In Constantinople around 400 AD, the empress Aelia Eudoxia had a eunuch choir-master, Brison, who may have established the use of castrati in Byzantine choirs, though whether Brison himself was a singer and whether he had colleagues who were eunuch singers is not certain. By the 9th century, eunuch singers were well-known (not least in the choir of Hagia Sophia) and remained so until the sack of Constantinople by the Western forces of the Fourth Crusade in 1204. Their fate from then until their reappearance in Italy more than three hundred years later is not clear. It seems likely that the Spanish tradition of soprano falsettists may have hidden castrati. Much of Spain was under Muslim rulers during the Middle Ages, and castration had a history going back to the ancient Near East. Stereotypically, eunuchs served as harem guards, but they were also valued as high-level political appointees since they could not start a dynasty which would threaten the ruler.
Castrati first appeared in Italy in the mid-16th century, though at first the terms describing them were not always clear. The phrase "soprano maschio" (male soprano), which could also mean falsettist, occurs in the "Due Dialoghi della Musica" (Two dialogues upon music) of Luigi Dentice, an Oratorian priest, published in Rome in 1553. On 9 November 1555 Cardinal Ippolito II d'Este (famed as the builder of the Villa d'Este at Tivoli), wrote to Guglielmo Gonzaga, Duke of Mantua (1538–1587), that he has heard that the Duke was interested in his "cantoretti" (little singers) and offered to send him two, so that he could choose one for his own service. This is a rare term but probably does equate to "castrato". The Cardinal's nephew, Alfonso II d'Este, Duke of Ferrara, was another early enthusiast, enquiring about castrati in 1556. There were certainly castrati in the Sistine Chapel choir in 1558, although not described as such: on 27 April of that year, Hernando Bustamante, a Spaniard from Palencia, was admitted (the first castrati so termed who joined the Sistine choir were Pietro Paolo Folignato and Girolamo Rossini, admitted in 1599). Surprisingly, considering the later French distaste for castrati they certainly existed in France at this time also, being known of in Paris, Orléans, Picardy and Normandy, though they were not abundant: the King of France himself had difficulty in obtaining them. By 1574, there were castrati in the Ducal court chapel at Munich, where the Kapellmeister (music director) was the famous Orlando di Lasso. In 1589, by the bull "Cum pro nostro pastorali munere", Pope Sixtus V re-organised the choir of St Peter's, Rome specifically to include castrati. Thus the castrati came to supplant both boys (whose voices broke after only a few years) and falsettists (whose voices were weaker and less reliable) from the top line in such choirs. Women were banned by the Pauline dictum "mulieres in ecclesiis taceant" ("let women keep silent in the churches"; see I Corinthians, ch. 14, v. 34).
Although the castrato (or musico) predates opera, there is some evidence that castrati had parts in the earliest operas. In the first performance of Monteverdi's "Orfeo" (1607), for example, they played subsidiary roles, including Speranza and (possibly) that of Euridice. Although female roles were performed by castrati in some of the papal states, this was increasingly rare; by 1680, they had supplanted "normal" male voices in lead roles, and retained their position as "primo uomo" for about a hundred years; an Italian opera not featuring at least one renowned castrato in a lead part would be doomed to fail. Because of the popularity of Italian opera throughout 18th-century Europe (except France), singers such as Ferri, Farinelli, Senesino and Pacchierotti became the first operatic superstars, earning enormous fees and hysterical public adulation. The strictly hierarchical organisation of "opera seria" favoured their high voices as symbols of heroic virtue, though they were frequently mocked for their strange appearance and bad acting. In his 1755 "Reflections upon theatrical expression in tragedy", Roger Pickering wrote:
Farinelli drew every Body to the Haymarket. What a Pipe! What Modulation! What Extasy to the Ear! But, Heavens! What Clumsiness! What Stupidity! What Offence to the Eye! Reader, if of the City, thou mayest probably have seen in the Fields of Islington or Mile-End or, If thou art in the environs of St James', thou must have observed in the Park with what Ease and Agility a cow, heavy with calf, has rose up at the command of the Milk-woman's foot: thus from the mossy bank sprang the DIVINE FARINELLI.
The means by which future singers were prepared could lead to premature death. To prevent the child from experiencing the intense pain of castration, many were inadvertently administered lethal doses of opium or some other narcotic, or were killed by overlong compression of the carotid artery in the neck (intended to render them unconscious during the castration procedure).
During the 18th century itself, the music historian Charles Burney was sent from pillar to post in search of places where "the operation" was carried out:
I enquired throughout Italy at what place boys were chiefly qualified for singing by castration, but could get no certain intelligence. I was told at Milan that it was at Venice; at Venice that it was at Bologna; but at Bologna the fact was denied, and I was referred to Florence; from Florence to Rome, and from Rome I was sent to Naples ... it is said that there are shops in Naples with this inscription: 'QUI SI CASTRANO RAGAZZI' ("Here boys are castrated"); but I was utterly unable to see or hear of any such shops during my residence in that city.
The training of the boys was rigorous. The regimen of one singing school in Rome (c. 1700) consisted of one hour of singing difficult and awkward pieces, one hour practising trills, one hour practising ornamented passaggi, one hour of singing exercises in their teacher's presence and in front of a mirror so as to avoid unnecessary movement of the body or facial grimaces, and one hour of literary study; all this, moreover, before lunch. After, half an hour would be devoted to musical theory, another to writing counterpoint, an hour copying down the same from dictation, and another hour of literary study. During the remainder of the day, the young castrati had to find time to practice their harpsichord playing, and to compose vocal music, either sacred or secular depending on their inclination. This demanding schedule meant that, if sufficiently talented, they were able to make a debut in their mid-teens with a perfect technique and a voice of a flexibility and power no woman or ordinary male singer could match.
In the 1720s and 1730s, at the height of the craze for these voices, it has been estimated that upwards of 4,000 boys were castrated annually in the service of art. Many came from poor homes and were castrated by their parents in the hope that their child might be successful and lift them from poverty (this was the case with Senesino). There are, though, records of some young boys asking to be operated on to preserve their voices (e.g. Caffarelli, who was from a wealthy family: his grandmother gave him the income from two vineyards to pay for his studies). Caffarelli was also typical of many castrati in being famous for tantrums on and off-stage, and for amorous adventures with noble ladies. Some, as described by Casanova, preferred gentlemen (noble or otherwise). Only a small percentage of boys castrated to preserve their voices had successful careers on the operatic stage; the better "also-rans" sang in cathedral or church choirs, but because of their marked appearance and the ban on their marrying, there was little room for them in society outside a musical context.
The castrati came in for a great amount of scurrilous and unkind abuse, and as their fame increased, so did the hatred of them. They were often castigated as malign creatures who lured men into homosexuality. There were homosexual castrati, as Casanova's accounts of 18th-century Italy bear witness. He mentions meeting an abbé whom he took for a girl in disguise, only later discovering that "she" was a famous castrato. In Rome in 1762 he attended a performance at which the prima donna was a castrato, "the favourite pathic" of Cardinal Borghese, who dined every evening with his protector. From his behaviour on stage "it was obvious that he hoped to inspire the love of those who liked him as a man, and probably would not have done so as a woman".
By the late 18th century, changes in operatic taste and social attitudes spelled the end for castrati. They lingered on past the end of the "ancien régime" (which their style of opera parallels), and two of their number, Pacchierotti and Crescentini, even entranced the iconoclastic Napoleon. The last great operatic castrato was Giovanni Battista Velluti (1781–1861), who performed the last operatic castrato role ever written: Armando in "Il crociato in Egitto" by Meyerbeer (Venice, 1824). Soon after this they were replaced definitively as the first men of the operatic stage by a new breed of heroic tenor, as first incarnated by the Frenchman Gilbert-Louis Duprez, the earliest so-called "king of the high Cs". His successors have included such singers as Enrico Tamberlik, Jean de Reszke, Francesco Tamagno, Enrico Caruso, Giovanni Martinelli, Beniamino Gigli, Jussi Björling, Franco Corelli and Luciano Pavarotti, among others.
After the unification of Italy in 1861, castration for musical purposes was officially made illegal (the new Italian state had adopted a French legal code which expressly forbade the practice). In 1878, Pope Leo XIII prohibited the hiring of new castrati by the church: only in the Sistine Chapel and in other papal basilicas in Rome did a few castrati linger. A group photo of the Sistine Choir taken in 1898 shows that by then only six remained (plus the "Direttore Perpetuo", the fine soprano castrato Domenico Mustafà), and in 1902 a ruling was extracted from Pope Leo that no further castrati should be admitted. The official end to the castrati came on St. Cecilia's Day, 22 November 1903, when the new pope, Pius X, issued his "motu proprio", "Tra le Sollecitudini" ('Amongst the Cares'), which contained this instruction: "Whenever ... it is desirable to employ the high voices of sopranos and contraltos, these parts must be taken by boys, according to the most ancient usage of the Church."
The last Sistine castrato to survive was Alessandro Moreschi, the only castrato to have made solo recordings. While an interesting historical record, these discs of his give us only a glimpse of the castrato voice – although he had been renowned as "The Angel of Rome" at the beginning of his career, some would say he was past his prime when the recordings were made in 1902 and 1904 and he never attempted to sing opera. The recording technology of the day was not of modern high quality. He retired officially in March 1913, and died in 1922.
The Catholic Church's involvement in the castrato phenomenon has long been controversial, and there have recently been calls for it to issue an official apology for its role. As early as 1748, Pope Benedict XIV tried to ban castrati from churches, but such was their popularity at the time that he realised that doing so might result in a drastic decline in church attendance.
The rumours of another castrato sequestered in the Vatican for the personal delectation of the Pontiff until as recently as 1959 have been proven false. The singer in question was a pupil of Moreschi's, Domenico Mancini, such a successful imitator of his teacher's voice that even Lorenzo Perosi, Direttore Perpetuo of the Sistine Choir from 1898 to 1956 and a strenuous opponent of the practice of castrato singers, thought he was a castrato. Mancini was in fact a moderately skilful falsettist and professional double bass player.
So-called "natural" or "endocrinological castrati" are born with hormonal anomalies, such as Klinefelter's syndrome and Kallmann's syndrome, or have undergone unusual physical or medical events during their early lives that reproduce the vocal effects of castration without being castrated. Basically, a male can retain his child voice if it never changes during puberty. The retained voice can be the treble voice shared by both sexes in childhood and is the same as boy soprano voice. But as evidence shows, many castratos, such as Senesino and Caffarelli, were actually altos (mezzo-soprano) – not sopranos.
Jimmy Scott, Robert Crowe and Radu Marian are examples of this type of high male voice. Michael Maniaci is somewhat different, in that he has no hormonal or other anomalies, but for some unknown reason, his voice did not "break" in the usual manner, leaving him still able to sing in the soprano register. Other uncastrated male adults sing soprano, generally using some form of falsetto but in a much higher range than most countertenors. Examples are Aris Christofellis, Jörg Waschinski, and Ghio Nannini.
However, it is believed the castrati possessed more of a tenorial chest register (the aria "Navigante che non spera" in Leonardo Vinci's opera "Il Medo", written for Farinelli, requires notes down to C3, 131 Hz). Similar low-voiced singing can be heard from the jazz vocalist Jimmy Scott, whose range matches approximately that used by female blues singers. High-pitched singer Jordan Smith has demonstrated having more of a tenorial chest register.
Actor Chris Colfer has soprano voice. Colfer has stated in interviews that when his voice began to change at puberty he sang in a high voice "constantly" in an effort to retain his range. Actor and singer Alex Newell has soprano range. Voice actor Walter Tetley may or may not have been a "castrato"; Bill Scott, a co-worker of Tetley's during their later work in television, once half-jokingly quipped that Tetley's mother "had him fixed" to protect the child star's voice-acting career. Tetley never personally divulged the exact reason for his condition, which left him with the voice of a preteen boy for his entire adult life.
Turkish popular singer Cem Adrian has the ability to sing from bass to soprano, his vocal folds having been reported to be three times the average length. | https://en.wikipedia.org/wiki?curid=5742 |
Counting-out game
A counting-out game or counting-out rhyme is a simple method of 'randomly' selecting a person from a group, often used by children for the purpose of playing another game. It usually requires no materials, and is achieved with spoken words or hand gestures. The historian Henry Carrington Bolton suggested in his 1888 book "Counting Out Rhymes of Children" that the custom of counting out originated in the "superstitious practice of divination by lot."
Many such methods involve one person pointing at each participant in a circle of players while reciting a rhyme. A new person is pointed at as each word is said. The player who is selected at the conclusion of the rhyme is "it" or "out". In an alternate version, the circle of players may each put two feet in and at the conclusion of the rhyme, that player removes one foot and the rhyme starts over with the next person. In this case, the first player that has both feet removed is "it" or "out". In theory a counting rhyme is determined entirely by the starting selection (and would result in a modulo operation), but in practice they are often accepted as random selections because the number of words has not been calculated beforehand, so the result is unknown until someone is selected.
A variant of counting-out game, known as the Josephus problem, represents a famous theoretical problem in mathematics and computer science.
Several simple games can be played to select one person from a group, either as a straightforward winner, or as someone who is eliminated. Rock, Paper, Scissors, Odd or Even and Blue Shoe require no materials and are played using hand gestures, although with the former it is possible for a player to win or lose through skill rather than luck. Coin flipping and drawing straws are fair methods of randomly determining a player. Fizz Buzz is a spoken word game where if a player slips up and speaks a word out of sequence, they are eliminated.
A scene in the Marx Brothers movie "Duck Soup" plays on the fact that counting-out games are not really random. Faced with selecting someone to go on a dangerous mission, the character Chicolini (Chico Marx) chants:
only to stop as he realizes he is about to select himself. He then says, "I did it wrong. Wait, wait, I start here", and repeats the chant—with the same result. After that, he says, "That's no good too. I got it!" and reduces the chant to
And with this version he finally manages to "randomly" select someone else. | https://en.wikipedia.org/wiki?curid=5743 |
Key size
In cryptography, key size or key length is the number of bits in a key used by a cryptographic algorithm (such as a cipher).
Key length defines the upper-bound on an algorithm's security (i.e. a logarithmic measure of the fastest known attack against an algorithm), since the security of all algorithms can be violated by brute-force attacks. Ideally, the lower-bound on an algorithm's security is by design equal to the key length (that is, the security is determined entirely by the keylength, or in other words, the algorithm's design doesn't detract from the degree of security inherent in the key length). Indeed, most symmetric-key algorithms are designed to have security equal to their key length. However, after design, a new attack might be discovered. For instance, Triple DES was designed to have a 168 bit key, but an attack of complexity 2112 is now known (i.e. Triple DES now only has 112 bits of security, and of the 168 bits in the key the attack has rendered 56 'ineffective' towards security). Nevertheless, as long as the security (understood as 'the amount of effort it would take to gain access') is sufficient for a particular application, then it doesn't matter if key length and security coincide. This is important for asymmetric-key algorithms, because no such algorithm is known to satisfy this property; elliptic curve cryptography comes the closest with an effective security of roughly half its key length.
Keys are used to control the operation of a cipher so that only the correct key can convert encrypted text (ciphertext) to plaintext. Many ciphers are actually based on publicly known algorithms or are open source and so it is only the difficulty of obtaining the key that determines security of the system, provided that there is no analytic attack (i.e. a "structural weakness" in the algorithms or protocols used), and assuming that the key is not otherwise available (such as via theft, extortion, or compromise of computer systems). The widely accepted notion that the security of the system should depend on the key alone has been explicitly formulated by Auguste Kerckhoffs (in the 1880s) and Claude Shannon (in the 1940s); the statements are known as Kerckhoffs' principle and Shannon's Maxim respectively.
A key should, therefore, be large enough that a brute-force attack (possible against any encryption algorithm) is infeasible – i.e. would take too long to execute. Shannon's work on information theory showed that to achieve so-called "perfect secrecy", the key length must be at least as large as the message and only used once (this algorithm is called the one-time pad). In light of this, and the practical difficulty of managing such long keys, modern cryptographic practice has discarded the notion of perfect secrecy as a requirement for encryption, and instead focuses on "computational security", under which the computational requirements of breaking an encrypted text must be infeasible for an attacker.
Encryption systems are often grouped into families. Common families include symmetric systems (e.g. AES) and asymmetric systems (e.g. RSA); they may alternatively be grouped according to the central algorithm used (e.g. elliptic curve cryptography).
As each of these is of a different level of cryptographic complexity, it is usual to have different key sizes for the same level of security, depending upon the algorithm used. For example, the security available with a 1024-bit key using asymmetric RSA is considered approximately equal in security to an 80-bit key in a symmetric algorithm.
The actual degree of security achieved over time varies, as more computational power and more powerful mathematical analytic methods become available. For this reason, cryptologists tend to look at indicators that an algorithm or key length shows signs of potential vulnerability, to move to longer key sizes or more difficult algorithms. For example, , a 1039-bit integer was factored with the special number field sieve using 400 computers over 11 months. The factored number was of a special form; the special number field sieve cannot be used on RSA keys. The computation is roughly equivalent to breaking a 700 bit RSA key. However, this might be an advance warning that 1024 bit RSA used in secure online commerce should be deprecated, since they may become breakable in the near future. Cryptography professor Arjen Lenstra observed that "Last time, it took nine years for us to generalize from a special to a nonspecial, hard-to-factor number" and when asked whether 1024-bit RSA keys are dead, said: "The answer to that question is an unqualified yes."
The 2015 Logjam attack revealed additional dangers in using Diffie-Helman key exchange when only one or a few common 1024-bit or smaller prime moduli are in use. This common practice allows large amounts of communications to be compromised at the expense of attacking a small number of primes.
Even if a symmetric cipher is currently unbreakable by exploiting structural weaknesses in its algorithm, it is possible to run through the entire space of keys in what is known as a "brute-force attack". Since longer symmetric keys require exponentially more work to brute force search, a sufficiently long symmetric key makes this line of attack impractical.
With a key of length "n" bits, there are 2n possible keys. This number grows very rapidly as "n" increases. The large number of operations (2128) required to try all possible 128-bit keys is widely considered out of reach for conventional digital computing techniques for the foreseeable future. However, experts anticipate alternative computing technologies that may have processing power superior to current computer technology. If a suitably sized quantum computer capable of running Grover's algorithm reliably becomes available, it would reduce a 128-bit key down to 64-bit security, roughly a DES equivalent. This is one of the reasons why AES supports a 256-bit key length. See the discussion on the relationship between key lengths and quantum computing attacks at the bottom of this page for more information.
US Government export policy has long restricted the "strength" of cryptography that can be sent out of the country. For many years the limit was 40 bits. Today, a key length of 40 bits offers little protection against even a casual attacker with a single PC. In response, by the year 2000, most of the major US restrictions on the use of strong encryption were relaxed. However, not all regulations have been removed, and encryption registration with the U.S. Bureau of Industry and Security is still required to export "mass market encryption commodities, software and components with encryption exceeding 64 bits" ().
IBM's Lucifer cipher was selected in 1974 as the base for what would become the Data Encryption Standard. Lucifer's key length was reduced from 128 bits to 56 bits, which the NSA and NIST argued was sufficient. The NSA has major computing resources and a large budget; some cryptographers including Whitfield Diffie and Martin Hellman complained that this made the cipher so weak that NSA computers would be able to break a DES key in a day through brute force parallel computing. The NSA disputed this, claiming that brute-forcing DES would take them something like 91 years. However, by the late 90s, it became clear that DES could be cracked in a few days' time-frame with custom-built hardware such as could be purchased by a large corporation or government. The book "Cracking DES" (O'Reilly and Associates) tells of the successful attempt in 1998 to break 56-bit DES by a brute-force attack mounted by a cyber civil rights group with limited resources; see EFF DES cracker. Even before that demonstration, 56 bits was considered insufficient length for symmetric algorithm keys; DES has been replaced in many applications by Triple DES, which has 112 bits of security when used 168-bit keys (triple key). In 2002, Distributed.net and its volunteers broke a 64-bit RC5 key after several years effort, using about seventy thousand (mostly home) computers.
The Advanced Encryption Standard published in 2001 uses key sizes of 128, 192 or 256 bits. Many observers consider 128 bits sufficient for the foreseeable future for symmetric algorithms of AES's quality until quantum computers become available. However, as of 2015, the U.S. National Security Agency has issued guidance that it plans to switch to quantum computing resistant algorithms and now requires 256-bit AES keys for data classified up to Top Secret.
In 2003, the U.S. National Institute for Standards and Technology, NIST proposed phasing out 80-bit keys by 2015. At 2005, 80-bit keys were allowed only until 2010.
Since 2015, NIST guidance says that "the use of keys that provide less than 112 bits of security strength for key agreement is now disallowed." NIST approved symmetric encryption algorithms include three-key Triple DES, and AES. Approvals for two-key Triple DES and Skipjack were withdrawn in 2015; the NSA's Skipjack algorithm used in its Fortezza program employs 80-bit keys.
The effectiveness of public key cryptosystems depends on the intractability (computational and theoretical) of certain mathematical problems such as integer factorization. These problems are time-consuming to solve, but usually faster than trying all possible keys by brute force. Thus, asymmetric algorithm keys must be longer for equivalent resistance to attack than symmetric algorithm keys. As of 2002, an asymmetric key length of 1024 bits was generally considered by cryptology experts to be the minimum necessary for the RSA encryption algorithm.
The Finite Field Diffie-Hellman algorithm has roughly the same key strength as RSA for the same key sizes. The work factor for breaking Diffie-Hellman is based on the discrete logarithm problem, which is related to the integer factorization problem on which RSA's strength is based. Thus, a 3072-bit Diffie-Hellman key has about the same strength as a 3072-bit RSA key.
One of the asymmetric algorithm types, elliptic-curve cryptography, or ECC, appears to be secure with shorter keys than other asymmetric key algorithms require. NIST guidelines state that ECC keys should be twice the length of equivalent strength symmetric key algorithms. So, for example, a 224-bit ECC key would have roughly the same strength as a 112-bit symmetric key. These estimates assume no major breakthroughs in solving the underlying mathematical problems that ECC is based on. A message encrypted with an elliptic key algorithm using a 109-bit long key has been broken by brute force.
The NSA previously specified that "Elliptic Curve Public Key Cryptography using the 256-bit prime modulus elliptic curve as specified in FIPS-186-2 and SHA-256 are appropriate for protecting classified information up to the SECRET level. Use of the 384-bit prime modulus elliptic curve and SHA-384 are necessary for the protection of TOP SECRET information." In 2015 the NSA announced that it plans to transition from elliptic-curve cryptography to new algorithms that are resistant to attack by future quantum computers. In the interim it recommends the larger 384-bit curve for all classified information.
The two best known quantum computing attacks are based on Shor's algorithm and Grover's algorithm. Of the two, Shor's offers the greater risk to current security systems.
Derivatives of Shor's algorithm are widely conjectured to be effective against all mainstream public-key algorithms including RSA, Diffie-Hellman and elliptic curve cryptography. According to Professor Gilles Brassard, an expert in quantum computing: "The time needed to factor an RSA integer is the same order as the time needed to use that same integer as modulus for a single RSA encryption. In other words, it takes no more time to break RSA on a quantum computer (up to a multiplicative constant) than to use it legitimately on a classical computer." The general consensus is that these public key algorithms are insecure at any key size if sufficiently large quantum computers capable of running Shor's algorithm become available. The implication of this attack is that all data encrypted using current standards based security systems such as the ubiquitous SSL used to protect e-commerce and Internet banking and SSH used to protect access to sensitive computing systems is at risk. Encrypted data protected using public-key algorithms can be archived and may be broken at a later time.
Mainstream symmetric ciphers (such as AES or Twofish) and collision resistant hash functions (such as SHA) are widely conjectured to offer greater security against known quantum computing attacks. They are widely thought most vulnerable to Grover's algorithm. Bennett, Bernstein, Brassard, and Vazirani proved in 1996 that a brute-force key search on a quantum computer cannot be faster than roughly 2"n"/2 invocations of the underlying cryptographic algorithm, compared with roughly 2"n" in the classical case. Thus in the presence of large quantum computers an "n"-bit key can provide at least "n"/2 bits of security. Quantum brute force is easily defeated by doubling the key length, which has little extra computational cost in ordinary use. This implies that at least a 256-bit symmetric key is required to achieve 128-bit security rating against a quantum computer. As mentioned above, the NSA announced in 2015 that it plans to transition to quantum-resistant algorithms.
According to NSA:
, the NSA's Commercial National Security Algorithm Suite includes: | https://en.wikipedia.org/wiki?curid=5749 |
Cognitive behavioral therapy
Cognitive behavioral therapy (CBT) is a psycho-social intervention that aims to improve mental health. CBT focuses on challenging and changing unhelpful cognitive distortions (e.g. thoughts, beliefs, and attitudes) and behaviors, improving emotional regulation, and the development of personal coping strategies that target solving current problems. Originally, it was designed to treat depression, but its uses have been expanded to include treatment of a number of mental health conditions, including anxiety. CBT includes a number of cognitive or behavior psychotherapies that treat defined psychopathologies using evidence-based techniques and strategies.
CBT is based on the combination of the basic principles from behavioral and cognitive psychology. It is different from historical approaches to psychotherapy, such as the psychoanalytic approach where the therapist looks for the unconscious meaning behind the behaviors and then formulates a diagnosis. Instead, CBT is a "problem-focused" and "action-oriented" form of therapy, meaning it is used to treat specific problems related to a diagnosed mental disorder. The therapist's role is to assist the client in finding and practicing effective strategies to address the identified goals and decrease symptoms of the disorder. CBT is based on the belief that thought distortions and maladaptive behaviors play a role in the development and maintenance of psychological disorders, and that symptoms and associated distress can be reduced by teaching new information-processing skills and coping mechanisms.
When compared to psychoactive medications, review studies have found CBT alone to be as effective for treating less severe forms of depression, anxiety, post traumatic stress disorder (PTSD), tics, substance abuse, eating disorders and borderline personality disorder. Some research suggests that CBT is most effective when combined with medication for treating mental disorders such as Major Depressive Disorder. In addition, CBT is recommended as the first line of treatment for the majority of psychological disorders in children and adolescents, including aggression and conduct disorder. Researchers have found that other "bona fide" therapeutic interventions were equally effective for treating certain conditions in adults. Along with interpersonal psychotherapy (IPT), CBT is recommended in treatment guidelines as a psychosocial treatment of choice, and CBT and IPT are the only psychosocial interventions that psychiatry residents in the United States are mandated to be trained in.
Mainstream cognitive behavioral therapy assumes that changing maladaptive thinking leads to change in behavior and affect, but recent variants emphasize changes in one's relationship to maladaptive thinking rather than changes in thinking itself. The goal of cognitive behavioral therapy is not to diagnose a person with a particular disease, but to look at the person as a whole and decide what can be altered.
Therapists or computer-based programs use CBT techniques to help people challenge their patterns and beliefs and replace errors in thinking, known as cognitive distortions, such as "overgeneralizing, magnifying negatives, minimizing positives and catastrophizing" with "more realistic and effective thoughts, thus decreasing emotional distress and self-defeating behavior". Cognitive distortions can be either a pseudo-discrimination belief or an over-generalization of something. CBT techniques may also be used to help individuals take a more open, mindful, and aware posture toward cognitive distortions so as to diminish their impact.
Mainstream CBT helps individuals replace "maladaptive... coping skills, cognitions, emotions and behaviors with more adaptive ones", by challenging an individual's way of thinking and the way that they react to certain habits or behaviors, but there is still controversy about the degree to which these traditional cognitive elements account for the effects seen with CBT over and above the earlier behavioral elements such as exposure and skills training.
CBT can be seen as having six phases:
These steps are based on a system created by Kanfer and Saslow. After identifying the behaviors that need changing, whether they be in excess or deficit, and treatment has occurred, the psychologist must identify whether or not the intervention succeeded. For example, "If the goal was to decrease the behavior, then there should be a decrease relative to the baseline. If the critical behavior remains at or above the baseline, then the intervention has failed."
The steps in the assessment phase include:
The re-conceptualization phase makes up much of the "cognitive" portion of CBT. A summary of modern CBT approaches is given by Hofmann.
There are different protocols for delivering cognitive behavioral therapy, with important similarities among them. Use of the term "CBT" may refer to different interventions, including "self-instructions (e.g. distraction, imagery, motivational self-talk), relaxation and/or biofeedback, development of adaptive coping strategies (e.g. minimizing negative or self-defeating thoughts), changing maladaptive beliefs about pain, and goal setting". Treatment is sometimes manualized, with brief, direct, and time-limited treatments for individual psychological disorders that are specific technique-driven. CBT is used in both individual and group settings, and the techniques are often adapted for self-help applications. Some clinicians and researchers are cognitively oriented (e.g. cognitive restructuring), while others are more behaviorally oriented (e.g. "in vivo" exposure therapy). Interventions such as imaginal exposure therapy combine both approaches.
CBT may be delivered in conjunction with a variety of diverse but related techniques such as exposure therapy, stress inoculation, cognitive processing therapy, cognitive therapy, relaxation training, dialectical behavior therapy, and acceptance and commitment therapy. Some practitioners promote a form of mindful cognitive therapy which includes a greater emphasis on self-awareness as part of the therapeutic process.
In adults, CBT has been shown to have effectiveness and a role in the treatment plans for anxiety disorders, body dysmorphic disorder, depression, eating disorders, chronic low back pain, personality disorders, psychosis, schizophrenia, substance use disorders, in the adjustment, depression, and anxiety associated with fibromyalgia, and with post-spinal cord injuries.
In children or adolescents, CBT is an effective part of treatment plans for anxiety disorders, body dysmorphic disorder, depression and suicidality, eating disorders and obesity, obsessive–compulsive disorder (OCD), and posttraumatic stress disorder, as well as tic disorders, trichotillomania, and other repetitive behavior disorders. CBT-SP, an adaptation of CBT for suicide prevention (SP), was specifically designed for treating youths who are severely depressed and who have recently attempted suicide within the past 90 days, and was found to be effective, feasible, and acceptable. CBT has also been shown to be effective for posttraumatic stress disorder in very young children (3 to 6 years of age). Reviews found "low quality" evidence that CBT may be more effective than other psychotherapies in reducing symptoms of posttraumatic stress disorder in children and adolescents. CBT has also been applied to a variety of childhood disorders, including depressive disorders and various anxiety disorders.
CBT combined with hypnosis and distraction reduces self-reported pain in children.
Cochrane reviews have found no evidence that CBT is effective for tinnitus, although there appears to be an effect on management of associated depression and quality of life in this condition. Other recent Cochrane Reviews found no convincing evidence that CBT training helps foster care providers manage difficult behaviors in the youths under their care, nor was it helpful in treating people who abuse their intimate partners.
According to a 2004 review by INSERM of three methods, cognitive behavioral therapy was either "proven" or "presumed" to be an effective therapy on several specific mental disorders. According to the study, CBT was effective at treating schizophrenia, depression, bipolar disorder, panic disorder, post-traumatic stress, anxiety disorders, bulimia, anorexia, personality disorders and alcohol dependency.
Some meta-analyses find CBT more effective than psychodynamic therapy and equal to other therapies in treating anxiety and depression.
Computerized CBT (CCBT) has been proven to be effective by randomized controlled and other trials in treating depression and anxiety disorders, including children, as well as insomnia. Some research has found similar effectiveness to an intervention of informational websites and weekly telephone calls. CCBT was found to be equally effective as face-to-face CBT in adolescent anxiety and insomnia.
Criticism of CBT sometimes focuses on implementations (such as the UK IAPT) which may result initially in low quality therapy being offered by poorly trained practitioners. However, evidence supports the effectiveness of CBT for anxiety and depression. Acceptance and commitment therapy (ACT) is a specialist branch of CBT (sometimes referred to as contextual CBT). ACT uses mindfulness and acceptance interventions and has been found to have a greater longevity in therapeutic outcomes. In a study with anxiety, CBT and ACT improved similarly across all outcomes from pre-to post-treatment. However, during a 12-month follow-up, ACT proved to be more effective, showing that it is a highly viable lasting treatment model for anxiety disorders.
Evidence suggests that the addition of hypnotherapy as an adjunct to CBT improves treatment efficacy for a variety of clinical issues.
CBT has been applied in both clinical and non-clinical environments to treat disorders such as personality conditions and behavioral problems. A systematic review of CBT in depression and anxiety disorders concluded that "CBT delivered in primary care, especially including computer- or Internet-based self-help programs, is potentially more effective than usual care and could be delivered effectively by primary care therapists."
Emerging evidence suggests a possible role for CBT in the treatment of attention deficit hyperactivity disorder (ADHD); hypochondriasis; coping with the impact of multiple sclerosis; sleep disturbances related to aging; dysmenorrhea; and bipolar disorder, but more study is needed and results should be interpreted with caution. CBT can have a therapeutic effects on easing symptoms of anxiety and depression in people with Alzheimer's disease. CBT has been studied as an aid in the treatment of anxiety associated with stuttering. Initial studies have shown CBT to be effective in reducing social anxiety in adults who stutter, but not in reducing stuttering frequency.
In the case of people with metastatic breast cancer, data is limited but CBT and other psychosocial interventions might help with psychological outcomes and pain management.
There is some evidence that CBT is superior in the long-term to benzodiazepines and the nonbenzodiazepines in the treatment and management of insomnia. CBT has been shown to be moderately effective for treating chronic fatigue syndrome.
In the United Kingdom, the National Institute for Health and Care Excellence (NICE) recommends CBT in the treatment plans for a number of mental health difficulties, including posttraumatic stress disorder, obsessive–compulsive disorder (OCD), bulimia nervosa, and clinical depression.
Cognitive behavioral therapy has been shown as an effective treatment for clinical depression. The American Psychiatric Association Practice Guidelines (April 2000) indicated that, among psychotherapeutic approaches, cognitive behavioral therapy and interpersonal psychotherapy had the best-documented efficacy for treatment of major depressive disorder. One etiological theory of depression is Aaron T. Beck's cognitive theory of depression. His theory states that depressed people think the way they do because their thinking is biased towards negative interpretations. According to this theory, depressed people acquire a negative schema of the world in childhood and adolescence as an effect of stressful life events, and the negative schema is activated later in life when the person encounters similar situations.
Beck also described a negative cognitive triad. The cognitive triad is made up of the depressed individual's negative evaluations of themselves, the world, and the future. Beck suggested that these negative evaluations derive from the negative schemata and cognitive biases of the person. According to this theory, depressed people have views such as "I never do a good job", "It is impossible to have a good day", and "things will never get better". A negative schema helps give rise to the cognitive bias, and the cognitive bias helps fuel the negative schema. Beck further proposed that depressed people often have the following cognitive biases: arbitrary inference, selective abstraction, over-generalization, magnification, and minimization. These cognitive biases are quick to make negative, generalized, and personal inferences of the self, thus fueling the negative schema.
A 2001 meta-analysis comparing CBT and psychodynamic psychotherapy suggested the approaches were equally effective in the short term.
In contrast, meta-analyses of larger trials of different psychotherapeutic treatments that CBT, interpersonal therapy, and problem-solving therapy for depression outperform psychodynamic psychotherapy and behavioral activation in terms of robustness of effects.
CBT has been shown to be effective in the treatment of adults with anxiety disorders.
A basic concept in some CBT treatments used in anxiety disorders is "in vivo" exposure. The term refers to the direct confrontation of feared objects, activities, or situations by a patient. For example, a woman with PTSD who fears the location where she was assaulted may be assisted by her therapist in going to that location and directly confronting those fears. Likewise, a person with social anxiety disorder who fears public speaking may be instructed to directly confront those fears by giving a speech. This "two-factor" model is often credited to O. Hobart Mowrer. Through exposure to the stimulus, this harmful conditioning can be "unlearned" (referred to as extinction and habituation). Studies have provided evidence that when examining animals and humans that glucocorticoids may possibly lead to a more successful extinction learning during exposure therapy. For instance, glucocorticoids can prevent aversive learning episodes from being retrieved and heighten reinforcement of memory traces creating a non-fearful reaction in feared situations. A combination of glucocorticoids and exposure therapy may be a better improved treatment for treating patients with anxiety disorders.
A 2015 Cochrane review also found that CBT for symptomatic management of non-specific chest pain is probably effective in the short term. However, the findings were limited by small trials and the evidence was considered of questionable quality.
There is limited evidence of effectiveness for CBT in bipolar disorder and severe depression.
In long-term psychoses, CBT is used to complement medication and is adapted to meet individual needs. Interventions particularly related to these conditions include exploring reality testing, changing delusions and hallucinations, examining factors which precipitate relapse, and managing relapses.
Several meta-analyses suggested that CBT is effective in schizophrenia, Cochrane reviews reported CBT had "no effect on long‐term risk of relapse" and no evidence that CBT had an additional effect above standard care.
A 2015 systematic review investigated the effects of CBT compared with other psychosocial therapies for people with schizophrenia and determined that there's no clear advantage over other, often less expensive interventions but acknowledged that better quality evidence is needed before firm conclusions can be drawn.
CBT is used to help people of all ages, but the therapy should be adjusted based on the age of the patient with whom the therapist is dealing. Older individuals in particular have certain characteristics that need to be acknowledged and the therapy altered to account for these differences thanks to age. Of the small number of studies examining CBT for the management of depression in older people, there is currently no strong support.
For anxiety disorders, use of CBT with people at risk has significantly reduced the number of episodes of generalized anxiety disorder and other anxiety symptoms, and also given significant improvements in explanatory style, hopelessness, and dysfunctional attitudes. In another study, 3% of the group receiving the CBT intervention developed generalized anxiety disorder by 12 months postintervention compared with 14% in the control group. Subthreshold panic disorder sufferers were found to significantly benefit from use of CBT. Use of CBT was found to significantly reduce social anxiety prevalence.
For depressive disorders, a stepped-care intervention (watchful waiting, CBT and medication if appropriate) achieved a 50% lower incidence rate in a patient group aged 75 or older. Another depression study found a neutral effect compared to personal, social, and health education, and usual school provision, and included a comment on potential for increased depression scores from people who have received CBT due to greater self recognition and acknowledgement of existing symptoms of depression and negative thinking styles. A further study also saw a neutral result. A meta-study of the Coping with Depression course, a cognitive behavioral intervention delivered by a psychoeducational method, saw a 38% reduction in risk of major depression.
For people at risk of psychosis, in 2014 the UK National Institute for Health and Care Excellence (NICE) recommended preventive CBT.
CBT is also used for pathological and problem gambling. The percentage of people who problem gamble is 1–3% around the world. Cognitive behavioral therapy develops skills for relapse prevention and someone can learn to control their mind and manage high-risk cases. There is evidence of efficacy of CBT for treating pathological and problem gambling at immediate follow up, however the longer term efficacy of CBT for it is currently unknown.
CBT looks at the habit of smoking cigarettes as a learned behavior, which later evolves into a coping strategy to handle daily stressors. Because smoking is often easily accessible, and quickly allows the user to feel good, it can take precedence over other coping strategies, and eventually work its way into everyday life during non-stressful events as well. CBT aims to target the function of the behavior, as it can vary between individuals, and works to inject other coping mechanisms in place of smoking. CBT also aims to support individuals suffering from strong cravings, which are a major reported reason for relapse during treatment.
In a 2008 controlled study out of Stanford University School of Medicine, suggested CBT may be an effective tool to help maintain abstinence. The results of 304 random adult participants were tracked over the course of one year. During this program, some participants were provided medication, CBT, 24 hour phone support, or some combination of the three methods. At 20 weeks, the participants who received CBT had a 45% abstinence rate, versus non-CBT participants, who had a 29% abstinence rate. Overall, the study concluded that emphasizing cognitive and behavioral strategies to support smoking cessation can help individuals build tools for long term smoking abstinence.
Mental health history can affect the outcomes of treatment. Individuals with a history of depressive disorders had a lower rate of success when using CBT alone to combat smoking addiction.
A Cochrane review was unable to find evidence of any difference between CBT and hypnosis for smoking cessation. While this may be evidence of no effect, further research may uncover an effect of CBT for smoking cessation.
Studies have shown CBT to be an effective treatment for substance abuse. For individuals with substance abuse disorders, CBT aims to reframe maladaptive thoughts, such as denial, minimizing and catastrophizing thought patterns, with healthier narratives. Specific techniques include identifying potential triggers and developing coping mechanisms to manage high-risk situations. Research has shown CBT to be particularly effective when combined with other therapy-based treatments or medication.
Though many forms of treatment can support individuals with eating disorders, CBT is proven to be a more effective treatment than medications and interpersonal psychotherapy alone. CBT aims to combat major causes of distress such as negative cognitions surrounding body weight, shape and size. CBT therapists also work with individuals to regulate strong emotions and thoughts that lead to dangerous compensatory behaviors. CBT is the first line of treatment for Bulimia Nervosa, and Eating Disorder Non-Specific. While there is evidence to support the efficacy of CBT for bulimia nervosa and binging, the evidence is somewhat variable and limited by small study sizes.
Research has identified Internet addiction as a new clinical disorder that causes relational, occupational, and social problems. Cognitive behavioral therapy (CBT) has been suggested as the treatment of choice for Internet addiction, and addiction recovery in general has used CBT as part of treatment planning.
A Cochrane review of interventions aimed at preventing psychological stress in healthcare workers found that CBT was more effective than no intervention but no more effective than alternative stress-reduction interventions.
Precursors of certain fundamental aspects of CBT have been identified in various ancient philosophical traditions, particularly Stoicism. Stoic philosophers, particularly Epictetus, believed logic could be used to identify and discard false beliefs that lead to destructive emotions, which has influenced the way modern cognitive-behavioral therapists identify cognitive distortions that contribute to depression and anxiety. For example, Aaron T. Beck's original treatment manual for depression states, "The philosophical origins of cognitive therapy can be traced back to the Stoic philosophers". Another example of Stoic influence on cognitive theorists is Epictetus on Albert Ellis. A key philosophical figure who also influenced the development of CBT was John Stuart Mill.
The modern roots of CBT can be traced to the development of behavior therapy in the early 20th century, the development of cognitive therapy in the 1960s, and the subsequent merging of the two. Groundbreaking work of behaviorism began with John B. Watson and Rosalie Rayner's studies of conditioning in 1920. Behaviorally-centered therapeutic approaches appeared as early as 1924 with Mary Cover Jones' work dedicated to the unlearning of fears in children. These were the antecedents of the development of Joseph Wolpe's behavioral therapy in the 1950s. It was the work of Wolpe and Watson, which was based on Ivan Pavlov's work on learning and conditioning, that influenced Hans Eysenck and Arnold Lazarus to develop new behavioral therapy techniques based on classical conditioning.
During the 1950s and 1960s, behavioral therapy became widely utilized by researchers in the United States, the United Kingdom, and South Africa, who were inspired by the behaviorist learning theory of Ivan Pavlov, John B. Watson, and Clark L. Hull. In Britain, Joseph Wolpe, who applied the findings of animal experiments to his method of systematic desensitization, applied behavioral research to the treatment of neurotic disorders. Wolpe's therapeutic efforts were precursors to today's fear reduction techniques. British psychologist Hans Eysenck presented behavior therapy as a constructive alternative.
At the same time as Eysenck's work, B. F. Skinner and his associates were beginning to have an impact with their work on operant conditioning. Skinner's work was referred to as radical behaviorism and avoided anything related to cognition. However, Julian Rotter, in 1954, and Albert Bandura, in 1969, contributed behavior therapy with their respective work on social learning theory, by demonstrating the effects of cognition on learning and behavior modification. The work of the Australian Claire Weekes dealing with anxiety disorders in the 1960s was also seen as a prototype of behavior therapy.
The emphasis on behavioral factors constituted the "first wave" of CBT.
One of the first therapists to address cognition in psychotherapy was Alfred Adler with his notion of basic mistakes and how they contributed to creation of unhealthy or useless behavioral and life goals. Adler's work influenced the work of Albert Ellis, who developed the earliest cognitive-based psychotherapy, known today as rational emotive behavior therapy, or REBT. Ellis also credits Abraham Low as a founder of cognitive behavioral therapy.
Around the same time that rational emotive therapy, as it was known then, was being developed, Aaron T. Beck was conducting free association sessions in his psychoanalytic practice. During these sessions, Beck noticed that thoughts were not as unconscious as Freud had previously theorized, and that certain types of thinking may be the culprits of emotional distress. It was from this hypothesis that Beck developed cognitive therapy, and called these thoughts "automatic thoughts". Beck has been referred to as "the father of cognitive behavioral therapy."
It was these two therapies, rational emotive therapy and cognitive therapy, that started the "second wave" of CBT, which was the emphasis on cognitive factors.
Although the early behavioral approaches were successful in many of the neurotic disorders, they had little success in treating depression. Behaviorism was also losing in popularity due to the so-called "cognitive revolution". The therapeutic approaches of Albert Ellis and Aaron T. Beck gained popularity among behavior therapists, despite the earlier behaviorist rejection of "mentalistic" concepts like thoughts and cognitions. Both of these systems included behavioral elements and interventions and primarily concentrated on problems in the present.
In initial studies, cognitive therapy was often contrasted with behavioral treatments to see which was most effective. During the 1980s and 1990s, cognitive and behavioral techniques were merged into cognitive behavioral therapy. Pivotal to this merging was the successful development of treatments for panic disorder by David M. Clark in the UK and David H. Barlow in the US.
Over time, cognitive behavior therapy became to be known not only as a therapy, but as an umbrella term for all cognitive-based psychotherapies. These therapies include, but are not limited to, rational emotive therapy (REBT), cognitive therapy, acceptance and commitment therapy, dialectical behavior therapy, reality therapy/choice theory, cognitive processing therapy, EMDR, and multimodal therapy. All of these therapies are a blending of cognitive- and behavior-based elements.
This blending of theoretical and technical foundations from both behavior and cognitive therapies constituted the "third wave" of CBT. The most prominent therapies of this third wave are dialectical behavior therapy and acceptance and commitment therapy.
Despite increasing popularity of "third-wave" treatment approaches, reviews of studies reveal there may be no difference in the effectiveness compared with "non-third wave" CBT for the treatment of depression.
A typical CBT programme would consist of face-to-face sessions between patient and therapist, made up of 6–18 sessions of around an hour each with a gap of 1–3 weeks between sessions. This initial programme might be followed by some booster sessions, for instance after one month and three months. CBT has also been found to be effective if patient and therapist type in real time to each other over computer links.
Cognitive behavioral therapy is most closely allied with the scientist–practitioner model in which clinical practice and research is informed by a scientific perspective, clear operationalization of the problem, and an emphasis on measurement, including measuring changes in cognition and behavior and in the attainment of goals. These are often met through "homework" assignments in which the patient and the therapist work together to craft an assignment to complete before the next session. The completion of these assignments – which can be as simple as a person suffering from depression attending some kind of social event – indicates a dedication to treatment compliance and a desire to change. The therapists can then logically gauge the next step of treatment based on how thoroughly the patient completes the assignment. Effective cognitive behavioral therapy is dependent on a therapeutic alliance between the healthcare practitioner and the person seeking assistance. Unlike many other forms of psychotherapy, the patient is very involved in CBT. For example, an anxious patient may be asked to talk to a stranger as a homework assignment, but if that is too difficult, he or she can work out an easier assignment first. The therapist needs to be flexible and willing to listen to the patient rather than acting as an authority figure.
Computerized cognitive behavioral therapy (CCBT) has been described by NICE as a "generic term for delivering CBT via an interactive computer interface delivered by a personal computer, internet, or interactive voice response system", instead of face-to-face with a human therapist. It is also known as internet-delivered cognitive behavioral therapy or ICBT. CCBT has potential to improve access to evidence-based therapies, and to overcome the prohibitive costs and lack of availability sometimes associated with retaining a human therapist. In this context, it is important not to confuse CBT with 'computer-based training', which nowadays is more commonly referred to as e-Learning.
CCBT has been found in meta-studies to be cost-effective and often cheaper than usual care, including for anxiety. Studies have shown that individuals with social anxiety and depression experienced improvement with online CBT-based methods. A review of current CCBT research in the treatment of OCD in children found this interface to hold great potential for future treatment of OCD in youths and adolescent populations. Additionally, most internet interventions for posttraumatic stress disorder use CCBT. CCBT is also predisposed to treating mood disorders amongst non-heterosexual populations, who may avoid face-to-face therapy from fear of stigma. However presently CCBT programs seldom cater to these populations.
A key issue in CCBT use is low uptake and completion rates, even when it has been clearly made available and explained. CCBT completion rates and treatment efficacy have been found in some studies to be higher when use of CCBT is supported personally, with supporters not limited only to therapists, than when use is in a self-help form alone. Another approach to improving uptake and completion rate, as well as treatment outcome, is to design software that supports the formation of a strong therapeutic alliance between the user and the technology.
In February 2006 NICE recommended that CCBT be made available for use within the NHS across England and Wales for patients presenting with mild-to-moderate depression, rather than immediately opting for antidepressant medication, and CCBT is made available by some health systems. The 2009 NICE guideline recognized that there are likely to be a number of computerized CBT products that are useful to patients, but removed endorsement of any specific product.
A relatively new avenue of research is the combination of artificial intelligence and CCBT. It has been proposed to use modern technology to create CCBT that simulates face-to-face therapy. This might be achieved in cognitive behavior therapy for a specific disorder using the comprehensive domain knowledge of CBT. One area where this has been attempted is the specific domain area of social anxiety in those who stutter.
Another new method of access is the use of mobile app or smartphone applications to deliver self-help or guided CBT. Technology companies are developing mobile-based artificial intelligence chatbot applications in delivering CBT as an early intervention to support mental health, to build psychological resilience and to promote emotional well-being. Artificial intelligence (AI) text-based conversational application delivered securely and privately over smartphone devices have the ability to scale globally and offer contextual and always-available support. Active research is underway including real world data studies that measure effectiveness and engagement of text-based smartphone chatbot apps for delivery of CBT using a text-based conversational interface.
Enabling patients to read self-help CBT guides has been shown to be effective by some studies. However one study found a negative effect in patients who tended to ruminate, and another meta-analysis found that the benefit was only significant when the self-help was guided (e.g. by a medical professional).
Patient participation in group courses has been shown to be effective. In a meta-analysis reviewing evidence-based treatment of OCD in children, individual CBT was found to be more efficacious than group CBT.
Brief cognitive behavioral therapy (BCBT) is a form of CBT which has been developed for situations in which there are time constraints on the therapy sessions. BCBT takes place over a couple of sessions that can last up to 12 accumulated hours by design. This technique was first implemented and developed on soldiers overseas in active duty by David M. Rudd to prevent suicide.
Breakdown of treatment
Cognitive emotional behavioral therapy (CEBT) is a form of CBT developed initially for individuals with eating disorders but now used with a range of problems including anxiety, depression, obsessive compulsive disorder (OCD), post-traumatic stress disorder (PTSD) and anger problems. It combines aspects of CBT and dialectical behavioral therapy and aims to improve understanding and tolerance of emotions in order to facilitate the therapeutic process. It is frequently used as a "pretreatment" to prepare and better equip individuals for longer-term therapy.
Structured cognitive behavioral training (SCBT) is a cognitive-based process with core philosophies that draw heavily from CBT. Like CBT, SCBT asserts that behavior is inextricably related to beliefs, thoughts and emotions. SCBT also builds on core CBT philosophy by incorporating other well-known modalities in the fields of behavioral health and psychology: most notably, Albert Ellis's rational emotive behavior therapy. SCBT differs from CBT in two distinct ways. First, SCBT is delivered in a highly regimented format. Second, SCBT is a predetermined and finite training process that becomes personalized by the input of the participant. SCBT is designed with the intention to bring a participant to a specific result in a specific period of time. SCBT has been used to challenge addictive behavior, particularly with substances such as tobacco, alcohol and food, and to manage diabetes and subdue stress and anxiety. SCBT has also been used in the field of criminal psychology in the effort to reduce recidivism.
Moral reconation therapy, a type of CBT used to help felons overcome antisocial personality disorder (ASPD), slightly decreases the risk of further offending. It is generally implemented in a group format because of the risk of offenders with ASPD being given one-on-one therapy reinforces narcissistic behavioral characteristics, and can be used in correctional or outpatient settings. Groups usually meet weekly for two to six months.
This type of therapy uses a blend of cognitive, behavioral and some humanistic training techniques to target the stressors of the client. This usually is used to help clients better cope with their stress or anxiety after stressful events. This is a three-phase process that trains the client to use skills that they already have to better adapt to their current stressors. The first phase is an interview phase that includes psychological testing, client self-monitoring, and a variety of reading materials. This allows the therapist to individually tailor the training process to the client. Clients learn how to categorize problems into emotion-focused or problem-focused, so that they can better treat their negative situations. This phase ultimately prepares the client to eventually confront and reflect upon their current reactions to stressors, before looking at ways to change their reactions and emotions in relation to their stressors. The focus is conceptualization.
The second phase emphasizes the aspect of skills acquisition and rehearsal that continues from the earlier phase of conceptualization. The client is taught skills that help them cope with their stressors. These skills are then practised in the space of therapy. These skills involve self-regulation, problem-solving, interpersonal communication skills, etc.
The third and final phase is the application and following through of the skills learned in the training process. This gives the client opportunities to apply their learned skills to a wide range of stressors. Activities include role-playing, imagery, modeling, etc. In the end, the client will have been trained on a preventive basis to inoculate personal, chronic, and future stressors by breaking down their stressors into problems they will address in long-term, short-term, and intermediate coping goals.
A newly developed group therapy model based on Cognitive Behavioral Therapy (CBT) integrates knitting into the therapeutical process and has been proven to yield reliable and promising results. The foundation for this novel approach to CBT is the frequently emphasized notion that therapy success depends on the embeddedness of the therapy method in the patients' natural routine. Similar to standard group-based Cognitive Behavioural Therapy, patients meet once a week in a group of 10 to 15 patients and knit together under the instruction of a trained psychologist or mental health professional. Central for the therapy is the patient's imaginative ability to assign each part of the wool to a certain thought. During the therapy, the wool is carefully knitted, creating a knitted piece of any form. This therapeutical process teaches the patient to meaningfully align thought, by (physically) creating a coherent knitted piece. Moreover, since CBT emphasizes the behavior as a result of cognition, the knitting illustrates how thoughts (which are tried to be imaginary tight to the wool) materialize into the reality surrounding us.
Mindfulness-based cognitive behavioral hypnotherapy (MCBH) is a form of CBT focusing on awareness in reflective approach with addressing of subconscious tendencies. It is more the process that contains basically three phases that are used for achieving wanted goals.
The Unified Protocol for Transdiagnostic Treatment of Emotional Disorders (UP) is a form of CBT, developed by David H. Barlow and researchers at Boston University, that can be applied to a range of depression and anxiety disorders. The rationale is that anxiety and depression disorders often occur together due to common underlying causes and can efficiently be treated together.
The UP includes a common set of components:
The UP has been shown to produce equivalent results to single-diagnosis protocols for specific disorders, such as OCD and social anxiety disorder.
Several studies have shown that the UP is easier to disseminate as compared to single-diagnosis protocols.
The research conducted for CBT has been a topic of sustained controversy. While some researchers write that CBT is more effective than other treatments, many other researchers and practitioners have questioned the validity of such claims. For example, one study determined CBT to be superior to other treatments in treating anxiety and depression. However, researchers responding directly to that study conducted a re-analysis and found no evidence of CBT being superior to other bona fide treatments, and conducted an analysis of thirteen other CBT clinical trials and determined that they failed to provide evidence of CBT superiority. In cases where CBT has been reported to be statistically better than other psychological interventions in terms of primary outcome measures, effect sizes were small and suggested that those differences were clinically meaningless and insignificant. Moreover, on secondary outcomes (i.e., measures of general functioning) no significant differences have been typically found between CBT and other treatments.
A major criticism has been that clinical studies of CBT efficacy (or any psychotherapy) are not double-blind (i.e., either the subjects or the therapists in psychotherapy studies are not blind to the type of treatment). They may be single-blinded, i.e. the rater may not know the treatment the patient received, but neither the patients nor the therapists are blinded to the type of therapy given (two out of three of the persons involved in the trial, i.e., all of the persons involved in the treatment, are unblinded). The patient is an active participant in correcting negative distorted thoughts, thus quite aware of the treatment group they are in.
The importance of double-blinding was shown in a meta-analysis that examined the effectiveness of CBT when placebo control and blindedness were factored in. Pooled data from published trials of CBT in schizophrenia, major depressive disorder (MDD), and bipolar disorder that used controls for non-specific effects of intervention were analyzed. This study concluded that CBT is no better than non-specific control interventions in the treatment of schizophrenia and does not reduce relapse rates; treatment effects are small in treatment studies of MDD, and it is not an effective treatment strategy for prevention of relapse in bipolar disorder. For MDD, the authors note that the pooled effect size was very low. Nevertheless, the methodological processes used to select the studies in the previously mentioned meta-analysis and the worth of its findings have been called into question.
Additionally, a 2015 meta-analysis revealed that the positive effects of CBT on depression have been declining since 1977. The overall results showed two different declines in effect sizes: 1) an overall decline between 1977 and 2014, and 2) a steeper decline between 1995 and 2014. Additional sub-analysis revealed that CBT studies where therapists in the test group were instructed to adhere to the Beck CBT manual had a steeper decline in effect sizes since 1977 than studies where therapists in the test group were instructed to use CBT without a manual. The authors reported that they were unsure why the effects were declining but did list inadequate therapist training, failure to adhere to a manual, lack of therapist experience, and patients' hope and faith in its efficacy waning as potential reasons. The authors did mention that the current study was limited to depressive disorders only.
Furthermore, other researchers write that CBT studies have high drop-out rates compared to other treatments. CBT drop out rates were found to be 17% higher than other therapies in one meta-analysis. This high drop-out rate is also evident in the treatment of several disorders, particularly the eating disorder anorexia nervosa, which is commonly treated with CBT. Those treated with CBT have a high chance of dropping out of therapy before completion and reverting to their anorexia behaviors.
Other researchers conducting an analysis of treatments for youths who self-injure found similar drop-out rates in CBT and DBT groups. In this study, the researchers analyzed several clinical trials that measured the efficacy of CBT administered to youths who self-injure. The researchers concluded that none of them were found to be efficacious.
The methods employed in CBT research have not been the only criticisms; some individuals have called its theory and therapy into question.
Slife and Williams write that one of the hidden assumptions in CBT is that of determinism, or the absence of free will. They argue that CBT holds that external stimuli from the environment enter the mind, causing different thoughts that cause emotional states: nowhere in CBT theory is agency, or free will, accounted for.
Another criticism of CBT theory, especially as applied to major depressive disorder (MDD), is that it confounds the symptoms of the disorder with its causes.
CBT is generally seen as having very low if any side effects. Calls have been made for more appraisal of CBT side effects.
The writer and group analyst Farhad Dalal questions the socio-political assumptions behind the introduction of CBT. According to one reviewer, Dalal connects the rise of CBT with "the parallel rise of neoliberalism, with its focus on marketization, efficiency, quantification and managerialism", and he questions the scientific basis of CBT, suggesting that "the 'science' of psychological treatment is often less a scientific than a political contest". In his book, Dalal also questions the ethical basis of CBT.
The UK's National Health Service announced in 2008 that more therapists would be trained to provide CBT at government expense as part of an initiative called Improving Access to Psychological Therapies (IAPT). the NICE said that CBT would become the mainstay of treatment for non-severe depression, with medication used only in cases where CBT had failed. Therapists complained that the data does not fully support the attention and funding CBT receives. Psychotherapist and professor Andrew Samuels stated that this constitutes "a coup, a power play by a community that has suddenly found itself on the brink of corralling an enormous amount of money ... Everyone has been seduced by CBT's apparent cheapness." The UK Council for Psychotherapy issued a press release in 2012 saying that the IAPT's policies were undermining traditional psychotherapy and criticized proposals that would limit some approved therapies to CBT, claiming that they restricted patients to "a watered down version of cognitive behavioural therapy (CBT), often delivered by very lightly trained staff".
The NICE also recommends offering CBT to people suffering from schizophrenia, as well as those at risk of suffering from a psychotic episode. | https://en.wikipedia.org/wiki?curid=5750 |
Chinese language
Chinese ( or especially though not exclusively for written Chinese: ) is a family of East Asian analytic languages that form the Sinitic branch of the Sino-Tibetan languages. Chinese languages are spoken by the ethnic Han Chinese majority and many minority ethnic groups in China. About 1.2 billion people (around 16% of the world's population) speak some form of Chinese as their first language.
The varieties of Chinese are usually considered by native speakers to be regional variants of ethnic Chinese speech, without consideration of whether they are mutually intelligible. Due to their lack of mutual intelligibility, they are generally described as distinct languages (perhaps hundreds) by linguists who sometimes note that they are more varied than the Romance languages. Investigation of the historical relationships among the Sinitic languages is just getting started. Currently, most classifications posit 7 to 13 main regional groups, based on often superficial phonetic developments, of which the most populous by far is Mandarin (about 800 million speakers, e.g. Southwestern Mandarin), followed by Min (75 million, e.g. Southern Min), Wu (74 million, e.g. Shanghainese) and Yue (68 million, e.g. Cantonese). These groups are unintelligible to each other and generally many of their subgroups are mutually unintelligible as well (e.g., not only is Min Chinese a family of mutually unintelligible languages, but Southern Min itself is not a single language). There are, however, several transitional areas, where languages and dialects from different branches share enough features for some limited intelligibility between neighboring areas. Examples are New Xiang and Southwest Mandarin, Xuanzhou Wu and Lower Yangtze Mandarin, Jin and Central Plains Mandarin and certain divergent dialects of Hakka with Gan (though these are unintelligible with mainstream Hakka). All varieties of Chinese are tonal to at least some degree and largely analytic.
Standard Chinese ("Pǔtōnghuà"/"Guóyǔ"/"Huáyǔ") is a standardized form of spoken Chinese based on the Beijing dialect of Mandarin. It is an official language of China, similar to one of the national languages of Taiwan (Taiwanese Mandarin) and one of the four official languages of Singapore. It is one of the six official languages of the United Nations. The written form of the standard language (, "Zhōngwén"), based on the logograms known as Chinese characters (/, "Hànzì"), is shared by literate speakers of otherwise unintelligible dialects.
The earliest Chinese written records are Shang dynasty-era oracle inscriptions, which can be traced back to 1250 BCE. The phonetic categories of Archaic Chinese can be reconstructed from the rhymes of ancient poetry. During the Northern and Southern dynasties period, Middle Chinese went through several sound changes and split into several varieties following prolonged geographic and political separation. "Qieyun", a rime dictionary, recorded a compromise between the pronunciations of different regions. The royal courts of the Ming and early Qing dynasties operated using a koiné language (Guanhua) based on Nanjing dialect of Lower Yangtze Mandarin. Standard Chinese was adopted in the 1930s and is now an official language of both the People's Republic of China and the Republic of China on Taiwan.
Linguists classify all varieties of Chinese as part of the Sino-Tibetan language family, together with Burmese, Tibetan and many other languages spoken in the Himalayas and the Southeast Asian Massif. Although the relationship was first proposed in the early 19th century and is now broadly accepted, reconstruction of Sino-Tibetan is much less developed than that of families such as Indo-European or Austroasiatic. Difficulties have included the great diversity of the languages, the lack of inflection in many of them, and the effects of language contact. In addition, many of the smaller languages are spoken in mountainous areas that are difficult to reach and are often also sensitive border zones. Without a secure reconstruction of proto-Sino-Tibetan, the higher-level structure of the family remains unclear. A top-level branching into Chinese and Tibeto-Burman languages is often assumed, but has not been convincingly demonstrated.
The first written records appeared over 3,000 years ago during the Shang dynasty. As the language evolved over this period, the various local varieties became mutually unintelligible. In reaction, central governments have repeatedly sought to promulgate a unified standard.
The earliest examples of Chinese are divinatory inscriptions on oracle bones from around 1250 BCE in the late Shang dynasty. Old Chinese was the language of the Western Zhou period (1046–771 BCE), recorded in inscriptions on bronze artifacts, the "Classic of Poetry" and portions of the "Book of Documents" and "I Ching". Scholars have attempted to reconstruct the phonology of Old Chinese by comparing later varieties of Chinese with the rhyming practice of the "Classic of Poetry" and the phonetic elements found in the majority of Chinese characters. Although many of the finer details remain unclear, most scholars agree that Old Chinese differs from Middle Chinese in lacking retroflex and palatal obstruents but having initial consonant clusters of some sort, and in having voiceless nasals and liquids. Most recent reconstructions also describe an atonal language with consonant clusters at the end of the syllable, developing into tone distinctions in Middle Chinese. Several derivational affixes have also been identified, but the language lacks inflection, and indicated grammatical relationships using word order and grammatical particles.
Middle Chinese was the language used during Northern and Southern dynasties and the Sui, Tang, and Song dynasties (6th through 10th centuries CE). It can be divided into an early period, reflected by the "Qieyun" rime book (601 CE), and a late period in the 10th century, reflected by rhyme tables such as the "Yunjing" constructed by ancient Chinese philologists as a guide to the "Qieyun" system. These works define phonological categories, but with little hint of what sounds they represent. Linguists have identified these sounds by comparing the categories with pronunciations in modern varieties of Chinese, borrowed Chinese words in Japanese, Vietnamese, and Korean, and transcription evidence. The resulting system is very complex, with a large number of consonants and vowels, but they are probably not all distinguished in any single dialect. Most linguists now believe it represents a diasystem encompassing 6th-century northern and southern standards for reading the classics.
The relationship between spoken and written Chinese is rather complex. Its spoken varieties have evolved at different rates, while written Chinese itself has changed much less. Classical Chinese literature began in the Spring and Autumn period.
After the fall of the Northern Song dynasty, and during the reign of the Jin (Jurchen) and Yuan (Mongol) dynasties in northern China, a common speech (now called Old Mandarin) developed based on the dialects of the North China Plain around the capital.
The "Zhongyuan Yinyun" (1324) was a dictionary that codified the rhyming conventions of new "sanqu" verse form in this language.
Together with the slightly later "Menggu Ziyun", this dictionary describes a language with many of the features characteristic of modern Mandarin dialects.
Up to the early 20th century, most of the people in China spoke only their local variety.
As a practical measure, officials of the Ming and Qing dynasties carried out the administration of the empire using a common language based on Mandarin varieties, known as "Guānhuà" (/, literally "language of officials").
For most of this period, this language was a koiné based on dialects spoken in the Nanjing area, though not identical to any single dialect.
By the middle of the 19th century, the Beijing dialect had become dominant and was essential for any business with the imperial court.
In the 1930s a standard national language "Guóyǔ" (/ "national language") was adopted.
After much dispute between proponents of northern and southern dialects and an abortive attempt at an artificial pronunciation, the National Language Unification Commission finally settled on the Beijing dialect in 1932. The People's Republic founded in 1949 retained this standard, calling it "pǔtōnghuà" (/ "common speech"). The national language is now used in education, the media, and formal situations in both Mainland China and Taiwan. In Hong Kong and Macau, because of their colonial and linguistic history, the language used in education, the media, formal speech, and everyday life remains the local Cantonese, although the standard language has become very influential and is being taught in schools.
The Chinese language has spread to neighbouring countries through a variety of means. Northern Vietnam was incorporated into the Han empire in 111 BCE, marking the beginning of a period of Chinese control that ran almost continuously for a millennium. The Four Commanderies were established in northern Korea in the first century BCE, but disintegrated in the following centuries. Chinese Buddhism spread over East Asia between the 2nd and 5th centuries CE, and with it the study of scriptures and literature in Literary Chinese. Later Korea, Japan, and Vietnam developed strong central governments modeled on Chinese institutions, with Literary Chinese as the language of administration and scholarship, a position it would retain until the late 19th century in Korea and (to a lesser extent) Japan, and the early 20th century in Vietnam. Scholars from different lands could communicate, albeit only in writing, using Literary Chinese.
Although they used Chinese solely for written communication, each country had its own tradition of reading texts aloud, the so-called Sino-Xenic pronunciations. Chinese words with these pronunciations were also extensively imported into the Korean, Japanese and Vietnamese languages, and today comprise over half of their vocabularies. This massive influx led to changes in the phonological structure of the languages, contributing to the development of moraic structure in Japanese and the disruption of vowel harmony in Korean.
Borrowed Chinese morphemes have been used extensively in all these languages to coin compound words for new concepts, in a similar way to the use of Latin and Ancient Greek roots in European languages. Many new compounds, or new meanings for old phrases, were created in the late 19th and early 20th centuries to name Western concepts and artifacts. These coinages, written in shared Chinese characters, have then been borrowed freely between languages. They have even been accepted into Chinese, a language usually resistant to loanwords, because their foreign origin was hidden by their written form. Often different compounds for the same concept were in circulation for some time before a winner emerged, and sometimes the final choice differed between countries. The proportion of vocabulary of Chinese origin thus tends to be greater in technical, abstract, or formal language. For example, in Japan, Sino-Japanese words account for about 35% of the words in entertainment magazines, over half the words in newspapers, and 60% of the words in science magazines.
Vietnam, Korea, and Japan each developed writing systems for their own languages, initially based on Chinese characters, but later replaced with the "Hangul" alphabet for Korean and supplemented with "kana" syllabaries for Japanese, while Vietnamese continued to be written with the complex "Chữ nôm" script. However, these were limited to popular literature until the late 19th century. Today Japanese is written with a composite script using both Chinese characters ("Kanji") and kana. Korean is written exclusively with Hangul in North Korea, and supplementary Chinese characters ("Hanja") are increasingly rarely used in South Korea. Vietnamese is written with a Latin-based alphabet.
Examples of loan words in English include "tea", from Hokkien (Min Nan) (), "dim sum", from Cantonese "dim2 sam1" and "kumquat", from Cantonese "gam1gwat1" ().
Jerry Norman estimated that there are hundreds of mutually unintelligible varieties of Chinese. These varieties form a dialect continuum, in which differences in speech generally become more pronounced as distances increase, though the rate of change varies immensely. Generally, mountainous South China exhibits more linguistic diversity than the North China Plain. In parts of South China, a major city's dialect may only be marginally intelligible to close neighbors. For instance, Wuzhou is about upstream from Guangzhou, but the Yue variety spoken there is more like that of Guangzhou than is that of Taishan, southwest of Guangzhou and separated from it by several rivers. In parts of Fujian the speech of neighboring counties or even villages may be mutually unintelligible.
Until the late 20th century, Chinese emigrants to Southeast Asia and North America came from southeast coastal areas, where Min, Hakka, and Yue dialects are spoken.
The vast majority of Chinese immigrants to North America spoke the Taishan dialect, from a small coastal area southwest of Guangzhou.
Local varieties of Chinese are conventionally classified into seven dialect groups, largely on the basis of the different evolution of Middle Chinese voiced initials:
The classification of Li Rong, which is used in the "Language Atlas of China" (1987), distinguishes three further groups:
Some varieties remain unclassified, including Danzhou dialect (spoken in Danzhou, on Hainan Island), Waxianghua (spoken in western Hunan) and Shaozhou Tuhua (spoken in northern Guangdong).
Standard Chinese, often called Mandarin, is the official standard language of China and Taiwan, and one of the four official languages of Singapore (where it is called "Huáyŭ" or simply Chinese). Standard Chinese is based on the Beijing dialect, the dialect of Mandarin as spoken in Beijing. The governments of both China and Taiwan intend for speakers of all Chinese speech varieties to use it as a common language of communication. Therefore, it is used in government agencies, in the media, and as a language of instruction in schools.
In mainland China and Taiwan, diglossia has been a common feature. For example, in addition to Standard Chinese, a resident of Shanghai might speak Shanghainese; and, if he or she grew up elsewhere, then he or she is also likely to be fluent in the particular dialect of that local area. A native of Guangzhou may speak both Cantonese and Standard Chinese. In addition to Mandarin, most Taiwanese also speak Minnan, Hakka, or an Austronesian language. A Taiwanese may commonly mix pronunciations, phrases, and words from Mandarin and other Taiwanese languages, and this mixture is considered normal in daily or informal speech.
The official Chinese designation for the major branches of Chinese is "fāngyán" (, literally "regional speech"), whereas the more closely related varieties within these are called "dìdiǎn fāngyán" (/ "local speech"). Conventional English-language usage in Chinese linguistics is to use "dialect" for the speech of a particular place (regardless of status) and "dialect group" for a regional grouping such as Mandarin or Wu. Because varieties from different groups are not mutually intelligible, some scholars prefer to describe Wu and others as separate languages. Jerry Norman called this practice misleading, pointing out that Wu, which itself contains many mutually unintelligible varieties, could not be properly called a single language under the same criterion, and that the same is true for each of the other groups.
Mutual intelligibility is considered by some linguists to be the main criterion for determining whether varieties are separate languages or dialects of a single language, although others do not regard it as decisive, particularly when cultural factors interfere as they do with Chinese. As explains, linguists often ignore mutual intelligibility when varieties share intelligibility with a central variety (i.e. prestige variety, such as Standard Mandarin), as the issue requires some careful handling when mutual intelligibility is inconsistent with language identity. John DeFrancis argues that it is inappropriate to refer to Mandarin, Wu and so on as "dialects" because the mutual unintelligibility between them is too great. On the other hand, he also objects to considering them as separate languages, as it incorrectly implies a set of disruptive "religious, economic, political, and other differences" between speakers that exist, for example, between French Catholics and English Protestants in Canada, but not between speakers of Cantonese and Mandarin in China, owing to China's near-uninterrupted history of centralized government.
Because of the difficulties involved in determining the difference between language and dialect, other terms have been proposed. These include "vernacular", "lect", "regionalect", "topolect", and "variety".
Most Chinese people consider the spoken varieties as one single language because speakers share a common culture and history, as well as a shared national identity and a common written form. To Chinese nationalists, the idea of Chinese as a language family may suggest that the Chinese identity is much more fragmented and disunified than their belief and as such is often looked upon as culturally and politically provocative.
The phonological structure of each syllable consists of a nucleus that has a vowel (which can be a monophthong, diphthong, or even a triphthong in certain varieties), preceded by an onset (a single consonant, or consonant+glide; zero onset is also possible), and followed (optionally) by a coda consonant; a syllable also carries a tone. There are some instances where a vowel is not used as a nucleus. An example of this is in Cantonese, where the nasal sonorant consonants and can stand alone as their own syllable.
In Mandarin much more than in other spoken varieties, most syllables tend to be open syllables, meaning they have no coda (assuming that a final glide is not analyzed as a coda), but syllables that do have codas are restricted to nasals , , , the retroflex approximant , and voiceless stops , , , or . Some varieties allow most of these codas, whereas others, such as Standard Chinese, are limited to only , and .
The number of sounds in the different spoken dialects varies, but in general there has been a tendency to a reduction in sounds from Middle Chinese. The Mandarin dialects in particular have experienced a dramatic decrease in sounds and so have far more multisyllabic words than most other spoken varieties. The total number of syllables in some varieties is therefore only about a thousand, including tonal variation, which is only about an eighth as many as English.
All varieties of spoken Chinese use tones to distinguish words. A few dialects of north China may have as few as three tones, while some dialects in south China have up to 6 or 12 tones, depending on how one counts. One exception from this is Shanghainese which has reduced the set of tones to a two-toned pitch accent system much like modern Japanese.
A very common example used to illustrate the use of tones in Chinese is the application of the four tones of Standard Chinese (along with the neutral tone) to the syllable "ma". The tones are exemplified by the following five Chinese words:
Standard Cantonese, in contrast, has six tones in open syllables and three tones in syllables ending with stops:
Chinese is often described as a "monosyllabic" language. However, this is only partially correct. It is largely accurate when describing Classical Chinese and Middle Chinese; in Classical Chinese, for example, perhaps 90% of words correspond to a single syllable and a single character. In the modern varieties, it is usually the case that a morpheme (unit of meaning) is a single syllable; In contrast, English has plenty of multi-syllable morphemes, both bound and free, such as "seven", "elephant", "para-" and "-able".
Some of the conservative southern varieties of modern Chinese have largely monosyllabic words, especially among the more basic vocabulary. In modern Mandarin, however, most nouns, adjectives and verbs are largely disyllabic. A significant cause of this is phonological attrition. Sound change over time has steadily reduced the number of possible syllables. In modern Mandarin, there are now only about 1,200 possible syllables, including tonal distinctions, compared with about 5,000 in Vietnamese (still largely monosyllabic) and over 8,000 in English.
This phonological collapse has led to a corresponding increase in the number of homophones. As an example, the small Langenscheidt Pocket Chinese Dictionary lists six words that are commonly pronounced as "shí" (tone 2): "ten"; / "real, actual"; / "know (a person), recognize"; "stone"; / "time"; "food, eat". These were all pronounced differently in Early Middle Chinese; in William H. Baxter's transcription they were "dzyip", "zyit", "syik", "dzyek", "dzyi" and "zyik" respectively. They are still pronounced differently in today's Cantonese; in Jyutping they are "sap9", "sat9", "sik7", "sek9", "si4", "sik9". In modern spoken Mandarin, however, tremendous ambiguity would result if all of these words could be used as-is; Yuen Ren Chao's modern poem Lion-Eating Poet in the Stone Den exploits this, consisting of 92 characters all pronounced "shi". As such, most of these words have been replaced (in speech, if not in writing) with a longer, less-ambiguous compound. Only the first one, "ten", normally appears as such when spoken; the rest are normally replaced with, respectively, "shíjì" / (lit. "actual-connection"); "rènshi" / (lit. "recognize-know"); "shítou" / (lit. "stone-head"); "shíjiān" / (lit. "time-interval"); "shíwù" (lit. "food-thing"). In each case, the homophone was disambiguated by adding another morpheme, typically either a synonym or a generic word of some sort (for example, "head", "thing"), the purpose of which is simply to indicate which of the possible meanings of the other, homophonic syllable should be selected.
However, when one of the above words forms part of a compound, the disambiguating syllable is generally dropped and the resulting word is still disyllabic. For example, "shí" alone, not "shítou" /, appears in compounds meaning "stone-", for example, "shígāo" "plaster" (lit. "stone cream"), "shíhuī" "lime" (lit. "stone dust"), "shíkū" "grotto" (lit. "stone cave"), "shíyīng" "quartz" (lit. "stone flower"), "shíyóu" "petroleum" (lit. "stone oil").
Most modern varieties of Chinese have the tendency to form new words through disyllabic, trisyllabic and tetra-character compounds. In some cases, monosyllabic words have become disyllabic without compounding, as in "kūlong" from "kǒng" 孔; this is especially common in Jin.
Chinese morphology is strictly bound to a set number of syllables with a fairly rigid construction. Although many of these single-syllable morphemes ("zì", ) can stand alone as individual words, they more often than not form multi-syllabic compounds, known as "cí" (/), which more closely resembles the traditional Western notion of a word. A Chinese "cí" ("word") can consist of more than one character-morpheme, usually two, but there can be three or more.
For example:
All varieties of modern Chinese are analytic languages, in that they depend on syntax (word order and sentence structure) rather than morphology—i.e., changes in form of a word—to indicate the word's function in a sentence. In other words, Chinese has very few grammatical inflections—it possesses no tenses, no voices, no numbers (singular, plural; though there are plural markers, for example for personal pronouns), and only a few articles (i.e., equivalents to "the, a, an" in English).
They make heavy use of grammatical particles to indicate aspect and mood. In Mandarin Chinese, this involves the use of particles like "le" (perfective), "hái" / ("still"), "yǐjīng" / ("already"), and so on.
Chinese has a subject–verb–object word order, and like many other languages of East Asia, makes frequent use of the topic–comment construction to form sentences. Chinese also has an extensive system of classifiers and measure words, another trait shared with neighboring languages like Japanese and Korean. Other notable grammatical features common to all the spoken varieties of Chinese include the use of serial verb construction, pronoun dropping and the related subject dropping.
Although the grammars of the spoken varieties share many traits, they do possess differences.
The entire Chinese character corpus since antiquity comprises well over 20,000 characters, of which only roughly 10,000 are now commonly in use. However Chinese characters should not be confused with Chinese words. Because most Chinese words are made up of two or more characters, there are many more Chinese words than characters. A more accurate equivalent for a Chinese character is the morpheme, as characters represent the smallest grammatical units with individual meanings in the Chinese language.
Estimates of the total number of Chinese words and lexicalized phrases vary greatly. The "Hanyu Da Zidian", a compendium of Chinese characters, includes 54,678 head entries for characters, including bone oracle versions. The "Zhonghua Zihai" (1994) contains 85,568 head entries for character definitions, and is the largest reference work based purely on character and its literary variants. The CC-CEDICT project (2010) contains 97,404 contemporary entries including idioms, technology terms and names of political figures, businesses and products. The 2009 version of the Webster's Digital Chinese Dictionary (WDCD), based on CC-CEDICT, contains over 84,000 entries.
The most comprehensive pure linguistic Chinese-language dictionary, the 12-volume "Hanyu Da Cidian", records more than 23,000 head Chinese characters and gives over 370,000 definitions. The 1999 revised "Cihai", a multi-volume encyclopedic dictionary reference work, gives 122,836 vocabulary entry definitions under 19,485 Chinese characters, including proper names, phrases and common zoological, geographical, sociological, scientific and technical terms.
The 7th (2016) edition of "Xiandai Hanyu Cidian", an authoritative one-volume dictionary on modern standard Chinese language as used in mainland China, has 13,000 head characters and defines 70,000 words.
Like any other language, Chinese has absorbed a sizable number of loanwords from other cultures. Most Chinese words are formed out of native Chinese morphemes, including words describing imported objects and ideas. However, direct phonetic borrowing of foreign words has gone on since ancient times.
Some early Indo-European loanwords in Chinese have been proposed, notably "mì" "honey", / "shī" "lion," and perhaps also / "mǎ" "horse", / "zhū" "pig", "quǎn" "dog", and / "é" "goose".
Ancient words borrowed from along the Silk Road since Old Chinese include "pútáo" "grape", "shíliu"/"shíliú" "pomegranate" and / "shīzi" "lion". Some words were borrowed from Buddhist scriptures, including "Fó" "Buddha" and / "Púsà" "bodhisattva." Other words came from nomadic peoples to the north, such as "hútòng" "hutong". Words borrowed from the peoples along the Silk Road, such as "grape," generally have Persian etymologies. Buddhist terminology is generally derived from Sanskrit or Pāli, the liturgical languages of North India. Words borrowed from the nomadic tribes of the Gobi, Mongolian or northeast regions generally have Altaic etymologies, such as "pípá", the Chinese lute, or "lào"/"luò" "cheese" or "yoghurt", but from exactly which source is not always clear.
Modern neologisms are primarily translated into Chinese in one of three ways: free translation ("calque", or by meaning), phonetic translation (by sound), or a combination of the two. Today, it is much more common to use existing Chinese morphemes to coin new words in order to represent imported concepts, such as technical expressions and international scientific vocabulary. Any Latin or Greek etymologies are dropped and converted into the corresponding Chinese characters (for example, "anti-" typically becomes "", literally "opposite"), making them more comprehensible for Chinese but introducing more difficulties in understanding foreign texts. For example, the word "telephone" was loaned phonetically as / (Shanghainese: "télífon" , Mandarin: "délǜfēng") during the 1920s and widely used in Shanghai, but later / "diànhuà" (lit. "electric speech"), built out of native Chinese morphemes, became prevalent ( is in fact from the Japanese "denwa"; see below for more Japanese loans). Other examples include / "diànshì" (lit. "electric vision") for television, / "diànnǎo" (lit. "electric brain") for computer; / "shǒujī" (lit. "hand machine") for mobile phone, / "lányá" (lit. "blue tooth") for Bluetooth, and / "wǎngzhì" (lit. "internet logbook") for blog in Hong Kong and Macau Cantonese. Occasionally half-transliteration, half-translation compromises are accepted, such as / "hànbǎobāo" ( "hànbǎo" "Hamburg" + "bāo" "bun") for "hamburger". Sometimes translations are designed so that they sound like the original while incorporating Chinese morphemes (phono-semantic matching), such as / "tuōlājī" "tractor" (lit. "dragging-pulling machine"), or / Mǎlì'ào for the video game character Mario. This is often done for commercial purposes, for example / "bēnténg" (lit. "dashing-leaping") for Pentium and / "Sàibǎiwèi" (lit. "better-than hundred tastes") for Subway restaurants.
Foreign words, mainly proper nouns, continue to enter the Chinese language by transcription according to their pronunciations. This is done by employing Chinese characters with similar pronunciations. For example, "Israel" becomes "Yǐsèliè", "Paris" becomes "Bālí". A rather small number of direct transliterations have survived as common words, including / "shāfā" "sofa", / "mǎdá" "motor", "yōumò" "humor", / "luóji"/"luójí" "logic", / "shímáo" "smart, fashionable", and "xiēsīdǐlǐ" "hysterics". The bulk of these words were originally coined in the Shanghai dialect during the early 20th century and were later loaned into Mandarin, hence their pronunciations in Mandarin may be quite off from the English. For example, / "sofa" and / "motor" in Shanghainese sound more like their English counterparts. Cantonese differs from Mandarin with some transliterations, such as "so1 faa3*2" "sofa" and "mo1 daa2" "motor".
Western foreign words representing Western concepts have influenced Chinese since the 20th century through transcription. From French came "bālěi" "ballet" and / "xiāngbīn", "champagne"; from Italian, "kāfēi" "caffè". English influence is particularly pronounced. From early 20th century Shanghainese, many English words are borrowed, such as / "gāoěrfū" "golf" and the above-mentioned / "shāfā" "sofa". Later, the United States soft influences gave rise to "dísikē"/"dísīkē" "disco", / "kělè" "cola", and "mínǐ" "mini [skirt]". Contemporary colloquial Cantonese has distinct loanwords from English, such as "kaa1 tung1" "cartoon", "gei1 lou2" "gay people", "dik1 si6*2" "taxi", and "baa1 si6*2" "bus". With the rising popularity of the Internet, there is a current vogue in China for coining English transliterations, for example, / "fěnsī" "fans", "hēikè" "hacker" (lit. "black guest"), and "bókè" "blog". In Taiwan, some of these transliterations are different, such as "hàikè" for "hacker" and "bùluògé" for "blog" (lit. "interconnected tribes").
Another result of the English influence on Chinese is the appearance in Modern Chinese texts of so-called / "zìmǔcí" (lit. "lettered words") spelled with letters from the English alphabet. This has appeared in magazines, newspapers, on web sites, and on TV: / "3rd generation cell phones" ( "sān" "three" + G "generation" + / "shǒujī" "mobile phones"), "IT circles" (IT "information technology" + "jiè" "industry"), HSK ("Hànyǔ Shuǐpíng Kǎoshì", /), GB ("Guóbiāo", /), / (CIF "Cost, Insurance, Freight" + / "jià" "price"), "e-home" (e "electronic" + "jiātíng" "home"), / "wireless era" (W "wireless" + / "shídài" "era"), "TV watchers" (TV "television" + "zú" "social group; clan"), / "post-PC era" (/ "hòu" "after/post-" + PC "personal computer" + /), and so on.
Since the 20th century, another source of words has been Japanese using existing kanji (Chinese characters used in Japanese). Japanese re-molded European concepts and inventions into , and many of these words have been re-loaned into modern Chinese. Other terms were coined by the Japanese by giving new senses to existing Chinese terms or by referring to expressions used in classical Chinese literature. For example, "jīngjì" (/; "keizai" in Japanese), which in the original Chinese meant "the workings of the state", was narrowed to "economy" in Japanese; this narrowed definition was then re-imported into Chinese. As a result, these terms are virtually indistinguishable from native Chinese words: indeed, there is some dispute over some of these terms as to whether the Japanese or Chinese coined them first. As a result of this loaning, Chinese, Korean, Japanese, and Vietnamese share a corpus of linguistic terms describing modern terminology, paralleling the similar corpus of terms built from Greco-Latin and shared among European languages.
The Chinese orthography centers on Chinese characters, which are written within imaginary square blocks, traditionally arranged in vertical columns, read from top to bottom down a column, and right to left across columns, despite alternative arrangement with rows of characters from left to right within a row and from top to bottom across rows having become more popular since the 20th century (like English and other Western writing system). Chinese characters denote morphemes independent of phonetic variation in different languages. Thus the character ("one") is uttered in Standard Chinese, in Cantonese and "it" in Hokkien (form of Min).
Most written Chinese documents in the modern time, especially the more formal ones, are created using the grammar and syntax of the Standard Mandarin Chinese variants, regardless of dialectical background of the author or targeted audience. This replaced the old writing language standard of Literary Chinese before 20th century. However, vocabularies from different Chinese-speaking area have diverged, and the divergence can be observed in written Chinese.
Meanwhile, colloquial forms of various Chinese language variants have also been written down by their users, especially in less formal settings. The most prominent example of this is the written colloquial form of Cantonese, which has become quite popular in tabloids, instant messaging applications, and on the internet amongst Hong-Kongers and Cantonese-speakers elsewhere.
Because some Chinese variants have diverged and developed a number of unique morphemes that are not found in Standard Mandarin (despite all other common morphemes), unique characters rarely used in Standard Chinese have also been created or inherited from archaic literary standard to represent these unique morphemes. For example, characters like and for Cantonese and Hakka, are actively used in both languages while being considered archaic or unused in standard written Chinese.
The Chinese had no uniform phonetic transcription system for most of its speakers until the mid-20th century, although enunciation patterns were recorded in early rime books and dictionaries. Early Indian translators, working in Sanskrit and Pali, were the first to attempt to describe the sounds and enunciation patterns of Chinese in a foreign language. After the 15th century, the efforts of Jesuits and Western court missionaries resulted in some Latin character transcription/writing systems, based on various variants of Chinese languages. Some of these latin character based systems are still being used to write various Chinese variants in the modern era.
In Hunan, women in certain areas write their local Chinese language variant in Nü Shu, a syllabary derived from Chinese characters. The Dungan language, considered by many a dialect of Mandarin, is nowadays written in Cyrillic, and was previously written in the Arabic script. The Dungan people are primarily Muslim and live mainly in Kazakhstan, Kyrgyzstan, and Russia; some of the related Hui people also speak the language and live mainly in China.
Each Chinese character represents a monosyllabic Chinese word or morpheme. In 100 CE, the famed Han dynasty scholar Xu Shen classified characters into six categories, namely pictographs, simple ideographs, compound ideographs, phonetic loans, phonetic compounds and derivative characters. Of these, only 4% were categorized as pictographs, including many of the simplest characters, such as "rén" (human), "rì" (sun), "shān" (mountain; hill), "shuǐ" (water). Between 80% and 90% were classified as phonetic compounds such as "chōng" (pour), combining a phonetic component "zhōng" (middle) with a semantic radical (water). Almost all characters created since have been made using this format. The 18th-century Kangxi Dictionary recognized 214 radicals.
Modern characters are styled after the regular script. Various other written styles are also used in Chinese calligraphy, including seal script, cursive script and clerical script. Calligraphy artists can write in traditional and simplified characters, but they tend to use traditional characters for traditional art.
There are currently two systems for Chinese characters. The traditional system, used in Hong Kong, Taiwan, Macau and Chinese speaking communities (except Singapore and Malaysia) outside mainland China, takes its form from standardized character forms dating back to the late Han dynasty. The Simplified Chinese character system, introduced by the People's Republic of China in 1954 to promote mass literacy, simplifies most complex traditional glyphs to fewer strokes, many to common cursive shorthand variants. Singapore, which has a large Chinese community, was the second nation to officially adopt simplified characters, although it has also become the "de facto" standard for younger ethnic Chinese in Malaysia.
The Internet provides the platform to practice reading these alternative systems, be it traditional or simplified. Most Chinese users in the modern era are capable of, although not necessarily comfortable with, reading (but not writing) the alternative system, through experience and guesswork.
A well-educated Chinese reader today recognizes approximately 4,000 to 6,000 characters; approximately 3,000 characters are required to read a Mainland newspaper. The PRC government defines literacy amongst workers as a knowledge of 2,000 characters, though this would be only functional literacy. School-children typically learn around 2,000 characters whereas scholars may memorize up to 10,000. A large unabridged dictionary, like the Kangxi Dictionary, contains over 40,000 characters, including obscure, variant, rare, and archaic characters; fewer than a quarter of these characters are now commonly used.
Romanization is the process of transcribing a language into the Latin script. There are many systems of romanization for the Chinese varieties, due to the lack of a native phonetic transcription until modern times. Chinese is first known to have been written in Latin characters by Western Christian missionaries in the 16th century.
Today the most common romanization standard for Standard Chinese is "Hanyu Pinyin", often known simply as pinyin, introduced in 1956 by the People's Republic of China, and later adopted by Singapore and Taiwan. Pinyin is almost universally employed now for teaching standard spoken Chinese in schools and universities across America, Australia and Europe. Chinese parents also use Pinyin to teach their children the sounds and tones of new words. In school books that teach Chinese, the Pinyin romanization is often shown below a picture of the thing the word represents, with the Chinese character alongside.
The second-most common romanization system, the Wade–Giles, was invented by Thomas Wade in 1859 and modified by Herbert Giles in 1892. As this system approximates the phonology of Mandarin Chinese into English consonants and vowels, i.e. it is an Anglicization, it may be particularly helpful for beginner Chinese speakers of an English-speaking background. Wade–Giles was found in academic use in the United States, particularly before the 1980s, and until 2009 was widely used in Taiwan.
When used within European texts, the tone transcriptions in both pinyin and Wade–Giles are often left out for simplicity; Wade–Giles' extensive use of apostrophes is also usually omitted. Thus, most Western readers will be much more familiar with "Beijing" than they will be with "Běijīng" (pinyin), and with "Taipei" than "T'ai²-pei³" (Wade–Giles). This simplification presents syllables as homophones which really are none, and therefore exaggerates the number of homophones almost by a factor of four.
Here are a few examples of "Hanyu Pinyin" and Wade–Giles, for comparison:
Other systems of romanization for Chinese include Gwoyeu Romatzyh, the French EFEO, the Yale system (invented during WWII for U.S. troops), as well as separate systems for Cantonese, Min Nan, Hakka, and other Chinese varieties.
Chinese varieties have been phonetically transcribed into many other writing systems over the centuries. The 'Phags-pa script, for example, has been very helpful in reconstructing the pronunciations of premodern forms of Chinese.
Zhuyin (colloquially "bopomofo"), a semi-syllabary is still widely used in Taiwan's elementary schools to aid standard pronunciation. Although zhuyin characters are reminiscent of katakana script, there is no source to substantiate the claim that Katakana was the basis for the zhuyin system. A comparison table of zhuyin to pinyin exists in the zhuyin article. Syllables based on pinyin and zhuyin can also be compared by looking at the following articles:
There are also at least two systems of cyrillization for Chinese. The most widespread is the Palladius system.
With the growing importance and influence of China's economy globally, Mandarin instruction is gaining popularity in schools in the United States, and has become an increasingly popular subject of study amongst the young in the Western world, as in the UK.
In 1991 there were 2,000 foreign learners taking China's official Chinese Proficiency Test (also known as HSK, comparable to the English Cambridge Certificate), while in 2005, the number of candidates had risen sharply to 117,660. By 2010, 750,000 people had taken the Chinese Proficiency Test. By 2017, 6.5 million candidates had taken the Chinese Proficiency Test of various kinds.
According to the Modern Language Association, there were 550 elementary, junior high and senior high schools providing Chinese programs in the United States in 2015, which represented a 100% increase in two years. At the same time, enrollment in Chinese language classes at college level had an increase of 51% from 2002 to 2015. On the other hand, the American Council on the Teaching of Foreign Languages also had figures suggesting that 30,000 – 50,000 students were studying Chinese in 2015.
In 2016, more than half a million Chinese students pursued post-secondary education overseas, whereas 400,000 international students came to China for higher education. Tsinghua University hosted 35,000 students from 116 countries in the same year.
With the increase in demand for Chinese as a second language, there are 330 institutions teaching Chinese language globally according to the Chinese Ministry of Education. The establishment of Confucius Institutes, which are the public institutions affiliated with the Ministry of Education of China, aims at promoting Chinese language and culture as well as supporting Chinese teaching overseas. There were more than 480 Confucius Institutes worldwide as of 2014. | https://en.wikipedia.org/wiki?curid=5751 |
Complex analysis
Complex analysis, traditionally known as the theory of functions of a complex variable, is the branch of mathematical analysis that investigates functions of complex numbers. It is useful in many branches of mathematics, including algebraic geometry, number theory, analytic combinatorics, applied mathematics; as well as in physics, including the branches of hydrodynamics, thermodynamics, and particularly quantum mechanics. By extension, use of complex analysis also has applications in engineering fields such as nuclear, aerospace, mechanical and electrical engineering.
As a differentiable function of a complex variable is equal to the sum of its Taylor series (that is, it is analytic), complex analysis is particularly concerned with analytic functions of a complex variable (that is, holomorphic functions).
Complex analysis is one of the classical branches in mathematics, with roots in the 18th century and just prior. Important mathematicians associated with complex numbers include Euler, Gauss, Riemann, Cauchy, Weierstrass, and many more in the 20th century. Complex analysis, in particular the theory of conformal mappings, has many physical applications and is also used throughout analytic number theory. In modern times, it has become very popular through a new boost from complex dynamics and the pictures of fractals produced by iterating holomorphic functions. Another important application of complex analysis is in string theory which studies conformal invariants in quantum field theory.
A complex function is a function from complex numbers to complex numbers. In other words, it is a function that has a subset of the complex numbers as a domain and the complex numbers as a codomain. Complex functions are generally supposed to have a domain that contains a nonempty open subset of the complex plane.
For any complex function, the values formula_1 from the domain and their images formula_2 in the range may be separated into real and imaginary parts:
where formula_4 are all real-valued.
In other words, a complex function formula_5 may be decomposed into
i.e., into two real-valued functions (formula_8, formula_9) of two real variables (formula_10, formula_11).
Similarly, any complex-valued function on an arbitrary set can be considered as an ordered pair of two real-valued functions: or, alternatively, as a vector-valued function from into formula_12
Some properties of complex-valued functions (such as continuity) are nothing more than the corresponding properties of vector valued functions of two real variables. Other concepts of complex analysis, such as differentiability are direct generalizations of the similar concepts for real functions, but may have very different properties. In particular, every differentiable complex function is analytic (see next section), and two differentiable functions that are equal in a neighborhood of a point are equal on the intersection of their domain (if the domains are connected). The latter property is the basis of the principle of analytic continuation which allows extending every real analytic function in a unique way for getting a complex analytic function whose domain is the whole complex plane with a finite number of curve arcs removed. Many basic and special complex functions are defined in this way, including exponential functions, logarithmic functions, and trigonometric functions.
Complex functions that are differentiable at every point of an open subset formula_13 of the complex plane are said to be "holomorphic" "on" formula_13. In the context of complex analysis, the derivative of formula_15 at formula_16 is defined to be
Superficially, this definition is formally analogous to that of the derivative of a real function. However, complex derivatives and differentiable functions behave in significantly different ways compared to their real counterparts. In particular, for this limit to exist, the value of the difference quotient must approach the same complex number, regardless of the manner in which we approach formula_16 in the complex plane. Consequently, complex differentiability has much stronger implications than real differentiability. For instance, holomorphic functions are infinitely differentiable, whereas the existence of the "n"th derivative need not imply the existence of the ("n" + 1)th derivative for real functions. Furthermore, all holomorphic functions satisfy the stronger condition of analyticity, meaning that the function is, at every point in its domain, locally given by a convergent power series. In essence, this means that functions holomorphic on formula_13 can be approximated arbitrarily well by polynomials in some neighborhood of every point in formula_13. This stands in sharp contrast to differentiable real functions; there are infinitely differentiable real functions that are "nowhere" analytic; see .
Most elementary functions, including the exponential function, the trigonometric functions, and all polynomial functions, extended appropriately to complex arguments as functions formula_21, are holomorphic over the entire complex plane, making them "entire" "functions", while rational functions formula_22, where "p" and "q" are polynomials, are holomorphic on domains that exclude points where "q" is zero. Such functions that are holomorphic everywhere except a set of isolated points are known as "meromorphic functions". On the other hand, the functions formula_23, formula_24, and formula_25 are not holomorphic anywhere on the complex plane, as can be shown by their failure to satisfy the Cauchy–Riemann conditions (see below).
An important property of holomorphic functions is the relationship between the partial derivatives of their real and imaginary components, known as the Cauchy–Riemann conditions. If formula_5, defined by formula_27, where formula_28, is holomorphic on a region formula_13, then "formula_30" must hold for all formula_31. Here, the differential operator formula_32 is defined as formula_33"." In terms of the real and imaginary parts of the function, "u" and "v", this is equivalent to the pair of equations formula_34 and formula_35, where the subscripts indicate partial differentiation. However, the Cauchy–Riemann conditions do not characterize holomorphic functions, without additional continuity conditions (see Looman–Menchoff theorem).
Holomorphic functions exhibit some remarkable features. For instance, Picard's theorem asserts that the range of an entire function can only take three possible forms: formula_36, formula_37, or formula_38 for some formula_39. In other words, if two distinct complex numbers formula_1 and formula_41 are not in the range of an entire function formula_15, then formula_15 is a constant function. Moreover, given a holomorphic function formula_15 defined on an open set formula_45, the analytic continuation of formula_15 to a larger open set formula_47 is unique. As a result, the value of a holomorphic function over an arbitrarily small region in fact determines the value of the function everywhere to which it can be extended as a holomorphic function.
"See also": analytic function, coherent sheaf and vector bundles.
One of the central tools in complex analysis is the line integral. The line integral around a closed path of a function that is holomorphic everywhere inside the area bounded by the closed path is always zero, as is stated by the Cauchy integral theorem. The values of such a holomorphic function inside a disk can be computed by a path integral on the disk's boundary (as shown in Cauchy's integral formula). Path integrals in the complex plane are often used to determine complicated real integrals, and here the theory of residues among others is applicable (see methods of contour integration). A "pole" (or isolated singularity) of a function is a point where the function's value becomes unbounded, or "blows up". If a function has such a pole, then one can compute the function's residue there, which can be used to compute path integrals involving the function; this is the content of the powerful residue theorem. The remarkable behavior of holomorphic functions near essential singularities is described by Picard's Theorem. Functions that have only poles but no essential singularities are called meromorphic. Laurent series are the complex-valued equivalent to Taylor series, but can be used to study the behavior of functions near singularities through infinite sums of more well understood functions, such as polynomials.
A bounded function that is holomorphic in the entire complex plane must be constant; this is Liouville's theorem. It can be used to provide a natural and short proof for the fundamental theorem of algebra which states that the field of complex numbers is algebraically closed.
If a function is holomorphic throughout a connected domain then its values are fully determined by its values on any smaller subdomain. The function on the larger domain is said to be analytically continued from its values on the smaller domain. This allows the extension of the definition of functions, such as the Riemann zeta function, which are initially defined in terms of infinite sums that converge only on limited domains to almost the entire complex plane. Sometimes, as in the case of the natural logarithm, it is impossible to analytically continue a holomorphic function to a non-simply connected domain in the complex plane but it is possible to extend it to a holomorphic function on a closely related surface known as a Riemann surface.
All this refers to complex analysis in one variable. There is also a very rich theory of complex analysis in more than one complex dimension in which the analytic properties such as power series expansion carry over whereas most of the geometric properties of holomorphic functions in one complex dimension (such as conformality) do not carry over. The Riemann mapping theorem about the conformal relationship of certain domains in the complex plane, which may be the most important result in the one-dimensional theory, fails dramatically in higher dimensions.
A major use of certain complex spaces is in quantum mechanics as wave functions. | https://en.wikipedia.org/wiki?curid=5759 |
History of China
The earliest known written records of the history of China date from as early as 1250 BC, from the Shang dynasty (c. 1600–1046 BC), during the king Wu Ding's reign, who was mentioned as the twenty-first Shang king by the same. Ancient historical texts such as the "Book of Documents" (early chapters, 11th century BC), the "Records of the Grand Historian" (c. 100 BC) and the "Bamboo Annals" (296 BC) mention and describe a Xia dynasty (c. 2070–1600 BC) before the Shang, but no writing is known from the period, and Shang writings do not indicate the existence of the Xia. The Shang ruled in the Yellow River valley, which is commonly held to be the cradle of Chinese civilization. However, Neolithic civilizations originated at various cultural centers along both the Yellow River and Yangtze River. These Yellow River and Yangtze civilizations arose millennia before the Shang. With thousands of years of continuous history, China is one of the world's oldest civilizations, and is regarded as one of the cradles of civilization.
The Zhou dynasty (1046–256 BC) supplanted the Shang, and introduced the concept of the Mandate of Heaven to justify their rule. The central Zhou government began to weaken due to external and internal pressures in the 8th century BC, and the country eventually splintered into smaller states during the Spring and Autumn period. These states became independent and warred with one another in the following Warring States period. Much of traditional Chinese culture, literature and philosophy first developed during those troubled times.
In 221 BC, Qin Shi Huang conquered the various warring states and created for himself the title of "Huangdi" or "emperor" of the Qin, marking the beginning of imperial China. However, the oppressive government fell soon after his death, and was supplanted by the longer-lived Han dynasty (206 BC – 220 AD). Successive dynasties developed bureaucratic systems that enabled the emperor to control vast territories directly. In the 21 centuries from 206 BC until AD 1912, routine administrative tasks were handled by a special elite of "scholar-officials". Young men, well-versed in calligraphy, history, literature, and philosophy, were carefully selected through difficult government examinations. China's last dynasty was the Qing (1644–1912), which was replaced by the Republic of China in 1912, and then in the mainland by the People's Republic of China in 1949.
Chinese history has alternated between periods of political unity and peace, and periods of war and failed statehood—the most recent being the Chinese Civil War (1927–1949). China was occasionally dominated by steppe peoples, most of whom were eventually assimilated into the Han Chinese culture and population. Between eras of multiple kingdoms and warlordism, Chinese dynasties have ruled parts or all of China; in some eras control stretched as far as Xinjiang and Tibet, as at present. Traditional culture, and influences from other parts of Asia and the Western world (carried by waves of immigration, cultural assimilation, expansion, and foreign contact), form the basis of the modern culture of China.
What is now China was inhabited by "Homo erectus" more than a million years ago. Recent study shows that the stone tools found at Xiaochangliang site are magnetostratigraphically dated to 1.36 million years ago. The archaeological site of Xihoudu in Shanxi Province has evidence of use of fire by "Homo erectus", which is dated 1.27 million years ago, and "Homo erectus" fossils in China include the Yuanmou Man, the Lantian Man and the Peking Man. Fossilised teeth of "Homo sapiens" dating to 125,000–80,000 BC have been discovered in Fuyan Cave in Dao County in Hunan. Evidence of Middle Palaeolithic Levallois technology has been found in the lithic assemblage of Guanyindong Cave site in southwest China, dated to approximately 170,000–80,000 years ago.
The Neolithic age in China can be traced back to about 10,000 BC. The earliest evidence of cultivated rice, found by the Yangtze River, is carbon-dated to 8,000 years ago. Early evidence for proto-Chinese millet agriculture is radiocarbon-dated to about 7000 BC. Farming gave rise to the Jiahu culture (7000 to 5800 BC). At Damaidi in Ningxia, 3,172 cliff carvings dating to 6000–5000 BC have been discovered, "featuring 8,453 individual characters such as the sun, moon, stars, gods and scenes of hunting or grazing". These pictographs are reputed to be similar to the earliest characters confirmed to be written Chinese. Chinese proto-writing existed in Jiahu around 7000 BC, Dadiwan from 5800 BC to 5400 BC, Damaidi around 6000 BC and Banpo dating from the 5th millennium BC. Some scholars have suggested that Jiahu symbols (7th millennium BC) were the earliest Chinese writing system. Excavation of a Peiligang culture site in Xinzheng county, Henan, found a community that flourished in 5,500 to 4,900 BC, with evidence of agriculture, constructed buildings, pottery, and burial of the dead. With agriculture came increased population, the ability to store and redistribute crops, and the potential to support specialist craftsmen and administrators. In late Neolithic times, the Yellow River valley began to establish itself as a center of Yangshao culture (5000 BC to 3000 BC), and the first villages were founded; the most archaeologically significant of these was found at Banpo, Xi'an. Later, Yangshao culture was superseded by the Longshan culture, which was also centered on the Yellow River from about 3000 BC to 2000 BC.
Bronze artifacts have been found at the Majiayao culture site (between 3100 and 2700 BC). The Bronze Age is also represented at the Lower Xiajiadian culture (2200–1600 BC) site in northeast China. Sanxingdui located in what is now Sichuan province is believed to be the site of a major ancient city, of a previously unknown Bronze Age culture (between 2000 and 1200 BC). The site was first discovered in 1929 and then re-discovered in 1986. Chinese archaeologists have identified the Sanxingdui culture to be part of the ancient kingdom of Shu, linking the artifacts found at the site to its early legendary kings.
Ferrous metallurgy begins to appear in the late 6th century in the Yangzi Valley.
A bronze tomahawk with a blade of meteoric iron excavated near the city of Gaocheng in Shijiazhuang (now Hebei province) has been dated to the 14th century BC.
For this reason, authors such as Liana Chua and Mark Elliott have used the term "Iron Age" by convention for the transitional period of c. 500 BC to 100 BC, roughly corresponding to the Warring States period of Chinese historiography.
An Iron Age culture of the Tibetan Plateau has tentatively been associated with the Zhang Zhung culture described in early Tibetan writings.
The Xia dynasty of China (from c. 2070 to c. 1600 BC) is the first dynasty to be described in ancient historical records such as Sima Qian's "Records of the Grand Historian" and "Bamboo Annals". The dynasty was considered mythical by historians until scientific excavations found early Bronze Age sites at Erlitou, Henan in 1959. With few clear records matching the Shang oracle bones, it remains unclear whether these sites are the remains of the Xia dynasty or of another culture from the same period. Excavations that overlap the alleged time period of the Xia indicate a type of culturally similar groupings of chiefdoms. Early markings from this period found on pottery and shells are thought to be ancestral to modern Chinese characters.
According to ancient records, the dynasty ended around 1600 BC as a consequence of the Battle of Mingtiao.
Archaeological findings providing evidence for the existence of the Shang dynasty, c. 1600–1046 BC, are divided into two sets. The first set, from the earlier Shang period, comes from sources at Erligang, Zhengzhou, and Shangcheng. The second set, from the later Shang or Yin (殷) period, is at Anyang, in modern-day Henan, which has been confirmed as the last of the Shang's nine capitals (c. 1300–1046 BC). The findings at Anyang include the earliest written record of the Chinese so far discovered: inscriptions of divination records in ancient Chinese writing on the bones or shells of animals—the "oracle bones", dating from around 1250 BC.
A series of thirty-one kings reigned over the Shang dynasty. During their reign, according to the "Records of the Grand Historian", the capital city was moved six times. The final (and most important) move was to Yin in around 1300 BC which led to the dynasty's golden age. The term Yin dynasty has been synonymous with the Shang dynasty in history, although it has lately been used to refer specifically to the latter half of the Shang dynasty.
Chinese historians in later periods were accustomed to the notion of one dynasty succeeding another, but the political situation in early China was much more complicated. Hence, as some scholars of China suggest, the Xia and the Shang can refer to political entities that existed concurrently, just as the early Zhou existed at the same time as the Shang.
Although written records found at Anyang confirm the existence of the Shang dynasty, Western scholars are often hesitant to associate settlements that are contemporaneous with the Anyang settlement with the Shang dynasty. For example, archaeological findings at Sanxingdui suggest a technologically advanced civilization culturally unlike Anyang. The evidence is inconclusive in proving how far the Shang realm extended from Anyang. The leading hypothesis is that Anyang, ruled by the same Shang in the official history, coexisted and traded with numerous other culturally diverse settlements in the area that is now referred to as China proper.
The Zhou dynasty (1046 BC to approximately 256 BC) is the longest-lasting dynasty in Chinese history. By the end of the 2nd millennium BC, the Zhou dynasty began to emerge in the Yellow River valley, overrunning the territory of the Shang. The Zhou appeared to have begun their rule under a semi-feudal system. The Zhou lived west of the Shang, and the Zhou leader was appointed Western Protector by the Shang. The ruler of the Zhou, King Wu, with the assistance of his brother, the Duke of Zhou, as regent, managed to defeat the Shang at the Battle of Muye.
The king of Zhou at this time invoked the concept of the Mandate of Heaven to legitimize his rule, a concept that was influential for almost every succeeding dynasty. Like Shangdi, Heaven ("tian") ruled over all the other gods, and it decided who would rule China. It was believed that a ruler lost the Mandate of Heaven when natural disasters occurred in great number, and when, more realistically, the sovereign had apparently lost his concern for the people. In response, the royal house would be overthrown, and a new house would rule, having been granted the Mandate of Heaven.
The Zhou initially moved their capital west to an area near modern Xi'an, on the Wei River, a tributary of the Yellow River, but they would preside over a series of expansions into the Yangtze River valley. This would be the first of many population migrations from north to south in Chinese history.
In the 8th century BC, power became decentralized during the Spring and Autumn period, named after the influential "Spring and Autumn Annals". In this period, local military leaders used by the Zhou began to assert their power and vie for hegemony. The situation was aggravated by the invasion of other peoples from the northwest, such as the Qin, forcing the Zhou to move their capital east to Luoyang. This marks the second major phase of the Zhou dynasty: the Eastern Zhou. The Spring and Autumn period is marked by a falling apart of the central Zhou power. In each of the hundreds of states that eventually arose, local strongmen held most of the political power and continued their subservience to the Zhou kings in name only. Some local leaders even started using royal titles for themselves. China now consisted of hundreds of states, some of them only as large as a village with a fort.
As the era continued, larger and more powerful states annexed or claimed suzerainty over smaller ones. By the 6th century BC most small states had disappeared by being annexed and just a few large and powerful principalities dominated China. Some southern states, such as Chu and Wu, claimed independence from the Zhou, who undertook wars against some of them (Wu and Yue). Many new cities were established in this period and Chinese culture was slowly shaped.
Once all these powerful rulers had firmly established themselves within their respective dominions, the bloodshed focused more fully on interstate conflict in the Warring States period, which began when the three remaining élite families in the Jin state—Zhao, Wei and Han—partitioned the state. Many famous individuals such as Laozi, Confucius and Sun Tzu lived during this chaotic period.
The Hundred Schools of Thought of Chinese philosophy blossomed during this period, and such influential intellectual movements as Confucianism, Taoism, Legalism and Mohism were founded, partly in response to the changing political world. The first two philosophical thoughts would have an enormous influence on Chinese culture.
After further political consolidation, seven prominent states remained by the end of the 5th century BC, and the years in which these few states battled each other are known as the Warring States period. Though there remained a nominal Zhou king until 256 BC, he was largely a figurehead and held little real power.
Numerous developments were made during this period in culture and mathematics. Examples include an important literary achievement, the Zuo zhuan on the "Spring and Autumn Annals", which summarizes the preceding Spring and Autumn period, and the bundle of 21 bamboo slips from the Tsinghua collection, which was invented during this period dated to 305 BC, are the world's earliest example of a two digit decimal multiplication table, indicating that sophisticated commercial arithmetic was already established during this period.
As neighboring territories of these warring states, including areas of modern Sichuan and Liaoning, were annexed, they were governed under the new local administrative system of commandery and prefecture. This system had been in use since the Spring and Autumn period, and parts can still be seen in the modern system of Sheng and Xian (province and county).
The final expansion in this period began during the reign of Ying Zheng, the king of Qin. His unification of the other six powers, and further annexations in the modern regions of Zhejiang, Fujian, Guangdong and Guangxi in 214 BC, enabled him to proclaim himself the First Emperor (Qin Shi Huang).
The Imperial China Period can be divided into three sub-periods: Early, Middle, and Late.
Major events in the Early sub-period include the Qin unification of China and their replacement by the Han, the First Split followed by the Jin unification, and the loss of north China. The Middle sub-period was marked by the Sui unification and their supplementation by the Tang, the Second Split, and the Song unification. The Late sub-period included the Yuan, Ming, and Qing dynasties.
Historians often refer to the period from the Qin dynasty to the end of the Qing dynasty as Imperial China. Though the unified reign of the First Qin Emperor lasted only 12 years, he managed to subdue great parts of what constitutes the core of the Han Chinese homeland and to unite them under a tightly centralized Legalist government seated at Xianyang (close to modern Xi'an). The doctrine of Legalism that guided the Qin emphasized strict adherence to a legal code and the absolute power of the emperor. This philosophy, while effective for expanding the empire in a military fashion, proved unworkable for governing it in peacetime. The Qin Emperor presided over the brutal silencing of political opposition, including the event known as the burning of books and burying of scholars. This would be the impetus behind the later Han synthesis incorporating the more moderate schools of political governance.
Major contributions of the Qin include the concept of a centralized government, and the unification and development of the legal code, the written language, measurement, and currency of China after the tribulations of the Spring and Autumn and Warring States periods. Even something as basic as the length of axles for carts—which need to match ruts in the roads—had to be made uniform to ensure a viable trading system throughout the empire. Also as part of its centralization, the Qin connected the northern border walls of the states it defeated, making the first Great Wall of China.
The tribes of the north, collectively called the Wu Hu by the Qin, were free from Chinese rule during the majority of the dynasty. Prohibited from trading with Qin dynasty peasants, the Xiongnu tribe living in the Ordos region in northwest China often raided them instead, prompting the Qin to retaliate. After a military campaign led by General Meng Tian, the region was conquered in 215 BC and agriculture was established; the peasants, however, were discontented and later revolted. The succeeding Han dynasty also expanded into the Ordos due to overpopulation, but depleted their resources in the process. Indeed, this was true of the dynasty's borders in multiple directions; modern Inner Mongolia, Xinjiang, Tibet, Manchuria, and regions to the southeast were foreign to the Qin, and even areas over which they had military control were culturally distinct.
As early as Confucius and up until 1912, there was reliance upon a trained intellectual elite, the "scholar-official" ("scholar-gentlemen"). They were civil servants appointed by the Emperor to handle daily governance. Talented young men were selected through an elaborate process of imperial examination. They had to demonstrate skill at calligraphy, and had to know Confucian philosophy. Historian Wing-Tsit Chan concludes that:
After Emperor Qin Shi Huang's unnatural death due to the consumption of mercury pills, the Qin government drastically deteriorated and eventually capitulated in 207 BC after the Qin capital was captured and sacked by rebels, which would ultimately lead to the establishment of a new dynasty of a unified China. Despite the short 15-year duration of the Qin dynasty, it was immensely influential on China and the structure of future Chinese dynasties.
The Han dynasty was founded by Liu Bang, who emerged victorious in the Chu–Han Contention that followed the fall of the Qin dynasty. A golden age in Chinese history, the Han dynasty's long period of stability and prosperity consolidated the foundation of China as a unified state under a central imperial bureaucracy, which was to last intermittently for most of the next two millennia. During the Han dynasty, territory of China was extended to most of the China proper and to areas far west. Confucianism was officially elevated to orthodox status and was to shape the subsequent Chinese civilization. Art, culture and science all advanced to unprecedented heights. With the profound and lasting impacts of this period of Chinese history, the dynasty name "Han" had been taken as the name of the Chinese people, now the dominant ethnic group in modern China, and had been commonly used to refer to Chinese language and written characters. The Han dynasty also saw many mathematical innovations being invented such as the method of Gaussian elimination which appeared in the Chinese mathematical text Chapter Eight "Rectangular Arrays" of "The Nine Chapters on the Mathematical Art". Its use is illustrated in eighteen problems, with two to five equations. The first reference to the book by this title is dated to 179 AD, but parts of it were written as early as approximately 150 BC, more than 1500 years before the Europeans came up with the method in the 18th century.
After the initial laissez-faire policies of Emperors Wen and Jing, the ambitious Emperor Wu brought the empire to its zenith. To consolidate his power, Confucianism, which emphasizes stability and order in a well-structured society, was given exclusive patronage to be the guiding philosophical thoughts and moral principles of the empire. Imperial Universities were established to support its study and further development, while other schools of thought were discouraged.
Major military campaigns were launched to weaken the nomadic Xiongnu Empire, limiting their influence north of the Great Wall. Along with the diplomatic efforts led by Zhang Qian, the sphere of influence of the Han Empire extended to the states in the Tarim Basin, opened up the Silk Road that connected China to the west, stimulating bilateral trade and cultural exchange. To the south, various small kingdoms far beyond the Yangtze River Valley were formally incorporated into the empire.
Emperor Wu also dispatched a series of military campaigns against the Baiyue tribes. The Han annexed Minyue in 135 BC and 111 BC, Nanyue in 111 BC, and Dian in 109 BC. Migration and military expeditions led to the cultural assimilation of the south. It also brought the Han into contact with kingdoms in Southeast Asia, introducing diplomacy and trade.
After Emperor Wu, the empire slipped into gradual stagnation and decline. Economically, the state treasury was strained by excessive campaigns and projects, while land acquisitions by elite families gradually drained the tax base. Various consort clans exerted increasing control over strings of incompetent emperors and eventually the dynasty was briefly interrupted by the usurpation of Wang Mang.
In AD 9, the usurper Wang Mang claimed that the Mandate of Heaven called for the end of the Han dynasty and the rise of his own, and he founded the short-lived Xin ("New") dynasty. Wang Mang started an extensive program of land and other economic reforms, including the outlawing of slavery and land nationalization and redistribution. These programs, however, were never supported by the landholding families, because they favored the peasants. The instability of power brought about chaos, uprisings, and loss of territories. This was compounded by mass flooding of the Yellow River; silt buildup caused it to split into two channels and displaced large numbers of farmers. Wang Mang was eventually killed in Weiyang Palace by an enraged peasant mob in AD 23.
Emperor Guangwu reinstated the Han dynasty with the support of landholding and merchant families at Luoyang, "east" of the former capital Xi'an. Thus, this new era is termed the Eastern Han dynasty. With the capable administrations of Emperors Ming and Zhang, former glories of the dynasty was reclaimed, with brilliant military and cultural achievements. The Xiongnu Empire was decisively defeated. The diplomat and general Ban Chao further expanded the conquests across the Pamirs to the shores of the Caspian Sea, thus reopening the Silk Road, and bringing trade, foreign cultures, along with the arrival of Buddhism. With extensive connections with the west, the first of several Roman embassies to China were recorded in Chinese sources, coming from the sea route in AD 166, and a second one in AD 284.
The Eastern Han dynasty was one of the most prolific era of science and technology in ancient China, notably the historic invention of papermaking by Cai Lun, and the numerous scientific and mathematical contributions by the famous polymath Zhang Heng.
By the 2nd century, the empire declined amidst land acquisitions, invasions, and feuding between consort clans and eunuchs. The Yellow Turban Rebellion broke out in AD 184, ushering in an era of warlords. In the ensuing turmoil, three states tried to gain predominance in the period of the Three Kingdoms. This time period has been greatly romanticized in works such as "Romance of the Three Kingdoms".
After Cao Cao reunified the north in 208, his son proclaimed the Wei dynasty in 220. Soon, Wei's rivals Shu and Wu proclaimed their independence, leading China into the Three Kingdoms period. This period was characterized by a gradual decentralization of the state that had existed during the Qin and Han dynasties, and an increase in the power of great families.
In 266, the Jin dynasty overthrew the Wei and later unified the country in 280, but this union was short-lived.
The Jin dynasty was severely weakened by internecine fighting among imperial princes and lost control of northern China after non-Han Chinese settlers rebelled and captured Luoyang and Chang'an. In 317, a Jin prince in modern-day Nanjing became emperor and continued the dynasty, now known as the Eastern Jin, which held southern China for another century. Prior to this move, historians refer to the Jin dynasty as the Western Jin.
Northern China fragmented into a series of independent kingdoms, most of which were founded by Xiongnu, Xianbei, Jie, Di and Qiang rulers. These non-Han peoples were ancestors of the Turks, Mongols, and Tibetans. Many had, to some extent, been "sinicized" long before their ascent to power. In fact, some of them, notably the Qiang and the Xiongnu, had already been allowed to live in the frontier regions within the Great Wall since late Han times. During the period of the Sixteen Kingdoms, warfare ravaged the north and prompted large-scale Han Chinese migration south to the Yangtze River Basin and Delta.
In the early 5th century, China entered a period known as the Northern and Southern dynasties, in which parallel regimes ruled the northern and southern halves of the country. In the south, the Eastern Jin gave way to the Liu Song, Southern Qi, Liang and finally Chen. Each of these Southern dynasties were led by Han Chinese ruling families and used Jiankang (modern Nanjing) as the capital. They held off attacks from the north and preserved many aspects of Chinese civilization, while northern barbarian regimes began to sinify.
In the north, the last of the Sixteen Kingdoms was extinguished in 439 by the Northern Wei, a kingdom founded by the Xianbei, a nomadic people who unified northern China. The Northern Wei eventually split into the Eastern and Western Wei, which then became the Northern Qi and Northern Zhou. These regimes were dominated by Xianbei or Han Chinese who had married into Xianbei families. During this period most Xianbei people adopted Han surnames, eventually leading to complete assimilation into the Han.
Despite the division of the country, Buddhism spread throughout the land. In southern China, fierce debates about whether Buddhism should be allowed were held frequently by the royal court and nobles. By the end of the era, Buddhists and Taoists had become much more tolerant of each other.
The short-lived Sui dynasty was a pivotal period in Chinese history. Founded by Emperor Wen in 581 in succession of the Northern Zhou, the Sui went on to conquer the Southern Chen in 589 to reunify China, ending three centuries of political division. The Sui pioneered many new institutions, including the government system of Three Departments and Six Ministries, imperial examinations for selecting officials from commoners, while improved on the systems of fubing system of the army conscription and the Equal-field system of land distributions. These policies, which were adopted by later dynasties, brought enormous population growth, and amassed excessive wealth to the state. Standardized coinage were enforced throughout the unified empire. Buddhism took root as a prominent religion and was supported officially. Sui China was known for its numerous mega-construction projects. Intended for grains shipment and transporting troops, the Grand Canal was constructed, linking the capitals Daxing (Chang'an) and Luoyang to the wealthy southeast region, and in another route, to the northeast border. The Great Wall was also expanded, while series of military conquests and diplomatic maneuvers further pacified its borders. However, the massive invasions of the Korean Peninsula during the Goguryeo–Sui War failed disastrously, triggering widespread revolts that led to the fall of the dynasty.
The Tang dynasty was founded by Emperor Gaozu on 18 June 618. It was a golden age of Chinese civilization and considered to be the most prosperous period of China with significant developments in culture, art, literature, particularly poetry, and technology. Buddhism became the predominant religion for the common people. Chang'an (modern Xi'an), the national capital, was the largest city in the world during its time.
The second emperor, Taizong, is widely regarded as one of the greatest emperors in Chinese history, who had laid the foundation for the dynasty to flourish for centuries beyond his reign. Combined military conquests and diplomatic maneuvers were implemented to eliminate threats from nomadic tribes, extend the border, and submit neighboring states into a tributary system. Military victories in the Tarim Basin kept the Silk Road open, connecting Chang'an to Central Asia and areas far to the west. In the south, lucrative maritime trade routes began from port cities such as Guangzhou. There was extensive trade with distant foreign countries, and many foreign merchants settled in China, encouraging a cosmopolitan culture. The Tang culture and social systems were observed and imitated by neighboring countries, most notably, Japan. Internally the Grand Canal linked the political heartland in Chang'an to the agricultural and economic centers in the eastern and southern parts of the empire. Xuanzang, a Chinese Buddhist monk, scholar, traveller, and translator who travelled to India on his own, and returned with, "over six hundred Mahayana and Hinayana texts, seven statues of the Buddha and more than a hundred sarira relics."
Underlying the prosperity of the early Tang dynasty was a strong centralized bureaucracy with efficient policies. The government was organized as "Three Departments and Six Ministries" to separately draft, review, and implement policies. These departments were run by royal family members as well as scholar officials who were selected by imperial examinations. These practices, which matured in the Tang dynasty, were continued by the later dynasties, with some modifications.
Under the Tang "equal-field system" all land was owned by the Emperor and granted to people according to household size. Men granted land were conscripted for military service for a fixed period each year, a military policy known as the "Fubing system". These policies stimulated a rapid growth in productivity and a significant army without much burden on the state treasury. By the dynasty's midpoint, however, standing armies had replaced conscription, and land was continuously falling into the hands of private owners.
The dynasty continued to flourish under the rule of Empress Wu Zetian, the only empress regnant in Chinese history, and reached its zenith during the long reign of Emperor Xuanzong, who oversaw an empire that stretched from the Pacific to the Aral Sea with at least 50 million people. There were vibrant artistic and cultural creations, including works of the greatest Chinese poets, Li Bai, and Du Fu.
At the zenith of prosperity of the empire, the An Lushan Rebellion from 755 to 763 was a watershed event that devastated the population and drastically weakened the central imperial government. Upon suppression of the rebellion, regional military governors, known as Jiedushi, gained increasingly autonomous status. With loss of revenue from land tax, the central imperial government relied heavily on salt monopoly. Externally, former submissive states raided the empire and the vast border territories were irreversibly lost for subsequent centuries. Nevertheless, civil society recovered and thrived amidst the weakened imperial bureaucracy.
In late Tang period, the empire was worn out by recurring revolts of regional warlords, while internally, as scholar-officials engaged in fierce factional strife, corrupted eunuchs amassed immense power. Catastrophically, the Huang Chao Rebellion, from 874 to 884, devastated the entire empire for a decade. The sack of the southern port Guangzhou in 879 was followed by the massacre of most of its inhabitants, along with the large foreign merchant enclaves. By 881, both capitals, Luoyang and Chang'an, fell successively. The reliance on ethnic Han and Turkic warlords in suppressing the rebellion increased their power and influence. Consequently, the fall of the dynasty following Zhu Wen's usurpation led to an era of division.
According to historian Mark Edward Lewis:
The period of political disunity between the Tang and the Song, known as the Five Dynasties and Ten Kingdoms period, lasted from 907 to 960. During this half-century, China was in all respects a multi-state system. Five regimes, namely, (Later) Liang, Tang, Jin, Han and Zhou, rapidly succeeded one another in control of the traditional Imperial heartland in northern China. Among the regimes, rulers of (Later) Tang, Jin and Han were sinicized Shatuo Turks, which ruled over the ethnic majority of Han Chinese. More stable and smaller regimes of mostly ethnic Han rulers coexisted in south and western China over the period, cumulatively constituted the "Ten Kingdoms".
Amidst political chaos in the north, the strategic Sixteen Prefectures (region along today's Great Wall) were ceded to the emerging Khitan Liao dynasty, which drastically weakened the defense of the China proper against northern nomadic empires. To the south, Vietnam gained lasting independence after being a Chinese prefecture for many centuries. With wars dominated in Northern China, there were mass southward migrations of population, which further enhanced the southward shift of cultural and economic centers in China. The era ended with the coup of Later Zhou general Zhao Kuangyin, and the establishment of the Song dynasty in 960, which eventually annihilated the remains of the "Ten Kingdoms" and reunified China.
In 960, the Song dynasty was founded by Emperor Taizu, with its capital established in Kaifeng (also known as Bianjing). In 979, the Song dynasty reunified most of the China proper, while large swaths of the outer territories were occupied by sinicized nomadic empires. The Khitan Liao dynasty, which lasted from 907 to 1125, ruled over Manchuria, Mongolia, and parts of Northern China. Meanwhile, in what are now the north-western Chinese provinces of Gansu, Shaanxi, and Ningxia, the Tangut tribes founded the Western Xia dynasty from 1032 to 1227.
Aiming to recover the strategic Sixteen Prefectures lost in the previous dynasty, campaigns were launched against the Liao dynasty in the early Song period, which all ended in failure. Then in 1004, the Liao cavalry swept over the exposed North China Plain and reached the outskirts of Kaifeng, forcing the Song's submission and then agreement to the Chanyuan Treaty, which imposed heavy annual tributes from the Song treasury. The treaty was a significant reversal of Chinese dominance of the traditional tributary system. Yet the annual outflow of Song's silver to the Liao was paid back through the purchase of Chinese goods and products, which expanded the Song economy, and replenished its treasury. This dampened the incentive for the Song to further campaign against the Liao. Meanwhile, this cross-border trade and contact induced further sinicization within the Liao Empire, at the expense of its military might which was derived from its primitive nomadic lifestyle. Similar treaties and social-economical consequences occurred in Song's relations with the Jin dynasty.
Within the Liao Empire, the Jurchen tribes revolted against their overlords to establish the Jin dynasty in 1115. In 1125, the devastating Jin cataphract annihilated the Liao dynasty, while remnants of Liao court members fled to Central Asia to found the Qara Khitai Empire (Western Liao dynasty). Jin's invasion of the Song dynasty followed swiftly. In 1127, Kaifeng was sacked, a massive catastrophe known as the Jingkang Incident, ending the Northern Song dynasty. Later the entire north of China was conquered. The survived members of Song court regrouped in the new capital city of Hangzhou, and initiated the Southern Song dynasty, which ruled territories south of the Huai River. In the ensuing years, the territory and population of China were divided between the Song dynasty, the Jin dynasty and the Western Xia dynasty. The era ended with the Mongol conquest, as Western Xia fell in 1227, the Jin dynasty in 1234, and finally the Southern Song dynasty in 1279.
Despite its military weakness, the Song dynasty is widely considered to be the high point of classical Chinese civilization. The Song economy, facilitated by technology advancement, had reached a level of sophistication probably unseen in world history before its time. The population soared to over 100 million and the living standards of common people improved tremendously due to improvements in rice cultivation and the wide availability of coal for production. The capital cities of Kaifeng and subsequently Hangzhou were both the most populous cities in the world for their time, and encouraged vibrant civil societies unmatched by previous Chinese dynasties. Although land trading routes to the far west were blocked by nomadic empires, there were extensive maritime trade with neighboring states, which facilitated the use of Song coinage as the de facto currency of exchange. Giant wooden vessels equipped with compasses traveled throughout the China Seas and northern Indian Ocean. The concept of insurance was practised by merchants to hedge the risks of such long-haul maritime shipments. With prosperous economic activities, the historically first use of paper currency emerged in the western city of Chengdu, as a supplement to the existing copper coins.
The Song dynasty was considered to be the golden age of great advancements in science and technology of China, thanks to innovative scholar-officials such as Su Song (1020–1101) and Shen Kuo (1031–1095). Inventions such as the hydro-mechanical astronomical clock, the first continuous and endless power-transmitting chain, woodblock printing and paper money were all invented during the Song dynasty.
There was court intrigue between the political reformers and conservatives, led by the chancellors Wang Anshi and Sima Guang, respectively. By the mid-to-late 13th century, the Chinese had adopted the dogma of Neo-Confucian philosophy formulated by Zhu Xi. Enormous literary works were compiled during the Song dynasty, such as the historical work, the "Zizhi Tongjian" ("Comprehensive Mirror to Aid in Government"). The invention of movable-type printing further facilitated the spread of knowledge. Culture and the arts flourished, with grandiose artworks such as "Along the River During the Qingming Festival" and "Eighteen Songs of a Nomad Flute", along with great Buddhist painters such as the prolific Lin Tinggui.
The Song dynasty was also a period of major innovation in the history of warfare. Gunpowder, while invented in the Tang dynasty, was first put into use in battlefields by the Song army, inspiring a succession of new firearms and siege engines designs. During the Southern Song dynasty, as its survival hinged decisively on guarding the Yangtze and Huai River against the cavalry forces from the north, the first standing navy in China was assembled in 1132, with its admiral's headquarters established at Dinghai. Paddle-wheel warships equipped with trebuchets could launch incendiary bombs made of gunpowder and lime, as recorded in Song's victory over the invading Jin forces at the Battle of Tangdao in the East China Sea, and the Battle of Caishi on the Yangtze River in 1161.
The advances in civilization during the Song dynasty came to an abrupt end following the devastating Mongol conquest, during which the population sharply dwindled, with a marked contraction in economy. Despite viciously halting Mongol advance for more than three decades, the Southern Song capital Hangzhou fell in 1276, followed by the final annihilation of the Song standing navy at the Battle of Yamen in 1279.
The Yuan dynasty was formally proclaimed in 1271, when the Great Khan of Mongol, Kublai Khan, one of the grandsons of Genghis Khan, assumed the additional title of Emperor of China, and considered his inherited part of the Mongol Empire as a Chinese dynasty. In the preceding decades, the Mongols had conquered the Jin dynasty in Northern China, and the Southern Song dynasty fell in 1279 after a protracted and bloody war. The Mongol Yuan dynasty became the first conquest dynasty in Chinese history to rule the entire China proper and its population as an ethnic minority. The dynasty also directly controlled the Mongolian heartland and other regions, inheriting the largest share of territory of the divided Mongol Empire, which roughly coincided with the modern area of China and nearby regions in East Asia. Further expansion of the empire was halted after defeats in the invasions of Japan and Vietnam. Following the previous Jin dynasty, the capital of Yuan dynasty was established at Khanbaliq (also known as Dadu, modern-day Beijing). The Grand Canal was reconstructed to connect the remote capital city to economic hubs in southern part of China, setting the precedence and foundation where Beijing would largely remain as the capital of the successive regimes that unified China mainland.
After the peace treaty in 1304 that ended a series of Mongol civil wars, the emperors of the Yuan dynasty were upheld as the nominal Great Khan (Khagan) of the greater Mongol Empire over other Mongol Khanates, which nonetheless remained de facto autonomous. The era was known as "Pax Mongolica", when much of the Asian continent was ruled by the Mongols. For the first and only time in history, the silk road was controlled entirely by a single state, facilitating the flow of people, trade, and cultural exchange. Network of roads and a postal system were established to connect the vast empire. Lucrative maritime trade, developed from the previous Song dynasty, continued to flourish, with Quanzhou and Hangzhou emerging as the largest ports in the world. Adventurous travelers from the far west, most notably the Venetian, Marco Polo, would have settled in China for decades. Upon his return, his detail travel record inspired generations of medieval Europeans with the splendors of the far East. The Yuan dynasty was the first ancient economy, where paper currency, known at the time as Jiaochao, was used as the predominant medium of exchange. Its unrestricted issuance in the late Yuan dynasty inflicted hyperinflation, which eventually brought the downfall of the dynasty.
While the Mongol rulers of the Yuan dynasty adopted substantially to Chinese culture, their sinicization was of lesser extent compared to earlier conquest dynasties in Chinese history. For preserving racial superiority as the conqueror and ruling class, traditional nomadic customs and heritage from the Mongolian steppe were held in high regard. On the other hand, the Mongol rulers also adopted flexibly to a variety of cultures from many advanced civilizations within the vast empire. Traditional social structure and culture in China underwent immense transform during the Mongol dominance. Large group of foreign migrants settled in China, who enjoyed elevated social status over the majority Han Chinese, while enriching Chinese culture with foreign elements. The class of scholar officials and intellectuals, traditional bearers of elite Chinese culture, lost substantial social status. This stimulated the development of culture of the common folks. There were prolific works in zaju variety shows and literary songs (sanqu), which were written in a distinctive poetry style known as qu. Novels of vernacular style gained unprecedented status and popularity.
Before the Mongol invasion, Chinese dynasties reported approximately 120 million inhabitants; after the conquest had been completed in 1279, the 1300 census reported roughly 60 million people. This major decline is not necessarily due only to Mongol killings. Scholars such as Frederick W. Mote argue that the wide drop in numbers reflects an administrative failure to record rather than an actual decrease; others such as Timothy Brook argue that the Mongols created a system of enserfment among a huge portion of the Chinese populace, causing many to disappear from the census altogether; other historians including William McNeill and David Morgan consider that plague was the main factor behind the demographic decline during this period. In the 14th century China suffered additional depredations from epidemics of plague, estimated to have killed 25 million people, 30% of the population of China.
Throughout the Yuan dynasty, there was some general sentiment among the populace against the Mongol dominance. Yet rather than the nationalist cause, it was mainly strings of natural disasters and incompetent governance that triggered widespread peasant uprisings since the 1340s. After the massive naval engagement at Lake Poyang, Zhu Yuanzhang prevailed over other rebel forces in the south. He proclaimed himself emperor and founded the Ming dynasty in 1368. The same year his northern expedition army captured the capital Khanbaliq. The Yuan remnants fled back to Mongolia and sustained the regime. Other Mongol Khanates in Central Asia continued to exist after the fall of Yuan dynasty in China.
The Ming dynasty was founded by Zhu Yuanzhang in 1368, who proclaimed himself as the Hongwu Emperor. The capital was initially set at Nanjing, and was later moved to Beijing from Yongle Emperor's reign onward.
Urbanization increased as the population grew and as the division of labor grew more complex. Large urban centers, such as Nanjing and Beijing, also contributed to the growth of private industry. In particular, small-scale industries grew up, often specializing in paper, silk, cotton, and porcelain goods. For the most part, however, relatively small urban centers with markets proliferated around the country. Town markets mainly traded food, with some necessary manufactures such as pins or oil.
Despite the xenophobia and intellectual introspection characteristic of the increasingly popular new school of neo-Confucianism, China under the early Ming dynasty was not isolated. Foreign trade and other contacts with the outside world, particularly Japan, increased considerably. Chinese merchants explored all of the Indian Ocean, reaching East Africa with the voyages of Zheng He.
The Hongwu Emperor, being the only founder of a Chinese dynasty who was also of peasant origin, had laid the foundation of a state that relied fundamentally in agriculture. Commerce and trade, which flourished in the previous Song and Yuan dynasties, were less emphasized. Neo-feudal landholdings of the Song and Mongol periods were expropriated by the Ming rulers. Land estates were confiscated by the government, fragmented, and rented out. Private slavery was forbidden. Consequently, after the death of the Yongle Emperor, independent peasant landholders predominated in Chinese agriculture. These laws might have paved the way to removing the worst of the poverty during the previous regimes. Towards later era of the Ming dynasty, with declining government control, commerce, trade and private industries revived.
The dynasty had a strong and complex central government that unified and controlled the empire. The emperor's role became more autocratic, although Hongwu Emperor necessarily continued to use what he called the "Grand Secretariat" to assist with the immense paperwork of the bureaucracy, including memorials (petitions and recommendations to the throne), imperial edicts in reply, reports of various kinds, and tax records. It was this same bureaucracy that later prevented the Ming government from being able to adapt to changes in society, and eventually led to its decline.
The Yongle Emperor strenuously tried to extend China's influence beyond its borders by demanding other rulers send ambassadors to China to present tribute. A large navy was built, including four-masted ships displacing 1,500 tons. A standing army of 1 million troops was created. The Chinese armies conquered and occupied Vietnam for around 20 years, while the Chinese fleet sailed the China seas and the Indian Ocean, cruising as far as the east coast of Africa. The Chinese gained influence in eastern Moghulistan. Several maritime Asian nations sent envoys with tribute for the Chinese emperor. Domestically, the Grand Canal was expanded and became a stimulus to domestic trade. Over 100,000 tons of iron per year were produced. Many books were printed using movable type. The imperial palace in Beijing's Forbidden City reached its current splendor. It was also during these centuries that the potential of south China came to be fully exploited. New crops were widely cultivated and industries such as those producing porcelain and textiles flourished.
In 1449 Esen Tayisi led an Oirat Mongol invasion of northern China which culminated in the capture of the Zhengtong Emperor at Tumu. Since then, the Ming became on the defensive on the northern frontier, which led to the Ming Great Wall being built. Most of what remains of the Great Wall of China today was either built or repaired by the Ming. The brick and granite work was enlarged, the watchtowers were redesigned, and cannons were placed along its length.
At sea, the Ming became increasingly isolationist after the death of the Yongle Emperor. The treasure voyages which sailed Indian Ocean were discontinued, and the maritime prohibition laws were set in place banning the Chinese from sailing abroad. European traders who reached China in the midst of the Age of Discovery were repeatedly rebuked in their requests for trade, with the Portuguese being repulsed by the Ming navy at Tuen Mun in 1521 and again in 1522. Domestic and foreign demands for overseas trade, deemed illegal by the state, led to widespread "wokou" piracy attacking the southeastern coastline during the rule of the Jiajing Emperor (1507–1567), which only subsided after the opening of ports in Guangdong and Fujian and much military suppression. The Portuguese were allowed to settle in Macau in 1557 for trade, which remained in Portuguese hands until 1999. The Dutch entry into the Chinese seas was also met with fierce resistance, with the Dutch being chased off the Penghu islands in the Sino-Dutch conflicts of 1622–1624 and were forced to settle in Taiwan instead. The Dutch in Taiwan fought with the Ming in the Battle of Liaoluo Bay in 1633 and lost, and eventually surrendered to the Ming loyalist Koxinga in 1662, after the fall of the Ming dynasty.
In 1556, during the rule of the Jiajing Emperor, the Shaanxi earthquake killed about 830,000 people, the deadliest earthquake of all time.
The Ming dynasty intervened deeply in the Japanese invasions of Korea (1592–98), which ended with the withdrawal of all invading Japanese forces in Korea, and the restoration of the Joseon dynasty, its traditional ally and tributary state. The regional hegemony of the Ming dynasty was preserved at a toll on its resources. Coincidentally, with Ming's control in Manchuria in decline, the Manchu (Jurchen) tribes, under their chieftain Nurhaci, broke away from Ming's rule, and emerged as a powerful, unified state, which was later proclaimed as the Qing dynasty. It went on to subdue the much weakened Korea as its tributary, conquered Mongolia, and expanded its territory to the outskirt of the Great Wall. The most elite army of the Ming dynasty was to station at the Shanhai Pass to guard the last stronghold against the Manchus, which weakened its suppression of internal peasants uprisings.
The Qing dynasty (1644–1911) was the last imperial dynasty in China. Founded by the Manchus, it was the second conquest dynasty to rule the territory of China proper, and roughly doubled the territory controlled by the Ming. The Manchus were formerly known as Jurchens, residing in the northeastern part of the Ming territory outside the Great Wall. They emerged as the major threat to the late Ming dynasty after Nurhaci united all Jurchen tribes and declared the founding of the Qing dynasty in 1636. The Qing dynasty set up the Eight Banners system that provided the basic framework for the Qing military conquest. Li Zicheng's peasant rebellion captured Beijing in 1644 and the Chongzhen Emperor, the last Ming emperor, committed suicide. The Manchus allied with the Ming general Wu Sangui to seize Beijing, which was made the capital of the Qing dynasty. and then proceeded to subdue the Ming remnants in the south. The decades of Manchu conquest caused enormous loss of lives and the economic scale of China shrank drastically. In total, the Qing conquest of the Ming (1618–1683) cost as many as 25 million lives. The early Manchu emperors combined traditions of Central Asian rule with Confucian norms of traditional Chinese government and were considered a Chinese dynasty.
The Manchus enforced a 'queue order,' forcing Han Chinese men to adopt the Manchu queue hairstyle. Officials were required to wear Manchu-style clothing "Changshan" (bannermen dress and "Tangzhuang"), but ordinary Han civilians were allowed to wear traditional Han clothing. Bannermen could not undertake trade or manual labor; they had to petition to be removed from banner status. They were considered a form of nobility and were given annual pensions, land, and allotments of cloth.The Kangxi Emperor ordered the creation of the "Kangxi Dictionary", the most complete dictionary of Chinese characters that had been compiled.
Over the next half-century, all areas previously under the Ming dynasty were consolidated under the Qing. Conquests in Central Asia in the eighteenth century extended territorial control. Between 1673 and 1681, the Kangxi Emperor suppressed the Revolt of the Three Feudatories, an uprising of three generals in Southern China who had been denied hereditary rule of large fiefdoms granted by the previous emperor. In 1683, the Qing staged an amphibious assault on southern Taiwan, bringing down the rebel Kingdom of Tungning, which was founded by the Ming loyalist Koxinga (Zheng Chenggong) in 1662 after the fall of the Southern Ming, and had served as a base for continued Ming resistance in Southern China. The Qing defeated the Russians at Albazin, resulting in the Treaty of Nerchinsk.
By the end of Qianlong Emperor's long reign in 1796, the Qing Empire was at its zenith. The Qing ruled more than one-third of the world's population, and had the largest economy in the world. By area it was one of the largest empires ever.
In the 19th century the empire was internally restive and externally threatened by western powers. The defeat by the British Empire in the First Opium War (1840) led to the Treaty of Nanking (1842), under which Hong Kong was ceded to Britain and importation of opium (produced by British Empire territories) was allowed. Opium usage continued to grow in China, adversely affecting societal stability. Subsequent military defeats and unequal treaties with other western powers continued even after the fall of the Qing dynasty.
Internally the Taiping Rebellion (1851–1864), a Christian religious movement led by the "Heavenly King" Hong Xiuquan swept from the south to establish the Taiping Heavenly Kingdom and controlled roughly a third of China proper for over a decade. The court in desperation empowered Han Chinese officials such as Zeng Guofan to raise local armies. After initial defeats, Zeng crushed the rebels in the Third Battle of Nanking in 1864. This was one of the largest wars in the 19th century in terms of troop involvement; there was massive loss of life, with a death toll of about 20 million. A string of civil disturbances followed, including the Punti–Hakka Clan Wars, Nian Rebellion, Dungan Revolt, and Panthay Rebellion. All rebellions were ultimately put down, but at enormous cost and with millions dead, seriously weakening the central imperial authority. China never rebuilt a strong central army, and many local officials used their military power to effectively rule independently in their provinces.
Yet the dynasty appeared to recover in the Tongzhi Restoration (1860-1872), led by Manchu royal family reformers and Han Chinese officials such as Zeng Guofan and his proteges Li Hongzhang and Zuo Zongtang. Their Self-Strengthening Movement made effective institutional reforms, imported Western factories and communications technology, with prime emphasis on strengthening the military. However, the reform was undermined by official rivalries, cynicism, and quarrels within the imperial family. The defeat of Yuan Shikai's modernized "Beiyang Fleet" in the First Sino-Japanese War (1894–1895) led to the formation of the New Army. The Guangxu Emperor, advised by Kang Youwei, then launched a comprehensive reform effort, the Hundred Days' Reform (1898). Empress Dowager Cixi, however, feared that precipitous change would lead to bureaucratic opposition and foreign intervention and quickly suppressed it.
In the summer of 1900, the Boxer Uprising opposed foreign influence and murdered Chinese Christians and foreign missionaries. When Boxers entered Beijing, the Qing government ordered all foreigners to leave, but they and many Chinese Christians were besieged in the foreign legations quarter. An Eight-Nation Alliance sent the Seymour Expedition of Japanese, Russian, British, Italian, German, French, American, and Austrian troops to relieve the siege, but they were forced to retreat by Boxer and Qing troops at the Battle of Langfang. After the Alliance's attack on the Dagu Forts, the court declared war on the Alliance and authorized the Boxers to join with imperial armies. After fierce fighting at Tientsin, the Alliance formed the second, much larger Gaselee Expedition and finally reached Beijing; the Empress Dowager evacuated to Xi'an. The Boxer Protocol ended the war, exacting a tremendous indemnity.
The Qing court then instituted "New Policies" of administrative and legal reform, including abolition of the examination system. But young officials, military officers, and students debated reform, perhaps a constitutional monarchy, or the overthrow of the dynasty and the creation of a republic. They were inspired by an emerging public opinion formed by intellectuals such as Liang Qichao and the revolutionary ideas of Sun Yat-sen. A localised military uprising, the Wuchang Uprising, began on 10 October 1911, in Wuchang (Today part of Wuhan), and soon spread. The Republic of China was proclaimed on 1 January 1912, ending 2,000 years of dynastic rule.
The provisional government of the Republic of China was formed in Nanking on 12 March 1912. Sun Yat-sen became President of the Republic of China, but he turned power over to Yuan Shikai, who commanded the New Army. Over the next few years, Yuan proceeded to abolish the national and provincial assemblies and declared himself as the emperor of Empire of China in late 1915. Yuan's imperial ambitions were fiercely opposed by his subordinates; faced with the prospect of rebellion, he abdicated in March 1916 and died of natural causes in June.
Yuan's death in 1916 left a power vacuum; the republican government was all but shattered. This opened the way for the Warlord Era, during which much of China was ruled by shifting coalitions of competing provincial military leaders and the Beiyang government. Intellectuals, disappointed in the failure of the Republic, launched the New Culture Movement.
In 1919, the May Fourth Movement began as a response to the pro-Japanese terms imposed on China by the Treaty of Versailles following World War I. It quickly became a nationwide protest movement. The protests were a moral success as the cabinet fell and China refused to sign the Treaty of Versailles, which had awarded German holdings of Shandong to Japan. Political and intellectual ferment waxed strong throughout the 1920s and 1930s. According to Patricia Ebrey:
In the 1920s, Sun Yat-sen established a revolutionary base in Guangzhou and set out to unite the fragmented nation. He welcomed assistance from the Soviet Union (itself fresh from Lenin's takeover) and he entered into an alliance with the fledgling Communist Party of China. After Sun's death from cancer in 1925, one of his protégés, Chiang Kai-shek, seized control of the Nationalist Party (KMT) and succeeded in bringing most of south and central China under its rule in the Northern Expedition (1926–1927). Having defeated the warlords in the south and central China by military force, Chiang was able to secure the nominal allegiance of the warlords in the North and establish the Nationalist government in Nanking. In 1927, Chiang turned on the CPC and relentlessly chased the CPC armies in NRA and its leaders out of KMT. In 1934, driven from their mountain bases such as the Chinese Soviet Republic, the CPC forces embarked on the Long March across China's most desolate terrain to the northwest, where they established a guerrilla base at Yan'an in Shaanxi Province. During the Long March, the communists reorganized under a new leader, Mao Zedong (Mao Tse-tung).
The bitter Chinese Civil War between the Nationalists and the Communists continued, openly or clandestinely, through the 14-year-long Japanese occupation of various parts of the country (1931–1945). The two Chinese parties nominally formed a United Front to oppose the Japanese in 1937, during the Second Sino-Japanese War (1937–1945), which became a part of World War II. Japanese forces committed numerous war atrocities against the civilian population, including biological warfare (see Unit 731) and the Three Alls Policy ("Sankō Sakusen"), the three alls being: ""Kill All, Burn All and Loot All"".
Following the defeat of Japan in 1945, the war between the Nationalist government forces and the CPC resumed, after failed attempts at reconciliation and a negotiated settlement. By 1949, the CPC had established control over most of the country. Odd Arne Westad says the Communists won the Civil War because they made fewer military mistakes than Chiang, and because in his search for a powerful centralized government, Chiang antagonized too many interest groups in China. Furthermore, his party was weakened in the war against the Japanese. Meanwhile, the Communists told different groups, such as peasants, exactly what they wanted to hear, and cloaked themselves in the cover of Chinese Nationalism. During the civil war both the Nationalists and Communists carried out mass atrocities, with millions of non-combatants killed by both sides. These included deaths from forced conscription and massacres. When the Nationalist government forces were defeated by CPC forces in mainland China in 1949, the Nationalist government retreated to Taiwan with its forces, along with Chiang and a large number of their supporters; the Nationalist government had taken effective control of Taiwan at the end of WWII as part of the overall Japanese surrender, when Japanese troops in Taiwan surrendered to the Republic of China troops.
Until the early 1970s, the ROC was recognized as the sole legitimate government of China by the United Nations, the United States and most Western nations, refusing to recognize the PRC on account of the Cold War. This changed in 1971 when the PRC was seated in the United Nations, replacing the ROC. The KMT ruled Taiwan under martial law until 1987, with the stated goal of being vigilant against Communist infiltration and preparing to retake mainland China. Therefore, political dissent was not tolerated during that period.
In the 1990s, the ROC underwent a major democratic reform, beginning with the 1991 resignation of the members of the Legislative Yuan and National Assembly elected in 1947. These groups were originally created to represent mainland China constituencies. Also lifted were the restrictions on the use of Taiwanese languages in the broadcast media and in schools. This culminated with the first direct presidential election in 1996 against the Democratic Progressive Party (DPP) candidate and former dissident, Peng Min-ming. In 2000, the KMT status as the ruling party ended when the DPP took power, only to regain its status in the 2008 election by Ma Ying-jeou.
Due to the controversial nature of the Taiwan's political status, the ROC is currently recognized by 14 UN member states and Holy See as of as the legitimate government of "China".
Major combat in the Chinese Civil War ended in 1949 with Kuomintang (KMT) pulling out of the mainland, with the government relocating to Taipei and maintaining control only over a few islands. The Communist Party of China was left in control of mainland China. On 1 October 1949, Mao Zedong proclaimed the People's Republic of China. "Communist China" and "Red China" were two common names for the PRC.
The PRC was shaped by a series of campaigns and five-year plans. The economic and social plan known as the Great Leap Forward caused an estimated 45 million deaths. Mao's government carried out mass executions of landowners, instituted collectivisation and implemented the Laogai camp system. Execution, deaths from forced labor and other atrocities resulted in millions of deaths under Mao. In 1966 Mao and his allies launched the Cultural Revolution, which continued until Mao's death a decade later. The Cultural Revolution, motivated by power struggles within the Party and a fear of the Soviet Union, led to a major upheaval in Chinese society.
In 1972, at the peak of the Sino-Soviet split, Mao and Zhou Enlai met US president Richard Nixon in Beijing to establish relations with the United States. In the same year, the PRC was admitted to the United Nations in place of the Republic of China, with permanent membership of the Security Council.
A power struggle followed Mao's death in 1976. The Gang of Four were arrested and blamed for the excesses of the Cultural Revolution, marking the end of a turbulent political era in China. Deng Xiaoping outmaneuvered Mao's anointed successor chairman Hua Guofeng, and gradually emerged as the "de facto" leader over the next few years.
Deng Xiaoping was the Paramount Leader of China from 1978 to 1992, although he never became the head of the party or state, and his influence within the Party led the country to significant economic reforms. The Communist Party subsequently loosened governmental control over citizens' personal lives and the communes were disbanded with many peasants receiving multiple land leases, which greatly increased incentives and agricultural production. In addition, there were many free market areas opened. The most successful free market areas was Shenzhen. It is located in Guangdong and the property tax free area still exists today. This turn of events marked China's transition from a planned economy to a mixed economy with an increasingly open market environment, a system termed by some as "market socialism", and officially by the Communist Party of China as "Socialism with Chinese characteristics". The PRC adopted its current constitution on 4 December 1982.
In 1989 the death of former general secretary Hu Yaobang helped to spark the Tiananmen Square protests of that year, during which students and others campaigned for several months, speaking out against corruption and in favour of greater political reform, including democratic rights and freedom of speech. However, they were eventually put down on 4 June when PLA troops and vehicles entered and forcibly cleared the square, with many fatalities. This event was widely reported, and brought worldwide condemnation and sanctions against the government. A filmed incident involving the "tank man" was seen worldwide.
CPC general secretary and PRC President Jiang Zemin and PRC Premier Zhu Rongji, both former mayors of Shanghai, led post-Tiananmen PRC in the 1990s. Under Jiang and Zhu's ten years of administration, the PRC's economic performance pulled an estimated 150 million peasants out of poverty and sustained an average annual gross domestic product growth rate of 11.2%. The country formally joined the World Trade Organization in 2001. By 1997 and 1999, former European colonies of Hong Kong and Macau became special administrative regions of China.
Although the PRC needs economic growth to spur its development, the government began to worry that rapid economic growth was degrading the country's resources and environment. Another concern is that certain sectors of society are not sufficiently benefiting from the PRC's economic development; one example of this is the wide gap between urban and rural areas. As a result, under former CPC general secretary and President Hu Jintao and Premier Wen Jiabao, the PRC initiated policies to address issues of equitable distribution of resources, but the outcome was not known . More than 40 million farmers were displaced from their land, usually for economic development, contributing to 87,000 demonstrations and riots across China in 2005. For much of the PRC's population, living standards improved very substantially and freedom increased, but political controls remained tight and rural areas poor. | https://en.wikipedia.org/wiki?curid=5760 |
Civil engineering
Civil engineering is a professional engineering discipline that deals with the design, construction, and maintenance of the physical and naturally built environment, including public works such as roads, bridges, canals, dams, airports, sewerage systems, pipelines, structural components of buildings, and railways.
Civil engineering is traditionally broken into a number of sub-disciplines. It is considered the second-oldest engineering discipline after military engineering, and it is defined to distinguish non-military engineering from military engineering. Civil engineering takes place in the public sector from municipal through to national governments, and in the private sector from individual homeowners through to international companies.
Civil engineering is the application of physical and scientific principles for solving the problems of society, and its history is intricately linked to advances in the understanding of physics and mathematics throughout history. Because civil engineering is a wide-ranging profession, including several specialized sub-disciplines, its history is linked to knowledge of structures, materials science, geography, geology, soils, hydrology, environment, mechanics and other fields.
Throughout ancient and medieval history most architectural design and construction was carried out by artisans, such as stonemasons and carpenters, rising to the role of master builder. Knowledge was retained in guilds and seldom supplanted by advances. Structures, roads, and infrastructure that existed were repetitive, and increases in scale were incremental.
One of the earliest examples of a scientific approach to physical and mathematical problems applicable to civil engineering is the work of Archimedes in the 3rd century BC, including Archimedes Principle, which underpins our understanding of buoyancy, and practical solutions such as Archimedes' screw. Brahmagupta, an Indian mathematician, used arithmetic in the 7th century AD, based on Hindu-Arabic numerals, for excavation (volume) computations.
Engineering has been an aspect of life since the beginnings of human existence. The earliest practice of civil engineering may have commenced between 4000 and 2000 BC in ancient Egypt, the Indus Valley Civilization, and Mesopotamia (ancient Iraq) when humans started to abandon a nomadic existence, creating a need for the construction of shelter. During this time, transportation became increasingly important leading to the development of the wheel and sailing.
Until modern times there was no clear distinction between civil engineering and architecture, and the term engineer and architect were mainly geographical variations referring to the same occupation, and often used interchangeably. The construction of pyramids in Egypt (circa 2700–2500 BC) were some of the first instances of large structure constructions. Other ancient historic civil engineering constructions include the Qanat water management system (the oldest is older than 3000 years and longer than 71 km,) the Parthenon by Iktinos in Ancient Greece (447–438 BC), the Appian Way by Roman engineers (c. 312 BC), the Great Wall of China by General Meng T'ien under orders from Ch'in Emperor Shih Huang Ti (c. 220 BC) and the stupas constructed in ancient Sri Lanka like the Jetavanaramaya and the extensive irrigation works in Anuradhapura. The Romans developed civil structures throughout their empire, including especially aqueducts, insulae, harbors, bridges, dams and roads.
In the 18th century, the term civil engineering was coined to incorporate all things civilian as opposed to military engineering. In 1747, the first institution for the teaching of civil engineering, the École Nationale des Ponts et Chaussées was stablished in France; and more examples followed in other European countries, like Spain. The first self-proclaimed civil engineer was John Smeaton, who constructed the Eddystone Lighthouse. In 1771 Smeaton and some of his colleagues formed the Smeatonian Society of Civil Engineers, a group of leaders of the profession who met informally over dinner. Though there was evidence of some technical meetings, it was little more than a social society.In 1818 the Institution of Civil Engineers was founded in London, and in 1820 the eminent engineer Thomas Telford became its first president. The institution received a Royal Charter in 1828, formally recognising civil engineering as a profession. Its charter defined civil engineering as:
The first private college to teach civil engineering in the United States was Norwich University, founded in 1819 by Captain Alden Partridge. The first degree in civil engineering in the United States was awarded by Rensselaer Polytechnic Institute in 1835. The first such degree to be awarded to a woman was granted by Cornell University to Nora Stanton Blatch in 1905.
In the UK during the early 19th century, the division between civil engineering and military engineering (served by the Royal Military Academy, Woolwich), coupled with the demands of the Industrial Revolution, spawned new engineering education initiatives: the Class of Civil Engineering and Mining was founded at King's College London in 1838, mainly as a response to the growth of the railway system and the need for more qualified engineers, the private College for Civil Engineers in Putney was established in 1839, and the UK's first Chair of Engineering was established at the University of Glasgow in 1840.
"Civil engineers" typically possess an academic degree in civil engineering. The length of study is three to five years, and the completed degree is designated as a bachelor of technology, or a bachelor of engineering. The curriculum generally includes classes in physics, mathematics, project management, design and specific topics in civil engineering. After taking basic courses in most sub-disciplines of civil engineering, they move on to specialize in one or more sub-disciplines at advanced levels. While an undergraduate degree (BEng/BSc) normally provides successful students with industry-accredited qualification, some academic institutions offer post-graduate degrees (MEng/MSc), which allow students to further specialize in their particular area of interest.
In most countries, a bachelor's degree in engineering represents the first step towards professional certification, and a professional body certifies the degree program. After completing a certified degree program, the engineer must satisfy a range of requirements including work experience and exam requirements before being certified. Once certified, the engineer is designated as a professional engineer (in the United States, Canada and South Africa), a chartered engineer (in most Commonwealth countries), a chartered professional engineer (in Australia and New Zealand), or a European engineer (in most countries of the European Union). There are international agreements between relevant professional bodies to allow engineers to practice across national borders.
The benefits of certification vary depending upon location. For example, in the United States and Canada, "only a licensed professional engineer may prepare, sign and seal, and submit engineering plans and drawings to a public authority for approval, or seal engineering work for public and private clients." This requirement is enforced under provincial law such as the Engineers Act in Quebec. No such legislation has been enacted in other countries including the United Kingdom. In Australia, state licensing of engineers is limited to the state of Queensland. Almost all certifying bodies maintain a code of ethics which all members must abide by.
Engineers must obey contract law in their contractual relationships with other parties. In cases where an engineer's work fails, they may be subject to the law of tort of negligence, and in extreme cases, criminal charges. An engineer's work must also comply with numerous other rules and regulations such as building codes and environmental law.
There are a number of sub-disciplines within the broad field of civil engineering. General civil engineers work closely with surveyors and specialized civil engineers to design grading, drainage, pavement, water supply, sewer service, dams, electric and communications supply. General civil engineering is also referred to as site engineering, a branch of civil engineering that primarily focuses on converting a tract of land from one usage to another. Site engineers spend time visiting project sites, meeting with stakeholders, and preparing construction plans. Civil engineers apply the principles of geotechnical engineering, structural engineering, environmental engineering, transportation engineering and construction engineering to residential, commercial, industrial and public works projects of all sizes and levels of construction.
"Coastal engineering" is concerned with managing coastal areas. In some jurisdictions, the terms sea defense and coastal protection mean defense against flooding and erosion, respectively. The term coastal defense is the more traditional term, but coastal management has become more popular as the field has expanded to techniques that allow erosion to claim land.
"Construction engineering" involves planning and execution, transportation of materials, site development based on hydraulic, environmental, structural and geotechnical engineering. As construction firms tend to have higher business risk than other types of civil engineering firms do, construction engineers often engage in more business-like transactions, for example, drafting and reviewing contracts, evaluating logistical operations, and monitoring prices of supplies.
"Earthquake engineering" involves designing structures to withstand hazardous earthquake exposures. Earthquake engineering is a sub-discipline of structural engineering. The main objectives of earthquake engineering are to understand interaction of structures on the shaky ground; foresee the consequences of possible earthquakes; and design, construct and maintain structures to perform at earthquake in compliance with building codes.
"Environmental engineering" is the contemporary term for sanitary engineering, though sanitary engineering traditionally had not included much of the hazardous waste management and environmental remediation work covered by environmental engineering. Public health engineering and environmental health engineering are other terms being used.
Environmental engineering deals with treatment of chemical, biological, or thermal wastes, purification of water and air, and remediation of contaminated sites after waste disposal or accidental contamination. Among the topics covered by environmental engineering are pollutant transport, water purification, waste water treatment, air pollution, solid waste treatment, recycling, and hazardous waste management. Environmental engineers administer pollution reduction, green engineering, and industrial ecology. Environmental engineers also compile information on environmental consequences of proposed actions.
"Forensic engineering" is the investigation of materials, products, structures or components that fail or do not operate or function as intended, causing personal injury or damage to property. The consequences of failure are dealt with by the law of product liability. The field also deals with retracing processes and procedures leading to accidents in operation of vehicles or machinery. The subject is applied most commonly in civil law cases, although it may be of use in criminal law cases. Generally the purpose of a Forensic engineering investigation is to locate cause or causes of failure with a view to improve performance or life of a component, or to assist a court in determining the facts of an accident. It can also involve investigation of intellectual property claims, especially patents.
"Geotechnical engineering" studies rock and soil supporting civil engineering systems. Knowledge from the field of soil science, materials science, mechanics, and hydraulics is applied to safely and economically design foundations, retaining walls, and other structures. Environmental efforts to protect groundwater and safely maintain landfills have spawned a new area of research called geoenvironmental engineering.
Identification of soil properties presents challenges to geotechnical engineers. Boundary conditions are often well defined in other branches of civil engineering, but unlike steel or concrete, the material properties and behavior of soil are difficult to predict due to its variability and limitation on investigation. Furthermore, soil exhibits nonlinear (stress-dependent) strength, stiffness, and dilatancy (volume change associated with application of shear stress), making studying soil mechanics all the more difficult. Geotechnical engineers frequently work with professional geologists and soil scientists.
"Materials science" is closely related to civil engineering. It studies fundamental characteristics of materials, and deals with ceramics such as concrete and mix asphalt concrete, strong metals such as aluminum and steel, and thermosetting polymers including polymethylmethacrylate (PMMA) and carbon fibers.
"Materials engineering" involves protection and prevention (paints and finishes). Alloying combines two types of metals to produce another metal with desired properties. It incorporates elements of applied physics and chemistry. With recent media attention on nanoscience and nanotechnology, materials engineering has been at the forefront of academic research. It is also an important part of forensic engineering and failure analysis.
"Structural engineering" is concerned with the structural design and structural analysis of buildings, bridges, towers, flyovers (overpasses), tunnels, off shore structures like oil and gas fields in the sea, aerostructure and other structures. This involves identifying the loads which act upon a structure and the forces and stresses which arise within that structure due to those loads, and then designing the structure to successfully support and resist those loads. The loads can be self weight of the structures, other dead load, live loads, moving (wheel) load, wind load, earthquake load, load from temperature change etc. The structural engineer must design structures to be safe for their users and to successfully fulfill the function they are designed for (to be "serviceable"). Due to the nature of some loading conditions, sub-disciplines within structural engineering have emerged, including wind engineering and earthquake engineering.
Design considerations will include strength, stiffness, and stability of the structure when subjected to loads which may be static, such as furniture or self-weight, or dynamic, such as wind, seismic, crowd or vehicle loads, or transitory, such as temporary construction loads or impact. Other considerations include cost, constructability, safety, aesthetics and sustainability.
"Surveying" is the process by which a surveyor measures certain dimensions that occur on or near the surface of the Earth. Surveying equipment such as levels and theodolites are used for accurate measurement of angular deviation, horizontal, vertical and slope distances. With computerisation, electronic distance measurement (EDM), total stations, GPS surveying and laser scanning have to a large extent supplanted traditional instruments. Data collected by survey measurement is converted into a graphical representation of the Earth's surface in the form of a map. This information is then used by civil engineers, contractors and realtors to design from, build on, and trade, respectively. Elements of a structure must be sized and positioned in relation to each other and to site boundaries and adjacent structures.
Although surveying is a distinct profession with separate qualifications and licensing arrangements, civil engineers are trained in the basics of surveying and mapping, as well as geographic information systems. Surveyors also lay out the routes of railways, tramway tracks, highways, roads, pipelines and streets as well as position other infrastructure, such as harbors, before construction.
In the United States, Canada, the United Kingdom and most Commonwealth countries land surveying is considered to be a separate and distinct profession. Land surveyors are not considered to be engineers, and have their own professional associations and licensing requirements. The services of a licensed land surveyor are generally required for boundary surveys (to establish the boundaries of a parcel using its legal description) and subdivision plans (a plot or map based on a survey of a parcel of land, with boundary lines drawn inside the larger parcel to indicate the creation of new boundary lines and roads), both of which are generally referred to as Cadastral surveying.
Construction surveying is generally performed by specialised technicians. Unlike land surveyors, the resulting plan does not have legal status. Construction surveyors perform the following tasks:
"Transportation engineering" is concerned with moving people and goods efficiently, safely, and in a manner conducive to a vibrant community. This involves specifying, designing, constructing, and maintaining transportation infrastructure which includes streets, canals, highways, rail systems, airports, ports, and mass transit. It includes areas such as transportation design, transportation planning, traffic engineering, some aspects of urban engineering, queueing theory, pavement engineering, Intelligent Transportation System (ITS), and infrastructure management.
"Municipal engineering" is concerned with municipal infrastructure. This involves specifying, designing, constructing, and maintaining streets, sidewalks, water supply networks, sewers, street lighting, municipal solid waste management and disposal, storage depots for various bulk materials used for maintenance and public works (salt, sand, etc.), public parks and cycling infrastructure. In the case of underground utility networks, it may also include the civil portion (conduits and access chambers) of the local distribution networks of electrical and telecommunications services. It can also include the optimizing of waste collection and bus service networks. Some of these disciplines overlap with other civil engineering specialties, however municipal engineering focuses on the coordination of these infrastructure networks and services, as they are often built simultaneously, and managed by the same municipal authority. Municipal engineers may also design the site civil works for large buildings, industrial plants or campuses (i.e. access roads, parking lots, potable water supply, treatment or pretreatment of waste water, site drainage, etc.)
"Water resources engineering" is concerned with the collection and management of water (as a natural resource). As a discipline it therefore combines elements of hydrology, environmental science, meteorology, conservation, and resource management. This area of civil engineering relates to the prediction and management of both the quality and the quantity of water in both underground (aquifers) and above ground (lakes, rivers, and streams) resources. Water resource engineers analyze and model very small to very large areas of the earth to predict the amount and content of water as it flows into, through, or out of a facility. Although the actual design of the facility may be left to other engineers.
"Hydraulic engineering" is concerned with the flow and conveyance of fluids, principally water. This area of civil engineering is intimately related to the design of pipelines, water supply network, drainage facilities (including bridges, dams, channels, culverts, levees, storm sewers), and canals. Hydraulic engineers design these facilities using the concepts of fluid pressure, fluid statics, fluid dynamics, and hydraulics, among others.
Civil engineering systems is a discipline that promotes the use of systems thinking to manage complexity and change in civil engineering within its wider public context. It posits that the proper development of civil engineering infrastructure requires a holistic, coherent understanding of the relationships between all of the important factors that contribute to successful projects while at the same time emphasising the importance of attention to technical detail. Its purpose is to help integrate the entire civil engineering project life cycle from conception, through planning, designing, making, operating to decommissioning. | https://en.wikipedia.org/wiki?curid=5762 |
Çatalhöyük
Çatalhöyük (; also "Çatal Höyük" and "Çatal Hüyük"; from Turkish "çatal" "fork" + "höyük" "tumulus") was a very large Neolithic and Chalcolithic proto-city settlement in southern Anatolia, which existed from approximately 7100 BC to 5700 BC, and flourished around 7000 BC. In July 2012, it was inscribed as a UNESCO World Heritage Site.
Çatalhöyük is located overlooking the Konya Plain, southeast of the present-day city of Konya (ancient Iconium) in Turkey, approximately 140 km (87 mi) from the twin-coned volcano of Mount Hasan. The eastern settlement forms a mound which would have risen about 20 m (66 ft) above the plain at the time of the latest Neolithic occupation. There is also a smaller settlement mound to the west and a Byzantine settlement a few hundred meters to the east. The prehistoric mound settlements were abandoned before the Bronze Age. A channel of the Çarşamba River once flowed between the two mounds, and the settlement was built on alluvial clay which may have been favorable for early agriculture.
The site was first excavated by James Mellaart in 1958. He later led a team which further excavated there for four seasons between 1961 and 1965. These excavations revealed this section of Anatolia as a centre of advanced culture in the Neolithic period. Excavation revealed 18 successive layers of buildings signifying various stages of the settlement and eras of history. The bottom layer of buildings can be dated as early as 7100 BC while the top layer is of 5600 BC.
Mellaart was banned from Turkey for his involvement in the Dorak affair in which he published drawings of supposedly important Bronze Age artifacts that later went missing. After this scandal, the site lay idle until 1993, when investigations began under the leadership of Ian Hodder, then at the University of Cambridge. These investigations are among the most ambitious excavation projects currently in progress according to archaeologist Colin Renfrew, among others. In addition to extensive use of archaeological science, psychological and artistic interpretations of the symbolism of the wall paintings have been employed. Hodder, a former student of Mellaart, chose the site as the first "real world" test of his then-controversial theory of post-processual archaeology. The site has always had a strong research emphasis upon engagement with digital methodologies, driven by the project's experimental and reflexive methodological framework. Sponsors and collaborators of the current dig include Yapi Kredi, Boeing, University of York, Selçuk University, British Institute at Ankara, Cardiff University, Stanford University, Turkish Cultural Foundation, and University at Buffalo.
Çatalhöyük was composed entirely of domestic buildings, with no obvious public buildings. While some of the larger ones have rather ornate murals, the purpose of some rooms remains unclear.
The population of the eastern mound has been estimated to be, at maximum, 10,000 people, but the population likely varied over the community's history. An average population of between 5,000 and 7,000 is a reasonable estimate. The sites were set up as large numbers of buildings clustered together. Households looked to their neighbors for help, trade, and possible marriage for their children. The inhabitants lived in mudbrick houses that were crammed together in an aggregate structure. No footpaths or streets were used between the dwellings, which were clustered in a honeycomb-like maze. Most were accessed by holes in the ceiling and doors on the side of the houses, with doors reached by ladders and stairs. The rooftops were effectively streets. The ceiling openings also served as the only source of ventilation, allowing smoke from the houses' open hearths and ovens to escape.
Houses had plaster interiors characterized by squared-off timber ladders or steep stairs. These were usually on the south wall of the room, as were cooking hearths and ovens. The main rooms contained raised platforms that may have been used for a range of domestic activities. Typical houses contained two rooms for everyday activity, such as cooking and crafting. All interior walls and platforms were plastered to a smooth finish. Ancillary rooms were used as storage, and were accessed through low openings from main rooms.
All rooms were kept scrupulously clean. Archaeologists identified very little rubbish in the buildings, finding middens outside the ruins, with sewage and food waste, as well as significant amounts of ash from burning wood, reeds and animal dung. In good weather, many daily activities may also have taken place on the rooftops, which may have formed a plaza. In later periods, large communal ovens appear to have been built on these rooftops. Over time, houses were renewed by partial demolition and rebuilding on a foundation of rubble, which was how the mound was gradually built up. As many as eighteen levels of settlement have been uncovered.
As a part of ritual life, the people of Çatalhöyük buried their dead within the village. Human remains have been found in pits beneath the floors and, especially, beneath hearths, the platforms within the main rooms, and under beds. Bodies were tightly flexed before burial and were often placed in baskets or wound and wrapped in reed mats. Disarticulated bones in some graves suggest that bodies may have been exposed in the open air for a time before the bones were gathered and buried. In some cases, graves were disturbed, and the individual's head removed from the skeleton. These heads may have been used in rituals, as some were found in other areas of the community. In a woman's grave spinning whorls were recovered and in a man's grave, stone axes. Some skulls were plastered and painted with ochre to recreate faces, a custom more characteristic of Neolithic sites in Syria and at Neolithic Jericho than at sites closer by.
Vivid murals and figurines are found throughout the settlement, on interior and exterior walls. Distinctive clay figurines of women, notably the Seated Woman of Çatalhöyük, have been found in the upper levels of the site. Although no identifiable temples have been found, the graves, murals, and figurines suggest that the people of Çatalhöyük had a religion rich in symbols. Rooms with concentrations of these items may have been shrines or public meeting areas. Predominant images include men with erect phalluses, hunting scenes, red images of the now extinct aurochs (wild cattle) and stags, and vultures swooping down on headless figures. Relief figures are carved on walls, such as of lionesses facing one another.
Heads of animals, especially of cattle, were mounted on walls. A painting of the village, with the twin mountain peaks of Hasan Dağ in the background, is frequently cited as the world's oldest map, and the first landscape painting. However, some archaeologists question this interpretation. Stephanie Meece, for example, argues that it is more likely a painting of a leopard skin instead of a volcano, and a decorative geometric design instead of a map.
A striking feature of Çatalhöyük are its female figurines. Mellaart, the original excavator, argued that these well-formed, carefully made figurines, carved and molded from marble, blue and brown limestone, schist, calcite, basalt, alabaster, and clay, represented a female deity. Although a male deity existed as well, "statues of a female deity far outnumber those of the male deity, who moreover, does not appear to be represented at all after Level VI". To date, eighteen levels have been identified. These artfully-hewn figurines were found primarily in areas Mellaart believed to be shrines. The stately goddess seated on a throne flanked by two lionesses ("illustration") was found in a grain bin, which Mellaart suggests might have been a means of ensuring the harvest or protecting the food supply.
Whereas Mellaart excavated nearly two hundred buildings in four seasons, the current excavator, Ian Hodder, spent an entire season excavating one building alone. Hodder and his team, in 2004 and 2005, began to believe that the patterns suggested by Mellaart were false. They found one similar figurine, but the vast majority did not imitate the Mother Goddess style that Mellaart suggested. Instead of a Mother Goddess culture, Hodder points out that the site gives little indication of a matriarchy or patriarchy.
In an article in the "Turkish Daily News", Hodder is reported as denying that Çatalhöyük was a matriarchal society and quoted as saying "When we look at what they eat and drink and at their social statues, we see that men and women had the same social status. There was a balance of power. Another example is the skulls found. If one's social status was of high importance in Çatalhöyük, the body and head were separated after death. The number of female and male skulls found during the excavations is almost equal." In another article in the "Hurriyet Daily News" Hodder is reported to say "We have learned that men and women were equally approached".
In a report in September 2009 on the discovery of around 2000 figurines Hodder is quoted as saying:
Çatalhöyük was excavated in the 1960s in a methodical way, but not using the full range of natural science techniques that are available to us today. Sir James Mellaart who excavated the site in the 1960s came up with all sorts of ideas about the way the site was organized and how it was lived in and so on ... We’ve now started working there since the mid 1990s and come up with very different ideas about the site. One of the most obvious examples of that is that Çatalhöyük is perhaps best known for the idea of the mother goddess. But our work more recently has tended to show that in fact there is very little evidence of a mother goddess and very little evidence of some sort of female-based matriarchy. That's just one of the many myths that the modern scientific work is undermining.
Professor Lynn Meskell explained that while the original excavations had found only 200 figures, the new excavations had uncovered 2,000 figurines of which most were animals, with less than 5% of the figurines women.
Estonian folklorist Uku Masing has suggested as early as in 1976, that Çatalhöyük was probably a hunting and gathering religion and the Mother Goddess figurine did not represent a female deity. He implied that perhaps a longer period of time was needed in order to develop symbols for agricultural rites. His theory was developed in the paper "Some remarks on the mythology of the people of Catal Hüyük".
Çatalhöyük has strong evidence of an egalitarian society, as no houses with distinctive features (belonging to royalty or religious hierarchy, for example) have been found so far. The most recent investigations also reveal little social distinction based on gender, with men and women receiving equivalent nutrition and seeming to have equal social status, as typically found in Paleolithic cultures. Children observed domestic areas. They learned how to perform rituals and how to build or repair houses by watching the adults make statues, beads and other objects.
Çatalhöyük's spatial layout may be due to the close kin relations exhibited amongst the people. It can be seen, in the layout, that the people were "divided into two groups who lived on opposite sides of the town, separated by a gully." Furthermore, because no nearby towns were found from which marriage partners could be drawn, "this spatial separation must have marked two intermarrying kinship groups." This would help explain how a settlement so early on would become so large.
In upper levels of the site, it becomes apparent that the people of Çatalhöyük were gaining skills in agriculture and the domestication of animals. Female figurines have been found within bins used for storage of cereals, such as wheat and barley, and the figurines are presumed to be of a deity protecting the grain. Peas were also grown, and almonds, pistachios, and fruit were harvested from trees in the surrounding hills. Sheep were domesticated and evidence suggests the beginning of cattle domestication as well. However, hunting continued to be a major source of food for the community. Pottery and obsidian tools appear to have been major industries; obsidian tools were probably both used and also traded for items such as Mediterranean sea shells and flint from Syria. There is also evidence that the settlement was the first place in the world to mine and smelt metal in the form of lead. Noting the lack of hierarchy and economic inequality, historian Murray Bookchin has argued that Çatalhöyük was an early example of anarcho-communism.
Conversely, a 2014 paper argues that the picture of Çatalhöyük is more complex and that while there seemed to have been an egalitarian distribution of cooking tools and some stone tools, unbroken quern-stones and storage units were more unevenly distributed, indicating social inequality. Private property existed but shared tools also existed. It was also suggested that Çatalhöyük was slowly becoming less egalitarian, with greater inter-generational wealth transmission, though there may have been efforts to try to stop this. | https://en.wikipedia.org/wiki?curid=5765 |
Clement Attlee
Clement Richard Attlee, 1st Earl Attlee, (3 January 18838 October 1967) was a British politician who served as Prime Minister of the United Kingdom from 1945 to 1951 and Leader of the Labour Party from 1935 to 1955. He was three times Leader of the Opposition (1935–1940, 1945, 1951–1955).
The son of a London solicitor, Attlee was born into a middle-class family. After attending the public school Haileybury College and the University of Oxford, he practised as a barrister. The volunteer work he carried out in London's East End exposed him to poverty and his political views shifted leftwards thereafter. He joined the Independent Labour Party, gave up his legal career, and began lecturing at the London School of Economics. His work was interrupted by service as an officer in the First World War. In 1919, he became mayor of Stepney and in 1922 was elected Member of Parliament for Limehouse. Attlee served in the first Labour minority government led by Ramsay MacDonald in 1924, and then joined the Cabinet during MacDonald's second minority (1929–1931). After retaining his seat in Labour's landslide defeat of 1931, he became the party's Deputy Leader. Elected Leader of the Labour Party in 1935, and at first advocating pacificism and opposing re-armament, he became a critic of Neville Chamberlain's appeasement of Hitler and Mussolini in the lead-up to the Second World War. Attlee took Labour into the wartime coalition government in 1940 and served under Winston Churchill, initially as Lord Privy Seal and then as Deputy Prime Minister from 1942.
After the end of the war, the coalition was dissolved and Attlee led Labour to a landslide victory at the 1945 general election, forming the first Labour majority government. His government's Keynesian approach to economic management aimed to maintain full employment, a mixed economy and a greatly enlarged system of social services provided by the state. To this end, it undertook the nationalisation of public utilities and major industries, and implemented wide-ranging social reforms, including the passing of the National Insurance Act 1946 and National Assistance Act, the foundation of the National Health Service (1948) and the enlargement of public subsidies for council house building. His government also reformed trade union legislation, working practices and children's services; it created the National Parks system, passed the New Towns Act 1946 and established the town and country planning system.
In foreign policy, Attlee delegated to Ernest Bevin, but oversaw the partition of India (1947), the independence of Burma and Ceylon, and the dissolution of the British mandates of Palestine and Transjordan. He and Bevin encouraged the United States to take a vigorous role in the Cold War; unable to afford military intervention in Greece, he called on Washington to counter Communists there, establishing the Truman Doctrine. He supported the Marshall Plan to rebuild Western Europe with American money and, in 1949, promoted the NATO military alliance against the Soviet bloc. After leading Labour to a narrow victory at the 1950 general election, he sent British troops to fight in the Korean War.
Attlee had inherited a country close to bankruptcy after the Second World War and beset by food, housing and resource shortages; despite his social reforms and economic programme, these problems persisted throughout his premiership, alongside recurrent currency crises and dependence on US aid. His party was narrowly defeated by the Conservatives in the 1951 general election, despite winning the most votes. He continued as Labour leader but retired after losing the 1955 election and was elevated to the House of Lords; he died in 1967. In public, he was modest and unassuming, but behind the scenes his depth of knowledge, quiet demeanour, objectivity and pragmatism proved decisive. Often rated as one of the greatest British prime ministers, Attlee's reputation among scholars has grown, thanks to his creation of the modern welfare state and involvement in building the coalition against Stalin in the Cold War. He remains the longest-serving Labour leader in British history.
Attlee was born on 3 January 1883 in Putney, Surrey (now part of London), into a middle-class family, the seventh of eight children. His father was Henry Attlee (1841–1908), a solicitor, and his mother was Ellen Bravery Watson (1847–1920), daughter of Thomas Simons Watson, secretary for the Art Union of London. He was educated at Northaw School, a boys' preparatory school near Pluckley in Kent; Haileybury College; and University College, Oxford, where in 1904 he graduated as a Bachelor of Arts with second-class honours in modern history.
Attlee then trained as a barrister at the Inner Temple and was called to the bar in March 1906. He worked for a time at his father's law firm Druces and Attlee but did not enjoy the work, and had no particular ambition to succeed in the legal profession. He also played football for non-League club Fleet.
In 1906, he became a volunteer at Haileybury House, a charitable club for working-class boys in Stepney in the East End of London run by his old school, and from 1907 to 1909 he served as the club's manager. Until then, his political views had been more conservative. However, after his shock at the poverty and deprivation he saw while working with the slum children, he came to the view that private charity would never be sufficient to alleviate poverty and that only direct action and income redistribution by the state would have any serious effect. This sparked a process that caused him to convert to socialism. He subsequently joined the Independent Labour Party (ILP) in 1908 and became active in local politics. In 1909, he stood unsuccessfully at his first election, as an ILP candidate for Stepney Borough Council.
He also worked briefly as a secretary for Beatrice Webb in 1909, before becoming a secretary for Toynbee Hall. In 1911, he was employed by the UK Government as an "official explainer"—touring the country to explain Chancellor of the Exchequer David Lloyd George's National Insurance Act. He spent the summer of that year touring Essex and Somerset on a bicycle, explaining the act at public meetings. A year later, he became a lecturer at the London School of Economics.
Following the outbreak of the First World War in August 1914, Attlee applied to join the British Army. Initially his application was turned down, as at the age of 31 he was seen as being too old; however, he was finally allowed to join in September, and was commissioned in the rank of Captain with the 6th (Service) Battalion, South Lancashire Regiment, part of the 38th Brigade of the 13th (Western) Division, and was sent to fight in the Gallipoli Campaign in Turkey. His decision to fight caused a rift between him and his older brother Tom, who, as a conscientious objector, spent much of the war in prison.
After a period fighting in Gallipoli, he collapsed after falling ill with dysentery and was put on a ship bound for England to recover. When he woke up he wanted to get back to action as soon as possible, and asked to be let off the ship in Malta where he stayed in hospital to recover. His hospitalisation coincided with the Battle of Sari Bair, which saw a large number of his comrades killed. Upon returning to action, he was informed that his company had been chosen to hold the final lines during the evacuation of Suvla. As such, he was the penultimate man to be evacuated from Suvla Bay, the last being General Stanley Maude.
The Gallipoli Campaign had been engineered by the First Lord of the Admiralty, Winston Churchill. Although it was unsuccessful, Attlee believed that it was a bold strategy, which could have been a success if it had been better implemented on the ground. This gave him an admiration for Churchill as a military strategist, which would make their working relationship in later years productive.
He later served in the Mesopotamian Campaign in what is now Iraq, where in April 1916 he was badly wounded, being hit in the leg by shrapnel while storming an enemy trench during the Battle of Hanna. He was sent firstly to India, and then back to the UK to recover. In February 1917, he was promoted to the rank of Major, leading him to be known as "Major Attlee" for much of the inter-war period. He would spend most of 1917 training soldiers at various locations in England. From 2 to 9 July 1917, he was the temporary commanding officer (CO) of the newly formed L (later 10th) Battalion, the Tank Corps at Bovington Camp, Dorset. From 9 July, he assumed command of 30th Company of the same battalion; however, he did not deploy to France with it in December 1917.
After fully recovering from his injuries, he was sent to France in June 1918 to serve on the Western Front for the final months of the war. After being discharged from the Army in January 1919, he returned to Stepney, and returned to his old job lecturing part-time at the London School of Economics.
Attlee met Violet Millar while on a long trip with friends to Italy in 1921. They fell in love and were soon engaged, marrying at Christ Church, Hampstead, on 10 January 1922. It would come to be a devoted marriage, with Attlee providing protection and Violet providing a home that was an escape for Attlee from political turmoil. She died in 1964. They had four children:
Attlee returned to local politics in the immediate post-war period, becoming mayor of the Metropolitan Borough of Stepney, one of London's most deprived inner-city boroughs, in 1919. During his time as mayor, the council undertook action to tackle slum landlords who charged high rents but refused to spend money on keeping their property in habitable condition. The council served and enforced legal orders on homeowners to repair their property. It also appointed health visitors and sanitary inspectors, reducing the infant mortality rate, and took action to find work for returning unemployed ex-servicemen.
In 1920, while mayor, he wrote his first book, "The Social Worker", which set out many of the principles that informed his political philosophy and that were to underpin the actions of his government in later years. The book attacked the idea that looking after the poor could be left to voluntary action. He wrote on page 30:In a civilised community, although it may be composed of self-reliant individuals, there will be some persons who will be unable at some period of their lives to look after themselves, and the question of what is to happen to them may be solved in three ways – they may be neglected, they may be cared for by the organised community as of right, or they may be left to the goodwill of individuals in the community.
and went on to say at page 75:Charity is only possible without loss of dignity between equals. A right established by law, such as that to an old age pension, is less galling than an allowance made by a rich man to a poor one, dependent on his view of the recipient's character, and terminable at his caprice.
In 1921, George Lansbury, the Labour mayor of the neighbouring borough of Poplar, and future Labour Party leader, launched the Poplar Rates Rebellion; a campaign of disobedience seeking to equalise the poor relief burden across all the London boroughs. Attlee, who was a personal friend of Lansbury, strongly supported this. However, Herbert Morrison, the Labour mayor of nearby Hackney, and one of the main figures in the London Labour Party, strongly denounced Lansbury and the rebellion. During this period, Attlee developed a lifelong dislike of Morrison.
At the 1922 general election, Attlee became the Member of Parliament (MP) for the constituency of Limehouse in Stepney. At the time, he admired Ramsay MacDonald and helped him get elected as Labour Party leader at the 1922 leadership election. He served as MacDonald's Parliamentary Private Secretary for the brief 1922 parliament. His first taste of ministerial office came in 1924, when he served as Under-Secretary of State for War in the short-lived first Labour government, led by MacDonald.
Attlee opposed the 1926 General Strike, believing that strike action should not be used as a political weapon. However, when it happened, he did not attempt to undermine it. At the time of the strike, he was chairman of the Stepney Borough Electricity Committee. He negotiated a deal with the Electrical Trade Union so that they would continue to supply power to hospitals, but would end supplies to factories. One firm, Scammell and Nephew Ltd, took a civil action against Attlee and the other Labour members of the committee (although not against the Conservative members who had also supported this). The court found against Attlee and his fellow councillors and they were ordered to pay £300 damages. The decision was later reversed on appeal, but the financial problems caused by the episode almost forced Attlee out of politics.
In 1927, he was appointed a member of the multi-party Simon Commission, a royal commission set up to examine the possibility of granting self-rule to India. Due to the time he needed to devote to the commission, and contrary to a promise MacDonald made to Attlee to induce him to serve on the commission, he was not initially offered a ministerial post in the Second Labour Government, which entered office after the 1929 general election. Attlee's service on the Commission equipped him with a thorough exposure to India and many of its political leaders. By 1933 he argued that British rule was alien to India and was unable to make the social and economic reforms necessary for India's progress. He became the British leader most sympathetic to Indian independence (as a dominion), preparing him for his role in deciding on independence in 1947.
In May 1930, Labour MP Oswald Mosley left the party after its rejection of his proposals for solving the unemployment problem, and Attlee was given Mosley's post of Chancellor of the Duchy of Lancaster. In March 1931, he became Postmaster General, a post he held for five months until August, when the Labour government fell, after failing to agree on how to tackle the financial crisis of the Great Depression. That month MacDonald and a few of his allies formed a National Government with the Conservatives and Liberals, leading them to be expelled from Labour. MacDonald offered Attlee a job in the National Government, but he turned down the offer and opted to stay loyal to the main Labour party.
After Ramsay MacDonald formed the National Government, Labour was deeply divided. Attlee had long been close to MacDonald and now felt betrayed—as did most Labour politicians. During the course of the second Labour government, Attlee had become increasingly disillusioned with MacDonald, whom he came to regard as vain and incompetent, and of whom he later wrote scathingly in his autobiography. He would write:
In the old days I had looked up to MacDonald as a great leader. He had a fine presence and great oratorical power. The unpopular line which he took during the First World War seemed to mark him as a man of character. Despite his mishandling of the Red Letter episode, I had not appreciated his defects until he took office a second time. I then realised his reluctance to take positive action and noted with dismay his increasing vanity and snobbery, while his habit of telling me, a junior Minister, the poor opinion he had of all his Cabinet colleagues made an unpleasant impression. I had not, however, expected that he would perpetrate the greatest betrayal in the political history of this country... The shock to the Party was very great, especially to the loyal workers of the rank-and-file who had made great sacrifices for these men.
The 1931 general election held later that year was a disaster for the Labour Party, which lost over 200 seats, returning only 52 MPs to Parliament. The vast majority of the party's senior figures, including the Leader Arthur Henderson, lost their seats. Attlee, however, narrowly retained his Limehouse seat, with his majority being slashed from 7,288 to just 551. He was one of only three Labour MPs who had experience of government to retain their seats, along with George Lansbury and Stafford Cripps. Accordingly, Lansbury was elected Leader unopposed with Attlee as his deputy.
Most of the remaining Labour MPs after 1931 were elderly trade union officials who could not contribute much to debates, Lansbury was in his 70s, and Stafford Cripps another main figure of the Labour front bench who had entered Parliament in 1931, was inexperienced. As one of the most capable and experienced of the remaining Labour MPs, Attlee therefore shouldered a lot of the burden of providing an opposition to the National Government in the years 1931–35, during this time he had to extend his knowledge of subjects which he had not studied in any depth before, such as finance and foreign affairs in order to provide an effective opposition to the government.
Attlee effectively served as acting leader for nine months from December 1933, after Lansbury fractured his thigh in an accident, which raised Attlee's public profile considerably. It was during this period, however, that personal financial problems almost forced Attlee to quit politics altogether. His wife had become ill, and at that time there was no separate salary for the Leader of the Opposition. On the verge of resigning from Parliament, he was persuaded to stay by Stafford Cripps, a wealthy socialist, who agreed to make a donation to party funds to pay him an additional salary until Lansbury could take over again.
During 1932–33 Attlee flirted with, and then drew back from radicalism, influenced by Stafford Cripps who was then on the radical wing of the party, he was briefly a member of the Socialist League, which had been formed by former Independent Labour Party (ILP) members, who opposed the ILP's disaffiliation from the main Labour Party in 1932. At one point he agreed with the proposition put forward by Cripps that gradual reform was inadequate and that a socialist government would have to pass an emergency powers act, allowing it to rule by decree to overcome any opposition by vested interests until it was safe to restore democracy. He admired Oliver Cromwell's strong-armed rule and use of major generals to control England. After looking more closely at Hitler, Mussolini, Stalin, and even his former colleague Oswald Mosley, leader of the new blackshirt fascist movement in Britain, Attlee retreated from his radicalism, and distanced himself from the League, and argued instead that the Labour Party must adhere to constitutional methods and stand forthright for democracy and against totalitarianism of either the left or right. He always supported the crown, and as Prime Minister was close to King George VI.
George Lansbury, a committed pacifist, resigned as the Leader of the Labour Party at the 1935 Party Conference on 8 October, after delegates voted in favour of sanctions against Italy for its aggression against Abyssinia. Lansbury had strongly opposed the policy, and felt unable to continue leading the party. Taking advantage of the disarray in the Labour Party, the Prime Minister Stanley Baldwin announced on 19 October that a general election would be held on 14 November. With no time for a leadership contest, the party agreed that Attlee should serve as interim leader, on the understanding that a leadership election would be held after the general election. Attlee therefore led Labour through the 1935 election, which saw the party stage a partial comeback from its disastrous 1931 performance, winning 38 per cent of the vote, the highest share Labour had won up to that point, and gaining over one hundred seats.
Attlee stood in the subsequent leadership election, held soon after, where he was opposed by Herbert Morrison, who had just re-entered parliament in the recent election, and Arthur Greenwood: Morrison was seen as the favourite, but was distrusted by many sections of the party, especially the left-wing. Arthur Greenwood meanwhile was a popular figure in the party; however, his leadership bid was severely hampered by his alcohol problem. Attlee was able to come across as a competent and unifying figure, particularly having already led the party through a general election. He went on to come first in both the first and second ballots, formally being elected Leader of the Labour Party on 3 December 1935.
Throughout the 1920s and most of the 1930s, the Labour Party's official policy had been to oppose rearmament, instead supporting internationalism and collective security under the League of Nations. At the 1934 Labour Party Conference, Attlee declared that, "We have absolutely abandoned any idea of nationalist loyalty. We are deliberately putting a world order before our loyalty to our own country. We say we want to see put on the statute book something which will make our people citizens of the world before they are citizens of this country". During a debate on defence in Commons a year later, Attlee said "We are told (in the White Paper) that there is danger against which we have to guard ourselves. We do not think you can do it by national defence. We think you can only do it by moving forward to a new world. A world of law, the abolition of national armaments with a world force and a world economic system. I shall be told that that is quite impossible". Shortly after those comments, Adolf Hitler proclaimed that German rearmament offered no threat to world peace. Attlee responded the next day noting that Hitler's speech, although containing unfavourable references to the Soviet Union, created "A chance to call a halt in the armaments race...We do not think that our answer to Herr Hitler should be just rearmament. We are in an age of rearmaments, but we on this side cannot accept that position".
In April 1936, the Chancellor of the Exchequer, Neville Chamberlain, introduced a Budget which increased the amount spent on the armed forces. Attlee made a radio broadcast in opposition to it, saying:
In June 1936, the Conservative MP Duff Cooper called for an Anglo-French alliance against possible German aggression and called for all parties to support one. Attlee condemned this: "We say that any suggestion of an alliance of this kind—an alliance in which one country is bound to another, right or wrong, by some overwhelming necessity—is contrary to the spirit of the League of Nations, is contrary to the Covenant, is contrary to Locarno is contrary to the obligations which this country has undertaken, and is contrary to the professed policy of this Government". At the Labour Party conference at Edinburgh in October Attlee reiterated that "There can be no question of our supporting the Government in its rearmament policy".
However, with the rising threat from Nazi Germany, and the ineffectiveness of the League of Nations, this policy eventually lost credibility. By 1937, Labour had jettisoned its pacifist position and came to support rearmament and oppose Neville Chamberlain's policy of appeasement.
In 1938, Attlee opposed the Munich Agreement, in which Chamberlain negotiated with Hitler to give Germany the German-speaking parts of Czechoslovakia, the Sudetenland: We all feel relief that war has not come this time. Every one of us has been passing through days of anxiety; we cannot, however, feel that peace has been established, but that we have nothing but an armistice in a state of war. We have been unable to go in for care-free rejoicing. We have felt that we are in the midst of a tragedy. We have felt humiliation. This has not been a victory for reason and humanity. It has been a victory for brute force. At every stage of the proceedings there have been time limits laid down by the owner and ruler of armed force. The terms have not been terms negotiated; they have been terms laid down as ultimata. We have seen to-day a gallant, civilised and democratic people betrayed and handed over to a ruthless despotism. We have seen something more. We have seen the cause of democracy, which is, in our view, the cause of civilisation and humanity, receive a terrible defeat. ... The events of these last few days constitute one of the greatest diplomatic defeats that this country and France have ever sustained. There can be no doubt that it is a tremendous victory for Herr Hitler. Without firing a shot, by the mere display of military force, he has achieved a dominating position in Europe which Germany failed to win after four years of war. He has overturned the balance of power in Europe. He has destroyed the last fortress of democracy in Eastern Europe which stood in the way of his ambition. He has opened his way to the food, the oil and the resources which he requires in order to consolidate his military power, and he has successfully defeated and reduced to impotence the forces that might have stood against the rule of violence.
At the end of 1937, Attlee and a party of three Labour MPs visited Spain and visited the British Battalion of the International Brigades fighting in the Spanish Civil War. One of the companies was named the "Major Attlee Company" in his honour.
In 1937, Attlee wrote a book entitled "The Labour Party in Perspective" that sold fairly well in which he set out some of his views. He argued that there was no point in Labour compromising on its socialist principles in the belief that this would achieve electoral success. He wrote: "I find that the proposition often reduces itself to this – that if the Labour Party would drop its socialism and adopt a Liberal platform, many Liberals would be pleased to support it. I have heard it said more than once that if Labour would only drop its policy of nationalisation everyone would be pleased, and it would soon obtain a majority. I am convinced it would be fatal for the Labour Party." He also wrote that there was no point in "watering down Labour's socialist creed in order to attract new adherents who cannot accept the full socialist faith. On the contrary, I believe that it is only a clear and bold policy that will attract this support".
In the late 1930s, Attlee sponsored a Jewish mother and her two children, enabling them to leave Germany in 1939 and move to the UK. On arriving in Britain, Attlee invited one of the children into his home in Stanmore, north-west London, where he stayed for several months.
Attlee remained as Leader of the Opposition when the Second World War broke out in September 1939. The ensuing disastrous Norwegian Campaign would result in a motion of no confidence in Neville Chamberlain. Although Chamberlain survived this, the reputation of his administration was so badly and publicly damaged that it became clear a coalition government would be necessary. Even if Attlee had personally been prepared to serve under Chamberlain in an emergency coalition government, he would never have been able to carry Labour with him. Consequently, Chamberlain tendered his resignation, and Labour and the Conservatives entered a coalition government led by Winston Churchill on 10 May 1940.
Attlee and Churchill quickly agreed that the War Cabinet would consist of three Conservatives (initially Churchill, Chamberlain and Lord Halifax) and two Labour members (initially himself and Arthur Greenwood) and that Labour should have slightly more than one third of the posts in the coalition government. Attlee and Greenwood played a vital role in supporting Churchill during a series of War Cabinet debates over whether or not to negotiate peace terms with Hitler following the Fall of France in May 1940; both supported Churchill and gave him the majority he needed in the War Cabinet to continue Britain's resistance.
Only Attlee and Churchill remained in the War Cabinet from the formation of the Government of National Unity in May 1940 through to the election in May 1945. Attlee was initially the Lord Privy Seal, before becoming Britain's first ever Deputy Prime Minister in 1942, as well as becoming the Dominions Secretary and the Lord President of the Council.
Attlee himself played a generally low key but vital role in the wartime government, working behind the scenes and in committees to ensure the smooth operation of government. In the coalition government, three inter-connected committees effectively ran the country. Churchill chaired the first two, the War Cabinet and the Defence Committee, with Attlee deputising for him in these, and answering for the government in Parliament when Churchill was absent. Attlee himself instituted, and later chaired the third body, the Lord President's Committee, which was responsible for overseeing domestic affairs. As Churchill was most concerned with overseeing the war effort, this arrangement suited both men. Attlee himself had largely been responsible for creating these arrangements with Churchill's backing, streamlining the machinery of government and abolishing many committees. He also acted as a concilliator in the government, smoothing over tensions which frequently arose between Labour and Conservative Ministers.
Many Labour activists were baffled by the top leadership role for a man they regarded as having little charisma; Beatrice Webb wrote in her diary in early 1940:
Following the defeat of Nazi Germany and the end of the War in Europe in May 1945, Attlee and Churchill favoured the coalition government remaining in place until Japan had been defeated. However, Herbert Morrison made it clear that the Labour Party would not be willing to accept this, and Churchill was forced to tender his resignation as Prime Minister and call an immediate election.
The war had set in motion profound social changes within Britain, and had ultimately led to a widespread popular desire for social reform. This mood was epitomised in the Beveridge Report of 1942, by the Liberal economist William Beveridge. The "Report" assumed that the maintenance of full employment would be the aim of post-war governments, and that this would provide the basis for the welfare state. Immediately on its release, it sold hundreds of thousands of copies. All major parties committed themselves to fulfilling this aim, but most historians say that Attlee's Labour Party were seen by the electorate as the party most likely to follow it through.
Labour campaigned on the theme of "Let Us Face the Future", positioning themselves as the party best placed to rebuild Britain after the war, and were widely viewed as having run a strong and positive campaign, while the Conservative campaign centred entirely around Churchill. Despite opinion polls indicating a strong Labour lead, opinion polls were then viewed as a novelty which had not proven their worth, and most commentators expected that Churchill's prestige and status as a "war hero" would ensure a comfortable Conservative victory. Before polling day, "The Manchester Guardian" surmised that "the chances of Labour sweeping the country and obtaining a clear majority ... are pretty remote". The "News of the World" predicted a working Conservative majority, while in Glasgow a pundit forecast the result as Conservatives 360, Labour 220, Others 60. Churchill, however, made some costly errors during the campaign. In particular, his suggestion during one radio broadcast that a future Labour Government would require "some form of a gestapo" to implement their policies was widely regarded as being in very bad taste, and massively backfired.
When the results of the election were announced on 26 July, they came as a surprise to most, including Attlee himself. Labour had won power by a huge landslide, winning 47.7 per cent of the vote to the Conservatives' 36 per cent. This gave them 393 seats in the House of Commons, a working majority of 146. This was the first time in history that the Labour Party had won a majority in Parliament. When Attlee went to see King George VI at Buckingham Palace to be appointed Prime Minister, the notoriously laconic Attlee and the famously tongue-tied King stood in silence; Attlee finally volunteered the remark, "I've won the election". The King replied "I know. I heard it on the Six O'Clock News".
As Prime Minister, Attlee appointed Hugh Dalton as Chancellor of the Exchequer, Ernest Bevin as Foreign Secretary, and Herbert Morrison as Deputy Prime Minister, with overall responsibility for nationalisation. Additionally, Stafford Cripps was made President of the Board of Trade, Aneurin Bevan became Minister of Health, and Ellen Wilkinson, the only woman to serve in Attlee's cabinet, was appointed Minister of Education. The Attlee government proved itself to be a radical, reforming government. From 1945 to 1948, over 200 public Acts of Parliament were passed, with eight major pieces of legislation placed on the statute book in 1946 alone.
Francis (1995) argues there was consensus both in the Labour's national executive committee and at party conferences on a definition of socialism that stressed moral improvement as well as material improvement. The Attlee government was committed to rebuilding British society as an ethical commonwealth, using public ownership and controls to abolish extremes of wealth and poverty. Labour's ideology contrasted sharply with the contemporary Conservative Party's defence of individualism, inherited privileges, and income inequality. On 5 July 1948, Clement Attlee replied to a letter dated 22 June from James Murray and ten other MPs who raised concerns about West Indians who arrived on board the . As for the prime minister himself, he was not much focused on economic policy, letting others handle the issues.
Attlee's Health Minister, Aneurin Bevan, fought hard against the general disapproval of the medical establishment, including the British Medical Association, by creating the National Health Service (NHS) in 1948. This was a publicly funded healthcare system, which offered treatment free of charge for all at the point of use. Reflecting pent-up demand that had long existed for medical services, the NHS treated some 8 and a half million dental patients and dispensed more than 5 million pairs of spectacles during its first year of operation.
The government set about implementing the wartime plans of Liberal William Beveridge for the creation of a "cradle to grave" welfare state. It set in place an entirely new system of social security. Among the most important pieces of legislation was the National Insurance Act 1946, in which people in work were required to pay a flat rate of national insurance. In return, they (and the wives of male contributors) were eligible for a wide range of benefits, including pensions, sickness benefit, unemployment benefit, and funeral benefit. Various other pieces of legislation provided for child benefit and support for people with no other source of income. In 1949, unemployment, sickness and maternity benefits were exempted from tax.
The New Towns Act of 1946 set up development corporations to construct new towns, while the Town and Country Planning Act of 1947 instructed county councils to prepare development plans and also provided compulsory purchase powers. The Attlee government also extended the powers of local authorities to requisition houses and parts of houses, and made the acquisition of land less difficult than before. The Housing (Scotland) Act of 1949 provided grants of 75 per cent (87.5 per cent in the highlands and islands) towards modernisation costs payable by Treasury to local authorities.
In 1949, local authorities were empowered to provide people suffering from poor health with public housing at subsidised rents.
To assist home ownership, the limit on the amount of money that people could borrow from their local authority to purchase or build a home was raised from £800 to £1,500 in 1945, and to £5,000 in 1949. Under the National Assistance act of 1948, local authorities had a duty "to provide emergency temporary accommodation for families which become homeless through no fault of their own".
A large house-building programme was carried out with the intention of providing millions of people with high-quality homes. A housing bill passed in 1946 increased Treasury subsidies for the construction of local authority housing in England and Wales. Four out of five houses constructed under Labour were council properties built to more generous specifications than before the Second World War, and subsidies kept down council rents. Altogether, these policies provided public-sector housing with its biggest-ever boost up until that point, while low-wage earners particularly benefited from these developments. Although the Attlee government failed to meet its targets, primarily due to economic constraints, over a million new homes were built between 1945 and 1951 (a significant achievement under the circumstances) which ensured that decent, affordable housing was available to many low-income families for the first time ever.
A number of reforms were embarked upon to improve conditions for women and children. In 1946, universal family allowances were introduced to provide financial support to households for raising children. These benefits had been legislated for the previous year by Churchill's Family Allowances Act 1945, and was the first measure pushed through parliament by Attlee's government. Conservatives would later criticise Labour for having been "too hasty" in introducing family allowances.
A Married Women (Restraint Upon Anticipation) Act was passed in 1949 "to equalise, to render inoperative any restrictions upon anticipation or alienation attached to the enjoyment of property by a woman", while the Married Women (Maintenance) Act of 1949 was enacted with the intention of improving the adequacy and duration of financial benefits for married women.
The Criminal Law (Amendment) Act of 1950 amended an Act of 1885 to bring prostitutes within the law and safeguard them from abduction and abuse. The Criminal Justice Act of 1948 restricted imprisonment for juveniles and brought improvements to the probation and remand centres systems, while the passage of the Justices of the Peace Act of 1949 led to extensive reforms of magistrates' courts. The Attlee government also abolished the marriage bar in the Civil Service, thereby enabling married women to work in that institution.
In 1946, the government set up a National Institute of Houseworkers as a means of providing a social democratic variety of domestic service.
By late 1946, agreed standards of training were established, which was followed by the opening of a training headquarters and the opening of an additional nine (9) training centres in Wales, Scotland, and then throughout Great Britain. The National Health Service Act of 1946 indicated that domestic help should be provided for households where that help is required "owing to the presence of any person who is ill, lying-in, an expectant mother, mentally defective, aged or a child not over compulsory school age". 'Home help' therefore included the provision of home-helps for nursing and expectant mothers and for mothers with children under the age of five, and by 1952 some 20,000 women were engaged in this service.
Development rights were nationalised while the government attempted to take all development profits for the State. Strong planning authorities were set up to control land use, and issued manuals of guidance which stressed the importance of safeguarding agricultural land. A chain of regional offices was set up within its planning ministry to provide a strong lead in regional development policies.
Comprehensive Development Areas (CDAs), a designation under the Town and Country Planning Act of 1947, allowed local authorities to acquire property in the designated areas using powers of compulsory purchase in order to re-plan and develop urban areas suffering from urban blight or war damage.
Various measures were carried out to improve conditions in the workplace. Entitlement to sick leave was greatly extended, and sick pay schemes were introduced for local authority administrative, professional and technical workers in 1946 and for various categories of manual workers in 1948. Worker's compensation was also significantly improved.
The Fair Wages Resolution of 1946 required any contractor working on a public project to at least match the pay rates and other employment conditions set in the appropriate collective agreement. In 1946, purchase tax was removed completely from kitchen fittings and crockery, while the rate was reduced on various gardening items.
The Fire Services Act 1947 introduced a new pension scheme for fire-fighters, while the Electricity Act 1947 introduced better retirement benefits for workers in that industry. A Workers' Compensation (Supplementation) Act was passed in 1948 that introduced benefits for workers with certain asbestos-related diseases which had occurred before 1948. The Merchant Shipping Act of 1948 and the Merchant Shipping (Safety Convention) Act of 1949 were passed to improve conditions for seamen. The Shops Act of 1950 consolidated previous legislation which provided that no one could be employed in a shop for more than six hours without having a break for at least 20 minutes. The legislation also required a lunch break of at least 45 minutes for anyone who worked between 11:30 am and 2:30 pm and a half-hour tea break for anyone working between 4 pm and 7 pm. The government also strengthened a Fair Wages Resolution, with a clause that required all employers getting government contracts to recognise the rights of their workers to join trade unions.
The Trades Disputes Act 1927 was repealed, and a Dock Labour Scheme was introduced in 1947 to put an end to the casual system of hiring labour in the docks. This scheme gave registered dockers the legal right to minimum work and decent conditions. Through the National Dock Labour Board (on which trade unions and employers had equal representation) the unions acquired control over recruitment and dismissal. Registered dockers laid off by employers within the Scheme had the right either to be taken on by another, or to generous compensation. All dockers were registered under the Dock Labour Scheme, giving them a legal right to minimum work, holidays and sick pay.
Wages for members of the police force were significantly increased. The introduction of a Miner's Charter in 1946 instituted a five-day work week for miners and a standardised day wage structure, and in 1948 a Colliery Workers Supplementary Scheme was approved, providing supplementary allowances to disabled coal-workers and their dependants. In 1948, a pension scheme was set up to provide pension benefits for employees of the new NHS, as well as their dependents. Under the Coal Industry Nationalisation (Superannuation) Regulations of 1950, a pension scheme for mineworkers was established. Improvements were also made in farmworkers' wages, and the Agricultural Wages Board in 1948 not only safeguarded wage levels, but also ensured that workers were provided with accommodation.
A number of regulations aimed at safeguarding the health and safety of people at work were also introduced during Attlee's time in office. Regulations issued in February 1946 applied to factories involved with "manufacturing briquettes or blocks of fuel consisting of coal, coal dust, coke or slurry with pitch as a binding substance", and concerned "dust and ventilation, washing facilities and clothing accommodation, medical supervision and examination, skin and eye protection and messrooms".
Attlee's government also carried out their manifesto commitment for nationalisation of basic industries and public utilities. The Bank of England and civil aviation were nationalised in 1946. Coal mining, the railways, road haulage, canals and Cable and Wireless were nationalised in 1947, and electricity and gas followed in 1948. The steel industry was nationalised in 1951. By 1951 about 20 per cent of the British economy had been taken into public ownership.
Nationalisation failed to provide workers with a greater say in the running of the industries in which they worked. It did, however, bring about significant material gains for workers in the form of higher wages, reduced working hours, and improvements in working conditions, especially in regards to safety. As historian Eric Shaw noted of the years following nationalisation, the electricity and gas supply companies became "impressive models of public enterprise" in terms of efficiency, and the National Coal Board was not only profitable, but working conditions for miners had significantly improved as well.
Within a few years of nationalisation, a number of progressive measures had been carried out which did much to improve conditions in the mines, including better pay, a five-day working week, a national safety scheme (with proper standards at all the collieries), a ban on boys under the age of 16 going underground, the introduction of training for newcomers before going down to the coalface, and the making of pithead baths into a standard facility.
The newly established National Coal Board offered sick pay and holiday pay to miners. As noted by Martin Francis:
Union leaders saw nationalisation as a means to pursue a more advantageous position within a framework of continued conflict, rather than as an opportunity to replace the old adversarial form of industrial relations. Moreover, most workers in nationalised industries exhibited an essentially instrumentalist attitude, favouring public ownership because it secured job security and improved wages rather than because it promised the creation of a new set of socialists relationships in the workplace.
The Attlee government placed strong emphasis on improving the quality of life in rural areas, benefiting both farmers and other consumers. Security of tenure for farmers was introduced, while consumers were protected by food subsidies and the redistributive effects of deficiency payments. Between 1945 and 1951, the quality of rural life was improved by improvements in gas, electricity, and water services, as well as in leisure and public amenities. In addition, the 1947 Transport Act improved provision of rural bus services, while the Agriculture Act 1947 established a more generous subsidy system for farmers. Legislation was also passed in 1947 and 1948 which established a permanent Agricultural Wages Board to fix minimum wages for agricultural workers.
Attlee's government made it possible for farm workers to borrow up to 90 per cent of the cost of building their own houses, and received a subsidy of £15 a year for 40 years towards that cost. Grants were also made to meet up to half the cost of supplying water to farm buildings and fields, the government met half the cost of bracken eradication and lime spreading, and grants were paid for bringing hill farming land into use that had previously been considered unfit for farming purposes.
In 1946, the National Agricultural Advisory Service was set up to supply agricultural advice and information. The Hill Farming Act of 1946 introduced for upland areas a system of grants for buildings, land improvement, and infrastructural improvements such as roads and electrification. The act also continued a system of headage payments for hill sheep and cattle that had been introduced during the war. The Agricultural Holdings Act of 1948 enabled (in effect) tenant farmers to have lifelong tenancies and made provision for compensation in the event of cessations of tenancies. In addition, the Livestock Rearing Act of March 1951 extended the provisions of the 1946 Hill Farming Act to the upland store cattle and sheep sector.
At a time of world food shortages, it was vital that farmers produced the maximum possible quantities. The government encouraged farmers via subsidies for modernisation, while the National Agricultural Advisory Service provided expertise and price guarantees. As a result of the Attlee government's initiatives in agriculture, there was a 20 per cent increase in output between 1947 and 1952, while Britain adopted one of the most mechanised and efficient farming industries in the world.
The Attlee government ensured provisions of the Education Act 1944 were fully implemented, with free secondary education becoming a right for the first time. Fees in state grammar schools were eliminated, while new, modern secondary schools were constructed.
The school leaving age was raised to 15 in 1947, an accomplishment helped brought into fruition by initiatives such as the H.O.R.S.A. ("Huts Operation for Raising the School-leaving Age") scheme and the S.F.O.R.S.A. (furniture) scheme. University scholarships were introduced to ensure that no one who was qualified "should be deprived of a university education for financial reasons," while a large school building programme was organised. A rapid increase in the number of trained teachers took place, and the number of new school places was increased.
Increased Treasury funds were made available for education, particularly for upgrading school buildings suffering from years of neglect and war damage. Prefabricated classrooms were built and 928 new primary schools were constructed between 1945 and 1950. The provision of free school meals was expanded, and opportunities for university entrants were increased. State scholarships to universities were increased, and the government adopted a policy of supplementing university scholarships awards to a level sufficient to cover fees plus maintenance.
Many thousands of ex-servicemen were assisted to go through college who could never have contemplated it before the war. Free milk was also made available to all schoolchildren for the first time. In addition, spending on technical education rose, and the number of nursery schools was increased. Salaries for teachers were also improved, and funds were allocated towards improving existing schools.
In 1947, the Arts Council of Great Britain was set up to encourage the arts.
A Ministry of Education was established, and free County Colleges were set up for the compulsory part-time instruction of teenagers between the ages of 15 and 18 who were not in full-time education. An Emergency Training Scheme was also introduced which turned out an extra 25,000 teachers in 1945–1951. In 1947, Regional Advisory Councils were set up to bring together industry and education to find out the needs of young workers "and advise on the provision required, and to secure reasonable economy of provision". That same year, thirteen Area Training Organisations were set up in England and one in Wales to coordinate teacher training.
Attlee's government, however, failed to introduce the comprehensive education for which many socialists had hoped. This reform was eventually carried out by Harold Wilson's government. During its time in office, the Attlee government increased spending on education by over 50 per cent, from £6.5 billion to £10 billion.
The most significant problem facing Attlee and his ministers remained the economy, as the war effort had left Britain nearly bankrupt. The war had cost Britain about a quarter of her national wealth. Overseas investments had been used up to pay for the war. The transition to a peacetime economy, and the maintaining of strategic military commitments abroad led to continuous and severe problems with the balance of trade. This resulted in strict rationing of food and other essential goods continuing in the post war period to force a reduction in consumption in an effort to limit imports, boost exports, and stabilise the Pound Sterling so that Britain could trade its way out of its financial state.
The abrupt end of the American Lend-Lease programme in August 1945 almost caused a crisis. Some relief was provided by the Anglo-American loan, negotiated in December 1945. The conditions attached to the loan included making the pound fully convertible to the US dollar. When this was introduced in July 1947, it led to a currency crisis and convertibility had to be suspended after just five weeks. The UK benefited from the American Marshall Aid program in 1948, and the economic situation improved significantly. Another balance of payments crisis in 1949 forced Chancellor of the Exchequer, Stafford Cripps, into devaluation of the pound.
Despite these problems, one of the main achievements of Attlee's government was the maintenance of near full employment. The government maintained most of the wartime controls over the economy, including control over the allocation of materials and manpower, and unemployment rarely rose above 500,000, or 3 per cent of the total workforce. Labour shortages proved a more frequent problem. The inflation rate was also kept low during his term. The rate of unemployment rarely rose above 2 per cent during Attlee's time in office, whilst there was no hard-core of long-term unemployed. Both production and productivity rose as a result of new equipment, while the average working week was shortened.
The government was less successful in housing, which was the responsibility of Aneurin Bevan. The government had a target to build 400,000 new houses a year to replace those which had been destroyed in the war, but shortages of materials and manpower meant that less than half this number were built. Nevertheless, millions of people were rehoused as a result of the Attlee government's housing policies. Between August 1945 and December 1951, 1,016,349 new homes were completed in England, Scotland, and Wales.
When the Attlee government was voted out of office in 1951, the economy had been improved compared to 1945. The period from 1946 to 1951 saw continuous full employment and steadily rising living standards, which increased by about 10 per cent each year. During that same period, the economy grew by 3 per cent a year, and by 1951 the UK had "the best economic performance in Europe, while output per person was increasing faster than in the United States". Careful planning after 1945 also ensured that demobilisation was carried out without having a negative impact upon economic recovery, and that unemployment stayed at very low levels. In addition, the number of motor cars on the roads rose from 3 million to 5 million from 1945 to 1951, and seaside holidays were taken by far more people than ever before. A Monopolies and Restrictive Practices (Inquiry and Control) Act was passed in 1948, which allowed for investigations of restrictive practices and monopolies.
1947 proved a particularly difficult year for the government; an exceptionally cold winter that year caused coal mines to freeze and cease production, creating widespread power cuts and food shortages. The Minister of Fuel and Power, Emanuel Shinwell was widely blamed for failing to ensure adequate coal stocks, and soon resigned from his post. The Conservatives capitalised on the crisis with the slogan 'Starve with Strachey and shiver with Shinwell' (referring to the Minister of Food John Strachey).
The crisis led to an unsuccessful plot by Hugh Dalton to replace Attlee as Prime Minister with Ernest Bevin. Later that year Stafford Cripps tried to persuade Attlee to stand aside for Bevin. These plots petered out after Bevin refused to cooperate. Later that year, Hugh Dalton resigned as Chancellor after inadvertently leaking details of the budget to a journalist. He was replaced by Cripps.
In foreign affairs, the Attlee government was concerned with four main issues; post-war Europe, the onset of the Cold War, the establishment of the United Nations, and decolonisation. The first two were closely related, and Attlee was assisted by Foreign Secretary Ernest Bevin. Attlee also attended the later stages of the Potsdam Conference, where he negotiated with President Harry S. Truman and Joseph Stalin.
In the immediate aftermath of the war, the Government faced the challenge of managing relations with Britain's former war-time ally, Stalin and the Soviet Union. Ernest Bevin was a passionate anti-communist, based largely on his experience of fighting communist influence in the trade union movement. Bevin's initial approach to the USSR as Foreign Secretary was "wary and suspicious, but not automatically hostile". Attlee himself sought warm relations with Stalin. He put his trust in the United Nations, rejected notions that the Soviet Union was bent on world conquest, and warned that treating Moscow as an enemy would turn it into one. This put Attlee at sword's point with his foreign minister, the Foreign Office, and the military who all saw the Soviets as a growing threat to Britain's role in the Middle East. Suddenly in January 1947, Attlee reversed his position and agreed with Bevin on a hard-line anti-Soviet policy.
In an early "good-will" gesture that was later heavily criticised, the Attlee government allowed the Soviets to purchase, under the terms of a 1946 UK-USSR Trade Agreement, a total of 25 Rolls-Royce Nene jet engines in September 1947 and March 1948. The agreement included an agreement not to use them for military purposes. The price was fixed under a commercial contract; a total of 55 jet engines were sold to the USSR in 1947. However, the Cold War intensified during this period and the Soviets, who at the time were well behind the West in jet technology, reverse-engineered the Nene and installed their own version in the MiG-15 interceptor. This was used to good effect against US-UK forces in the subsequent Korean War, as well as in several later MiG models.
After Stalin took political control of most of Eastern Europe, and began to subvert other governments in the Balkans, Attlee's and Bevin's worst fears of Soviet intentions were realised. The Attlee government then became instrumental in the creation of the successful NATO defence alliance to protect Western Europe against any Soviet expansion. In a crucial contribution to the economic stability of post-war Europe, Attlee's Cabinet was instrumental in promoting the American Marshall Plan for the economic recovery of Europe. He called it, one of the "most bold, enlightened and good-natured acts in the history of nations".
A group of Labour MPs, organised under the banner of "Keep Left" urged the government to steer a middle way between the two emerging superpowers, and advocated the creation of a "third force" of European powers to stand between the US and USSR. However, deteriorating relations between Britain and the USSR, as well as Britain's economic reliance on America following the Marshall Plan, steered policy towards supporting the US. In January 1947, fear of both Soviet and American nuclear intentions led to a secret meeting of the Cabinet, where the decision was made to press ahead with the development of Britain's independent nuclear deterrent, an issue which later caused a split in the Labour Party. Britain's first successful nuclear test, however, did not occur until 1952, one year after Attlee had left office.
The London dock strike of July 1949, led by Communists, was suppressed when the Attlee government sent in 13,000 Army troops and passed special legislation to promptly end the strike. His response reveals Attlee's growing concern that Soviet expansionism, supported by the British Communist Party, was a genuine threat to national security, and that the docks were highly vulnerable to sabotage ordered by Moscow. He noted that the strike was caused not by local grievances, but to help communist unions who were on strike in Canada. Attlee agreed with MI5 that he faced "a very present menace".
Decolonisation was never a major election issue but Attlee gave the matter a great deal of attention and was the chief leader in planning and achieving the process of decolonisation of the British Empire, starting in Asia.
In August 1948, the Chinese Communists' victories caused Attlee to begin preparing for a Communist takeover of China. It kept open consulates in Communist-controlled areas and rejected the Chinese Nationalists' requests that British citizens assist in the defence of Shanghai. By December, the government concluded that although British property in China would likely be nationalised, British traders would benefit in the long run from a stable, industrialising Communist China. Retaining Hong Kong was especially important; although the Chinese Communists promised to not interfere with its rule, Britain reinforced the Hong Kong Garrison during 1949. When the victorious Chinese Communists government declared on 1 October 1949 that it would exchange diplomats with any country that ended relations with the Chinese Nationalists, Britain became the first western country to formally recognise the People's Republic of China in January 1950.
In 1954, a Labour Party delegation including Attlee visited China at the invitation of then Foreign Minister Zhou Enlai. Attlee became the first high-ranking western politician to meet Mao Zedong.
Attlee orchestrated the granting of independence to India and Pakistan in 1947. Attlee in 1928–1934 had been a member of the Indian Statutory Commission (otherwise known as the Simon Commission). He became the Labour Party expert on India and by 1934 was committed to granting India the same independent dominion status that Canada, Australia, New Zealand and South Africa had recently been given. He faced strong resistance from the die-hard Conservative imperialists, led by Churchill, who opposed both independence and efforts led by Prime Minister Stanley Baldwin to set up a system of limited local control by Indians themselves. Attlee and the Labour leadership were sympathetic to the Congress movement led by Mahatma Gandhi and Jawaharlal Nehru. During the Second World War, Attlee was in charge of Indian affairs. He set up the Cripps Mission in 1942, which tried and failed to bring the factions together. When the Congress called for passive resistance in the "Quit India" movement of 1942–1945, it was Attlee who ordered the arrest and internment for the duration of tens of thousands of Congress leaders and crushed the revolt.
Labour's election Manifesto in 1945 called for "the advancement of India to responsible self-government", but did not mention independence. In 1942 the British Raj tried to enlist all major political parties in support of the war effort. Congress, led by Nehru and Gandhi, demanded immediate independence and full control by Congress of all of India. That demand was rejected by the British, and Congress opposed the war effort with its "Quit India campaign". The Raj immediately responded in 1942 by imprisoning the major national, regional and local Congress leaders for the duration. Attlee did not object. By contrast, the Muslim League led by Muhammad Ali Jinnah, and also the Sikh community, strongly supported the war effort. They greatly enlarged their membership and won favour from London for their decision. Attlee retained a fondness for Congress and until 1946, accepted their thesis that they were a non-religious party that accepted Hindus, Muslims, Sikhs, and everyone else.
The Muslim league insisted that it was the only true representative of all of the Muslims of India, and by 1946 Attlee had come to agree with them. With violence escalating in India after the war, but with British financial power at a low ebb, large-scale military involvement was impossible. Viceroy Wavell said he needed a further seven army divisions to prevent communal violence if independence negotiations failed. No divisions were available; independence was the only option. Given the demands of the Muslim league, independence implied a partition that set off heavily Muslim Pakistan from the main portion of India.
The Labour government gave independence to India and Pakistan in an unexpectedly quick move in 1947. Historian Andrew Roberts says the independence of India was a "national humiliation" but it was necessitated by urgent financial, administrative, strategic and political needs. Churchill in 1940–1945 had tightened the hold on India and imprisoned the Congress leadership, with Attlee's approval. Labour had looked forward to making it a fully independent dominion like Canada or Australia. Many of the Congress leaders in the India had studied in England, and were highly regarded as fellow idealistic socialists by Labour leaders. Attlee was the Labour expert on India and took special charge of decolonisation. Attlee found that Churchill's viceroy, Field Marshal Wavell, was too imperialistic, too keen on military solutions (he wanted seven more Army divisions) and too neglectful of Indian political alignments. The new Viceroy was Lord Mountbatten, the dashing war hero and a cousin of the King. The boundary between the newly created states of Pakistan and India involved the widespread resettlement of millions of Muslims and Hindus (and many Sikhs). Extreme violence ensued when Punjab and Bengal provinces were split. Historian Yasmin Khan estimates that between a half-million and a million men, women and children were killed. Gandhi himself was assassinated by a Hindu activist in January 1948.
The final result was two nations consisting of a Hindu-majority India and a Muslim-majority Pakistan (which incorporated East Pakistan, now Bangladesh). Both joined the Commonwealth.
Attlee also sponsored the peaceful transition to independence in 1948 of Burma (Myanmar) and Ceylon (Sri Lanka).
One of the most urgent problems concerned the future of the Palestine Mandate. It had become too troublesome and much too expensive to handle. British policies there were perceived by the Zionist movement and the Truman Administration as pro-Arab and anti-Jewish. In the face of an armed revolt of Jewish militant groups and increasing violence of the local Arab population, Britain had found itself unable to control events. This was a very unpopular commitment, and the evacuation of British troops and subsequent handing over of the issue to the United Nations was widely supported by the British public.
The government's policies with regard to the other colonies, particularly those in Africa, focused on keeping them as strategic Cold War assets while modernising their economies. The Labour Party had long attracted aspiring leaders from Africa and had developed elaborate plans before the war. Implementing them overnight with an empty treasury proved too challenging. A major military base was built in Kenya, and the African colonies came under an unprecedented degree of direct control from London. Development schemes were implemented to help solve Britain's post-war balance of payments crisis and raise African living standards. This "new colonialism" worked slowly and had failures such as the Tanganyika groundnut scheme.
The 1950 election gave Labour a massively reduced majority of five seats compared to the triple-digit majority of 1945. Although re-elected, the result was seen by Attlee as very disappointing, and was widely attributed to the effects of post-war austerity denting Labour's appeal to middle-class voters. With such a small majority leaving him dependent on a small number of MPs to govern, Attlee's second term was much tamer than his first. Some major reforms were nevertheless passed, particularly regarding industry in urban areas and regulations to limit air and water pollution.
By 1951, the Attlee government was exhausted, with several of its most senior ministers ailing or ageing, and with a lack of new ideas. Attlee's record for settling internal differences in the Labour Party fell in April 1951, when there was a damaging split over an austerity Budget brought in by the Chancellor, Hugh Gaitskell, to pay for the cost of Britain's participation in the Korean War. Aneurin Bevan resigned to protest against the new charges for "teeth and spectacles" in the National Health Service introduced by that Budget, and was joined in this action by several senior ministers, including the future Prime Minister Harold Wilson, then the President of the Board of Trade. Thus escalated a battle between the left and right wings of the Party that continues today.
Finding it increasingly impossible to govern, Attlee's only chance was to call a snap election in October 1951, in the hope of achieving a more workable majority and to regain authority. The gamble failed: Labour narrowly lost to the Conservative Party, despite winning considerably more votes (achieving the largest Labour vote in electoral history). Attlee tendered his resignation as Prime Minister the following day, after six years and three months in office.
Following the defeat in 1951, Attlee continued to lead the party as Leader of the Opposition. His last four years as leader were, however, widely seen as one of the Labour Party's weaker periods.
The period was dominated by infighting between the Labour Party's right wing, led by Hugh Gaitskell, and its left, led by Aneurin Bevan. Many Labour MPs felt that Attlee should have retired after the 1951 election and allowed a younger man to lead the party. Bevan openly called for him to stand down in the summer of 1954. One of his main reasons for staying on as leader was to frustrate the leadership ambitions of Herbert Morrison, whom Attlee disliked for both political and personal reasons. At one time, Attlee had favoured Aneurin Bevan to succeed him as leader, but this became problematic after Bevan almost irrevocably split the party.
In an interview with the "News Chronicle" columnist Percy Cudlipp in mid-September 1955, Attlee made clear his own thinking together with his preference for the leadership succession, stating:
Attlee, now aged 72, contested the 1955 general election against Anthony Eden, which saw Labour lose 18 seats, and the Conservatives increase their majority. He retired as Leader of the Labour Party on 7 December 1955, having led the party for twenty years, and on 14 December Hugh Gaitskell was elected as his replacement.
He subsequently retired from the House of Commons and was elevated to the peerage to take his seat in the House of Lords as Earl Attlee and Viscount Prestwood on 16 December 1955. He believed Eden had been forced into taking a strong stand on the Suez Crisis by his backbenchers. In 1958, he was, along with numerous notables, to establish the Homosexual Law Reform Society. The society campaigned for the decriminalisation of homosexual acts in private by consenting adults, a reform which was voted through Parliament nine years later.
In 1962, he spoke twice in the House of Lords against the British government's application for the UK to join the European Economic Community ("Common Market"). In his second speech delivered in November, Attlee claimed that Britain had a separate parliamentary tradition from the Continental countries that composed the EEC. He also claimed that if Britain was a member, EEC rules would prevent the British government from planning the economy and that Britain's traditional policy had been outward looking rather than Continental.
He attended Winston Churchill's funeral in January 1965. He was elderly and frail by that time, and had to remain seated in the freezing cold as the coffin was carried, having tired himself out by standing at the rehearsal the previous day. He lived to see the Labour Party return to power under Harold Wilson in 1964, but also to see his old constituency of Walthamstow West fall to the Conservatives in a by-election in September 1967.
Attlee died peacefully in his sleep of pneumonia, at the age of 84 at Westminster Hospital on 8 October 1967. Two thousand people attended his funeral in November, including the then-Prime Minister Harold Wilson and the Duke of Kent, representing the Queen. He was cremated and his ashes were buried at Westminster Abbey.
Upon his death, the title passed to his son Martin Richard Attlee, 2nd Earl Attlee (1927–1991). It is now held by Clement Attlee's grandson John Richard Attlee, 3rd Earl Attlee. The third earl (a member of the Conservative Party) retained his seat in the Lords as one of the hereditary peers to remain under an amendment to Labour's House of Lords Act 1999.
Attlee's estate was sworn for probate purposes at a value of £7,295, (equivalent to £ in ) a relatively modest sum for so prominent a figure, and only a fraction of the £75,394 in his father's estate when he died in 1908.
The quotation about Attlee, "A modest man, but then he has so much to be modest about", is commonly ascribed to Churchill—though Churchill denied saying it, and respected Attlee's service in the War Cabinet. Attlee's modesty and quiet manner hid a great deal that has only come to light with historical reappraisal. Attlee himself is said to have responded to critics with a limerick: "There were few who thought him a starter, Many who thought themselves smarter. But he ended PM, CH and OM, an Earl and a Knight of the Garter".
The journalist and broadcaster Anthony Howard called him "the greatest Prime Minister of the 20th century".
His leadership style of consensual government, acting as a chairman rather than a president, won him much praise from historians and politicians alike. Christopher Soames, the British Ambassador to France during the Conservative government of Edward Heath and cabinet minister under Margaret Thatcher, remarked that "Mrs Thatcher was not really running a team. Every time you have a Prime Minister who wants to make all the decisions, it mainly leads to bad results. Attlee didn't. That's why he was so damn good".
Thatcher herself wrote in her 1995 memoirs, which charted her beginnings in Grantham to her victory at the 1979 general election, that she admired Attlee, writing: "Of Clement Attlee, however, I was an admirer. He was a serious man and a patriot. Quite contrary to the general tendency of politicians in the 1990s, he was all substance and no show".
Attlee's government presided over the successful transition from a wartime economy to peacetime, tackling problems of demobilisation, shortages of foreign currency, and adverse deficits in trade balances and government expenditure. Further domestic policies that he brought about included the creation of the National Health Service and the post-war Welfare State, which became key to the reconstruction of post-war Britain. Attlee and his ministers did much to transform the UK into a more prosperous and egalitarian society during their time in office with reductions in poverty and a rise in the general economic security of the population.
In foreign affairs, he did much to assist with the post-war economic recovery of Europe. He proved a loyal ally of the US at the onset of the Cold War. Due to his style of leadership, it was not he, but Ernest Bevin who masterminded foreign policy. It was Attlee's government that decided Britain should have an independent nuclear weapons programme, and work on it began in 1947.
Bevin, Attlee's Foreign Secretary, famously stated that "We've got to have it and it's got to have a bloody Union Jack on it". The first operational British A Bomb was not detonated until October 1952, about one year after Attlee had left office. Independent British atomic research was prompted partly by the US McMahon Act, which nullified wartime expectations of postwar US–UK collaboration in nuclear research, and prohibited Americans from communicating nuclear technology even to allied countries. British atomic bomb research was kept secret even from some members of Attlee's own cabinet, whose loyalty or discretion seemed uncertain.
Although a socialist, Attlee still believed in the British Empire of his youth. He thought of it as an institution that was a power for good in the world. Nevertheless, he saw that a large part of it needed to be self-governing. Using the Dominions of Canada, Australia, and New Zealand as a model, he continued the transformation of the empire into the modern-day British Commonwealth.
His greatest achievement, surpassing many of these, was perhaps the establishment of a political and economic consensus about the governance of Britain that all three major parties subscribed to for three decades, fixing the arena of political discourse until the late-1970s. In 2004, he was voted the most successful British Prime Minister of the 20th century by a poll of 139 academics organised by Ipsos MORI.
A blue plaque unveiled in 1979 commemorates Attlee at 17 Monkhams Avenue, in Woodford Green in the London borough of Redbridge.
Attlee was elected a Fellow of the Royal Society in 1947. Attlee was awarded an Honorary Fellowship of Queen Mary College on 15 December 1948.
On 30 November 1988, a bronze statue of Clement Attlee was unveiled by Harold Wilson (the next Labour Prime Minister after Attlee) outside Limehouse Library in Attlee's former constituency. By then Wilson was the last surviving member of Attlee's cabinet, and the unveiling of the statue would be one of the last public appearances by Wilson, who was by that point in the early stages of Alzheimer's disease; he died at the age of 79 in May 1995.
Limehouse Library was closed in 2003, after which the statue was vandalised. The council surrounded it with protective hoarding for four years, before eventually removing it for repair and recasting in 2009. The restored statue was unveiled by Peter Mandelson in April 2011, in its new position less than a mile away at the Queen Mary University of London's Mile End campus.
There is also a statue of Clement Attlee in the Houses of Parliament that was erected, instead of a bust, by parliamentary vote in 1979. The sculptor was Ivor Roberts-Jones.
Although one of his brothers became a clergyman and one of his sisters a missionary, Attlee himself is usually regarded as an agnostic. In an interview he described himself as "incapable of religious feeling", saying that he believed in "the ethics of Christianity" but not "the mumbo-jumbo". When asked whether he was an agnostic, Attlee replied "I don't know".
Biographical
Biographies of his cabinet and associates
Scholarly studies | https://en.wikipedia.org/wiki?curid=5766 |
Catullus
Gaius Valerius Catullus ( , ; c. 84 – c. 54 BC) was a Latin poet of the late Roman Republic who wrote chiefly in the neoteric style of poetry, which is about personal life rather than classical heroes. His surviving works are still read widely and continue to influence poetry and other forms of art.
Catullus's poems were widely appreciated by contemporary poets, significantly influencing Ovid and Virgil, among others. After his rediscovery in the Late Middle Ages, Catullus again found admirers such as Petrarca. The explicit sexual imagery which he uses in some of his poems has shocked many readers. Yet, at many instruction levels, Catullus is considered a resource for teachers of Latin. His profane body of work is still frequently read from secondary school to higher education programs across the world, with his 64th poem often considered his greatest.
Gāius Valerius Catullus () was born to a leading equestrian family of Verona, in Cisalpine Gaul. The social prominence of the Catullus family allowed the father of Gaius Valerius to entertain Julius Caesar when he was the Promagistrate (proconsul) of both Gallic provinces. In a poem, Catullus describes his happy homecoming to the family villa at Sirmio, on Lake Garda, near Verona; he also owned a villa near the resort of Tibur (Tivoli).
Catullus appears to have spent most of his young adult years in Rome. His friends there included the poets Licinius Calvus, and Helvius Cinna, Quintus Hortensius (son of the orator and rival of Cicero) and the biographer Cornelius Nepos, to whom Catullus dedicated a "libellus" of poems, the relation of which to the extant collection remains a matter of debate. He appears to have been acquainted with the poet Marcus Furius Bibaculus. A number of prominent contemporaries appear in his poetry, including Cicero, Caesar and Pompey. According to an anecdote preserved by Suetonius, Caesar did not deny that Catullus's lampoons left an indelible stain on his reputation, but when Catullus apologized, he invited the poet for dinner the very same day.
It was probably in Rome that Catullus fell deeply in love with the "Lesbia" of his poems, who is usually identified with Clodia Metelli, a sophisticated woman from the aristocratic house of patrician family Claudii Pulchri, sister of the infamous Publius Clodius Pulcher, and wife to proconsul Quintus Caecilius Metellus Celer. In his poems Catullus describes several stages of their relationship: initial euphoria, doubts, separation, and his wrenching feelings of loss. Clodia had several other partners; "From the poems one can adduce no fewer than five lovers in addition to Catullus: Egnatius (poem 37), Gellius (poem 91), Quintius (poem 82), Rufus (poem 77), and Lesbius (poem 79)." There is also some question surrounding her husband's mysterious death in 59 B.C., some critics believing he was domestically poisoned. Yet, a sensitive and passionate Catullus could not relinquish his flame for Clodia, regardless of her obvious indifference to his desire for a deep and permanent relationship. In his poems, Catullus wavers between devout, sweltering love and bitter, scornful insults that he directs at her blatant infidelity (as demonstrated in poems 11 and 58). His passion for her is unrelenting—yet it is unclear when exactly the couple split up for good. Catullus's poems about the relationship display striking depth and psychological insight.
He spent the provincial command year summer 57 to summer 56 BC in Bithynia on the staff of the commander Gaius Memmius. While in the East, he traveled to the Troad to perform rites at his brother's tomb, an event recorded in a moving poem.
There survives no ancient biography of Catullus: his life has to be pieced together from scattered references to him in other ancient authors and from his poems. Thus it is uncertain when he was born and when he died. St. Jerome says that he died in his 30th year, and was born in 87 BC. But the poems include references to events of 55 and 54 BC. Since the Roman consular fasti make it somewhat easy to confuse 87–57 BC with 84–54 BC, many scholars accept the dates 84 BC–54 BC, supposing that his latest poems and the publication of his "libellus" coincided with the year of his death. Other authors suggest 52 or 51 BC as the year of the poet's death. Though upon his elder brother's death Catullus lamented that their "whole house was buried along" with the deceased, the existence (and prominence) of "Valerii Catulli" is attested in the following centuries. T.P. Wiseman argues that after the brother's death Catullus could have married, and that, in this case, the later "Valerii Catulli" may have been his descendants.
Catullus's poems have been preserved in an anthology of 116 "carmina" (the actual number of poems may slightly vary in various editions), which can be divided into three parts according to their form: sixty short poems in varying meters, called "polymetra", eight longer poems, and forty-eight epigrams.
There is no scholarly consensus on whether Catullus himself arranged the order of the poems. The longer poems differ from the "polymetra" and the epigrams not only in length but also in their subjects: There are seven hymns and one mini-epic, or epyllion, the most highly prized form for the "new poets".
The "polymetra" and the epigrams can be divided into four major thematic groups (ignoring a rather large number of poems that elude such categorization):
All these poems describe the lifestyle of Catullus and his friends, who, despite Catullus's temporary political post in Bithynia, lived their lives withdrawn from politics. They were interested mainly in poetry and love. Above all other qualities, Catullus seems to have valued "venustas", or charm, in his acquaintances, a theme which he explores in a number of his poems. The ancient Roman concept of "virtus" (i.e. of virtue that had to be proved by a political or military career), which Cicero suggested as the solution to the societal problems of the late Republic, meant little to them.
However Catullus does not reject traditional notions, but rather their particular application to the "vita activa" of politics and war. Indeed, he tries to reinvent these notions from a personal point of view and to introduce them into human relationships. For example, he applies the word "fides", which traditionally meant faithfulness towards one's political allies, to his relationship with Lesbia and reinterprets it as unconditional faithfulness in love. So, despite the seeming frivolity of his lifestyle, Catullus measured himself and his friends by quite ambitious standards.
Catullus's poetry was influenced by the innovative poetry of the Hellenistic Age, and especially by Callimachus and the Alexandrian school, which had propagated a new style of poetry that deliberately turned away from the classical epic poetry in the tradition of Homer. Cicero called these local innovators "neoteroi" (νεώτεροι) or 'moderns' (in Latin "poetae novi" or 'new poets'), in that they cast off the heroic model handed down from Ennius in order to strike new ground and ring a contemporary note. Catullus and Callimachus did not describe the feats of ancient heroes and gods (except perhaps in re-evaluating and predominantly artistic circumstances, e.g. poems 63 and 64), focusing instead on small-scale personal themes. Although these poems sometimes seem quite superficial and their subjects often are mere everyday concerns, they are accomplished works of art. Catullus described his work as "expolitum", or polished, to show that the language he used was very carefully and artistically composed.
Catullus was also an admirer of Sappho, a female poet of the seventh century BC. Catullus 51 partly translates, partly imitates and transforms Sappho 31. Some hypothesize that 61 and 62 were perhaps inspired by lost works of Sappho but this is purely speculative. Both of the latter are "epithalamia", a form of laudatory or erotic wedding-poetry that Sappho was famous for. Catullus twice used a meter that Sappho developed, called the Sapphic strophe, in poems 11 and 51, perhaps prompting his successor Horace's interest in the form.
Catullus, as was common to his era, was greatly influenced by stories from Greek and Roman myth. His longer poems—such as 63, 64, 65, 66, and 68—allude to mythology in various ways. Some stories he refers to are the wedding of Peleus and Thetis, the departure of the Argonauts, Theseus and the Minotaur, Ariadne's abandonment, Tereus and Procne, as well as Protesilaus and Laodamia.
Catullus wrote in many different meters including hendecasyllabic verse and elegiac couplets (common in love poetry). A great part of his poetry shows strong and occasionally wild emotions, especially in regard to Lesbia. His love poems are very emotional and ardent, and we can relate to them even today. Catullus describes his Lesbia as having multiple suitors and often showing little affection towards him. He also demonstrates a great sense of humour such as in Catullus 13.
"Catullus Dreams" (2011) is a song cycle by David Glaser set to texts of Catullus. The cycle is scored for soprano and seven instruments. It was premiered at Symphony Space in New York by soprano Linda Larson and Sequitur Ensemble.
"Catulli Carmina" is a cantata by Carl Orff to the texts of Catullus.
"Carmina Catulli" is a song cycle arranged from 17 of Catullus' poems by American composer Michael Linton. The cycle was recorded in December 2013 and premiered at Carnegie Hall's Weill Recital Hall in March 2014 by French baritone Edwin Crossley-Mercer and pianist Jason Paul Peterson.
Catullus 5, the love poem "Vivamus mea Lesbia atque amemus", in the translation by Ben Jonson, was set to music (lute accompanied song) by Alfonso Ferrabosco the younger. Thomas Campion also wrote a lute-song using his own translation of the first six lines of Catullus 5 followed by two verses of his own. The translation by Richard Crashaw was set to music in a four-part glee by Samuel Webbe Jr. It was also set to music in a three-part glee by John Stafford Smith.
The Hungarian born British composer Matyas Seiber set poem 31 for unaccompanied mixed chorus Sirmio in 1957.
Finnish jazz singer Reine Rimón has recorded poems of Catullus set to standard jazz tunes.
The American composer Ned Rorem set Catullus 101 to music for voice and piano. The song, "Catallus: on the Burial of His Brother", was originally published in 1969.
The Icelandic composer Jóhann Jóhannsson set Catullus 85 to music. The poem is sung through a vocoder. The music is played by a string quartet and piano. Titled "Odi Et Amo", the song is found on Jóhannsson's album "Englabörn". | https://en.wikipedia.org/wiki?curid=5768 |
C. S. Forester
Cecil Louis Troughton Smith (27 August 1899 – 2 April 1966), known by his pen name Cecil Scott "C. S." Forester, was an English novelist known for writing tales of naval warfare, such as the 12-book Horatio Hornblower series depicting a Royal Navy officer during the Napoleonic wars. The Hornblower novels "A Ship of the Line" and "Flying Colours" were jointly awarded the James Tait Black Memorial Prize for fiction in 1938. His other works include "The African Queen" (1935; turned into a 1951 film by John Huston) and "The Good Shepherd" (1955; turned into a 2020 film starring Tom Hanks).
Forester was born in Cairo and moved with his mother to London after a family breakup at an early age, where he was educated at Alleyn's School and Dulwich College. He began to study medicine at Guy's Hospital but left without completing his degree. He wore glasses and was of slender physique; he failed his Army physical and was told that there was no chance that he would be accepted, even though he was of good height and somewhat athletic. He began writing seriously around 1921 using his pen name.
Forester moved to the United States during the Second World War, where he worked for the British Ministry of Information and wrote propaganda to encourage the US to join the Allies. He eventually settled in Berkeley, California. He met Roald Dahl in 1942 while living in Washington, D.C., and Forester encouraged him to write about his experiences in the RAF. According to Dahl's autobiography "Lucky Break", Forester asked him about his experiences as a fighter pilot, and this prompted Dahl to write his first story "A Piece of Cake".
Forester wrote many novels, but he is best known for the 12-book Horatio Hornblower series depicting a Royal Navy officer during the Napoleonic Wars. He began the series with Hornblower fairly high in rank in the first novel that he wrote, which was published in 1937. But high demand for more stories led him to fill in Hornblower's life story, and he wrote novels detailing his rise from the rank of midshipman. The last completed novel was published in 1962. Hornblower's fictional feats were based on real events, but Forester wrote the body of the works carefully to avoid entanglements with real world history, so that Hornblower is always off on another mission when a great naval battle occurs during the Napoleonic Wars.
Forester's other novels include "The African Queen" (1935) and "The General" (1936); Peninsular War novels in "Death to the French" (published in the United States as "Rifleman Dodd") and "The Gun" (filmed as "The Pride and the Passion" in 1957); and seafaring stories that did not involve Hornblower, such as "Brown on Resolution" (1929), "The Captain from Connecticut" (1941), "The Ship" (1943), and "Hunting the Bismarck" (1959), which was used as the basis of the screenplay for the film "Sink the Bismarck!" (1960). Several of his works were filmed, including "The African Queen" (1951), directed by John Huston. Forester is also credited as story writer for several movies not based on his published fiction, including "Commandos Strike at Dawn" (1942).
He wrote several volumes of short stories set during the Second World War. Those in "The Nightmare" (1954) were based on events in Nazi Germany, ending at the Nuremberg trials. Stories in "The Man in the Yellow Raft" (1969) followed the career of the destroyer USS "Boon", while many of those in "Gold from Crete" (1971) followed the destroyer HMS "Apache". The last of the stories in "Gold from Crete" was "If Hitler had invaded England", which offers an imagined sequence of events starting with Hitler's attempt to implement Operation Sea Lion, and culminating in the early military defeat of Nazi Germany in the summer of 1941. His non-fiction seafaring works include "The Age of Fighting Sail" (1956), an account of the sea battles between Great Britain and the United States in the War of 1812.
Forester also published the crime novels "Payment Deferred" (1926) and "Plain Murder" (1930), as well as two children's books. "Poo-Poo and the Dragons" (1942) was created as a series of stories told to his son George to encourage him to finish his meals. George had mild food allergies which kept him feeling unwell, and he needed encouragement to eat. "The Barbary Pirates" (1953) is a children's history of early 19th-century pirates.
Forester appeared as a contestant on the television quiz program "You Bet Your Life" hosted by Groucho Marx, in an episode broadcast on 1 November 1956. A previously unknown novel of Forester's entitled "The Pursued" was discovered in 2003 and published by Penguin Classics on 3 November 2011.
He married Kathleen Belcher in 1926 and they had two sons, John and George Forester. The couple divorced in 1945. In 1947, he married Dorothy Foster.
John Forester wrote a two-volume biography of his father, including many elements of Forester's life which only became clear to his son after his death.
Sternlicht, Sanford V., "C.S. Forester and the Hornblower saga" (Syracuse University Press, 1999)
Van der Kiste, John, "C.S. Forester's Crime Noir: A view of the murder stories" (KDP, 2018) | https://en.wikipedia.org/wiki?curid=5769 |
List of country calling codes
Country calling codes or country dial-in codes are telephone number prefixes for reaching telephone subscribers in the networks of the member countries or regions of the International Telecommunication Union (ITU). The codes are defined by the ITU-T in standards E.123 and E.164. The prefixes enable international direct dialing (IDD), and are also referred to as "international subscriber dialing" (ISD) codes.
Country codes are a component of the international telephone numbering plan, and are necessary only when dialing a telephone number to establish a call to another country. Country codes are dialed before the national telephone number. By convention, international telephone numbers are represented by prefixing the country code with a plus sign (+), which also indicates to the subscriber that the local international call prefix must first be dialed. For example, the international call prefix in all countries belonging to the North American Numbering Plan is 011, while it is 00 in most European, Asian and African countries. On GSM (cellular) networks, the prefix may automatically be inserted when the user prefixes a dialed number with the plus sign.
Country calling codes are prefix codes and can be organized as a tree. In each row of the table, the country codes given in the left-most column share the same first digit; then subsequent columns give the second digit in ascending order.
While there is a general geographic grouping to the zones, some exceptions exist for political and historical reasons. Thus, the geographical indicators below are approximations only.
Member countries of the North American Numbering Plan (NANP) are assigned three-digit area codes under the common country prefix "1", shown in the format "+1 XXX".
The North American Numbering Plan includes:
Originally, larger countries such as Spain, the United Kingdom and France, were assigned two-digit codes to compensate for their usually longer domestic numbers. Small countries, such as Iceland, were assigned three-digit codes. Since the 1980s, all new assignments have been three-digit regardless of countries' populations.
In Antarctica, dialing is dependent on the parent country of each base:
Other places with no country codes in use, although a code may be reserved: | https://en.wikipedia.org/wiki?curid=5770 |
Christopher Marlowe
Christopher Marlowe, also known as Kit Marlowe (; baptised 26 February 156430 May 1593), was an English playwright, poet and translator of the Elizabethan era. Modern scholars count Marlowe among the most famous of the Elizabethan playwrights and based upon the "many imitations" of his play "Tamburlaine" consider him to have been the foremost dramatist in London in the years just before his mysterious early death. Some scholars also believe that he greatly influenced William Shakespeare, who was baptised in the same year as Marlowe and later became the pre-eminent Elizabethan playwright.. Marlowe's plays are the first to use blank verse, which became the standard for the era, and are distinguished by their overreaching protagonists. Themes found within Marlowe's literary works have been noted as humanistic with realistic emotions, which some scholars find difficult to reconcile with Marlowe's "anti-intellectualism" and his catering to the taste of his Elizabethan audiences for generous displays of extreme physical violence, cruelty, and bloodshed.
Events in Marlowe's life were sometimes as extreme as those found in his dramas. Reports of Marlowe’s death in 1593 were particularly infamous in his day and are contested by scholars today due to a lack of good documentation. Traditionally, the playwright’s death has been blamed on a long list of conjectures, including a barroom fight, church libel, homosexual intrigue, betrayal by another playwright, and espionage from the highest level: Elizabeth I of England’s Privy Council. An official coroner account of Marlowe’s death was only revealed in 1925, but it did little to persuade all scholars that it told the whole story nor did it eliminate the uncertainties present in his biography.
Christopher Marlowe was born to Canterbury shoemaker John Marlowe and his wife Katherine, daughter of William Arthur of Dover. He was baptised on 26 February 1564 at St. George's Church, Canterbury. Marlowe's birth was likely to have been a few days before, making him about two months older than William Shakespeare, who was baptised on 26 April 1564 in Stratford-upon-Avon.
By age 14, Marlowe attended The King's School, Canterbury on scholarship and two years later Corpus Christi College, Cambridge, where he also studied on scholarship and received his Bachelor of Arts degree in 1584. In 1587, the university hesitated to award his Master of Arts degree because of a rumour that he intended to go to the English seminary at Rheims in northern France, presumably to prepare for ordination as a Roman Catholic priest. If true, such an action on his part would have been a direct violation of royal edict issued by Queen Elizabeth I in 1585 criminalizing any attempt by an English citizen to be ordained in the Roman Catholic Church.
Largescale violence between Protestants and Catholics on the European continent has been cited by scholars as the impetus for the Protestant English Queen's defensive anti-Catholic laws issued from 1581 until her death in 1603. Despite the dire implications for Marlowe, his degree was awarded on schedule when the Privy Council intervened on his behalf, commending him for his "faithful dealing" and "good service" to the Queen. The nature of Marlowe's service was not specified by the Council, but its letter to the Cambridge authorities has provoked much speculation by modern scholars, notably the theory that Marlowe was operating as a secret agent for Privy Council member Sir Francis Walsingham. The only surviving evidence of the Privy Council's correspondence is found in their minutes, the letter being lost. There is no mention of espionage in the minutes, but its summation of the lost Privy Council letter is vague in meaning, stating that "it was not Her Majesties pleasure" that persons employed as Marlowe had been "in matters touching the benefit of his country should be defamed by those who are ignorant in th'affaires he went about." Scholars agree the vague wording was typically used to protect government agents, but they continue to debate what the "matters touching the benefit of his country" actually were in Marlowe's case and how they affected the 23-year-old writer as he launched his literary career in 1587.
Six dramas have been attributed to the authorship of Christopher Marlowe either alone or in collaboration with other writers, with varying degrees of evidence. The writing sequence or chronology of these plays is mostly unknown and is offered here with any dates and evidence known. Among the little available information we have, "Dido" is believed to be the first Marlowe play performed, while it was "Tamburlaine" that was first to be performed on a regular commercial stage in London in 1587. Believed by many scholars to be Marlowe's greatest success, "Tamburlaine" was the first English play written in blank verse, and with Thomas Kyd's "The Spanish Tragedy", is generally considered the beginning of the mature phase of the Elizabethan theatre.
Works
(The dates of composition are approximate).:
The play "Lust's Dominion" was attributed to Marlowe upon its initial publication in 1657, though scholars and critics have almost unanimously rejected the attribution. He may also have written or co-written "Arden of Faversham".
Publication and responses to the poetry and translations credited to Marlowe primarily occurred posthumously, including:
Modern scholars still look for evidence of collaborations between Marlowe and other writers. In 2016, one publisher was the first to endorse the scholarly claim of a collaboration between Marlowe and the playwright William Shakespeare:
Marlowe's plays were enormously successful, possibly due to the imposing stage presence of his lead actor, Edward Alleyn. Alleyn was unusually tall for the time and the haughty roles of Tamburlaine, Faustus and Barabas were probably written for him. Marlowe's plays were the foundation of the repertoire of Alleyn's company, the Admiral's Men, throughout the 1590s. One of Marlowe's poetry translations did not fare as well. In 1599, Marlowe's translation of Ovid was banned and copies were publicly burned as part of Archbishop Whitgift's crackdown on offensive material.
There are at least two major modern scholarly editions of the collected works of Christopher Marlowe:
There are also notable scholarly collections of essays concerning the collected works of Christopher Marlowe, including:
This is a possible chronology of composition for the dramatic works of Christopher Marlowe based upon dates previously cited. The dates of composition are approximate. There are other chronologies for Marlowe, including one based upon dates of printing, as was used in the 2004 "Cambridge Companion to Christopher Marlowe", edited by Patrick Cheney.
As with other Elizabethans, little is known about Marlowe's adult life. All available evidence, other than what can be deduced from his literary works, is found in legal records and other official documents. This has not stopped writers of fiction and non-fiction from speculating about his professional activities, private life and character. Marlowe has often been described as a spy, a brawler and a heretic, as well as a "magician", "duellist", "tobacco-user", "counterfeiter" and "rakehell". While J. A. Downie and Constance Kuriyama have argued against the more lurid speculation, it is the usually circumspect J. B. Steane who remarked, "it seems absurd to dismiss all of these Elizabethan rumours and accusations as 'the Marlowe myth. To understand his brief adult life, from 1587 to 1593, much has been written, including speculation of: his involvement in royally-sanctioned espionage; his vocal declaration as an atheist; his private, and possibly same-gender, sexual interests; and the puzzling circumstances surrounding his death.
Marlowe is alleged to have been a government spy. The authors Park Honan and Charles Nicholl speculate this was the case and suggest that Marlowe's recruitment took place when he was at Cambridge. In 1587, when the Privy Council ordered the University of Cambridge to award Marlowe his degree as Master of Arts, they denied rumours that he intended to go to the English Catholic college in Rheims, saying instead that he had been engaged in unspecified "affaires" on "matters touching the benefit of his country". Surviving college records from the period also indicate that in the academic year 1584–1585, Marlowe had had a series of unusually lengthy absences from the university which violated university regulations. Surviving college buttery accounts, that indicate student purchases for personal provisions, show Marlowe began spending lavishly on food and drink during the periods he was in attendance. The amount was more than he could have afforded on his known scholarship income.
It has been speculated that Marlowe was the "Morley" who was tutor to Arbella Stuart in 1589. This possibility was first raised in a "Times Literary Supplement" letter by E. St John Brooks in 1937; in a letter to "Notes and Queries", John Baker has added that only Marlowe could have been Arbella's tutor due to the absence of any other known "Morley" from the period with an MA and not otherwise occupied. If Marlowe was Arbella's tutor, it might indicate that he was there as a spy, since Arbella, niece of Mary, Queen of Scots, and cousin of James VI of Scotland, later James I of England, was at the time a strong candidate for the succession to Elizabeth's throne. Frederick S. Boas dismisses the possibility of this identification, based on surviving legal records which document his "residence in London between September and December 1589". Marlowe had been party to a fatal quarrel involving his neighbours and the poet Thomas Watson in Norton Folgate and was held in Newgate Prison for a fortnight. In fact, the quarrel and his arrest was on 18 September, he was released on bail on 1 October and he had to attend court, where he was acquitted on 3 December but there is no record of where he was for the intervening two months.
In 1592 Marlowe was arrested in the English garrison town of Flushing (Vlissingen) in the Netherlands, for his alleged involvement in the counterfeiting of coins, presumably related to the activities of seditious Catholics. He was sent to the Lord Treasurer (Burghley), but no charge or imprisonment resulted. This arrest may have disrupted another of Marlowe's spying missions, perhaps by giving the resulting coinage to the Catholic cause. He was to infiltrate the followers of the active Catholic plotter William Stanley and report back to Burghley.
Marlowe was reputed to be an atheist, which held the dangerous implication of being an enemy of God and the state, by association. With the rise of public fears concerning The School of Night, or "School of Atheism" in the late 16th century, accusations of atheism were closely associated with disloyalty to the Protestant monarchy of England.
Some modern historians consider that Marlowe's professed atheism, as with his supposed Catholicism, may have been no more than a sham to further his work as a government spy. Contemporary evidence comes from Marlowe's accuser in Flushing, an informer called Richard Baines. The governor of Flushing had reported that each of the men had "of malice" accused the other of instigating the counterfeiting and of intending to go over to the Catholic "enemy"; such an action was considered atheistic by the Church of England. Following Marlowe's arrest in 1593, Baines submitted to the authorities a "note containing the opinion of one Christopher Marly concerning his damnable judgment of religion, and scorn of God's word". Baines attributes to Marlowe a total of eighteen items which "scoff at the pretensions of the Old and New Testament" such as, "Christ was a bastard and his mother dishonest [unchaste]", "the woman of Samaria and her sister were whores and that Christ knew them dishonestly", "St John the Evangelist was bedfellow to Christ and leaned always in his bosom" (cf. John 13:23–25) and "that he used him as the sinners of Sodom". He also implied that Marlowe had Catholic sympathies. Other passages are merely sceptical in tone: "he persuades men to atheism, willing them not to be afraid of bugbears and hobgoblins". The final paragraph of Baines's document reads:
Similar examples of Marlowe's statements were given by Thomas Kyd after his imprisonment and possible torture (see above); Kyd and Baines connect Marlowe with the mathematician Thomas Harriot's and Sir Walter Raleigh's circle. Another document claimed about that time that "one Marlowe is able to show more sound reasons for Atheism than any divine in England is able to give to prove divinity, and that ... he hath read the Atheist lecture to Sir Walter Raleigh and others".
Some critics believe that Marlowe sought to disseminate these views in his work and that he identified with his rebellious and iconoclastic protagonists. Plays had to be approved by the Master of the Revels before they could be performed and the censorship of publications was under the control of the Archbishop of Canterbury. Presumably these authorities did not consider any of Marlowe's works to be unacceptable other than the "Amores".
Marlowe is believed to have been homosexual. Some scholars argue that the identification of an Elizabethan as gay or homosexual in a modern sense is "anachronistic," claiming that for the Elizabethans the terms were more likely to have been applied to sexual acts rather than to what we understand to be exclusive sexual orientations and identities. Other scholars argue that the evidence is inconclusive and that the reports of Marlowe's homosexuality may be rumours produced after his death. Richard Baines reported Marlowe as saying: "all they that love not Tobacco & Boies were fools". David Bevington and Eric Rasmussen describe Baines's evidence as "unreliable testimony" and "These and other testimonials need to be discounted for their exaggeration and for their having been produced under legal circumstances we would regard as a witch-hunt".
J. B. Steane remarked that he considered there to be "no evidence for Marlowe's homosexuality at all". Other scholars point to homosexual themes in Marlowe's writing: in "Hero and Leander", Marlowe writes of the male youth Leander: "in his looks were all that men desire..." "Edward the Second" contains the following passage enumerating homosexual relationships:
Marlowe wrote the only play about the life of Edward II up to his time, taking the humanist literary discussion of male sexuality much further than his contemporaries. The play was extremely bold, dealing with a star-crossed love story between Edward II and Piers Gaveston. Though it was a common practice at the time to reveal characters as gay to give audiences reason to suspect them as culprits in a crime, Christopher Marlowe's Edward II is portrayed as a sympathetic character.
In early May 1593, several bills were posted about London threatening Protestant refugees from France and the Netherlands who had settled in the city. One of these, the "Dutch church libel", written in rhymed iambic pentameter, contained allusions to several of Marlowe's plays and was signed, "Tamburlaine". On 11 May the Privy Council ordered the arrest of those responsible for the libels. The next day, Marlowe's colleague Thomas Kyd was arrested, his lodgings were searched and a three-page fragment of a heretical tract was found. In a letter to Sir John Puckering, Kyd asserted that it had belonged to Marlowe, with whom he had been writing "in one chamber" some two years earlier. In a second letter, Kyd described Marlowe as blasphemous, disorderly, holding treasonous opinions, being an irreligious reprobate and "intemperate & of a cruel hart". They had both been working for an aristocratic patron, probably Ferdinando Stanley, Lord Strange. A warrant for Marlowe's arrest was issued on 18 May, when the Privy Council apparently knew that he might be found staying with Thomas Walsingham, whose father was a first cousin of the late Sir Francis Walsingham, Elizabeth's principal secretary in the 1580s and a man more deeply involved in state espionage than any other member of the Privy Council. Marlowe duly presented himself on 20 May but there apparently being no Privy Council meeting on that day, was instructed to "give his daily attendance on their Lordships, until he shall be licensed to the contrary". On Wednesday, 30 May, Marlowe was killed.
Various accounts of Marlowe's death were current over the next few years. In his "Palladis Tamia", published in 1598, Francis Meres says Marlowe was "stabbed to death by a bawdy serving-man, a rival of his in his lewd love" as punishment for his "epicurism and atheism". In 1917, in the "Dictionary of National Biography", Sir Sidney Lee wrote that Marlowe was killed in a drunken fight and this is still often stated as fact today. The official account came to light only in 1925, when the scholar Leslie Hotson discovered the coroner's report of the inquest on Marlowe's death, held two days later on Friday 1 June 1593, by the Coroner of the Queen's Household, William Danby. Marlowe had spent all day in a house in Deptford, owned by the widow Eleanor Bull and together with three men: Ingram Frizer, Nicholas Skeres and Robert Poley. All three had been employed by one or other of the Walsinghams. Skeres and Poley had helped snare the conspirators in the Babington plot and Frizer would later describe Thomas Walsingham as his "master" at that time, although his role was probably more that of a financial or business agent, as he was for Walsingham's wife Audrey a few years later. These witnesses testified that Frizer and Marlowe had argued over payment of the bill (now famously known as the 'Reckoning') exchanging "divers malicious words" while Frizer was sitting at a table between the other two and Marlowe was lying behind him on a couch. Marlowe snatched Frizer's dagger and wounded him on the head. In the ensuing struggle, according to the coroner's report, Marlowe was stabbed above the right eye, killing him instantly. The jury concluded that Frizer acted in self-defence and within a month he was pardoned. Marlowe was buried in an unmarked grave in the churchyard of St. Nicholas, Deptford immediately after the inquest, on 1 June 1593.
The complete text of the inquest report was published by Leslie Hotson in his book, "The Death of Christopher Marlowe", in the introduction to which Prof. George Kittredge said "The mystery of Marlowe's death, heretofore involved in a cloud of contradictory gossip and irresponsible guess-work, is now cleared up for good and all on the authority of public records of complete authenticity and gratifying fullness" but this confidence proved fairly short-lived. Hotson had considered the possibility that the witnesses had "concocted a lying account of Marlowe's behaviour, to which they swore at the inquest, and with which they deceived the jury" but came down against that scenario. Others began to suspect that this was indeed the case. Writing to the "TLS" shortly after the book's publication, Eugénie de Kalb disputed that the struggle and outcome as described were even possible and Samuel A. Tannenbaum insisted the following year that such a wound could not have possibly resulted in instant death, as had been claimed. Even Marlowe's biographer John Bakeless acknowledged that "some scholars have been inclined to question the truthfulness of the coroner's report. There is something queer about the whole episode" and said that Hotson's discovery "raises almost as many questions as it answers". It has also been discovered more recently that the apparent absence of a local county coroner to accompany the Coroner of the Queen's Household would, if noticed, have made the inquest null and void.
One of the main reasons for doubting the truth of the inquest concerns the reliability of Marlowe's companions as witnesses. As an "agent provocateur" for the late Sir Francis Walsingham, Robert Poley was a consummate liar, the "very genius of the Elizabethan underworld" and is on record as saying "I will swear and forswear myself, rather than I will accuse myself to do me any harm". The other witness, Nicholas Skeres, had for many years acted as a confidence trickster, drawing young men into the clutches of people in the money-lending racket, including Marlowe's apparent killer, Ingram Frizer, with whom he was engaged in such a swindle. Despite their being referred to as "generosi" (gentlemen) in the inquest report, the witnesses were professional liars. Some biographers, such as Kuriyama and Downie, take the inquest to be a true account of what occurred but in trying to explain what really happened if the account was "not" true, others have come up with a variety of murder theories.
Since there are only written documents on which to base any conclusions and since it is probable that the most crucial information about his death was never committed to paper, it is unlikely that the full circumstances of Marlowe's death will ever be known.
For his contemporaries in the literary world, Marlowe was above all an admired and influential artist. Within weeks of his death, George Peele remembered him as "Marley, the Muses' darling"; Michael Drayton noted that he "Had in him those brave translunary things / That the first poets had" and Ben Jonson wrote of "Marlowe's mighty line". Thomas Nashe wrote warmly of his friend, "poor deceased Kit Marlowe," as did the publisher Edward Blount in his dedication of "Hero and Leander" to Sir Thomas Walsingham. Among the few contemporary dramatists to say anything negative about Marlowe was the anonymous author of the Cambridge University play "The Return from Parnassus" (1598) who wrote, "Pity it is that wit so ill should dwell, / Wit lent from heaven, but vices sent from hell".
The most famous tribute to Marlowe was paid by Shakespeare in "As You Like It", where he not only quotes a line from "Hero and Leander" ("Dead Shepherd, now I find thy saw of might, 'Who ever lov'd that lov'd not at first sight?) but also gives to the clown Touchstone the words "When a man's verses cannot be understood, nor a man's good wit seconded with the forward child, understanding, it strikes a man more dead than a great reckoning in a little room". This appears to be a reference to Marlowe's murder which involved a fight over the "reckoning", the bill, as well as to a line in Marlowe's "Jew of Malta"; "Infinite riches in a little room".
Shakespeare was much influenced by Marlowe in his work, as can be seen in the use of Marlovian themes in "Antony and Cleopatra", "The Merchant of Venice", "Richard II" and "Macbeth" ("Dido", "Jew of Malta", "Edward II" and "Doctor Faustus", respectively). In "Hamlet", after meeting with the travelling actors, Hamlet requests the Player perform a speech about the Trojan War, which at 2.2.429–32 has an echo of Marlowe's "Dido, Queen of Carthage". In "Love's Labour's Lost" Shakespeare brings on a character "Marcade" (three syllables) in conscious acknowledgement of Marlowe's character "Mercury", also attending the King of Navarre, in "Massacre at Paris". The significance, to those of Shakespeare's audience who were familiar with "Hero and Leander", was Marlowe's identification of himself with the god Mercury.
An argument has arisen about the notion that Marlowe may have faked his death and then continued to write under the assumed name of William Shakespeare. Orthodox academic consensus rejects alternative candidates for authorship of Shakespeare's plays and sonnets, including Marlowe.
A Marlowe Memorial in the form of a bronze sculpture of "The Muse of Poetry" by Edward Onslow Ford was erected by subscription in Buttermarket, Canterbury in 1891. In July 2002, a memorial window to Marlowe, a gift of the Marlowe Society, was unveiled in Poets' Corner in Westminster Abbey. Controversially, a question mark was added to the generally accepted date of death. On 25 October 2011 a letter from Paul Edmondson and Stanley Wells was published by "The Times" newspaper, in which they called on the Dean and Chapter to remove the question mark on the grounds that it "flew in the face of a mass of unimpugnable evidence". In 2012, they renewed this call in their e-book "Shakespeare Bites Back", adding that it "denies history" and again the following year in their book "Shakespeare Beyond Doubt".
The Marlowe Theatre in Canterbury, Kent, UK, was named after the town’s “most famous” resident, the English playwright Christopher Marlowe in 1949. Originally housed in a former 1920s cinema on St. Margaret’s Street, the Marlowe Theatre later moved to a newly converted 1930’s era Odeon Cinema in the city. After a 2011 reopening with a newly enhanced state-of-the-art theatre facility, the Marlowe now enjoys some of the country’s finest touring companies including, Glyndebourne Opera, the Royal Shakespeare Company, the Royal National Theatre as well as many major West End musicals.
Marlowe has been used as a character in books, theatre, film, television and radio.
Modern productions of the plays of Christopher Marlowe have increased in frequency throughout the twentieth and twenty-first centuries, including the following notable productions: | https://en.wikipedia.org/wiki?curid=5771 |
Caving
Caving – also known as spelunking in the United States and Canada and potholing in the United Kingdom and Ireland – is the recreational pastime of exploring wild cave systems (as distinguished from show caves). In contrast, speleology is the scientific study of caves and the cave environment.
The challenges involved in caving vary according to the cave being visited; in addition to the total absence of light beyond the entrance, negotiating pitches, squeezes, and water hazards can be difficult. Cave diving is a distinct, and more hazardous, sub-speciality undertaken by a small minority of technically proficient cavers. In an area of overlap between recreational pursuit and scientific study, the most devoted and serious-minded cavers become accomplished at the surveying and mapping of caves and the formal publication of their efforts. These are usually published freely and publicly, especially in the UK and other European countries, although in the US, these are generally private.
Sometimes categorized as an "extreme sport", it is not commonly considered as such by longtime enthusiasts, who may dislike the term for its connotation of disregard for safety.
Many caving skills overlap with those involved in canyoning and mine and urban exploration.
Caving is often undertaken for the enjoyment of the outdoor activity or for physical exercise, as well as original exploration, similar to mountaineering or diving. Physical or biological science is also an important goal for some cavers, while others are engaged in cave photography. Virgin cave systems comprise some of the last unexplored regions on Earth and much effort is put into trying to locate, enter and survey them. In well-explored regions (such as most developed nations), the most accessible caves have already been explored, and gaining access to new caves often requires cave digging or cave diving.
Caving, in certain areas, has also been utilized as a form of eco and adventure tourism, for example in New Zealand. Tour companies have established an industry leading and guiding tours into and through caves. Depending on the type of cave and the type of tour, the experience could be adventure-based or ecological-based. There are tours led through lava tubes by a guiding service (e.g. Lava River Cave, the oceanic islands of Tenerife, Iceland and Hawaii).
Caving has also been described as an "individualist's team sport" by some, as cavers can often make a trip without direct physical assistance from others but will generally go in a group for companionship or to provide emergency help if needed. Some however consider the assistance cavers give each other as a typical team sport activity.
The base term "caving" comes from the Latin "cavea" or "caverna", meaning simply, a cave.
"Potholing" refers to the act of exploring "potholes", a word originating in the north of England for predominantly vertical caves.
Clay Perry, an American caver of the 1940s, wrote about a group of men and boys who explored and studied caves throughout New England. This group referred to themselves as "spelunkers", a term derived from the Latin "" ("cave, cavern, den"), itself from the Greek "spēlynks" ("cave"). This is regarded as the first use of the word in the Americas. Throughout the 1950s, "spelunking" was the general term used for exploring caves in US English. It was used freely, without any positive or negative connotations, although only rarely outside the US.
In the 1960s, the terms "spelunking" and "spelunker" began to be considered déclassé among experienced enthusiasts. In 1985, Steve Knutson – editor of the National Speleological Society (NSS) publication "American Caving Accidents" – made the following distinction:
This sentiment is exemplified by bumper stickers and T-shirts displayed by some cavers: "Cavers rescue spelunkers". Nevertheless, outside the caving community, "spelunking" and "spelunkers" predominately remain neutral terms referring to the practice and practitioners, without any respect to skill level.
In the mid-nineteenth century, John Birkbeck explored potholes in England, notably Gaping Gill in 1842 and Alum Pot in 1847–8, returning there in the 1870s. In the mid-1880s, Herbert E. Balch began exploring Wookey Hole Caves and in the 1890s, Balch was introduced to the caves of the Mendip Hills. One of the oldest established caving clubs, Yorkshire Ramblers' Club, was founded in 1892.
Caving as a specialized pursuit was pioneered by Édouard-Alfred Martel (1859–1938), who first achieved the descent and exploration of the Gouffre de Padirac, in France, as early as 1889 and the first complete descent of a 110-metre wet vertical shaft at Gaping Gill in 1895. He developed his own techniques based on ropes and metallic ladders. Martel visited Kentucky and notably Mammoth Cave National Park in October 1912. In the 1920s famous US caver Floyd Collins made important explorations in the area and in the 1930s, as caving became increasingly popular, small exploration teams both in the Alps and in the karstic high plateaus of southwest France (Causses and Pyrenees) transformed cave exploration into both a scientific and recreational activity. Robert de Joly, Guy de Lavaur and Norbert Casteret were prominent figures of that time, surveying mostly caves in Southwest France. During World War II, an alpine team composed of Pierre Chevalier, Fernand Petzl, Charles Petit-Didier and others explored the Dent de Crolles cave system near Grenoble, which became the deepest explored system in the world (-658m) at that time. The lack of available equipment during the war forced Pierre Chevalier and the rest of the team to develop their own equipment, leading to technical innovation. The scaling-pole (1940), nylon ropes (1942), use of explosives in caves (1947) and mechanical rope-ascenders (Henri Brenot's "monkeys", first used by Chevalier and Brenot in a cave in 1934) can be directly associated to the exploration of the Dent de Crolles cave system.
In 1941, American cavers organized themselves into the National Speleological Society (NSS) to advance the exploration, conservation, study and understanding of caves in the United States. American caver Bill Cuddington, known as "Vertical Bill", further developed the single-rope technique (SRT) in the late 1950s. In 1958, two Swiss alpinists, Juesi and Marti teamed together, creating the first rope ascender known as the Jumar. In 1968 Bruno Dressler asked Fernand Petzl, who worked as a metals machinist, to build a rope-ascending tool, today known as the Petzl Croll, that he had developed by adapting the Jumar to vertical caving. Pursuing these developments, Petzl started in the 1970s a caving equipment manufacturing company named Petzl. The development of the rappel rack and the evolution of mechanical ascension systems extended the practice and safety of vertical exploration to a wider range of cavers.
Hard hats are worn to protect the head from bumps and falling rocks. The caver's primary light source is usually mounted on the helmet in order to keep the hands free. Electric LED lights are most common. Many cavers carry two or more sources of light – one as primary and the others as backup in case the first fails. More often than not, a second light will be mounted to the helmet for quick transition if the primary fails. Carbide lamp systems are an older form of illumination, inspired by miner's equipment, and are still used by some cavers, particularly on remote expeditions where electric charging facilities are not available.
The type of clothes worn underground varies according to the environment of the cave being explored, and the local culture. In cold caves, the caver may wear a warm base layer that retains its insulating properties when wet, such as a fleece ("furry") suit or polypropylene underwear, and an oversuit of hard-wearing (e.g., cordura) or waterproof (e.g., PVC) material. Lighter clothing may be worn in warm caves, particularly if the cave is dry, and in tropical caves thin polypropylene clothing is used, to provide some abrasion protection while remaining as cool as possible. Wetsuits may be worn if the cave is particularly wet or involves stream passages. On the feet boots are worn – hiking-style boots in drier caves, or rubber boots (such as wellies) often with neoprene socks ("wetsocks") in wetter caves. Knee-pads (and sometimes elbow-pads) are popular for protecting joints during crawls. Depending on the nature of the cave, gloves are sometimes worn to protect the hands against abrasion or cold. In pristine areas and for restoration, clean oversuits and powder-free, non-latex surgical gloves are used to protect the cave itself from contaminants.
Ropes are used for descending or ascending pitches (single rope technique or SRT) or for protection. Knots commonly used in caving are the figure-of-eight- (or figure-of-nine-) loop, bowline, alpine butterfly, and Italian hitch. Ropes are usually rigged using bolts, slings, and carabiners. In some cases cavers may choose to bring and use a flexible metal ladder.
In addition to the equipment already described, cavers frequently carry packs containing first-aid kits, emergency equipment, and food. Containers for securely transporting urine are also commonly carried. On longer trips, containers for securely transporting feces out of the cave are carried.
During very long trips, it may be necessary to camp in the cave – some cavers have stayed underground for many days, or in particularly extreme cases, for weeks at a time. This is particularly the case when exploring or mapping very extended cave systems, where it would be impractical to retrace the route back to the surface regularly. Such long trips necessitate the cavers carrying provisions, sleeping and cooking equipment.
Caves can be dangerous places; hypothermia, falling, flooding, falling rocks and physical exhaustion are the main risks. Rescuing people from underground is difficult and time-consuming, and requires special skills, training, and equipment. Full-scale cave rescues often involve the efforts of dozens of rescue workers (often other long-time cavers who have participated in specialized courses, as normal rescue staff are not sufficiently experienced in cave environments), who may themselves be put in jeopardy in effecting the rescue. This said, caving is not necessarily a high-risk sport (especially if it does not involve difficult climbs or diving). As in all physical sports, knowing one's limitations is key.
Caving in warmer climates carries the risk of contracting histoplasmosis, a fungal infection that is contracted from bird or bat droppings. It can cause pneumonia and can disseminate in the body to cause continued infections.
In many parts of the world, leptospirosis ("a type of bacterial infection spread by animals" including rats) is a distinct threat due to the presence of rat urine in rainwater or precipitation that enters the caves water system. Complications are uncommon, but can be serious.
Safety risks while caving can be minimized by using a number of techniques:
Many cave environments are very fragile. Many speleothems can be damaged by even the slightest touch and some by impacts as slight as a breath. Research suggests that increased carbon dioxide levels can lead to "a higher equilibrium concentration of calcium within the drip waters feeding the speleothems, and hence causes dissolution of existing features." In 2008, researchers found evidence that respiration from cave visitors may generate elevated carbon dioxide concentrations in caves, leading to increased temperatures of up to 3 °C and a dissolution of existing features.
Pollution is also of concern. Since water that flows through a cave eventually comes out in streams and rivers, any pollution may ultimately end up in someone's drinking water, and can even seriously affect the surface environment, as well. Even minor pollution such as dropping organic material can have a dramatic effect on the cave biota.
Cave-dwelling species are also very fragile, and often, a particular species found in a cave may live within that cave alone, and be found nowhere else in the world, such as Alabama cave shrimp. Cave-dwelling species are accustomed to a near-constant climate of temperature and humidity, and any disturbance can be disruptive to the species' life cycles. Though cave wildlife may not always be immediately visible, it is typically nonetheless present in most caves.
Bats are one such fragile species of cave-dwelling animal. Bats which hibernate are most vulnerable during the winter season, when no food supply exists on the surface to replenish the bat's store of energy should it be awakened from hibernation. Bats which migrate are most sensitive during the summer months when they are raising their young. For these reasons, visiting caves inhabited by hibernating bats is discouraged during cold months; and visiting caves inhabited by migratory bats is discouraged during the warmer months when they are most sensitive and vulnerable. Due to an affliction affecting bats in the northeastern US known as white nose syndrome (WNS), the US Fish & Wildlife Service has called for a moratorium effective March 26, 2009, on caving activity in states known to have hibernacula (MD, NY, VT, NH, MA, CT, NJ, PA, VA, and WV) affected by WNS, as well as adjoining states.
Some cave passages may be marked with flagging tape or other indicators to show biologically, aesthetically, or archaeologically sensitive areas. Marked paths may show ways around notably fragile areas such as a pristine floor of sand or silt which may be thousands of years old, dating from the last time water flowed through the cave. Such deposits may easily be spoiled forever by a single misplaced step. Active formations such as flowstone can be similarly marred with a muddy footprint or handprint, and ancient human artifacts, such as fiber products, may even crumble to dust under all but the most gentle touch.
In 1988, concerned that cave resources were becoming increasingly damaged through unregulated use, Congress enacted the Federal Cave Resources Protection Act, giving land management agencies in the United States expanded authority to manage cave conservation on public land.
In Europe there have been some panoramic 360° records and VR projects as a means of sharing interesting caves or quarries:
Cavers in many countries have created organizations for the administration and oversight of caving activities within their nations. The oldest of these is the French Federation of Speleology (originally Société de spéléologie) founded by Édouard-Alfred Martel in 1895, which produced the first periodical journal in speleology, "Spelunca". The first University-based speleological institute in the world was founded in 1920 in Cluj-Napoca, Romania, by Emil Racovita, a Romanian biologist, zoologist, speleologist and explorer of Antarctica.
The British Speleological Association was established in 1935 and the National Speleological Society in the US was founded in 1941 (originally formed as the Speleological Society of the District of Columbia on May 6, 1939).
An international speleological congress was proposed at a meeting in Valence-sur-Rhone, France in 1949 and first held in 1953 in Paris. The International Union of Speleology (UIS) was founded in 1965. | https://en.wikipedia.org/wiki?curid=5776 |
Cave
A cave or cavern is a natural void in the ground, specifically a space large enough for a human to enter. Caves often form by the weathering of rock and often extend deep underground. The word "cave" can also refer to much smaller openings such as sea caves, rock shelters, and grottos, though strictly speaking a cave is exogene, meaning it is deeper than its opening is wide, and a rock shelter is endogene.
Speleology is the science of exploration and study of all aspects of caves and the cave environment. Visiting or exploring caves for recreation may be called "caving", "potholing", or "spelunking".
The formation and development of caves is known as "speleogenesis"; it can occur over the course of millions of years. Caves can range widely in size, and are formed by various geological processes. These may involve a combination of chemical processes, erosion by water, tectonic forces, microorganisms, pressure, and atmospheric influences. Isotopic dating techniques can be applied to cave sediments, to determine the timescale of the geological events which formed and shaped present-day caves.
It is estimated that a cave cannot be more than vertically beneath the surface due to the pressure of overlying rocks. This does not, however, impose a maximum depth for a cave which is measured from its highest entrance to its lowest point, as the amount of rock above the lowest point is dependent on the topography of the landscape above it. For karst caves the maximum depth is determined on the basis of the lower limit of karst forming processes, coinciding with the base of the soluble carbonate rocks. Most caves are formed in limestone by dissolution.
Caves can be classified in various other ways as well, including a contrast between active and relict: active caves have water flowing through them; relict caves do not, though water may be retained in them. Types of active caves include inflow caves ("into which a stream sinks"), outflow caves ("from which a stream emerges"), and through caves ("traversed by a stream").
Solutional caves or karst caves are the most frequently occurring caves. Such caves form in rock that is soluble; most occur in limestone, but they can also form in other rocks including chalk, dolomite, marble, salt, and gypsum. Rock is dissolved by natural acid in groundwater that seeps through bedding planes, faults, joints, and comparable features. Over time cracks enlarge to become caves and cave systems.
The largest and most abundant solutional caves are located in limestone. Limestone dissolves under the action of rainwater and groundwater charged with H2CO3 (carbonic acid) and naturally occurring organic acids. The dissolution process produces a distinctive landform known as "karst", characterized by sinkholes and underground drainage. Limestone caves are often adorned with calcium carbonate formations produced through slow precipitation. These include flowstones, stalactites, stalagmites, helictites, soda straws and columns. These secondary mineral deposits in caves are called "speleothems".
The portions of a solutional cave that are below the water table or the local level of the groundwater will be flooded.
Lechuguilla Cave in New Mexico and nearby Carlsbad Cavern are now believed to be examples of another type of solutional cave. They were formed by H2S (hydrogen sulfide) gas rising from below, where reservoirs of oil give off sulfurous fumes. This gas mixes with groundwater and forms H2SO4 (sulfuric acid). The acid then dissolves the limestone from below, rather than from above, by acidic water percolating from the surface.
Caves formed at the same time as the surrounding rock are called primary caves.
Lava tubes are formed through volcanic activity and are the most common primary caves. As lava flows downhill, its surface cools and solidifies. Hot liquid lava continues to flow under that crust, and if most of it flows out, a hollow tube remains. Such caves can be found in the Canary Islands, Jeju-do, the basaltic plains of Eastern Idaho, and in other places. Kazumura Cave near Hilo, Hawaii is a remarkably long and deep lava tube; it is .
Lava caves include but are not limited to lava tubes. Other caves formed through volcanic activity include rifts, lava molds, open vertical conduits, inflationary, blisters, among others.
Sea caves are found along coasts around the world. A special case is littoral caves, which are formed by wave action in zones of weakness in sea cliffs. Often these weaknesses are faults, but they may also be dykes or bedding-plane contacts. Some wave-cut caves are now above sea level because of later uplift. Elsewhere, in places such as Thailand's Phang Nga Bay, solutional caves have been flooded by the sea and are now subject to littoral erosion. Sea caves are generally around in length, but may exceed .
Corrasional or erosional caves are those that form entirely by erosion by flowing streams carrying rocks and other sediments. These can form in any type of rock, including hard rocks such as granite. Generally there must be some zone of weakness to guide the water, such as a fault or joint. A subtype of the erosional cave is the wind or aeolian cave, carved by wind-born sediments. Many caves formed initially by solutional processes often undergo a subsequent phase of erosional or vadose enlargement where active streams or rivers pass through them.
Glacier caves are formed by melting ice and flowing water within and under glaciers. The cavities are influenced by the very slow flow of the ice, which tends to collapse the caves again. Glacier caves are sometimes misidentified as "ice caves", though this latter term is properly reserved for bedrock caves that contain year-round ice formations.
Fracture caves are formed when layers of more soluble minerals, such as gypsum, dissolve out from between layers of less soluble rock. These rocks fracture and collapse in blocks of stone.
Talus caves are formed by the openings among large boulders that have fallen down into a random heap, often at the bases of cliffs. These unstable deposits are called talus or scree, and may be subject to frequent rockfalls and landslides.
Anchialine caves are caves, usually coastal, containing a mixture of freshwater and saline water (usually sea water). They occur in many parts of the world, and often contain highly specialized and endemic fauna.
Caves are found throughout the world, but only a small portion of them have been explored and documented by cavers. The distribution of documented cave systems is widely skewed toward countries where caving has been popular for many years (such as France, Italy, Australia, the UK, the United States, etc.). As a result, explored caves are found widely in Europe, Asia, North America and Oceania, but are sparse in South America, Africa, and Antarctica.
This is a rough generalization, as large expanses of North America and Asia contain no documented caves, whereas areas such as the Madagascar dry deciduous forests and parts of Brazil contain many documented caves. As the world's expanses of soluble bedrock are researched by cavers, the distribution of documented caves is likely to shift. For example, China, despite containing around half the world's exposed limestone—more than —has relatively few documented caves.
Cave-inhabiting animals are often categorized as troglobites (cave-limited species), troglophiles (species that can live their entire lives in caves, but also occur in other environments), trogloxenes (species that use caves, but cannot complete their life cycle fully in caves) and accidentals (animals not in one of the previous categories). Some authors use separate terminology for aquatic forms (for example, stygobites, stygophiles, and stygoxenes).
Of these animals, the troglobites are perhaps the most unusual organisms. Troglobitic species often show a number of characteristics, termed troglomorphic, associated with their adaptation to subterranean life. These characteristics may include a loss of pigment (often resulting in a pale or white coloration), a loss of eyes (or at least of optical functionality), an elongation of appendages, and an enhancement of other senses (such as the ability to sense vibrations in water). Aquatic troglobites (or stygobites), such as the endangered Alabama cave shrimp, live in bodies of water found in caves and get nutrients from detritus washed into their caves and from the feces of bats and other cave inhabitants. Other aquatic troglobites include cave fish, and cave salamanders such as the olm and the Texas blind salamander.
Cave insects such as Oligaphorura (formerly Archaphorura) schoetti are troglophiles, reaching in length. They have extensive distribution and have been studied fairly widely. Most specimens are female, but a male specimen was collected from St Cuthberts Swallet in 1969.
Bats, such as the gray bat and Mexican free-tailed bat, are trogloxenes and are often found in caves; they forage outside of the caves. Some species of cave crickets are classified as trogloxenes, because they roost in caves by day and forage above ground at night.
Because of the fragile nature of the cave ecosystem, and the fact that cave regions tend to be isolated from one another, caves harbor a number of endangered species, such as the Tooth cave spider, liphistius trapdoor spider, and the gray bat.
Caves are visited by many surface-living animals, including humans. These are usually relatively short-lived incursions, due to the lack of light and sustenance.
Cave entrances often have typical florae. For instance, in the eastern temperate United States, cave entrances are most frequently (and often densely) populated by the bulblet fern, "Cystopteris bulbifera".
Throughout history, primitive peoples have made use of caves. The earliest human fossils found in caves come from a series of caves near Krugersdorp and Mokopane in South Africa. The cave sites of Sterkfontein, Swartkrans, Kromdraai B, Drimolen, Malapa, Cooper's D, Gladysvale, Gondolin and Makapansgat have yielded a range of early human species dating back to between three and one million years ago, including Australopithecus africanus, Australopithecus sediba and Paranthropus robustus. However, it is not generally thought that these early humans were living in the caves, but that they were brought into the caves by carnivores that had killed them.
The first early hominid ever found in Africa, the Taung Child in 1924, was also thought for many years to come from a cave, where it had been deposited after being predated on by an eagle. However, this is now debated (Hopley et al., 2013; Am. J. Phys. Anthrop.). Caves do form in the dolomite of the Ghaap Plateau, including the Early, Middle and Later Stone Age site of Wonderwerk Cave; however, the caves that form along the escarpment's edge, like that hypothesised for the Taung Child, are formed within a secondary limestone deposit called tufa. There is numerous evidence for other early human species inhabiting caves from at least one million years ago in different parts of the world, including Homo erectus in China at Zhoukoudian, Homo rhodesiensis in South Africa at the Cave of Hearths (Makapansgat), Homo neandertalensis and Homo heidelbergensis in Europe at Archaeological Site of Atapuerca, Homo floresiensis in Indonesia, and the Denisovans in southern Siberia.
In southern Africa, early modern humans regularly used sea caves as shelter starting about 180,000 years ago when they learned to exploit the sea for the first time (Marean et al., 2007; Nature). The oldest known site is PP13B at Pinnacle Point. This may have allowed rapid expansion of humans out of Africa and colonization of areas of the world such as Australia by 60–50,000 years ago. Throughout southern Africa, Australia, and Europe, early modern humans used caves and rock shelters as sites for rock art, such as those at Giants Castle. Caves such as the yaodong in China were used for shelter; other caves were used for burials (such as rock-cut tombs), or as religious sites (such as ). Among the known sacred caves are China's Cave of a Thousand Buddhas and the sacred caves of Crete.
The importance of sound in caves predates a modern understanding of acoustics. Archaeologists have uncovered relationships between paintings of dots and lines, in specific areas of resonance, within the caves of Spain and France, as well as instruments depicting paleolithic motifs, indicators of musical events and rituals. Clusters of paintings were often founds in areas with notable acoustics, sometimes even replicating the sounds of the animals depicted on the walls. The human voice was also theorized to be used as an echolocation device to navigate darker areas of the caves where torches were less useful. Dots of red ochre are often found in spaces with the highest resonance, where the production of paintings was too difficult. Here, singing is to be the most efficient way to explore caves.
Caves continue to provide usage for modern-day explorers of acoustics. Today Cumberland Caverns provides one of the best examples for modern musical usages of caves. Not only are caves utilized for the reverberations, but for the dampening qualities of their abnormal faces as well. The irregularities in the walls of the Cumberland Caverns diffuse sounds bouncing off the walls and give the space and almost recording studio-like quality. Caves provide importance in both their echos and their silence alike. During the 20th century musicians began to explore the possibility of using caves as locations as clubs and concert halls, including the likes of Dinah Shore, Roy Acuff, and Benny Goodman. Unlike today, these early performances were typically held in the mouths of the caves, as the lack of technology made depths of the interior were inaccessible with musical equipment. In one circumstance, a stalagmite hanging from the ceiling of a cave was converted into a functioning organ. Unlike an organ, this instrument responds percussively like a piano, using mallets to strike the stalagmites to produce different pitches. | https://en.wikipedia.org/wiki?curid=5778 |
Chinese numerals
Chinese numerals are words and characters used to denote numbers in Chinese.
Today, speakers of Chinese use three written numeral systems: the system of Arabic numerals used worldwide, and two indigenous systems. The more familiar indigenous system is based on Chinese characters that correspond to numerals in the spoken language. These are shared with other languages of the Chinese cultural sphere such as Japanese, Korean and Vietnamese. Most people and institutions in China and Taiwan primarily use the Arabic or mixed Arabic-Chinese systems for convenience, with traditional Chinese numerals used in finance, mainly for writing amounts on checks, banknotes, some ceremonial occasions, some boxes, and on commercials.
The other indigenous system is the Suzhou numerals, or "huama", a positional system, the only surviving form of the rod numerals. These were once used by Chinese mathematicians, and later in Chinese markets, such as those in Hong Kong before the 1990s, but have been gradually supplanted by Arabic (and also Roman) numerals.
The Chinese character numeral system consists of the Chinese characters used by the Chinese written language to write spoken numerals. Similar to spelling-out numbers in English (e.g., "one thousand nine hundred forty-five"), it is not an independent system "per se". Since it reflects spoken language, it does not use the positional system as in Arabic numerals, in the same way that spelling out numbers in English does not.
There are characters representing the numbers zero through nine, and other characters representing larger numbers such as tens, hundreds, thousands and so on. There are two sets of characters for Chinese numerals: one for everyday writing, known as "xiǎoxiě" (), and one for use in commercial or financial contexts, known as "dàxiě" (). The latter arose because the characters used for writing numerals are geometrically simple, so simply using those numerals cannot prevent forgeries in the same way spelling numbers out in English would. A forger could easily change the everyday characters 三十 (30) to 五千 (5000) just by adding a few strokes. That would not be possible when writing using the financial characters 參拾 (30) and 伍仟 (5000). They are also referred to as "banker's numerals", "anti-fraud numerals", or "banker's anti-fraud numerals". For the same reason, rod numerals were never used in commercial records.
T denotes Traditional Chinese characters, while S denotes Simplified Chinese characters.
In the People's Liberation Army of the People's Republic of China and the Republic of China Armed Forces of Taiwan, some numbers will have altered names when used for clearer radio communications. They are:
For numbers larger than 10,000, similarly to the long and short scales in the West, there have been four systems in ancient and modern usage. The original one, with unique names for all powers of ten up to the 14th, is ascribed to the Yellow Emperor in the 6th century book by Zhen Luan, "Wujing suanshu" (Arithmetic in Five Classics). In modern Chinese only the second system is used, in which the same ancient names are used, but each represents a number 10,000 (myriad, 萬 wàn) times the previous:
In practice, this situation does not lead to ambiguity, with the exception of 兆 (zhào), which means 1012 according to the system in common usage throughout the Chinese communities as well as in Japan and Korea, but has also been used for 106 in recent years (especially in mainland China for megabyte). To avoid problems arising from the ambiguity, the PRC government never uses this character in official documents, but uses 万亿 (wànyì) or 太 (tài, as the translation for "tera") instead. Partly due to this, combinations of 万 and 亿 are often used instead of the larger units of the traditional system as well, for example 亿亿 (yìyì) instead of 京. The ROC government in Taiwan uses 兆 (zhào) to mean 1012 in official documents.
Numerals beyond 載 zǎi come from Buddhist texts in Sanskrit, but are mostly found in ancient texts. Some of the following words are still being used today, but may have transferred meanings.
The following are characters used to denote small order of magnitude in Chinese historically. With the introduction of SI units, some of them have been incorporated as SI prefixes, while the rest have fallen into disuse.
In the People's Republic of China, the early translation for the SI prefixes in 1981 was different from those used today. The larger (兆, 京, 垓, 秭, 穰) and smaller Chinese numerals (微, 纖, 沙, 塵, 渺) were defined as translation for the SI prefixes as "mega", "giga", "tera", "peta", "exa", "micro", "nano", "pico", "femto", "atto", resulting in the creation of yet more values for each numeral.
The Republic of China (Taiwan) defined 百萬 as the translation for "mega" and 兆 as the translation for "tera". This translation is widely used in official documents, academic communities, informational industries, etc. However, the civil broadcasting industries sometimes use 兆赫 to represent "megahertz".
Today, the governments of both China and Taiwan use phonetic transliterations for the SI prefixes. However, the governments have each chosen different Chinese characters for certain prefixes. The following table lists the two different standards together with the early translation.
Multiple-digit numbers are constructed using a multiplicative principle; first the digit itself (from 1 to 9), then the place (such as 10 or 100); then the next digit.
In Mandarin, the multiplier ("liǎng") is often used rather than ("èr") for all numbers 200 and greater with the "2" numeral (although as noted earlier this varies from dialect to dialect and person to person). Use of both 兩 ("liǎng") or 二 ("èr") are acceptable for the number 200. When writing in the Cantonese dialect, 二 ("yi6") is used to represent the "2" numeral for all numbers. In the southern Min dialect of Chaozhou (Teochew), 兩 ("no6") is used to represent the "2" numeral in all numbers from 200 onwards. Thus:
For the numbers 11 through 19, the leading "one" () is usually omitted. In some dialects, like Shanghainese, when there are only two significant digits in the number, the leading "one" and the trailing zeroes are omitted. Sometimes, the one before "ten" in the middle of a number, such as 213, is omitted. Thus:
Notes:
In certain older texts like the Protestant Bible or in poetic usage, numbers such as 114 may be "written" as [100] [10] [4] ().
Outside of Taiwan, digits are sometimes grouped by myriads instead of thousands. Hence it is more convenient to think of numbers here as in groups of four, thus 1,234,567,890 is regrouped here as 12,3456,7890. Larger than a myriad, each number is therefore four zeroes longer than the one before it, thus 10000 × () = (). If one of the numbers is between 10 and 19, the leading "one" is omitted as per the above point. Hence (numbers in parentheses indicate that the number has been written as one number rather than expanded):
In Taiwan, pure Arabic numerals are officially always and only grouped by thousands. Unofficially, they are often not grouped, particularly for numbers below 100,000. Mixed Arabic-Chinese numerals are often used in order to denote myriads. This is used both officially and unofficially, and come in a variety of styles:
Interior zeroes before the unit position (as in 1002) must be spelt explicitly. The reason for this is that trailing zeroes (as in 1200) are often omitted as shorthand, so ambiguity occurs. One zero is sufficient to resolve the ambiguity. Where the zero is before a digit other than the units digit, the explicit zero is not ambiguous and is therefore optional, but preferred. Thus:
To construct a fraction, the denominator is written first, followed by ("parts of/dividing") and then the numerator. This is the opposite of how fractions are read in English, which is numerator first. Each half of the fraction is written the same as a whole number. Mixed numbers are written with the whole-number part first, followed by ("and"), then the fractional part.
Percentages are constructed similarly, using (100) as the denominator. The (one) before is omitted. (Like the English "one hundred" or "a hundred", typically the quantity 100 is denoted in Chinese.)
Decimal numbers are constructed by first writing the whole number part, then inserting a point (), and finally the decimal expression. The decimal expression is written using only the digits for 0 to 9, without multiplicative words.
, [half] functions as a number and therefore requires a measure word. Example: is "half a glass of water".
Ordinal numbers are formed by adding ("sequence") before the number.
The Heavenly Stems are a traditional Chinese ordinal system.
Negative numbers are formed by adding fù () before the number.
Chinese grammar requires the use of classifiers (measure words) when a numeral is used together with a noun to express a quantity. For example, "three people" is expressed as , "three ( particle) person", where / "" is a classifier. There exist many different classifiers, for use with different sets of nouns, although / is the most common, and may be used informally in place of other classifiers.
Chinese uses cardinal numbers in certain situations in which English would use ordinals. For example, (literally "three story/storey") means "third floor" ("second floor" in British ). Likewise, (literally "twenty-one century") is used for "21st century".
Numbers of years are commonly spoken as a sequence of digits, as in ("two zero zero one") for the year 2001. Names of months and days (in the Western system) are also expressed using numbers: ("one month") for January, etc.; and ("week one") for Monday, etc. Only one exception, Sunday is , or informally , both literally "week day". When meaning "week", "" and "" are interchangeable. "" or "" means "day of worship". Chinese Catholics call Sunday "" "", "Lord's day".
Full dates are usually written in the format 2001年1月20日 for January 20, 2001 (using "year", "month", and "day") – all the numbers are read as cardinals, not ordinals, with no leading zeroes, and the year is read as a sequence of digits. For brevity the , and may be dropped to give a date composed of just numbers. For example "6-4" in Chinese is "six-four", short for "month six, day four" i.e. June Fourth, a common Chinese shorthand for the 1989 Tiananmen Square protests (because of the violence that occurred on June 4). For another example 67, in Chinese is sixty seven, short for year nineteen sixty seven, a common Chinese shorthand for the Hong Kong 1967 leftist riots.
In the same way that Roman numerals were standard in ancient and medieval Europe for mathematics and commerce, the Chinese formerly used the rod numerals, which is a positional system. The Suzhou numerals () system is a variation of the Southern Song rod numerals. Nowadays, the "huāmǎ" system is only used for displaying prices in Chinese markets or on traditional handwritten invoices.
There is a common method of using of one hand to signify the numbers one to ten. While the five digits on one hand can express the numbers one to five, six to ten have special signs that can be used in commerce or day-to-day communication.
Most Chinese numerals of later periods were descendants of the Shang dynasty oracle numerals of the 14th century BC. The oracle bone script numerals were found on tortoise shell and animal bones. In early civilizations, the Shang were able to express any numbers, however large, with only nine symbols and a counting board.
Some of the bronze script numerals such as 1, 2, 3, 4, 10, 11, 12, and 13 became part of the system of rod numerals.
In this system, horizontal rod numbers are used for the tens, thousands, hundred thousands etc. It's written in "Sunzi Suanjing" that "one is vertical, ten is horizontal".
The counting rod numerals system has place value and decimal numerals for computation, and was used widely by Chinese merchants, mathematicians and astronomers from the Han dynasty to the 16th century.
In 690 AD, Empress Wǔ promulgated Zetian characters, one of which was "〇". The word is now used as a synonym for the number zero.
Alexander Wylie, Christian missionary to China, in 1853 already refuted the notion that "the Chinese numbers were written in words at length", and stated that in ancient China, calculation was carried out by means of counting rods, and "the written character is evidently a rude presentation of these". After being introduced to the rod numerals, he said "Having thus obtained a simple but effective system of figures, we find the Chinese in actual use of a method of notation depending on the theory of local value [i.e. place-value], several centuries before such theory was understood in Europe, and while yet the science of numbers had scarcely dawned among the Arabs."
During the Ming and Qing dynasties (after Arabic numerals were introduced into China), some Chinese mathematicians used Chinese numeral characters as positional system digits. After the Qing period, both the Chinese numeral characters and the Suzhou numerals were replaced by Arabic numerals in mathematical writings.
Traditional Chinese numeric characters are also used in Japan and Korea and were used in Vietnam before the 20th century. In vertical text (that is, read top to bottom), using characters for numbers is the norm, while in horizontal text, Arabic numerals are most common. Chinese numeric characters are also used in much the same formal or decorative fashion that Roman numerals are in Western cultures. Chinese numerals may appear together with Arabic numbers on the same sign or document. | https://en.wikipedia.org/wiki?curid=5781 |
Computer program
A computer program is a collection of instructions that can be executed by a computer to perform a specific task.
A computer program is usually written by a computer programmer in a programming language. From the program in its human-readable form of source code, a compiler or assembler can derive machine code—a form consisting of instructions that the computer can directly execute. Alternatively, a computer program may be executed with the aid of an interpreter.
A collection of computer programs, libraries, and related data are referred to as software. Computer programs may be categorized along functional lines, such as application software and system software. The underlying method used for some calculation or manipulation is known as an algorithm.
Code-breaking algorithms have existed for centuries. In the 9th century, the Arab mathematician Al-Kindi described a cryptographic algorithm for deciphering encrypted code, in "A Manuscript On Deciphering Cryptographic Messages". He gave the first description of cryptanalysis by frequency analysis, the earliest code-breaking algorithm.
The earliest programmable machines preceded the invention of the digital computer. As early as the 9th century, a programmable music sequencer was invented by the Persian Banu Musa brothers, who described an automated mechanical flute player in the "Book of Ingenious Devices". In 1206, the Arab engineer Al-Jazari invented a programmable drum machine where musical mechanical automata could be made to play different rhythms and drum patterns. In 1801, Joseph-Marie Jacquard devised a loom that would weave a pattern by following a series of perforated cards. Patterns could be woven and repeated by arranging the cards.
In 1837, Charles Babbage was inspired by Jacquard's loom to attempt to build the Analytical Engine.
The names of the components of the calculating device were borrowed from the textile industry. In the textile industry, yarn was brought from the store to be milled. The device would have had a "store"—memory to hold 1,000 numbers of 40 decimal digits each. Numbers from the "store" would then have then been transferred to the "mill" (analogous to the CPU of a modern machine), for processing. And a "thread" being the execution of programmed instructions by the device. It was programmed using two sets of perforated cards—one to direct the operation and the other for the input variables.
However, after more than 17,000 pounds of the British government's money, the thousands of cogged wheels and gears never fully worked together.
During a nine-month period in 1842–43, Ada Lovelace translated the memoir of Italian mathematician Luigi Menabrea. The memoir covered the Analytical Engine. The translation contained Note G which completely detailed a method for calculating Bernoulli numbers using the Analytical Engine. This note is recognized by some historians as the world's first written computer program.
In 1936, Alan Turing introduced the Universal Turing machine—a theoretical device that can model every computation that can be performed on a Turing complete computing machine.
It is a finite-state machine that has an infinitely long read/write tape. The machine can move the tape back and forth, changing its contents as it performs an algorithm. The machine starts in the initial state, goes through a sequence of steps, and halts when it encounters the halt state.
This machine is considered by some to be the origin of the stored-program computer—used by John von Neumann (1946) for the "Electronic Computing Instrument" that now bears the von Neumann architecture name.
The Z3 computer, invented by Konrad Zuse (1941) in Germany, was a digital and programmable computer. A digital computer uses electricity as the calculating component. The Z3 contained 2,400 relays to create the circuits. The circuits provided a binary, floating-point, nine-instruction computer. Programming the Z3 was through a specially designed keyboard and punched tape.
The Electronic Numerical Integrator And Computer (Fall 1945) was a Turing complete, general-purpose computer that used 17,468 vacuum tubes to create the circuits. At its core, it was a series of Pascalines wired together. Its 40 units weighed 30 tons, occupied , and consumed $650 per hour (in 1940s currency) in electricity when idle. It had 20 base-10 accumulators. Programming the ENIAC took up to two months. Three function tables were on wheels and needed to be rolled to fixed function panels. Function tables were connected to function panels using heavy black cables. Each function table had 728 rotating knobs. Programming the ENIAC also involved setting some of the 3,000 switches. Debugging a program took a week. The programmers of the ENIAC were women who were known collectively as the "ENIAC girls" and included Jean Jennings Bartik, Betty Holberton, Marlyn Wescoff, Kathleen McNulty, Ruth Teitelbaum, and Frances Spence.
The ENIAC featured parallel operations. Different sets of accumulators could simultaneously work on different algorithms. It used punched card machines for input and output, and it was controlled with a clock signal. It ran for eight years, calculating hydrogen bomb parameters, predicting weather patterns, and producing firing tables to aim artillery guns.
The Manchester Baby (June 1948) was a stored-program computer. Programming transitioned away from moving cables and setting dials; instead, a computer program was stored in memory as numbers. Only three bits of memory were available to store each instruction, so it was limited to eight instructions. 32 switches were available for programming.
Computers manufactured until the 1970s had front-panel switches for programming. The computer program was written on paper for reference. An instruction was represented by a configuration of on/off settings. After setting the configuration, an execute button was pressed. This process was then repeated. Computer programs also were manually input via paper tape or punched cards. After the medium was loaded, the starting address was set via switches and the execute button pressed.
In 1961, the Burroughs B5000 was built specifically to be programmed in the ALGOL 60 language. The hardware featured circuits to ease the compile phase.
In 1964, the IBM System/360 was a line of six computers each having the same instruction set architecture. The Model 30 was the smallest and least expensive. Customers could upgrade and retain the same application software. Each System/360 model featured multiprogramming. With operating system support, multiple programs could be in memory at once. When one was waiting for input/output, another could compute. Each model also could emulate other computers. Customers could upgrade to the System/360 and retain their IBM 7094 or IBM 1401 application software.
Computer programming is the process of writing or editing source code. Editing source code involves testing, analyzing, refining, and sometimes coordinating with other programmers on a jointly developed program. A person who practices this skill is referred to as a computer programmer, software developer, and sometimes coder.
The sometimes lengthy process of computer programming is usually referred to as software development. The term software engineering is becoming popular as the process is seen as an engineering discipline.
Computer programs can be categorized by the programming language paradigm used to produce them. Two of the main paradigms are imperative and declarative.
"Imperative programming languages" specify a sequential algorithm using declarations, expressions, and statements:
One criticism of imperative languages is the side effect of an assignment statement on a class of variables called non-local variables.
"Declarative programming languages" describe "what" computation should be performed and not "how" to compute it. Declarative programs omit the control flow and are considered "sets" of instructions. Two broad categories of declarative languages are functional languages and logical languages. The principle behind functional languages (like Haskell) is to not allow side effects, which makes it easier to reason about programs like mathematical functions. The principle behind logical languages (like Prolog) is to define the problem to be solved – the goal – and leave the detailed solution to the Prolog system itself. The goal is defined by providing a list of subgoals. Then each subgoal is defined by further providing a list of its subgoals, etc. If a path of subgoals fails to find a solution, then that subgoal is backtracked and another path is systematically attempted.
A computer program in the form of a human-readable, computer programming language is called source code. Source code may be converted into an executable image by a compiler or assembler, or executed immediately with the aid of an interpreter.
Compilers are used to translate source code from a programming language into either object code or machine code. Object code needs further processing to become machine code, and machine code consists of the central processing unit's native instructions, ready for execution. Compiled computer programs are commonly referred to as executables, binary images, or simply as binaries – a reference to the binary file format used to store the executable code.
Some compiled and assembled object programs need to be combined as modules with a linker utility in order to produce an executable program.
Interpreters are used to execute source code from a programming language line-by-line. The interpreter decodes each statement and performs its behavior. One advantage of interpreters is that they can easily be extended to an interactive session. The programmer is presented with a prompt, and individual lines of code are typed in and performed immediately.
The main disadvantage of interpreters is computer programs run slower than when compiled. Interpreting code is slower because the interpreter must decode each statement and then perform it. However, software development may be faster using an interpreter because testing is immediate when the compiling step is omitted. Another disadvantage of interpreters is an interpreter must be present on the executing computer. By contrast, compiled computer programs need no compiler present during execution.
Just in time compilers pre-compile computer programs just before execution. For example, the Java virtual machine Hotspot contains a Just In Time Compiler which selectively compiles Java bytecode into machine code – but only code which Hotspot predicts is likely to be used many times.
Either compiled or interpreted programs might be executed in a batch process without human interaction.
Scripting languages are often used to create batch processes. One common scripting language is Unix shell, and its executing environment is called the command-line interface.
No properties of a programming language require it to be exclusively compiled or exclusively interpreted. The categorization usually reflects the most popular method of language execution. For example, Java is thought of as an interpreted language and C a compiled language, despite the existence of Java compilers and C interpreters.
Typically, computer programs are stored in non-volatile memory until requested either directly or indirectly to be executed by the computer user. Upon such a request, the program is loaded into random-access memory, by a computer program called an operating system, where it can be accessed directly by the central processor. The central processor then executes ("runs") the program, instruction by instruction, until termination. A program in execution is called a process. Termination is either by normal self-termination, by user intervention, or by error – software or hardware error.
Many operating systems support multitasking which enables many computer programs to appear to run simultaneously on one computer. Operating systems may run multiple programs through process scheduling – a software mechanism to switch the CPU among processes often so users can interact with each program while it runs. Within hardware, modern day multiprocessor computers or computers with multicore processors may run multiple programs.
A computer program in execution is normally treated as being different from the data the program operates on. However, in some cases, this distinction is blurred when a computer program modifies itself. The modified computer program is subsequently executed as part of the same program. Self-modifying code is possible for programs written in machine code, assembly language, Lisp, C, COBOL, PL/1, and Prolog.
Computer programs may be categorized along functional lines. The main functional categories are application software and system software. System software includes the operating system which couples computer hardware with application software. The purpose of the operating system is to provide an environment in which application software executes in a convenient and efficient manner. In addition to the operating system, system software includes embedded programs, boot programs, and micro programs. Application software designed for end users have a user interface. Application software not designed for the end user includes middleware, which couples one application with another. Application software also includes utility programs. The distinction between system software and application software is under debate.
There are many types of application software:
Utility programs are application programs designed to aid system administrators and computer programmers.
An operating system is a computer program that acts as an intermediary between a user of a computer and the computer hardware.
In the 1950s, the programmer, who was also the operator, would write a program and run it. After the program finished executing, the output may have been printed, or it may have been punched onto paper tape or cards for later processing.
More often than not the program did not work. The programmer then looked at the console lights and fiddled with the console switches. If less fortunate, a memory printout was made for further study. In the 1960s, programmers reduced the amount of wasted time by automating the operator's job. A program called an "operating system" was kept in the computer at all times.
Originally, operating systems were programmed in assembly; however, modern operating systems are typically written in C.
A stored-program computer requires an initial computer program stored in its read-only memory to boot. The boot process is to identify and initialize all aspects of the system, from processor registers to device controllers to memory contents. Following the initialization process, this initial computer program loads the operating system and sets the program counter to begin normal operations.
Independent of the host computer, a hardware device might have embedded firmware to control its operation. Firmware is used when the computer program is rarely or never expected to change, or when the program must not be lost when the power is off.
Microcode programs control some central processing units and some other hardware. This code moves data between the registers, buses, arithmetic logic units, and other functional units in the CPU. Unlike conventional programs, microcode is not usually written by, or even visible to, the end users of systems, and is usually provided by the manufacturer, and is considered internal to the device. | https://en.wikipedia.org/wiki?curid=5783 |
Crime
In ordinary language, a crime is an unlawful act punishable by a state or other authority. The term "crime" does not, in modern criminal law, have any simple and universally accepted definition, though statutory definitions have been provided for certain purposes. The most popular view is that crime is a category created by law; in other words, something is a crime if declared as such by the relevant and applicable law. One proposed definition is that a crime or offence (or criminal offence) is an act harmful not only to some individual but also to a community, society, or the state ("a public wrong"). Such acts are forbidden and punishable by law.
The notion that acts such as murder, rape, and theft are to be prohibited exists worldwide. What precisely is a criminal offence is defined by criminal law of each country. While many have a catalogue of crimes called the criminal code, in some common law countries no such comprehensive statute exists.
The state (government) has the power to severely restrict one's liberty for committing a crime. In modern societies, there are procedures to which investigations and trials must adhere. If found guilty, an offender may be sentenced to a form of reparation such as a community sentence, or, depending on the nature of their offence, to undergo imprisonment, life imprisonment or, in some jurisdictions, execution.
Usually, to be classified as a crime, the "act of doing something criminal" ("actus reus") mustwith certain exceptionsbe accompanied by the "intention to do something criminal" ("mens rea").
While every crime violates the law, not every violation of the law counts as a crime. Breaches of private law (torts and breaches of contract) are not automatically punished by the state, but can be enforced through civil procedure.
When informal relationships prove insufficient to establish and maintain a desired social order, a government or a state may impose more formalized or stricter systems of social control. With institutional and legal machinery at their disposal, agents of the state can compel populations to conform to codes and can opt to punish or attempt to reform those who do not conform.
Authorities employ various mechanisms to regulate (encouraging or discouraging) certain behaviors in general. Governing or administering agencies may for example codify rules into laws, police citizens and visitors to ensure that they comply with those laws, and implement other policies and practices that legislators or administrators have prescribed with the aim of discouraging or preventing crime. In addition, authorities provide remedies and sanctions, and collectively these constitute a criminal justice system. Legal sanctions vary widely in their severity; they may include (for example) incarceration of temporary character aimed at reforming the convict. Some jurisdictions have penal codes written to inflict permanent harsh punishments: legal mutilation, capital punishment, or life without parole.
Usually, a natural person perpetrates a crime, but legal persons may also commit crimes. Historically, several premodern societies believed that non-human animals were capable of committing crimes, and prosecuted and punished them accordingly.
The sociologist Richard Quinney has written about the relationship between society and crime. When Quinney states "crime is a social phenomenon" he envisages both how individuals conceive crime and how populations perceive it, based on societal norms.
The word "crime" is derived from the Latin root , meaning "I decide, I give judgment". Originally the Latin word "crīmen" meant "charge" or "cry of distress." The Ancient Greek word , from which the Latin cognate derives, typically referred to an intellectual mistake or an offense against the community, rather than a private or moral wrong.
In 13th century English "crime" meant "sinfulness", according to the Online Etymology Dictionary. It was probably brought to England as Old French "crimne" (12th century form of Modern French "crime"), from Latin "crimen" (in the genitive case: "criminis"). In Latin, "crimen" could have signified any one of the following: "charge, indictment, accusation; crime, fault, offense".
The word may derive from the Latin "cernere" – "to decide, to sift" (see crisis, mapped on Kairos and Chronos). But Ernest Klein (citing Karl Brugmann) rejects this and suggests *cri-men, which originally would have meant "cry of distress". Thomas G. Tucker suggests a root in "cry" words and refers to English plaint, plaintiff, and so on. The meaning "offense punishable by law" dates from the late 14th century. The Latin word is glossed in Old English by "facen", also "deceit, fraud, treachery", [cf. fake]. "Crime wave" is first attested in 1893 in American English.
Whether a given act or omission constitutes a crime does not depend on the nature of that act or omission. It depends on the nature of the legal consequences that may follow it. An act or omission is a crime if it is capable of being followed by what are called criminal proceedings.
History
The following definition of "crime" was provided by the Prevention of Crimes Act 1871, and applied for the purposes of section 10 of the Prevention of Crime Act 1908:
For the purpose of section 243 of the Trade Union and Labour Relations (Consolidation) Act 1992, a crime means an offence punishable on indictment, or an offence punishable on summary conviction, and for the commission of which the offender is liable under the statute making the offence punishable to be imprisoned either absolutely or at the discretion of the court as an alternative for some other punishment.
A normative definition views crime as deviant behavior that violates prevailing normscultural standards prescribing how humans ought to behave normally. This approach considers the complex realities surrounding the concept of crime and seeks to understand how changing social, political, psychological, and economic conditions may affect changing definitions of crime and the form of the legal, law-enforcement, and penal responses made by society.
These structural realities remain fluid and often contentious. For example: as cultures change and the political environment shifts, societies may criminalise or decriminalise certain behaviours, which directly affects the statistical crime rates, influence the allocation of resources for the enforcement of laws, and (re-)influence the general public opinion.
Similarly, changes in the collection and/or calculation of data on crime may affect the public perceptions of the extent of any given "crime problem". All such adjustments to crime statistics, allied with the experience of people in their everyday lives, shape attitudes on the extent to which the state should use law or social engineering to enforce or encourage any particular social norm. Behaviour can be controlled and influenced by a society in many ways without having to resort to the criminal justice system.
Indeed, in those cases where no clear consensus exists on a given norm, the drafting of criminal law by the group in power to prohibit the behaviour of another group may seem to some observers an improper limitation of the second group's freedom, and the ordinary members of society have less respect for the law or laws in generalwhether the authorities actually enforce the disputed law or not.
Legislatures can pass laws (called "mala prohibita") that define crimes against social norms. These laws vary from time to time and from place to place: note variations in gambling laws, for example, and the prohibition or encouragement of duelling in history. Other crimes, called "mala in se", count as outlawed in almost all societies, (murder, theft and rape, for example).
English criminal law and the related criminal law of Commonwealth countries can define offences that the courts alone have developed over the years, without any actual legislation: common law offences. The courts used the concept of "malum in se" to develop various common law offences.
One can view criminalization as a procedure deployed by society as a preemptive harm-reduction device, using the threat of punishment as a deterrent to anyone proposing to engage in the behavior causing harm. The state becomes involved because governing entities can become convinced that the costs of not criminalizing (through allowing the harms to continue unabated) outweigh the costs of criminalizing it (restricting individual liberty, for example, to minimize harm to others).
States control the process of criminalization because:
The label of "crime" and the accompanying social stigma normally confine their scope to those activities seen as injurious to the general population or to the state, including some that cause serious loss or damage to individuals. Those who apply the labels of "crime" or "criminal" intend to assert the hegemony of a dominant population, or to reflect a consensus of condemnation for the identified behavior and to justify any punishments prescribed by the state (in the event that standard processing tries and convicts an accused person of a crime).
Justifying the state's use of force to coerce compliance with its laws has proven a consistent theoretical problem. One of the earliest justifications involved the theory of natural law. This posits that the nature of the world or of human beings underlies the standards of morality or constructs them. Thomas Aquinas wrote in the 13th century: "the rule and measure of human acts is the reason, which is the first principle of human acts". He regarded people as by nature rational beings, concluding that it becomes morally appropriate that they should behave in a way that conforms to their rational nature. Thus, to be valid, any law must conform to natural law and coercing people to conform to that law is morally acceptable. In the 1760s, William Blackstone described the thesis:
But John Austin (1790–1859), an early positivist, applied utilitarianism in accepting the calculating nature of human beings and the existence of an objective morality. He denied that the legal validity of a norm depends on whether its content conforms to morality. Thus in Austinian terms, a moral code can objectively determine what people ought to do, the law can embody whatever norms the legislature decrees to achieve social utility, but every individual remains free to choose what to do. Similarly, H.L.A. Hart saw the law as an aspect of sovereignty, with lawmakers able to adopt any law as a means to a moral end.
Thus the necessary and sufficient conditions for the truth of a proposition of law simply involved internal logic and consistency, and that the state's agents used state power with responsibility. Ronald Dworkin rejects Hart's theory and proposes that all individuals should expect the equal respect and concern of those who govern them as a fundamental political right. He offers a theory of compliance overlaid by a theory of deference (the citizen's duty to obey the law) and a theory of enforcement, which identifies the legitimate goals of enforcement and punishment. Legislation must conform to a theory of legitimacy, which describes the circumstances under which a particular person or group is entitled to make law, and a theory of legislative justice, which describes the law they are entitled or obliged to make.
There are natural-law theorists who have accepted the idea of enforcing the prevailing morality as a primary function of the law. This view entails the problem that it makes any moral criticism of the law impossible: if conformity with natural law forms a necessary condition for legal validity, all valid law must, by definition, count as morally just. Thus, on this line of reasoning, the legal validity of a norm necessarily entails its moral justice.
One can solve this problem by granting some degree of moral relativism and accepting that norms may evolve over time and, therefore, one can criticize the continued enforcement of old laws in the light of the current norms. People may find such law acceptable, but the use of state power to coerce citizens to comply with that law lacks moral justification. More recent conceptions of the theory characterise crime as the violation of individual rights.
Since society considers so many rights as natural (hence the term "right") rather than man-made, what constitutes a crime also counts as natural, in contrast to laws (seen as man-made). Adam Smith illustrates this view, saying that a smuggler would be an excellent citizen, "...had not the laws of his country made that a crime which nature never meant to be so."
Natural-law theory therefore distinguishes between "criminality" (which derives from human nature) and "illegality" (which originates with the interests of those in power). Lawyers sometimes express the two concepts with the phrases "malum in se" and "malum prohibitum" respectively. They regard a "crime "malum in se"" as inherently criminal; whereas a "crime "malum prohibitum"" (the argument goes) counts as criminal only because the law has decreed it so.
It follows from this view that one can perform an illegal act without committing a crime, while a criminal act could be perfectly legal. Many Enlightenment thinkers (such as Adam Smith and the American Founding Fathers) subscribed to this view to some extent, and it remains influential among so-called classical liberals and libertarians.
Some religious communities regard sin as a crime; some may even highlight the crime of sin very early in legendary or mythological accounts of originsnote the tale of Adam and Eve and the theory of original sin. What one group considers a crime may cause or ignite war or conflict. However, the earliest known civilizations had codes of law, containing both civil and penal rules mixed together, though not always in recorded form.
The Sumerians produced the earliest surviving written codes. Urukagina (reigned , short chronology) had an early code that has not survived; a later king, Ur-Nammu, left the earliest extant written law system, the Code of Ur-Nammu (), which prescribed a formal system of penalties for specific cases in 57 articles. The Sumerians later issued other codes, including the "code of Lipit-Ishtar". This code, from the 20th century BCE, contains some fifty articles, and scholars have reconstructed it by comparing several sources.
Successive legal codes in Babylon, including the code of Hammurabi (), reflected Mesopotamian society's belief that law derived from the will of the gods (see Babylonian law).
Many states at this time functioned as theocracies, with codes of conduct largely religious in origin or reference. In the Sanskrit texts of Dharmaśāstra (), issues such as legal and religious duties, code of conduct, penalties and remedies, etc. have been discussed and forms one of the elaborate and earliest source of legal code.
Sir Henry Maine studied the ancient codes available in his day, and failed to find any criminal law in the "modern" sense of the word. While modern systems distinguish between offences against the "state" or "community", and offences against the "individual", the so-called penal law of ancient communities did not deal with "crimes" (Latin: "crimina"), but with "wrongs" (Latin: "delicta"). Thus the Hellenic laws treated all forms of theft, assault, rape, and murder as private wrongs, and left action for enforcement up to the victims or their survivors. The earliest systems seem to have lacked formal courts.
The Romans systematized law and applied their system across the Roman Empire. Again, the initial rules of Roman law regarded assaults as a matter of private compensation. The most significant Roman law concept involved "dominion". The "pater familias" owned all the family and its property (including slaves); the "pater" enforced matters involving interference with any property. The "Commentaries" of Gaius (written between 130 and 180 AD) on the Twelve Tables treated "furtum" (in modern parlance: "theft") as a tort.
Similarly, assault and violent robbery involved trespass as to the "pater's" property (so, for example, the rape of a slave could become the subject of compensation to the "pater" as having trespassed on his "property"), and breach of such laws created a "vinculum juris" (an obligation of law) that only the payment of monetary compensation (modern "damages") could discharge. Similarly, the consolidated Teutonic laws of the Germanic tribes, included a complex system of monetary compensations for what courts would consider the complete range of criminal offences against the person, from murder down.
Even though Rome abandoned its Britannic provinces around 400 AD, the Germanic mercenarieswho had largely become instrumental in enforcing Roman rule in Britanniaacquired ownership of land there and continued to use a mixture of Roman and Teutonic Law, with much written down under the early Anglo-Saxon kings. But only when a more centralized English monarchy emerged following the Norman invasion, and when the kings of England attempted to assert power over the land and its peoples, did the modern concept emerge, namely of a crime not only as an offence against the "individual", but also as a wrong against the "state".
This idea came from common law, and the earliest conception of a criminal act involved events of such major significance that the "state" had to usurp the usual functions of the civil tribunals, and direct a special law or "privilegium" against the perpetrator. All the earliest English criminal trials involved wholly extraordinary and arbitrary courts without any settled law to apply, whereas the civil (delictual) law operated in a highly developed and consistent manner (except where a king wanted to raise money by selling a new form of writ). The development of the idea that the "state" dispenses justice in a court only emerges in parallel with or after the emergence of the concept of sovereignty.
In continental Europe, Roman law persisted, but with a stronger influence from the Christian Church.
Coupled with the more diffuse political structure based on smaller feudal units, various legal traditions emerged, remaining more strongly rooted in Roman jurisprudence, but modified to meet the prevailing political climate.
In Scandinavia the effect of Roman law did not become apparent until the 17th century, and the courts grew out of the "things"the assemblies of the people. The people decided the cases (usually with largest freeholders dominating). This system later gradually developed into a system with a royal judge nominating a number of the most esteemed men of the parish as his board, fulfilling the function of "the people" of yore.
From the Hellenic system onwards, the policy rationale for requiring the payment of monetary compensation for wrongs committed has involved the avoidance of feuding between clans and families.
If compensation could mollify families' feelings, this would help to keep the peace. On the other hand, the institution of oaths also played down the threat of feudal warfare. Both in archaic Greece and in medieval Scandinavia, an accused person walked free if he could get a sufficient number of male relatives to swear him not guilty. (Compare the United Nations Security Council, in which the veto power of the permanent members ensures that the organization does not become involved in crises where it could not enforce its decisions.)
These means of restraining private feuds did not always work, and sometimes prevented the fulfillment of justice. But in the earliest times the "state" did not always provide an independent policing force. Thus criminal law grew out of what 21st-century lawyers would call torts; and, in real terms, many acts and omissions classified as crimes actually overlap with civil-law concepts.
The development of sociological thought from the 19th century onwards prompted some fresh views on crime and criminality, and fostered the beginnings of criminology as a study of crime in society. Nietzsche noted a link between crime and creativityin "The Birth of Tragedy" he asserted: "The best and brightest that man can acquire he must obtain by crime". In the 20th century, Michel Foucault in "Discipline and Punish" made a study of criminalization as a coercive method of state control.
The following classes of offences are used, or have been used, as legal terms:
Researchers and commentators have classified crimes into the following categories, in addition to those above:
One can categorise crimes depending on the related punishment, with sentencing tariffs prescribed in line with the perceived seriousness of the offence. Thus fines and noncustodial sentences may address the crimes seen as least serious, with lengthy imprisonment or (in some jurisdictions) capital punishment reserved for the most serious.
Under the common law of England, crimes were classified as either treason, felony or misdemeanour, with treason sometimes being included with the felonies. This system was based on the perceived seriousness of the offence. It is still used in the United States but the distinction between felony and misdemeanour is abolished in England, Wales and Northern Ireland.
The following classes of offence are based on mode of trial:
In common law countries, crimes may be categorised into common law offences and statutory offences. In the US, Australia and Canada (in particular), they are divided into federal crimes and under state crimes.
In the United States since 1930, the FBI has tabulated Uniform Crime Reports (UCR) annually from crime data submitted by law enforcement agencies across the United States.
Officials compile this data at the city, county, and state levels into the UCR. They classify violations of laws based on common law as Part I (index) crimes in UCR data. These are further categorized as violent or property crimes. Part I violent crimes include murder and criminal homicide (voluntary manslaughter), forcible rape, aggravated assault, and robbery; while Part I property crimes include burglary, arson, larceny/theft, and motor-vehicle theft. All other crimes count come under Part II.
For convenience, such lists usually include infractions although, in the U.S., they may come into the sphere not of the criminal law, but rather of the civil law. Compare tortfeasance.
Booking arrests require detention for a time-frame ranging 1 to 24 hours.
There are several national and International organizations offering studies and statistics about global and local crime activity, such as United Nations Office on Drugs and Crime, the United States of America Overseas Security Advisory Council (OSAC) safety report or national reports generated by the law-enforcement authorities of EU state member reported to the Europol.
In England and Wales, as well as in Hong Kong, the term "offence" means the same thing as, and is interchangeable with, the term "crime", They are further split into:
Many different causes and correlates of crime have been proposed with varying degree of empirical support. They include socioeconomic, psychological, biological, and behavioral factors. Controversial topics include media violence research and effects of gun politics.
Emotional state (both chronic and current) have a tremendous impact on individual thought processes and, as a result, can be linked to criminal activities. The positive psychology concept of Broaden and Build posits that cognitive functioning expands when an individual is in a good-feeling emotional state and contracts as emotional state declines. In positive emotional states an individual is able to consider more possible solutions to problems, but in lower emotional states fewer solutions can be ascertained. The narrowed thought-action repertoires can result in the only paths perceptible to an individual being ones they would never use if they saw an alternative, but if they can't conceive of the alternatives that carry less risk they will choose one that they can see. Criminals who commit even the most horrendous of crimes, such as mass murders, did not see another solution.
Crimes defined by treaty as crimes against international law include:
From the point of view of state-centric law, extraordinary procedures (international courts or national courts operating with universal jurisdiction) may prosecute such crimes. Note the role of the International Criminal Court at The Hague in the Netherlands.
Different religious traditions may promote distinct norms of behaviour, and these in turn may clash or harmonise with the perceived interests of a state. Socially accepted or imposed religious morality has influenced secular jurisdictions on issues that may otherwise concern only an individual's conscience. Activities sometimes criminalized on religious grounds include (for example) alcohol consumption (prohibition), abortion and stem-cell research. In various historical and present-day societies, institutionalized religions have established systems of earthly justice that punish crimes against the divine will and against specific devotional, organizational and other rules under specific codes, such as Roman Catholic canon law.
In the military sphere, authorities can prosecute both regular crimes and specific acts (such as mutiny or desertion) under martial-law codes that either supplant or extend civil codes in times of (for example) war.
Many constitutions contain provisions to curtail freedoms and criminalize otherwise tolerated behaviors under a state of emergency in the event of war, natural disaster or civil unrest. Undesired activities at such times may include assembly in the streets, violation of curfew, or possession of firearms.
Two common types of employee crime exist: embezzlement and wage theft.
The complexity and anonymity of computer systems may help criminal employees camouflage their operations. The victims of the most costly scams include banks, brokerage houses, insurance companies, and other large financial institutions.
In the United States, it is estimated that workers are not paid at least $19 billion every year in overtime and that in total $40 billion to $60 billion are lost annually due to all forms of wage theft. This compares to national annual losses of $340 million due to robbery, $4.1 billion due to burglary, $5.3 billion due to larceny, and $3.8 billion due to auto theft in 2012. In Singapore, as in the United States, wage theft was found to be widespread and severe. In a 2014 survey it was found that as many as one-third of low wage male foreign workers in Singapore, or about 130,000, were affected by wage theft from partial to full denial of pay. | https://en.wikipedia.org/wiki?curid=5785 |
California Institute of Technology
The California Institute of Technology (Caltech) is a private research university in Pasadena, California. It was founded as a preparatory and vocational school by Amos G. Throop in 1891 and began attracting influential scientists such as George Ellery Hale, Arthur Amos Noyes and Robert Andrews Millikan in the early 20th century. The vocational and preparatory schools were disbanded and spun off in 1910 and the college assumed its present name in 1920. In 1934, Caltech was elected to the Association of American Universities, and the antecedents of NASA's Jet Propulsion Laboratory, which Caltech continues to manage and operate, were established between 1936 and 1943 under Theodore von Kármán. The university is one among a small group of institutes of technology in the United States which is primarily devoted to the instruction of pure and applied sciences.
Caltech has six academic divisions with strong emphasis on science and engineering, managing $332 million in 2011 in sponsored research. Its primary campus is located approximately northeast of downtown Los Angeles. First-year students are required to live on campus, and 95% of undergraduates remain in the on-campus House System at Caltech. Although Caltech has a strong tradition of practical jokes and pranks, student life is governed by an honor code which allows faculty to assign take-home examinations. The Caltech Beavers compete in 13 intercollegiate sports in the NCAA Division III's Southern California Intercollegiate Athletic Conference.
, Caltech alumni, faculty and researchers include 74 Nobel Laureates (chemist Linus Pauling being the only individual in history to win two unshared prizes), 4 Fields Medalists, and 6 Turing Award winners. In addition, there are 56 non-emeritus faculty members (as well as many emeritus faculty members) who have been elected to one of the United States National Academies, 4 Chief Scientists of the U.S. Air Force and 71 have won the United States National Medal of Science or Technology. Numerous faculty members are associated with the Howard Hughes Medical Institute as well as NASA. According to a 2015 Pomona College study, Caltech ranked number one in the U.S. for the percentage of its graduates who go on to earn a PhD.
Caltech started as a vocational school founded in Pasadena in 1891 by local businessman and politician Amos G. Throop. The school was known successively as Throop University, Throop Polytechnic Institute (and Manual Training School) and Throop College of Technology before acquiring its current name in 1920. The vocational school was disbanded and the preparatory program was split off to form the independent Polytechnic School in 1907.
At a time when scientific research in the United States was still in its infancy, George Ellery Hale, a solar astronomer from the University of Chicago, founded the Mount Wilson Observatory in 1904. He joined Throop's board of trustees in 1907, and soon began developing it and the whole of Pasadena into a major scientific and cultural destination. He engineered the appointment of James A. B. Scherer, a literary scholar untutored in science but a capable administrator and fund raiser, to Throop's presidency in 1908. Scherer persuaded retired businessman and trustee Charles W. Gates to donate $25,000 in seed money to build Gates Laboratory, the first science building on campus.
In 1910, Throop moved to its current site. Arthur Fleming donated the land for the permanent campus site. Theodore Roosevelt delivered an address at Throop Institute on March 21, 1911, and he declared:
I want to see institutions like Throop turn out perhaps ninety-nine of every hundred students as men who are to do given pieces of industrial work better than any one else can do them; I want to see those men do the kind of work that is now being done on the Panama Canal and on the great irrigation projects in the interior of this country—and the one-hundredth man I want to see with the kind of cultural scientific training that will make him and his fellows the matrix out of which you can occasionally develop a man like your great astronomer, George Ellery Hale.
In the same year, a bill was introduced in the California Legislature calling for the establishment of a publicly funded "California Institute of Technology", with an initial budget of a million dollars, ten times the budget of Throop at the time. The board of trustees offered to turn Throop over to the state, but the presidents of Stanford University and the University of California successfully lobbied to defeat the bill, which allowed Throop to develop as the only scientific research-oriented education institute in southern California, public or private, until the onset of the World War II necessitated the broader development of research-based science education. The promise of Throop attracted physical chemist Arthur Amos Noyes from MIT to develop the institution and assist in establishing it as a center for science and technology.
With the onset of World War I, Hale organized the National Research Council to coordinate and support scientific work on military problems. While he supported the idea of federal appropriations for science, he took exception to a federal bill that would have funded engineering research at land-grant colleges, and instead sought to raise a $1 million national research fund entirely from private sources. To that end, as Hale wrote in "The New York Times":
Throop College of Technology, in Pasadena California has recently afforded a striking illustration of one way in which the Research Council can secure co-operation and advance scientific investigation. This institution, with its able investigators and excellent research laboratories, could be of great service in any broad scheme of cooperation. President Scherer, hearing of the formation of the council, immediately offered to take part in its work, and with this object, he secured within three days an additional research endowment of one hundred thousand dollars.
Through the National Research Council, Hale simultaneously lobbied for science to play a larger role in national affairs, and for Throop to play a national role in science. The new funds were designated for physics research, and ultimately led to the establishment of the Norman Bridge Laboratory, which attracted experimental physicist Robert Andrews Millikan from the University of Chicago in 1917. During the course of the war, Hale, Noyes and Millikan worked together in Washington on the NRC. Subsequently, they continued their partnership in developing Caltech.
Under the leadership of Hale, Noyes, and Millikan (aided by the booming economy of Southern California), Caltech grew to national prominence in the 1920s and concentrated on the development of Roosevelt's "Hundredth Man". On November 29, 1921, the trustees declared it to be the express policy of the Institute to pursue scientific research of the greatest importance and at the same time "to continue to conduct thorough courses in engineering and pure science, basing the work of these courses on exceptionally strong instruction in the fundamental sciences of mathematics, physics, and chemistry; broadening and enriching the curriculum by a liberal amount of instruction in such subjects as English, history, and economics; and vitalizing all the work of the Institute by the infusion in generous measure of the spirit of research". In 1923, Millikan was awarded the Nobel Prize in Physics. In 1925, the school established a department of geology and hired William Bennett Munro, then chairman of the division of History, Government, and Economics at Harvard University, to create a division of humanities and social sciences at Caltech. In 1928, a division of biology was established under the leadership of Thomas Hunt Morgan, the most distinguished biologist in the United States at the time, and discoverer of the role of genes and the chromosome in heredity. In 1930, Kerckhoff Marine Laboratory was established in Corona del Mar under the care of Professor George MacGinitie. In 1926, a graduate school of aeronautics was created, which eventually attracted Theodore von Kármán. Kármán later helped create the Jet Propulsion Laboratory, and played an integral part in establishing Caltech as one of the world's centers for rocket science. In 1928, construction of the Palomar Observatory began.
Millikan served as "Chairman of the Executive Council" (effectively Caltech's president) from 1921 to 1945, and his influence was such that the Institute was occasionally referred to as "Millikan's School." Millikan initiated a visiting-scholars program soon after joining Caltech. Scientists who accepted his invitation include luminaries such as Paul Dirac, Erwin Schrödinger, Werner Heisenberg, Hendrik Lorentz and Niels Bohr. Albert Einstein arrived on the Caltech campus for the first time in 1931 to polish up his Theory of General Relativity, and he returned to Caltech subsequently as a visiting professor in 1932 and 1933.
During World War II, Caltech was one of 131 colleges and universities nationally that took part in the V-12 Navy College Training Program which offered students a path to a Navy commission. The United States Navy also maintained a naval training school for aeronautical engineering, resident inspectors of ordinance and naval material, and a liaison officer to the National Defense Research Committee on campus.
From April to December 1951, Caltech was the host of a federal classified study, Project Vista. The selection of Caltech as host for the project was based on the university's expertise in rocketry and nuclear physics. In response to the war in Korea and the pressure from the Soviet Union, the project was Caltech's way of assisting the federal government in its effort to increase national security. The project was created to study new ways of improving the relationship between tactical air support and ground troops. The Army, Air Force, and Navy sponsored the project, however it was under contract with the Army. The study was named after the hotel, Vista del Arroyo Hotel, which housed the study. The study operated under a committee with the supervision of President Lee A. DuBridge. William A. Fowler, a professor at Caltech, was selected as research director. More than a fourth of Caltech's faculty and a group of outside scientists staffed the project. Moreover, the number increases if one takes into account visiting scientists, military liaisons, secretarial, and security staff. In compensation for its participation, the university received about $750,000.
From the 1950s to 1970s, Caltech was the home of Murray Gell-Mann and Richard Feynman, whose work was central to the establishment of the Standard Model of particle physics. Feynman was also widely known outside the physics community as an exceptional teacher and colorful, unconventional character.
During Lee A. DuBridge's tenure as Caltech's president (1946–1969), Caltech's faculty doubled and the campus tripled in size. DuBridge, unlike his predecessors, welcomed federal funding of science. New research fields flourished, including chemical biology, planetary science, nuclear astrophysics, and geochemistry. A 200-inch telescope was dedicated on nearby Palomar Mountain in 1948 and remained the world's most powerful optical telescope for over forty years.
Caltech opened its doors to female undergraduates during the presidency of Harold Brown in 1970, and they made up 14% of the entering class. The portion of female undergraduates has been increasing since then.
Protests by Caltech students are rare. The earliest was a 1968 protest outside the NBC Burbank studios, in response to rumors that NBC was to cancel "". In 1973, the students from Dabney House protested a presidential visit with a sign on the library bearing the simple phrase "Impeach Nixon". The following week, Ross McCollum, president of the National Oil Company, wrote an open letter to Dabney House stating that in light of their actions he had decided not to donate one million dollars to Caltech. The Dabney family, being Republicans, disowned Dabney House after hearing of the protest.
Since 2000, the Einstein Papers Project has been located at Caltech. The project was established in 1986 to assemble, preserve, translate, and publish papers selected from the literary estate of Albert Einstein and from other collections.
In fall 2008, the freshman class was 42% female, a record for Caltech's undergraduate enrollment. In the same year, the Institute concluded a six-year-long fund-raising campaign. The campaign raised more than $1.4 billion from about 16,000 donors. Nearly half of the funds went into the support of Caltech programs and projects.
In 2010, Caltech, in partnership with Lawrence Berkeley National Laboratory and headed by Professor Nathan Lewis, established a DOE Energy Innovation Hub aimed at developing revolutionary methods to generate fuels directly from sunlight. This hub, the Joint Center for Artificial Photosynthesis, will receive up to $122 million in federal funding over five years.
Since 2012, Caltech began to offer classes through massive open online courses (MOOCs) under Coursera, and from 2013, edX.
Jean-Lou Chameau, the eighth president, announced on February 19, 2013, that he would be stepping down to accept the presidency at King Abdullah University of Science and Technology. Thomas F. Rosenbaum was announced to be the ninth president of Caltech on October 24, 2013, and his term began on July 1, 2014.
In 2019, Caltech received a gift of $750 million for sustainability research from the Resnick family of The Wonderful Company. The gift is the largest ever for environmental sustainability research and the second-largest private donation to a US academic institution (after Bloomberg's gift of $1.8 billion to Johns Hopkins University in 2018).
Caltech's primary campus is located in Pasadena, California, approximately northeast of downtown Los Angeles. It is within walking distance of Old Town Pasadena and the Pasadena Playhouse District and therefore the two locations are frequent getaways for Caltech students.
In 1917 Hale hired architect Bertram Goodhue to produce a master plan for the campus. Goodhue conceived the overall layout of the campus and designed the physics building, Dabney Hall, and several other structures, in which he sought to be consistent with the local climate, the character of the school, and Hale's educational philosophy. Goodhue's designs for Caltech were also influenced by the traditional Spanish mission architecture of Southern California.
During the 1960s, Caltech underwent considerable expansion, in part due to the philanthropy of alumnus Arnold O. Beckman. In 1953, Beckman was asked to join the Caltech Board of Trustees. In 1964, he became its chairman. Over the next few years, as Caltech's president emeritus David Baltimore describes it, Arnold Beckman and his wife Mabel "shaped the destiny of Caltech".
In 1971 a magnitude-6.6 earthquake in San Fernando caused some damage to the Caltech campus. Engineers who evaluated the damage found that two historic buildings dating from the early days of the Institute—Throop Hall and the Goodhue-designed Culbertson Auditorium—had cracked.
New additions to the campus include the Cahill Center for Astronomy and Astrophysics and the Walter and Leonore Annenberg Center for Information Science and Technology, which opened in 2009, and the Warren and Katherine Schlinger Laboratory for Chemistry and Chemical Engineering followed in March 2010. The Institute also concluded an upgrading of the south houses in 2006. In late 2010, Caltech completed a 1.3 MW solar array projected to produce approximately 1.6 GWh in 2011.
Caltech is incorporated as a non-profit corporation and is governed by a privately appointed 46-member board of trustees who serve five-year terms of office and retire at the age of 72. The current board is chaired by David L. Lee, co-founder of Global Crossing Ltd. The Trustees elect a President to serve as the chief executive officer of the Institute and administer the affairs on the Institute on behalf of the board, a Provost who serves as the chief academic officer of the Institute below the President, and ten other vice presidential and other senior positions. Former Georgia Tech provost Jean-Lou Chameau became the eighth president of Caltech on September 1, 2006, replacing David Baltimore who had served since 1997. Chameau's compensation for 2008–2009 totaled $799,472. Chameau served until June 30, 2013. Thomas F. Rosenbaum was announced to be the ninth president of Caltech on October 24, 2013, and his term began on July 1, 2014. Caltech's endowment is governed by a permanent Trustee committee and administered by an Investment Office.
The Institute is organized into six primary academic divisions: Biology and Biological Engineering, Chemistry and Chemical Engineering, Engineering and Applied Science, Geological and Planetary Sciences, Humanities and Social Sciences, and Physics, Mathematics, and Astronomy. The voting faculty of Caltech include all professors, instructors, research associates and fellows, and the University Librarian. Faculty are responsible for establishing admission requirements, academic standards, and curricula. The Faculty Board is the faculty's representative body and consists of 18 elected faculty representatives as well as other senior administration officials. Full-time professors are expected to teach classes, conduct research, advise students, and perform administrative work such as serving on committees.
Founded in 1930s, the Jet Propulsion Laboratory (JPL) is a federally funded research and development center (FFRDC) owned by NASA and operated as a division of Caltech through a contract between NASA and Caltech. In 2008, JPL spent over $1.6 billion on research and development and employed over 5,000 project-related and support employees. The JPL Director also serves as a Caltech Vice President and is responsible to the President of the Institute for the management of the laboratory.
Caltech is a small four-year, highly residential research university with slightly more students in graduate programs than undergraduate. The Institute has been accredited by the Western Association of Schools and Colleges since 1949. Caltech is on the quarter system: the fall term starts in late September and ends before Christmas, the second term starts after New Years Day and ends in mid-March, and the third term starts in late March or early April and ends in early June.
For 2020, "U.S. News & World Report" ranked Caltech as tied for 12th in the United States among national universities overall, 8th for most innovative, and 11th for best value. "U.S. News & World Report" also ranked the graduate programs in chemistry and earth sciences first among national universities.
Caltech was ranked 1st internationally between 2011 and 2016 by the "Times Higher Education World University Rankings". Caltech was ranked as the best university in the world in two categories: Engineering & Technology and Physical Sciences. It was also found to have the highest faculty citation rate in the world.
For the Class of 2023 (enrolled Fall 2019), Caltech received 8,367 applications and accepted 6.4% of applicants; 235 enrolled. The class included 44% women and 56% men. 32% were of underrepresented ancestry, and 6% were foreign students.
Admission to Caltech is extremely rigorous and requires the highest test scores in the nation. The middle 50% range of SAT scores for enrolled freshmen for the class of 2023 were 740–780 for evidence-based reading and writing and 790–800 for math, and 1530–1570 total. The middle 50% range ACT Composite score was 35–36. The SAT Math Level 2 middle 50% range was 800–800. The middle 50% range for the SAT Physics Subject Test was 760–800; SAT Chemistry Subject Test was 760–800;
SAT Biology Subject Tests was 760–800
Undergraduate tuition for the 2013–2014 school year was $39,990 and total annual costs were estimated to be $58,755. In 2012–2013, Caltech awarded $17.1 million in need-based aid, $438k in non-need-based aid, and $2.51 million in self-help support to enrolled undergraduate students. The average financial aid package of all students eligible for aid was $38,756 and students graduated with an average debt of $15,090.
The full-time, four-year undergraduate program emphasizes instruction in the arts and sciences and has high graduate coexistence. Caltech offers 24 majors (called "options") and six minors across all six academic divisions. Caltech also offers interdisciplinary programs in Applied Physics, Biochemistry, Bioengineering, Computation and Neural Systems, Control and Dynamical Systems, Environmental Science and Engineering, Geobiology and Astrobiology, Geochemistry, and Planetary Astronomy. The most popular options are Chemical Engineering, Computer Science, Electrical Engineering, Mechanical Engineering and Physics.
Prior to the entering class of 2013, Caltech required students to take a core curriculum of five terms of mathematics, five terms of physics, two terms of chemistry, one term of biology, two terms of lab courses, one term of scientific communication, three terms of physical education, and 12 terms of humanities and social science. Since 2013, only three terms each of mathematics and physics have been required by the Institute, with the remaining two terms each required by certain options.
A typical class is worth 9 academic units and given the extensive core curriculum requirements in addition to individual options' degree requirements, students need to take an average of 40.5 units per term (more than four classes) in order to graduate in four years. 36 units is the minimum full-time load, 48 units is considered a heavy load, and registrations above 51 units require an overload petition. Approximately 20 percent of students double-major. This is achievable since the humanities and social sciences majors have been designed to be done in conjunction with a science major. Although choosing two options in the same division is discouraged, it is still possible.
First-year students are enrolled in first-term classes based upon results of placement exams in math, physics, chemistry, and writing and take all classes in their first two terms on a Pass/Fail basis. There is little competition; collaboration on homework is encouraged and the honor system encourages take-home tests and flexible homework schedules. Caltech offers co-operative programs with other schools, such as the Pasadena Art Center College of Design and Occidental College.
According to a PayScale study, Caltech graduates earn a median early career salary of $83,400 and $143,100 mid-career, placing them in the top 5 among graduates of US colleges and universities. The average net return on investment over a period of 20 years is $887,000, the tenth-highest among US colleges.
Caltech offers Army and Air Force ROTC in cooperation with the University of Southern California.
The graduate instructional programs emphasize doctoral studies and are dominated by science, technology, engineering, and mathematics fields. The Institute offers graduate degree programs for the Master of Science, Engineer's Degree, Doctor of Philosophy, BS/MS and MD/PhD, with the majority of students in the PhD program. The most popular options are Chemistry, Physics, Biology, Electrical Engineering and Chemical Engineering. Applicants for graduate studies are required to take the GRE. GRE Subject scores are either required or strongly recommended by several options. A joint program between Caltech and the Keck School of Medicine of the University of Southern California, and the UCLA David Geffen School of Medicine grants MD/PhD degrees. Students in this program do their preclinical and clinical work at USC or UCLA, and their PhD work with any member of the Caltech faculty, including the Biology, Chemistry, and Engineering and Applied Sciences Divisions. The MD degree would be from USC or UCLA and the PhD would be awarded from Caltech.
The research facilities at Caltech are available to graduate students, but there are opportunities for students to work in facilities of other universities, research centers as well as private industries. The graduate student to faculty ratio is 4:1.
Approximately 99 percent of doctoral students have full financial support. Financial support for graduate students comes in the form of fellowships, research assistantships, teaching assistantships or a combination of fellowship and assistantship support.
Graduate students are bound by the honor code, as are the undergraduates, and the Graduate Honor Council oversees any violations of the code.
Caltech is classified among "R1: Doctoral Universities – Very High Research Activity". Caltech was elected to the Association of American Universities in 1934 and remains a research university with "very high" research activity, primarily in STEM fields. Caltech manages research expenditures of $270 million annually, 66th among all universities in the U.S. and 17th among private institutions without medical schools for 2008. The largest federal agencies contributing to research are NASA, National Science Foundation, Department of Health and Human Services, Department of Defense, and Department of Energy. Caltech received $144 million in federal funding for the physical sciences, $40.8 million for the life sciences, $33.5 million for engineering, $14.4 million for environmental sciences, $7.16 million for computer sciences, and $1.97 million for mathematical sciences in 2008.
The Institute was awarded an all-time high funding of $357 million in 2009. Active funding from the National Science Foundation Directorate of Mathematical and Physical Science (MPS) for Caltech stands at $343 million , the highest for any educational institution in the nation, and higher than the total funds allocated to any state except California and New York.
In 2005, Caltech had dedicated to research: to physical sciences, to engineering, and to biological sciences.
In addition to managing JPL, Caltech also operates the Palomar Observatory in San Diego County, the Owens Valley Radio Observatory in Bishop, California, the Submillimeter Observatory and W. M. Keck Observatory at the Mauna Kea Observatory, the Laser Interferometer Gravitational-Wave Observatory at Livingston, Louisiana and Richland, Washington, and Kerckhoff Marine Laboratory in Corona del Mar, California. The Institute launched the Kavli Nanoscience Institute at Caltech in 2006, the Keck Institute for Space Studies in 2008, and is also the current home for the Einstein Papers Project. The Spitzer Science Center (SSC), part of the Infrared Processing and Analysis Center located on the Caltech campus, is the data analysis and community support center for NASA's Spitzer Space Telescope.
Caltech partnered with UCLA to establish a Joint Center for Translational Medicine (UCLA-Caltech JCTM), which conducts experimental research into clinical applications, including the diagnosis and treatment of diseases such as cancer.
Caltech operates several TCCON stations as part of an international collaborative effort of measuring greenhouse gases globally. One station is on campus.
Undergraduates at Caltech are also encouraged to participate in research. About 80% of the class of 2010 did research through the annual Summer Undergraduate Research Fellowships (SURF) program at least once during their stay, and many continued during the school year. Students write and submit SURF proposals for research projects in collaboration with professors, and about 70 percent of applicants are awarded SURFs. The program is open to both Caltech and non-Caltech undergraduate students. It serves as preparation for graduate school and helps to explain why Caltech has the highest percentage of alumni who go on to receive a PhD of all the major universities.
The licensing and transferring of technology to the commercial sector is managed by the Office of Technology Transfer (OTT). OTT protects and manages the intellectual property developed by faculty members, students, other researchers, and JPL technologists. Caltech receives more invention disclosures per faculty member than any other university in the nation. , 1891 patents were granted to Caltech researchers since 1969.
During the early 20th century, a Caltech committee visited several universities and decided to transform the undergraduate housing system from fraternities to a house system. Four south houses (or "hovses", as styled in the stone engravings) were built: Blacker House, Dabney House, Fleming House and Ricketts House. In the 1960s, three north houses were built: Lloyd House, Page House, and Ruddock House, and during the 1990s, Avery House. The four south houses closed for renovation in 2005 and reopened in 2006. The latest addition to residential life at Caltech is Bechtel Residence, which opened in 2018. It is not affiliated with the house system. All first- and second-year students live on campus in the house system or in the Bechtel Residence.
Caltech has athletic teams in baseball, men's and women's basketball, cross country, fencing, men's and women's soccer, swimming and diving, men's and women's tennis, track and field, women's volleyball, and men's and women's water polo. Caltech's mascot is the Beaver, a homage to nature's engineer. Its teams (with the exception of the fencing team) play in the Southern California Intercollegiate Athletic Conference, which Caltech co-founded in 1915. The fencing team competes in the NCAA's Division I, facing teams from UCLA, USC, UCSD, and Stanford, among others.
On January 6, 2007, the Beavers' men's basketball team snapped a 207-game losing streak to Division III schools, beating Bard College 81–52. It was their first Division III victory since 1996.
Until their win over Occidental on February 22, 2011 the team had not won a game in conference play since 1985. Ryan Elmquist's free throw with 3.3 seconds in regulation gave the Beavers the victory. The documentary film "Quantum Hoops" concerns the events of the Beavers' 2005–06 season.
On January 13, 2007, the Caltech women's basketball team snapped a 50-game losing streak, defeating the Pomona–Pitzer Sagehens 55–53. The women's program, which entered the SCIAC in 2002, garnered their first conference win. On the bench as honorary coach for the evening was Dr. Robert Grubbs, 2005 Nobel laureate in Chemistry. The team went on to beat Whittier College on February 10, for its second SCIAC win, and placed its first member on the All Conference team. The 2006–2007 season is the most successful season in the history of the program.
In 2007, 2008, and 2009, the women's table tennis team (a club team) competed in nationals. The women's Ultimate club team, known as "Snatch", has also been very successful in recent years, ranking 44 of over 200 college teams in the Ultimate Player's Association.
On February 2, 2013, the Caltech baseball team ended a 228-game losing streak, the team's first win in nearly 10 years.
The track and field team plays at the South Athletic Field in Tournament Park, the site of the first Rose Bowl Game.
The school also sponsored a football team prior to 1976, which played part of its home schedule at the Rose Bowl, or, as Caltech students put it, "to the largest number of empty seats in the nation".
The Caltech/Occidental College Orchestra is a full seventy-piece orchestra composed of students, faculty, and staff at Caltech and nearby Occidental College. The orchestra gives three pairs of concerts annually, at both Caltech and Occidental College. There are also two Caltech Jazz Bands and a Concert Band, as well as an active chamber music program. For vocal music, Caltech has a mixed-voice Glee Club and the smaller Chamber Singers. The theater program at Caltech is known as TACIT, or Theater Arts at the California Institute of Technology. There are two to three plays organized by TACIT per year, and they were involved in the production of the , released in 2011.
Every Halloween, Dabney House conducts the infamous "Millikan pumpkin-drop experiment" from the top of Millikan Library, the highest point on campus. According to tradition, a claim was once made that the shattering of a pumpkin frozen in liquid nitrogen and dropped from a sufficient height would produce a triboluminescent spark. This yearly event involves a crowd of observers, who try to spot the elusive spark. The title of the event is an oblique reference to the famous Millikan oil-drop experiment which measured "e", the elemental unit of electrical charge.
On Ditch Day, the seniors ditch school, leaving behind elaborately designed tasks and traps at the doors of their rooms to prevent underclassmen from entering. Over the years this has evolved to the point where many seniors spend months designing mechanical, electrical, and software obstacles to confound the underclassmen. Each group of seniors designs a "stack" to be solved by a handful of underclassmen. The faculty have been drawn into the event as well, and cancel all classes on Ditch Day so the underclassmen can participate in what has become a highlight of the academic year.
Another long-standing tradition is the playing of Wagner's "Ride of the Valkyries" at 7:00 each morning during finals week with the largest, loudest speakers available. The playing of that piece is not allowed at any other time (except if one happens to be listening to the entire 14 hours and 5 minutes of "The Ring Cycle"), and any offender is dragged into the showers to be drenched in cold water fully dressed.
Caltech students have been known for the many pranks (also known as "RFs").
The two most famous in recent history are the changing of the Hollywood Sign to read "Caltech", by judiciously covering up certain parts of the letters, and the changing of the scoreboard to read Caltech 38, MIT 9 during the 1984 Rose Bowl Game. But the most famous of all occurred during the 1961 Rose Bowl Game, where Caltech students altered the flip-cards that were raised by the stadium attendees to display "Caltech", and several other "unintended" messages. This event is now referred to as the Great Rose Bowl Hoax.
In recent years, pranking has been officially encouraged by Tom Mannion, Caltech's Assistant VP for Student Affairs and Campus Life. "The grand old days of pranking have gone away at Caltech, and that's what we are trying to bring back," reported the "Boston Globe".
In December 2011, Caltech students went to New York and pulled a prank in Manhattan's Greenwich Village. The prank involved making The Cube sculpture look like the Aperture Science Weighted Companion Cube from the video game "Portal".
Caltech pranks have been documented in three Legends of Caltech books, the most recent of which was edited by alumni Autumn Looijen '99 and Mason Porter '98 and published in May 2007.
In 2005, a group of Caltech students pulled a string of pranks during MIT's Campus Preview Weekend for admitted students. These include covering up the word Massachusetts in the "Massachusetts Institute of Technology" engraving on the main building façade with a banner so that it read "That Other Institute of Technology". A group of MIT hackers responded by altering the banner so that the inscription read "The Only Institute of Technology." Caltech students also passed out T-shirts to MIT's incoming freshman class that had MIT written on the front and "...because not everyone can go to Caltech" along with an image of a palm tree on the back.
MIT retaliated in April 2006, when students posing as the Howe & Ser (Howitzer) Moving Company stole the 130-year-old, 1.7-ton Fleming House cannon and moved it over 3000 miles to their campus in Cambridge, Massachusetts for their 2006 Campus Preview Weekend, repeating a similar prank performed by nearby Harvey Mudd College in 1986. Thirty members of Fleming House traveled to MIT and reclaimed their cannon on April 10, 2006.
On April 13, 2007 (Friday the 13th), a group of students from "The California Tech", Caltech's campus newspaper, arrived and distributed fake copies of "The Tech", MIT's campus newspaper, while prospective students were visiting for their Campus Preview Weekend. Articles included "MIT Invents the Interweb", "Architects Deem Campus 'Unfortunate'", and "Infinite Corridor Not Actually Infinite".
In December 2009, some Caltech students declared that MIT had been sold and had become the Caltech East campus. A "sold" banner was hung on front of the MIT dome building and a "Welcome to Caltech East: School of the Humanities" banner over the Massachusetts Avenue Entrance. Newspapers and T-shirts were distributed, and door labels and fliers in the infinite corridor were put up in accordance with the "curriculum change."
In September 2010, MIT students attempted to put a TARDIS, the time machine from the BBC's "Doctor Who", onto a roof. Caught in midact, the prank was aborted. In January 2011, Caltech students in conjunction with MIT students helped put the TARDIS on top of Baxter. Caltech students then moved the TARDIS to UC Berkeley and Stanford.
In April 2014, during MIT's Campus Preview Weekend, a group of Caltech students handed out mugs emblazoned with the MIT logo on the front and the words "The Institute of Technology" on the back. When heated, the mugs turn orange, display a palm tree, and read "Caltech The Hotter Institute of Technology." Identical mugs continue to be sold at the Caltech campus store.
Life in the Caltech community is governed by the honor code, which simply states: "No member of the Caltech community shall take unfair advantage of any other member of the Caltech community." This is enforced by a Board of Control, which consists of undergraduate students, and by a similar body at the graduate level, called the Graduate Honor Council.
The honor code aims at promoting an atmosphere of respect and trust that allows Caltech students to enjoy privileges that make for a more relaxed atmosphere. For example, the honor code allows professors to make the majority of exams as take-home, allowing students to take them on their own schedule and in their preferred environment.
Through the late 1990s, the only exception to the honor code, implemented earlier in the decade in response to changes in federal regulations, concerned the sexual harassment policy. Today, there are myriad exceptions to the honor code in the form of new Institute policies such as the fire policy and alcohol policy. Although both policies are presented in the Honor System Handbook given to new members of the Caltech community, some undergraduates regard them as a slight against the honor code and the implicit trust and respect it represents within the community. In recent years, the Student Affairs Office has also taken up pursuing investigations independently of the Board of Control and Conduct Review Committee, an implicit violation of both the honor code and written disciplinary policy that has contributed to further erosion of trust between some parts of the undergraduate community and the administration.
, Caltech has 38 Nobel laureates to its name awarded to 22 alumni, which includes 5 Caltech professors who are also alumni (Carl D. Anderson, Linus Pauling, William A. Fowler, Edward B. Lewis, and Kip Thorne), and 15 non-alumni professors. The total number of Nobel Prizes is 39 because Pauling received prizes in both Chemistry and Peace. The official Nobel Prize count is 48 affiliates in total when including temporary academic staff such as visiting professors and postdoctoral scholars. Seven faculty and alumni have received a Crafoord Prize from the Royal Swedish Academy of Sciences, while 58 have been awarded the U.S. National Medal of Science, and 13 have received the National Medal of Technology. One alumnus, Stanislav Smirnov, won the Fields Medal in 2010. Other distinguished researchers have been affiliated with Caltech as postdoctoral scholars (for example, Barbara McClintock, James D. Watson, Sheldon Glashow and John Gurdon) or visiting professors (for example, Albert Einstein, Stephen Hawking and Edward Witten).
Caltech enrolled 938 undergraduate students and 1,299 graduate students for the 2019–2020 school year. Women made up 45% of the undergraduate and 31% of the graduate student body. The racial demographics of the school substantially differ from those of the nation as a whole.
The four-year graduation rate is 79% and the six-year rate is 92%, which is low compared to most leading U.S. universities, but substantially higher than it was in the 1960s and 1970s. Students majoring in STEM fields traditionally have graduation rates below 70%.
There are 22,930 total living alumni in the U.S. and around the world. As of October 2019, twenty-two alumni and 15 non-alumni faculty have won the Nobel Prize. The Turing Award, the "Nobel Prize of Computer Science", has been awarded to six alumni, and one has won the Fields Medal.
Many alumni have participated in scientific research. Some have concentrated their studies on the very small universe of atoms and molecules. Nobel laureate Carl D. Anderson (BS 1927, PhD 1930) proved the existence of positrons and muons, Nobel laureate Edwin McMillan (BS 1928, MS 1929) synthesized the first transuranium element, Nobel laureate Leo James Rainwater (BS 1939) investigated the non-spherical shapes of atomic nuclei, and Nobel laureate Douglas D. Osheroff (BS 1967) studied the superfluid nature of helium-3. Donald Knuth (PhD 1963), the "father" of the analysis of algorithms, wrote "The Art of Computer Programming" and created the TeX computer typesetting system, which is commonly used in the scientific community. Bruce Reznick (BS 1973) is a mathematician noted for his contributions to number theory and the combinatorial-algebraic-analytic investigations of polynomials. Narendra Karmarkar (MS 1979) is known for the interior point method, a polynomial algorithm for linear programming known as Karmarkar's algorithm.
Other alumni have turned their gaze to the universe. C. Gordon Fullerton (BS 1957, MS 1958) piloted the third Space Shuttle mission. Astronaut (and later, United States Senator) Harrison Schmitt (BS 1957) was the only geologist to have ever walked on the surface of the moon. Astronomer Eugene Merle Shoemaker (BS 1947, MS 1948) co-discovered Comet Shoemaker-Levy 9 (a comet which crashed into the planet Jupiter) and was the first person buried on the moon (by having his ashes crashed into the moon). Astronomer George O. Abell (BS 1951, MS 1952, PhD 1957) while a grad student at Caltech participated in the National Geographic Society-Palomar Sky Survey. This ultimately resulted in the publication of the "Abell Catalogue of Clusters of Galaxies," the definitive work in the field.
Undergraduate alumni founded, or co-founded, companies such as LCD manufacturer Varitronix, Hotmail, Compaq, and MathWorks (which created Matlab), while graduate students founded, or co-founded, companies such as Intel, TRW, and the non-profit educational organization, the Exploratorium.
Arnold Beckman (PhD 1928) invented the pH meter and commercialized it with the founding of Beckman Instruments. His success with that company enabled him to provide seed funding for William Shockley (BS 1932), who had co-invented semiconductor transistors and wanted to commercialize them. Shockley became the founding Director of the Shockley Semiconductor Laboratory division of Beckman Instruments. Shockley had previously worked at Bell Labs, whose first president was another alumnus, Frank Jewett (BS 1898). Because his aging mother lived in Palo Alto, California, Shockley established his laboratory near her in Mountain View, California. Shockley was a co-recipient of the Nobel Prize in physics in 1956, but his aggressive management style and odd personality at the Shockley Lab became unbearable. In late 1957, eight of his researchers resigned and with support from Sherman Fairchild formed Fairchild Semiconductor. Among the "traitorous eight" was Gordon E. Moore (PhD 1954), who later left Fairchild to co-found Intel. Other offspring companies of Fairchild Semiconductor include National Semiconductor and Advanced Micro Devices, which in turn spawned more technology companies in the area. Shockley's decision to use silicon instead of germanium as the semiconductor material, coupled with the abundance of silicon semiconductor related companies in the area, gave rise to the term "Silicon Valley" to describe that geographic region surrounding Palo Alto.
Caltech alumni also held public offices, with Mustafa A.G. Abushagur (PhD 1984) the Deputy Prime Minister of Libya and Prime Minister-Elect of Libya, James Fletcher (PhD 1948) the 4th and 7th Administrator of NASA, Steven Koonin (PhD 1972) the Undersecretary of Energy for Science, and Regina Dugan (PhD 1993) the 19th director of DARPA. The 20th director for DARPA, Arati Prabhakar, is also a Caltech alumna (PhD 1984). Arvind Virmani is a former Chief Economic Adviser to the Government of India. In 2013, President Obama announced the nomination of France Cordova (PhD 1979) as the director of the National Science Foundation and Ellen Williams (PhD 1982) as the director for ARPA-E.
Richard Feynman was among the most well-known physicists associated with Caltech, having published the "Feynman Lectures on Physics", an undergraduate physics text, and popular science texts such as "Six Easy Pieces" for the general audience. The promotion of physics made him a public figure of science, although his Nobel-winning work in quantum electrodynamics was already very established in the scientific community. Murray Gell-Mann, a Nobel-winning physicist, introduced a classification of hadrons and went on to postulate the existence of quarks, which is currently accepted as part of the Standard Model. Long-time Caltech President Robert Andrews Millikan was the first to calculate the charge of the electron with his well-known oil-drop experiment, while Richard Chace Tolman is remembered for his contributions to cosmology and statistical mechanics. 2004 Nobel Prize in Physics winner H. David Politzer is a current professor at Caltech, as is astrophysicist and author Kip Thorne and eminent mathematician Barry Simon. Linus Pauling pioneered quantum chemistry and molecular biology, and went on to discover the nature of the chemical bond in 1939. Seismologist Charles Richter, also an alumnus, developed the magnitude scale that bears his name, the Richter magnitude scale for measuring the power of earthquakes. One of the founders of the geochemistry department, Clair Patterson was the first to accurately determine the age of the Earth via lead:uranium ratio in meteorites. In engineering, Theodore von Kármán made many key advances in aerodynamics, notably his work on supersonic and hypersonic airflow characterization. A repeating pattern of swirling vortices is named after him, the von Kármán vortex street. Participants in von Kármán's GALCIT project included Frank Malina, who helped develop the WAC Corporal, which was the first U.S. rocket to reach the edge of space, Jack Parsons, a pioneer in the development of liquid and solid rocket fuels who designed the first castable composite-based rocket motor, and Qian Xuesen, who was dubbed the "Father of Chinese Rocketry". More recently, Michael Brown, a professor of planetary astronomy, discovered many trans-Neptunian objects, most notably the dwarf planet Eris, which prompted the International Astronomical Union to redefine the term "planet".
David Baltimore, the Robert A. Millikan Professor of Biology, and Alice Huang, Senior Faculty Associate in Biology, served as the presidents of AAAS from 2007–2008 and 2010–2011, respectively.
33% of the faculty are members of the National Academy of Science or Engineering and/or fellows of the American Academy of Arts and Sciences. This is the highest percentage of any faculty in the country with the exception of the graduate institution Rockefeller University.
The average salary for assistant professors at Caltech is $111,300, associate professors $121,300, and full professors $172,800. Caltech faculty are active in applied physics, astronomy and astrophysics, biology, biochemistry, biological engineering, chemical engineering, computer science, geology, mechanical engineering and physics.
Over the years Caltech has actively promoted the commercialization of technologies developed within its walls. Through its Office of Technology Transfer & Corporate Partnerships, scientific breakthroughs have led to the transfer of numerous technologies in a wide variety of scientific-related fields such as photovoltaic, radio-frequency identification (RFID), semiconductors, hyperspectral imaging, electronic devices, protein design, solid state amplifiers and many more. Companies such as Contour Energy Systems, Impinj, Fulcrum Microsystems, Nanosys, Inc., Photon etc., Xencor, and Wavestream Wireless have emerged from Caltech.
Caltech has appeared in many works of popular culture, both as itself and in disguised form. On television, it plays a prominent role and is the workplace of all four male lead characters and one female lead character in the sitcom "The Big Bang Theory". Caltech is also the inspiration, and frequent film location, for the California Institute of Science in "Numb3rs". On film, the Pacific Tech of "The War of the Worlds" and "Real Genius" is based on Caltech.
In nonfiction, two 2007 documentaries examine aspects of Caltech: "Curious", its researchers, and "Quantum Hoops", its men's basketball team.
Given its Los Angeles-area location, the grounds of the Institute are often host to short scenes in movies and television. The Athenaeum dining club appears in the "Beverly Hills Cop" series, "The X-Files", "True Romance", and "The West Wing". | https://en.wikipedia.org/wiki?curid=5786 |
Carlo Goldoni
Carlo Osvaldo Goldoni (, also , ; 25 February 1707 – 6 February 1793) was an Italian playwright and librettist from the Republic of Venice. His works include some of Italy's most famous and best-loved plays. Audiences have admired the plays of Goldoni for their ingenious mix of wit and honesty. His plays offered his contemporaries images of themselves, often dramatizing the lives, values, and conflicts of the emerging middle classes. Though he wrote in French and Italian, his plays make rich use of the Venetian language, regional vernacular, and colloquialisms. Goldoni also wrote under the pen name and title "Polisseno Fegeio, Pastor Arcade", which he claimed in his memoirs the "Arcadians of Rome" bestowed on him.
There is an abundance of autobiographical information on Goldoni, most of which comes from the introductions to his plays and from his "Memoirs". However, these memoirs are known to contain many errors of fact, especially about his earlier years.
In these memoirs, he paints himself as a born comedian, careless, light-hearted and with a happy temperament, proof against all strokes of fate, yet thoroughly respectable and honorable.
Goldoni was born in Venice in 1707, the son of Margherita Salvioni (or Saioni) and Giulio Goldoni. In his memoirs, Goldoni describes his father as a physician, and claims that he was introduced to theatre by his grandfather Carlo Alessandro Goldoni. In reality, it seems that Giulio was an apothecary; as for the grandfather, he had died four years before Carlo's birth. In any case, Goldoni was deeply interested in theatre from his earliest years, and all attempts to direct his activity into other channels were of no avail; his toys were puppets, and his books, plays.
His father placed him under the care of the philosopher Caldini at Rimini but the youth soon ran away with a company of strolling players and returned to Venice. In 1723 his father matriculated him into the stern Collegio Ghislieri in Pavia, which imposed the tonsure and monastic habits on its students. However, he relates in his "Memoirs" that a considerable part of his time was spent in reading Greek and Latin comedies. He had already begun writing at this time and, in his third year, he composed a libellous poem ("Il colosso") in which he ridiculed the daughters of certain Pavian families. As a result of that incident (and/or of a visit paid with some schoolmates to a local brothel) he was expelled from the school and had to leave the city (1725). He studied law at Udine, and eventually took his degree at University of Modena. He was employed as a law clerk at Chioggia and Feltre, after which he returned to his native city and began practicing.
Educated as a lawyer, and holding lucrative positions as secretary and counsellor, he seemed, indeed, at one time to have settled down to the practice of law, but following an unexpected summons to Venice, after an absence of several years, he changed his career, and thenceforth he devoted himself to writing plays and managing theatres. His father died in 1731. In 1732, to avoid an unwanted marriage, he left the town for Milan and then for Verona where the theatre manager Giuseppe Imer helped him on his way to becoming a comical poet as well as introducing him to his future wife, Nicoletta Conio. Goldoni returned with her to Venice, where he stayed until 1743.
Goldoni entered the Italian theatre scene with a tragedy, "Amalasunta", produced in Milan. The play was a critical and financial failure.
Submitting it to Count Prata, director of the opera, he was told that his piece "was composed with due regard for the rules of Aristotle and Horace, but not according to those laid down for the Italian drama." "In France", continued the count, "you can try to please the public, but here in Italy it is the actors and actresses whom you must consult, as well as the composer of the music and the stage decorators. Everything must be done according to a certain form which I will explain to you."
Goldoni thanked his critic, went back to his inn and ordered a fire, into which he threw the manuscript of his "Amalasunta".
His next play, "Belisario", written in 1734, was more successful, though of its success he afterward professed himself ashamed.
During this period he also wrote librettos for opera seria and served for a time as literary director of the San Giovanni Grisostomo, Venice's most distinguished opera house.
He wrote other tragedies for a time, but he was not long in discovering that his bent was for comedy. He had come to realize that the Italian stage needed reforming; adopting Molière as his model, he went to work in earnest and in 1738 produced his first real comedy, "L'uomo di mondo" ("The Man of the World"). During his many wanderings and adventures in Italy, he was constantly at work and when, at Livorno, he became acquainted with the manager Medebac, he determined to pursue the profession of playwriting in order to make a living. He was employed by Medebac to write plays for his theater in Venice. He worked for other managers and produced during his stay in that city some of his most characteristic works. He also wrote "Momolo Cortesan" in 1738. By 1743, he had perfected his hybrid style of playwriting (combining the model of Molière with the strengths of Commedia dell'arte and his own wit and sincerity). This style was typified in "La Donna di garbo", the first Italian comedy of its kind.
After 1748, Goldoni collaborated with the composer Baldassare Galuppi, making significant contributions to the new form of 'opera buffa'. Galuppi composed the score for more than twenty of Goldoni's librettos. As with his comedies, Goldoni's "opera buffa" integrate elements of the Commedia dell'arte with recognisable local and middle-class realities. His operatic works include two of the most successful musical comedies of the eighteenth century, "Il filosofo di campagna" ("The Country Philosopher"), set by Galuppi (1752) and "La buona figliuola" ("The Good Girl"), set by Niccolò Piccinni (1760).
In 1753, following his return from Bologna, he defected to the Teatro San Luca of the Vendramin family, where he performed most of his plays to 1762.
In 1757, he engaged in a bitter dispute with playwright Carlo Gozzi, which left him utterly disgusted with the tastes of his countrymen; so much so that in 1761 he moved to Paris, where he received a position at court and was put in charge of the Theatre Italien. He spent the rest of his life in France, composing most of his plays in French and writing his memoirs in that language.
Among the plays which he wrote in French, the most successful was "Le bourru bienfaisant", dedicated to the Marie Adélaïde, a daughter of Louis XV and aunt to the dauphin, the future Louis XVI of France. It premiered on 4 February 1771, almost nine months after the dauphin's marriage to Marie Antoinette. Goldoni enjoyed considerable popularity in France; in 1769, when he retired to Versailles, the King gave him a pension. He lost this pension after the French Revolution. The Convention eventually voted to restore his pension the day after his death. It was restored to his widow, at the pleading of the poet André Chénier; "She is old", he urged, "she is seventy-six, and her husband has left her no heritage save his illustrious name, his virtues and his poverty."
In his "Memoirs" Goldoni amply discusses the state of Italian comedy when he began writing. At that time, Italian comedy revolved around the conventionality of the Commedia dell'arte, or improvised comedy. Goldoni took to himself the task of superseding the comedy of masks and the comedy of intrigue by representations of actual life and manners through the characters and their behaviors. He rightly maintained that Italian life and manners were susceptible of artistic treatment such as had not been given them before.
His works are a lasting monument to the changes that he initiated: a dramatic revolution that had been attempted but not achieved before. Goldoni's importance lay in providing good examples rather than precepts. Goldoni says that he took for his models the plays of Molière and that whenever a piece of his own succeeded he whispered to himself: "Good, but not yet Molière." Goldoni's plays are gentler and more optimistic in tone than Molière's.
It was this very success that was the object of harsh critiques by Carlo Gozzi, who accused Goldoni of having deprived the Italian theatre of the charms of poetry and imagination. The great success of Gozzi's fairy dramas so irritated Goldoni that it led to his self-exile to France.
Goldoni gave to his country a classical form, which, though it has since been cultivated, has yet to be cultivated by a master.
Goldoni's plays that were written while he was still in Italy ignore religious and ecclesiastical subjects. This may be surprising, considering his staunch Catholic upbringing. No thoughts are expressed about death or repentance in his memoirs or in his comedies. After his move to France, his position became clearer, as his plays took on a clear anti-clerical tone and often satirized the hypocrisy of monks and of the Church.
Goldoni was inspired by his love of humanity and the admiration he had for his fellow men. He wrote, and was obsessed with, the relationships that humans establish with one another, their cities and homes, the Humanist movement, and the study of philosophy. The moral and civil values that Goldoni promotes in his plays are those of rationality, civility, humanism, the importance of the rising middle-class, a progressive stance to state affairs, honor and honesty. Goldoni had a dislike for arrogance, intolerance and the abuse of power.
Goldoni's main characters are no abstract examples of human virtue, nor monstrous examples of human vice. They occupy the middle ground of human temperament. Goldoni maintains an acute sensibility for the differences in social classes between his characters as well as environmental and generational changes. Goldoni pokes fun at the arrogant nobility and the pauper who lacks dignity.
As in other theatrical works of the time and place, the characters in Goldoni's Italian comedies spoke originally either the literary Tuscan variety (which became modern Italian) or the Venetian dialect, depending on their station in life. However, in some printed editions of his plays he often turned the Venetian texts into Tuscan, too.
One of his best known works is the comic play "Servant of Two Masters", which has been translated and adapted internationally numerous times. In 1966 it was adapted into an opera buffa by the American composer Vittorio Giannini. In 2011, Richard Bean adapted the play for the National Theatre of Great Britain as "One Man, Two Guvnors". Its popularity led to a transfer to the West End and in 2012 to Broadway.
The film "Carlo Goldoni – Venice, Grand Theatre of the World", directed by Alessandro Bettero, was released in 2007 and is available in English, Italian, French, and Japanese.
The following is a small sampling of Goldoni's enormous output. | https://en.wikipedia.org/wiki?curid=5790 |
Cumulative distribution function
In probability theory and statistics, the cumulative distribution function (CDF) of a real-valued random variable formula_1, or just distribution function of formula_1, evaluated at formula_3, is the probability that formula_1 will take a value less than or equal to formula_3.
In the case of a scalar continuous distribution, it gives the area under the probability density function from minus infinity to formula_3. Cumulative distribution functions are also used to specify the distribution of multivariate random variables.
The cumulative distribution function of a real-valued random variable formula_1 is the function given by
where the right-hand side represents the probability that the random variable formula_1 takes on a value less than or
equal to formula_3. The probability that formula_1 lies in the semi-closed interval formula_11, where formula_12, is therefore
In the definition above, the "less than or equal to" sign, "≤", is a convention, not a universally used one (e.g. Hungarian literature uses " using the Fundamental Theorem of Calculus; i.e. given formula_16,
as long as the derivative exists.
The CDF of a continuous random variable formula_1 can be expressed as the integral of its probability density function formula_19 as follows:
In the case of a random variable formula_1 which has distribution having a discrete component at a value formula_22,
If formula_24 is continuous at formula_22, this equals zero and there is no discrete component at formula_22.
Every cumulative distribution function formula_24 is non-decreasing and right-continuous, which makes it a càdlàg function. Furthermore,
Every function with these four properties is a CDF, i.e., for every such function, a random variable can be defined such that the function is the cumulative distribution function of that random variable.
If formula_1 is a purely discrete random variable, then it attains values formula_30 with probability formula_31, and the CDF of formula_1 will be discontinuous at the points formula_33:
If the CDF formula_24 of a real valued random variable formula_1 is continuous, then formula_1 is a continuous random variable; if furthermore formula_24 is absolutely continuous, then there exists a Lebesgue-integrable function formula_39 such that
for all real numbers formula_41 and formula_22. The function formula_19 is equal to the derivative of formula_24 almost everywhere, and it is called the probability density function of the distribution of formula_1.
As an example, suppose formula_1 is uniformly distributed on the unit interval formula_47.
Then the CDF of formula_1 is given by
Suppose instead that formula_1 takes only the discrete values 0 and 1, with equal probability.
Then the CDF of formula_1 is given by
Suppose formula_1 is exponential distributed. Then the CDF of formula_1 is given by
Here λ > 0 is the parameter of the distribution, often called the rate parameter.
Suppose formula_1 is normal distributed. Then the CDF of formula_1 is given by
Here the parameter formula_59 is the mean or expectation of the distribution; and formula_60 is its standard deviation.
Suppose formula_1 is binomial distributed. Then the CDF of formula_1 is given by
Here formula_64 is the probability of success and the function denotes the discrete probability distribution of the number of successes in a sequence of formula_65 independent experiments, and formula_66 is the "floor" under formula_67, i.e. the greatest integer less than or equal to formula_67.
Sometimes, it is useful to study the opposite question and ask how often the random variable is "above" a particular level. This is called the complementary cumulative distribution function (ccdf) or simply the tail distribution or exceedance, and is defined as
This has applications in statistical hypothesis testing, for example, because the one-sided p-value is the probability of observing a test statistic "at least" as extreme as the one observed. Thus, provided that the test statistic, "T", has a continuous distribution, the one-sided p-value is simply given by the ccdf: for an observed value formula_70 of the test statistic
In survival analysis, formula_72 is called the survival function and denoted formula_73, while the term "reliability function" is common in engineering.
Z-table:
One of the most popular application of cumulative distribution function is standard normal table, also called the unit normal table or Z table, is the value of cumulative distribution function of the normal distribution. It is very useful to use Z-table not only for probabilities below a value which is the original application of cumulative distribution function, but also above and/or between values on standard normal distribution, and it was further extended to any normal distribution.
While the plot of a cumulative distribution often has an S-like shape, an alternative illustration is the folded cumulative distribution or mountain plot, which folds the top half of the graph over,
thus using two scales, one for the upslope and another for the downslope. This form of illustration emphasises the median and dispersion (specifically, the mean absolute deviation from the median
The generalization of the cumulative distribution function from real to complex random variables is not obvious because expressions of the form formula_125 make no sense. However expressions of the form formula_126 make sense. Therefore, we define the cumulative distribution of a complex random variables via the joint distribution of their real and imaginary parts:
Generalization of yields
as definition for the CDS of a complex random vector formula_129.
The concept of the cumulative distribution function makes an explicit appearance in statistical analysis in two (similar) ways. Cumulative frequency analysis is the analysis of the frequency of occurrence of values of a phenomenon less than a reference value. The empirical distribution function is a formal direct estimate of the cumulative distribution function for which simple statistical properties can be derived and which can form the basis of various statistical hypothesis tests. Such tests can assess whether there is evidence against a sample of data having arisen from a given distribution, or evidence against two samples of data having arisen from the same (unknown) population distribution.
The Kolmogorov–Smirnov test is based on cumulative distribution functions and can be used to test to see whether two empirical distributions are different or whether an empirical distribution is different from an ideal distribution. The closely related Kuiper's test is useful if the domain of the distribution is cyclic as in day of the week. For instance Kuiper's test might be used to see if the number of tornadoes varies during the year or if sales of a product vary by day of the week or day of the month. | https://en.wikipedia.org/wiki?curid=5793 |
Central tendency
In statistics, a central tendency (or measure of central tendency) is a central or typical value for a probability distribution. It may also be called a center or location of the distribution. Colloquially, measures of central tendency are often called "averages." The term "central tendency" dates from the late 1920s.
The most common measures of central tendency are the arithmetic mean, the median, the mode and the range. A middle tendency can be calculated for either a finite set of values or for a theoretical distribution, such as the normal distribution. Occasionally authors use central tendency to denote "the tendency of quantitative data to cluster around some central value."
The central tendency of a distribution is typically contrasted with its "dispersion" or "variability"; dispersion and central tendency are the often characterized properties of distributions. Analysis may judge whether data has a strong or a weak central tendency based on its dispersion.
The following may be applied to one-dimensional data. Depending on the circumstances, it may be appropriate to transform the data before calculating a central tendency. Examples are squaring the values or taking logarithms. Whether a transformation is appropriate and what it should be, depend heavily on the data being analyzed.
Any of the above may be applied to each dimension of multi-dimensional data, but the results may not be invariant to rotations of the multi-dimensional space. In addition, there are the
Several measures of central tendency can be characterized as solving a variational problem, in the sense of the calculus of variations, namely minimizing variation from the center. That is, given a measure of statistical dispersion, one asks for a measure of central tendency that minimizes variation: such that variation from the center is minimal among all choices of center. In a quip, "dispersion precedes location". These measures are initially defined in one dimension, but can be generalized to multiple dimensions. This center may or may not be unique. In the sense of "L""p" spaces, the correspondence is:
The associated functions are called "p"-norms: respectively 0-"norm", 1-norm, 2-norm, and ∞-norm. The function corresponding to the "L"0 space is not a norm, and is thus often referred to in quotes: 0-"norm".
In equations, for a given (finite) data set "X", thought of as a vector formula_1, the dispersion about a point "c" is the "distance" from x to the constant vector formula_2 in the "p"-norm (normalized by the number of points "n"):
For formula_4 and formula_5 these functions are defined by taking limits, respectively as formula_6 and formula_7. For formula_4 the limiting values are formula_9 and formula_10 for formula_11, so the difference becomes simply equality, so the 0-norm counts the number of "unequal" points. For formula_5 the largest number dominates, and thus the ∞-norm is the maximum difference.
The mean ("L"2 center) and midrange ("L"∞ center) are unique (when they exist), while the median ("L"1 center) and mode ("L"0 center) are not in general unique. This can be understood in terms of convexity of the associated functions (coercive functions).
The 2-norm and ∞-norm are strictly convex, and thus (by convex optimization) the minimizer is unique (if it exists), and exists for bounded distributions. Thus standard deviation about the mean is lower than standard deviation about any other point, and the maximum deviation about the midrange is lower than the maximum deviation about any other point.
The 1-norm is not "strictly" convex, whereas strict convexity is needed to ensure uniqueness of the minimizer. Correspondingly, the median (in this sense of minimizing) is not in general unique, and in fact any point between the two central points of a discrete distribution minimizes average absolute deviation.
The 0-"norm" is not convex (hence not a norm). Correspondingly, the mode is not unique – for example, in a uniform distribution "any" point is the mode.
Instead of a single central point, one can ask for multiple points such that the variation from these points is minimized. This leads to cluster analysis, where each point in the data set is clustered with the nearest "center". Most commonly, using the 2-norm generalizes the mean to "k"-means clustering, while using the 1-norm generalizes the (geometric) median to "k"-medians clustering. Using the 0-norm simply generalizes the mode (most common value) to using the "k" most common values as centers.
Unlike the single-center statistics, this multi-center clustering cannot in general be computed in a closed-form expression, and instead must be computed or approximated by an iterative method; one general approach is expectation–maximization algorithms.
The notion of a "center" as minimizing variation can be generalized in information geometry as a distribution that minimizes divergence (a generalized distance) from a data set. The most common case is maximum likelihood estimation, where the maximum likelihood estimate (MLE) maximizes likelihood (minimizes expected surprisal), which can be interpreted geometrically by using entropy to measure variation: the MLE minimizes cross entropy (equivalently, relative entropy, Kullback–Leibler divergence).
A simple example of this is for the center of nominal data: instead of using the mode (the only single-valued "center"), one often uses the empirical measure (the frequency distribution divided by the sample size) as a "center". For example, given binary data, say heads or tails, if a data set consists of 2 heads and 1 tails, then the mode is "heads", but the empirical measure is 2/3 heads, 1/3 tails, which minimizes the cross-entropy (total surprisal) from the data set. This perspective is also used in regression analysis, where least squares finds the solution that minimizes the distances from it, and analogously in logistic regression, a maximum likelihood estimate minimizes the surprisal (information distance).
For unimodal distributions the following bounds are known and are sharp:
where "μ" is the mean, "ν" is the median, "θ" is the mode, and "σ" is the standard deviation.
For every distribution, | https://en.wikipedia.org/wiki?curid=5794 |
Celebrity
Celebrity is a reference to the fame and wide public recognition of an individual or a group, or, occasionally, a particular animal, as a direct result of the attention given to them by the mass media. Attaining Celebrity status is often associated with having wealth (i.e. having "fame and fortune"). Attaining fame through involvement in the field of sports or the entertainment industry are commonly associated with acquiring celebrity status, while political leaders often become celebrities. People may also become celebrities due to media attention on their lifestyle, wealth, or controversial actions, or even for their connection to a famous person.
In his 2020 book "Dead Famous: an unexpected history of celebrity", British historian Greg Jenner uses the definition:
Although his book is subtitled "from Bronze Age to Silver Screen", and despite the fact that "Until very recently, sociologists argued that "celebrity" was invented just over 100 years ago, in the flickering glimmer of early Hollywood" and the suggestion that some medieval saints might qualify, Jenner asserts that the earliest celebrities lived in the early 1700s, his first example being Henry Sacheverell.
Athletes in Ancient Greece were welcomed home as heroes, had songs and poems written in their honor, and received free food and gifts from those seeking celebrity endorsement. Ancient Rome similarly lauded actors and notorious gladiators, and Julius Caesar appeared on a coin in his own lifetime (a departure from the usual depiction of battles and divine lineage).
In the early 12th century, Thomas Becket became famous following his murder. He was promoted by the Christian Church as a martyr and images of him and scenes from his life became widespread in just a few years. In a pattern often repeated, what started as an explosion of popularity (often referred to with the suffix 'mania') turned into long-lasting fame: pilgrimages to Canterbury Cathedral where he was killed became instantly fashionable and the fascination with his life and death have inspired plays and films.
The cult of personality (particularly in the west) can be traced back to the Romantics in the 18th century, whose livelihood as artists and poets depended on the currency of their reputation. The establishment of cultural hot-spots became an important factor in the process of generating fame: for example, London and Paris in the 18th and 19th centuries. Newspapers started including gossip columns and certain clubs and events became places to be seen in order to receive publicity.
The movie industry spread around the globe in the first half of the 20th century and now, the familiar concept of the instantly recognizable faces of its superstars. Yet, celebrity was not always tied to actors in films, especially when cinema was starting as a medium. As Paul McDonald states in "The Star System: Hollywood's Production of Popular Identities", "in the first decade of the twentieth century, American film production companies withheld the names of film performers, despite requests from audiences, fearing that public recognition would drive performers to demand higher salaries." Public fascination went well beyond the on-screen exploits of movie stars and their private lives became headline news: for example, in Hollywood the marriages of Elizabeth Taylor and in Bollywood the affairs of Raj Kapoor in the 1950s.
The second half of the century saw television and popular music bring new forms of celebrity, such as the rock star and the pop group, epitomised by Elvis Presley and the Beatles, respectively. John Lennon's highly controversial 1966 quote: "We're more popular than Jesus now," which he later insisted was not a boast, and that he was not in any way comparing himself with Christ, gives an insight into both the adulation and notoriety that fame can bring. Unlike movies, television created celebrities who were not primarily actors; for example, presenters, talk show hosts, and newsreaders. However, most of these are only famous within the regions reached by their particular broadcaster, and only a few such as Oprah Winfrey, Jerry Springer, or David Frost could be said to have broken through into wider stardom.
In the '60s and early '70s, the book publishing industry began to persuade major celebrities to put their names on autobiographies and other titles in a genre called celebrity publishing. In most cases, the book was not written by the celebrity but by a ghost-writer, but the celebrity would then be available for a book tour and appearances on talk shows.
People may become celebrities in a wide range of ways; from their professions, following appearances in the media, or by complete accident. The term "instant celebrity" describes someone who becomes a celebrity in a very short time. Someone who achieves a small amount of transient fame (through, say, hype or mass media) may become labeled a "B-grade celebrity". Often, the generalization extends to someone who falls short of mainstream or persistent fame but who seeks to extend or exploit it.
There are, of course, no guarantees of success for an individual to become a celebrity. Though celebrities come from many different working fields, most celebrities are typically associated with the fields of sports and entertainment, or a person may be a public figure who is commonly recognizable in mass media with commercial and critical acclaim.
Though glamour and wealth may certainly play a role for only famous celebrities, most people in the sports and entertainments spheres, be it music, film, television, radio, modelling, comedy, literature etc. live in obscurity and only a small percentage achieve fame and fortune.
Outside of the sports and entertainment sphere, the top inventors, professionals such as doctors, lawyers, and scientists, etc. are unlikely to become celebrities even if they are enormously successful in their field due to society's disinterest in science, invention, medicine, and courtroom law which is not fictional. American microbiologist Maurice Hilleman is credited with saving more lives than any other medical scientist of the 20th century. After Hilleman's death Ralph Nader wrote, "Yet almost no one knew about him, saw him on television, or read about him in newspapers or magazines. His anonymity, in comparison with Madonna, Michael Jackson, Jose Canseco, or an assortment of grade B actors, tells something about our society's and media's concepts of celebrity; much less of the heroic."
Many athletes who are unable to turn professional take a second job or even sometimes abandon their athletic aspirations in order to make ends meet. A small percentage of entertainers and athletes can make a decent living but a vast majority will spend their careers toiling from hard work, determination, rejection, and frequent unemployment. For minor league to amateur athletes, earnings are usually on the lower end of the pay-scale. Many of them take second jobs on the side or even venture into other occupations within the field of sports such as coaching, general management, refereeing, or recruiting and scouting up-and-coming athletes.
The Screen Actors Guild, a union representing actors and actresses throughout Hollywood reports that the average television and film actor earns less than US$50,000 annually; the median hourly wage for actors was $18.80 in May 2015. Actors sometimes alternate between theater, television, and film or even branch into other occupations within the entertainment industry such as becoming a singer, comedian, producer, or a television host in order to be monetarily diversified, as doing one gig pays comparatively very little. For instance, David Letterman is well known for branching into late night television as a talk show host while honing his skills as a stand-up comedian, Barbra Streisand ventured into acting while operating as a singer, and Clint Eastwood achieved even greater fame in Hollywood as a film director and producer than for his acting credentials.
According to American entertainment magnate Master P, entertainers and professional athletes make up less than 1% of all millionaires in the entire world. Less than 1% of all runway models are known to make more than US$1000 for every fashion showcase. According to the US Bureau of Labor Statistics, the median wage for commercial and print models was only $11.22 per hour in 2006 and was also listed one of the top ten worst jobs in the United States.
"Forbes" Magazine releases an annual "Forbes" Celebrity 100 list of the highest-paid celebrities in the world. The total earnings for all top celebrity 100 earners totaled $4.5 billion in 2010 alone.
For instance, Forbes ranked media mogul and talk show host, Oprah Winfrey as the top earner "Forbes magazine’s annual ranking of the most powerful celebrities", with earnings of $290 million in the past year. Forbes cites that Lady Gaga reportedly earned over $90 million in 2010. In 2011, golfer Tiger Woods was one of highest-earning celebrity athletes, with an income of $74 million and is consistently ranked one of the highest-paid athletes in the world. In 2013, Madonna was ranked as the fifth most powerful and the highest-earning celebrity of the year with earnings of $125 million. She has consistently been among the most powerful and highest-earning celebrities in the world, occupying the third place in Forbes Celebrity 100 2009 with $110 million of earnings, and getting the tenth place in the 2011 edition of the list with annual earnings equal to $58 million.
Celebrity endorsements have proven very successful around the world where, due to increasing consumerism, an individual is often considered to own a status symbol when they purchase a celebrity-endorsed product . Although it has become commonplace for celebrities to place their name with endorsements onto products just for quick money, some celebrities have gone beyond merely using their names and have put their entrepreneurial spirit to work by becoming entrepreneurs by attaching themselves in the business aspects of entertainment and building their own business brand beyond their traditional salaried activities. Along with investing their salaried wages into growing business endeavors, several celebrities have become innovative business leaders in their respective industries, gaining the admiration of their peers and contributing to the country's economy.
Numerous celebrities have ventured into becoming business moguls and established themselves as entrepreneurs, idolizing many well known American business leaders such as Bill Gates and Warren Buffett. For instance, basketball legend Michael Jordan became an active entrepreneur involved with many sports-related ventures including investing a minority stake in the Charlotte Bobcats, Paul Newman started his own salad dressing business after leaving behind a distinguished acting career, and rap musician Birdman started his own record label, clothing line, and an oil business while maintaining a career as a rap artist. Brazilian football legend and World Cup winner Ronaldo became the majority owner of La Liga club Real Valladolid in 2018. Other celebrities such as Tyler Perry, George Lucas, and Steven Spielberg have become successful entrepreneurs through starting their own film production companies and running their own movie studios beyond their traditional activities of screenwriting, directing, animating, producing, and acting.
Various examples of celebrity turned entrepreneurs included in the table below are:
Tabloid magazines and talk TV shows bestow a great deal of attention to celebrities. To stay in the public eye and build wealth in addition to their salaried labor, numerous celebrities have participating and branching into various business ventures and endorsements. Many celebrities have participated in many different endorsement opportunities that include: animation, publishing, fashion designing, cosmetics, consumer electronics, household items and appliances, cigarettes, soft drinks and alcoholic beverages, hair care, hairdressing, jewelry design, fast food, credit cards, video games, writing, and toys.
In addition to various endorsements, some celebrities have been involved with some business and investment-related ventures also include: and toddler related items, sports team ownership, fashion retailing, establishments such as restaurants, cafes, hotels, and casinos, movie theaters, advertising and event planning, management-related ventures such as sports management, financial services, model management, and talent management, record labels, film production, television production, publishing such as book and music publishing, massage therapy, salons, health and fitness, and real estate.
Although some celebrities have achieved additional financial success from various business ventures, the vast majority of celebrities are not successful businesspeople and still rely on salaried labored wages to earn a living. Most businesses and investments are well known to have a 90 to 95 percent failure rate within the first five years of operation. Not all celebrities eventually succeed with their businesses and other related side ventures. Some celebrities either went broke or filed for bankruptcy as a result of dabbling with such side businesses or endorsements. Though some might question such validity since celebrities themselves are already well known, have mass appeal, and are well exposed to the general public. The average entrepreneur who is not well known and reputable to the general public does not the same marketing flexibility and status-quo as most celebrities allow and have. Therefore, compared to the average person who starts a business, celebrities already have all the cards and odds stacked in their favor. This means they can have an unfair advantage to expose their business ventures and endorsements and can easily capture a more significant amount of market share than the average entrepreneur.
Celebrities often have fame comparable to royalty. As a result, there is a strong public curiosity about their private affairs. The release of Kim Kardashian's sex tape with rapper Ray J in 2003 brought her to a new level of fame, leading to magazine covers, book deals, and reality TV series.
Celebrities may be resented for their accolades, and the public may have a love/hate relationship with celebrities. Due to the high visibility of celebrities' private lives, their successes and shortcomings are often made very public. Celebrities are alternately portrayed as glowing examples of perfection, when they garner awards, or as decadent or immoral if they become associated with a scandal. When seen in a positive light, celebrities are frequently portrayed as possessing skills and abilities beyond average people; for example, celebrity actors are routinely celebrated for acquiring new skills necessary for filming a role within a very brief time, and to a level that amazes the professionals who train them. Similarly, some celebrities with very little formal education can sometimes be portrayed as experts on complicated issues. Some celebrities have been very vocal about their political views. For example, Matt Damon expressed his displeasure with 2008 US vice presidential nominee Sarah Palin, as well as with the 2011 United States debt-ceiling crisis.
Famous for being famous, in popular culture terminology, refers to someone who attains celebrity status for no particular identifiable reason, or who achieves fame through association with a celebrity. The term is a pejorative, suggesting that the individual has no particular talents or abilities. Even when their fame arises from a particular talent or action on their part, the term will sometimes still apply if their fame is perceived as disproportionate to what they earned through their own talent or work.
The coinages "famesque" and "celebutante" are of similar pejorative gist.
Also known as being "internet famous", contemporary fame does not always involve a physical red carpet.
A report by BBC highlighted a longtime trend of Asian internet celebrities such as Chinese celebrity Wang Hong (birth name Ling Ling). According to BBC, there are two kinds of online celebrities in China—those who create original content, such as Papi Jiang, who is regularly censored by Chinese authorities for cussing in her videos, and those such as Wang Hong and Zhang Dayi, who fall under the second category, as they have clothing and cosmetics businesses on Taobao, China's equivalent of Amazon.
Most high-profile celebrities participate in social networking and photo or video hosting platforms such as YouTube, Twitter, Facebook, Instagram, and Snapchat. Social networking sites allow celebrities to communicate directly with their fans, removing the middle-man known as traditional media. Social media humanizes celebrities in a way that arouses public fascination as evident by the success of magazines such as "Us Weekly" and "People Weekly". Celebrity blogging have also spawned stars such as Perez Hilton who is well known for not only blogging but also outing celebrities.
Social media and the rise of the smartphone have changed how celebrities are treated and how people gain the platform of fame. Not everything is as concealed as it was back in old Hollywood because now everything is put out on the internet by fans or even the celebrity themselves. Websites like Twitter, Facebook, Instagram, and YouTube allow people to become a celebrity overnight. For example, Justin Bieber got his start on YouTube by posting videos of him singing and got discovered. All of his fans got direct contact with his content and were able to interact with him on several social media platforms. Social media has substantially changed what it means to be a celebrity. Instagram and YouTube allow regular people to become rich and famous all from inside their home. It also allows fans to connect with their favorite celebrity without ever meeting them in person. Everything is being shared on social media so it makes it harder for celebrities to live private lives.
Social media sites have also contributed to the fame of some celebrities, such as Tila Tequila who became known through MySpace.
Another example of a celebrity is a family that has notable ancestors or is known for its wealth. In some cases, a well-known family is associated with a particular field. For example, the Kennedy family is associated with US politics; The House of Windsor with royalty; The Hilton and Rothschild families with business; the Jackson family with popular music; and the Osbourne, Chaplin, Kardashian, Baldwin, and Barrymore families with television and film.
Access to celebrities is strictly controlled by their entourage of staff which includes managers, publicists, agents, personal assistants, and bodyguards. Even journalists find it difficult to access celebrities for interviews. Writer and actor Michael Musto said, "You have to go through many hoops just to talk to a major celebrity. You have to get past three different sets of publicists: the publicist for the event, the publicist for the movie, and then the celebrity's personal publicist. They all have to approve you."
Celebrities often hire one or more bodyguards (or close protection officer) to protect themselves and their families from threats ranging from the mundane (intrusive paparazzi photographers or autograph-seeking fans) to serious (assault, kidnapping, assassination, or stalking). The bodyguard travels with the celebrity during professional activities (movie shoots or concerts) and personal activities such as recreation and errands.
Celebrities also typically have security staff at their home, to protect them from similar threats.
Andy Warhol famously coined the phrase "15 minutes of fame" in reference to short-lived publicity. Certain "15 minutes of fame" celebrities can be average people seen with an A-list celebrity, who are sometimes noticed on entertainment news channels such as E! News. These persons are ordinary people becoming celebrities, often based on the ridiculous things they do. "In fact, many reality show contestants fall into this category: the only thing that qualifies them to be on TV is that they're real."
John Cleese said being famous offers some advantages such as financial wealth and easier access to things that are more difficult for non-famous people to access, such as the ability to more easily meet other famous or powerful people, but that being famous also often comes with the disadvantage of creating the conditions in which the celebrity finds themselves acting, at least temporarily (although sometimes over extended periods of time), in a superficial, inauthentic fashion.
Common threats such as stalking have spawned celebrity worship syndrome where a person becomes overly involved with the details of a celebrity's personal life. Psychologists have indicated that though many people obsess over glamorous film, television, sport and music stars, the disparity in salaries in society seems to value professional athletes and entertainment industry-based professionals. One study found that singers, musicians, actors and athletes die younger on average than writers, composers, academics, politicians and businesspeople, with a greater incidence of cancer and especially lung cancer. However, it was remarked that the reasons for this remained unclear, with theories including innate tendencies towards risk-taking as well as the pressure or opportunities of particular types of fame.
Furthermore, some have said fame might have negative psychological effects, and may lead to increasingly selfish tendencies and psychopathy. An academic study on the subject said that fame has an addictive quality to it. When a celebrity's fame recedes over time, the celebrity may find it difficult to adjust psychologically.
Recently, there has been more attention toward the impact celebrities have on health decisions of the population at large. It is believed that the public will follow celebrities' health advice to some extent. This can have positive impacts when the celebrities give solid, evidence-informed health advice, however, it can also have detrimental effects if the health advice is not accurate enough. | https://en.wikipedia.org/wiki?curid=5796 |
Cluster sampling
Cluster sampling is a sampling plan used when mutually homogeneous yet internally heterogeneous groupings are evident in a statistical population. It is often used in marketing research. In this sampling plan, the total population is divided into these groups (known as clusters) and a simple random sample of the groups is selected. The elements in each cluster are then sampled. If all elements in each sampled cluster are sampled, then this is referred to as a "one-stage" cluster sampling plan. If a simple random subsample of elements is selected within each of these groups, this is referred to as a "two-stage" cluster sampling plan. A common motivation for cluster sampling is to reduce the total number of interviews and costs given the desired accuracy. For a fixed sample size, the expected random error is smaller when most of the variation in the population is present internally within the groups, and not between the groups.
The population within a cluster should ideally be as heterogeneous as possible, but there should be homogeneity between clusters. Each cluster should be a small-scale representation of the total population. The clusters should be mutually exclusive and collectively exhaustive. A random sampling technique is then used on any relevant clusters to choose which clusters to include in the study. In single-stage cluster sampling, all the elements from each of the selected clusters are sampled. In two-stage cluster sampling, a random sampling technique is applied to the elements from each of the selected clusters.
The main difference between cluster sampling and stratified sampling is that in cluster sampling the cluster is treated as the sampling unit so sampling is done on a population of clusters (at least in the first stage). In stratified sampling, the sampling is done on elements within each stratum. In stratified sampling, a random sample is drawn from each of the strata, whereas in cluster sampling only the selected clusters are sampled. A common motivation of cluster sampling is to reduce costs by increasing sampling efficiency. This contrasts with stratified sampling where the motivation is to increase precision.
There is also multistage cluster sampling, where at least two stages are taken in selecting elements from clusters.
Without modifying the estimated parameter, cluster sampling is unbiased when the clusters are approximately the same size. In this case, the parameter is computed by combining all the selected clusters. When the clusters are of different sizes there are several options:
One method is to sample clusters and then survey all elements in that cluster. Another method is a two-stage method of sampling a fixed proportion of units (be it 5% or 50%, or another number, depending on cost considerations) from within each of the selected clusters. Relying on the sample drawn from these options will yield an unbiased estimator. However, the sample size is no longer fixed upfront. This leads to a more complicated formula for the standard error of the estimator, as well as issues with the optics of the study plan (since the power analysis and the cost estimations often relate to a specific sample size).
A third possible solution is to use probability proportionate to size sampling. In this sampling plan, the probability of selecting a cluster is proportional to its size, so that a large cluster has a greater probability of selection than a small cluster. The advantage here is that when clusters are selected with probability proportionate to size, the same number of interviews should be carried out in each sampled cluster so that each unit sampled has the same probability of selection.
An example of cluster sampling is area sampling or geographical cluster sampling. Each cluster is a geographical area. Because a geographically dispersed population can be expensive to survey, greater economy than simple random sampling can be achieved by grouping several respondents within a local area into a cluster. It is usually necessary to increase the total sample size to achieve equivalent precision in the estimators, but cost savings may make such an increase in sample size feasible.
Cluster sampling is used to estimate high mortalities in cases such as wars, famines and natural disasters.
Major use: when the sampling frame of all elements is not available we can resort only to the cluster sampling.
Two-stage cluster sampling, a simple case of multistage sampling, is obtained by selecting cluster samples in the first stage and then selecting a sample of elements from every sampled cluster. Consider a population of "N" clusters in total. In the first stage, "n" clusters are selected using ordinary cluster sampling method. In the second stage, simple random sampling is usually used. It is used separately in every cluster and the numbers of elements selected from different clusters are not necessarily equal. The total number of clusters "N", number of clusters selected "n", and numbers of elements from selected clusters need to be pre-determined by the survey designer. Two-stage cluster sampling aims at minimizing survey costs and at the same time controlling the uncertainty related to estimates of interest. This method can be used in health and social sciences. For instance, researchers used two-stage cluster sampling to generate a representative sample of the Iraqi population to conduct mortality surveys. Sampling in this method can be quicker and more reliable than other methods, which is why this method is now used frequently.
Cluster sampling methods can lead to significant bias when working with a small number of clusters. For instance, it can be necessary to cluster at the state or city level, units that may be small and fixed in number. Microeconometrics methods for panel data often use short panels, which is analogous to having few observations per clusters and many clusters. The small cluster problem can be viewed as an incidental parameter problem. While the point estimates can be reasonably precisely estimated, if the number of observations per cluster is sufficiently high, we need the number of clooters formula_1 for the asymptotics to kick in. If the number of clusters is low the estimated covariance matrix can be downward biased.
Small numbers of clusters is a risk when there is serial correlation or when there is intraclass correlation as in the Moulton context. When having few clusters, we tend to underestimate serial correlation across observations when a random shock occurs, or the intraclass correlation in a Moulton setting. Several studies have highlighted the consequences of serial correlation and highlighted the small-cluster problem.
In the framework of the Moulton factor, an intuitive explanation of the small cluster problem can be derived from the formula for the Moulton factor. Assume for simplicity that the number of observation per cluster is fixed at "n". Below, formula_2 stands for the covariance matrix adjusted for clustering, formula_3 stands for the covariance matrix not adjusted for clustering, and ρ stands for the intraclass correlation:
The ratio on the left-hand side provides an indication of how much the unadjusted scenario overestimates the precision. Therefore, a high number means a strong downward bias of the estimated covariance matrix. A small cluster problem can be interpreted as a large n: when the data is fixed and the number of clusters is low, the number of data within a cluster can be high. It follows that inference when the number of clusters is small will not have correct coverage.
Several solutions for the small cluster problem have been proposed. One can use a bias-corrected cluster-robust variance matrix, make T-distribution adjustments, or use bootstrap methods with asymptotic refinements, such as the percentile-t or wild bootstrap, that can lead to improved finite sample inference. Cameron, Gelbach and Miller (2008) provide microsimulations for different methods and find that the wild bootstrap performs well in the face of a small number of clusters. | https://en.wikipedia.org/wiki?curid=5797 |
Charles Baudelaire
Charles Pierre Baudelaire (, ; ; 9 April 1821 – 31 August 1867) was a French poet who also produced notable work as an essayist, art critic, and one of the first translators of Edgar Allan Poe.
His most famous work, a book of lyric poetry titled "Les Fleurs du mal" ("The Flowers of Evil"), expresses the changing nature of beauty in the rapidly industrializing Paris during the mid-19th century. Baudelaire's highly original style of prose-poetry influenced a whole generation of poets including Paul Verlaine, Arthur Rimbaud and Stéphane Mallarmé, among many others. He is credited with coining the term "modernity" ("modernité") to designate the fleeting, ephemeral experience of life in an urban metropolis, and the responsibility of artistic expression to capture that experience.
Baudelaire was born in Paris, France, on 9 April 1821, and baptized two months later at Saint-Sulpice Roman Catholic Church. His father, Joseph-François Baudelaire (1759-1827), a senior civil servant and amateur artist, was 34 years older than Baudelaire's mother, Caroline (née Dufaÿs) (1794-1871). Joseph-François died during Baudelaire's childhood, at rue Hautefeuille, Paris, on February 10, 1827. The following year, Caroline married Lieutenant Colonel Jacques Aupick, who later became a French ambassador to various noble courts. Baudelaire's biographers have often seen this as a crucial moment, considering that finding himself no longer the sole focus of his mother's affection left him with a trauma, which goes some way to explaining the excesses later apparent in his life. He stated in a letter to her that, "There was in my childhood a period of passionate love for you." Baudelaire regularly begged his mother for money throughout his career, often promising that a lucrative publishing contract or journalistic commission was just around the corner.
Baudelaire was educated in Lyon, where he boarded. At 14, he was described by a classmate as "much more refined and distinguished than any of our fellow pupils...we are bound to one another...by shared tastes and sympathies, the precocious love of fine works of literature." Baudelaire was erratic in his studies, at times diligent, at other times prone to "idleness". Later, he attended the Lycée Louis-le-Grand in Paris, studying law, a popular course for those not yet decided on any particular career. He began to frequent prostitutes and may have contracted gonorrhea and syphilis during this period. He also began to run up debts, mostly for clothes. Upon gaining his degree in 1839, he told his brother "I don't feel I have a vocation for anything." His stepfather had in mind a career in law or diplomacy, but instead Baudelaire decided to embark upon a literary career. His mother later recalled: "Oh, what grief! If Charles had let himself be guided by his stepfather, his career would have been very different...He would not have left a name in literature, it is true, but we should have been happier, all three of us."
His stepfather sent him on a voyage to Calcutta, India in 1841 in the hope of ending his dissolute habits. The trip provided strong impressions of the sea, sailing, and exotic ports, that he later employed in his poetry. (Baudelaire later exaggerated his aborted trip to create a legend about his youthful travels and experiences, including "riding on elephants".) On returning to the taverns of Paris, he began to compose some of the poems of "Les Fleurs du Mal". At 21, he received a sizable inheritance but squandered much of it within a few years. His family obtained a decree to place his property in trust, which he resented bitterly, at one point arguing that allowing him to fail financially would have been the one sure way of teaching him to keep his finances in order.
Baudelaire became known in artistic circles as a dandy and free-spender, going through much of his inheritance and allowance in a short period of time. During this time, Jeanne Duval became his mistress. She was rejected by his family. His mother thought Duval a "Black Venus" who "tortured him in every way" and drained him of money at every opportunity. Baudelaire made a suicide attempt during this period.
He took part in the Revolutions of 1848 and wrote for a revolutionary newspaper. However, his interest in politics was passing, as he was later to note in his journals.
In the early 1850s, Baudelaire struggled with poor health, pressing debts, and irregular literary output. He often moved from one lodging to another to escape creditors. He undertook many projects that he was unable to complete, though he did finish translations of stories by Edgar Allan Poe.
Upon the death of his stepfather in 1857, Baudelaire received no mention in the will but he was heartened nonetheless that the division with his mother might now be mended. At 36, he wrote her: "believe that I belong to you absolutely, and that I belong only to you." His mother died on 16 August 1871, outliving her son by almost four years.
His first published work, under the pseudonym Baudelaire Dufaÿs, was his art review "Salon of 1845", which attracted immediate attention for its boldness. Many of his critical opinions were novel in their time, including his championing of Delacroix, and some of his views seem remarkably in tune with the future theories of the Impressionist painters.
In 1846, Baudelaire wrote his second Salon review, gaining additional credibility as an advocate and critic of Romanticism. His continued support of Delacroix as the foremost Romantic artist gained widespread notice. The following year Baudelaire's novella "La Fanfarlo" was published.
Baudelaire was a slow and very attentive worker. However, he often was sidetracked by indolence, emotional distress and illness, and it was not until 1857 that he published "Les Fleurs du mal" ("The Flowers of Evil"), his first and most famous volume of poems. Some of these poems had already appeared in the "Revue des deux mondes" ("Review of Two Worlds") in 1855, when they were published by Baudelaire's friend Auguste Poulet Malassis. Some of the poems had appeared as "fugitive verse" in various French magazines during the previous decade.
The poems found a small, yet appreciative audience. However, greater public attention was given to their subject matter. The effect on fellow artists was, as Théodore de Banville stated, "immense, prodigious, unexpected, mingled with admiration and with some indefinable anxious fear". Gustave Flaubert, recently attacked in a similar fashion for "Madame Bovary" (and acquitted), was impressed and wrote to Baudelaire: "You have found a way to rejuvenate Romanticism...You are as unyielding as marble, and as penetrating as an English mist."
The principal themes of sex and death were considered scandalous for the period. He also touched on lesbianism, sacred and profane love, metamorphosis, melancholy, the corruption of the city, lost innocence, the oppressiveness of living, and wine. Notable in some poems is Baudelaire's use of imagery of the sense of smell and of fragrances, which is used to evoke feelings of nostalgia and past intimacy.
The book, however, quickly became a byword for unwholesomeness among mainstream critics of the day. Some critics called a few of the poems "masterpieces of passion, art and poetry," but other poems were deemed to merit no less than legal action to suppress them. J. Habas led the charge against Baudelaire, writing in "Le Figaro": "Everything in it which is not hideous is incomprehensible, everything one understands is putrid." Baudelaire responded to the outcry in a prophetic letter to his mother:
"You know that I have always considered that literature and the arts pursue an aim independent of morality. Beauty of conception and style is enough for me. But this book, whose title ("Fleurs du mal") says everything, is clad, as you will see, in a cold and sinister beauty. It was created with rage and patience. Besides, the proof of its positive worth is in all the ill that they speak of it. The book enrages people. Moreover, since I was terrified myself of the horror that I should inspire, I cut out a third from the proofs. They deny me everything, the spirit of invention and even the knowledge of the French language. I don't care a rap about all these imbeciles, and I know that this book, with its virtues and its faults, will make its way in the memory of the lettered public, beside the best poems of V. Hugo, Th. Gautier and even Byron."
Baudelaire, his publisher and the printer successfully were prosecuted for creating an offense against public morals. They were fined, but Baudelaire was not imprisoned. Six of the poems were suppressed, but printed later as "Les Épaves" ("The Wrecks") (Brussels, 1866). Another edition of "Les Fleurs du mal", without these poems, but with considerable additions, appeared in 1861. Many notables rallied behind Baudelaire and condemned the sentence. Victor Hugo wrote to him: "Your "fleurs du mal" shine and dazzle like stars...I applaud your vigorous spirit with all my might." Baudelaire did not appeal the judgment, but his fine was reduced. Nearly 100 years later, on May 11, 1949, Baudelaire was vindicated, the judgment officially reversed, and the six banned poems reinstated in France.
In the poem "Au lecteur" ("To the Reader") that prefaces "Les Fleurs du mal", Baudelaire accuses his readers of hypocrisy and of being as guilty of sins and lies as the poet:
Baudelaire next worked on a translation and adaptation of Thomas De Quincey's "Confessions of an English Opium Eater". Other works in the years that followed included "Petits Poèmes en prose" ("Small Prose poems"); a series of art reviews published in the "Pays, Exposition universelle" ("Country, World Fair"); studies on Gustave Flaubert (in "L'Artiste", October 18, 1857); on Théophile Gautier ("Revue contemporaine", September 1858); various articles contributed to Eugene Crepet's "Poètes francais"; "Les Paradis artificiels: opium et haschisch" ("French poets; Artificial Paradises: opium and hashish") (1860); and "Un Dernier Chapitre de l'histoire des oeuvres de Balzac" ("A Final Chapter of the history of works of Balzac") (1880), originally an article "Comment on paye ses dettes quand on a du génie" ("How one pays one's debts when one has genius"), in which his criticism turns against his friends Honoré de Balzac, Théophile Gautier, and Gérard de Nerval.
By 1859, his illnesses, his long-term use of laudanum, his life of stress, and his poverty had taken a toll and Baudelaire had aged noticeably. But at last, his mother relented and agreed to let him live with her for a while at Honfleur. Baudelaire was productive and at peace in the seaside town, his poem "Le Voyage" being one example of his efforts during that time. In 1860, he became an ardent supporter of Richard Wagner.
His financial difficulties increased again, however, particularly after his publisher Poulet Malassis went bankrupt in 1861. In 1864, he left Paris for Belgium, partly in the hope of selling the rights to his works and to give lectures. His long-standing relationship with Jeanne Duval continued on-and-off, and he helped her to the end of his life. Baudelaire's relationships with actress Marie Daubrun and with courtesan Apollonie Sabatier, though the source of much inspiration, never produced any lasting satisfaction. He smoked opium, and in Brussels he began to drink to excess. Baudelaire suffered a massive stroke in 1866 and paralysis followed. After more than a year of aphasia, he received the last rites of the Catholic Church. The last two years of his life were spent in a semi-paralyzed state in "maisons de santé" in Brussels and in Paris, where he died on 31 August 1867. Baudelaire is buried in the Cimetière du Montparnasse, Paris.
Many of Baudelaire's works were published posthumously. After his death, his mother paid off his substantial debts, and she found some comfort in Baudelaire's emerging fame. "I see that my son, for all his faults, has his place in literature." She lived another four years.
Baudelaire is one of the major innovators in French literature. His poetry is influenced by the French romantic poets of the earlier 19th century, although its attention to the formal features of verse connects it more closely to the work of the contemporary "Parnassians". As for theme and tone, in his works we see the rejection of the belief in the supremacy of nature and the fundamental goodness of man as typically espoused by the romantics and expressed by them in rhetorical, effusive and public voice in favor of a new urban sensibility, an awareness of individual moral complexity, an interest in vice (linked with decadence) and refined sensual and aesthetic pleasures, and the use of urban subject matter, such as the city, the crowd, individual passers-by, all expressed in highly ordered verse, sometimes through a cynical and ironic voice. Formally, the use of sound to create atmosphere, and of "symbols" (images that take on an expanded function within the poem), betray a move towards considering the poem as a self-referential object, an idea further developed by the Symbolists Verlaine and Mallarmé, who acknowledge Baudelaire as a pioneer in this regard.
Beyond his innovations in versification and the theories of symbolism and "correspondences", an awareness of which is essential to any appreciation of the literary value of his work, aspects of his work that regularly receive much critical discussion include the role of women, the theological direction of his work and his alleged advocacy of "satanism", his experience of drug-induced states of mind, the figure of the dandy, his stance regarding democracy and its implications for the individual, his response to the spiritual uncertainties of the time, his criticisms of the bourgeois, and his advocacy of modern music and painting (e.g., Wagner, Delacroix). He made Paris the subject of modern poetry. He brought the city's details to life in the eyes and hearts of his readers.
Baudelaire was an active participant in the artistic life of his times. As critic and essayist, he wrote extensively and perceptively about the luminaries and themes of French culture. He was frank with friends and enemies, rarely took the diplomatic approach and sometimes responded violently verbally, that often undermined his cause. His associations were numerous, including Gustave Courbet, Honoré Daumier, Félicien Rops, Franz Liszt, Champfleury, Victor Hugo, Gustave Flaubert, and Balzac.
In 1847, Baudelaire became acquainted with the works of Poe, in which he found tales and poems that had, he claimed, long existed in his own brain but never taken shape. Baudelaire saw in Poe a precursor and tried to be his French contemporary counterpart. From this time until 1865, he was largely occupied with translating Poe's works; his translations were widely praised. Baudelaire was not the first French translator of Poe, but his "scrupulous translations" were considered among the best. These were published as "Histoires extraordinaires" ("Extraordinary stories") (1856), "Nouvelles histoires extraordinaires" ("New extraordinary stories") (1857), "Aventures d'Arthur Gordon Pym", "", and "Histoires grotesques et sérieuses" ("Grotesque and serious stories") (1865). Two essays on Poe are to be found in his "Oeuvres complètes" ("Complete works") (vols. v. and vi.).
A strong supporter of the Romantic painter Delacroix, Baudelaire called him "a poet in painting". Baudelaire also absorbed much of Delacroix's aesthetic ideas as expressed in his journals. As Baudelaire elaborated in his "Salon of 1846", "As one contemplates his series of pictures, one seems to be attending the celebration of some grievous mystery...This grave and lofty melancholy shines with a dull light.. plaintive and profound like a melody by Weber." Delacroix, though appreciative, kept his distance from Baudelaire, particularly after the scandal of "Les Fleurs du mal". In private correspondence, Delacroix stated that Baudelaire "really gets on my nerves" and he expressed his unhappiness with Baudelaire's persistent comments about "melancholy" and "feverishness".
Baudelaire had no formal musical training, and knew little of composers beyond Beethoven and Weber. Weber was in some ways Wagner's precursor, using the leitmotif and conceiving the idea of the "total art work" ("Gesamtkunstwerk"), both of which gained Baudelaire's admiration. Before even hearing Wagner's music, Baudelaire studied reviews and essays about him, and formulated his impressions. Later, Baudelaire put them into his non-technical analysis of Wagner, which was highly regarded, particularly his essay "Richard Wagner et Tannhäuser à Paris". Baudelaire's reaction to music was passionate and psychological. "Music engulfs (possesses) me like the sea." After attending three Wagner concerts in Paris in 1860, Baudelaire wrote to the composer: "I had a feeling of pride and joy in understanding, in being possessed, in being overwhelmed, a truly sensual pleasure like that of rising in the air." Baudelaire's writings contributed to the elevation of Wagner and to the cult of Wagnerism that swept Europe in the following decades.
Gautier, writer and poet, earned Baudelaire's respect for his perfection of form and his mastery of language, though Baudelaire thought he lacked deeper emotion and spirituality. Both strove to express the artist's inner vision, which Heinrich Heine earlier stated: "In artistic matters, I am a supernaturalist. I believe that the artist can not find all his forms in nature, but that the most remarkable are revealed to him in his soul." Gautier's frequent meditations on death and the horror of life are themes which influenced Baudelaire's writings. In gratitude for their friendship and commonality of vision, Baudelaire dedicated "Les Fleurs du mal" to Gautier.
Manet and Baudelaire became constant companions from around 1855. In the early 1860s, Baudelaire accompanied Manet on daily sketching trips and often met him socially. Manet also lent Baudelaire money and looked after his affairs, particularly when Baudelaire went to Belgium. Baudelaire encouraged Manet to strike out on his own path and not succumb to criticism. "Manet has great talent, a talent which will stand the test of time. But he has a weak character. He seems to me crushed and stunned by shock." In his painting "Music in the Tuileries", Manet includes portraits of his friends Théophile Gautier, Jacques Offenbach, and Baudelaire. While it's difficult to differentiate who influenced whom, both Manet and Baudelaire discussed and expressed some common themes through their respective arts. Baudelaire praised the modernity of Manet's subject matter: "almost all our originality comes from the stamp that 'time' imprints upon our feelings." When Manet's famous "Olympia" (1863), a portrait of a nude prostitute, provoked a scandal for its blatant realism mixed with an imitation of Renaissance motifs, Baudelaire worked privately to support his friend, though he offered no public defense (he was, however, ill at the time). When Baudelaire returned from Belgium after his stroke, Manet and his wife were frequent visitors at the nursing home and she played passages from Wagner for Baudelaire on the piano.
Nadar (Félix Tournachon) was a noted caricaturist, scientist and important early photographer. Baudelaire admired Nadar, one of his close friends, and wrote: "Nadar is the most amazing manifestation of vitality." They moved in similar circles and Baudelaire made many social connections through him. Nadar's ex-mistress Jeanne Duval became Baudelaire's mistress around 1842. Baudelaire became interested in photography in the 1850s, and denouncing it as an art form, advocated its return to "its real purpose, which is that of being the servant to the sciences and arts". Photography should not, according to Baudelaire, encroach upon "the domain of the impalpable and the imaginary". Nadar remained a stalwart friend right to Baudelaire's last days and wrote his obituary notice in "Le Figaro".
Many of Baudelaire's philosophical proclamations were considered scandalous and intentionally provocative in his time. He wrote on a wide range of subjects, drawing criticism and outrage from many quarters.
Baudelaire's influence on the direction of modern French (and English) language literature was considerable. The most significant French writers to come after him were generous with tributes; four years after his death, Arthur Rimbaud praised him in a letter as 'the king of poets, a true God'. In 1895, Stéphane Mallarmé published "Le Tombeau de Charles Baudelaire", a sonnet in Baudelaire's memory. Marcel Proust, in an essay published in 1922, stated that along with Alfred de Vigny, Baudelaire was "the greatest poet of the nineteenth century".
In the English-speaking world, Edmund Wilson credited Baudelaire as providing an initial impetus for the Symbolist movement by virtue of his translations of Poe. In 1930, T.S. Eliot, while asserting that Baudelaire had not yet received a "just appreciation" even in France, claimed that the poet had "great genius" and asserted that his "technical mastery which can hardly be overpraised...has made his verse an inexhaustible study for later poets, not only in his own language". In a lecture delivered in French on "Edgar Allan Poe and France" (Edgar Poe et la France) in Aix-en-Provence in April 1948, Eliot stated that "I am an English poet of American origin who learnt his art under the aegis of Baudelaire and the Baudelairian lineage of poets." Eliot also alluded to Baudelaire's poetry directly in his own poetry. For example, he quoted the last line of Baudelaire's "Au Lecteur" in the last line of Section I of "The Waste Land".'
At the same time that Eliot was affirming Baudelaire's importance from a broadly conservative and explicitly Christian viewpoint, left-wing critics such as Wilson and Walter Benjamin were able to do so from a dramatically different perspective. Benjamin translated Baudelaire's "Tableaux Parisiens" into German and published a major essay on translation as the foreword.
In the late 1930s, Benjamin used Baudelaire as a starting point and focus for "Das Passagenwerk", his monumental attempt at a materialist assessment of 19th-century culture. For Benjamin, Baudelaire's importance lay in his anatomies of the crowd, of the city and of modernity. He says that, in "Les Fleurs du mal", "the specific devaluation of the world of things, as manifested in the commodity, is the foundation of Baudelaire's allegorical intention." | https://en.wikipedia.org/wiki?curid=5804 |
Classical guitar
The classical guitar (also known as the classic guitar, nylon-string guitar or Spanish guitar) is a member of the guitar family used in classical music. An acoustic wooden string instrument with strings made of gut or nylon, it is a precursor of the modern acoustic and electric guitars, both of which use metal strings. Classical guitars are derived from the Spanish vihuela and gittern in the fifteenth and sixteenth century, which later evolved into the seventeenth and eighteenth century Baroque guitar and later the modern classical guitar in the mid nineteenth century.
For a right-handed player, the traditional classical guitar has twelve frets clear of the body and is properly held on the left leg, so that the hand that plucks or strums the strings does so near the back of the sound hole (this is called the classical position). The modern steel string guitar, on the other hand, usually has fourteen frets clear of the body (see Dreadnought) and is commonly played off the hip.
The phrase "classical guitar" may refer to either of two concepts other than the instrument itself:
The term "modern classical guitar" is sometimes used to distinguish the classical guitar from older forms of guitar, which are in their broadest sense also called "classical", or more specifically, "early guitars". Examples of early guitars include the six-string early romantic guitar (c. 1790–1880), and the earlier baroque guitars with five courses.
The materials and the methods of classical guitar construction may vary, but the typical shape is either "modern classical guitar" or that "historic classical guitar" similar to the early romantic guitars of France and Italy. Classical guitar strings once made of gut are now made of such polymers as nylon, with fine wire wound about the acoustically lower (bass side) strings.
A guitar family tree may be identified. The flamenco guitar derives from the modern classical, but has differences in material, construction and sound.
Today's "modern classical guitar" was established by the late designs of the 19th-century Spanish luthier, Antonio Torres Jurado.
The classical guitar has a long history and one is able to distinguish various:
Both instrument and repertoire can be viewed from a combination of various perspectives:
Historical (chronological period of time)
Geographical
Cultural
While "classical guitar" is today mainly associated with the modern classical guitar design, there is an increasing interest in early guitars; and understanding the link between historical repertoire and the particular period guitar that was originally used to perform this repertoire. The musicologist and author Graham Wade writes:
Nowadays it is customary to play this repertoire on reproductions of instruments authentically modelled on concepts of musicological research with appropriate adjustments to techniques and overall interpretation. Thus over recent decades we have become accustomed to specialist artists with expertise in the art of vihuela (a 16th-century type of guitar popular in Spain), lute, Baroque guitar, 19th-century guitar, etc.
Different types of guitars have different sound aesthetics, e.g. different colour-spectrum characteristics (the way the sound energy is spread in the fundamental frequency and the overtones), different response, etc. These differences are due to differences in construction; for example modern classical guitars usually use a different bracing (fan-bracing) from that used in earlier guitars (they had ladder-bracing); and a different voicing was used by the luthier.
There is a historical parallel between musical styles (baroque, classical, romantic, flamenco, jazz) and the style of "sound aesthetic" of the musical instruments used, for example: Robert de Visée played a baroque guitar with a very different sound aesthetic from the guitars used by Mauro Giuliani and Luigi Legnani – they used 19th century guitars. These guitars in turn sound different from the Torres models used by Segovia that are suited for interpretations of romantic-modern works such as Moreno Torroba.
When considering the guitar from a historical perspective, the musical instrument used is as important as the musical language and style of the particular period. As an example: It is impossible to play a historically informed de Visee or Corbetta (baroque guitarist-composers) on a modern classical guitar. The reason is that the baroque guitar used courses, which are two strings close together (in unison), that are plucked together. This gives baroque guitars an unmistakable sound characteristic and tonal texture that is an integral part of an interpretation. Additionally the sound aesthetic of the baroque guitar (with its strong overtone presence) is very different from modern classical type guitars, as is shown below.
Today's use of Torres and post-Torres type guitars for repertoire of all periods is sometimes critically viewed: Torres and post-Torres style modern guitars (with their fan-bracing and design) have a thick and strong tone, very suitable for modern-era repertoire. However, they are considered to emphasize the fundamental too heavily (at the expense of overtone partials) for earlier repertoire (Classical/Romantic: Carulli, Sor, Giuliani, Mertz, ...; Baroque: de Visee, ...; etc.). "Andrés Segovia presented the Spanish guitar as a versatile model for all playing styles" to the extent, that still today, "many guitarists have tunnel-vision of the world of the guitar, coming from the modern Segovia tradition".
While fan-braced modern classical Torres and post-Torres style instruments coexisted with traditional ladder-braced guitars at the beginning of the 20th century, the traditional forms eventually fell away. Some attribute this to the popularity of Segovia, considering him "the catalyst for change toward the Spanish design and the so-called 'modern' school in the 1920s and beyond." The styles of music performed on ladder-braced guitars were becoming more and more unfashionable; and, e.g. in Germany, musicians were in part turning towards folk music (Schrammel-music and the Contraguitar), but this only remained localized in Germany and Austria and became unfashionable again. On the other hand, Segovia was playing in concerts around the world, popularizing his modern classical guitar, as well as a new style of music in the 1920s: Spanish romantic-modern style with guitar works by Moreno Torroba, de Falla, etc. Some people consider it to have been this influence of Segovia which led to the domination of the Torres instrument. Factories all over the world began producing them in large numbers.
It was the 19th-century classical guitarist Francisco Tárrega who first popularized the Torres design as a classical solo instrument.
Composers of the Renaissance period who wrote for four-course guitar include Alonso Mudarra, Miguel de Fuenllana, Adrian Le Roy, , Guillaume de Morlaye, and .
Four-course guitar
Some well known composers of the Baroque guitar were Gaspar Sanz, Robert de Visée, Francesco Corbetta and Santiago de Murcia.
From approximately 1780 to 1850, the guitar had numerous composers and performers including:
Hector Berlioz studied the guitar as a teenager; Franz Schubert owned at least two and wrote for the instrument; and Ludwig van Beethoven, after hearing Giuliani play, commented the instrument was "a miniature orchestra in itself". Niccolò Paganini was also a guitar virtuoso and composer. He once wrote: "I love the guitar for its harmony; it is my constant companion in all my travels". He also said, on another occasion: "I do not like this instrument, but regard it simply as a way of helping me to think."
The guitarist and composer Francisco Tárrega (November 29, 1852 – December 15, 1909) was one of the great guitar virtuosos and teachers and is considered the father of modern classical guitar playing. As professor of guitar at the conservatories of Madrid and Barcelona, he defined many elements of the modern classical technique and elevated the importance of the guitar in the classical music tradition.
At the beginning of the 1920s, Andrés Segovia popularized the guitar with tours and early phonograph recordings. Segovia collaborated with the composers Federico Moreno Torroba and Joaquin Turina with the aim of extending the guitar repertoire with new music. Segovia's tour of South America revitalized public interest in the guitar and helped the guitar music of Manuel Ponce and Heitor Villa-Lobos reach a wider audience. The composers Alexandre Tansman and Mario Castelnuovo-Tedesco were commissioned by Segovia to write new pieces for the guitar. Luiz Bonfá popularized Brazilian musical styles such as the newly created Bossa Nova, which was well received by audiences in the USA.
The classical guitar repertoire also includes modern contemporary works – sometimes termed "New Music" – such as Elliott Carter's "Changes", Cristóbal Halffter's "Codex I", Luciano Berio's "Sequenza XI", Maurizio Pisati's "Sette Studi", Maurice Ohana's "Si Le Jour Paraît", Sylvano Bussotti's "Rara (eco sierologico)", Ernst Krenek's "Suite für Guitarre allein, Op. 164", Franco Donatoni's "Algo: Due pezzi per chitarra", Paolo Coggiola's "Variazioni Notturne", etc.
Performers who are known for including modern repertoire include Jürgen Ruck, Elena Càsoli, Leo Brouwer (when he was still performing), John Schneider, Reinbert Evers, Maria Kämmerling, Siegfried Behrend, David Starobin, Mats Scheidegger, Magnus Andersson, etc.
This type of repertoire is usually performed by guitarists who have particularly chosen to focus on the avant-garde in their performances.
Within the contemporary music scene itself, there are also works which are generally regarded as extreme. These include works such as Brian Ferneyhough's "Kurze Schatten II", Sven-David Sandström's "away from" and Rolf Riehm's "Toccata Orpheus" etc. which are notorious for their extreme difficulty.
There are also a variety of databases documenting modern guitar works such as Sheer Pluck and others.
The evolution of the classical guitar and its repertoire spans more than four centuries. It has a history that was shaped by contributions from earlier instruments, such as the lute, the vihuela, and the baroque guitar.
The origins of the modern guitar are not known with certainty. Some believe it is indigenous to Europe, while others think it is an imported instrument. Guitar-like instruments appear in ancient carvings and statues recovered from Egyptian, Sumerian, and Babylonian civilizations. This means that the contemporary Iranian instruments such as the tanbur and setar are distantly related to the European guitar, as they all derive ultimately from the same ancient origins, but by very different historical routes and influences.
During the late Middle Ages, gitterns called "guitars" were in use, but their construction and tuning was different from modern guitars. The "Guitarra Latina" in Spain, had curved sides and a single hole. The "Guitarra Morisca", which appears to have had Moorish influences, had an oval soundbox and many sound holes on its soundboard. By the 15th century, a four course double-string instrument called the vihuela de mano, that had tuning like the later modern guitar except on one string and similar construction, first appeared in Spain and spread to France and Italy. In the 16th century, a fifth double-string was added. During this time, composers wrote mostly in tablature notation. In the middle of the 16th century, influences from the vihuela and the renaissance guitar were combined and the baroque five string guitar appeared in Spain. The baroque guitar quickly superseded the vihuela in popularity in Spain, France and Italy and Italian players and composers became prominent. In the late 18th century the six string guitar quickly became popular at the expense of the five string guitars. During the 19th century the Spanish luthier and player Antonio de Torres gave the modern classical guitar its definitive form, with a broadened body, increased waist curve, thinned belly, improved internal bracing. The modern classical guitar replaced an older form for the accompaniment of song and dance called flamenco, and a modified version, known as the flamenco guitar, was created.
Alonso de Mudarra's book "Tres Libros de Música", published in Spain in 1546, contains the earliest known written pieces for a four-course guitarra. This four-course "guitar" was popular in France, Spain, and Italy. In France this instrument gained popularity among aristocrats. A considerable volume of music was published in Paris from the 1550s to the 1570s: Simon Gorlier's Le Troysième Livre... mis en tablature de Guiterne was published in 1551. In 1551 Adrian Le Roy also published his Premier Livre de Tablature de Guiterne, and in the same year he also published Briefve et facile instruction pour apprendre la tablature a bien accorder, conduire, et disposer la main sur la Guiterne. Robert Ballard, Grégoire Brayssing from Augsburg, and Guillaume Morlaye (c. 1510 – c. 1558) significantly contributed to its repertoire. Morlaye's Le Premier Livre de Chansons, Gaillardes, Pavannes, Bransles, Almandes, Fantasies – which has a four-course instrument illustrated on its title page – was published in partnership with Michel Fedenzat, and among other music, they published six books of tablature by lutenist Albert de Rippe (who was very likely Guillaume's teacher).
The written history of the classical guitar can be traced back to the early 16th century with the development of the "vihuela" in Spain. While the lute was then becoming popular in other parts of Europe, the Spaniards did not take to it well because of its association with the Moors. Instead, the lute-like vihuela appeared with two more strings that gave it more range and complexity. In its most developed form, the vihuela was a guitar-like instrument with six double strings made of gut, tuned like a modern classical guitar with the exception of the third string, which was tuned half a step lower. It has a high sound and is rather large to hold. Few have survived and most of what is known today comes from diagrams and paintings.
The earliest extant six-string guitar is believed to have been built in 1779 by Gaetano Vinaccia (1759 – after 1831) in Naples, Italy; however, the date on the label is a little ambiguous. The Vinaccia family of luthiers is known for developing the mandolin. This guitar has been examined and does not show tell-tale signs of modifications from a double-course guitar.
The authenticity of guitars allegedly produced before the 1790s is often in question. This also corresponds to when Moretti's 6-string method appeared, in 1792.
The "modern classical guitar" (also known as the "Spanish guitar"), the immediate forerunner of today's guitars, was developed in the 19th century by Antonio de Torres Jurado, Ignacio Fleta, Hermann Hauser Sr., and Robert Bouchet.
The fingerstyle is used fervently on the modern classical guitar. The thumb traditionally plucks the bass – or root note – whereas the fingers ring the melody and its accompanying parts. Noted players were: Francisco Tárrega, Emilio Pujol, Andrés Segovia
In the 20th century, many non-guitarist composers wrote for the instrument, which previously only players of the instrument had done. These included: Francisco Tárrega (1852–1909), Roberto Gerhard (1896–1970), and Heitor Villa-Lobos (1887–1959)
The modern classical guitar is usually played in a seated position, with the instrument resting on the left lap – and the left foot placed on a footstool. Alternatively – if a footstool is not used – a "guitar support" can be placed between the guitar and the left lap (the support usually attaches to the instrument's side with suction cups). (There are of course exceptions, with some performers choosing to hold the instrument another way.)
Right-handed players use the fingers of the right hand to pluck the strings, with the thumb plucking from the top of a string downwards (downstroke) and the other fingers plucking from the bottom of string upwards (upstroke). The little finger in classical technique as it evolved in the 20th century is used only to ride along with the ring finger without striking the strings and to thus physiologically facilitate the ring finger's motion.
In contrast, Flamenco technique, and classical compositions evoking Flamenco, employ the little finger semi-independently in the Flamenco four-finger rasgueado, that rapid strumming of the string by the fingers in reverse order employing the back of the fingernail—a familiar characteristic of Flamenco.
Flamenco technique, in the performance of the rasgueado also uses the upstroke of the four fingers and the downstroke of the thumb: the string is hit not only with the inner, fleshy side of the fingertip but also with the outer, fingernail side. This was also used in a technique of the vihuela called dedillo which has recently begun to be introduced on the classical guitar.
Some modern guitarists, such as Štěpán Rak and Kazuhito Yamashita, use the little finger independently, compensating for the little finger's shortness by maintaining an extremely long fingernail. Rak and Yamashita have also generalized the use of the upstroke of the four fingers and the downstroke of the thumb (the same technique as in the rasgueado of the Flamenco: as explained above the string is hit not only with the inner, fleshy side of the fingertip but also with the outer, fingernail side) both as a free stroke and as a rest stroke.
As with other plucked instruments (such as the lute), the musician directly touches the strings (usually plucking) to produce the sound. This has important consequences: Different tone/timbre (of a single note) can be produced by plucking the string in different manners and in different positions. For example, plucking an open string will sound brighter than playing the same note(s) on a fretted position (which would have a warmer tone).
The instrument's versatility means it can create a variety of tones, but this also makes the instrument harder to learn than a standard acoustic guitar.
In guitar "scores" the five fingers of the right-hand (which pluck the strings) are designated by the first letter of their Spanish names namely p = thumb ("pulgar"), i = index finger ("índice"), m = middle finger ("mayor"), a = ring finger ("anular"), c = little finger or pinky ("meñique/chiquito")
The four fingers of the left hand (which stop the strings) are designated 1 = index, 2 = major, 3 = ring finger, 4 = little finger; 0 designates an open string, that is a string that is not stopped by a finger of the left hand and whose full length thus vibrates when plucked. On the classical guitar thumb of the left hand is never used to stop strings from above (as is done on the electric guitar): the neck of a classical guitar is too wide and the normal position of the thumb used in classical guitar technique do not make that possible.
Scores (contrary to "tablatures") do not systematically indicate the string to be plucked (although in most cases the choice is obvious). When an indication of the string is required the strings are designated 1 to 6 (from the 1st the high E to the 6th the low E) with figures 1 to 6 inside circles.
The positions (that is where on the fretboard the first finger of the left hand is placed) are also not systematically indicated, but when they are (mostly in the case of the execution of "barrés") these are indicated with Roman numerals from the first position I (index finger of the left hand placed on the 1st fret: F-B flat-E flat-A flat-C-F) to the twelfth position XII (the index finger of the left hand placed on the 12th fret: E-A-D-G-B-E; the 12th fret is placed where the body begins) or even higher up to position XIX (the classical guitar most often having 19 frets, with the 19th fret being most often split and not being usable to fret the 3rd and 4th strings).
To achieve tremolo effects and rapid, fluent scale passages, the player must practice alternation, that is, never plucking a string with the same finger twice in a row.
Using p to indicate the thumb, i the index finger, m the middle finger and a the ring finger, common alternation patterns include:
Music written specifically for the classical guitar dates from the addition of the sixth string (the baroque guitar normally had five pairs of strings) in the late 18th century.
A guitar recital may include a variety of works, e.g. works written originally for the lute or vihuela by composers such as John Dowland (b. England 1563) and Luis de Narváez (b. Spain c. 1500), and also music written for the harpsichord by Domenico Scarlatti (b. Italy 1685), for the baroque lute by Sylvius Leopold Weiss (b. Germany 1687), for the baroque guitar by Robert de Visée (b. France c. 1650) or even Spanish-flavored music written for the piano by Isaac Albéniz (b. Spain 1860) and Enrique Granados (b. Spain 1867). The most important composer who did not write for the guitar but whose music is often played on it is Johann Sebastian Bach (b. Germany 1685), whose baroque lute works have proved highly adaptable to the instrument.
Of music written originally for guitar, the earliest important composers are from the classical period and include Fernando Sor (b. Spain 1778) and Mauro Giuliani (b. Italy 1781), both of whom wrote in a style strongly influenced by Viennese classicism. In the 19th century guitar composers such as Johann Kaspar Mertz (b. Slovakia, Austria 1806) were strongly influenced by the dominance of the piano. Not until the end of the nineteenth century did the guitar begin to establish its own unique identity. Francisco Tárrega (b. Spain 1852) was central to this, sometimes incorporating stylized aspects of flamenco's Moorish influences into his romantic miniatures. This was part of late 19th century mainstream European musical nationalism. Albéniz and Granados were central to this movement; their evocation of the guitar was so successful that their compositions have been absorbed into standard guitar repertoire.
The steel-string and electric guitars characteristic to the rise of rock and roll in the post-WWII era became more widely played in North America and the English speaking world. Agustín Barrios Mangoré of Paraguay composed many works and brought into the mainstream the characteristics of Latin American music, as did the Brazilian composer Heitor Villa-Lobos. Andrés Segovia commissioned works from Spanish composers such as Federico Moreno Torroba and Joaquín Rodrigo, Italians such as Mario Castelnuovo-Tedesco and Latin American composers such as Manuel Ponce of Mexico. Other prominent Latin American composers are Leo Brouwer of Cuba, Antonio Lauro of Venezuela and Enrique Solares of Guatemala. Julian Bream of Britain managed to get nearly every British composer from William Walton to Benjamin Britten to Peter Maxwell Davies to write significant works for guitar. Bream's collaborations with tenor Peter Pears also resulted in song cycles by Britten, Lennox Berkeley and others. There are significant works by composers such as Hans Werner Henze of Germany, Gilbert Biberian of England and Roland Chadwick of Australia.
The classical guitar also became widely used in popular music and rock & roll in the 1960s after guitarist Mason Williams popularized the instrument in his instrumental hit Classical Gas. Guitarist Christopher Parkening is quoted in the book "Classical Gas: The Music of Mason Williams" as saying that it is the most requested guitar piece besides Malagueña and perhaps the best known instrumental guitar piece today.
In the field of New Flamenco, the works and performances of Spanish composer and player Paco de Lucía are known worldwide.
Not many classical guitar concertos were written through the guitar history. Nevertheless, some guitar concertos are nowadays widely known and popular, especially Joaquín Rodrigo's "Concierto de Aranjuez" (with the famous theme from 2nd movement) and "Fantasía para un gentilhombre". Composers, who also wrote famous guitar concertos are: Antonio Vivaldi (originally for mandolin or lute), Mauro Giuliani, Heitor Villa-Lobos, Mario Castelnuovo-Tedesco, Manuel Ponce, Leo Brouwer, Lennox Berkeley...
Nowadays, more and more contemporary composers decide to write a guitar concerto, among them "Bosco Sacro" by Federico Biscione, for guitar and string orchestra, is one of the most inspired.
The classical guitar is distinguished by a number of characteristics:
Parts of typical classical guitars, numbered
The fretboard (also called the fingerboard) is a piece of wood embedded with metal frets that constitutes the top of the neck. It is flat or slightly curved. The curvature of the fretboard is measured by the fretboard radius, which is the radius of a hypothetical circle of which the fretboard's surface constitutes a segment. The smaller the fretboard radius, the more noticeably curved the fretboard is. Fretboards are most commonly made of ebony, but may also be made of rosewood, some other hardwood, or of phenolic composite ("micarta").
Frets are the metal strips (usually nickel alloy or stainless steel) embedded along the fingerboard and placed at points that divide the length of string mathematically. The strings' vibrating length is determined when the strings are pressed down behind the frets. Each fret produces a different pitch and each pitch spaced a half-step apart on the 12 tone scale. The ratio of the widths of two consecutive frets is the twelfth root of two (formula_1), whose numeric value is about 1.059463. The twelfth fret divides the string in two exact halves and the 24th fret (if present) divides the string in half yet again. Every twelve frets represents one octave. This arrangement of frets results in equal tempered tuning.
A classical guitar's frets, fretboard, tuners, headstock, all attached to a long wooden extension, collectively constitute its neck. The wood for the fretboard usually differs from the wood in the rest of the neck. The bending stress on the neck is considerable, particularly when heavier gauge strings are used.
This is the point where the neck meets the body. In the traditional Spanish neck joint the neck and block are one piece with the sides inserted into slots cut in the block. Other necks are built separately and joined to the body either with a dovetail joint, mortise or flush joint. These joints are usually glued and can be reinforced with mechanical fasteners. Recently many manufacturers use bolt on fasteners. Bolt on neck joints were once associated only with less expensive instruments but now some top manufacturers and hand builders are using variations of this method. Some people believed that the Spanish style one piece neck/block and glued dovetail necks have better sustain, but testing has failed to confirm this.
While most traditional Spanish style builders use the one piece neck/heel block, Fleta, a prominent Spanish builder, used a dovetail joint due to the influence of his early training in violin making.
One reason for the introduction of the mechanical joints was to make it easier to repair necks. This is more of a problem with steel string guitars than with nylon strings, which have about half the string tension. This is why nylon string guitars often don't include a truss rod either.
The body of the instrument is a major determinant of the overall sound variety for acoustic guitars. The guitar top, or soundboard, is a finely crafted and engineered element often made of spruce or red cedar. Considered the most prominent factor in determining the sound quality of a guitar, this thin (often 2 or 3 mm thick) piece of wood has a uniform thickness and is strengthened by different types of internal bracing. The back is made in rosewood and Brazilian rosewood is especially coveted, but mahogany or other decorative woods are sometimes used.
The majority of the sound is caused by vibration of the guitar top as the energy of the vibrating strings is transferred to it. Different patterns of wood bracing have been used through the years by luthiers (Torres, Hauser, Ramírez, Fleta, and C.F. Martin being among the most influential designers of their times); to not only strengthen the top against collapsing under the tremendous stress exerted by the tensioned strings, but also to affect the resonation of the top. Some contemporary guitar makers have introduced new construction concepts such as "double-top" consisting of two extra-thin wooden plates separated by Nomex, or carbon-fiber reinforced lattice – pattern bracing. The back and sides are made out of a variety of woods such as mahogany, maple, cypress Indian rosewood and highly regarded Brazilian rosewood ("Dalbergia nigra"). Each one is chosen for its aesthetic effect and structural strength, and such choice can also play a significant role in determining the instrument's timbre. These are also strengthened with internal bracing, and decorated with inlays and purfling.
The body of a classical guitar is a resonating chamber that projects the vibrations of the body through a "sound hole", allowing the acoustic guitar to be heard without amplification. The sound hole is normally a single round hole in the top of the guitar (under the strings), though some have different placement, shapes, or numbers of holes. How much air an instrument can move determines its maximum volume.
The top, back and sides of a classical guitar body are very thin, so a flexible piece of wood called "kerfing" (because it is often scored, or "kerfed" so it bends with the shape of the rim) is glued into the corners where the rim meets the top and back. This interior reinforcement provides 5 to 20 mm of solid gluing area for these corner joints.
During final construction, a small section of the outside corners is carved or routed out and filled with binding material on the outside corners and decorative strips of material next to the binding, which are called "purfling". This binding serves to seal off the endgrain of the top and back. Binding and purfling materials are generally made of either wood or high quality plastic materials.
The main purpose of the bridge on a classical guitar is to transfer the vibration from the strings to the soundboard, which vibrates the air inside of the guitar, thereby amplifying the sound produced by the strings. The bridge holds the strings in place on the body. Also, the position of the saddle, usually a strip of bone or plastic that supports the strings off the bridge, determines the distance to the nut (at the top of the fingerboard).
The modern full size classical guitar has a scale length of around , with an overall instrument length of . The scale length has remained quite consistent since it was chosen by the originator of the instrument, Antonio de Torres. This length may have been chosen because it's twice the length of a violin string. As the guitar is tuned to one octave below that of the violin, the same size gut could be used for the first strings of both instruments.
Smaller-scale instruments are produced to assist children in learning the instrument as the smaller scale leads to the frets being closer together, making it easier for smaller hands. The scale-size for the smaller guitars is usually in the range , with an instrument length of . Full-size instruments are sometimes referred to as 4/4, while the smaller sizes are 3/4, 1/2 or 1/4.
These sizes are not absolute, as luthiers may choose variations around these nominal scale-lengths;
Guitars can be described in size from largest to smallest as:
– Contra or Octave bass;
– Bass baritone or Quint bass;
– Prime or Quart bass;
– Terz treble;
– Alto Requinto;
– Quart;
– Quint;
– Soprano, Octave or Piccolo.
A variety of different tunings are used. The most common by far, which one could call the "standard tuning" is:
The above order is the tuning from the "1st string" (highest-pitched string e'—spatially the bottom string in playing position) to the "6th string" – lowest-pitched string E—spatially the upper string in playing position, and hence comfortable to pluck with the thumb.
The explanation for this "asymmetrical" tuning (in the sense that the maj 3rd is not between the two middle strings, as in the tuning of the viola da gamba) is probably that the guitar originated as a 4-string instrument (actually an instrument with 4 double courses of strings, see above) with a maj 3rd between the 2nd and 3rd strings, and it only became a 6-string instrument by gradual addition of a 5th string and then a 6th string tuned a 4th apart:
"The development of the modern tuning can be traced in stages. One of the tunings from the 16th century is C-F-A-D. This is equivalent to the top four strings of the modern guitar tuned a tone lower. However, the absolute pitch for these notes is not equivalent to modern "concert pitch". The tuning of the four-course guitar was moved up by a tone and toward the end of the 16th century, five-course instruments were in use with an added lower string tuned to A. This produced A-D-G-B-E, one of a wide number of variant tunings of the period. The low E string was added during the 18th century."
This tuning is such that neighboring strings are at most 5 semitones apart.
There are also a variety of commonly used alternate tunings.
Guitar Foundation of America is an American classical guitar organisation that was founded in 1973 at the National Guitar Convention sponsored by the American String Teachers Association. It is also the world’s largest multi-national guitar organisation. They published journals and publications, also held the Annual GFA Convention and Competitions, including the prestigious International Concert Artists Competition, International Youth Competitions and International Ensemble Competition. Jérémy Jouve, is one of the winner from GFA ICAC (International Concert Artists Competition) 2003. | https://en.wikipedia.org/wiki?curid=5810 |
C. S. Lewis
Clive Staples Lewis (29 November 1898 – 22 November 1963) was a British writer and lay theologian. He held academic positions in English literature at both Oxford University (Magdalen College, 1925–1954) and Cambridge University (Magdalene College, 1954–1963). He is best known for his works of fiction, especially "The Screwtape Letters", "The Chronicles of Narnia", and "The Space Trilogy", and for his non-fiction Christian apologetics, such as "Mere Christianity", "Miracles", and "The Problem of Pain".
Lewis and fellow novelist J. R. R. Tolkien were close friends. They both served on the English faculty at Oxford University and were active in the informal Oxford literary group known as the Inklings. According to Lewis's memoir "Surprised by Joy", he was baptised in the Church of Ireland, but fell away from his faith during adolescence. Lewis returned to Anglicanism at the age of 32, owing to the influence of Tolkien and other friends, and he became an "ordinary layman of the Church of England". Lewis's faith profoundly affected his work, and his wartime radio broadcasts on the subject of Christianity brought him wide acclaim.
Lewis wrote more than 30 books which have been translated into more than 30 languages and have sold millions of copies. The books that make up "The Chronicles of Narnia" have sold the most and have been popularised on stage, TV, radio, and cinema. His philosophical writings are widely cited by Christian apologists from many denominations.
In 1956, Lewis married American writer Joy Davidman; she died of cancer four years later at the age of 45. Lewis died on 22 November 1963 from kidney failure, one week before his 65th birthday. In 2013, on the 50th anniversary of his death, Lewis was honoured with a memorial in Poets' Corner in Westminster Abbey.
Clive Staples Lewis was born in Belfast, Ireland, on 29 November 1898. His father was Albert James Lewis (1863–1929), a solicitor whose father Richard had come to Ireland from Wales during the mid-19th century. His mother was Florence Augusta Lewis, née Hamilton (1862–1908), known as Flora, the daughter of Thomas Hamilton, a Church of Ireland priest, and great granddaughter of both Bishop Hugh Hamilton and John Staples. He had an elder brother, Warren Hamilton Lewis (known as "Warnie"). He was baptised on 29 January 1899 by his maternal grandfather in St Mark's Church, Dundela.
When his dog Jacksie was killed by a car, the four-year old Lewis adopted the name Jacksie. At first, he would answer to no other name, but later accepted Jack, the name by which he was known to friends and family for the rest of his life. When he was seven, his family moved into "Little Lea", the family home of his childhood, in the Strandtown area of East Belfast.
As a boy, Lewis was fascinated with anthropomorphic animals; he fell in love with Beatrix Potter's stories and often wrote and illustrated his own animal stories. He and his brother Warnie created the world of Boxen, inhabited and run by animals. Lewis loved to read; his father's house was filled with books, and he felt that finding a book to read was as easy as walking into a field and "finding a new blade of grass".
Lewis was schooled by private tutors until age nine when his mother died in 1908 from cancer. His father then sent him to live and study at Wynyard School in Watford, Hertfordshire. Lewis's brother had enrolled there three years previously. The school was closed not long afterward due to a lack of pupils; the headmaster Robert "Oldie" Capron was soon after committed to a psychiatric hospital. Lewis then attended Campbell College in the east of Belfast about a mile from his home, but left after a few months due to respiratory problems. He was then sent to the health-resort town of Malvern, Worcestershire, where he attended the preparatory school Cherbourg House, which Lewis calls "Chartres" in his autobiography. It was during this time that Lewis abandoned his childhood Christian faith and became an atheist, becoming interested in mythology and the occult. In September 1913, Lewis enrolled at Malvern College, where he remained until the following June. He found the school socially competitive. After leaving Malvern, he studied privately with William T. Kirkpatrick, his father's old tutor and former headmaster of Lurgan College.
As a teenager, Lewis was wonder-struck by the songs and legends of what he called "Northernness", the ancient literature of Scandinavia preserved in the Icelandic sagas. These legends intensified an inner longing that he would later call "joy". He also grew to love nature; its beauty reminded him of the stories of the North, and the stories of the North reminded him of the beauties of nature. His teenage writings moved away from the tales of Boxen, and he began using different art forms, such as epic poetry and opera, to try to capture his new-found interest in Norse mythology and the natural world. Studying with Kirkpatrick ("The Great Knock", as Lewis afterward called him) instilled in him a love of Greek literature and mythology and sharpened his debate and reasoning skills. In 1916, Lewis was awarded a scholarship at University College, Oxford.
Lewis experienced a certain cultural shock on first arriving in England: "No Englishman will be able to understand my first impressions of England," Lewis wrote in "Surprised by Joy". "The strange English accents with which I was surrounded seemed like the voices of demons. But what was worst was the English landscape ... I have made up the quarrel since; but at that moment I conceived a hatred for England which took many years to heal."
From boyhood, Lewis had immersed himself in Norse and Greek mythology, and later in Irish mythology and literature. He also expressed an interest in the Irish language, though there is not much evidence that he laboured to learn it. He developed a particular fondness for W. B. Yeats, in part because of Yeats's use of Ireland's Celtic heritage in poetry. In a letter to a friend, Lewis wrote, "I have here discovered an author exactly after my own heart, whom I am sure you would delight in, W. B. Yeats. He writes plays and poems of rare spirit and beauty about our old Irish mythology."
In 1921, Lewis met Yeats twice, since Yeats had moved to Oxford. Lewis was surprised to find his English peers indifferent to Yeats and the Celtic Revival movement, and wrote: "I am often surprised to find how utterly ignored Yeats is among the men I have met: perhaps his appeal is purely Irish – if so, then thank the gods that I am Irish." Early in his career, Lewis considered sending his work to the major Dublin publishers, writing: "If I do ever send my stuff to a publisher, I think I shall try Maunsel, those Dublin people, and so tack myself definitely onto the Irish school." After his conversion to Christianity, his interests gravitated towards Christian theology and away from pagan Celtic mysticism.
Lewis occasionally expressed a somewhat tongue-in-cheek chauvinism toward the English. Describing an encounter with a fellow Irishman, he wrote: "Like all Irish people who meet in England, we ended by criticisms on the invincible flippancy and dullness of the Anglo-Saxon race. After all, there is no doubt, "ami", that the Irish are the only people: with all their faults, I would not gladly live or die among another folk." Throughout his life, he sought out the company of other Irish people living in England and visited Northern Ireland regularly. In 1958 he spent his honeymoon there at the Old Inn, Crawfordsburn, which he called "my Irish life".
Various critics have suggested that it was Lewis's dismay over the sectarian conflict in his native Belfast which led him to eventually adopt such an ecumenical brand of Christianity. As one critic has said, Lewis "repeatedly extolled the virtues of all branches of the Christian faith, emphasising a need for unity among Christians around what the Catholic writer called 'Mere Christianity', the core doctrinal beliefs that all denominations share". On the other hand, Paul Stevens of the University of Toronto has written that "Lewis' mere Christianity masked many of the political prejudices of an old-fashioned Ulster Protestant, a native of middle-class Belfast for whom British withdrawal from Northern Ireland even in the 1950s and 1960s was unthinkable."
Lewis entered Oxford in the 1917 summer term, studying at University College, and shortly after, he joined the Officers' Training Corps at the university as his "most promising route into the army". From there, he was drafted into a Cadet Battalion for training. After his training, he was commissioned into the Third Battalion of the Somerset Light Infantry of the British Army as a Second Lieutenant. Within months of entering Oxford, the British Army shipped him to France to fight in the First World War.
On his 19th birthday (29 November 1917) he arrived at the front line in the Somme Valley in France, where he experienced trench warfare for the first time. On 15 April 1918, Lewis was wounded and two of his colleagues were killed by a British shell falling short of its target. He suffered from depression and homesickness during his convalescence and, upon his recovery in October, he was assigned to duty in Andover, England. He was demobilised in December 1918 and soon restarted his studies. In a later letter, Lewis cited that his experience of the horror of war, along with the loss of his mother and his unhappiness in school, were the bases of his pessimism and atheism.
After Lewis returned to Oxford University, he received a First in Honour Moderations (Greek and Latin literature) in 1920, a First in Greats (Philosophy and Ancient History) in 1922, and a First in English in 1923. In 1924 he became a philosophy tutor at University College and, in 1925, was elected a Fellow and Tutor in English Literature at Magdalen College, where he served for 29 years until 1954.
During his army training, Lewis shared a room with another cadet, Edward Courtnay Francis "Paddy" Moore (1898–1918). Maureen Moore, Paddy's sister, said that the two made a mutual pact that if either died during the war, the survivor would take care of both of their families. Paddy was killed in action in 1918 and Lewis kept his promise. Paddy had earlier introduced Lewis to his mother, Janie King Moore, and a friendship quickly sprang up between Lewis, who was 18 when they met, and Janie, who was 45. The friendship with Moore was particularly important to Lewis while he was recovering from his wounds in hospital, as his father did not visit him.
Lewis lived with and cared for Moore until she was hospitalised in the late 1940s. He routinely introduced her as his mother, referred to her as such in letters, and developed a deeply affectionate friendship with her. Lewis's own mother had died when he was a child, and his father was distant, demanding, and eccentric.
Speculation regarding their relationship resurfaced with the 1990 publication of A. N. Wilson's biography of Lewis. Wilson (who never met Lewis) attempted to make a case for their having been lovers for a time. Wilson's biography was not the first to address the question of Lewis's relationship with Moore. George Sayer knew Lewis for 29 years, and he had sought to shed light on the relationship during the period of 14 years prior to Lewis's conversion to Christianity. In his biography "Jack: A Life of C. S. Lewis", he wrote:
Later Sayer changed his mind. In the introduction to the 1997 edition of his biography of Lewis he wrote:
However, the romantic nature of the relationship is doubted by other writers; for example, Philip Zaleski and Carol Zaleski write in "The Fellowship" that
Lewis spoke well of Mrs. Moore throughout his life, saying to his friend George Sayer, "She was generous and taught me to be generous, too." In December 1917, Lewis wrote in a letter to his childhood friend Arthur Greeves that Janie and Greeves were "the two people who matter most to me in the world".
In 1930, Lewis moved into The Kilns with his brother Warnie, Mrs. Moore, and her daughter Maureen. The Kilns was a house in the district of Headington Quarry on the outskirts of Oxford, now part of the suburb of Risinghurst. They all contributed financially to the purchase of the house, which passed to Maureen, who by then was Dame Maureen Dunbar, when Warren died in 1973.
Moore suffered from dementia in her later years and was eventually moved into a nursing home, where she died in 1951. Lewis visited her every day in this home until her death.
Lewis was raised in a religious family that attended the Church of Ireland. He became an atheist at age 15, though he later described his young self as being paradoxically "very angry with God for not existing" and "equally angry with him for creating a world". His early separation from Christianity began when he started to view his religion as a chore and a duty; around this time, he also gained an interest in the occult, as his studies expanded to include such topics. Lewis quoted Lucretius ("De rerum natura", 5.198–9) as having one of the strongest arguments for atheism:
which he translated poetically as follows:
Lewis's interest in the works of George MacDonald was part of what turned him from atheism. This can be seen particularly well through this passage in Lewis's "The Great Divorce", chapter nine, when the semi-autobiographical main character meets MacDonald in Heaven:
He eventually returned to Christianity, having been influenced by arguments with his Oxford colleague and Christian friend J. R. R. Tolkien, whom he seems to have met for the first time on 11 May 1926, and the book "The Everlasting Man" by G. K. Chesterton. Lewis vigorously resisted conversion, noting that he was brought into Christianity like a prodigal, "kicking, struggling, resentful, and darting his eyes in every direction for a chance to escape". He described his last struggle in "Surprised by Joy":
After his conversion to theism in 1929, Lewis converted to Christianity in 1931, following a long discussion during a late-night walk along Addison's Walk with his close friends Tolkien and Hugo Dyson. He records making a specific commitment to Christian belief while on his way to the zoo with his brother. He became a member of the Church of England – somewhat to the disappointment of Tolkien, who had hoped that he would join the Catholic Church.
Lewis was a committed Anglican who upheld a largely orthodox Anglican theology, though in his apologetic writings, he made an effort to avoid espousing any one denomination. In his later writings, some believe that he proposed ideas such as purification of venial sins after death in purgatory ("The Great Divorce" and "Letters to Malcolm") and mortal sin ("The Screwtape Letters"), which are generally considered to be Roman Catholic teachings, although they are also widely held in Anglicanism (particularly in high church Anglo-Catholic circles). Regardless, Lewis considered himself an entirely orthodox Anglican to the end of his life, reflecting that he had initially attended church only to receive communion and had been repelled by the hymns and the poor quality of the sermons. He later came to consider himself honoured by worshipping with men of faith who came in shabby clothes and work boots and who sang all the verses to all the hymns. In 1935, Lewis and his brother Warren donated a stained glass window in memory of their parents to their childhood church of St. Mark's, Dundela, Belfast.
After the outbreak of the Second World War in 1939, the Lewises took child evacuees from London and other cities into The Kilns. Lewis was only 40 when the war started, and he tried to re-enter military service, offering to instruct cadets; but his offer was not accepted. He rejected the recruiting office's suggestion of writing columns for the Ministry of Information in the press, as he did not want to "write lies" to deceive the enemy. He later served in the local Home Guard in Oxford.
From 1941 to 1943, Lewis spoke on religious programmes broadcast by the BBC from London while the city was under periodic air raids. These broadcasts were appreciated by civilians and servicemen at that stage. For example, Air Chief Marshal Sir Donald Hardman wrote:
The broadcasts were anthologised in "Mere Christianity". From 1941, he was occupied at his summer holiday weekends visiting R.A.F. stations to speak on his faith, invited by the R.A.F.'s Chaplain-in-Chief Maurice Edwards.
It was also during the same wartime period that Lewis was invited to become first President of the Oxford Socratic Club in January 1942, a position that he enthusiastically held until he resigned on appointment to Cambridge University in 1954.
Lewis was named on the last list of honours by George VI in December 1951 as a Commander of the Order of the British Empire (CBE) but declined so as to avoid association with any political issues.
In 1954, Lewis accepted the newly founded chair of Mediaeval and Renaissance Literature at Magdalene College, Cambridge, where he finished his career. He maintained a strong attachment to the city of Oxford, keeping a home there and returning on weekends until his death in 1963.
In later life, Lewis corresponded with Joy Davidman Gresham, an American writer of Jewish background, a former Communist, and a convert from atheism to Christianity. She was separated from her alcoholic and abusive husband, novelist William L. Gresham, and came to England with her two sons, David and Douglas. Lewis at first regarded her as an agreeable intellectual companion and personal friend, and it was on this level that he agreed to enter into a civil marriage contract with her so that she could continue to live in the UK. The civil marriage took place at the register office, 42 St Giles', Oxford, on 23 April 1956. Lewis's brother Warren wrote: "For Jack the attraction was at first undoubtedly intellectual. Joy was the only woman whom he had met ... who had a brain which matched his own in suppleness, in width of interest, and in analytical grasp, and above all in humour and a sense of fun." After complaining of a painful hip, she was diagnosed with terminal bone cancer, and the relationship developed to the point that they sought a Christian marriage. Since she was divorced, this was not straightforward in the Church of England at the time, but a friend, the Rev. Peter Bide, performed the ceremony at her bed in the Churchill Hospital on 21 March 1957.
Gresham's cancer soon went into remission, and the couple lived together as a family with Warren Lewis until 1960, when her cancer recurred and she died on 13 July. Earlier that year, the couple took a brief holiday in Greece and the Aegean; Lewis was fond of walking but not of travel, and this marked his only crossing of the English Channel after 1918. Lewis's book "A Grief Observed" describes his experience of bereavement in such a raw and personal fashion that he originally released it under the pseudonym N. W. Clerk to keep readers from associating the book with him. Ironically, many friends recommended the book to Lewis as a method for dealing with his own grief. After Lewis's death, his authorship was made public by Faber's, with the permission of the executors.
Lewis continued to raise Gresham's two sons after her death. Douglas Gresham is a Christian like Lewis and his mother, while David Gresham turned to his mother's ancestral faith, becoming Orthodox Jewish in his beliefs. His mother's writings had featured the Jews in an unsympathetic manner, particularly one "shohet" (ritual slaughterer). David informed Lewis that he was going to become a ritual slaughterer to present this type of Jewish religious functionary to the world in a more favourable light. In a 2005 interview, Douglas Gresham acknowledged that he and his brother were not close, but he did say that they are in email contact. Douglas remains involved in the affairs of the Lewis estate.
In early June 1961, Lewis began suffering from nephritis, which resulted in blood poisoning. His illness caused him to miss the autumn term at Cambridge, though his health gradually began improving in 1962 and he returned that April. His health continued to improve and, according to his friend George Sayer, Lewis was fully himself by early 1963. On 15 July that year, he fell ill and was admitted to the hospital; he suffered a heart attack at 5:00 pm the next day and lapsed into a coma, unexpectedly waking the following day at 2:00 pm. After he was discharged from the hospital, Lewis returned to the Kilns, though he was too ill to return to work. As a result, he resigned from his post at Cambridge in August.
Lewis's condition continued to decline, and he was diagnosed with end-stage kidney failure in mid-November. He collapsed in his bedroom at 5:30 pm on 22 November, exactly one week before his 65th birthday, and died a few minutes later. He is buried in the churchyard of Holy Trinity Church, Headington, Oxford. His brother Warren died on 9 April 1973 and was buried in the same grave.
Media coverage of Lewis's death was almost completely overshadowed by news of the assassination of US President John F. Kennedy, which occurred on the same day (approximately 55 minutes following Lewis's collapse), as did the death of English writer Aldous Huxley, author of "Brave New World". This coincidence was the inspiration for Peter Kreeft's book "Between Heaven and Hell: A Dialog Somewhere Beyond Death with John F. Kennedy, C. S. Lewis, & Aldous Huxley". Lewis is commemorated on 22 November in the church calendar of the Episcopal Church.
Lewis began his academic career as an undergraduate student at Oxford University, where he won a triple first, the highest honours in three areas of study. He was then elected a Fellow of Magdalen College, Oxford, where he worked for nearly thirty years, from 1925 to 1954. In 1954, he was awarded the newly founded chair of Mediaeval and Renaissance Literature at Cambridge University, and was elected a fellow of Magdalene College. Concerning his appointed academic field, he argued that there was no such thing as an English Renaissance. Much of his scholarly work concentrated on the later Middle Ages, especially its use of allegory. His "The Allegory of Love" (1936) helped reinvigorate the serious study of late medieval narratives such as the "Roman de la Rose".
Lewis was commissioned to write the volume "English Literature in the Sixteenth Century (Excluding Drama)" for the Oxford History of English Literature. His book "A Preface to "Paradise Lost"" is still cited as a criticism of that work. His last academic work, "The Discarded Image: An Introduction to Medieval and Renaissance Literature" (1964), is a summary of the medieval world view, a reference to the "discarded image" of the cosmos.
Lewis was a prolific writer, and his circle of literary friends became an informal discussion society known as the "Inklings", including J. R. R. Tolkien, Nevill Coghill, Lord David Cecil, Charles Williams, Owen Barfield, and his brother Warren Lewis. Glyer points to December 1929 as the Inklings' beginning date. Lewis's friendship with Coghill and Tolkien grew during their time as members of the Kolbítar, an Old Norse reading group that Tolkien founded and which ended around the time of the inception of the Inklings. At Oxford, he was the tutor of poet John Betjeman, critic Kenneth Tynan, mystic Bede Griffiths, novelist Roger Lancelyn Green and Sufi scholar Martin Lings, among many other undergraduates. Curiously, the religious and conservative Betjeman detested Lewis, whereas the anti-establishment Tynan retained a lifelong admiration for him.
Of Tolkien, Lewis writes in "Surprised by Joy":
In addition to his scholarly work, Lewis wrote several popular novels, including the science fiction "Space Trilogy" for adults and the Narnia fantasies for children. Most deal implicitly with Christian themes such as sin, humanity's fall from grace, and redemption.
His first novel after becoming a Christian was "The Pilgrim's Regress" (1933), which depicted his experience with Christianity in the style of John Bunyan's "The Pilgrim's Progress". The book was poorly received by critics at the time, although David Martyn Lloyd-Jones, one of Lewis's contemporaries at Oxford, gave him much-valued encouragement. Asked by Lloyd-Jones when he would write another book, Lewis replied, "When I understand the meaning of prayer."
The "Space Trilogy" (also called the "Cosmic Trilogy" or "Ransom Trilogy") dealt with what Lewis saw as the dehumanising trends in contemporary science fiction. The first book, "Out of the Silent Planet", was apparently written following a conversation with his friend J.R.R. Tolkien about these trends. Lewis agreed to write a "space travel" story and Tolkien a "time travel" one, but Tolkien never completed "The Lost Road", linking his Middle-earth to the modern world. Lewis's main character Elwin Ransom is based in part on Tolkien, a fact to which Tolkien alludes in his letters.
The second novel, "Perelandra", depicts a new Garden of Eden on the planet Venus, a new Adam and Eve, and a new "serpent figure" to tempt Eve. The story can be seen as an account of what might have happened if the terrestrial Adam had defeated the serpent and avoided the Fall of Man, with Ransom intervening in the novel to "ransom" the new Adam and Eve from the deceptions of the enemy. The third novel, "That Hideous Strength", develops the theme of nihilistic science threatening traditional human values, embodied in Arthurian legend.
Many ideas in the trilogy, particularly opposition to dehumanisation as portrayed in the third book, are presented more formally in "The Abolition of Man", based on a series of lectures by Lewis at Durham University in 1943. Lewis stayed in Durham, where he says he was overwhelmed by the magnificence of the cathedral. "That Hideous Strength" is in fact set in the environs of "Edgestow" university, a small English university like Durham, though Lewis disclaims any other resemblance between the two.
Walter Hooper, Lewis's literary executor, discovered a fragment of another science-fiction novel apparently written by Lewis called "The Dark Tower". Ransom appears in the story but it is not clear whether the book was intended as part of the same series of novels. The manuscript was eventually published in 1977, though Lewis scholar Kathryn Lindskoog doubts its authenticity.
"The Chronicles of Narnia" is a series of seven fantasy novels for children and is considered a classic of children's literature. Written between 1949 and 1954 and illustrated by Pauline Baynes, the series is Lewis's most popular work, having sold over 100 million copies in 41 languages . It has been adapted several times, complete or in part, for radio, television, stage and cinema.
The books contain Christian ideas intended to be easily accessible to young readers. In addition to Christian themes, Lewis also borrows characters from Greek and Roman mythology, as well as traditional British and Irish fairy tales.
Lewis wrote several works on Heaven and Hell. One of these, "The Great Divorce", is a short novella in which a few residents of Hell take a bus ride to Heaven, where they are met by people who dwell there. The proposition is that they can stay if they choose, in which case they can call the place where they had come from "Purgatory", instead of "Hell", but many find it not to their taste. The title is a reference to William Blake's "The Marriage of Heaven and Hell", a concept that Lewis found a "disastrous error". This work deliberately echoes two other more famous works with a similar theme: the "Divine Comedy" of Dante Alighieri, and Bunyan's "The Pilgrim's Progress".
Another short work, "The Screwtape Letters", consists of letters of advice from senior demon Screwtape to his nephew Wormwood on the best ways to tempt a particular human and secure his damnation. Lewis's last novel was "Till We Have Faces", which he thought of as his most mature and masterly work of fiction but which was never a popular success. It is a retelling of the myth of Cupid and Psyche from the unusual perspective of Psyche's sister. It is deeply concerned with religious ideas, but the setting is entirely pagan, and the connections with specific Christian beliefs are left implicit.
Before Lewis's conversion to Christianity, he published two books: "Spirits in Bondage", a collection of poems, and "Dymer", a single narrative poem. Both were published under the pen name Clive Hamilton. Other narrative poems have since been published posthumously, including "Launcelot", "The Nameless Isle", and "The Queen of Drum".
He also wrote "The Four Loves", which rhetorically explains four categories of love: friendship, eros, affection, and charity.
In 2009, a partial draft was discovered of "Language and Human Nature", which Lewis had begun co-writing with J. R. R. Tolkien, but which was never completed.
Lewis is also regarded by many as one of the most influential Christian apologists of his time, in addition to his career as an English professor and an author of fiction. "Mere Christianity" was voted best book of the 20th century by "Christianity Today" in 2000. He has been called "The Apostle to the Skeptics" due to his approach to religious belief as a sceptic, and his following conversion.
Lewis was very interested in presenting an argument from reason against metaphysical naturalism and for the existence of God. "Mere Christianity", "The Problem of Pain", and "Miracles" were all concerned, to one degree or another, with refuting popular objections to Christianity, such as the question, "How could a good God allow pain to exist in the world?" He also became a popular lecturer and broadcaster, and some of his writing originated as scripts for radio talks or lectures (including much of "Mere Christianity").
According to George Sayer, losing a 1948 debate with Elizabeth Anscombe, also a Christian, led Lewis to re-evaluate his role as an apologist, and his future works concentrated on devotional literature and children's books. Anscombe had a completely different recollection of the debate's outcome and its emotional effect on Lewis. Victor Reppert also disputes Sayer, listing some of Lewis's post-1948 apologetic publications, including the second and revised edition of his "Miracles" in 1960, in which Lewis addressed Anscombe's criticism. Noteworthy too is Roger Teichman's suggestion in "The Philosophy of Elizabeth Anscombe" that the intellectual impact of Anscombe's paper on Lewis's philosophical self-confidence should not be over-rated: "... it seems unlikely that he felt as irretrievably crushed as some of his acquaintances have made out; the episode is probably an inflated legend, in the same category as the affair of Wittgenstein's Poker. Certainly, Anscombe herself believed that Lewis's argument, though flawed, was getting at something very important; she thought that this came out more in the improved version of it that Lewis presented in a subsequent edition of "Miracles" – though that version also had 'much to criticize in it'."
Lewis wrote an autobiography titled "Surprised by Joy", which places special emphasis on his own conversion. He also wrote many essays and public speeches on Christian belief, many of which were collected in "God in the Dock" and "The Weight of Glory and Other Addresses".
His most famous works, the "Chronicles of Narnia", contain many strong Christian messages and are often considered allegory. Lewis, an expert on the subject of allegory, maintained that the books were not allegory, and preferred to call the Christian aspects of them "suppositional". As Lewis wrote in a letter to a Mrs. Hook in December 1958:
In a much-cited passage from "Mere Christianity", Lewis challenged the view that Jesus was a great moral teacher but not God. He argued that Jesus made several implicit claims to divinity, which would logically exclude that claim:
Although this argument is sometimes called "Lewis's trilemma", Lewis did not invent it but rather developed and popularised it. It has also been used by Christian apologist Josh McDowell in his book "More Than a Carpenter". It has been widely repeated in Christian apologetic literature, but largely ignored by professional theologians and biblical scholars.
Lewis's Christian apologetics, and this argument in particular, have been criticised. Philosopher John Beversluis described Lewis's arguments as "textually careless and theologically unreliable", and this particular argument as logically unsound and an example of a false dilemma. Theologian John Hick argues that New Testament scholars do not now support the view that Jesus claimed to be God. New Testament scholar N. T. Wright criticises Lewis for failing to recognise the significance of Jesus' Jewish identity and setting – an oversight which "at best, drastically short-circuits the argument" and which lays Lewis open to criticism that his argument "doesn't work as history, and it backfires dangerously when historical critics question his reading of the gospels", although he believes this "doesn't undermine the eventual claim".
Lewis used a similar argument in "The Lion, the Witch and the Wardrobe", when the old Professor advises the young heroes that their sister's claims of a magical world must logically be taken as either lies, madness, or truth.
One of the main theses in Lewis's apologia is that there is a common morality known throughout humanity, which he calls "natural law". In the first five chapters of "Mere Christianity", Lewis discusses the idea that people have a standard of behaviour to which they expect people to adhere. Lewis claims that people all over the earth know what this law is and when they break it. He goes on to claim that there must be someone or something behind such a universal set of principles.
Lewis also portrays Universal Morality in his works of fiction. In "The Chronicles of Narnia" he describes Universal Morality as the "deep magic" which everyone knew.
In the second chapter of "Mere Christianity", Lewis recognises that "many people find it difficult to understand what this Law of Human Nature ... is." And he responds first to the idea "that the Moral Law is simply our herd instinct" and second to the idea "that the Moral Law is simply a social convention". In responding to the second idea Lewis notes that people often complain that one set of moral ideas is better than another, but that this actually argues for there existing some "Real Morality" to which they are comparing other moralities. Finally, he notes that sometimes differences in moral codes are exaggerated by people who confuse differences in beliefs about morality with differences in beliefs about facts:
Lewis also had fairly progressive views on the topic of "animal morality", in particular the suffering of animals, as is evidenced by several of his essays: most notably, "On Vivisection" and "On the Pains of Animals".
Lewis continues to attract a wide readership. In 2008, "The Times" ranked him eleventh on their list of "the 50 greatest British writers since 1945". Readers of his fiction are often unaware of what Lewis considered the Christian themes of his works. His Christian apologetics are read and quoted by members of many Christian denominations. In 2013, on the 50th anniversary of his death, Lewis joined some of Britain's greatest writers recognised at Poets' Corner, Westminster Abbey. The dedication service, at noon on 22 November 2013, included a reading from "The Last Battle" by Douglas Gresham, younger stepson of Lewis. Flowers were laid by Walter Hooper, trustee and literary advisor to the Lewis Estate. An address was delivered by former Archbishop of Canterbury Rowan Williams. The floor stone inscription is a quotation from an address by Lewis:
Lewis has been the subject of several biographies, a few of which were written by close friends, such as Roger Lancelyn Green and George Sayer. In 1985, the screenplay "Shadowlands" by William Nicholson dramatised Lewis's life and relationship with Joy Davidman Gresham. It was aired on British television starring Joss Ackland and Claire Bloom. This was also staged as a theatre play starring Nigel Hawthorne in 1989 and made into the 1993 feature film "Shadowlands" starring Anthony Hopkins and Debra Winger.
Many books have been inspired by Lewis, including "A Severe Mercy" by his correspondent and friend Sheldon Vanauken. "The Chronicles of Narnia" has been particularly influential. Modern children's literature has been more or less influenced by Lewis's series, such as Daniel Handler's "A Series of Unfortunate Events", Eoin Colfer's "Artemis Fowl", Philip Pullman's "His Dark Materials", and J. K. Rowling's "Harry Potter". Pullman is an atheist and is known to be sharply critical of C.S. Lewis's work, accusing Lewis of featuring religious propaganda, misogyny, racism, and emotional sadism in his books. However, he has also modestly praised "The Chronicles of Narnia" for being a "more serious" work of literature in comparison with Tolkien's "trivial" "The Lord of the Rings". Authors of adult fantasy literature such as Tim Powers have also testified to being influenced by Lewis's work.
In "A Sword Between the Sexes? C. S. Lewis and the Gender Debates", Mary Stewart Van Leeuwen finds in Lewis's work "a hierarchical and essentialist view of class and gender" corresponding to an upbringing during the Edwardian era.
Most of Lewis's posthumous work has been edited by his literary executor Walter Hooper. Kathryn Lindskoog, an independent Lewis scholar, argued that Hooper's scholarship is not reliable and that he has made false statements and attributed forged works to Lewis. C. S. Lewis's stepson Douglas Gresham denies the forgery claims, saying, "The whole controversy thing was engineered for very personal reasons ... Her fanciful theories have been pretty thoroughly discredited."
A bronze statue of Lewis's character Digory from "The Magician's Nephew" stands in Belfast's Holywood Arches in front of the Holywood Road Library.
Several C. S. Lewis Societies exist around the world, including one which was founded in Oxford in 1982. The C.S. Lewis Society at the University of Oxford meets at Pusey House during term time to discuss papers on the life and works of Lewis and the other Inklings, and generally appreciate all things Lewisian.
Live-action film adaptations have been made of three of "The Chronicles of Narnia: " (2005), "" (2008) and "" (2010).
Lewis is featured as a main character in "The Chronicles of the Imaginarium Geographica" series by James A. Owen. He is one of two characters in Mark St. Germain's 2009 play "Freud's Last Session", which imagines a meeting between Lewis, aged 40, and Sigmund Freud, aged 83, at Freud's house in Hampstead, London, in 1939, as the Second World War is about to break out.
The CS Lewis Nature Reserve, on ground owned by Lewis, lies behind his house, The Kilns. There is public access. | https://en.wikipedia.org/wiki?curid=5813 |
Chinese dominoes
Chinese dominoes are used in several tile-based games, namely, tien gow, pai gow, tiu u and kap tai shap. In Cantonese they are called "gwat pai" (), which literally means "bone tiles"; it is also the name of a northern Chinese game, where the rules are quite different from the southern Chinese version of Tien Gow.
Ming author Xie Zhaozhe (1567–1624) records the legend of dominoes having been presented to Song Emperor Huizong in 1112. However the contemporary Li Qingzhao (1084 – ) made no mention of dominoes in her compendium of games. The oldest confirmed written mention of dominoes in China comes from the "Former Events in Wulin" (i.e. the capital Hangzhou) written by the Yuan Dynasty (1271–1368) author Zhou Mi (1232–1298), who listed ""pupai"" (gambling plaques or dominoes) as well as dice as items sold by peddlers during the reign of Song Emperor Xiaozong (). Andrew Lo asserts that Zhou Mi meant dominoes when referring to "pupai", since the Ming author Lu Rong (1436–1494) explicitly defined "pupai" as dominoes (in regards to a story of a suitor who won a maiden's hand by drawing out four winning "pupai" from a set). Tiles dating from the 12th to 14th centuries have survived. Unlike most modern tiles they are white with black and red pips.
During the Qing dynasty (1644-1912), the suits known as "Chinese" and "barbarian" were renamed to "civil" and "military" respectively to avoid offending the ruling Manchus. Tiles with blank ends, like those found in Western "double-six" dominoes, once existed during the 17th century. These games employed two sets of "double-six" tiles. It is possible that these were the types of dominoes that made it to Europe the following century.
Each tile pattern in the Chinese domino set is made up of the outcome of a throw of two six-sided dice. Each combination is only used once, so there are 21 unique possible patterns. Eleven of these 21 unique patterns are repeated to make a total of 32 tiles in a Chinese dominoes set. The tile set consists of 32 tiles in two "suits" or groups called "military" and "civil". There are no markings on the tiles to distinguish these suits; a player must simply remember which tiles belong to which group.
The tile set contains two each of eleven civil suit tiles (6-6, 1-1, 4-4, 1-3, 5-5, 3-3, 2-2, 5-6, 4-6, 1-6, 1-5) and one each of ten military suit tiles (3-6, 4-5; 2-6, 3-5; 2-5, 3-4; 2-4; 1-4, 2-3; 1-2). Each civil tile also has a Chinese name (and common rough translation to English): The 6-6 is "tin" ( heaven), 1-1 is "dei" ( earth), 4-4 is "yan" ( man), 1-3 is "ngo" ( goose or harmony), 5-5 is "mui" ( plum flower), 3-3 is "cheung" ( long), 2-2 is "ban" ( board), 5-6 is "fu" ( hatchet), 4-6 is "ping" ( partition), 1-6 is "tsat" () (long leg seven), and 1-5 is "luk" () (big head six).
The civil tiles are ranked according to the Chinese cultural significance of the tile names, and must be memorized. The hendiatris of () dates back for over two thousand years while the harmony () of the three have been in dice and domino games since at least the Ming dynasty. Remembering the suits and rankings of the tiles is easier if one understands the Chinese names of the tiles and the symbolism behind them. The military tiles are named and ranked according to the total points on the tiles. For example, the "nines" (3-6 and 4-5) rank higher than the "eights" (2-6 and 3-5).
The military tiles (since there is only one each) are also considered to be five mixed "pairs" (for example, the 3-6 and 4-5 tiles "match" because they have same total points and both in the military suit). Among the military tiles, individual tiles of the same pair (such as 1-4 and 2-3) rank equally. The 2-4 and 1-2 are an odd pair. They are the only tiles in the whole set that don't match other tiles in the normal sense. This pair when played together is considered a suit on its own, called the "gi jun" ( supreme). It is the highest ranking pair in the game of Pai Gow, though the tiles rank low individually (in their normal order). When a tile of this pair is played individually in the game of Tien Gow, each takes its regular ranking among other military suit tiles according to the total points. The rankings of the individual tiles are similar in most games. However, the ranking of combination tiles is slightly different in Pai Gow and Tien Gow.
Using the same coloring scheme of the traditional Chinese dice, every half-domino with 1 or 4 pips has those pips colored red (for example, the 4-5 domino has four red pips and five white pips). The only exception is the pair of 6-6 tiles. Half of the pips on the 6-6 domino are colored red to make them stand out as the top ranking tiles.
There are also sets with where the tiles have Xiangqi characters next to the pips. As Xiangqi also has 32 pieces, these dual use sets can be used to play Giog. Variant sets include the Digging Flowers () game in which some tiles have flowers or frames printed on them while others have their values duplicated and may have mahjong type flower and blank tiles.
The eponymous game of Bone Tiles ("gǔpái" in Mandarin) is played in northern and central China and as far south as Hunan. The name suggests that it is or became the default game played with dominoes in those regions. It is a trick-taking game similar to Tien Gow but has been simplified. In single-tile tricks, the civil and military suits have been merged into a single suit. In double-tile tricks, there is a new ranking order similar to Pai Gow. Triple-tile and quadruple-tile tricks are not allowed as in older versions of Tien Gow. Scoring has been simplified to number of stacks won. | https://en.wikipedia.org/wiki?curid=5814 |
Estonian language
Estonian ( ) is a Uralic language of the Finnic branch spoken in Estonia. It is the official language of Estonia, spoken natively by about 1.1 million people; 922,000 people in Estonia and 160,000 outside Estonia. It is a Southern Finnic language and is the second-most-spoken language among all the Finnic languages.
Estonian is closely related to Finnish and belongs to the Finnic branch of the Uralic language family. Alongside Finnish, Hungarian, and Maltese, Estonian is one of the four official languages of European Union that is not of an Indo-European origin. Despite some overlaps in the vocabulary due to borrowings, in terms of its origin, Estonian and Finnish are not related to their nearest geographical neighbours, Swedish, Latvian, and Russian, which are all Indo-European languages.
Although the Estonian and Germanic languages are of very different origins, one can identify many similar words in Estonian and German, for example. This is primarily because the Estonian language has borrowed nearly one third of its vocabulary from Germanic languages, mainly from Low Saxon (Middle Low German) during the period of , and High German (including Standard German). The percentage of Low Saxon and High German loanwords can be estimated at 22–25 percent, with Low Saxon making up about 15 percent. Swedish and Russian are the other two important sources of borrowings.
Estonian is a predominantly agglutinative language, but unlike Finnish, it has lost vowel harmony, the front vowels occurring exclusively on the first or stressed syllable, although in older texts and in South Estonian dialects the vowel harmony can still be recognized. Furthermore, the loss of word-final sounds is extensive, and this has made its inflectional morphology markedly more fusional, especially with respect to noun and adjective inflection. The transitional form from an agglutinating to a fusional language is a common feature of Estonian typologically over the course of history with the development of a rich morphological system.
Word order is considerably more flexible than English, but the basic order is subject–verb–object.
The two different historical Estonian languages (sometimes considered dialects), the North and South Estonian languages, are based on the ancestors of modern Estonians' migration into the territory of Estonia in at least two different waves, both groups speaking considerably different Finnic vernaculars. Modern standard Estonian has evolved on the basis of the dialects of Northern Estonia.
The oldest written records of the Finnic languages of Estonia date from the 13th century. "Originates Livoniae" in the Chronicle of Henry of Livonia contains Estonian place names, words and fragments of sentences.
The earliest extant samples of connected (north) Estonian are the so-called Kullamaa prayers dating from 1524 and 1528. In 1525 the first book published in the Estonian language was printed. The book was a Lutheran manuscript, which never reached the reader and was destroyed immediately after publication.
The first extant Estonian book is a bilingual German-Estonian translation of the Lutheran catechism by S.Wanradt and J.Koell dating to 1535, during the Protestant Reformation period. An Estonian grammar book to be used by priests was printed in German in 1637. The New Testament was translated into southern Estonian in 1686 (northern Estonian, 1715). The two languages were united based on northern Estonian by Anton thor Helle.
Writings in Estonian became more significant in the 19th century during the Estophile Enlightenment Period (1750–1840).
The birth of native Estonian literature was in 1810 to 1820 when the patriotic and philosophical poems by Kristjan Jaak Peterson were published. Peterson, who was the first student at the then German-language University of Dorpat to acknowledge his Estonian origin, is commonly regarded as a herald of Estonian national literature and considered the founder of modern Estonian poetry. His birthday, March 14, is celebrated in Estonia as Mother Tongue Day. A fragment from Peterson's poem "Kuu" expresses the claim reestablishing the birthright of the Estonian language:
In English:
In the period from 1525 to 1917, 14,503 titles were published in Estonian; by comparison, between 1918 and 1940, 23,868 titles were published.
In modern times Jaan Kross and Jaan Kaplinski and Viivi Luik are three of Estonia's best known and most translated writers.
Writings in Estonian became significant only in the 19th century with the spread of the ideas of the Age of Enlightenment, during the Estophile Enlightenment Period (1750–1840). Although Baltic Germans at large regarded the future of Estonians as being a fusion with themselves, the Estophile educated class admired the ancient culture of the Estonians and their era of freedom before the conquests by Danes and Germans in the 13th century.
After the Estonian War of Independence in 1919, the Estonian language became the state language of the newly independent country. In 1945, 97.3% of Estonia considered itself ethnic Estonian and spoke the language.
When Estonia was invaded and occupied by the Soviet Union in World War II, the status of the Estonian language changed to the first of two official languages (Russian being the other one). As with Latvia many immigrants entered Estonia under Soviet encouragement. In the second half of the 1970s, the pressure of bilingualism (for Estonians) intensified, resulting in widespread knowledge of Russian throughout the country. The Russian language was termed as ‘the language of friendship of nations’ and was taught to Estonian children, sometimes as early as in kindergarten. Although teaching Estonian to non-Estonians in schools was compulsory, in practice learning the language was often considered unnecessary.
During the Perestroika era, The Law on the Status of the Estonian Language was adopted in January 1989. The 1991 collapse of the Soviet Union led to the restoration of the Republic of Estonia's independence. Estonian went back to being the only state language in Estonia which in practice meant that use of Estonian was promoted while the use of Russian was discouraged.
The return of Soviet immigrants to their countries of origin has brought the proportion of Estonians in Estonia back above 70%. And again as in Latvia, today many of the remnant non-Estonians in Estonia have adopted the Estonian language; about 40% at the 2000 census.
The Estonian dialects are divided into two groups – the northern and southern dialects, historically associated with the cities of Tallinn in the north and Tartu in the south, in addition to a distinct "kirderanniku" dialect, Northeastern coastal Estonian.
The northern group consists of the "keskmurre" or central dialect that is also the basis for the standard language, the "läänemurre" or western dialect, roughly corresponding to Lääne County and Pärnu County, the "saarte murre" (islands' dialect) of Saaremaa, Hiiumaa, Muhu and Kihnu, and the "idamurre" or eastern dialect on the northwestern shore of Lake Peipus.
South Estonian consists of the Tartu, Mulgi, Võro and Seto varieties. These are sometimes considered either variants of South Estonian or separate languages altogether. Also, Seto and Võro distinguish themselves from each other less by language and more by their culture and their respective Christian confession.
Estonian employs the Latin script as the basis for its alphabet, which adds the letters "ä", "ö", "ü", and "õ", plus the later additions "š" and "ž". The letters "c", "q", "w", "x" and "y" are limited to proper names of foreign origin, and "f", "z", "š", and "ž" appear in loanwords and foreign names only. "Ö" and "Ü" are pronounced similarly to their equivalents in Swedish and German. Unlike in standard German but like Swedish (when followed by 'r') and Finnish, "Ä" is pronounced [æ], as in English "mat". The vowels Ä, Ö and Ü are clearly separate phonemes and inherent in Estonian, although the letter shapes come from German. The letter "õ" denotes , unrounded , or a close-mid back unrounded vowel. It is almost identical to the Bulgarian ъ and the Vietnamese ơ, and is also used to transcribe the Russian ы.
Although the Estonian orthography is generally guided by phonemic principles, with each grapheme corresponding to one phoneme, there are some historical and morphological deviations from this: for example preservation of the morpheme in declension of the word (writing b, g, d in places where p, k, t is pronounced) and in the use of 'i' and 'j'. Where it is very impractical or impossible to type "š" and "ž", they are substituted with "sh" and "zh" in some written texts, although this is considered incorrect. Otherwise, the "h" in "sh" represents a voiceless glottal fricative, as in "Pasha" ("pas-ha"); this also applies to some foreign names.
Modern Estonian orthography is based on the "Newer Orthography" created by Eduard Ahrens in the second half of the 19thcentury based on Finnish orthography. The "Older Orthography" it replaced was created in the 17thcentury by Bengt Gottfried Forselius and Johann Hornung based on standard German orthography. Earlier writing in Estonian had by and large used an "ad hoc" orthography based on Latin and Middle Low German orthography. Some influences of the standard German orthography — for example, writing 'W'/'w' instead of 'V'/'v' — persisted well into the 1930s.
Estonian words and names quoted in international publications from Soviet sources are often back-transliterations from the Russian transliteration. Examples are the use of "ya" for "ä" (e.g. Pyarnu instead of Pärnu), "y" instead of "õ" (e.g., Pylva instead of Põlva) and "yu" instead of "ü" (e.g., Pyussi instead of Püssi). Even in the "Encyclopædia Britannica" one can find "ostrov Khiuma", where "ostrov" means "island" in Russian and "Khiuma" is back-transliteration from Russian instead of "Hiiumaa" ("Hiiumaa" > Хийума(а) > "Khiuma").
There are 9 vowels and 36 diphthongs, 28 of which are native to Estonian.[1] All nine vowels can appear as the first component of a diphthong, but only /ɑ e i o u/ occur as the second component. A vowel characteristic of Estonian is the unrounded back vowel /ɤ/, which may be close-mid back, close back, or close-mid central.
Typologically, Estonian represents a transitional form from an agglutinating language to a fusional language. The canonical word order is SVO (subject–verb–object).
In Estonian, nouns and pronouns do not have grammatical gender, but nouns and adjectives decline in fourteen cases: nominative, genitive, partitive, illative, inessive, elative, allative, adessive, ablative, translative, terminative, essive, abessive, and comitative, with the case and number of the adjective(s) always agreeing with that of the noun (except in the terminative, essive, abessive and comitative, where there is agreement only for the number, the adjective being in the genitive form). Thus the illative for "kollane maja" ("a yellow house") is "kollasesse majja" ("into a yellow house"), but the terminative is "kollase majani" ("as far as a yellow house"). With respect to the Proto-Finnic language, elision has occurred; thus, the actual case marker may be absent, but the stem is changed, cf. "maja – majja" and the Ostrobothnia dialect of Finnish "maja – majahan".
The direct object of the verb appears either in the accusative (for total objects) or in the partitive (for partial objects). The accusative coincides with the genitive in the singular and with nominative in the plural. Accusative vs. partitive case opposition of the object used with transitive verbs creates a telicity contrast, just as in Finnish. This is a rough equivalent of the perfective vs. imperfective aspect opposition.
The verbal system lacks a distinctive future tense (the present tense serves here) and features special forms to express an action performed by an undetermined subject (the "impersonal").
Although the Estonian and Germanic languages are of very different origins and the vocabulary is considered quite different from that of the Indo-European family, one can identify many similar words in Estonian and English, for example. This is primarily because the Estonian language has borrowed nearly one third of its vocabulary from Germanic languages, mainly from Low Saxon (Middle Low German) during the period of , and High German (including standard German). The percentage of Low Saxon and High German loanwords can be estimated at 22–25 percent, with Low Saxon making up about 15 percent.
Often 'b' & 'p' are interchangeable, for example 'baggage' becomes 'pagas', 'lob' (to throw) becomes 'loopima'. The initial letter 's' before another consonant is often dropped, for example 'skool' becomes 'kool', 'stool' becomes 'tool'.
Estonian language planners such as Ado Grenzstein (a journalist active in Estonia in the 1870s–90s) tried to use formation "ex nihilo", Urschöpfung; i.e. they created new words out of nothing.
The most famous reformer of Estonian, Johannes Aavik (1880–1973), used creations "ex nihilo" (cf. ‘free constructions’, Tauli 1977), along with other sources of lexical enrichment such as derivations, compositions and loanwords (often from Finnish; cf. Saareste and Raun 1965: 76). In Aavik's dictionary (1921), which lists approximately 4000 words, there are many words which were (allegedly) created "ex nihilo", many of which are in common use today. Examples are
Many of the coinages that have been considered (often by Aavik himself) as words concocted "ex nihilo" could well have been influenced by foreign lexical items, for example words from Russian, German, French, Finnish, English and Swedish. Aavik had a broad classical education and knew Ancient Greek, Latin and French. Consider "roim" ‘crime’ versus English "crime" or "taunima" ‘to condemn, disapprove’ versus Finnish "tuomita" ‘to condemn, to judge’ (these Aavikisms appear in Aavik's 1921 dictionary). These words might be better regarded as a peculiar manifestation of morpho-phonemic adaptation of a foreign lexical item.
Article 1 of the Universal Declaration of Human Rights in Estonian: | https://en.wikipedia.org/wiki?curid=10223 |
E-Prime
E-Prime (short for English-Prime or English Prime, sometimes denoted É or E′) is a version of the English language that excludes all forms of the verb "to be", including all conjugations, contractions and archaic forms.
Some scholars advocate using E-Prime as a device to clarify thinking and strengthen writing. A number of other scholars have criticized E-Prime's utility.
D. David Bourland Jr., who had studied under Alfred Korzybski, devised E-Prime as an addition to Korzybski's general semantics in the late 1940s. Bourland published the concept in a 1965 essay entitled "A Linguistic Note: Writing in E-Prime" (originally published in "General Semantics Bulletin"). The essay quickly generated controversy within the general semantics field, partly because practitioners of general semantics sometimes saw Bourland as attacking the verb "to be" as such, and not just certain usages.
Bourland collected and published three volumes of essays in support of his innovation. The first (1991), co-edited by Paul Dennithorne Johnston, bore the title: "To Be or Not: An E-Prime Anthology".
For the second, "More E-Prime: To Be or Not II", published in 1994, he added a third editor, Jeremy Klein. Bourland and Johnston then edited a third book, "E-Prime III: a third anthology", published in 1997.
In the English language, the verb 'to be' (also known as the "copula") has several distinct functions:
Bourland sees specifically the "identity" and "predication" functions as pernicious, but advocates eliminating all forms for the sake of simplicity. In the case of the "existence" form (and less idiomatically, the "location" form), one might (for example) simply substitute the verb "exists". Other copula-substitutes in English include "taste", "feel", "smell", "sound", "grow", "remain", "stay", and "turn", among others a user of E-prime might use instead of "to be".
Words not used in E-prime include: be, being, been, am, is, isn't, are, aren't, was, wasn't, were, and weren't.
Contractions formed from a pronoun and a form of "to be" are also not used, including: I'm, you're, we're, they're, he's, she's, it's, there's, here's, where's, how's, what's, who's, and that's. E-Prime also prohibits contractions of "to be" found in nonstandard dialects of English, such as "ain't".
Bourland and other advocates also suggest that use of E-Prime leads to a less dogmatic style of language that reduces the possibility of misunderstanding or conflict.
Kellogg and Bourland describe misuse of the verb "to be" as creating a "deity mode of speech", allowing "even the most ignorant to transform their opinions magically into god-like pronouncements on the nature of things".
While teaching at the University of Florida, Alfred Korzybski counseled his students to eliminate the infinitive and verb forms of "to be" from their vocabulary, whereas a second group continued to use "I am," "You are," "They are" statements as usual. For example, instead of saying, "I am depressed," a student was asked to eliminate that emotionally primed verb and to say something else, such as, "I feel depressed when ..." or "I tend to make myself depressed about ..."
Korzybski observed improvement "of one full letter grade" by "students who did not generalize by using that infinitive".
Albert Ellis advocated the use of E-Prime when discussing psychological distress to encourage framing these experiences as temporary (see also Solution focused brief therapy) and to encourage a sense of agency by specifying the subject of statements. According to Ellis, rational emotive behavior therapy "has favored E-Prime more than any other form of psychotherapy and I think it is still the only form of therapy that has some of its main books written in E-Prime". However, Ellis did not always use E-Prime because he believed it interferes with readability.
Many authors have questioned E-Prime's effectiveness at improving readability and reducing prejudice (Lakoff, 1992; Murphy, 1992; Parkinson, 1992; Kenyon, 1992; French, 1992, 1993; Lohrey, 1993). These authors observed that a communication under the copula ban can remain extremely unclear and imply prejudice, while losing important speech patterns, such as identities and identification. Further, prejudices and judgments that are made are more difficult to notice or refute. James D. French, a computer programmer at the University of California, Berkeley, summarized ten arguments against E-Prime (in the context of general semantics) as follows:
According to an article (written in E-Prime and advocating a role for E-Prime in ESL and EFL programs) published by the Office of English Language Programs of the Bureau of Educational and Cultural Affairs in the State Department of the United States, "Requiring students to avoid the verb to be on every assignment would deter students from developing other fundamental skills of fluent writing." | https://en.wikipedia.org/wiki?curid=10224 |
Elliptic curve
In mathematics, an elliptic curve is a plane algebraic curve defined by an equation of the form
which is non-singular; that is, the curve has no cusps or self-intersections. (When the coefficient field has characteristic 2 or 3, the above equation is not quite general enough to comprise all non-singular cubic curves; see below.)
Formally, an elliptic curve is a smooth, projective, algebraic curve of genus one, on which there is a specified point "O". An elliptic curve is an abelian variety – that is, it has a multiplication defined algebraically, with respect to which it is an abelian group – and "O" serves as the identity element. Often the curve itself, without "O" specified, is called an elliptic curve; the point "O" is often taken to be the curve's "point at infinity" in the projective plane.
If "y"2 = "P"("x"), where "P" is any polynomial of degree three in "x" with no repeated roots, the solution set is a nonsingular plane curve of genus one, an elliptic curve. If "P" has degree four and is square-free this equation again describes a plane curve of genus one; however, it has no natural choice of identity element. More generally, any algebraic curve of genus one, for example from the intersection of two quadric surfaces embedded in three-dimensional projective space, is called an elliptic curve, provided that it has at least one rational point to act as the identity.
Using the theory of elliptic functions, it can be shown that elliptic curves defined over the complex numbers correspond to embeddings of the torus into the complex projective plane. The torus is also an abelian group, and in fact this correspondence is also a group isomorphism.
Elliptic curves are especially important in number theory, and constitute a major area of current research; for example, they were used in Andrew Wiles's proof of Fermat's Last Theorem. They also find applications in elliptic curve cryptography (ECC) and integer factorization.
An elliptic curve is "not" an ellipse: see elliptic integral for the origin of the term. Topologically, a complex elliptic curve is a torus.
Although the formal definition of an elliptic curve is fairly technical and requires some background in algebraic geometry, it is possible to describe some features of elliptic curves over the real numbers using only introductory algebra and geometry.
In this context, an elliptic curve is a plane curve defined by an equation of the form
where "a" and "b" are real numbers. This type of equation is called a Weierstrass equation.
The definition of elliptic curve also requires that the curve be non-singular. Geometrically, this means that the graph has no cusps, self-intersections, or isolated points. Algebraically, this holds if and only if the discriminant
is not equal to zero. (Although the factor −16 is irrelevant to whether or not the curve is non-singular, this definition of the discriminant is useful in a more advanced study of elliptic curves.)
The (real) graph of a non-singular curve has "two" components if its discriminant is positive, and "one" component if it is negative. For example, in the graphs shown in figure to the right, the discriminant in the first case is 64, and in the second case is −368.
When working in the projective plane, we can define a group structure on any smooth cubic curve. In Weierstrass normal form, such a curve will have an additional point at infinity, "O", at the homogeneous coordinates [0:1:0] which serves as the identity of the group.
Since the curve is symmetrical about the x-axis, given any point "P", we can take −"P" to be the point opposite it. We take −"O" to be just "O".
If "P" and "Q" are two points on the curve, then we can uniquely describe a third point, "P" + "Q", in the following way. First, draw the line that intersects "P" and "Q". This will generally intersect the cubic at a third point, "R". We then take "P" + "Q" to be −"R", the point opposite "R".
This definition for addition works except in a few special cases related to the point at infinity and intersection multiplicity. The first is when one of the points is "O". Here, we define "P" + "O" = "P" = "O" + "P", making "O" the identity of the group. Next, if "P" and "Q" are opposites of each other, we define "P" + "Q" = "O". Lastly, if "P" = "Q" we only have one point, thus we can't define the line between them. In this case, we use the tangent line to the curve at this point as our line. In most cases, the tangent will intersect a second point "R" and we can take its opposite. However, if "P" happens to be an inflection point (a point where the concavity of the curve changes), we take "R" to be "P" itself and "P" + "P" is simply the point opposite itself.
For a cubic curve not in Weierstrass normal form, we can still define a group structure by designating one of its nine inflection points as the identity "O". In the projective plane, each line will intersect a cubic at three points when accounting for multiplicity. For a point "P", −"P" is defined as the unique third point on the line passing through "O" and "P". Then, for any "P" and "Q", "P" + "Q" is defined as −"R" where "R" is the unique third point on the line containing "P" and "Q".
Let "K" be a field over which the curve is defined (i.e., the coefficients of the defining equation or equations of the curve are in "K") and denote the curve by "E". Then the "K"-rational points of "E" are the points on "E" whose coordinates all lie in "K", including the point at infinity. The set of "K"-rational points is denoted by "E"("K"). It, too, forms a group, because properties of polynomial equations show that if "P" is in "E"("K"), then −"P" is also in "E"("K"), and if two of "P", "Q", and "R" are in "E"("K"), then so is the third. Additionally, if "K" is a subfield of "L", then "E"("K") is a subgroup of "E"("L").
The above group can be described algebraically as well as geometrically. Given the curve "y"2 = "x"3 + "ax" + "b" over the field "K" (whose characteristic we assume to be neither 2 nor 3), and points "P" = ("xP", "yP") and "Q" = ("xQ", "yQ") on the curve, assume first that "xP" ≠ "xQ" (first pane below). Let "y = sx + d" be the line that intersects "P" and "Q", which has the following slope:
Since "K" is a field, "s" is well-defined. The line equation and the curve equation have an identical "y" in the points "xP", "xQ", and "xR".
which is equivalent to formula_6. We know that this equation has its roots in exactly the same "x"-values as
We equate the coefficient for "x2" and solve for "xR". "yR" follows from the line equation. This defines "R" = ("xR", "yR") = −("P" + "Q") with
If "xP" = "xQ", then there are two options: if "yP" = −"yQ" (third and fourth panes below), including the case where "yP" = "yQ" = 0 (fourth pane), then the sum is defined as 0; thus, the inverse of each point on the curve is found by reflecting it across the "x"-axis. If "yP" = "yQ" ≠ 0, then "Q" = "P" and "R" = ("x""R", "y""R") = −("P" + "P") = −2"P" = −2"Q" (second pane below with "P" shown for "R") is given by
The formulation of elliptic curves as the embedding of a torus in the complex projective plane follows naturally from a curious property of Weierstrass's elliptic functions. These functions and their first derivative are related by the formula
Here, "g"2 and "g"3 are constants; formula_11 is the Weierstrass elliptic function and formula_12 its derivative. It should be clear that this relation is in the form of an elliptic curve (over the complex numbers). The Weierstrass functions are doubly periodic; that is, they are periodic with respect to a lattice Λ; in essence, the Weierstrass functions are naturally defined on a torus "T" = C/Λ. This torus may be embedded in the complex projective plane by means of the map
This map is a group isomorphism of the torus (considered with its natural group structure) with the chord-and-tangent group law on the cubic curve which is the image of this map. It is also an isomorphism of Riemann surfaces from the torus to the cubic curve, so topologically, an elliptic curve is a torus. If the lattice Λ is related by multiplication by a non-zero complex number "c" to a lattice "c"Λ, then the corresponding curves are isomorphic. Isomorphism classes of elliptic curves are specified by the j-invariant.
The isomorphism classes can be understood in a simpler way as well. The constants "g"2 and "g"3, called the modular invariants, are uniquely determined by the lattice, that is, by the structure of the torus. However, all real polynomials factorize completely into linear factors over the complex numbers, since the field of complex numbers is the algebraic closure of the reals. So, the elliptic curve may be written as
One finds that
and
so that the modular discriminant is
Here, λ is sometimes called the modular lambda function.
Note that the uniformization theorem implies that every compact Riemann surface of genus one can be represented as a torus.
This also allows an easy understanding of the torsion points on an elliptic curve: if the lattice Λ is spanned by the fundamental periods ω1 and ω2, then the "n"-torsion points are the (equivalence classes of) points of the form
for "a" and "b" integers in the range from 0 to "n"−1.
Over the complex numbers, every elliptic curve has nine inflection points. Every line through two of these points also passes through a third inflection point; the nine points and 12 lines formed in this way form a realization of the Hesse configuration.
A curve "E" defined over the field of rational numbers is also defined over the field of real numbers. Therefore, the law of addition (of points with real coordinates) by the tangent and secant method can be applied to "E". The explicit formulae show that the sum of two points "P" and "Q" with rational coordinates has again rational coordinates, since the line joining "P" and "Q" has rational coefficients. This way, one shows that the set of rational points of "E" forms a subgroup of the group of real points of "E". As this group, it is an abelian group, that is, "P" + "Q" = "Q" + "P".
The most important result is that all points can be constructed by the method of tangents and secants starting with a "finite" number of points. More precisely the Mordell–Weil theorem states that the group "E"(Q) is a finitely generated (abelian) group. By the fundamental theorem of finitely generated abelian groups it is therefore a finite direct sum of copies of Z and finite cyclic groups.
The proof of that theorem rests on two ingredients: first, one shows that for any integer "m" > 1, the quotient group "E"(Q)/"mE"(Q) is finite (weak Mordell–Weil theorem). Second, introducing a height function "h" on the rational points "E"(Q) defined by "h"("P"0) = 0 and if "P" (unequal to the point at infinity "P"0) has as abscissa the rational number "x" = "p"/"q" (with coprime "p" and "q"). This height function "h" has the property that "h"("mP") grows roughly like the square of "m". Moreover, only finitely many rational points with height smaller than any constant exist on "E".
The proof of the theorem is thus a variant of the method of infinite descent and relies on the repeated application of Euclidean divisions on "E": let "P" ∈ "E"(Q) be a rational point on the curve, writing "P" as the sum 2"P"1 + "Q"1 where "Q"1 is a fixed representant of "P" in "E"(Q)/2"E"(Q), the height of "P"1 is about of the one of "P" (more generally, replacing 2 by any "m" > 1, and by ). Redoing the same with "P"1, that is to say "P"1 = 2"P"2 + "Q"2, then "P"2 = 2"P"3 + "Q"3, etc. finally expresses "P" as an integral linear combination of points "Qi" and of points whose height is bounded by a fixed constant chosen in advance: by the weak Mordell–Weil theorem and the second property of the height function "P" is thus expressed as an integral linear combination of a finite number of fixed points.
So far, the theorem is not effective since there is no known general procedure for determining the representants of "E"(Q)/"mE"(Q).
The rank of "E"(Q), that is the number of copies of Z in "E"(Q) or, equivalently, the number of independent points of infinite order, is called the "rank" of "E". The Birch and Swinnerton-Dyer conjecture is concerned with determining the rank. One conjectures that it can be arbitrarily large, even if only examples with relatively small rank are known. The elliptic curve with biggest exactly known rank is
It has rank 20, found by Noam Elkies and Zev Klagsbrun in 2020. Curves of rank at least 28 are known, but their rank is not exactly known.
As for the groups constituting the torsion subgroup of "E"(Q), the following is known: the torsion subgroup of "E"(Q) is one of the 15 following groups (a theorem due to Barry Mazur): Z/"NZ for "N" = 1, 2, ..., 10, or 12, or Z/2Z × Z/2"NZ with "N" = 1, 2, 3, 4. Examples for every case are known. Moreover, elliptic curves whose Mordell–Weil groups over Q have the same torsion groups belong to a parametrized family.
The "Birch and Swinnerton-Dyer conjecture" (BSD) is one of the Millennium problems of the Clay Mathematics Institute. The conjecture relies on analytic and arithmetic objects defined by the elliptic curve in question.
At the analytic side, an important ingredient is a function of a complex variable, "L", the Hasse–Weil zeta function of "E" over Q. This function is a variant of the Riemann zeta function and Dirichlet L-functions. It is defined as an Euler product, with one factor for every prime number "p".
For a curve "E" over Q given by a minimal equation
with integral coefficients formula_20, reducing the coefficients modulo "p" defines an elliptic curve over the finite field F"p" (except for a finite number of primes "p", where the reduced curve has a singularity and thus fails to be elliptic, in which case "E" is said to be of bad reduction at "p").
The zeta function of an elliptic curve over a finite field F"p" is, in some sense, a generating function assembling the information of the number of points of "E" with values in the finite field extensions F"pn" of F"p". It is given by
The interior sum of the exponential resembles the development of the logarithm and, in fact, the so-defined zeta function is a rational function:
where the 'trace of Frobenius' term formula_23 is defined to be the (negative of) the difference between the number of points on the elliptic curve formula_24 over formula_25 and the 'expected' number formula_26, viz.:
There are two points to note about this quantity. First, these formula_23 are not to be confused with the formula_20 in the definition of the curve formula_24 above: this is just an unfortunate clash of notation. Second, we may define the same quantities and functions over an arbitrary finite field of characteristic formula_31, with formula_32 replacing formula_31 everywhere.
The Hasse–Weil zeta function of "E" over Q is then defined by collecting this information together, for all primes "p". It is defined by
where ε("p") = 1 if "E" has good reduction at "p" and 0 otherwise (in which case "ap" is defined differently from the method above: see Silverman (1986) below).
This product converges for Re("s") > 3/2 only. Hasse's conjecture affirms that the "L"-function admits an analytic continuation to the whole complex plane and satisfies a functional equation relating, for any "s", "L"("E", "s") to "L"("E", 2 − "s"). In 1999 this was shown to be a consequence of the proof of the Shimura–Taniyama–Weil conjecture, which asserts that every elliptic curve over "Q" is a modular curve, which implies that its "L"-function is the "L"-function of a modular form whose analytic continuation is known.
One can therefore speak about the values of "L"("E", "s") at any complex number "s". The Birch–Swinnerton-Dyer conjecture relates the arithmetic of the curve to the behavior of its "L"-function at "s" = 1. More precisely, it affirms that the order of the "L"-function at "s" = 1 equals the rank of "E" and predicts the leading term of the Laurent series of "L"("E", "s") at that point in terms of several quantities attached to the elliptic curve.
Much like the Riemann hypothesis, this conjecture has multiple consequences, including the following two:
The modularity theorem, once known as the Taniyama–Shimura–Weil conjecture, states that every elliptic curve "E" over Q is a modular curve, that is to say, its Hasse–Weil zeta function is the "L"-function of a modular form of weight 2 and level "N", where "N" is the conductor of "E" (an integer divisible by the same prime numbers as the discriminant of "E", Δ("E").) In other words, if, for Re("s") > 3/2, one writes the "L"-function in the form
the expression
defines a parabolic modular newform of weight 2 and level "N". For prime numbers ℓ not dividing "N", the coefficient "a"(ℓ) of the form equals ℓ minus the number of solutions of the minimal equation of the curve modulo ℓ.
For example, to the elliptic curve formula_41 with discriminant (and conductor) 37, is associated the form
For prime numbers ℓ not equal to 37, one can verify the property about the coefficients. Thus, for ℓ = 3, there are 6 solutions of the equation modulo 3: (0, 0), (0, 1), (2, 0), (1, 0), (1, 1), (2, 1); thus "a"(3) = 3 − 6 = −3.
The conjecture, going back to the 1950s, was completely proven by 1999 using ideas of Andrew Wiles, who proved it in 1994 for a large family of elliptic curves.
There are several formulations of the conjecture. Showing that they are equivalent is difficult and was a main topic of number theory in the second half of the 20th century. The modularity of an elliptic curve "E" of conductor "N" can be expressed also by saying that there is a non-constant rational map defined over Q, from the modular curve "X"0("N") to "E". In particular, the points of "E" can be parametrized by modular functions.
For example, a modular parametrization of the curve formula_43 is given by
where, as above, "q" = exp(2π"iz"). The functions "x(z)" and "y(z)" are modular of weight 0 and level 37; in other words they are meromorphic, defined on the upper half-plane Im("z") > 0 and satisfy
and likewise for "y(z)" for all integers "a, b, c, d" with "ad" − "bc" = 1 and 37|"c".
Another formulation depends on the comparison of Galois representations attached on the one hand to elliptic curves, and on the other hand to modular forms. The latter formulation has been used in the proof the conjecture. Dealing with the level of the forms (and the connection to the conductor of the curve) is particularly delicate.
The most spectacular application of the conjecture is the proof of Fermat's Last Theorem (FLT). Suppose that for a prime "p" ≥ 5, the Fermat equation
has a solution with non-zero integers, hence a counter-example to FLT. Then as Yves Hellegouarch was the first to notice, the elliptic curve
of discriminant
cannot be modular. Thus, the proof of the Taniyama–Shimura–Weil conjecture for this family of elliptic curves (called Hellegouarch–Frey curves) implies FLT. The proof of the link between these two statements, based on an idea of Gerhard Frey (1985), is difficult and technical. It was established by Kenneth Ribet in 1987.
This section is concerned with points "P" = ("x", "y") of "E" such that "x" is an integer. The following theorem is due to C. L. Siegel: the set of points "P" = ("x", "y") of "E"(Q) such that "x" is an integer is finite. This theorem can be generalized to points whose "x" coordinate has a denominator divisible only by a fixed finite set of prime numbers.
The theorem can be formulated effectively. For example, if the Weierstrass equation of "E" has integer coefficients bounded by a constant "H", the coordinates ("x", "y") of a point of "E" with both "x" and "y" integer satisfy:
The Sato–Tate conjecture is a statement about how the error term formula_50 in Hasse's theorem varies with the different primes "q", if an elliptic curve E over Q is reduced modulo q. It was proven (for almost all such curves) in 2006 due to the results of Taylor, Harris and Shepherd-Barron, and says that the error terms are equidistributed.
Elliptic curves over finite fields are notably applied in cryptography and for the factorization of large integers. These algorithms often make use of the group structure on the points of "E". Algorithms that are applicable to general groups, for example the group of invertible elements in finite fields, F*"q", can thus be applied to the group of points on an elliptic curve. For example, the discrete logarithm is such an algorithm. The interest in this is that choosing an elliptic curve allows for more flexibility than choosing "q" (and thus the group of units in F"q"). Also, the group structure of elliptic curves is generally more complicated.
Elliptic curves over finite fields are used in some cryptographic applications as well as for integer factorization. Typically, the general idea in these applications is that a known algorithm which makes use of certain finite groups is rewritten to use the groups of rational points of elliptic curves. For more see also:
Serge Lang, in the introduction to the book cited below, stated that "It is possible to write endlessly on elliptic curves. (This is not a threat.)" The following short list is thus at best a guide to the vast expository literature available on the theoretical, algorithmic, and cryptographic aspects of elliptic curves. | https://en.wikipedia.org/wiki?curid=10225 |
Equidae
Equidae (sometimes known as the horse family) is the taxonomic family of horses and related animals, including the extant horses, donkeys, and zebras, and many other species known only from fossils. All extant species are in the genus "Equus". Equidae belongs to the order Perissodactyla, which includes the extant tapirs and rhinoceros, and several extinct families.
The term equid refers to any member of this family, including any equine.
The oldest known fossils assigned to Equidae date from the early Eocene, 54 million years ago. They used to be assigned to the genus "Hyracotherium", but the type species of that genus now is regarded to be not a member of this family. The other species have been split off into different genera. These early equids were fox-sized animals with three toes on the hind feet, and four on the front feet. They were herbivorous browsers on relatively soft plants, and already adapted for running. The complexity of their brains suggest that they already were alert and intelligent animals. Later species reduced the number of toes, and developed teeth more suited for grinding up grasses and other tough plant food.
The equids, like other perissodactyls, are hindgut fermenters. They have evolved specialized teeth that cut and shear tough plant matter to accommodate their fibrous diet. Their seemingly inefficient digestion strategy is a result of their size at the time of its evolution, as they would have already had to be relatively large mammals to be supported on such a strategy.
The family became relatively diverse during the Miocene, with many new species appearing. By this time, equids were more truly horse-like, having developed the typical body shape of the modern animals. Many of these species bore the main weight of their bodies on their central, third, toe, with the others becoming reduced, and barely touching the ground, if at all. The sole surviving genus, "Equus", had evolved by the early Pleistocene, and spread rapidly through the world. | https://en.wikipedia.org/wiki?curid=10229 |
List of economists
This is an incomplete alphabetical list by surname of notable economists, experts in the social science of economics, past and present. For a history of economics, see the article History of economic thought. Only economists with biographical articles in Wikipedia are listed here.
Jobin p CHACKO(1995) Indian economist completed his graduation in mathematics and post graduation in economics from BAM college thuruthicad .mallappally | https://en.wikipedia.org/wiki?curid=10231 |
ELIZA
ELIZA is an early natural language processing computer program created from 1964 to 1966 at the MIT Artificial Intelligence Laboratory by Joseph Weizenbaum. Created to demonstrate the superficiality of communication between humans and machines, Eliza simulated conversation by using a "pattern matching" and substitution methodology that gave users an illusion of understanding on the part of the program, but had no built in framework for contextualizing events. Directives on how to interact were provided by "scripts", written originally in MAD-Slip, which allowed ELIZA to process user inputs and engage in discourse following the rules and directions of the script. The most famous script, DOCTOR, simulated a Rogerian psychotherapist (in particular, Carl Rogers, who was well-known for simply parroting back at patients what they'd just said),
and used rules, dictated in the script, to respond with non-directional questions to user inputs. As such, ELIZA was one of the first chatterbots and one of the first programs capable of attempting the Turing test.
ELIZA's creator, Weizenbaum regarded the program as a method to show the superficiality of communication between man and machine, but was surprised by the number of individuals who attributed human-like feelings to the computer program, including Weizenbaum’s secretary. Many academics believed that the program would be able to positively influence the lives of many people, particularly those suffering from psychological issues, and that it could aid doctors working on such patients' treatment. While ELIZA was capable of engaging in discourse, ELIZA could not converse with true understanding. However, many early users were convinced of ELIZA’s intelligence and understanding, despite Weizenbaum’s insistence to the contrary.
Joseph Weizenbaum’s ELIZA, running the DOCTOR script, was created to provide a parody of "the responses of a non-directional psychotherapist in an initial psychiatric interview" and to "demonstrate that the communication between man and machine was superficial". While ELIZA is most well known for acting in the manner of a psychotherapist, this mannerism is due to the data and instructions supplied by the DOCTOR script. ELIZA itself examined the text for keywords, applied values to said keywords, and transformed the input into an output; the script that ELIZA ran determined the keywords, set the values of keywords, and set the rules of transformation for the output. Weizenbaum chose to make the DOCTOR script in the context of psychotherapy to "sidestep the problem of giving the program a data base of real-world knowledge", as in a Rogerian therapeutic situation, the program had only to reflect back the patient's statements. The algorithms of DOCTOR allowed for a deceptively intelligent response, which deceived many individuals when first using the program.
Weizenbaum named his program ELIZA after Eliza Doolittle, a working-class character in George Bernard Shaw's "Pygmalion". According to Weizenbaum, ELIZA's ability to be "incrementally improved" by various users made it similar to Eliza Doolittle, since Eliza Doolittle was taught to speak with an upper-class accent in Shaw's play. However, unlike in Shaw's play, ELIZA is incapable of learning new patterns of speech or new words through interaction alone. Edits must be made directly to ELIZA’s active script in order to change the manner by which the program operates.
Weizenbaum first implemented ELIZA in his own SLIP list-processing language, where, depending upon the initial entries by the user, the illusion of human intelligence could appear, or be dispelled through several interchanges. Some of ELIZA's responses were so convincing that Weizenbaum and several others have anecdotes of users becoming emotionally attached to the program, occasionally forgetting that they were conversing with a computer. Weizenbaum's own secretary reportedly asked Weizenbaum to leave the room so that she and ELIZA could have a real conversation. Weizenbaum was surprised by this, later writing: "I had not realized ... that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people."
In 1966, interactive computing (via a teletype) was new. It was 15 years before the personal computer became familiar to the general public, and three decades before most people encountered attempts at natural language processing in Internet services like Ask.com or PC help systems such as Microsoft Office Clippit. Although those programs included years of research and work, ELIZA remains a milestone simply because it was the first time a programmer had attempted such a human-machine interaction with the goal of creating the illusion (however brief) of human–"human" interaction.
At the ICCC 1972 ELIZA met another early artificial-intelligence program named PARRY and had a computer-only conversation. While ELIZA was built to be a "Doctor", PARRY was intended to simulate a patient with schizophrenia.
Weizenbaum originally wrote ELIZA in MAD-Slip for the IBM 7094, as a program to make natural-language conversation possible with a computer. To accomplish this, Weizenbaum identified five "fundamental technical problems" for ELIZA to overcome: the identification of critical words, the discovery of a minimal context, the choice of appropriate transformations, the generation of responses appropriate to the transformation or in the absence of critical words and the provision of an ending capacity for ELIZA scripts. Weizenbaum solved these problems in his ELIZA program and made ELIZA such that it had no built-in contextual framework or universe of discourse. However, this required ELIZA to have a script of instructions on how to respond to inputs from users.
ELIZA starts its process of responding to an input by a user by first examining the text input for a "keyword". A "keyword" is a word designated as important by the acting ELIZA script, which assigns to each keyword a precedence number, or a RANK, designed by the programmer. If such words are found, they are put into a "keystack", with the keyword of the highest RANK at the top. The input sentence is then manipulated and transformed as the rule associated with the keyword of the highest RANK directs. For example, when the DOCTOR script encounters words such as "alike" or "same", it would output a message pertaining to similarity, in this case “In what way?”, as these words had high precedence number. This also demonstrates how certain words, as dictated by the script, can be manipulated regardless of contextual considerations, such as switching first-person pronouns and second-person pronouns and vice versa, as these too had high precedence numbers. Such words with high precedence numbers are deemed superior to conversational patterns and are treated independently of contextual patterns.
Following the first examination, the next step of the process is to apply an appropriate transformation rule, which includes two parts: the "decomposition rule" and the "reassembly rule". First, the input is reviewed for syntactical patterns in order to establish the minimal context necessary to respond. Using the keywords and other nearby words from the input, different disassembly rules are tested until an appropriate pattern is found. Using the script's rules, the sentence is then "dismantled" and arranged into sections of the component parts as the "decomposition rule for the highest-ranking keyword" dictates. The example that Weizenbaum gives is the input "I are very helpful" (remembering that "I" is "You" transformed), which is broken into (1) empty (2) "I" (3) "are" (4) "very helpful". The decomposition rule has broken the phrase into four small segments that contain both the keywords and the information in the sentence.
The decomposition rule then designates a particular reassembly rule, or set of reassembly rules, to follow when reconstructing the sentence. The reassembly rule then takes the fragments of the input that the decomposition rule had created, rearranges them, and adds in programmed words to create a response. Using Weizenbaum's example previously stated, such a reassembly rule would take the fragments and apply them to the phrase "What makes you think I am (4)", which would result in "What makes you think I am very helpful". This example is rather simple, since depending upon the disassembly rule, the output could be significantly more complex and use more of the input from the user. However, from this reassembly, ELIZA then sends the constructed sentence to the user in the form of text on the screen.
These steps represent the bulk of the procedures that ELIZA follows in order to create a response from a typical input, though there are several specialized situations that ELIZA/DOCTOR can respond to. One Weizenbaum specifically wrote about was when there is no keyword. One solution was to have ELIZA respond with a remark that lacked content, such as "I see" or "Please go on". The second method was to use a "MEMORY" structure, which recorded prior recent inputs, and would use these inputs to create a response referencing a part of the earlier conversation when encountered with no keywords. This was possible due to Slip’s ability to tag words for other usage, which simultaneously allowed ELIZA to examine, store and repurpose words for usage in outputs.
While these functions were all framed in ELIZA's programming, the exact manner by which the program dismantled, examined, and reassembled inputs is determined by the operating script. However, the script is not static and can be edited, or a new one created, as is necessary for the operation in the context needed (thus how ELIZA can "learn" new information). This would allow the program to be applied in multiple situations, including the well-known DOCTOR script, which simulates a Rogerian psychotherapist.
Weizenbaum's original MAD-SLIP implementation was re-written in Lisp by Bernie Cosell. A BASIC version appeared in Creative Computing in 1977 (although it was written in 1973 by Jeff Shrager). This version, which was ported to many of the earliest personal computers, appears to have been subsequently translated into many other versions in many other languages.
Another version of Eliza popular among software engineers is the version that comes with the default release of GNU Emacs, and which can be accessed by typing codice_1 from most modern Emacs implementations.
In 1969, George Lucas and Walter Murch incorporated an Eliza-like dialogue interface in their screenplay for the feature film "THX-1138". Inhabitants of the underground future world of THX, when stressed, would retreat to "confession booths" and initiate a one-sided Eliza-formula conversation with a Jesus-faced computer who claimed to be "Omm".
ELIZA influenced a number of early computer games by demonstrating additional kinds of interface designs. Don Daglow wrote an enhanced version of the program called "Ecala" on a DEC PDP-10 minicomputer at Pomona College in 1973 before writing the computer role-playing game "Dungeon" (1975).
In the 2008 anime RD Sennou Chousashitsu, aka "Real Drive", a character named Eliza Weizenbaum appears, an obvious tribute to ELIZA and Joseph Weizenbaum. Her behavior in the story often mimics the responses of the ELIZA program.
The 2011 video game "" features an artificial-intelligence Picus TV Network newsreader named Eliza Cassan.
In January 2018, the twelfth episode of the American sitcom "Young Sheldon" starred the protagonist "conversing" with ELIZA, hoping to resolve a domestic issue.
On July 19, 2018, ELIZA was in a brief mention by the protagonist of the movie "Zoe" to support his reasoning behind why his relationship with Zoe, a hyper-realistic AI, wasn't real.
On August 12, 2019, independent game developer Zachtronics published a visual novel called "Eliza", about an AI-based counseling service inspired by ELIZA.
Lay responses to ELIZA were disturbing to Weizenbaum and motivated him to write his book "Computer Power and Human Reason: From Judgment to Calculation", in which he explains the limits of computers, as he wants to make clear in people's minds his opinion that the anthropomorphic views of computers are just a reduction of the human being and any life form for that matter. In the independent documentary film "Plug & Pray" (2010) Weizenbaum said that only people who misunderstood ELIZA called it a sensation.
The Israeli poet David Avidan, who was fascinated with future technologies and their relation to art, desired to explore the use of computers for writing literature. He conducted several conversations with an APL implementation of ELIZA and published them – in English, and in his own translation to Hebrew – under the title "My Electronic Psychiatrist – Eight Authentic Talks with a Computer". In the foreword he presented it as a form of constrained writing.
There are many programs based on ELIZA in different programming languages. In 1980 a company called "Don't Ask Software" created a version called "Abuse" for the Apple II, Atari, and Commodore 64 computers, which verbally abused the user based on the user's input. Other versions adapted ELIZA around a religious theme, such as ones featuring Jesus (both serious and comedic), and another Apple II variant called "I Am Buddha". The 1980 game "The Prisoner" incorporated ELIZA-style interaction within its gameplay. In 1988 the British artist and friend of Weizenbaum Brian Reffin Smith created and showed at the exhibition "Salamandre" in the Musée du Berry, Bourges, France, two art-oriented ELIZA-style programs written in BASIC, one called "Critic" and the other "Artist", running on two separate Amiga 1000 computers. The visitor was supposed to help them converse by typing in to "Artist" what "Critic" said, and vice versa. The secret was that the two programs were identical. GNU Emacs formerly had a codice_2 command that simulates a session between ELIZA and Zippy the Pinhead. The Zippyisms were removed due to copyright issues, but the DOCTOR program remains.
ELIZA has been referenced in popular culture and continues to be a source of inspiration for programmers and developers focused on artificial intelligence. It was also featured in a 2012 exhibit at Harvard University titled "Go Ask A.L.I.C.E", as part of a celebration of mathematician Alan Turing's 100th birthday. The exhibit explores Turing's lifelong fascination with the interaction between humans and computers, pointing to ELIZA as one of the earliest realizations of Turing's ideas. | https://en.wikipedia.org/wiki?curid=10235 |
ELIZA effect
The ELIZA effect, in computer science, is the tendency to unconsciously assume computer behaviors are analogous to human behaviors; that is, anthropomorphisation.
In its specific form, the ELIZA effect refers only to "the susceptibility of people to read far more understanding than is warranted into strings of symbols—especially words—strung together by computers". A trivial example of the specific form of the Eliza effect, given by Douglas Hofstadter, involves an automated teller machine which displays the words "THANK YOU" at the end of a transaction. A (very) casual observer might think that the machine is actually expressing gratitude; however, the machine is only printing a preprogrammed string of symbols.
More generally, the ELIZA effect describes any situation where, based solely on a system's output, users perceive computer systems as having "intrinsic qualities and abilities which the software controlling the (output) cannot possibly achieve" or "assume that [outputs] reflect a greater causality than they actually do". In both its specific and general forms, the ELIZA effect is notable for occurring even when users of the system are aware of the determinate nature of output produced by the system. From a psychological standpoint, the ELIZA effect is the result of a subtle cognitive dissonance between the user's awareness of programming limitations and their behavior towards the output of the program. The discovery of the ELIZA effect was an important development in artificial intelligence, demonstrating the principle of using social engineering rather than explicit programming to pass a Turing test.
The effect is named for the 1966 chatterbot ELIZA, developed by MIT computer scientist Joseph Weizenbaum. When executing Weizenbaum's "DOCTOR" script, ELIZA parodied a Rogerian psychotherapist, largely by rephrasing the "patients replies as questions:
Though designed strictly as a mechanism to support "natural language conversation" with a computer, ELIZA's "DOCTOR" script was found to be surprisingly successful in eliciting emotional responses from users who, in the course of interacting with the program, began to ascribe understanding and motivation to the program's output. As Weizenbaum later wrote, "I had not realized ... that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people." Indeed, ELIZA's code had not been designed to evoke this reaction in the first place. Upon observation, researchers discovered users unconsciously assuming ELIZA's questions implied interest and emotional involvement in the topics discussed, "even when they consciously knew that ELIZA did not simulate emotion". | https://en.wikipedia.org/wiki?curid=10236 |
Exponentiation by squaring
In mathematics and computer programming, exponentiating by squaring is a general method for fast computation of large positive integer powers of a number, or more generally of an element of a semigroup, like a polynomial or a square matrix. Some variants are commonly referred to as square-and-multiply algorithms or binary exponentiation. These can be of quite general use, for example in modular arithmetic or powering of matrices. For semigroups for which additive notation is commonly used, like elliptic curves used in cryptography, this method is also referred to as double-and-add.
The method is based on the observation that, for a positive integer "n", we have
This method uses the bits of the exponent to determine which powers are computed.
This example shows how to compute formula_2 using this method.
The exponent, 13, is 1101 in binary. The bits are used in left to right order.
The exponent has 4 bits, so there are 4 iterations.
First, initialize the result to 1: formula_3.
If we write formula_12 in binary as formula_13, then this is equivalent to defining a sequence formula_14 by letting formula_15 and then defining formula_16 for formula_17, where formula_18 will equal formula_19.
This may be implemented as the following recursive algorithm:
Although not tail-recursive, this algorithm may be rewritten into a tail recursive algorithm by introducing an auxiliary function:
A tail-recursive variant may also be constructed using a pair of accumulators instead of an auxiliary function as seen in the F# example below. The accumulators a1 and a2 can be thought of as storing the values formula_20 and formula_21 where i and j are initialized to 1 and 0 respectively. In the even case i is doubled, and in the odd case j is increased by i. The final result is formula_22 where formula_23.
let exp_by_squaring x n =
The iterative version of the algorithm also uses a bounded auxiliary space, and is given by
A brief analysis shows that such an algorithm uses formula_24 squarings and at most formula_24 multiplications, where formula_26 denotes the floor function. More precisely, the number of multiplications is one less than the number of ones present in the binary expansion of "n". For "n" greater than about 4 this is computationally more efficient than naively multiplying the base with itself repeatedly.
Each squaring results in approximately double the number of digits of the previous, and so, if multiplication of two "d"-digit numbers is implemented in O("d""k") operations for some fixed "k", then the complexity of computing "x""n" is given by
This algorithm calculates the value of "xn" after expanding the exponent in base 2"k". It was first proposed by Brauer in 1939. In the algorithm below we make use of the following function "f"(0) = ("k", 0) and "f"("m") = ("s", "u"), where "m" = "u"·2"s" with "u" odd.
Algorithm:
For optimal efficiency, "k" should be the smallest integer satisfying
This method is an efficient variant of the 2"k"-ary method. For example, to calculate the exponent 398, which has binary expansion (110 001 110)2, we take a window of length 3 using the 2"k"-ary method algorithm and calculate 1, x3, x6, x12, x24, x48, x49, x98, x99, x198, x199, x398.
But, we can also compute 1, x3, x6, x12, x24, x48, x96, x192, x198, x199, x398, which saves one multiplication and amounts to evaluating (110 001 110)2
Here is the general algorithm:
Algorithm:
Algorithm:
Many algorithms for exponentiation do not provide defence against side-channel attacks. Namely, an attacker observing the sequence of squarings and multiplications can (partially) recover the exponent involved in the computation. This is a problem if the exponent should remain secret, as with many public-key cryptosystems. A technique called "Montgomery's ladder" addresses this concern.
Given the binary expansion of a positive, non-zero integer "n" = ("n""k"−1..."n"0)2 with "n"k−1 = 1, we can compute "xn" as follows:
The algorithm performs a fixed sequence of operations (up to log "n"): a multiplication and squaring takes place for each bit in the exponent, regardless of the bit's specific value. A similar algorithm for multiplication by doubling exists.
This specific implementation of Montgomery's ladder is not yet protected against cache timing attacks: memory access latencies might still be observable to an attacker, as different variables are accessed depending on the value of bits of the secret exponent. Modern cryptographic implementations use a "scatter" technique to make sure the processor always misses the faster cache.
There are several methods which can be employed to calculate "xn" when the base is fixed and the exponent varies. As one can see, precomputations play a key role in these algorithms.
Yao's method is orthogonal to the -ary method where the exponent is expanded in radix and the computation is as performed in the algorithm above. Let , , , and be integers.
Let the exponent be written as
where formula_33 for all formula_34.
Let .
Then the algorithm uses the equality
Given the element of , and the exponent written in the above form, along with the precomputed values , the element is calculated using the algorithm below:
If we set and , then the values are simply the digits of in base . Yao's method collects in "u" first those that appear to the highest power ; in the next round those with power are collected in as well etc. The variable "y" is multiplied times with the initial , times with the next highest powers, and so on.
The algorithm uses multiplications, and elements must be stored to compute .
The Euclidean method was first introduced in "Efficient exponentiation using precomputation and vector addition chains" by P.D Rooij.
This method for computing formula_19 in group , where is a natural integer, whose algorithm is given below, is using the following equality recursively:
where formula_38.
In other words, a Euclidean division of the exponent by is used to return a quotient and a rest .
Given the base element in group , and the exponent formula_12 written as in Yao's method, the element formula_19 is calculated using formula_41 precomputed values formula_42 and then the algorithm below.
The algorithm first finds the largest value among the and then the supremum within the set of .
Then it raises to the power , multiplies this value with , and then assigns the result of this computation and the value modulo .
The same idea allows fast computation of large exponents modulo a number. Especially in cryptography, it is useful to compute powers in a ring of integers modulo "q". It can also be used to compute integer powers in a group, using the rule
The method works in every semigroup and is often used to compute powers of matrices.
For example, the evaluation of
would take a very long time and lots of storage space if the naïve method were used: compute 13789722341, then take the remainder when divided by 2345. Even using a more effective method will take a long time: square 13789, take the remainder when divided by 2345, multiply the result by 13789, and so on. This will take less than formula_43 modular multiplications.
Applying above "exp-by-squaring" algorithm, with "*" interpreted as "x" * "y" = "xy" mod 2345 (that is, a multiplication followed by a division with remainder) leads to only 27 multiplications and divisions of integers, which may all be stored in a single machine word.
This is a non-recursive implementation of the above algorithm in Ruby.
codice_1 is redundant when codice_2 implicitly rounds towards zero, as strongly-typed languages with integer division would do. (codice_3 has the same effect.) codice_4 is the rightmost bit of the binary representation of codice_5, so if it is 1, then the number is odd, and if it is zero, then the number is even. It is also codice_5 modulo 2. (codice_7 has the same effect.)
def power(x, n)
end
parameter x = 3
result := 3
This example is based on the algorithm above. If calculated by hand, should go from left to right. If the start number is 1, just ignore it. Then if the next is one, square and multiply. If the next is zero, only square.
Exponentiation by squaring may also be used to calculate the product of 2 or more powers. If the underlying group or semigroup is commutative, then it is often possible to reduce the number of multiplications by computing the product simultaneously.
The formula "a"7×"b"5 may be calculated within 3 steps:
so one gets 8 multiplications in total.
A faster solution is to calculate both powers simultaneously:
which needs only 6 multiplications in total. Note that "a"×"b" is calculated twice; the result could be stored after the first calculation, which reduces the count of multiplication to 5.
Example with numbers:
Calculating the powers simultaneously instead of calculating them separately always reduces the count of multiplications if at least two of the exponents are greater than 1.
The example above "a"7×"b"5 may also be calculated with only 5 multiplications if the expression is transformed before calculation:
Generalization of transformation shows the following scheme:
For calculating "aA"×"bB"×...×"mM"×"nN"
Transformation before calculation often reduces the count of multiplications, but in some cases it also increases the count (see the last one of the examples below), so it may be a good idea to check the count of multiplications before using the transformed expression for calculation.
For the following expressions the count of multiplications is shown for calculating each power separately, calculating them simultaneously without transformation, and calculating them simultaneously after transformation.
In certain computations it may be more efficient to allow negative coefficients and hence use the inverse of the base, provided inversion in is "fast" or has been precomputed. For example, when computing , the binary method requires multiplications and squarings. However, one could perform squarings to get and then multiply by to obtain .
To this end we define the signed-digit representation of an integer in radix as
"Signed binary representation" corresponds to the particular choice and formula_45. It is denoted by formula_46. There are several methods for computing this representation. The representation is not unique. For example, take : two distinct signed-binary representations are given by formula_47 and formula_48, where formula_49 is used to denote . Since the binary method computes a multiplication for every non-zero entry in the base-2 representation of , we are interested in finding the signed-binary representation with the smallest number of non-zero entries, that is, the one with "minimal" Hamming weight. One method of doing this is to compute the representation in non-adjacent form, or NAF for short, which is one that satisfies formula_50 and denoted by formula_51. For example, the NAF representation of 478 is formula_52. This representation always has minimal Hamming weight. A simple algorithm to compute the NAF representation of a given integer formula_53 with formula_54 is the following:
Another algorithm by Koyama and Tsuruoka does not require the condition that formula_55; it still minimizes the Hamming weight.
Exponentiation by squaring can be viewed as a suboptimal addition-chain exponentiation algorithm: it computes the exponent by an addition chain consisting of repeated exponent doublings (squarings) and/or incrementing exponents by "one" (multiplying by "x") only. More generally, if one allows "any" previously computed exponents to be summed (by multiplying those powers of "x"), one can sometimes perform the exponentiation using fewer multiplications (but typically using more memory). The smallest power where this occurs is for "n" = 15:
In general, finding the "optimal" addition chain for a given exponent is a hard problem, for which no efficient algorithms are known, so optimal chains are typically only used for small exponents (e.g. in compilers where the chains for small powers have been pre-tabulated). However, there are a number of heuristic algorithms that, while not being optimal, have fewer multiplications than exponentiation by squaring at the cost of additional bookkeeping work and memory usage. Regardless, the number of multiplications never grows more slowly than Θ(log "n"), so these algorithms only improve asymptotically upon exponentiation by squaring by a constant factor at best. | https://en.wikipedia.org/wiki?curid=10237 |
Exon
An exon is any part of a gene that will encode a part of the final mature RNA produced by that gene after introns have been removed by RNA splicing. The term "exon" refers to both the DNA sequence within a gene and to the corresponding sequence in RNA transcripts. In RNA splicing, introns are removed and exons are covalently joined to one another as part of generating the mature messenger RNA. Just as the entire set of genes for a species constitutes the genome, the entire set of exons constitutes the exome.
The term "exon" derives from the expressed region and was coined by American biochemist Walter Gilbert in 1978: "The notion of the cistron… must be replaced by that of a transcription unit containing regions which will be lost from the mature messengerwhich I suggest we call introns (for intragenic regions)alternating with regions which will be expressedexons."
This definition was originally made for protein-coding transcripts that are spliced before being translated. The term later came to include sequences removed from rRNA and tRNA, and it also was used later for RNA molecules originating from different parts of the genome that are then ligated by trans-splicing.
Although unicellular eukaryotes such as yeast have either no introns or very few, metazoans and especially vertebrate genomes have a large fraction of non-coding DNA. For instance, in the human genome only 1.1% of the genome is spanned by exons, whereas 24% is in introns, with 75% of the genome being intergenic DNA. This can provide a practical advantage in omics-aided health care (such as precision medicine) because it makes commercialized whole exome sequencing a smaller and less expensive challenge than commercialized whole genome sequencing. The large variation in genome size and C-value across life forms has posed an interesting challenge called the C-value enigma.
Across all eukaryotic genes in GenBank, there were (in 2002), on average, 5.48 exons per gene. The average exon encoded 30-36 amino acids. While the longest exon in the human genome is 11555 bp long, several exons have been found to be only 2 bp long. A single-nucleotide exon has been reported from the "Arabidopsis" genome.
In protein-coding genes, the exons include both the protein-coding sequence and the 5′- and 3′-untranslated regions (UTR). Often the first exon includes both the 5′-UTR and the first part of the coding sequence, but exons containing only regions of 5′-UTR or (more rarely) 3′-UTR occur in some genes, i.e. the UTRs may contain introns. Some non-coding RNA transcripts also have exons and introns.
Mature mRNAs originating from the same gene need not include the same exons, since different introns in the pre-mRNA can be removed by the process of alternative splicing.
Exonization is the creation of a new exon, as a result of mutations in introns.
Exon trapping or 'gene trapping' is a molecular biology technique that exploits the existence of the intron-exon splicing to find new genes. The first exon of a 'trapped' gene splices into the exon that is contained in the insertional DNA. This new exon contains the ORF for a reporter gene that can now be expressed using the enhancers that control the target gene. A scientist knows that a new gene has been trapped when the reporter gene is expressed.
Splicing can be experimentally modified so that targeted exons are excluded from mature mRNA transcripts by blocking the access of splice-directing small nuclear ribonucleoprotein particles (snRNPs) to pre-mRNA using Morpholino antisense oligos. This has become a standard technique in developmental biology. Morpholino oligos can also be targeted to prevent molecules that regulate splicing (e.g. splice enhancers, splice suppressors) from binding to pre-mRNA, altering patterns of splicing. | https://en.wikipedia.org/wiki?curid=10238 |
Exxon
Exxon is the brand name of oil and natural resources company Exxon Corporation, prior to 1972, known as Standard Oil Company of New Jersey. In 1999, Exxon Corporation merged with Mobil to form ExxonMobil. The "Exxon" brand is still used by ExxonMobil's downstream operations as a brand for certain gas stations, motor fuel and related products (the highest concentration of which are located in New Jersey, Pennsylvania, Texas and in the Mid-Atlantic and Southeastern states). Standard Oil Company of New Jersey was one of the Seven Sisters that dominated the global petroleum industry from the mid-1940s to the 1970s.
Exxon replaced the Esso, Enco, and Humble brands in the United States in 1973. The Esso name was a trademark of Standard Oil Company of New Jersey, and attracted protests from other Standard Oil spinoffs because of its phonetic similarity to the acronym of the name of the parent company, Standard Oil. As a result, Standard Oil Company of New Jersey was restricted from using Esso in the U.S., except in those states awarded to it in the 1911 Standard Oil antitrust settlement.
In states where it was restricted from using the Esso name, the company marketed under the Humble or Enco brands. The Humble brand was used at Texas stations for decades, as those operations were under the direction of Standard Oil Company of New Jersey affiliate Humble Oil & Refining Company. In the middle to late 1950s, use of the Humble brand spread to other southwestern states, including Arizona, New Mexico, and Oklahoma.
In 1959, Standard Oil Company of New Jersey secured full control of Humble Oil and restructured it into its U.S. marketing and refining division, to market nationwide under the Enco, Esso and Humble brands. Enco was created as an acronym for the phrase "Energy Company". Humble introduced the Enco brand in 1960 in Oklahoma and surrounding states, to replace Humble's subsidiary Oklahoma and Pate brands. Humble also tried marketing under Enco in Ohio, but Standard Oil Company of Ohio (Sohio) protested that the Enco name and logo (a white oval with blue border and red lettering) too closely resembled that of Esso. Consequently, stations in Ohio were rebranded as Humble, and remained so until the Exxon brand came into use.
After the Enco brand was discontinued in Ohio, it was moved to other non-Esso states. In 1961, Humble stations in Arizona, New Mexico, Oklahoma and Texas were rebranded to Enco. That same year, Enco appeared on former Carter stations in the Midwest and the Pacific Northwest.
In 1963, Humble Oil and Tidewater Oil Company began negotiating a sale of Tidewater's West Coast refining and marketing operations. The sale would have given Humble Oil many existing Flying A stations and distributorships, as well as a refinery in California, the nation's fastest-growing gasoline market. However, the Justice Department objected to the sale on anti-trust grounds. (In 1966, Phillips Petroleum Company bought Tidewater's western properties and rebranded all Flying A outlets to Phillips 66.)
Humble Oil continued to expand its West Coast operations, adding California to its marketing territory, building many new Enco stations and rebranding others. In 1967, Humble Oil purchased all remaining Signal stations from Standard Oil Company of California (Chevron) and rebranded them as Enco outlets, greatly increasing Enco's presence in California. Finally, in 1969, Humble Oil opened a new refinery in Benicia, California.
In 1966, the U.S. Justice Department ordered Humble Oil to "cease and desist" from using the Esso brand at stations in several southeastern states, following protests from Standard Oil of Kentucky (Kyso), which was a Standard Oil of California subsidiary in the process of rebranding its Standard stations to Chevron. By 1967, Humble Oil's Esso stations in the Southeast were rebranded to Enco.
In the 1960s and early 1970s, Humble Oil continued to have difficulties promoting itself as a nationwide marketer of petroleum products, despite a number of high-profile marketing strategies. These included the popular "Put a Tiger in Your Tank" advertising campaign and accompanying tiger mascot created by American illustrator Bob Jones, to promote Enco Extra and Esso Extra gasolines. Humble Oil also used similar logotypes, use of the Humble name in all Enco and Esso advertising, and uniform designs for all stations regardless of brand. In addition, Humble Oil was a major promoter and broadcast sponsor for college football in the Pacific-8 (now Pac-12) and Southwestern conferences.
But Humble Oil still faced stiff competition from national brands such as Shell and Texaco, which at that time was the only company to market under one brand name in all 50 states. By the late 1960s, Humble officials realized that the time had come to develop a new brand name that could be used nationwide.
At first, consideration was given to simply rebranding all stations as Enco, but that was shelved when it was learned that the word "Enco" is similar in pronunciation to the Japanese slang term "enko", meaning "stalled car" (an abbreviation of "enjin no kosho", "engine breakdown").
In 1972, Exxon was unveiled as the new, unified brand name for all former Enco and Esso outlets. At the same time, the company changed its corporate name from Standard Oil of New Jersey to Exxon Corporation. The rebranding came after successful test-marketing of the Exxon name, under two experimental logos, in the fall and winter of 1971–1972. Along with the new name, Exxon settled on a rectangular logo using red lettering and blue trim on a white background, similar to the familiar color scheme on the old Enco and Esso logos.
The company initially planned to change its name to "Exon", in keeping with the four-letter format of Enco and Esso. However, during the planning process, it was noted that James Exon was the governor of Nebraska. Renaming the company after a sitting governor seemed ill-advised. George T. Piercy, a senior member of the board of directors suggested adding an X resulting in the new EXXON name.
The unrestricted international use of the popular Esso brand prompted Exxon to continue using it outside the U.S. Esso is the only widely used Standard Oil descendant brand left in existence. Others, such as Chevron, maintain a few Standard-branded stations in specific states in order to retain their trademarks and prevent others from using them.
Under the guidance of its paid consultants at Boston Consulting Group, Exxon announced in the 1970s, that it would compete against IBM and Xerox. The mantra was ‘Information Is the Oil of the 21st Century’. It launched Exxon Office Systems, which predictably failed, since "the giant oil company failed to fully realize the subtleties of managing small high-tech companies." In the early 1980s, Exxon retailed its fax machines and software through Sears. Exxon announced the closure of the venture at the end of 1984.
In 1989, Exxon announced that it was moving its headquarters, including about 300 employees, from Manhattan, New York City to the Las Colinas area of Irving, Texas. Exxon sold the Exxon Building (1251 Avenue of the Americas), its former headquarters in Rockefeller Center, to a unit of Mitsui Real Estate Development Co. Ltd. in 1986 for $610 million. John Walsh, president of Exxon subsidiary Friendswood Development Company, stated that Exxon left New York because the costs were too high. In 2009 Exxon partnered with Turner Ridge Capital Management to develop and finance their U.S. alternative energy infrastructure.
In 2016, ExxonMobil successfully asked a U.S. federal court to lift the aforementioned trademark injunction that banned it from using the Esso brand in various states. By this time, as a result of numerous mergers and rebranding, the remaining Standard Oil companies that previously objected to the Esso name had been acquired by BP. ExxonMobil cited trademark surveys in which there was no longer possible confusion with the Esso name as it was more than seven decades before. BP also had no objection to lifting the ban. ExxonMobil did not specify whether they would now open new stations in the U.S. under the Esso name; they were primarily concerned about the additional expenses of having separate marketing, letterheads, packaging, and other materials that omit "Esso".
The rectangular Exxon logo, with the blue strip at the bottom and red lettering with the two 'X's interlinked together, was designed by noted industrial stylist Raymond Loewy. The interlinked 'X's are incorporated in the modern-day ExxonMobil corporate logo; in mid-2016, as part of a corporate rebranding accompanying the launch of ExxonMobil's "Synergy" fuel products, the mixed-case Exxon wordmark from the ExxonMobil corporate logo became the brand's main logo.
In 1985, Minolta introduced an autofocus SLR camera system named "Maxxum" in the United States. Originally, cameras (such as the Maxxum 7000) lenses and flashes used a logo with the X's crossed in 'MAXXUM'. Exxon considered this a violation of their trademark, and as a result, Minolta was allowed to distribute cameras already produced, but was forced to change the stylistic 'XX' and implement this as a change in new production. ExxonMobil similarly sued 21st Century Fox over its cable channel FXX, but the parties agreed to dismiss the suit in October 2015.
Exxon is ExxonMobil's primary retail gasoline brand in most of the United States, with the highest concentration of retail outlets located in New Jersey, Pennsylvania, Texas and in the Mid-Atlantic and Southeastern states. The Exxon brand also has a market presence in the following metropolitan areas:
Mobil is the company's primary retail gasoline brand in California, Florida, New York, New England, the Great Lakes and the Midwest. Esso is ExxonMobil's primary gasoline brand worldwide except in Australia, Guam, Mexico, Nigeria and New Zealand, where the Mobil brand is used exclusively. In Colombia, Canada and Egypt, as well as formerly Japan and Malaysia, both the Esso and Mobil brands are used, in which the latter were rebranded as Petron in 2013, and ENEOS for the former in 2019, respectively. The Mobil brand is applied on each Esso fuel tanks in Hong Kong and Singapore. | https://en.wikipedia.org/wiki?curid=10239 |
Exxon Valdez oil spill
The "Exxon Valdez" oil spill occurred in Prince William Sound, Alaska, March 24, 1989, when "Exxon Valdez", an oil tanker owned by Exxon Shipping Company, bound for Long Beach, California, struck Prince William Sound's Bligh Reef, west of Tatitlek, Alaska, at 12:04 a.m. and spilled (or 37,000 metric tonnes) of crude oil over the next few days. It is considered the worst oil spill worldwide in terms of damage to the environment. The "Valdez" spill is the second largest in US waters, after the 2010 "Deepwater Horizon" oil spill, in terms of volume released. Prince William Sound's remote location, accessible only by helicopter, plane, or boat, made government and industry response efforts difficult and severely taxed existing response plans. The region is a habitat for salmon, sea otters, seals and seabirds. The oil, originally extracted at the Prudhoe Bay Oil Field, eventually affected of coastline, of which were heavily or moderately oiled.
The ship was carrying of oil, of which about were spilled into the Prince William Sound.
Multiple factors have been identified as contributing to the incident:
Captain Joseph Hazelwood, who was widely reported to have been drinking heavily that night, was not at the controls when the ship struck the reef. Exxon blamed Captain Hazelwood for the grounding of the tanker, but Hazelwood accused the corporation of making him a scapegoat. As the senior officer in command of the ship, he was accused of being intoxicated and thereby contributing to the disaster, but he was cleared of this charge at his 1990 trial after witnesses testified that he was sober around the time of the accident. In light of the other findings, investigative reporter Greg Palast stated in 2008, "Forget the drunken skipper fable. At the helm, the third mate may never have collided with Bligh Reef had he looked at his RAYCAS radar. But the radar was not turned on. In fact, the tanker's radar was left broken and disabled for more than a year before the disaster, and Exxon management knew it. It was just too expensive to fix and operate."
Other factors, according to an MIT course entitled "Software System Safety" by Professor Nancy G. Leveson, included:
This disaster resulted in International Maritime Organization introducing comprehensive marine pollution prevention rules (MARPOL) through various conventions. The rules were ratified by member countries and, under International Ship Management rules, the ships are being operated with a common objective of "safer ships and cleaner oceans".
In 2009, "Exxon Valdez" Captain Joseph Hazelwood offered a "heartfelt apology" to the people of Alaska, suggesting he had been wrongly blamed for the disaster: "The true story is out there for anybody who wants to look at the facts, but that's not the sexy story and that's not the easy story," he said. Hazelwood said he felt Alaskans always gave him a fair shake.
Chemical dispersant, a surfactant and solvent mixture, was applied to the slick by a private company on March 24 with a helicopter. But the helicopter missed the target area. Scientific data on its toxicity were either thin or incomplete. In addition, public acceptance of a new, widespread chemical treatment was lacking. Landowners, fishing groups, and conservation organizations questioned the use of chemicals on hundreds of miles of shoreline when other alternatives may have been available."
According to a report by David Kirby for TakePart, the main component of the Corexit formulation used during cleanup, 2-butoxyethanol, was identified as "one of the agents that caused liver, kidney, lung, nervous system, and blood disorders among cleanup crews in Alaska following the 1989 "Exxon Valdez" spill.
Mechanical cleanup was started shortly afterwards using booms and skimmers, but the skimmers were not readily available during the first 24 hours following the spill, and thick oil and kelp tended to clog the equipment. Despite civilian insistence for a complete clean, only 10% of total oil was actually completely cleaned. Exxon was widely criticized for its slow response to cleaning up the disaster and John Devens, the mayor of Valdez, has said his community felt betrayed by Exxon's inadequate response to the crisis. More than 11,000 Alaska residents, along with some Exxon employees, worked throughout the region to try to restore the environment.
Because Prince William Sound contained many rocky coves where the oil collected, the decision was made to displace it with high-pressure hot water. However, this also displaced and destroyed the microbial populations on the shoreline; many of these organisms (e.g. plankton) are the basis of the coastal marine food chain, and others (e.g. certain bacteria and fungi) are capable of facilitating the biodegradation of oil. At the time, both scientific advice and public pressure was to clean everything, but since then, a much greater understanding of natural and facilitated remediation processes has developed, due somewhat in part to the opportunity presented for study by the "Exxon Valdez" spill. Despite the extensive cleanup attempts, less than ten percent of the oil was recovered.
Both the long-term and short-term effects of the oil spill have been studied. Immediate effects included the deaths of 100,000 to as many as 250,000 seabirds, at least 2,800 sea otters, approximately 12 river otters, 300 harbor seals, 247 bald eagles, and 22 orcas, and an unknown number of salmon and herring.
Although the volume of oil has declined considerably with oil remaining only about 0.14–0.28% of the original spilled volume, studies suggest the area of oiled beach has changed little since 1992. A study by the National Marine Fisheries Service, NOAA in Juneau, determined that by 2001 approximately 90 tonnes of oil remained on beaches in Prince William Sound in the sandy soil of the contaminated shoreline, with annual loss rates declining from 68% per year prior to 1992, to 4% per year after 2001.
The remaining oil lasting far longer than anticipated has resulted in more long-term losses of species than had been expected. Laboratory experiments found that at levels as low as one part per billion, polycyclic aromatic hydrocarbons are toxic for salmon and herring eggs. Species as diverse as sea otters, harlequin ducks and orcas suffered immediate and long-term losses. Oiled mussel beds and other tidal shoreline habitats may take up to 30 years to recover.
ExxonMobil denied concerns over remaining oil, stating that they anticipated the remaining fraction would not cause long-term ecological impacts. According to the conclusions of ExxonMobil's study: "We've done 350 peer-reviewed studies of Prince William Sound, and those studies conclude that Prince William Sound has recovered, it's healthy and it's thriving."
On March 24, 2014, the twenty-fifth anniversary of the spill, NOAA scientists reported that some species seem to have recovered, with the sea otter the latest creature to return to pre-spill numbers. Scientists who have monitored the spill area for the last 25 years report that concern remains for one of two pods of local orca whales, with fears that one pod may eventually die out. Federal scientists estimate that between 16,000 and 21,000 US gallons (61 to 79 m3) of oil remains on beaches in Prince William Sound and up to 450 miles (725 km) away. Some of the oil does not appear to have biodegraded at all. A USGS scientist who analyses the remaining oil along the coastline states that it remains among rocks and between tide marks. "The oil mixes with seawater and forms an emulsion...Left out, the surface crusts over but the inside still has the consistency of mayonnaise – or mousse." Alaska state senator Berta Gardner is urging Alaskan politicians to demand that the US government force ExxonMobil to pay the final $92 million (£57 million) still owed from the court settlement. The major part of the money would be spent to finish cleaning up oiled beaches and attempting to restore the crippled herring population.
In the case of "Exxon v. Baker", an Anchorage jury awarded $287 million for actual damages and $5 billion for punitive damages. To protect itself in case the judgment was affirmed, Exxon obtained a $4.8 billion credit line from J.P. Morgan & Co., who created the first modern credit default swap so that they would not have to hold as much money in reserve against the risk of Exxon's default.
Meanwhile, Exxon appealed the ruling, and the 9th U.S. Circuit Court of Appeals ordered the trial judge, Russel Holland, to reduce the punitive damages. On December 6, 2002, Holland announced that he had reduced the damages to $4 billion, which he concluded was justified by the facts of the case and was not grossly excessive. Exxon appealed again and the case returned to Holland to be reconsidered in light of a recent Supreme Court ruling in a similar case. Holland increased the punitive damages to $4.5 billion, plus interest.
After more appeals, in December 2006 the damages award was cut to $2.5 billion. The court of appeals cited recent Supreme Court rulings relative to limits on punitive damages.
Exxon appealed again. On May 23, 2007, the 9th Circuit Court of Appeals denied ExxonMobil's request for a third hearing and let stand its ruling that Exxon owed $2.5 billion in punitive damages. Exxon then appealed to the Supreme Court, which agreed to hear the case. On February 27, 2008, the Supreme Court heard oral arguments. Justice Samuel Alito, who at the time owned between $100,000 and $250,000 in Exxon stock, recused himself from the case. In a decision issued June 25, 2008, written by Justice David Souter, the court vacated the $2.5 billion award and remanded the case back to the lower court, finding that the damages were excessive with respect to maritime common law. Exxon's actions were deemed "worse than negligent but less than malicious." The punitive damages were further reduced to an amount of $507.5 million. The Court's ruling was that maritime punitive damages should not exceed the compensatory damages, supported by a precedent dating from 1818. Senate Judiciary Committee Chairman Patrick J. Leahy has decried the ruling as "another in a line of cases where this Supreme Court has misconstrued congressional intent to benefit large corporations."
Exxon's official position was that punitive damages greater than $25 million were not justified because the spill resulted from an accident, and because Exxon spent an estimated $2 billion cleaning up the spill and a further $1 billion to settle related civil and criminal charges. Attorneys for the plaintiffs contended that Exxon bore responsibility for the accident because the company "put a drunk in charge of a tanker in Prince William Sound."
Exxon recovered a significant portion of clean-up and legal expenses through insurance claims associated with the grounding of the "Exxon Valdez". Also, in 1991, Exxon made a quiet, separate financial settlement of damages with a group of seafood producers known as the Seattle Seven for the disaster's effect on the Alaskan seafood industry. The agreement granted $63.75 million to the Seattle Seven, but stipulated that the seafood companies would have to repay almost all of any punitive damages awarded in other civil proceedings. The $5 billion in punitive damages was awarded later, and the Seattle Seven's share could have been as high as $750 million if the damages award had held. Other plaintiffs have objected to this secret arrangement, and when it came to light, Judge Holland ruled that Exxon should have told the jury at the start that an agreement had already been made, so the jury would know exactly how much Exxon would have to pay.
As of December 15, 2009, Exxon had paid the entire $507.5 million in punitive damages, including lawsuit costs, plus interest, which were further distributed to thousands of plaintiffs.
In October 1989, Exxon filed suit against the State of Alaska, charging that the state had interfered with Exxon's attempts to clean up the spill by refusing to approve the use of dispersant chemicals until the night of the 26th. The state disputed the claim, stating that there was a long-standing agreement to allow the use of dispersants to clean up spills, thus Exxon did not require permission to use them, and that in fact Exxon had not had enough dispersant on hand to effectively handle a spill of the size created by the "Valdez". Exxon filed claims in October 1990 against the Coast Guard, asking to be reimbursed for cleanup costs and damages awarded to plaintiffs in any lawsuits filed by the State of Alaska or the federal government against Exxon. The company claimed that the Coast Guard was "wholly or partially responsible" for the spill, because they had granted mariners' licenses to the crew of the Valdez, and because they had given the "Valdez" permission to leave regular shipping lanes to avoid ice. They also reiterated the claim that the Coast Guard had delayed cleanup by refusing to give permission to immediately use chemical dispersants on the spill.
The Oil Spill Recovery Institute was formed after United States Congress approved it to seek a solution. Collaborating with InnoCentive they found a partial solution for the flow of oil.
A report by the US National Response Team summarized the event and made a number of recommendations, such as changes to the work patterns of Exxon crew in order to address the causes of the accident.
In response to the spill, the United States Congress passed the Oil Pollution Act of 1990 (OPA). The legislation included a clause that prohibits any vessel that, after March 22, 1989, has caused an oil spill of more than in any marine area, from operating in Prince William Sound.
In April 1998, the company argued in a legal action against the Federal government that the ship should be allowed back into Alaskan waters. Exxon claimed OPA was effectively a bill of attainder, a regulation that was unfairly directed at Exxon alone. In 2002, the 9th Circuit Court of Appeals ruled against Exxon. As of 2002, OPA had prevented 18 ships from entering Prince William Sound.
OPA also set a schedule for the gradual phase in of a double hull design, providing an additional layer between the oil tanks and the ocean. While a double hull would likely not have prevented the "Valdez" disaster, a Coast Guard study estimated that it would have cut the amount of oil spilled by 60 percent.
The "Exxon Valdez" supertanker was towed to San Diego, arriving on July 10. Repairs began on July 30. Approximately of steel were removed and replaced. In June 1990 the tanker, renamed "Exxon Mediterranean", left harbor after $30 million of repairs. In 1993, owned by SeaRiver Maritime, it was named "S/R Mediterranean", then in 2005 "Mediterranean". In 2008 the vessel was acquired by a Hong Kong company that operated it as "Dong Fang Ocean", then in 2011 renamed it "Oriental Nicety". In August 2012, it was beached at Alang, India, and dismantled.
In the aftermath of the spill, Alaska governor Steve Cowper issued an executive order requiring two tugboats to escort every loaded tanker from Valdez out through Prince William Sound to Hinchinbrook Entrance. As the plan evolved in the 1990s, one of the two routine tugboats was replaced with a Escort Response Vehicle (ERV). Tankers at Valdez are no longer single-hulled. Congress enacted legislation requiring all tankers to be double-hulled as of 2015.
In 1991, following the collapse of the local marine population (particularly clams, herring and seals) the Chugach Alaska Corporation, an Alaska Native Corporation, filed for Chapter 11 bankruptcy protection. It has since recovered.
According to several studies funded by the state of Alaska, the spill had both short-term and long-term economic effects. These included the loss of recreational sports, fisheries, reduced tourism, and an estimate of what economists call "existence value", which is the value to the public of a pristine Prince William Sound.
The economy of the city of Cordova, Alaska was adversely affected after the spill damaged stocks of salmon and herring in the area. The village of Chenega was transformed into an emergency base and media outlet. The local villagers had to cope with a tripling of their population from 80 to 250. When asked how they felt about the situation, a village councillor noted that they were too shocked and busy to be depressed; others emphasized the human costs of leaving children unattended while their parents worked to clean up. Many Native Americans were worried that too much time was spent on the fishery and not enough on the land that supports subsistence hunting.
In 2010, a CNN report alleged that many oil spill cleanup workers involved in the "Exxon Valdez" response had subsequently become sick. Anchorage lawyer Dennis Mestas found that this was true for 6,722 of 11,000 worker files he was able to inspect. Access to the records was controlled by Exxon. Exxon responded in a statement to CNN:
After 20 years, there is no evidence suggesting that either cleanup workers or the residents of the communities affected by the Valdez spill have had any adverse health effects as a result of the spill or its cleanup.
In 1992, Exxon released a video titled "Scientists and the Alaska Oil Spill" for distribution to schools. Critics said the video misrepresented the clean-up process.
In December 1994, the Unabomber assassinated Burson-Marsteller executive Thomas Mosser, accusing him of having "helped Exxon clean up its public image after the "Exxon Valdez" incident".
Several weeks after the spill, "Saturday Night Live" aired a pointed sketch featuring Kevin Nealon, Phil Hartman, and Victoria Jackson as cleanup workers struggling to scrub the oil off of animals and rocks on a beach in Prince William Sound.
In the 1995 film "Waterworld", the "Exxon Valdez" is the flagship of the movie's villain, "The Deacon", the leader of a band of scavenging raiders. In the ship is a portrait of their patron saint, Joseph Hazelwood.
In the second Forrest Gump novel, "Gump and Co." by Winston Groom, Gump commandeers the "Exxon Valdez" and accidentally crashes it.
Composer Jonathan Larson wrote a song called "Iron Mike" about the oil spill. The song is written in the style of a sea shanty. It was first professionally recorded by George Salazar for the album "The Jonathan Larson Project". | https://en.wikipedia.org/wiki?curid=10243 |
Édouard de Pomiane
Édouard Alexandre de Pomiane, sometimes Édouard Pozerski (20 April 1875 in Paris – 26 January 1964 in Paris) was a French scientist, radio broadcaster and food writer.
His parents emigrated from Poland in 1863 after the January Uprising, changed their name from "Pozerski" to "de Pomiane", and became French citizens.
De Pomiane worked as a physician at the Institut Pasteur in Paris, where he gave Félix d'Herelle a place to work on bacteriophages.
His best known works that have been translated into English are "Cooking in Ten Minutes" and "Cooking with Pomiane". His writing was remarkable in its time for its directness (he frequently uses a strange second-person voice, telling you—the reader—what you are seeing and smelling as you follow a recipe) and for his general disdain for upper-class elaborate French cuisine. He travelled widely and quite a few of his recipes are from abroad. His recipes often take pains to demystify cooking by explaining the chemical processes at work.
"Vingt Plats Qui Donnent Goutte"IMG_7221.jpeg 1935 edition. | https://en.wikipedia.org/wiki?curid=10244 |
Edward VI of England
Edward VI (12 October 1537 – 6 July 1553) was the King of England and Ireland from 28 January 1547 until his death. He was crowned on 20 February at the age of nine. Edward was the son of Henry VIII and Jane Seymour, and England's first monarch to be raised as a Protestant. During his reign, the realm was governed by a regency council because he never reached maturity. The council was first led by his uncle Edward Seymour, 1st Duke of Somerset (1547–1549), and then by John Dudley, 1st Earl of Warwick (1550–1553), who from 1551 was Duke of Northumberland.
Edward's reign was marked by economic problems and social unrest that in 1549 erupted into riot and rebellion. An expensive war with Scotland, at first successful, ended with military withdrawal from Scotland and Boulogne-sur-Mer in exchange for peace. The transformation of the Church of England into a recognisably Protestant body also occurred under Edward, who took great interest in religious matters. Although his father, HenryVIII, had severed the link between the Church and Rome, HenryVIII had never permitted the renunciation of Catholic doctrine or ceremony. It was during Edward's reign that Protestantism was established for the first time in England with reforms that included the abolition of clerical celibacy and the Mass, and the imposition of compulsory services in English.
In February 1553, at age 15, Edward fell ill. When his sickness was discovered to be terminal, he and his Council drew up a "Devise for the Succession", to prevent the country's return to Catholicism. Edward named his first cousin once removed, Lady Jane Grey, as his heir, excluding his half-sisters, Mary and Elizabeth. This decision was disputed following Edward's death, and Jane was deposed by Mary nine days after becoming queen. During her reign, Mary reversed Edward's Protestant reforms, which nonetheless became the basis of the Elizabethan Religious Settlement of 1559.
Edward was born on 12 October 1537 in his mother's room inside Hampton Court Palace, in Middlesex. He was the son of King Henry VIII by his third wife, Jane Seymour. Throughout the realm, the people greeted the birth of a male heir, "whom we hungered for so long", with joy and relief. "Te Deums" were sung in churches, bonfires lit, and "their was shott at the Tower that night above two thousand gonnes". Queen Jane, appearing to recover quickly from the birth, sent out personally signed letters announcing the birth of "a Prince, conceived in most lawful matrimony between my Lord the King's Majesty and us". Edward was christened on 15 October, with his half-sisters, the 21-year-old Lady Mary as godmother and the 4-year-old Lady Elizabeth carrying the chrisom; and the Garter King of Arms proclaimed him as Duke of Cornwall and Earl of Chester. The Queen, however, fell ill on 23 October from presumed postnatal complications, and died the following night. Henry VIII wrote to Francis I of France that "Divine Providence ... hath mingled my joy with bitterness of the death of her who brought me this happiness".
Edward was a healthy baby who suckled strongly from the outset. His father was delighted with him; in May 1538, Henry was observed "dallying with him in his arms ... and so holding him in a window to the sight and great comfort of the people". That September, the Lord Chancellor, Lord Audley, reported Edward's rapid growth and vigour; and other accounts describe him as a tall and merry child. The tradition that Edward VI was a sickly boy has been challenged by more recent historians. At the age of four, he fell ill with a life-threatening "quartan fever", but, despite occasional illnesses and poor eyesight, he enjoyed generally good health until the last six months of his life.
Edward was initially placed in the care of Margaret Bryan, "lady mistress" of the prince's household. She was succeeded by Blanche Herbert, Lady Troy. Until the age of six, Edward was brought up, as he put it later in his "Chronicle", "among the women". The formal royal household established around Edward was, at first, under Sir William Sidney, and later Sir Richard Page, stepfather of Edward Seymour's wife, Anne Stanhope. Henry demanded exacting standards of security and cleanliness in his son's household, stressing that Edward was "this whole realm's most precious jewel". Visitors described the prince, who was lavishly provided with toys and comforts, including his own troupe of minstrels, as a contented child.
From the age of six, Edward began his formal education under Richard Cox and John Cheke, concentrating, as he recalled himself, on "learning of tongues, of the scripture, of philosophy, and all liberal sciences". He received tuition from Elizabeth's tutor, Roger Ascham, and Jean Belmain, learning French, Spanish and Italian. In addition, he is known to have studied geometry and learned to play musical instruments, including the lute and the virginals. He collected globes and maps and, according to coinage historian C. E. Challis, developed a grasp of monetary affairs that indicated a high intelligence. Edward's religious education is assumed to have favoured the reforming agenda. His religious establishment was probably chosen by Archbishop Thomas Cranmer, a leading reformer. Both Cox and Cheke were "reformed" Catholics or Erasmians and later became Marian exiles. By 1549, Edward had written a treatise on the pope as Antichrist and was making informed notes on theological controversies. Many aspects of Edward's religion were essentially Catholic in his early years, including celebration of the mass and reverence for images and relics of the saints.
Both Edward's sisters were attentive to their brother and often visited him – on one occasion, Elizabeth gave him a shirt "of her own working". Edward "took special content" in Mary's company, though he disapproved of her taste for foreign dances; "I love you most", he wrote to her in 1546. In 1543, Henry invited his children to spend Christmas with him, signalling his reconciliation with his daughters, whom he had previously illegitimised and disinherited. The following spring, he restored them to their place in the succession with a Third Succession Act, which also provided for a regency council during Edward's minority. This unaccustomed family harmony may have owed much to the influence of Henry's new wife, Catherine Parr, of whom Edward soon became fond. He called her his "most dear mother" and in September 1546 wrote to her: "I received so many benefits from you that my mind can hardly grasp them."
Other children were brought to play with Edward, including the granddaughter of Edward's chamberlain, Sir William Sidney, who in adulthood recalled the prince as "a marvellous sweet child, of very mild and generous condition". Edward was educated with sons of nobles, "appointed to attend upon him" in what was a form of miniature court. Among these, Barnaby Fitzpatrick, son of an Irish peer, became a close and lasting friend. Edward was more devoted to his schoolwork than his classmates and seems to have outshone them, motivated to do his "duty" and compete with his sister Elizabeth's academic prowess. Edward's surroundings and possessions were regally splendid: his rooms were hung with costly Flemish tapestries, and his clothes, books, and cutlery were encrusted with precious jewels and gold. Like his father, Edward was fascinated by military arts, and many of his portraits show him wearing a gold dagger with a jewelled hilt, in imitation of Henry. Edward's "Chronicle" enthusiastically details English military campaigns against Scotland and France, and adventures such as John Dudley's near capture at Musselburgh in 1547.
On 1 July 1543, Henry VIII signed the Treaty of Greenwich with the Scots, sealing the peace with Edward's betrothal to the seven-month-old Mary, Queen of Scots. The Scots were in a weak bargaining position after their defeat at Solway Moss the previous November, and Henry, seeking to unite the two realms, stipulated that Mary be handed over to him to be brought up in England. When the Scots repudiated the treaty in December 1543 and renewed their alliance with France, Henry was enraged. In April 1544, he ordered Edward's uncle, Edward Seymour, Earl of Hertford, to invade Scotland and "put all to fire and sword, burn Edinburgh town, so razed and defaced when you have sacked and gotten what ye can of it, as there may remain forever a perpetual memory of the vengeance of God lightened upon [them] for their falsehood and disloyalty". Seymour responded with the most savage campaign ever launched by the English against the Scots. The war, which continued into Edward's reign, has become known as "The Rough Wooing".
The nine-year-old Edward wrote to his father and stepmother on 10 January 1547 from Hertford thanking them for his new year's gift of their portraits from life. By 28 January 1547, Henry VIII was dead. Those close to the throne, led by Edward Seymour and William Paget, agreed to delay the announcement of the king's death until arrangements had been made for a smooth succession. Seymour and Sir Anthony Browne, the Master of the Horse, rode to collect Edward from Hertford and brought him to Enfield, where Lady Elizabeth was living. He and Elizabeth were then told of the death of their father and heard a reading of the will.
The Lord Chancellor, Thomas Wriothesley, announced Henry's death to parliament on 31 January, and general proclamations of Edward's succession were ordered. The new king was taken to the Tower of London, where he was welcomed with "great shot of ordnance in all places there about, as well out of the Tower as out of the ships". The following day, the nobles of the realm made their obeisance to Edward at the Tower, and Seymour was announced as Protector. Henry VIII was buried at Windsor on 16 February, in the same tomb as Jane Seymour, as he had wished.
Edward VI was crowned at Westminster Abbey four days later on Sunday 20 February. The ceremonies were shortened, because of the "tedious length of the same which should weary and be hurtsome peradventure to the King's majesty, being yet of tender age", and also because the Reformation had rendered some of them inappropriate.
On the eve of the coronation, Edward progressed on horseback from the Tower to the Palace of Westminster through thronging crowds and pageants, many based on the pageants for a previous boy king, Henry VI. He laughed at a Spanish tightrope walker who "tumbled and played many pretty toys" outside St Paul's Cathedral.
At the coronation service, Cranmer affirmed the royal supremacy and called Edward a second Josiah, urging him to continue the reformation of the Church of England, "the tyranny of the Bishops of Rome banished from your subjects, and images removed". After the service, Edward presided at a banquet in Westminster Hall, where, he recalled in his "Chronicle", he dined with his crown on his head.
Henry VIII's will named sixteen executors, who were to act as Edward's Council until he reached the age of eighteen. These executors were supplemented by twelve men "of counsail" who would assist the executors when called on. The final state of Henry VIII's will has been the subject of controversy. Some historians suggest that those close to the king manipulated either him or the will itself to ensure a share-out of power to their benefit, both material and religious. In this reading, the composition of the Privy Chamber shifted towards the end of 1546 in favour of the reforming faction. In addition, two leading conservative Privy Councillors were removed from the centre of power.
Stephen Gardiner was refused access to Henry during his last months. Thomas Howard, 3rd Duke of Norfolk, found himself accused of treason; the day before the king's death his vast estates were seized, making them available for redistribution, and he spent the whole of Edward's reign in the Tower of London. Other historians have argued that Gardiner's exclusion was based on non-religious matters, that Norfolk was not noticeably conservative in religion, that conservatives remained on the Council, and that the radicalism of such men as Sir Anthony Denny, who controlled the dry stamp that replicated the king's signature, is debatable.
Whatever the case, Henry's death was followed by a lavish hand-out of lands and honours to the new power group. The will contained an "unfulfilled gifts" clause, added at the last minute, which allowed Henry's executors to freely distribute lands and honours to themselves and the court, particularly to Edward Seymour, 1st Earl of Hertford, the new king's uncle who became Lord Protector of the Realm, Governor of the King's Person, and Duke of Somerset.
In fact, Henry VIII's will did not provide for the appointment of a Protector. It entrusted the government of the realm during his son's minority to a Regency Council that would rule collectively, by majority decision, with "like and equal charge". Nevertheless, a few days after Henry's death, on 4 February, the executors chose to invest almost regal power in Edward Seymour, now Duke of Somerset. Thirteen out of the sixteen (the others being absent) agreed to his appointment as Protector, which they justified as their joint decision "by virtue of the authority" of Henry's will. Somerset may have done a deal with some of the executors, who almost all received hand-outs. He is known to have done so with William Paget, private secretary to Henry VIII, and to have secured the support of Sir Anthony Browne of the Privy Chamber.
Somerset's appointment was in keeping with historical precedent, and his eligibility for the role was reinforced by his military successes in Scotland and France. In March 1547, he secured letters patent from King Edward granting him the almost monarchical right to appoint members to the Privy Council himself and to consult them only when he wished. In the words of historian Geoffrey Elton, "from that moment his autocratic system was complete". He proceeded to rule largely by proclamation, calling on the Privy Council to do little more than rubber-stamp his decisions.
Somerset's takeover of power was smooth and efficient. The imperial ambassador, François van der Delft, reported that he "governs everything absolutely", with Paget operating as his secretary, though he predicted trouble from John Dudley, Viscount Lisle, who had recently been raised to Earl of Warwick in the share-out of honours. In fact, in the early weeks of his Protectorate, Somerset was challenged only by the Chancellor, Thomas Wriothesley, whom the Earldom of Southampton had evidently failed to buy off, and by his own brother. Wriothesley, a religious conservative, objected to Somerset's assumption of monarchical power over the Council. He then found himself abruptly dismissed from the chancellorship on charges of selling off some of his offices to delegates.
Somerset faced less manageable opposition from his younger brother Thomas Seymour, who has been described as a "worm in the bud". As King Edward's uncle, Thomas Seymour demanded the governorship of the king's person and a greater share of power. Somerset tried to buy his brother off with a barony, an appointment to the Lord Admiralship, and a seat on the Privy Council—but Thomas was bent on scheming for power. He began smuggling pocket money to King Edward, telling him that Somerset held the purse strings too tight, making him a "beggarly king". He also urged him to throw off the Protector within two years and "bear rule as other kings do"; but Edward, schooled to defer to the Council, failed to co-operate. In the spring of 1547, using Edward's support to circumvent Somerset's opposition, Thomas Seymour secretly married Henry VIII's widow Catherine Parr, whose Protestant household included the 11-year-old Lady Jane Grey and the 13-year-old Lady Elizabeth.
In summer 1548, a pregnant Catherine Parr discovered Thomas Seymour embracing Lady Elizabeth. As a result, Elizabeth was removed from Catherine Parr's household and transferred to Sir Anthony Denny's. That September, Catherine Parr died shortly after childbirth, and Thomas Seymour promptly resumed his attentions to Elizabeth by letter, planning to marry her. Elizabeth was receptive, but, like Edward, unready to agree to anything unless permitted by the Council. In January 1549, the Council had Thomas Seymour arrested on various charges, including embezzlement at the Bristol mint. King Edward, whom Seymour was accused of planning to marry to Lady Jane Grey, himself testified about the pocket money. Lack of clear evidence for treason ruled out a trial, so Seymour was condemned instead by an Act of Attainder and beheaded on 20 March 1549.
Somerset's only undoubted skill was as a soldier, which he had proven on expeditions to Scotland and in the defence of Boulogne-sur-Mer in 1546. From the first, his main interest as Protector was the war against Scotland. After a crushing victory at the Battle of Pinkie in September 1547, he set up a network of garrisons in Scotland, stretching as far north as Dundee. His initial successes, however, were followed by a loss of direction, as his aim of uniting the realms through conquest became increasingly unrealistic. The Scots allied with France, who sent reinforcements for the defence of Edinburgh in 1548. The Queen of Scots was moved to France, where she was betrothed to the Dauphin. The cost of maintaining the Protector's massive armies and his permanent garrisons in Scotland also placed an unsustainable burden on the royal finances. A French attack on Boulogne in August 1549 at last forced Somerset to begin a withdrawal from Scotland.
During 1548, England was subject to social unrest. After April 1549, a series of armed revolts broke out, fuelled by various religious and agrarian grievances. The two most serious rebellions, which required major military intervention to put down, were in Devon and Cornwall and in Norfolk. The first, sometimes called the Prayer Book Rebellion, arose from the imposition of Protestantism, and the second, led by a tradesman called Robert Kett, mainly from the encroachment of landlords on common grazing ground. A complex aspect of the social unrest was that the protesters believed they were acting legitimately against enclosing landlords with the Protector's support, convinced that the landlords were the lawbreakers.
The same justification for outbreaks of unrest was voiced throughout the country, not only in Norfolk and the west. The origin of the popular view of Somerset as sympathetic to the rebel cause lies partly in his series of sometimes liberal, often contradictory, proclamations, and partly in the uncoordinated activities of the commissions he sent out in 1548 and 1549 to investigate grievances about loss of tillage, encroachment of large sheep flocks on common land, and similar issues. Somerset's commissions were led by an evangelical M.P. called John Hales, whose socially liberal rhetoric linked the issue of enclosure with Reformation theology and the notion of a godly commonwealth. Local groups often assumed that the findings of these commissions entitled them to act against offending landlords themselves. King Edward wrote in his "Chronicle" that the 1549 risings began "because certain commissions were sent down to pluck down enclosures".
Whatever the popular view of Somerset, the disastrous events of 1549 were taken as evidence of a colossal failure of government, and the Council laid the responsibility at the Protector's door. In July 1549, Paget wrote to Somerset: "Every man of the council have misliked your proceedings ... would to God, that, at the first stir you had followed the matter hotly, and caused justice to be ministered in solemn fashion to the terror of others ...".
The sequence of events that led to Somerset's removal from power has often been called a "coup d'état". By 1 October 1549, Somerset had been alerted that his rule faced a serious threat. He issued a proclamation calling for assistance, took possession of the king's person, and withdrew for safety to the fortified Windsor Castle, where Edward wrote, "Me thinks I am in prison". Meanwhile, a united Council published details of Somerset's government mismanagement. They made clear that the Protector's power came from them, not from Henry VIII's will. On 11 October, the Council had Somerset arrested and brought the king to Richmond. Edward summarised the charges against Somerset in his "Chronicle": "ambition, vainglory, entering into rash wars in mine youth, negligent looking on Newhaven, enriching himself of my treasure, following his own opinion, and doing all by his own authority, etc." In February 1550, John Dudley, Earl of Warwick, emerged as the leader of the Council and, in effect, as Somerset's successor. Although Somerset was released from the Tower and restored to the Council, he was executed for felony in January 1552 after scheming to overthrow Dudley's regime. Edward noted his uncle's death in his "Chronicle": "the duke of Somerset had his head cut off upon Tower Hill between eight and nine o'clock in the morning".
Historians contrast the efficiency of Somerset's takeover of power, in which they detect the organising skills of allies such as Paget, the "master of practices", with the subsequent ineptitude of his rule. By autumn 1549, his costly wars had lost momentum, the crown faced financial ruin, and riots and rebellions had broken out around the country. Until recent decades, Somerset's reputation with historians was high, in view of his many proclamations that appeared to back the common people against a rapacious landowning class. More recently, however, he has often been portrayed as an arrogant and aloof ruler, lacking in political and administrative skills.
In contrast, Somerset's successor John Dudley, Earl of Warwick, made Duke of Northumberland in 1551, was once regarded by historians merely as a grasping schemer who cynically elevated and enriched himself at the expense of the crown. Since the 1970s, the administrative and economic achievements of his regime have been recognised, and he has been credited with restoring the authority of the royal Council and returning the government to an even keel after the disasters of Somerset's protectorate.
The Earl of Warwick's rival for leadership of the new regime was Thomas Wriothesley, 1st Earl of Southampton, whose conservative supporters had allied with Dudley's followers to create a unanimous Council, which they, and observers such as the Holy Roman Emperor Charles V's ambassador, expected to reverse Somerset's policy of religious reform. Warwick, on the other hand, pinned his hopes on the king's strong Protestantism and, claiming that Edward was old enough to rule in person, moved himself and his people closer to the king, taking control of the Privy Chamber. Paget, accepting a barony, joined Warwick when he realised that a conservative policy would not bring the emperor onto the English side over Boulogne. Southampton prepared a case for executing Somerset, aiming to discredit Warwick through Somerset's statements that he had done all with Warwick's co-operation. As a counter-move, Warwick convinced parliament to free Somerset, which it did on 14 January 1550. Warwick then had Southampton and his followers purged from the Council after winning the support of Council members in return for titles, and was made Lord President of the Council and great master of the king's household. Although not called a Protector, he was now clearly the head of the government.
As Edward was growing up, he was able to understand more and more government business. However, his actual involvement in decisions has long been a matter of debate, and during the 20th century, historians have presented the whole gamut of possibilities, "balanc[ing] an articulate puppet against a mature, precocious, and essentially adult king", in the words of Stephen Alford. A special "Counsel for the Estate" was created when Edward was fourteen. Edward chose the members himself. In the weekly meetings with this Council, Edward was "to hear the debating of things of most importance". A major point of contact with the king was the Privy Chamber, and there Edward worked closely with William Cecil and William Petre, the Principal Secretaries. The king's greatest influence was in matters of religion, where the Council followed the strongly Protestant policy that Edward favoured.
The Duke of Northumberland's mode of operation was very different from Somerset's. Careful to make sure he always commanded a majority of councillors, he encouraged a working council and used it to legitimise his authority. Lacking Somerset's blood-relationship with the king, he added members to the Council from his own faction in order to control it. He also added members of his family to the royal household. He saw that to achieve personal dominance, he needed total procedural control of the Council. In the words of historian John Guy, "Like Somerset, he became quasi-king; the difference was that he managed the bureaucracy on the pretence that Edward had assumed full sovereignty, whereas Somerset had asserted the right to near-sovereignty as Protector".
Warwick's war policies were more pragmatic than Somerset's, and they have earned him criticism for weakness. In 1550, he signed a peace treaty with France that agreed to withdrawal from Boulogne and recalled all English garrisons from Scotland. In 1551, Edward was betrothed to Elisabeth of Valois, King Henry II's daughter, and was made a Knight of Saint Michael. In practice, he realised that England could no longer support the cost of wars. At home, he took measures to police local unrest. To forestall future rebellions, he kept permanent representatives of the crown in the localities, including lords lieutenant, who commanded military forces and reported back to central government.
Working with William Paulet and Walter Mildmay, Warwick tackled the disastrous state of the kingdom's finances. However, his regime first succumbed to the temptations of a quick profit by further debasing the coinage. The economic disaster that resulted caused Warwick to hand the initiative to the expert Thomas Gresham. By 1552, confidence in the coinage was restored, prices fell, and trade at last improved. Though a full economic recovery was not achieved until Elizabeth's reign, its origins lay in the Duke of Northumberland's policies. The regime also cracked down on widespread embezzlement of government finances, and carried out a thorough review of revenue collection practices, which has been called "one of the more remarkable achievements of Tudor administration".
In the matter of religion, the regime of Northumberland followed the same policy as that of Somerset, supporting an increasingly vigorous programme of reform. Although Edward VI's practical influence on government was limited, his intense Protestantism made a reforming administration obligatory; his succession was managed by the reforming faction, who continued in power throughout his reign. The man Edward trusted most, Thomas Cranmer, Archbishop of Canterbury, introduced a series of religious reforms that revolutionised the English church from one that—while rejecting papal supremacy—remained essentially Catholic, to one that was institutionally Protestant. The confiscation of church property that had begun under Henry VIII resumed under Edward—notably with the dissolution of the chantries—to the great monetary advantage of the crown and the new owners of the seized property. Church reform was therefore as much a political as a religious policy under Edward VI. By the end of his reign, the church had been financially ruined, with much of the property of the bishops transferred into lay hands.
The religious convictions of both Somerset and Northumberland have proved elusive for historians, who are divided on the sincerity of their Protestantism. There is less doubt, however, about the religious fervour of King Edward, who was said to have read twelve chapters of scripture daily and enjoyed sermons, and was commemorated by John Foxe as a "godly imp". Edward was depicted during his life and afterwards as a new Josiah, the biblical king who destroyed the idols of Baal. He could be priggish in his anti-Catholicism and once asked Catherine Parr to persuade Lady Mary "to attend no longer to foreign dances and merriments which do not become a most Christian princess". Edward's biographer Jennifer Loach cautions, however, against accepting too readily the pious image of Edward handed down by the reformers, as in John Foxe's influential "Acts and Monuments", where a woodcut depicts the young king listening to a sermon by Hugh Latimer. In the early part of his life, Edward conformed to the prevailing Catholic practices, including attendance at mass: but he became convinced, under the influence of Cranmer and the reformers among his tutors and courtiers, that "true" religion should be imposed in England.
The English Reformation advanced under pressure from two directions: from the traditionalists on the one hand and the zealots on the other, who led incidents of iconoclasm (image-smashing) and complained that reform did not go far enough. Reformed doctrines were made official, such as justification by faith alone and communion for laity as well as clergy in both kinds, of bread and wine. The Ordinal of 1550 replaced the divine ordination of priests with a government-run appointment system, authorising ministers to preach the gospel and administer the sacraments rather than, as before, "to offer sacrifice and celebrate mass both for the living and the dead". Cranmer set himself the task of writing a uniform liturgy in English, detailing all weekly and daily services and religious festivals, to be made compulsory in the first Act of Uniformity of 1549. The "Book of Common Prayer" of 1549, intended as a compromise, was attacked by traditionalists for dispensing with many cherished rituals of the liturgy, such as the elevation of the bread and wine, while some reformers complained about the retention of too many "popish" elements, including vestiges of sacrificial rites at communion. The prayer book was also opposed by many senior Catholic clerics, including Stephen Gardiner, Bishop of Winchester, and Edmund Bonner, Bishop of London, who were both imprisoned in the Tower and, along with others, deprived of their sees.
After 1551, the Reformation advanced further, with the approval and encouragement of Edward, who began to exert more personal influence in his role as Supreme Head of the church. The new changes were also a response to criticism from such reformers as John Hooper, Bishop of Gloucester, and the Scot John Knox, who was employed as a minister in Newcastle upon Tyne under the Duke of Northumberland and whose preaching at court prompted the king to oppose kneeling at communion. Cranmer was also influenced by the views of the continental reformer Martin Bucer, who died in England in 1551, by Peter Martyr, who was teaching at Oxford, and by other foreign theologians. The progress of the Reformation was further speeded by the consecration of more reformers as bishops. In the winter of 1551–52, Cranmer rewrote the "Book of Common Prayer" in less ambiguous reformist terms, revised canon law, and prepared a doctrinal statement, the Forty-two Articles, to clarify the practice of the reformed religion, particularly in the divisive matter of the communion service. Cranmer's formulation of the reformed religion, finally divesting the communion service of any notion of the real presence of God in the bread and the wine, effectively abolished the mass. According to Elton, the publication of Cranmer's revised prayer book in 1552, supported by a second Act of Uniformity, "marked the arrival of the English Church at Protestantism". The prayer book of 1552 remains the foundation of the Church of England's services. However, Cranmer was unable to implement all these reforms once it became clear in spring 1553 that King Edward, upon whom the whole Reformation in England depended, was dying.
In February 1553, Edward VI became ill, and by June, after several improvements and relapses, he was in a hopeless condition. The king's death and the succession of his Catholic half-sister Mary would jeopardise the English Reformation, and Edward's Council and officers had many reasons to fear it. Edward himself opposed Mary's succession, not only on religious grounds but also on those of legitimacy and male inheritance, which also applied to Elizabeth. He composed a draft document, headed "My devise for the succession", in which he undertook to change the succession, most probably inspired by his father Henry VIII's precedent. He passed over the claims of his half-sisters and, at last, settled the Crown on his first cousin once removed, the 16-year-old Lady Jane Grey, who on 25 May 1553 had married Lord Guilford Dudley, a younger son of the Duke of Northumberland. In the document he writes:
In his document Edward provided, in case of "lack of issue of my body", for the succession of male heirs only, that is, Jane Grey's mother's male heirs, Jane's, or her sisters'. As his death approached and possibly persuaded by Northumberland, he altered the wording so that Jane and her sisters themselves should be able to succeed. Yet Edward conceded Jane's right only as an exception to male rule, demanded by reality, an example not to be followed if Jane or her sisters had only daughters. In the final document both Mary and Elizabeth were excluded because of bastardy; since both had been declared bastards under Henry VIII and never made legitimate again, this reason could be advanced for both sisters. The provisions to alter the succession directly contravened Henry VIII's Third Succession Act of 1543 and have been described as bizarre and illogical.
In early June, Edward personally supervised the drafting of a clean version of his devise by lawyers, to which he lent his signature "in six several places." Then, on 15 June he summoned high ranking judges to his sickbed, commanding them on their allegiance "with sharp words and angry countenance" to prepare his devise as letters patent and announced that he would have these passed in parliament. His next measure was to have leading councillors and lawyers sign a bond in his presence, in which they agreed faithfully to perform Edward's will after his death. A few months later, Chief Justice Edward Montagu recalled that when he and his colleagues had raised legal objections to the devise, Northumberland had threatened them "trembling for anger, and ... further said that he would fight in his shirt with any man in that quarrel". Montagu also overheard a group of lords standing behind him conclude "if they refused to do that, they were traitors". At last, on 21 June, the devise was signed by over a hundred notables, including councillors, peers, archbishops, bishops, and sheriffs; many of them later claimed that they had been bullied into doing so by Northumberland, although in the words of Edward's biographer Jennifer Loach, "few of them gave any clear indication of reluctance at the time".
It was now common knowledge that Edward was dying, and foreign diplomats suspected that some scheme to debar Mary was under way. France found the prospect of the emperor's cousin on the English throne disagreeable and engaged in secret talks with Northumberland, indicating support. The diplomats were certain that the overwhelming majority of the English people backed Mary, but nevertheless believed that Queen Jane would be successfully established.
For centuries, the attempt to alter the succession was mostly seen as a one-man-plot by the Duke of Northumberland. Since the 1970s, however, many historians have attributed the inception of the "devise" and the insistence on its implementation to the king's initiative. Diarmaid MacCulloch has made out Edward's "teenage dreams of founding an evangelical realm of Christ", while David Starkey has stated that "Edward had a couple of co-operators, but the driving will was his". Among other members of the Privy Chamber, Northumberland's intimate Sir John Gates has been suspected of suggesting to Edward to change his devise so that Lady Jane Grey herself—not just any sons of hers—could inherit the Crown. Whatever the degree of his contribution, Edward was convinced that his word was law and fully endorsed disinheriting his half-sisters: "barring Mary from the succession was a cause in which the young King believed."
Edward became ill during January 1553 with a fever and cough that gradually worsened. The imperial ambassador, Jean Scheyfve, reported that "he suffers a good deal when the fever is upon him, especially from a difficulty in drawing his breath, which is due to the compression of the organs on the right side". Edward felt well enough in early April to take the air in the park at Westminster and to move to Greenwich, but by the end of the month he had weakened again. By 7 May he was "much amended", and the royal doctors had no doubt of his recovery. A few days later the king was watching the ships on the Thames, sitting at his window. However, he relapsed, and on 11 June Scheyfve, who had an informant in the king's household, reported that "the matter he ejects from his mouth is sometimes coloured a greenish yellow and black, sometimes pink, like the colour of blood". Now his doctors believed he was suffering from "a suppurating tumour" of the lung and admitted that Edward's life was beyond recovery. Soon, his legs became so swollen that he had to lie on his back, and he lost the strength to resist the disease. To his tutor John Cheke he whispered, "I am glad to die".
Edward made his final appearance in public on 1 July, when he showed himself at his window in Greenwich Palace, horrifying those who saw him by his "thin and wasted" condition. During the next two days, large crowds arrived hoping to see the king again, but on 3 July, they were told that the weather was too chilly for him to appear. Edward died at the age of 15 at Greenwich Palace at 8 pm on 6 July 1553. According to John Foxe's legendary account of his death, his last words were: "I am faint; Lord have mercy upon me, and take my spirit". He was buried in the Henry VII Lady Chapel at Westminster Abbey on 8 August 1553, with reformed rites performed by Thomas Cranmer. The procession was led by "a grett company of chylderyn in ther surples" and watched by Londoners "wepyng and lamenting"; the funeral chariot, draped in cloth of gold, was topped by an effigy of Edward, with crown, sceptre, and garter. Edward's burial place was unmarked until as late as 1966, when an inscribed stone was laid in the chapel floor by Christ's Hospital school to commemorate their founder. The inscription reads as follows: "In Memory Of King Edward VI Buried In This Chapel This Stone Was Placed Here By Christ's Hospital In Thanksgiving For Their Founder 7 October 1966".
The cause of Edward VI's death is not certain. As with many royal deaths in the 16th century, rumours of poisoning abounded, but no evidence has been found to support these. The Duke of Northumberland, whose unpopularity was underlined by the events that followed Edward's death, was widely believed to have ordered the imagined poisoning. Another theory held that Edward had been poisoned by Catholics seeking to bring Mary to the throne. The surgeon who opened Edward's chest after his death found that "the disease whereof his majesty died was the disease of the lungs". The Venetian ambassador reported that Edward had died of consumption—in other words, tuberculosis—a diagnosis accepted by many historians. Skidmore believes that Edward contracted tuberculosis after a bout of measles and smallpox in 1552 that suppressed his natural immunity to the disease. Loach suggests instead that his symptoms were typical of acute bronchopneumonia, leading to a "suppurating pulmonary infection" or lung abscess, septicaemia, and kidney failure.
Lady Mary was last seen by Edward in February, and was kept informed about the state of her half-brother's health by Northumberland and through her contacts with the imperial ambassadors. Aware of Edward's imminent death, she left Hunsdon House, near London, and sped to her estates around Kenninghall in Norfolk, where she could count on the support of her tenants. Northumberland sent ships to the Norfolk coast to prevent her escape or the arrival of reinforcements from the continent. He delayed the announcement of the king's death while he gathered his forces, and Jane Grey was taken to the Tower on 10 July. On the same day, she was proclaimed queen in the streets of London, to murmurings of discontent. The Privy Council received a message from Mary asserting her "right and title" to the throne and commanding that the Council proclaim her queen, as she had already proclaimed herself. The Council replied that Jane was queen by Edward's authority and that Mary, by contrast, was illegitimate and supported only by "a few lewd, base people".
Northumberland soon realised that he had miscalculated drastically, not least in failing to secure Mary's person before Edward's death. Although many of those who rallied to Mary were conservatives hoping for the defeat of Protestantism, her supporters also included many for whom her lawful claim to the throne overrode religious considerations. Northumberland was obliged to relinquish control of a nervous Council in London and launch an unplanned pursuit of Mary into East Anglia, from where news was arriving of her growing support, which included a number of nobles and gentlemen and "innumerable companies of the common people". On 14 July Northumberland marched out of London with three thousand men, reaching Cambridge the next day; meanwhile, Mary rallied her forces at Framlingham Castle in Suffolk, gathering an army of nearly twenty thousand by 19 July.
It now dawned on the Privy Council that it had made a terrible mistake. Led by the Earl of Arundel and the Earl of Pembroke, on 19 July the Council publicly proclaimed Mary as queen; Jane's nine-day reign came to an end. The proclamation triggered wild rejoicing throughout London. Stranded in Cambridge, Northumberland proclaimed Mary himself—as he had been commanded to do by a letter from the Council. William Paget and the Earl of Arundel rode to Framlingham to beg Mary's pardon, and Arundel arrested Northumberland on 24 July. Northumberland was beheaded on 22 August, shortly after renouncing Protestantism. His recantation dismayed his daughter-in-law, Jane, who followed him to the scaffold on 12 February 1554, after her father's involvement in Wyatt's rebellion.
Although Edward reigned for only six years and died at the age of 15, his reign made a lasting contribution to the English Reformation and the structure of the Church of England. The last decade of Henry VIII's reign had seen a partial stalling of the Reformation, a drifting back to more conservative values. By contrast, Edward's reign saw radical progress in the Reformation. In those six years, the Church transferred from an essentially Catholic liturgy and structure to one that is usually identified as Protestant. In particular, the introduction of the Book of Common Prayer, the Ordinal of 1550, and Cranmer's Forty-two Articles formed the basis for English Church practices that continue to this day. Edward himself fully approved these changes, and though they were the work of reformers such as Thomas Cranmer, Hugh Latimer, and Nicholas Ridley, backed by Edward's determinedly evangelical Council, the fact of the king's religion was a catalyst in the acceleration of the Reformation during his reign.
Queen Mary's attempts to undo the reforming work of her brother's reign faced major obstacles. Despite her belief in the papal supremacy, she ruled constitutionally as the Supreme Head of the English Church, a contradiction under which she bridled. She found herself entirely unable to restore the vast number of ecclesiastical properties handed over or sold to private landowners. Although she burned a number of leading Protestant churchmen, many reformers either went into exile or remained subversively active in England during her reign, producing a torrent of reforming propaganda that she was unable to stem. Nevertheless, Protestantism was not yet "printed in the stomachs" of the English people, and had Mary lived longer, her Catholic reconstruction might have succeeded, leaving Edward's reign, rather than hers, as a historical aberration.
On Mary's death in 1558, the English Reformation resumed its course, and most of the reforms instituted during Edward's reign were reinstated in the Elizabethan Religious Settlement. Queen Elizabeth replaced Mary's councillors and bishops with ex-Edwardians, such as William Cecil, Northumberland's former secretary, and Richard Cox, Edward's old tutor, who preached an anti-Catholic sermon at the opening of parliament in 1559. Parliament passed an Act of Uniformity the following spring that restored, with modifications, Cranmer's prayer book of 1552; and the Thirty-nine Articles of 1563 were largely based on Cranmer's Forty-two Articles. The theological developments of Edward's reign provided a vital source of reference for Elizabeth's religious policies, though the internationalism of the Edwardian Reformation was never revived. | https://en.wikipedia.org/wiki?curid=10245 |
EDSAC
The Electronic delay storage automatic calculator (EDSAC) was an early British computer. Inspired by John von Neumann's seminal "First Draft of a Report on the EDVAC", the machine was constructed by Maurice Wilkes and his team at the University of Cambridge Mathematical Laboratory in England. EDSAC was the second electronic digital stored-program computer to go into regular service.
Later the project was supported by J. Lyons & Co. Ltd., a British firm, who were rewarded with the first commercially applied computer, LEO I, based on the EDSAC design. Work on EDSAC started during 1947, and it ran its first programs on 6 May 1949, when it calculated a table of square numbers and a list of prime numbers. EDSAC was finally shut down on 11 July 1958, having been superseded by EDSAC 2, which remained in use until 1965.
As soon as EDSAC was operational, it began serving the University's research needs. It used mercury delay lines for memory, and derated vacuum tubes for logic. Power consumption was 11 kW of electricity. Cycle time was 1.5 ms for all ordinary instructions, 6 ms for multiplication. Input was via five-hole punched tape and output was via a teleprinter.
Initially registers were limited to an accumulator and a multiplier register. In 1953, David Wheeler, returning from a stay at the University of Illinois, designed an index register as an extension to the original EDSAC hardware.
A magnetic tape drive was added in 1952 but never worked sufficiently well to be of real use.
Until 1952, the available main memory (instructions and data) was only 512 18-bit words, and there was no backing store. The delay lines (or "tanks") were arranged in two batteries providing 512 words each. The second battery came into operation in 1952.
The full 1024-word delay line store was not available until 1955 or early 1956, limiting programs to about 800 words until then.
John Lindley (diploma student 1958–1959) mentioned "the incredible difficulty we had ever to produce a single correct piece of paper tape with the crude and unreliable home-made punching, printing and verifying gear available in the late 50s".
The EDSAC's main memory consisted of 1024 locations, though only 512 locations were initially installed. Each contained 18 bits, but the topmost bit was always unavailable due to timing problems, so only 17 bits were used. An instruction consisted of a five-bit instruction code, one spare bit, a ten bit operand (usually a memory address), and a length bit to control whether the instruction used a 17-bit or a 35-bit operand (two consecutive words, little-endian). All instruction codes were by design represented by one mnemonic letter, so that the "Add" instruction, for example, used the EDSAC character code for the letter A.
Internally, the EDSAC used two's complement, binary numbers. Numbers were either 17 bits (one word) or 35 bits (two words) long. Unusually, the multiplier was designed to treat numbers as fixed-point fractions in the range −1 ≤ "x" < 1, i.e. the binary point was immediately to the right of the sign. The accumulator could hold 71 bits, including the sign, allowing two long (35-bit) numbers to be multiplied without losing any precision.
The instructions available were:
There was no division instruction (but various division subroutines were supplied) and no way to directly load a number into the accumulator (a "sTore and zero accumulator" instruction followed by an "Add" instruction were necessary for this). There was no unconditional jump instruction, nor was there a procedure call instruction – it had not yet been invented.
Maurice Wilkes discussed relative addressing modes for the EDSAC in a paper published in 1953. He was making the proposals to facilitate the use of subroutines.
The "initial orders" were hard-wired on a set of uniselector switches and loaded into the low words of memory at startup. By May 1949, the initial orders provided a primitive relocating assembler taking advantage of the mnemonic design described above, all in 31 words. This was the world's first assembler, and arguably the start of the global software industry. There is a simulation of EDSAC available and a full description of the initial orders and first programs.
The first calculation done by EDSAC was a square number program run on 6 May 1949. The program was written by Beatrice Worsley who had come from Canada to study the machine.
The machine was used by other members of the University to solve real problems, and many early techniques were developed that are now included in operating systems.
Users prepared their programs by punching them (in assembler) onto a paper tape. They soon became good at being able to hold the paper tape up to the light and read back the codes. When a program was ready it was hung on a length of line strung up near the paper tape reader. The machine operators, who were present during the day, selected the next tape from the line and loaded it into EDSAC. This is of course well known today as job queues. If it printed something then the tape and the printout were returned to the user, otherwise they were informed at which memory location it had stopped. Debuggers were some time away, but a CRT screen could be set to display the contents of a particular piece of memory. This was used to see if a number was converging, for example. A loudspeaker was connected to the accumulator's sign bit; experienced users knew healthy and unhealthy sounds of programs, particularly programs 'hung' in a loop. After office hours certain "Authorised Users" were allowed to run the machine for themselves, which went on late into the night until a valve blew – which usually happened according to one such user.
The early programmers had to make use of techniques frowned upon today—especially altering the code. As there was no index register until much later, the only way of accessing an array was to alter which memory location a particular instruction was referencing.
David Wheeler, who earned the world's first Computer Science PhD working on the project, is credited with inventing the concept of a subroutine. Users wrote programs that called a routine by jumping to the start of the subroutine with the return address (i.e. the location-plus-one of the jump itself) in the accumulator (a Wheeler Jump). By convention the subroutine expected this and the first thing it did was to modify its concluding jump instruction to that return address. Multiple and nested subroutines could be called so long as the user knew the length of each one in order to calculate the location to jump to; recursive calls were forbidden. The user then copied the code for the subroutine from a master tape onto their own tape following the end of their own program. (However, Turing discussed subroutines in a paper of 1945 on design proposals for the NPL ACE, going so far as to invent the concept of a return address stack, which would have allowed recursion.)
The subroutine concept led to the availability of a substantial subroutine library. By 1951, 87 subroutines in the following categories were available for general use: floating point arithmetic; arithmetic operations on complex numbers; checking; division; exponentiation; routines relating to functions; differential equations; special functions; power series; logarithms; miscellaneous; print and layout; quadrature; read (input); "n"th root; trigonometric functions; counting operations (simulating repeat until loops, while loops and for loops); vectors; and matrices.
EDSAC was designed specifically to form part of the Mathematical Laboratory's support service for calculation. The first scientific paper to be published using a computer for calculations was by Ronald Fisher. Wilkes and Wheeler had used EDSAC to solve a differential equation relating to gene frequencies for him. In 1951, Miller and Wheeler used the machine to discover a 79-digit prime – the largest known at the time.
The winners of three Nobel Prizes: John Kendrew and Max Perutz (Chemistry, 1962), Andrew Huxley (Medicine, 1963) and Martin Ryle (Physics, 1974) benefitted from EDSAC's revolutionary computing power. In their acceptance prize speeches, each acknowledged the role that EDSAC had played in their research.
In the early 1960s Peter Swinnerton-Dyer used the EDSAC computer to calculate the number of points modulo "p" (denoted by "Np") for a large number of primes "p" on elliptic curves whose rank was known. Based on these numerical results conjectured that "Np" for a curve "E" with rank "r" obeys an asymptotic law, the Birch and Swinnerton-Dyer conjecture, considered one of the top unsolved problems in mathematics as of 2016.
In 1952, Sandy Douglas developed "OXO", a version of noughts and crosses (tic-tac-toe) for the EDSAC, with graphical output to a VCR97 6" cathode ray tube. This may well have been the world's first video game.
Another video game was created by Stanley Gill and involved a dot (termed a sheep) approaching a line in which one of two gates could be opened. The Stanley Gill game was controlled via the lightbeam of the EDSAC's paper tape reader. Interrupting it (such as by the player placing their hand in it) would open the upper gate. Leaving the beam unbroken would result in the lower gate opening.
EDSAC's successor, EDSAC 2, was commissioned in 1958.
In 1961, an EDSAC 2 version of Autocode, an ALGOL-like high-level programming language for scientists and engineers, was developed by David Hartley.
In the mid-1960s, a successor to the EDSAC 2 was planned, but the move was instead made to the Titan, a prototype Atlas 2 developed from the Atlas Computer of the University of Manchester, Ferranti, and Plessey.
On 13 January 2011, the Computer Conservation Society announced that it planned to build a working replica of EDSAC, at the National Museum of Computing (TNMoC) in Bletchley Park supervised by Andrew Herbert, who studied under Maurice Wilkes. The first parts of the recreation were switched on in November 2014. The ongoing project is open to visitors of the museum. In 2016, two original EDSAC operators, Margaret Marrs and Joyce Wheeler, visited the museum to assist the project. As of November 2016, commissioning of the fully completed and operational state of the replica was estimated to be the autumn of 2017. However, project delays have postponed its opening. | https://en.wikipedia.org/wiki?curid=10251 |
E. H. Shepard
Ernest Howard Shepard OBE, MC (10 December 1879 – 24 March 1976) was an English artist and book illustrator. He is known especially for illustrations of the anthropomorphic animal and soft toy characters in "The Wind in the Willows" and "Winnie-the-Pooh".
Shepard was born in St John's Wood, London. Having shown some promise in drawing at St Paul's School, in 1897 he enrolled in the Heatherley School of Fine Art in Chelsea. After a productive year there, he attended the Royal Academy Schools, winning a Landseer scholarship in 1899 and a British Institute prize in 1900. There he met Florence Eleanor Chaplin, whom he married in 1904. By 1906 Shepard had become a successful illustrator, having produced work for illustrated editions of Aesop's Fables, "David Copperfield", and "Tom Brown's Schooldays", while at the same time working as an illustrator on the staff of "Punch". The couple bought a house in London, but in 1905 moved to Shamley Green, near Guildford.
Shepard was a prolific painter, showing in a number of exhibitions. He exhibited at the Royal Society of Artists, Birmingham—a traditional venue for generic painters—as well as in the more radical atmosphere of Glasgow's Institute of Fine Arts, where some of the most innovative artists were on show. He was twice an exhibitor at the Walker Art Gallery in Liverpool, one of the largest provincial galleries in the country, and another at the Manchester Art Gallery, a Victorian institution later part of the public libraries. But at heart, Shepard was a Londoner, showing sixteen times at the Royal Academy on Piccadilly. His wife, who was also a painter, found a home in London's West End venue for her own modest output during a 25-year career.
In his mid-thirties when World War I broke out in 1914, Shepard received a commission as a second lieutenant in the Royal Garrison Artillery, an arm of the Royal Artillery. He was assigned to 105th Siege Battery, which crossed to France in May 1916. and went into action at the Battle of the Somme.
By the autumn of 1916, Shepard started working for the Intelligence Department sketching the combat area within the view of his battery position. On 16 February 1917, he was made an acting captain whilst second-in-command of his battery, and briefly served as an acting major in late April and early May of that year during the Battle of Arras before reverting to acting captain. He was promoted to substantive lieutenant on 1 July 1917. Whilst acting as Captain, he was awarded the Military Cross. His citation read:
Later in 1917 105th Siege Battery participated in the final stages of the Battle of Passchendaele where it came under heavy fire and suffered a number of casualties. At the end of the year it was sent to help retrieve a disastrous situation on the Italian Front, travelling by rail via Verona before coming into action on the Montello Hill.
Shepard missed the Second Battle of the Piave River in April 1918, being on leave in England (where he was invested with his MC by King George V at Buckingham Palace) and where he was attending a gunnery course. He was back in Italy with his battery for the victory at Vittorio Veneto. After the Armistice of Villa Giusti in November 1918, Shepard was promoted to acting major in command of the battery, and given the duty of administering captured enemy guns. Demobilisation began at Christmas 1918 and 105th Siege Battery was disbanded in March 1919.
Throughout the war he had been contributing to "Punch". He was hired as a regular staff cartoonist in 1921 and became lead cartoonist in 1945. He was removed from this post in 1953 by "Punch"'s new editor, Malcolm Muggeridge.
Shepard was recommended to A. A. Milne in 1923 by another "Punch" staffer, E. V. Lucas. Milne initially thought Shepard's style was not what he wanted, but used him to illustrate the book of poems "When We Were Very Young". Happy with the results, Milne then insisted Shepard illustrate "Winnie-the-Pooh". Realising his illustrator's contribution to the book's success, the writer arranged for Shepard to receive a share of his royalties. Milne also inscribed a copy of "Winnie-the-Pooh" with the following personal verse:
Eventually Shepard came to resent "that silly old bear" as he felt that the Pooh illustrations overshadowed his other work.
Shepard modelled Pooh not on the toy owned by Milne's son Christopher Robin but on "Growler", a stuffed bear owned by his own son. (Growler no longer exists, having been given to his granddaughter Minnie Hunt and subsequently destroyed by a neighbour's dog.) His Pooh work is so famous that 300 of his preliminary sketches were exhibited at the Victoria and Albert Museum in 1969, when he was 90 years old.
A Shepard painting of Winnie the Pooh, believed to have been painted in the 1930s for a Bristol teashop, is his only known oil painting of the famous teddy bear. It was purchased at an auction for $243,000 in London late in 2000. The painting is displayed in the Pavilion Gallery at Assiniboine Park in Winnipeg, Manitoba, Canada.
Shepard wrote two autobiographies: "Drawn from Memory" (1957) and "Drawn From Life" (1961).
In 1972, Shepard gave his personal collection of papers and illustrations to the University of Surrey. These now form the E.H. Shepard Archive.
Shepard was made an Officer of the Order of the British Empire in the 1972 Birthday Honours.
Shepard lived at Melina Place in St John's Wood and from 1955 in Lodsworth, West Sussex. He and Florence had two children, Graham (born 1907) and Mary (born 1909), who both became illustrators. Lt. Graham Shepard died when his ship HMS "Polyanthus" was sunk by German submarine U-952 in September 1943. Mary married E.V. Knox, the editor of "Punch", and became known as the illustrator of the "Mary Poppins" series of children's books. Florence Shepard died in 1927. In November 1943 Shepard married Norah Carroll, a nurse at St Mary's Hospital, Paddington. They remained married until his death in 1976. In 1966, he called the short film Winnie the Pooh and the Honey Tree a travesty. | https://en.wikipedia.org/wiki?curid=10252 |
Enterobacteriaceae
Enterobacteriaceae is a large family of Gram-negative bacteria. It was first proposed by Rahn in 1936, and now includes over 30 genera and more than 100 species. Its classification above the level of family is still a subject of debate, but one classification places it in the order Enterobacterales of the class Gammaproteobacteria in the phylum Proteobacteria.
Enterobacteriaceae includes, along with many harmless symbionts, many of the more familiar pathogens, such as "Salmonella", "Escherichia coli", "Klebsiella", and "Shigella". Other disease-causing bacteria in this family include "Enterobacter" and "Citrobacter". Members of the Enterobacteriaceae can be trivially referred to as enterobacteria or "enteric bacteria", as several members live in the intestines of animals. In fact, the etymology of the family is enterobacterium with the suffix to designate a family (aceae)—not after the genus "Enterobacter" (which would be "Enterobacteraceae")—and the type genus is "Escherichia".
Members of the Enterobacteriaceae are bacilli (rod-shaped), and are typically 1–5 μm in length. They typically appear as medium to large-sized grey colonies on blood agar, although some can express pigments.
Most have many flagella used to move about, but a few genera are nonmotile. Most members of Enterobacteriaceae have peritrichous, type I fimbriae involved in the adhesion of the bacterial cells to their hosts.
They are not spore-forming.
Like other proteobacteria, enterobactericeae have Gram-negative stains, and they are facultative anaerobes, fermenting sugars to produce lactic acid and various other end products. Most also reduce nitrate to nitrite, although exceptions exist. Unlike most similar bacteria, enterobacteriaceae generally lack cytochrome c oxidase, although there are exceptions.
Catalase reactions vary among Enterobacteriaceae.
Many members of this family are normal members of the gut microbiota in humans and other animals, while others are found in water or soil, or are parasites on a variety of different animals and plants.
"Escherichia coli" is one of the most important model organisms, and its genetics and biochemistry have been closely studied.
Some enterobacteria are important pathogens, e.g. "Salmonella", or "Shigella" e.g. because they produce endotoxins. Endotoxins reside in the cell wall and are released when the cell dies and the cell wall disintegrates. Some members of the "Enterobacteriaceae" produce endotoxins that, when released into the bloodstream following cell lysis, cause a systemic inflammatory and vasodilatory response. The most severe form of this is known as endotoxic shock, which can be rapidly fatal.
The following genera have been validly published, thus they have "Standing in Nomenclature". The year the genus was proposed is listed in parentheses after the genus name.
The following genera have been effectively, but not validly, published, thus they do not have "Standing in Nomenclature". The year the genus was proposed is listed in parentheses after the genus name.
To identify different genera of Enterobacteriaceae, a microbiologist may run a series of tests in the lab. These include:
In a clinical setting, three species make up 80 to 95% of all isolates identified. These are "Escherichia coli", "Klebsiella pneumoniae", and "Proteus mirabilis". However, "Proteus mirabilis" is now considered a part of the Morganellaceae, a sister clade within the Enterobacterales.
Several Enterobacteriaceae strains have been isolated which are resistant to antibiotics including carbapenems, which are often claimed as "the last line of antibiotic defense" against resistant organisms. For instance, some "Klebsiella pneumoniae" strains are carbapenem resistant. | https://en.wikipedia.org/wiki?curid=10253 |
Essendon Football Club
The Essendon Football Club, nicknamed the Bombers, is a professional Australian rules football club that plays in the Australian Football League (AFL), the sport's premier competition. Thought to have formed in 1872, the club played its first recorded game on 7 June 1873 against a Carlton Second 20, winning 1 goal to nil. The club played a senior club in the Victorian Football Association in 1878, one year after the VFA formed. It is historically associated with Essendon, a suburb in the north-west of Melbourne, Victoria. Since 2013, the club has been headquartered at The Hangar, Melbourne Airport, and plays its home games at either Docklands Stadium or the Melbourne Cricket Ground; throughout most of its history the club's home ground and headquarters was Windy Hill, Essendon, where it played from 1922 until 1991. While it stopped playing games at the ground thereafter, Windy Hill remained its training and administration base until the end of 2013. Dyson Heppell is the current team captain.
A founding member club of both the Victorian Football Association, in 1877, and the Victorian Football League (renamed the AFL in 1990), in 1896, Essendon is one of Australia's best-known football clubs. Essendon has won 16 VFL/AFL premierships, which, along with Carlton, is the most of any club in the competition. The club won four consecutive VFA premierships between 1891 and 1894, a feat unmatched in that competition's history.
The club was founded by members of the Royal Agricultural Society, the Melbourne Hunt Club and the Victorian Woolbrokers. The Essendon Football Club is thought to have formed in 1872 at a meeting it the home of a well-known brewery family, the McCrackens, whose Ascot Vale property hosted a team of local junior players.
Robert McCracken, the owner of several city hotels, was the founder and first president of the Essendon Football club and his son, Alex, its first secretary. Alex later became president of the newly formed VFL. Alex's cousin, Collier McCracken, who had already played with Melbourne, was the team's first captain.
The club played its first recorded match against the Carlton second twenty on 7 June 1873, with Essendon winning by one goal. Essendon played 13 matches in its first season, winning seven, with four draws and losing two. The club was one of the inaugural junior members of the Victorian Football Association (VFA) in 1877, and began competing as a senior club from the 1878 season. During its early years in the Association, Essendon played its home matches at Flemington Hill, but moved to the East Melbourne Cricket Ground in 1881.
In 1878, Essendon played in the first match on what would be considered by modern standards to be a full-sized field at Flemington Hill. In 1879 Essendon played Melbourne in one of the earliest night matches recorded when the ball was painted white. In 1883 the team played four matches in eight days in Adelaide: losing to Norwood (on 23 June), and defeating Port Adelaide (on 16 June), a combined South Australian team (on 18 June), and South Adelaide (on 20 June).
In 1891 Essendon won their first VFA premiership, which they repeated in 1892, 1893 and 1894. One of the club's greatest players, Albert Thurgood played for the club during this period, making his debut in 1892. Essendon (18 wins, 2 draws) was undefeated in the 1893 season.
At the end of the 1896 season Essendon along with seven other clubs formed the Victorian Football League. Essendon's first VFL game was in 1897 was against Geelong at Corio Oval in Geelong. Essendon won its first VFL premiership by winning the 1897 VFL finals series. Essendon again won the premiership in 1901, defeating Collingwood in the Grand Final. The club won successive premierships in 1911 and 1912 over Collingwood and South Melbourne respectively.
The club is recorded as having played at McCracken's Paddock, Glass' Paddock and Flemington Hill. It is likely that these are three different names for the one ground, given that McCracken's Paddock was a parcel of land that sat within the larger Glass's Paddock which in turn was situated in an area widely known at the time as Flemington Hill. In 1882 the club moved home games to the East Melbourne Cricket Ground, (since gone) after an application to play on the Essendon Cricket Ground (later known as Windy Hill) was voted down by then Lord Mayor James Taylor on the basis that City of Essendon the mayor considered the Essendon Cricket Ground "to be suitable only for the gentleman's game of cricket",
The club became known by the nickname "the Same Old Essendon", from the title and hook of the principal song performed by a band of supporters which regularly occupied a section of the grandstand at the club's games. The nickname first appeared in print in the local "North Melbourne Advertiser" in 1889, and ended up gaining wide use, often as the diminutive "Same Olds".
This move away from Essendon, at a time when fans would walk to their local ground, didn't go down too well with many Essendon people; and, as a consequence, a new team and club was formed in 1900, unconnected with the first (although it played in the same colours), that was based at the Essendon Cricket Ground, and playing in the Victorian Football Association. It was known firstly as Essendon Town and, after 1905, as Essendon (although it was often called Essendon A, with the A standing for association).
After the 1921 season, the East Melbourne Cricket Ground was closed and demolished to expand the Flinders Street Railyard. Having played at the East Melbourne Cricket Ground from 1882 to 1921, and having won four VFA premierships (1891–1894) and four VFL premierships (1897, 1901, 1911 and 1912) whilst there, Essendon was looking for a new home, and was offered grounds at the current Royal Melbourne Showgrounds, at Victoria Park, at Arden St, North Melbourne, and the Essendon Cricket Ground. The Essendon City Council offered the (VFL) team the Essendon Cricket Ground, announcing that it would be prepared to spend over £12,000 on improvements, including a new grandstand, scoreboard and re-fencing of the oval.
The club's first preference was to move to North Melbourne – a move which the North Melbourne Football Club (then in the VFA) saw as an opportunity to get into the VFL. Most of Essendon's members and players were from the North Melbourne area, and sportswriters believed that Essendon would have been taken over by or rebranded as North Melbourne within only a few years of the move. However, the VFA, desperate for its own strategic reasons not to lose its use of the North Melbourne Cricket Ground, successfully appealed to the State Government to block Essendon's move to North Melbourne. With its preferred option off the table, the club returned to Essendon, and the Essendon VFA club disbanded, with most of its players moving to North Melbourne.
The old "Same Olds" nickname fell into disuse, and by 1922 the other nicknames "Sash Wearers" and "Essendonians" that had been variously used from time to time were also abandoned. The team became universally known as "the Dons" (from EssenDON); it was not until much later, during the War years of the early 1940s, that they became known as "The Bombers" – due to Windy Hill's proximity to the Essendon Aerodrome.
In the 1922 season, playing in Essendon for the first time in decades, Essendon reached the final four for the first time since 1912, finishing in third place. In the 1923 season the club topped the ladder with 13 wins from 16 games. After a 17-point second semi final loss to South Melbourne defeated Fitzroy (who had beaten South Melbourne) in the challenge final: Essendon 8.15 (63) to Fitzroy 6.10 (46). Amongst Essendon's best players were half forward flanker George "Tich" Shorten, centre half forward Justin McCarthy, centre half back Tom Fitzmaurice, rover Frank Maher and wingman Jack Garden.
This was one of Essendon's most famous sides, dubbed the "Mosquito Fleet", due to the number of small, very fast players in the side. Six players were 5'6" (167 cm) or smaller.
The 1924 season proved to be arguably the strangest year in Essendon's entire history. For the first time since 1897 there was no ultimate match – either "Challenge Final" or "Grand Final" – to determine the premiers; instead, the top four clubs after the home and away season played a round-robin to determine the premiers. Essendon, having previously defeated both Fitzroy (by 40 points) and South Melbourne (by 33 points), clinched the premiership by means of a 20-point loss to Richmond. With the Tigers having already lost a match to Fitzroy by a substantial margin, the Dons were declared premiers by virtue of their superior percentage, meaning that Essendon again managed to win successive premierships. But the poor crowds for the finals meant this was never attempted again, resulting in Essendon having the unique record of winning the only two premierships without a grand final.
Prominent contributors to Essendon's 1924 Premiership success included back pocket Clyde Donaldson, follower Norm Beckton, half back flanker Roy Laing, follower Charlie May and rover Charlie Hardy.
The 1924 season was not without controversy, with rumours of numerous players accepting bribes. Regardless of the accuracy of these allegations, the club's image was tarnished, and the side experienced its lowest period during the decade that followed, with poor results on the field and decreased support off it.
There was worse to follow, with various Essendon players publicly blaming each other for the poor performance against Richmond, and then, with dissension still rife in the ranks, the side plummeted to an humiliating 28-point loss to VFA premiers Footscray in a special charity match played a week later in front of 46,100 people, in aid of "Dame Nellie Melba's Limbless Soldiers' Appeal Fund", purportedly (but not officially) for the championship of Victoria.
The club's fortunes dipped alarmingly, and persistently. Indeed, after finishing third in the 1926 season, it was to be 14 years before Essendon would even contest a finals series.
The 1933 season, was probably the start of the Essendon revival, seeing the debut of the player regarded as one of Essendon's greatest players Dick Reynolds. His impact was immediate. He won his first Brownlow Medal aged 19. His record of three Brownlow victories (1934, 1937, 1938), equalled Haydn Bunton, Sr (1931, 1932, 1935), and later equalled by Bob Skilton (1959, 1963, 1968), and Ian Stewart (1965, 1966, 1971).
Reynolds went on to arguably even greater achievements as a coach, a position to which he was first appointed, jointly with Harry Hunter, in 1939 (this was while Reynolds was still a player). A year later he took the reins on a solo basis and was rewarded with immediate success (at least in terms of expectations at the time which, after so long in the wilderness, were somewhat modest). He was regarded as having a sound tactical knowledge of the game and being an inspirational leader, as he led the side into the finals in 1940 for the first time since 1926, when the side finished 3rd. Melbourne, which defeated Essendon by just 5 points in the preliminary final, later went on to trounce Richmond by 39 points in the grand final.
The Essendon Football Club adopted the nickname The Bombers in April 1940.
1941 brought Essendon's first grand final appearance since 1923, but the side again lowered its colours to Melbourne. A year later war broke out and the competition was considerably weakened, with Geelong being forced to pull out of the competition due to travel restrictions as a result of petrol rationing. Attendances at games also declined dramatically, whilst some clubs had to move from their normal grounds due to them being used for military purposes. Many players were lost to football due to their military service. Nevertheless, Essendon went on to win the 1942 Premiership with Western Australian Wally Buttsworth in irrepressible form at centre half back. Finally, the long-awaited premiership was Essendon's after comprehensively outclassing Richmond in the grand final, 19.18 (132) to 11.13 (79). The match was played at Carlton in front of 49,000 spectators.
In any case, there could be no such reservations about Essendon's next premiership, which came just four years later. Prior to that Essendon lost a hard-fought grand final to Richmond in 1943 by 5 points, finished 3rd in 1944, and dropped to 8th in 1945.
After World War II, Essendon enjoyed great success. In the five years immediately after the war, Essendon won 3 premierships (1946, 1949, 1950) and were runners up twice (1947, 1948). In 1946, Essendon were clearly the VFL's supreme force, topping the ladder after the roster games and surviving a drawn second semi final against Collingwood to win through to the grand final a week later with a 10.16 (76) to 8.9 (57). Then, in the grand final against Melbourne, Essendon set a grand final record score of 22.18 (150) to Melbourne 13.9 (87), with 7 goal centre half forward Gordon Lane. Rover Bill Hutchinson, and defenders Wally Buttsworth, Cec Ruddell and Harold Lambert among the best players.
The 1947 Grand Final has to go down in the ledger as 'one of the ones that got away', Essendon losing to Carlton by a single point despite managing 30 scoring shots to 21. As if to prove that lightning does occasionally strike twice, the second of the 'ones that got away' came just a year later, the Dons finishing with a lamentable 7.27, to tie with Melbourne (who managed 10.9) in the 1948 grand final. A week later Essendon waved the premiership good-bye, as Melbourne raced to a 13.11 (89) to 7.8 (50) triumph. The club's Annual Report made an assessment that was at once restrained and, as was soon to emerge, tacitly and uncannily prophetic:
It is very apparent that no team is complete without a spearhead and your committee has high hopes of rectifying that fault this coming season.
The 1949 season heralded the arrival on the VFL scene of John Coleman, arguably the greatest player in Essendon's history, and, in the view of some, the finest player the game has known. In his first ever appearance for the Dons, against Hawthorn in Round 1 1949, he booted 12 of his side's 18 goals to create an opening round record which was to endure for forty five years. More importantly, however, he went on to maintain the same high level of performance throughout the season, kicking precisely 100 goals for the year to become the first player to top the ton since Richmond's Jack Titus in 1940.
The Coleman factor was just what Essendon needed to enable them to take that vital final step to premiership glory, but even so it was not until the business end of the season that this became clear. Essendon struggled to make the finals in 4th place, but once there they suddenly ignited to put in one of the most consistently devastating September performances in VFL history.
Collingwood succumbed first as the Dons powered their way to an 82-point first semi final victory, and a fortnight later it was the turn of the North Melbourne Football Club as Essendon won the preliminary final a good deal more comfortably than the ultimate margin of 17 points suggested. In the grand final, Essendon were pitted against Carlton and in a match that was a total travesty as a contest they overwhelmed the Blues to the tune of 73 points, 18.17 (125) to 6.16 (52). Best for the Dons included pacy aboriginal half back flanker Norm McDonald, ruckman Bob McLure, and rovers Bill Hutchinson and Ron McEwin. John Coleman also did well, registering 6 majors.
A year later Essendon were if anything even more dominant, defeating the North Melbourne Football Club in both the second semi final and the grand final to secure consecutive VFL premierships for the third time. Best afield in the grand final in what was officially his swansong as a player was captain-coach Dick Reynolds, who received sterling support from the likes of Norm McDonald, ruckman/back pocket Wally May, back pocket Les Gardiner, and big Bob McLure.
With 'King Richard' still holding court as coach in 1951, albeit now in a non-playing capacity, Essendon seemed on course for a third consecutive flag but a controversial four-week suspension dished out to John Coleman on the eve of the finals effectively put paid to their chances. Coleman was reported for retaliation after twice being struck by his Carlton opponent, Harry Caspar, and without him the Dons were rated a 4 goals poorer team. Nevertheless, they still managed to battle their way to a 6th successive grand final with wins over Footscray by 8 points in the first semi final and Collingwood by 2 points in the preliminary final.
The Dons sustained numerous injuries in the preliminary final and the selectors sprang a surprise on grand final day by naming the officially retired Dick Reynolds as 20th man. 'King Richard' was powerless to prevent the inevitable, although leading at half time, the Geelong kicked five goals to two points in the third quarter to set up victory by 11 points.
Essendon slumped to 8th in 1952 but John Coleman was in irrepressible form managing 103 goals for the year. Hugh Buggy noted in "The Argus": "It was the wettest season for twenty-two years and Coleman showed that since the war he was without peer in the art of goal kicking."
Two seasons later Coleman's career was ended after he dislocated a knee during the Round 8 clash with the North Melbourne Football Club at Essendon. Aged just twenty-five, he had kicked 537 goals in only 98 VFL games in what was generally a fairly low scoring period for the game. His meteoric rise and fall were clearly the stuff of legend, and few if any players, either before or since, have had such an immense impact over so brief a period.
According to Alf Brown, football writer for "The Herald":
Somewhat more colourful, R.S. Whittington suggested,
Without Coleman, Essendon's fortunes plummeted, and there were to be no further premierships in the 1950s. The nearest miss came in 1957 when the Bombers (as they were popularly known by this time) earned premiership favouritism after a superb 16-point second semi final defeat of Melbourne, only to lose by over 10 goals against the same side a fortnight later.
1959 saw another grand final loss to Melbourne, this time by 37 points, but the fact that the average age of the Essendon side was only 22 was seen as providing considerable cause for optimism. However, it was to take another three years, and a change of coach, before the team's obvious potential was translated into tangible success.
John Coleman started his coaching career at Essendon in 1961, thus ending the Dick Reynolds era at the club. In the same year Essendon finished the season mid table and supporters were not expecting too much for the following season. However, the club blitzed the opposition in this year, losing only two matches and finishing top of the table. Both losses were to the previous year's grand finalists. The finals posed no problems for the resurgent Dons, easily accounting for Carlton in the season's climax, winning the 1962 Premiership. This was a remarkable result for Coleman who in his second season of coaching pulled off the ultimate prize in Australian football. As so often is the case after a flag, the following two years were below standard. A further premiership in 1965 (won from 4th position on the ladder), was also unexpected due to periods of poor form during the season. The Bombers were a different club when the finals came around, but some of the credit for the improvement was given to the influence of Brian Sampson and Ted Fordham during the finals. Coleman's time as coach turned out to be much like his playing career: highly successful but cut short when he had to stand down due to health problems in 1967. Only six years later, on the eve of the 1973 season, he died of a heart-attack at just 44 years of age.
Following Coleman's retirement, the club experienced tough times on and off the field. Finals appearances were rare for the side, which was often in contention for the wooden spoon. Essendon did manage to make the 1968 VFL Grand Final, but lost to Carlton by just three points and did not make it back to the big stage for a decade-and-a-half.
During the period from 1968 until 1980, five different coaches were tried, with none lasting longer than four years. Off the field the club went through troubled times as well. In 1970 five players went on strike before the season even began, demanding higher payments. Essendon did make the finals in 1972 and 1973 under the autocratic direction of Des Tuddenham (Collingwood) but they were beaten badly in successive elimination finals by St. Kilda and did not taste finals action again until the very end of the decade. The 70s Essendon sides were involved in many rough and tough encounters under Tuddenham, who himself came to loggerheads with Ron Barassi at a quarter time huddle where both coaches exchanged heated words. Essendon had tough, but talented players with the likes of "Rotten Ronnie" Ron Andrews and experienced players such as Barry Davis, Ken Fletcher, Geoff Blethyn, Neville Fields and West Australian import Graham Moss. In May 1974, a controversial half time all-in-brawl with Richmond at Windy Hill and a 1975 encounter with Carlton were testimony of the era. Following the Carlton match, the 'Herald' described Windy Hill as "Boot Hill", because of the extent of the fights and the high number of reported players (eight in all – four from Carlton and four from Essendon). The peak of these incidents occurred in 1980 with new recruit Phil Carman making headlines for head-butting an umpire. The tribunal suspended him for sixteen weeks, and although most people thought this was a fair (or even lenient) sentence, he took his case to the supreme court, gathering even more unwanted publicity for the club. Despite this, the club had recruited many talented young players in the late 70s who emerged as club greats. Three of those young players were Simon Madden, Tim Watson and Paul Van Der Haar. Terry Daniher and his brother Neale came via a trade with South Melbourne, and Roger Merrett joined soon afterwards to form the nucleus of what would become the formidable Essendon sides of the 1980s. This raw but talented group of youngsters took Essendon to an elimination final in 1979 under Barry Davis but were again thrashed in an Elimination Final, this time at the hands of Fitzroy. Davis resigned at the end of the 1980 season after missing out on a finals appearance.
One of the few highlights for Essendon supporters during this time was when Graham Moss won the 1976 Brownlow Medal; he was the only Bomber to do so in a 40-year span from 1953–1993. Even that was bittersweet as he quit VFL football to move back to his native Western Australia, where Moss finished out his career as a player and coach at Claremont Football Club. In many ways, Moss' career reflects Essendon's mixed fortunes during the decade.
Former Richmond player Kevin Sheedy started as head coach in 1981.
Essendon reached the Grand Final in 1983, the first time since 1968. Hawthorn won by a then record 83 points.
In 1984, Essendon won the pre-season competition and completed the regular season on top of the ladder. The club played, and beat, Hawthorn in the 1984 VFL Grand Final to win their 13th premiership—their first since 1965. The teams met again in the 1985 Grand Final, which Essendon also won. At the start of 1986, Essendon were considered unbackable for three successive flags, but a succession of injuries to key players Paul Van der Haar (only fifteen games from 1986 to 1988), Tim Watson, Darren Williams, Roger Merrett and Simon Madden led the club to win only eight of its last eighteen games in 1986 and only nine games (plus a draw with Geelong) in 1987. In July 1987, the Bombers suffered a humiliation at the hands of Sydney, who fell two points short of scoring the then highest score in VFL history.
In 1988, Essendon made a rebound to sixth place with twelve wins, including a 140-point thrashing of Brisbane where they had a record sixteen individual goalkickers. In 1989, they rebounded further to second on the ladder with only five losses and thrashed Geelong in the Qualifying Final. However, after a fiery encounter with Hawthorn ended in a convincing defeat, the Bombers were no match for Geelong next week.
In 1990, Essendon were pace-setters almost from the start, but a disruption from the Qualifying Final draw between Collingwood and West Coast was a blow from which they never recovered. The Magpies comprehensively thrashed them in both the second semi final and the grand final.
Following the 1991 season, Essendon moved its home games from its traditional home ground at Windy Hill to the larger and newly renovated MCG. This move generated large increases in game attendance, membership and revenue for the club. The club's training and administrative base remained at Windy Hill until 2013.
Following the retirement of Tim Watson and Simon Madden in the early 1990s, the team was built on new players such as Gavin Wanganeen, Joe Misiti, Mark Mercuri, Michael Long, Dustin Fletcher (son of Ken) and James Hird, who was taken at No. 79 in the 1992 draft. This side became known as the "Baby Bombers", as the core of the side was made up of young players early in their careers.
The team won the 1993 Grand Final against Carlton and that same year, Gavin Wanganeen won the Brownlow Medal, the first awarded to an Essendon player since 1976. Three years later, James Hird was jointly awarded the medal with Michael Voss of Brisbane.
In 2000, Essendon won 20 consecutive matches before they lost to the Western Bulldogs in round 21. The team went on to win their 16th premiership, defeating , thereby completing the most dominant single season in AFL/VFL history. The defeat to the Bulldogs was the only defeat for Essendon throughout the entire calendar year (Essendon also won the 2000 pre-season competition).
Essendon was less successful after 2001. Lucrative contracts to a number of premiership players had caused serious pressure on the club's salary cap, forcing the club to trade several key players. Blake Caracella, Chris Heffernan, Justin Blumfield, Gary Moorcroft and Damien Hardwick had all departed by the end of 2002; in 2004, Mark Mercuri, Sean Wellman and Joe Misiti retired. The club remained competitive; however, they could progress no further than the second week of the finals each year for the years of 2002, 2003, and 2004. Sheedy signed a new three-year contract at the end of 2004.
In 2005, Essendon missed the finals for the first time since 1997; and in 2006, the club suffered its worst season under Sheedy, and its worst for more than 70 years, finishing second-last with only three wins (one of which was against defending premiers , in which newly-appointed captain Matthew Lloyd kicked eight goals) and one draw from twenty-two games. Lloyd had replaced James Hird as captain at the start of the season, but after suffering a season-ending hamstring injury two weeks after his phenomenal performance against Leo Barry, David Hille was appointed captain for the remainder of the season. The club improved its on-field position in 2007, but again missed the finals.
Sheedy's contract was not renewed after 2007, ending his 27-year tenure as Essendon coach. Matthew Knights replaced Sheedy as coach, and coached the club for three seasons, reaching the finals once – an eighth-place finish in 2009 at the expense of reigning premiers . On 29 August 2010, shortly after the end of the 2010 home-and-away season, Knights was dismissed as coach.
On 28 September 2010, former captain James Hird was named as Essendon's new coach from 2011 on a four-year deal. Former dual premiership winning coach and Essendon triple-premiership winning player Mark Thompson later joined Hird on the coaching panel. In his first season, Essendon finished eighth. The club started strongly in 2012, sitting fourth with a 10-3 record at the halfway mark of the season; but the club won only one more match for the season, finishing eleventh to miss the finals.
In 2013 the club moved its training and administrative base to the True Value Solar Centre, a new facility in the suburb of Melbourne Airport which it had developed in conjunction with the Australian Paralympic Committee. Essendon holds a 37-year lease at the facility, and maintains a lease at Windy Hill to use the venue for home matches for its reserves team in the Victorian Football League, and for a social club and merchandise store on the site.
During 2013, the club was investigated by the AFL and the Australian Sports Anti-Doping Authority (ASADA) over its 2012 player supplements and sports science program, most specifically over allegations into illegal use of peptide supplements. An internal review found it to have "established a supplements program that was experimental, inappropriate and inadequately vetted and controlled", and on 27 August 2013, the club was found guilty of bringing the game into disrepute for this reason. Among its penalties, the club was fined A$2 million, stripped of early draft picks in the following two drafts, and forfeited its place in the 2013 finals series (having originally finished seventh on the ladder); Hird was suspended from coaching for twelve months. Several office-bearers also resigned their posts during the controversy, including chairman David Evans and CEO Ian Robson.
In the midst of the supplements saga, assistant coach Mark Thompson took over as coach for the 2014 season during Hird's suspension. He led the club back to the finals for a seventh-place finish but in a tense first elimination final against archrivals North Melbourne, the Bombers led by as much as 27 points at half time before a resurgent Kangaroos side came back and won the game by 12 points. After the 2014 season, Mark Thompson left the club to make way for Hird's return to the senior coaching role.
In June 2014, thirty-four players were issued show-cause notices alleging the use of banned peptide Thymosin beta-4 during the program. The players faced the AFL Anti-Doping Tribunal over the 2014/15 offseason, and on 31 March 2015 the tribunal returned a not guilty verdict, determining that it was "not comfortably satisfied" that the players had been administered the peptide.
Hird returned as senior coach for the 2015 season, and after a strong start, the club's form severely declined after the announcement that WADA would appeal the decision of the AFL Anti-Doping Tribunal. The effect of the appeal on the team's morale was devastating and they went on to win only six games for the year. Under extreme pressure, Hird resigned on 18 August 2015 following a disastrous 112-point loss to Adelaide. Former West Coast Eagles premiership coach John Worsfold was appointed as the new senior coach on a three-year contract.
On 12 January 2016 the Court of Arbitration for Sport overruled the AFL anti-doping tribunal's decision, deeming that 34 past and present players of the Essendon Football Club, took the banned substance Thymosin Beta-4. As a result, all 34 players, 12 of which were still at the club, were given two-year suspensions. However, all suspensions were effectively less due to players having previously taken part in provisional suspensions undertaken during the 2014/2015 off-season.
As a result, Essendon contested the 2016 season with twelve of its regular senior players under suspension. In order for the club to remain competitive, the AFL granted Essendon the ability to upgrade all five of their rookie listed players and to sign an additional ten players to cover the loss of the suspended players for the season.
Due to this unprecedented situation, many in the football community predicted the club would go through the 2016 AFL season without a win; however, they were able to win three matches: against , and in rounds 2, 21 and 23 respectively. The absence of its most experienced players also allowed the development of its young players, with Zach Merrett and Orazio Fantasia having breakout years, while Darcy Parish and Anthony McDonald-Tipungwuti, impressing in their debut seasons. Merrett acted as captain in the side's round 21 win over the Suns. The club eventually finished on the bottom of the ladder and thus claimed its first wooden spoon since 1933.
Essendon made their final financial settlement related to the supplements saga in September 2017, just before finals started. They also improved vastly on their 2016 performance, finishing 7th in the home and away season and becoming the first team since in 2011 to go from wooden spooner to a finals appearance, but ultimately losing their only final to .
The 2017 season was also capped off by the retirement of much-loved club legend and ex-captain Jobe Watson, midfielder Brent Stanton, and ex-Geelong star James Kelly, who later took up a development coach role at the club. Midfielder Heath Hocking, who played 126 games for the club, was delisted.
Expectations were high for the 2018 season, with the club having an outstanding offseason. The recruitment of Jake Stringer, Adam Saad and Devon Smith from the Western Bulldogs, Gold Coast Suns and Greater Western Sydney Giants respectively was expected to throw Essendon firmly into premiership contention.
After beating the previous year's runner up (which went on to beat reigning premiers the following round) in round one, Essendon's form slumped severely, only winning one game out of the next seven rounds and losing to the then-winless Carlton in round eight. Senior assistant coach Mark Neeld was sacked by the club the following Monday.
The team's form improved sharply after this, recording wins against top eight sides Geelong, GWS, eventual premiers West Coast and Sydney, and winning ten out of the last 13 games of the season. However, the mid-season revival was short-lived, with an upset loss to reigning premiers by eight points in round 22 ending any hopes they had of reaching the finals.
The 2018 season was capped off by the club not offering veteran Brendon Goddard a new contract for 2019.
Essendon acquired Dylan Shiel from in one of the most high-profile trades of the 2018 AFL Trade Period. The Bombers had inconsistent form throughout the 2019 season but qualified for the finals for the second time in three seasons, finishing eighth on the ladder with 12 wins and 10 losses. The Bombers, however, were no match for the West Coast Eagles in the first elimination final and lost by 55 points to end their season. The defeat extended their 15-year finals winning drought, having not won a final since 2004.
Essendon's first recorded jumpers were navy blue (The Footballers, edited by Thomas Power, 1875) although the club wore 'red and black caps and hose'. In 1877 The Footballers records the addition of 'a red sash over left shoulder'. This is the first time a red sash as part of the club jumper and by 1878 there are newspaper reports referring to Essendon players as 'the men in the sash'.
Given that blue and navy blue were the most popular colours at the time it is thought that Essendon adopted a red sash in 1877 to distinguish its players from others in similar coloured jumpers.
In 2007, the AFL Commission laid down the requirement that all clubs must produce an alternative jumper for use in matches where jumpers are considered to clash. From 2007–2011, the Essendon clash guernsey was the same design as its home guernsey, but with a substantially wider sash such that the guernsey was predominantly red rather than predominantly black. This was changed after 2011 when the AFL deemed that the wider sash did not provide sufficient contrast.
From 2012 to 2016, Essendon's clash guernsey was predominantly grey, with a red sash fimbriated in black; the grey field contained, in small print, the names of all Essendon premiership players.
Before the 2016 season, Essendon's changed their clash guernsey to a predominantly red one, featuring a red sash in black. Similar to the grey jumper, the names of Essendon premiership players were also printed outside the sash.
Following Adam Ramanauskas' personal battle with cancer, a "Clash for Cancer" match against was launched in 2006. This was a joint venture between Essendon and the Cancer Council of Victoria to raise funds for the organisation. Despite a formal request to the AFL being denied, players wore yellow armbands for the match which resulted in the club being fined $20,000. In 2007, the AFL agreed to allow yellow armbands to be incorporated into the left sleeve of the jumper. The 'Clash for Cancer' match against Melbourne has become an annual event, repeated in subsequent seasons, though in 2012, 2013, 2014 and 2016, (twice), the Sydney Swans and Brisbane Lions were the opponents in those respective seasons instead of Melbourne. In 2009, the jumpers were auctioned along with yellow boots worn by some players during the match.
The club's theme song, "See the Bombers Fly Up", is thought to have been written c. 1959 by Kevin Andrews in the home of player Jeff Gamble at which time Kevin Andrews was living. The song is based on the tune of Johnnie Hamp's 1929 song "(Keep Your) Sunny Side Up" at an increased tempo. Jeff Gamble came up with the line 'See the bombers fly up, up' while Kevin Andrews contributed all or most of the rest. At the time, "(Keep Your) Sunny Side Up" was the theme song for the popular Melbourne-based TV show on Channel 7 "Sunnyside Up". The official version of the song was recorded in 1972 by the Fable Singers and is still used today.
The song, as with all other AFL clubs, is played prior to every match and at the conclusion of matches when the team is victorious.
Songwriter Mike Brady, of "Up There Cazaly" fame, penned an updated version of the song in 1999 complete with a new verse arrangement, but it was not well received. However, this version is occasionally played at club functions.
The club's mascot is named "Skeeta Reynolds". Named after Dick Reynolds, he is a mosquito and was created in honour of the team's back-to-back premiership sides in the 1920s known as the "Mosquito Fleet". He was first named through a competition run in the Bomber magazine with "Skeeta" being the winning entry. This was later changed to "Skeeta Reynolds". He appears as a red mosquito in an Essendon jumper and wears a red and black scarf.
Essendon has a four-way rivalry with , , and being the four biggest and most supported clubs in Victoria. Matches between the clubs are often close regardless of form and ladder positions. If out of the race themselves, all four have the desire to deny the others a finals spot or a premiership. Essendon also has a fierce rivalry with Hawthorn stemming from the 1980s. This rivalry was made even more heated when Matthew Lloyd knocked out Brad Sewell with a bump. This then led to an all in brawl between both sides.
Lindsay Tanner has served as chairman of the board since late 2015.
Essendon's board members are Paul Brasher, Melissa Verner Green, Sean Wellman, David Barham, Catherine Lio, Ken Lay, Simon Madden, Andrew Muir and Lindsay Tanner.
The club's apparel is currently produced by Under Armour. The club's apparel has also been produced by Reebok, Fila, Puma, Adidas and ISC.
"See Essendon Football Club honours."
To celebrate the 125th anniversary of the club, as well as 100 years of the VFL/AFL, Essendon announced its "Team of the Century" in 1997.
In 2002, a club panel chose and ranked the 25 greatest players to have played for Essendon.
The Essendon reserves team first competed in the Victorian Football League's reserves competition when the competition was established in 1919. The team enjoyed success in the form of eight premierships between 1919 and 1999, including the last Victorian State Football League year in 1999. From 2000 until 2002, the club's reserves team competed in the new Victorian Football League competition.
At the end of 2002, the club dissolved its reserves team and established a reserves affiliation with the Bendigo Football Club in the VFL. The affiliation ran for ten years from 2003 until 2012, allowing reserves players from the Essendon list to play with Bendigo.
The club re-established its reserves team in 2013, seeking greater developmental autonomy. The reserves team has since competed in the VFL. The team plays its home games at Windy Hill. The team is made up of AFL senior listed players and VFL contracted players.
Essendon also has a women's team that competes in the women's VFL. In December 2017, the team announced the recruitment of two daughters of Essendon AFL legends: Stephanie Hird (daughter of James) and Michaela Long (daughter of Michael).
The same month, Essendon also entered E-Sports by acquiring Australian "League of Legends" team Abyss ESports. This made them the second AFL team to acquire an E-Sports division after Adelaide acquired Legacy ESports in May.
In 2018 the Essendon Football Club along with four other AFL clubs entered the Victorian Wheelchair Football League | https://en.wikipedia.org/wiki?curid=10257 |
Enid Blyton
Enid Mary Blyton (11 August 1897 – 28 November 1968) was an English children's writer whose books have been among the world's best-sellers since the 1930s, selling more than 600 million copies. Blyton's books are still enormously popular, and have been translated into 90 languages. She wrote on a wide range of topics including education, natural history, fantasy, mystery, and biblical narratives and is best remembered today for her Noddy, Famous Five, Malory Towers and Secret Seven series.
Her first book, "Child Whispers", a 24-page collection of poems, was published in 1922. Following the commercial success of her early novels such as "Adventures of the Wishing-Chair" (1937) and "The Enchanted Wood" (1939), Blyton went on to build a literary empire, sometimes producing fifty books a year in addition to her prolific magazine and newspaper contributions. Her writing was unplanned and sprang largely from her unconscious mind: she typed her stories as events unfolded before her. The sheer volume of her work and the speed with which it was produced led to rumours that Blyton employed an army of ghost writers, a charge she vigorously denied.
Blyton's work became increasingly controversial among literary critics, teachers and parents from the 1950s onwards, because of the alleged unchallenging nature of her writing and the themes of her books, particularly the Noddy series. Some libraries and schools banned her works, which the BBC had refused to broadcast from the 1930s until the 1950s because they were perceived to lack literary merit. Her books have been criticised as being elitist, sexist, racist, xenophobic and at odds with the more progressive environment emerging in post-Second World War Britain, but they have continued to be best-sellers since her death in 1968.
She felt she had a responsibility to provide her readers with a strong moral framework, so she encouraged them to support worthy causes. In particular, through the clubs she set up or supported, she encouraged and organised them to raise funds for animal and paediatric charities. The story of Blyton's life was dramatised in a BBC film entitled "Enid", featuring Helena Bonham Carter in the title role and first broadcast in the United Kingdom on BBC Four in 2009. There have also been several adaptations of her books for stage, screen and television.
Enid Blyton was born on 11 August 1897 in East Dulwich, South London, the oldest of the three children, to Thomas Carey Blyton (1870–1920), a cutlery salesman, (the 1911 census records his occupation as Mantle Manufacturer dealer, womens suits, skirts, etc) and his wife Theresa Mary ("née" Harrison; 1874–1950). Enid's younger brothers, Hanly (1899–1983) and Carey (1902–1976), were born after the family had moved to a semi-detached villa in Beckenham, then a village in Kent. A few months after her birth Enid almost died from whooping cough, but was nursed back to health by her father, whom she adored. Thomas Blyton ignited Enid's interest in nature; in her autobiography she wrote that he "loved flowers and birds and wild animals, and knew more about them than anyone I had ever met". He also passed on his interest in gardening, art, music, literature and the theatre, and the pair often went on nature walks, much to the disapproval of Enid's mother, who showed little interest in her daughter's pursuits. Enid was devastated when he left the family shortly after her thirteenth birthday to live with another woman. Enid and her mother did not have a good relationship, and she did not attend either of her parents' funerals.
From 1907 to 1915 Blyton attended St Christopher's School in Beckenham, where she enjoyed physical activities and became school tennis champion and captain of lacrosse. She was not so keen on all the academic subjects but excelled in writing, and in 1911 she entered Arthur Mee's children's poetry competition. Mee offered to print her verses, encouraging her to produce more. Blyton's mother considered her efforts at writing to be a "waste of time and money", but she was encouraged to persevere by Mabel Attenborough, the aunt of school friend Mary Potter.
Blyton's father taught her to play the piano, which she mastered well enough for him to believe that she might follow in his sister's footsteps and become a professional musician. Blyton considered enrolling at the Guildhall School of Music, but decided she was better suited to becoming a writer. After finishing school in 1915 as head girl, she moved out of the family home to live with her friend Mary Attenborough, before going to stay with George and Emily Hunt at Seckford Hall near Woodbridge in Suffolk. Seckford Hall, with its allegedly haunted room and secret passageway provided inspiration for her later writing. At Woodbridge Congregational Church Blyton met Ida Hunt, who taught at Ipswich High School, and suggested that she train there as a teacher. Blyton was introduced to the children at the nursery school, and recognising her natural affinity with them she enrolled in a National Froebel Union teacher training course at the school in September 1916. By this time she had almost ceased contact with her family.
Blyton's manuscripts had been rejected by publishers on many occasions, which only made her more determined to succeed: "it is partly the struggle that helps you so much, that gives you determination, character, self-reliance – all things that help in any profession or trade, and most certainly in writing". In March 1916 her first poems were published in "Nash's Magazine". She completed her teacher training course in December 1918, and the following month obtained a teaching appointment at Bickley Park School, a small independent establishment for boys in Bickley, Kent. Two months later Blyton received a teaching certificate with distinctions in zoology and principles of education, 1st class in botany, geography, practice and history of education, child hygiene and class teaching and 2nd class in literature and elementary mathematics. In 1920 she moved to Southernhay in Hook Road Surbiton as nursery governess to the four sons of architect Horace Thompson and his wife Gertrude, with whom Blyton spent four happy years. Owing to a shortage of schools in the area her charges were soon joined by the children of neighbours, and a small school developed at the house.
In 1920 Blyton relocated to Chessington, and began writing in her spare time. The following year she won the "Saturday Westminster Review" writing competition with her essay "On the Popular Fallacy that to the Pure All Things are Pure". Publications such as "The Londoner", "Home Weekly" and "The Bystander" began to show an interest in her short stories and poems.
Blyton's first book, "Child Whispers", a 24-page collection of poems, was published in 1922. It was illustrated by a schoolfriend, Phyllis Chase, who collaborated on several of her early works. Also in that year Blyton began writing in annuals for Cassell and George Newnes, and her first piece of writing, "Peronel and his Pot of Glue", was accepted for publication in "Teachers' World". Her success was boosted in 1923 when her poems were published alongside those of Rudyard Kipling, Walter de la Mare and G. K. Chesterton in a special issue of "Teachers' World". Blyton's educational texts were quite influential in the 1920s and '30s, her most sizeable being the three-volume "The Teacher's Treasury" (1926), the six-volume "Modern Teaching" (1928), the ten-volume "Pictorial Knowledge" (1930), and the four-volume "Modern Teaching in the Infant School" (1932).
In July 1923 Blyton published "Real Fairies", a collection of thirty-three poems written especially for the book with the exception of "Pretending", which had appeared earlier in "Punch" magazine. The following year she published "The Enid Blyton Book of Fairies", illustrated by Horace J. Knowles, and in 1926 the "Book of Brownies". Several books of plays appeared in 1927, including "A Book of Little Plays" and "The Play's the Thing" with the illustrator Alfred Bestall.
In the 1930s Blyton developed an interest in writing stories related to various myths, including those of ancient Greece and Rome; "The Knights of the Round Table", "Tales of Ancient Greece" and "Tales of Robin Hood" were published in 1930. In "Tales of Ancient Greece" Blyton retold sixteen well-known ancient Greek myths, but used the Latin rather than the Greek names of deities and invented conversations between the characters. "The Adventures of Odysseus", "Tales of the Ancient Greeks and Persians" and "Tales of the Romans" followed in 1934.
The first of twenty-eight books in Blyton's Old Thatch series, "The Talking Teapot and Other Tales", was published in 1934, the same year as "Brer Rabbit Retold"; (note that Brer Rabbit originally featured in Uncle Remus stories by Joel Chandler Harris), her first serial story and first full-length book, "Adventures of the Wishing-Chair", followed in 1937. "The Enchanted Wood", the first book in the Faraway Tree series, published in 1939, is about a magic tree inspired by the Norse mythology that had fascinated Blyton as a child. According to Blyton's daughter Gillian the inspiration for the magic tree came from "thinking up a story one day and suddenly she was walking in the enchanted wood and found the tree. In her imagination she climbed up through the branches and met Moon-Face, Silky, the Saucepan Man and the rest of the characters. She had all she needed." As in the Wishing-Chair series, these fantasy books typically involve children being transported into a magical world in which they meet fairies, goblins, elves, pixies and other mythological creatures.
Blyton's first full-length adventure novel, "The Secret Island", was published in 1938, featuring the characters of Jack, Mike, Peggy and Nora. Described by "The Glasgow Herald" as a "Robinson Crusoe-style adventure on an island in an English lake", "The Secret Island" was a lifelong favourite of Gillian's and spawned the Secret series. The following year Blyton released her first book in the Circus series and her initial book in the Amelia Jane series, "Naughty Amelia Jane!" According to Gillian the main character was based on a large handmade doll given to her by her mother on her third birthday.
During the 1940s Blyton became a prolific author, her success enhanced by her "marketing, publicity and branding that was far ahead of its time". In 1940 Blyton published two books – "Three Boys and a Circus" and "Children of Kidillin" – under the pseudonym of Mary Pollock (middle name plus first married name), in addition to the eleven published under her own name that year. So popular were Pollock's books that one reviewer was prompted to observe that "Enid Blyton had better look to her laurels". But Blyton's readers were not so easily deceived and many complained about the subterfuge to her and her publisher, with the result that all six books published under the name of Mary Pollock – two in 1940 and four in 1943 – were reissued under Blyton's name. Later in 1940 Blyton published the first of her boarding school story books and the first novel in the Naughtiest Girl series, "The Naughtiest Girl in the School", which followed the exploits of the mischievous schoolgirl Elizabeth Allen at the fictional Whyteleafe School. The first of her six novels in the St. Clare's series, "The Twins at St. Clare's", appeared the following year, featuring the twin sisters Patricia and Isabel O'Sullivan.
In 1942 Blyton released the first book in the Mary Mouse series, "Mary Mouse and the Dolls' House", about a mouse exiled from her mousehole who becomes a maid at a dolls' house. Twenty-three books in the series were produced between 1942 and 1964; 10,000 copies were sold in 1942 alone. The same year, Blyton published the first novel in the Famous Five series, "Five on a Treasure Island", with illustrations by Eileen Soper. Its popularity resulted in twenty-one books between then and 1963, and the characters of Julian, Dick, Anne, George (Georgina) and Timmy the dog became household names in Britain. Matthew Grenby, author of "Children's Literature", states that the five were involved with "unmasking hardened villains and solving serious crimes", although the novels were "hardly 'hard-boiled' thrillers". Blyton based the character of Georgina, a tomboy she described as "short-haired, freckled, sturdy, and snub-nosed" and "bold and daring, hot-tempered and loyal", on herself.
Blyton had an interest in biblical narratives, and retold Old and New Testament stories. "The Land of Far-Beyond" (1942) is a Christian parable along the lines of John Bunyan's "The Pilgrim's Progress" (1698), with contemporary children as the main characters. In 1943 she published "The Children's Life of Christ", a collection of fifty-nine short stories related to the life of Jesus, with her own slant on popular biblical stories, from the Nativity and the Three Wise Men through to the trial, the crucifixion and the resurrection. "Tales from the Bible" was published the following year, followed by "The Boy with the Loaves and Fishes" in 1948.
The first book in Blyton's Five Find-Outers series, "The Mystery of the Burnt Cottage", was published in 1943, as was the second book in the Faraway series, "The Magic Faraway Tree", which in 2003 was voted 66th in the BBC's Big Read poll to find the UK's favourite book. Several of Blyton's works during this period have seaside themes; "John Jolly by the Sea" (1943), a picture book intended for younger readers, was published in a booklet format by Evans Brothers. Other books with a maritime theme include "The Secret of Cliff Castle" and "Smuggler Ben", both attributed to Mary Pollock in 1943; "The Island of Adventure", the first in the Adventure series of eight novels from 1944 onwards; and various novels of the Famous Five series such as "Five on a Treasure Island" (1942), "Five on Kirrin Island Again" (1947) and "Five Go Down to the Sea" (1953).
Capitalising on her success, with a loyal and ever-growing readership, Blyton produced a new edition of many of her series such as the Famous Five, the Five Find-Outers and St. Clare's every year in addition to many other novels, short stories and books. In 1946 Blyton launched the first in the Malory Towers series of six books based around the schoolgirl Darrell Rivers, "First Term at Malory Towers", which became extremely popular, particularly with girls.
The first book in Blyton's Barney Mysteries series, "The Rockingdown Mystery", was published in 1949, as was the first of her fifteen Secret Seven novels. The Secret Seven Society consists of Peter, his sister Janet, and their friends Colin, George, Jack, Pam and Barbara, who meet regularly in a shed in the garden to discuss peculiar events in their local community. Blyton rewrote the stories so they could be adapted into cartoons, which appeared in "Mickey Mouse Weekly" in 1951 with illustrations by George Brook. The French author Evelyne Lallemand continued the series in the 1970s, producing an additional twelve books, nine of which were translated into English by Anthea Bell between 1983 and 1987.
Blyton's Noddy, about a little wooden boy from Toyland, first appeared in the "Sunday Graphic" on 5 June 1949, and in November that year "Noddy Goes to Toyland", the first of at least two dozen books in the series, was published. The idea was conceived by one of Blyton's publishers, Sampson, Low, Marston and Company, who in 1949 arranged a meeting between Blyton and the Dutch illustrator Harmsen van der Beek. Despite having to communicate via an interpreter, he provided some initial sketches of how Toyland and its characters would be represented. Four days after the meeting Blyton sent the text of the first two Noddy books to her publisher, to be forwarded to van der Beek. The Noddy books became one of her most successful and best-known series, and were hugely popular in the 1950s. An extensive range of sub-series, spin-offs and strip books were produced throughout the decade, including "Noddy's Library", "Noddy's Garage of Books", "Noddy's Castle of Books", "Noddy's Toy Station of Books" and "Noddy's Shop of Books".
In 1950 Blyton established the company Darrell Waters Ltd to manage her affairs. By the early 1950s she had reached the peak of her output, often publishing more than fifty books a year, and she remained extremely prolific throughout much of the decade. By 1955 Blyton had written her fourteenth Famous Five novel, "Five Have Plenty of Fun", her fifteenth Mary Mouse book, "Mary Mouse in Nursery Rhyme Land", her eighth book in the Adventure series, "The River of Adventure", and her seventh Secret Seven novel, "Secret Seven Win Through". She completed the sixth and final book of the Malory Towers series, "Last Term at Malory Towers", in 1951.
Blyton published several further books featuring the character of Scamp the terrier, following on from "The Adventures of Scamp", a novel she had released in 1943 under the pseudonym of Mary Pollock. "Scamp Goes on Holiday" (1952) and "Scamp and Bimbo", "Scamp at School", "Scamp and Caroline" and "Scamp Goes to the Zoo" (1954) were illustrated by Pierre Probst. She introduced the character of Bom, a stylish toy drummer dressed in a bright red coat and helmet, alongside Noddy in "TV Comic" in July 1956. A book series began the same year with "Bom the Little Toy Drummer", featuring illustrations by R. Paul-Hoye, and followed with "Bom and His Magic Drumstick" (1957), "Bom Goes Adventuring" and "Bom Goes to Ho Ho Village" (1958), "Bom and the Clown" and "Bom and the Rainbow" (1959) and "Bom Goes to Magic Town" (1960). In 1958 she produced two annuals featuring the character, the first of which included twenty short stories, poems and picture strips.
Many of Blyton's series, including Noddy and The Famous Five, continued to be successful in the 1960s; by 1962, 26 million copies of Noddy had been sold. Blyton concluded several of her long-running series in 1963, publishing the last books of The Famous Five ("Five Are Together Again") and The Secret Seven ("Fun for the Secret Seven"); she also produced three more Brer Rabbit books with the illustrator Grace Lodge: "Brer Rabbit Again", "Brer Rabbit Book", and "Brer Rabbit's a Rascal". In 1962 many of her books were among the first to be published by Armada Books in paperback, making them more affordable to children.
After 1963 Blyton's output was generally confined to short stories and books intended for very young readers, such as "Learn to Count with Noddy" and "Learn to Tell Time with Noddy" in 1965, and "Stories for Bedtime" and the Sunshine Picture Story Book collection in 1966. Her declining health and a falling off in readership among older children have been put forward as the principal reasons for this change in trend. Blyton published her last book in the Noddy series, "Noddy and the Aeroplane", in February 1964. In May the following year she published "Mixed Bag", a song book with music written by her nephew Carey, and in August she released her last full-length books, "The Man Who Stopped to Help" and "The Boy Who Came Back".
Blyton cemented her reputation as a children's writer when in 1926 she took over the editing of "Sunny Stories", a magazine that typically included the re-telling of legends, myths, stories and other articles for children. That same year she was given her own column in "Teachers' World", entitled "From my Window". Three years later she began contributing a weekly page in the magazine, in which she published letters from her fox terrier dog Bobs. They proved to be so popular that in 1933 they were published in book form as "Letters from Bobs", and sold ten thousand copies in the first week. Her most popular feature was "Round the Year with Enid Blyton", which consisted of forty-eight articles covering aspects of natural history such as weather, pond life, how to plant a school garden and how to make a bird table. Among Blyton's other nature projects was her monthly "Country Letter" feature that appeared in "The Nature Lover" magazine in 1935.
"Sunny Stories" was renamed "Enid Blyton's Sunny Stories" in January 1937, and served as a vehicle for the serialisation of Blyton's books. Her first Naughty Amelia Jane story, about an anti-heroine based on a doll owned by her daughter Gillian, was published in the magazine. Blyton stopped contributing in 1952, and it closed down the following year, shortly before the appearance of the new fortnightly "Enid Blyton Magazine" written entirely by Blyton. The first edition appeared on 18 March 1953, and the magazine ran until September 1959.
Noddy made his first appearance in the "Sunday Graphic" in 1949, the same year as Blyton's first daily Noddy strip for the London "Evening Standard". It was illustrated by van der Beek until his death in 1953.
Blyton worked in a wide range of fictional genres, from fairy tales to animal, nature, detective, mystery, and circus stories, but she often "blurred the boundaries" in her books, and encompassed a range of genres even in her short stories. In a 1958 article published in "The Author", she wrote that there were a "dozen or more different types of stories for children", and she had tried them all, but her favourites were those with a family at their centre.
In a letter to the psychologist Peter McKellar, Blyton describes her writing technique:
In another letter to McKellar she describes how in just five days she wrote the 60,000-word book "The River of Adventure", the eighth in her Adventure Series, by listening to what she referred to as her "under-mind", which she contrasted with her "upper conscious mind". Blyton was unwilling to conduct any research or planning before beginning work on a new book, which coupled with the lack of variety in her life according to Druce almost inevitably presented the danger that she might unconsciously, and clearly did, plagiarise the books she had read, including her own. Gillian has recalled that her mother "never knew where her stories came from", but that she used to talk about them "coming from her 'mind's eye", as did William Wordsworth and Charles Dickens. Blyton had "thought it was made up of every experience she'd ever had, everything she's seen or heard or read, much of which had long disappeared from her conscious memory" but never knew the direction her stories would take. Blyton further explained in her biography that "If I tried to think out or invent the whole book, I could not do it. For one thing, it would bore me and for another, it would lack the 'verve' and the extraordinary touches and surprising ideas that flood out from my imagination."
Blyton's daily routine varied little over the years. She usually began writing soon after breakfast, with her portable typewriter on her knee and her favourite red Moroccan shawl nearby; she believed that the colour red acted as a "mental stimulus" for her. Stopping only for a short lunch break she continued writing until five o'clock, by which time she would usually have produced 6,000–10,000 words.
A 2000 article in "The Malay Mail" considers Blyton's children to have "lived in a world shaped by the realities of post-war austerity", enjoying freedom without the political correctness of today, which serves modern readers of Blyton's novels with a form of escapism. Brandon Robshaw of "The Independent" refers to the Blyton universe as "crammed with colour and character", "self-contained and internally consistent", noting that Blyton exemplifies a strong mistrust of adults and figures of authority in her works, creating a world in which children govern. Gillian noted that in her mother's adventure, detective and school stories for older children, "the hook is the strong storyline with plenty of cliffhangers, a trick she acquired from her years of writing serialised stories for children's magazines. There is always a strong moral framework in which bravery and loyalty are (eventually) rewarded". Blyton herself wrote that "my love of children is the whole foundation of all my work".
Victor Watson, Assistant Director of Research at Homerton College, Cambridge, believes that Blyton's works reveal an "essential longing and potential associated with childhood", and notes how the opening pages of "The Mountain of Adventure" present a "deeply appealing ideal of childhood". He argues that Blyton's work differs from that of many other authors in its approach, describing the narrative of The Famous Five series for instance as "like a powerful spotlight, it seeks to illuminate, to explain, to demystify. It takes its readers on a roller-coaster story in which the darkness is always banished; everything puzzling, arbitrary, evocative is either dismissed or explained". Watson further notes how Blyton often used minimalist visual descriptions and introduced a few careless phrases such as "gleamed enchantingly" to appeal to her young readers.
From the mid-1950s rumours began to circulate that Blyton had not written all the books attributed to her, a charge she found particularly distressing. She published an appeal in her magazine asking children to let her know if they heard such stories and, after one mother informed her that she had attended a parents' meeting at her daughter's school during which a young librarian had repeated the allegation, Blyton decided in 1955 to begin legal proceedings. The librarian was eventually forced to make a public apology in open court early the following year, but the rumours that Blyton operated "a 'company' of ghost writers" persisted, as some found it difficult to believe that one woman working alone could produce such a volume of work.
Enid's Conservative personal politics were often in view in her fiction. In "The Mystery of the Missing Necklace" (a The Five Find-Outers installment), she uses the character of young Elizabeth ("Bets") to give a statement praising Winston Churchill and describing the politician as a "statesman".
Blyton felt a responsibility to provide her readers with a positive moral framework, and she encouraged them to support worthy causes. Her view, expressed in a 1957 article, was that children should help animals and other children rather than adults:
Blyton and the members of the children's clubs she promoted via her magazines raised a great deal of money for various charities; according to Blyton, membership of her clubs meant "working for others, for no reward". The largest of the clubs she was involved with was the Busy Bees, the junior section of the People's Dispensary for Sick Animals, which Blyton had actively supported since 1933. The club had been set up by Maria Dickin in 1934, and after Blyton publicised its existence in the "Enid Blyton Magazine" it attracted 100,000 members in three years. Such was Blyton's popularity among children that after she became Queen Bee in 1952 more than 20,000 additional members were recruited in her first year in office. The Enid Blyton Magazine Club was formed in 1953. Its primary objective was to raise funds to help those children with cerebral palsy who attended a centre in Cheyne Walk, in Chelsea, London, by furnishing an on-site hostel among other things.
The Famous Five series gathered such a following that readers asked Blyton if they might form a fan club. She agreed, on condition that it serve a useful purpose, and suggested that it could raise funds for the Shaftesbury Society Babies' Home in Beaconsfield, on whose committee she had served since 1948. The club was established in 1952, and provided funds for equipping a Famous Five Ward at the home, a paddling pool, sun room, summer house, playground, birthday and Christmas celebrations, and visits to the pantomime. By the late 1950s Blyton's clubs had a membership of 500,000, and raised £35,000 in the six years of the "Enid Blyton Magazine"'s run.
By 1974 the Famous Five Club had a membership of 220,000, and was growing at the rate of 6,000 new members a year. The Beaconsfield home it was set up to support closed in 1967, but the club continued to raise funds for other paediatric charities, including an Enid Blyton bed at Great Ormond Street Hospital and a mini-bus for disabled children at Stoke Mandeville Hospital.
Blyton capitalised upon her commercial success as an author by negotiating agreements with jigsaw puzzle and games manufacturers from the late 1940s onwards; by the early 1960s some 146 different companies were involved in merchandising Noddy alone. In 1948 Bestime released four jigsaw puzzles featuring her characters, and the first Enid Blyton board game appeared, "Journey Through Fairyland", created by BGL. The first card game, Faraway Tree, appeared from Pepys in 1950. In 1954 Bestime released the first four jigsaw puzzles of the Secret Seven, and the following year a Secret Seven card game appeared.
Bestime released the Little Noddy Car Game in 1953 and the Little Noddy Leap Frog Game in 1955, and in 1956 American manufacturer Parker Brothers released Little Noddy's Taxi Game, a board game which features Noddy driving about town, picking up various characters. Bestime released its Plywood Noddy Jigsaws series in 1957 and a Noddy jigsaw series featuring cards appeared from 1963, with illustrations by Robert Lee. Arrow Games became the chief producer of Noddy jigsaws in the late 1970s and early 1980s. Whitman manufactured four new Secret Seven jigsaw puzzles in 1975, and produced four new Malory Towers ones two years later. In 1979 the company released a Famous Five adventure board game, Famous Five Kirrin Island Treasure. Stephen Thraves wrote eight Famous Five adventure game books, published by Hodder & Stoughton in the 1980s. The first adventure game book of the series, "The Wreckers' Tower Game", was published in October 1984.
On 28 August 1924 Blyton married Major Hugh Alexander Pollock, DSO (1888–1971) at Bromley Register Office, without inviting her family. They married shortly after he divorced from his first wife, with whom he had two sons, one of whom was already deceased. Pollock was editor of the book department in the publishing firm of George Newnes, which became her regular publisher. It was he who requested that Blyton write a book about animals, "The Zoo Book", which was completed in the month before they married. They initially lived in a flat in Chelsea before moving to Elfin Cottage in Beckenham in 1926, and then to Old Thatch in Bourne End (called Peterswood in her books) in 1929. Blyton's first daughter Gillian, was born on 15 July 1931, and after a miscarriage in 1934, she gave birth to a second daughter, Imogen, on 27 October 1935.
In 1938 Blyton and her family moved to a house in Beaconsfield, which was named Green Hedges by Blyton's readers following a competition in her magazine. By the mid-1930s, Pollock – possibly due to the trauma he had suffered during the First World War being revived through his meetings as a publisher with Winston Churchill – withdrew increasingly from public life and became a secret alcoholic. With the outbreak of the Second World War, he became involved in the Home Guard. Pollock met again Ida Crowe, an aspiring writer nineteen years his junior, whom he had met years before. He made an offer to her to join him as secretary in his posting to a Home Guard training centre at Denbies, a Gothic mansion in Surrey belonging to Lord Ashcombe, and they entered into a romantic relationship. Blyton's marriage to Pollock became troubled for years, and according to Crowe's memoir, Blyton began a series of affairs, including a lesbian relationship with one of the children's nannies. In 1941 Blyton met Kenneth Fraser Darrell Waters, a London surgeon with whom she began a serious affair. Pollock discovered the liaison, and threatened to initiate divorce proceedings against Blyton. Fearing that exposure of her adultery would ruin her public image, it was ultimately agreed that Blyton would instead file for divorce against Pollock. According to Crowe's memoir, Blyton promised that if he admitted to infidelity she would allow him parental access to their daughters; but after the divorce he was forbidden to contact them, and Blyton ensured he was subsequently unable to find work in publishing. Pollock, having married Crowe on 26 October 1943, eventually resumed his heavy drinking and was forced to petition for bankruptcy in 1950.
Blyton and Darrell Waters married at the City of Westminster Register Office on 20 October 1943. She changed the surname of her daughters to Darrell Waters and publicly embraced her new role as a happily married and devoted doctor's wife. After discovering she was pregnant in the spring of 1945, Blyton miscarried five months later, following a fall from a ladder. The baby would have been Darrell Waters's first child and it would also have been the son for which both of them longed.
Her love of tennis included playing naked, with nude tennis "a common practice in those days among the more louche members of the middle classes".
Blyton's health began to deteriorate in 1957, when during a round of golf she started to complain of feeling faint and breathless, and by 1960 she was displaying signs of dementia. Her agent George Greenfield recalled that it was "unthinkable" for the "most famous and successful of children's authors with her enormous energy and computer-like memory" to be losing her mind and suffering from what is now known as Alzheimer's disease in her mid-sixties. Blyton's situation was worsened by her husband's declining health throughout the 1960s; he suffered from severe arthritis in his neck and hips, deafness, and became increasingly ill-tempered and erratic until his death on 15 September 1967.
The story of Blyton's life was dramatised in a BBC film entitled "Enid", which aired in the United Kingdom on BBC Four on 16 November 2009. Helena Bonham Carter, who played the title role, described Blyton as "a complete workaholic, an achievement junkie and an extremely canny businesswoman" who "knew how to brand herself, right down to the famous signature".
During the months following her husband's death, Blyton became increasingly ill and moved into a nursing home three months before her death. She died at the Greenways Nursing Home, Hampstead, North London, on 28 November 1968, aged 71. A memorial service was held at St James's Church, Piccadilly and she was cremated at Golders Green Crematorium, where her ashes remain. Blyton's home, Green Hedges, was auctioned on 26 May 1971 and demolished in 1973; the site is now occupied by houses and a street named Blyton Close. An English Heritage blue plaque commemorates Blyton at Hook Road in Chessington, where she lived from 1920 to 1924. In 2014, a plaque recording her time as a Beaconsfield resident from 1938 until her death in 1968 was unveiled in the town hall gardens, next to small iron figures of Noddy and Big Ears.
Since her death and the publication of her daughter Imogen's 1989 autobiography, "A Childhood at Green Hedges", Blyton has emerged as an emotionally immature, unstable and often malicious figure. Imogen considered her mother to be "arrogant, insecure, pretentious, very skilled at putting difficult or unpleasant things out of her mind, and without a trace of maternal instinct. As a child, I viewed her as a rather strict authority. As an adult I pitied her." Blyton's eldest daughter Gillian remembered her rather differently however, as "a fair and loving mother, and a fascinating companion".
The Enid Blyton Trust for Children was established in 1982, with Imogen as its first chairman, and in 1985 it established the National Library for the Handicapped Child. "Enid Blyton's Adventure Magazine" began publication in September 1985 and, on 14 October 1992, the BBC began publishing "Noddy Magazine" and released the Noddy CD-Rom in October 1996.
The first Enid Blyton Day was held at Rickmansworth on 6 March 1993 and, in October 1996, the Enid Blyton award, The Enid, was given to those who have made outstanding contributions towards children. The Enid Blyton Society was formed in early 1995, to provide "a focal point for collectors and enthusiasts of Enid Blyton" through its thrice-annual "Enid Blyton Society Journal", its annual Enid Blyton Day and its website. On 16 December 1996, Channel 4 broadcast a documentary about Blyton, "Secret Lives". To celebrate her centenary in 1997, exhibitions were put on at the London Toy & Model Museum (now closed), Hereford and Worcester County Museum and Bromley Library and, on 9 September, the Royal Mail issued centenary stamps.
The London-based entertainment and retail company Trocadero plc purchased Blyton's Darrell Waters Ltd in 1995 for £14.6 million and established a subsidiary, Enid Blyton Ltd, to handle all intellectual properties, character brands and media in Blyton's works. The group changed its name to Chorion in 1998 but, after financial difficulties in 2012, sold its assets. Hachette UK acquired from Chorion world rights in the Blyton estate in March 2013, including The Famous Five series but excluding the rights to Noddy, which had been sold to DreamWorks Classics (formerly Classic Media, now a subsidiary of DreamWorks Animation) in 2012.
Blyton's granddaughter, Sophie Smallwood, wrote a new Noddy book to celebrate the character's 60th birthday, 46 years after the last book was published; "Noddy and the Farmyard Muddle" (2009) was illustrated by Robert Tyndall. In February 2011, the manuscript of a previously unknown Blyton novel, "Mr Tumpy's Caravan", was discovered by the archivist at Seven Stories, National Centre for Children's Books in a collection of papers belonging to Blyton's daughter Gillian, purchased by Seven Stories in 2010 following her death. It was initially thought to belong to a comic strip collection of the same name published in 1949, but it appears to be unrelated and is believed to be something written in the 1930s, which had been rejected by a publisher.
In a 1982 survey of 10,000 eleven-year-old children, Blyton was voted their most popular writer. She is the world's fourth most-translated author, behind Agatha Christie, Jules Verne and William Shakespeare with her books being translated into 90 languages. From 2000 to 2010, Blyton was listed as a Top Ten author, selling almost 8 million copies (worth £31.2 million) in the UK alone. In 2003, "The Magic Faraway Tree" was voted 66th in the BBC's Big Read. In the 2008 Costa Book Awards, Blyton was voted Britain's best-loved author. Her books continue to be very popular among children in Commonwealth nations such as India, Pakistan, Sri Lanka, Singapore, Malta, New Zealand and Australia, and around the world. They have also seen a surge of popularity in China, where they are "big with every generation". In March 2004, Chorion and the Chinese publisher Foreign Language Teaching and Research Press negotiated an agreement over the Noddy franchise, which included bringing the character to an animated series on television, with a potential audience of a further 95 million children under the age of five. Chorion spent around £10 million digitising Noddy and, as of 2002, had made television agreements with at least 11 countries worldwide.
Novelists influenced by Blyton include the crime writer Denise Danks, whose fictional detective Georgina Powers is based on George from the Famous Five. Peter Hunt's "A Step off the Path" (1985) is also influenced by the Famous Five, and the St. Clare's and Malory Towers series provided the inspiration for Jacqueline Wilson's "Double Act" (1996) and Adèle Geras's Egerton Hall trilogy (1990–92) respectively.
A.H. Thompson, who compiled an extensive overview of censorship efforts in the United Kingdom's public libraries, dedicated an entire chapter to "The Enid Blyton Affair", and wrote of her in 1975:
Blyton's range of plots and settings has been described as limited, repetitive and continually recycled. Many of her books were critically assessed by teachers and librarians, deemed unfit for children to read, and removed from syllabuses and public libraries. Responding to claims that her moral views were "dependably predictable", Blyton commented that "most of you could write down perfectly correctly all the things that I believe in and stand for – you have found them in my books, and a writer's books are always a faithful reflection of himself".
From the 1930s to the 1950s the BBC operated a "de facto" ban on dramatising Blyton's books for radio, considering her to be a "second-rater" whose work was without literary merit. The children's literary critic Margery Fisher likened Blyton's books to "slow poison", and Jean E. Sutcliffe of the BBC's schools broadcast department wrote of Blyton's ability to churn out "mediocre material", noting that "her capacity to do so amounts to genius ... anyone else would have died of boredom long ago". Michael Rosen, Children's Laureate from 2007 until 2009, wrote that "I find myself flinching at occasional bursts of snobbery and the assumed level of privilege of the children and families in the books." The children's author Anne Fine presented an overview of the concerns about Blyton's work and responses to them on BBC Radio 4 in November 2008, in which she noted the "drip, drip, drip of disapproval" associated with the books. Blyton's response to her critics was that she was uninterested in the views of anyone over the age of 12, claiming that half the attacks on her work were motivated by jealousy and the rest came from "stupid people who don't know what they're talking about because they've never read any of my books".
Despite criticism by contemporaries that her work's quality began to suffer in the 1950s at the expense of its increasing volume, Blyton nevertheless capitalised on being generally regarded at the time as "a more 'savoury', English alternative" to what some considered an "invasion" of Britain by American culture, in the form of "rock music, horror comics, television, teenage culture, delinquency, and Disney".
According to British academic Nicholas Tucker, the works of Enid Blyton have been "banned from more public libraries over the years than is the case with any other adult or children's author", though such attempts to quell the popularity of her books over the years seem to have been largely unsuccessful, and "she still remains very widely read".
Some librarians felt that Blyton's restricted use of language, a conscious product of her teaching background, was prejudicial to an appreciation of more literary qualities. In a scathing article published in "Encounter" in 1958, the journalist Colin Welch remarked that it was "hard to see how a diet of Miss Blyton could help with the 11-plus or even with the Cambridge English Tripos", but reserved his harshest criticism for Blyton's Noddy, describing him as an "unnaturally priggish ... sanctimonious ... witless, spiritless, snivelling, sneaking doll."
The author and educational psychologist Nicholas Tucker notes that it was common to see Blyton cited as people's favourite or least favourite author according to their age, and argues that her books create an "encapsulated world for young readers that simply dissolves with age, leaving behind only memories of excitement and strong identification". Fred Inglis considers Blyton's books to be technically easy to read, but to also be "emotionally and cognitively easy". He mentions that the psychologist Michael Woods believed that Blyton was different from many other older authors writing for children in that she seemed untroubled by presenting them with a world that differed from reality. Woods surmised that Blyton "was a child, she thought as a child, and wrote as a child ... the basic feeling is essentially pre-adolescent ... Enid Blyton has no moral dilemmas ... Inevitably Enid Blyton was labelled by rumour a child-hater. If true, such a fact should come as no surprise to us, for as a child herself all other children can be nothing but rivals for her." Inglis argues though that Blyton was clearly devoted to children and put an enormous amount of energy into her work, with a powerful belief in "representing the crude moral diagrams and garish fantasies of a readership". Blyton's daughter Imogen has stated that she "loved a relationship with children through her books", but real children were an intrusion, and there was no room for intruders in the world that Blyton occupied through her writing.
Accusations of racism in Blyton's books were first made by Lena Jeger in a "Guardian" article published in 1966, in which she was critical of Blyton's "The Little Black Doll", published a few months earlier. Sambo, the black doll of the title, is hated by his owner and the other toys owing to his "ugly black face", and runs away. A shower of rain washes his face clean, after which he is welcomed back home with his now pink face. Jamaica Kincaid also considers the Noddy books to be "deeply racist" because of the blonde children and the black golliwogs. In Blyton's 1944 novel "The Island of Adventure", a black servant named Jo-Jo is very intelligent, but is particularly cruel to the children.
Accusations of xenophobia were also made. As George Greenfield observed, "Enid was very much part of that between-the-wars middle class which believed that foreigners were untrustworthy or funny or sometimes both". The publisher Macmillan conducted an internal assessment of Blyton's "The Mystery That Never Was", submitted to them at the height of her fame in 1960. The review was carried out by the author and books editor Phyllis Hartnoll, in whose view "There is a faint but unattractive touch of old-fashioned xenophobia in the author's attitude to the thieves; they are 'foreign' ... and this seems to be regarded as sufficient to explain their criminality." Macmillan rejected the manuscript, but it was published by William Collins in 1961, and then again in 1965 and 1983.
Blyton's depictions of boys and girls are considered by many critics to be sexist. In a "Guardian" article published in 2005 Lucy Mangan proposed that The Famous Five series depicts a power struggle between Julian, Dick and George (Georgina), in which the female characters either act like boys or are talked down to, as when Dick lectures George: "it's really time you gave up thinking you're as good as a boy".
In December 2016 the Royal Mint discussed featuring Blyton on a commemorative 50p coin but dismissed the idea because she was "known to have been a racist, sexist, homophobe and not a very well-regarded writer".
To address criticisms levelled at Blyton's work some later editions have been altered to reflect more politically progressive attitudes towards issues such as race, gender, violence between young persons, the treatment of children by adults, and legal changes in Britain as to what is allowable for young children to do in the years since the stories were originally written (e.g. purchasing fireworks); modern reprints of the Noddy series substitute teddy bears or goblins for golliwogs, for instance. The golliwogs who steal Noddy's car and dump him naked in the Dark Wood in "Here Comes Noddy Again" are replaced by goblins in the 1986 revision, who strip Noddy only of his shoes and hat and return at the end of the story to apologise.
"The Faraway Tree"'s Dame Slap, who made regular use of corporal punishment, was changed to Dame Snap who no longer did so, and the names of Dick and Fanny in the same series were changed to Rick and Frannie. Characters in the Malory Towers and St. Clare's series are no longer spanked or threatened with a spanking, but are instead scolded. References to George's short hair making her look like a boy were removed in revisions to "Five on a Hike Together", reflecting the idea that girls need not have long hair to be considered feminine or normal. Anne of "The Famous Five" stating that boys cannot wear pretty dresses or like girl's dolls was removed. In "The Adventurous Four", the names of the young twin girls were changed from Jill and Mary to Pippa and Zoe.
In 2010 Hodder, the publisher of the Famous Five series, announced its intention to update the language used in the books, of which it sold more than half a million copies a year. The changes, which Hodder described as "subtle", mainly affect the dialogue rather than the narrative. For instance, "school tunic" becomes "uniform", "mother and father", and "mother and daddy" (this latter one used by young female characters and deemed sexist) becomes "mum and dad", "bathing" is replaced by "swimming", and "jersey" by "jumper". Some commentators see the changes as necessary to encourage modern readers, whereas others regard them as unnecessary and patronising. In 2016 Hodder's parent company Hachette announced that they would abandon the revisions as, based on feedback, they had not been a success.
In 1954 Blyton adapted Noddy for the stage, producing the "Noddy in Toyland" pantomime in just two or three weeks. The production was staged at the 2660-seat Stoll Theatre in Kingsway, London at Christmas. Its popularity resulted in the show running during the Christmas season for five or six years. Blyton was delighted with its reception by children in the audience, and attended the theatre three or four times a week. TV adaptations of Noddy since 1954 include one in the 1970s narrated by Richard Briers. In 1955 a stage play based on the Famous Five was produced, and in January 1997 the King's Head Theatre embarked on a six-month tour of the UK with "The Famous Five Musical", to commemorate Blyton's centenary. On 21 November 1998 "The Secret Seven Save the World" was first performed at the Sherman Theatre in Cardiff.
There have also been several film and television adaptations of the Famous Five: by the Children's Film Foundation in 1957 and 1964, Southern Television in 1978–79, and Zenith Productions in 1995–97. The series was also adapted for the German film "Fünf Freunde", directed by Mike Marzuk and released in 2011.
The Comic Strip, a group of British comedians, produced two extreme parodies of the Famous Five for Channel 4 television: "Five Go Mad in Dorset", broadcast in 1982, and "Five Go Mad on Mescalin", broadcast the following year. A third in the series, "Five Go to Rehab", was broadcast on Sky in 2012.
Blyton's "The Faraway Tree" series of books has also been adapted to television and film. On 29 September 1997 the BBC began broadcasting an animated series called "The Enchanted Lands", based on the series. It was announced in October 2014 that a deal had been signed with publishers Hachette for "The Faraway Tree" series to be adapted into a live-action film by director Sam Mendes’ production company. Marlene Johnson, head of children's books at Hachette, said: "Enid Blyton was a passionate advocate of children’s storytelling, and The Magic Faraway Tree is a fantastic example of her creative imagination."
Blyton's "Malory Towers" has been adapted into a musical of the same name by Emma Rice's theatre company. It was scheduled to do a UK spring tour in 2020 which has been postponed due to the COVID-19 pandemic.
In 2019, "Malory Towers" was adapted as a 13 part TV series for the BBC. It is made partly in Toronto and partly in the UK in association with Canada's Family Channel. The series went to air in the UK from April 2020.
Seven Stories, the National Centre for Children's Books in Newcastle upon Tyne, holds the largest public collection of Blyton's papers and typescripts. The Seven Stories collection contains a significant number of Blyton's typescripts, including the previously unpublished novel, "Mr Tumpy's Caravan", as well as personal papers and diaries. The purchase of the material in 2010 was made possible by special funding from the Heritage Lottery Fund, the MLA/V&A Purchase Grant Fund, and two private donations. | https://en.wikipedia.org/wiki?curid=10258 |
Epipalaeolithic Near East
The Epipalaeolithic Near East designates the Epipalaeolithic ("Final Old Stone Age", also known as Mesolithic) in the prehistory of the Near East. It is the period after the Upper Palaeolithic and before the Neolithic, between approximately 20,000 and 10,000 years Before Present (BP). The people of the Epipalaeolithic were nomadic hunter-gatherers who generally lived in small, seasonal camps rather than permanent villages. They made sophisticated stone tools using microliths—small, finely-produced blades that were hafted in wooden implements. These are the primary artifacts by which archaeologists recognise and classify Epipalaeolithic sites.
The start of the Epipalaeolithic is defined by the appearance of microliths. Although this is an arbitrary boundary, the Epipalaeolithic does differ significantly from the preceding Upper Palaeolithic. Epipalaeolithic sites are more numerous, better preserved, and can be accurately radiocarbon dated. The period coincides with the gradual retreat of glacial climatic conditions between the Last Glacial Maximum and the start of the Holocene, and it is characterised by population growth and economic intensification. The Epipalaeolithic ended with the "Neolithic Revolution" and the onset of domestication, food production, and sedentism, although archaeologists now recognise that these trends began in the Epipalaeolithic.
The period may be subdivided into Early, Middle and Late Epipalaeolithic: The Early Epipalaeolithic corresponds to the Kebaran culture, c. 20,000 to 14,500 years ago, the Middle Epipalaeolithic is the Geometric Kebaran or late phase of the Kebaran, and the Late Epipalaeolithic to the Natufian, 14,500–11,500 BP. The Natufian overlaps with the incipient Neolithic Revolution, the Pre-Pottery Neolithic A.
The Early Epipalaeolithic, also known as Kebaran, lasted from 20,000 to 12,150 BP. It followed the Upper Paleolithic Levantine Aurignacian (formerly called Antelian) period throughout the Levant. By the end of the Levantine Aurignacian, gradual changes took place in stone industries. Small stone tools called microliths and retouched bladelets can be found for the first time. The microliths of this culture period differ markedly from the Aurignacian artifacts.
By 18,000 BP the climate and environment had changed, starting a period of transition. The Levant became more arid and the forest vegetation retreated, to be replaced by steppe. The cool and dry period ended at the beginning of Mesolithic 1. The hunter-gatherers of the Aurignacian would have had to modify their way of living and their pattern of settlement to adapt to the changing conditions. The crystallization of these new patterns resulted in Mesolithic 1. The people developed new types of settlements and new stone industries.
The inhabitants of a small Mesolithic 1 site in the Levant left little more than their chipped stone tools behind. The industry was of small tools made of bladelets struck off single-platform cores. Besides bladelets, burins and end-scrapers have been found. A few bone tools and some ground stone have also been found. These so-called Mesolithic sites of Asia are far less numerous than those of the Neolithic, and the archeological remains are very poor.
The type site is Kebara Cave south of Haifa. The Kebaran was characterized by small, geometric microliths. The people were thought to lack the specialized grinders and pounders found in later Near Eastern cultures.
The Kebaran is preceded by the Athlitian phase of the Levantine Aurignacian (formerly called Antelian) and followed by the proto-agrarian Natufian culture of the Epipalaeolithic. The appearance of the Kebarian culture, of microlithic type, implies a significant rupture in the cultural continuity of Levantine Upper Paleolithic. The Kebaran culture, with its use of microliths, is associated also with the use of the bow and arrow and the domestication of the dog. The Kebaran is also characterised by the earliest collecting and processing of wild cereals, known due to the excavation of grain-grinding tools. This was the first step towards the Neolithic Revolution. The Kebaran people are believed to have migrated seasonally, dispersing to upland environments in the summer, and gathering in caves and rockshelters near lowland lakes in the winter. This diversity of environments may be the reason for the variety of tools found in their toolkits.
The Kebaran is generally thought to have been ancestral to the later Natufian culture that occupied much of the same range.
The earliest evidence for the use of composite cereal harvesting tools are the glossed flint blades that have been found at the site of Ohalo II, a 23,000-year-old fisher-hunter-gatherers’ camp on the shore of the Sea of Galilee, Northern Israel. The Ohalo site is dated at the junction of the Upper Paleolithic and the Early Epipalaeolithic, and has been attributed to both periods. The wear traces on the tools indicate that these were used for harvesting near-ripe, semi-green wild cereals, shortly before grains ripen enough to disperse naturally. The study shows that the tools were not used intensively, and they reflect two harvesting modes: flint knives held by hand and inserts hafted into a handle. The finds reveal the existence of cereal harvesting techniques and tools some 8,000 years before the Natufian, and 12,000 years before the establishment of sedentary farming communities in the Near East during the Neolithic Revolution. Furthermore, the new finds accord well with evidence for the earliest ever cereal cultivation at the site, and for the use of stone-made grinding implements.
Evidence for symbolic behavior of Late Pleistocene foragers in the Levant has been found in engraved limestone plaquettes from the Epipalaeolithic open-air site Ein Qashish South in the Jezreel Valley, Israel. The engravings were uncovered in Kebaran and Geometric Kebaran deposits (ca. 23,000 and ca. 16,500 BP), and include the image of a bird, the first figurative representation known so far from a pre-Natufian Epipalaeolithic site, together with geometric motifs such as chevrons, cross-hatchings, and ladders. Some of the engravings closely resemble roughly contemporary European finds, and may be interpreted as "systems of notations" or "artificial memory systems" related to the timing of seasonal resources and related important events for nomadic groups.
Similar-looking signs and patterns are well known from the context of the local Natufian, a final Epipalaeolithic period when sedentary or semi-sedentary foragers started practicing agriculture.
The Late Epipalaeolithic is also called the Natufian culture. This period is characterized by the early rise of agriculture, which later emerged more fully in the Neolithic period. Radiocarbon dating places the Natufian culture between 12,500 and 9500 BCE, just before the end of the Pleistocene. This period is characterised by the beginning of agriculture. The earliest-known battle occurred during the Mesolithic period at a site in Sudan, known as Cemetery 117.
The Natufian culture is commonly split into two subperiods: Early Natufian (12,500–10,800 BCE) (Christopher Delage gives 13,000–11,500 BP uncalibrated, equivalent to 13,700–11,500 BCE) and Late Natufian (10,800–9500 BCE). The Late Natufian most likely occurred in tandem with the Younger Dryas. The following period is often called the Pre-Pottery Neolithic; specialists do not discuss "Mesolithic pottery" in terms of the Levant.
Until recently, it was thought that the Arabian peninsula was too arid and inhospitable for human settlement in the Late Pleistocene. The earliest known sites belonged to the early Neolithic, c. 9000 to 8000 BP, and it was supposed that people were able to recolonise the region then due to the wetter climate of the early Holocene.
However, in 2014, archaeologists working in the southern Nefud desert discovered an Epipalaeolithic site dating to between 12,000 and 10,000 BP. The site is located in the Jubbah basin, a palaeolake which retained water in the otherwise dry conditions of the Terminal Pleistocene. The stone tools found bore a close resemblance to the Geometric Kebaran, a Levantine industry associated with the Middle Epipalaeolithic. The excavators of the site therefore proposed that northern Arabia was colonised by foragers from the Levant around 15,000 years ago. These groups may then have been cut off by the drying climate and retreated to "refugia" like the Jubbah palaeolake.
The Epipalaeolithic is best understood when discussing the southern Levant, as the period is well documented due to good preservation at the sites, at least of animal remains. The most prevalent animal food sources in the Levant during this period were: deer, gazelle, and ibex of various species, and smaller animals including birds, lizard, fox, tortoise, and hare. Less common were aurochs, wild equids, wild boar, wild cattle, and hartebeest. At Neve David near Haifa, 15 mammal species were found, and two reptile species. Despite then being very close to the coast, the rather small number of seashells found (7 genera) and the piercing of many, suggests these may have been collected as ornaments rather than food.
However the period seems to be marked by an increase in plant foods and a decrease in meat eating. Over 40 plant species have been found by analysing one site in the Jordan Valley, and some grains were processed and baked. Stones with evidence of grinding have been found. These were most likely the main food sources throughout the Pre-Pottery Neolithic A, which introduced the widespread agricultural growing of crops. | https://en.wikipedia.org/wiki?curid=10259 |
Enrico Fermi
Enrico Fermi (; 29 September 1901 – 28 November 1954) was an Italian (later naturalized American) physicist and the creator of the world's first nuclear reactor, the Chicago Pile-1. He has been called the "architect of the nuclear age" and the "architect of the atomic bomb". He was one of very few physicists to excel in both theoretical physics and experimental physics. Fermi held several patents related to the use of nuclear power, and was awarded the 1938 Nobel Prize in Physics for his work on induced radioactivity by neutron bombardment and for the discovery of transuranium elements. He made significant contributions to the development of statistical mechanics, quantum theory, and nuclear and particle physics.
Fermi's first major contribution involved the field of statistical mechanics. After Wolfgang Pauli formulated his exclusion principle in 1925, Fermi followed with a paper in which he applied the principle to an ideal gas, employing a statistical formulation now known as Fermi–Dirac statistics. Today, particles that obey the exclusion principle are called "fermions". Pauli later postulated the existence of an uncharged invisible particle emitted along with an electron during beta decay, to satisfy the law of conservation of energy. Fermi took up this idea, developing a model that incorporated the postulated particle, which he named the "neutrino". His theory, later referred to as Fermi's interaction and now called weak interaction, described one of the four fundamental interactions in nature. Through experiments inducing radioactivity with the recently discovered neutron, Fermi discovered that slow neutrons were more easily captured by atomic nuclei than fast ones, and he developed the Fermi age equation to describe this. After bombarding thorium and uranium with slow neutrons, he concluded that he had created new elements. Although he was awarded the Nobel Prize for this discovery, the new elements were later revealed to be nuclear fission products.
Fermi left Italy in 1938 to escape new Italian racial laws that affected his Jewish wife, Laura Capon. He emigrated to the United States, where he worked on the Manhattan Project during World War II. Fermi led the team that designed and built Chicago Pile-1, which went critical on 2 December 1942, demonstrating the first human-created, self-sustaining nuclear chain reaction. He was on hand when the X-10 Graphite Reactor at Oak Ridge, Tennessee, went critical in 1943, and when the B Reactor at the Hanford Site did so the next year. At Los Alamos, he headed F Division, part of which worked on Edward Teller's thermonuclear "Super" bomb. He was present at the Trinity test on 16 July 1945, where he used his Fermi method to estimate the bomb's yield.
After the war, Fermi served under J. Robert Oppenheimer on the General Advisory Committee, which advised the Atomic Energy Commission on nuclear matters. After the detonation of the first Soviet fission bomb in August 1949, he strongly opposed the development of a hydrogen bomb on both moral and technical grounds. He was among the scientists who testified on Oppenheimer's behalf at the 1954 hearing that resulted in the denial of Oppenheimer's security clearance. Fermi did important work in particle physics, especially related to pions and muons, and he speculated that cosmic rays arose when material was accelerated by magnetic fields in interstellar space. Many awards, concepts, and institutions are named after Fermi, including the Enrico Fermi Award, the Enrico Fermi Institute, the Fermi National Accelerator Laboratory, the Fermi Gamma-ray Space Telescope, the Enrico Fermi Nuclear Generating Station, and the synthetic element fermium, making him one of 16 scientists who have elements named after them.
Enrico Fermi was born in Rome, Italy, on 29 September 1901. He was the third child of Alberto Fermi, a division head in the Ministry of Railways, and Ida de Gattis, an elementary school teacher. His sister, Maria, was two years older than him, his brother Giulio a year older. After the two boys were sent to a rural community to be wet nursed, Enrico rejoined his family in Rome when he was two and a half. Although he was baptised a Roman Catholic in accordance with his grandparents' wishes, his family was not particularly religious; Enrico was an agnostic throughout his adult life. As a young boy he shared the same interests as his brother Giulio, building electric motors and playing with electrical and mechanical toys. Giulio died during an operation on a throat abscess in 1915 and Maria died in an airplane crash near Milan in 1959.
At a local market Fermi found a physics book, the 900-page "Elementorum physicae mathematicae". Written in Latin by Jesuit Father , a professor at the Collegio Romano, it presented mathematics, classical mechanics, astronomy, optics, and acoustics as they were understood at the time of its 1840 publication. With scientifically inclined friend, Enrico Persico, Fermi pursued projects such as building gyroscopes and measuring the acceleration of Earth's gravity. A colleague of Fermi's father gave him books on physics and mathematics which he assimilated quickly.
Fermi graduated from high school in July 1918, and at Amidei's urging applied to the "Scuola Normale Superiore" in Pisa. Having lost one son, his parents only reluctantly allowed him to live in the school's lodgings for four years. Fermi took first place in the difficult entrance exam, which included an essay on the theme of "Specific characteristics of Sounds"; the 17-year-old Fermi chose to use Fourier analysis to derive and solve the partial differential equation for a vibrating rod, and after interviewing Fermi the examiner declared he would become an outstanding physicist.
At the "Scuola Normale Superiore" Fermi played pranks with fellow student Franco Rasetti; the two became close friends and collaborators. Fermi was advised by Luigi Puccianti, director of the physics laboratory, who said there was little he could teach Fermi and often asked Fermi to teach him something instead. Fermi's knowledge of quantum physics was such that Puccianti asked him to organize seminars on the topic. During this time Fermi learned tensor calculus, a technique key to general relativity. Fermi initially chose mathematics as his major, but soon switched to physics. He remained largely self-taught, studying general relativity, quantum mechanics, and atomic physics.
In September 1920, Fermi was admitted to the Physics department. Since there were only three students in the department—Fermi, Rasetti, and Nello Carrara—Puccianti let them freely use the laboratory for whatever purposes they chose. Fermi decided that they should research X-ray crystallography, and the three worked to produce a Laue photograph—an X-ray photograph of a crystal. During 1921, his third year at the university, Fermi published his first scientific works in the Italian journal "Nuovo Cimento". The first was entitled "On the dynamics of a rigid system of electrical charges in translational motion" ('). A sign of things to come was that the mass was expressed as a tensor—a mathematical construct commonly used to describe something moving and changing in three-dimensional space. In classical mechanics, mass is a scalar quantity, but in relativity it changes with velocity. The second paper was "On the electrostatics of a uniform gravitational field of electromagnetic charges and on the weight of electromagnetic charges" ('). Using general relativity, Fermi showed that a charge has a weight equal to U/c2, where U was the electrostatic energy of the system, and c is the speed of light.
The first paper seemed to point out a contradiction between the electrodynamic theory and the relativistic one concerning the calculation of the electromagnetic masses, as the former predicted a value of 4/3 U/c2. Fermi addressed this the next year in a paper "Concerning a contradiction between electrodynamic and the relativistic theory of electromagnetic mass" in which he showed that the apparent contradiction was a consequence of relativity. This paper was sufficiently well-regarded that it was translated into German and published in the German scientific journal "Physikalische Zeitschrift" in 1922. That year, Fermi submitted his article "On the phenomena occurring near a world line" (') to the Italian journal '. In this article he examined the Principle of Equivalence, and introduced the so-called "Fermi coordinates". He proved that on a world line close to the time line, space behaves as if it were a Euclidean space.
Fermi submitted his thesis, "A theorem on probability and some of its applications" (""), to the "Scuola Normale Superiore" in July 1922, and received his laurea at the unusually young age of 20. The thesis was on X-ray diffraction images. Theoretical physics was not yet considered a discipline in Italy, and the only thesis that would have been accepted was one on experimental physics. For this reason, Italian physicists were slow in embracing the new ideas like relativity coming from Germany. Since Fermi was quite at home in the lab doing experimental work, this did not pose insurmountable problems for him.
While writing the appendix for the Italian edition of the book "Fundamentals of Einstein Relativity" by August Kopff in 1923, Fermi was the first to point out that hidden inside the famous Einstein equation () was an enormous amount of nuclear potential energy to be exploited. "It does not seem possible, at least in the near future", he wrote, "to find a way to release these dreadful amounts of energy—which is all to the good because the first effect of an explosion of such a dreadful amount of energy would be to smash into smithereens the physicist who had the misfortune to find a way to do it."
In 1924 Fermi was initiated into the Masonic Lodge "Adriano Lemmi" of the Grand Orient of Italy.
Fermi spent a semester studying under Max Born at the University of Göttingen, where he met Werner Heisenberg and Pascual Jordan. Fermi then studied in Leiden with Paul Ehrenfest from September to December 1924 on a fellowship from the Rockefeller Foundation obtained through the intercession of the mathematician Vito Volterra. Here Fermi met Hendrik Lorentz and Albert Einstein, and became friends with Samuel Goudsmit and Jan Tinbergen. From January 1925 to late 1926, Fermi taught mathematical physics and theoretical mechanics at the University of Florence, where he teamed up with Rasetti to conduct a series of experiments on the effects of magnetic fields on mercury vapour. He also participated in seminars at the Sapienza University of Rome, giving lectures on quantum mechanics and solid state physics. While giving lectures on the new quantum mechanics based on the remarkable accuracy of predictions of the Schrödinger equation, Fermi would often say, "It has no business to fit so well!"
After Wolfgang Pauli announced his exclusion principle in 1925, Fermi responded with a paper "On the quantization of the perfect monoatomic gas" ("), in which he applied the exclusion principle to an ideal gas. The paper was especially notable for Fermi's statistical formulation, which describes the distribution of particles in systems of many identical particles that obey the exclusion principle. This was independently developed soon after by the British physicist Paul Dirac, who also showed how it was related to the Bose–Einstein statistics. Accordingly, it is now known as Fermi–Dirac statistics. After Dirac, particles that obey the exclusion principle are today called "fermions", while those that do not are called "bosons".
Professorships in Italy were granted by competition (") for a vacant chair, the applicants being rated on their publications by a committee of professors. Fermi applied for a chair of mathematical physics at the University of Cagliari on Sardinia, but was narrowly passed over in favor of Giovanni Giorgi. In 1926, at the age of 24, he applied for a professorship at the Sapienza University of Rome. This was a new chair, one of the first three in theoretical physics in Italy, that had been created by the Minister of Education at the urging of Professor Orso Mario Corbino, who was the University's professor of experimental physics, the Director of the Institute of Physics, and a member of Benito Mussolini's cabinet. Corbino, who also chaired the selection committee, hoped that the new chair would raise the standard and reputation of physics in Italy. The committee chose Fermi ahead of Enrico Persico and Aldo Pontremoli, and Corbino helped Fermi recruit his team, which was soon joined by notable students such as Edoardo Amaldi, Bruno Pontecorvo, Ettore Majorana and Emilio Segrè, and by Franco Rasetti, whom Fermi had appointed as his assistant. They were soon nicknamed the "Via Panisperna boys" after the street where the Institute of Physics was located.
Fermi married Laura Capon, a science student at the University, on 19 July 1928. They had two children: Nella, born in January 1931, and Giulio, born in February 1936. On 18 March 1929, Fermi was appointed a member of the Royal Academy of Italy by Mussolini, and on 27 April he joined the Fascist Party. He later opposed Fascism when the 1938 racial laws were promulgated by Mussolini in order to bring Italian Fascism ideologically closer to German National Socialism. These laws threatened Laura, who was Jewish, and put many of Fermi's research assistants out of work.
During their time in Rome, Fermi and his group made important contributions to many practical and theoretical aspects of physics. In 1928, he published his "Introduction to Atomic Physics" (""), which provided Italian university students with an up-to-date and accessible text. Fermi also conducted public lectures and wrote popular articles for scientists and teachers in order to spread knowledge of the new physics as widely as possible. Part of his teaching method was to gather his colleagues and graduate students together at the end of the day and go over a problem, often from his own research. A sign of success was that foreign students now began to come to Italy. The most notable of these was the German physicist Hans Bethe, who came to Rome as a Rockefeller Foundation fellow, and collaborated with Fermi on a 1932 paper "On the Interaction between Two Electrons" ().
At this time, physicists were puzzled by beta decay, in which an electron was emitted from the atomic nucleus. To satisfy the law of conservation of energy, Pauli postulated the existence of an invisible particle with no charge and little or no mass that was also emitted at the same time. Fermi took up this idea, which he developed in a tentative paper in 1933, and then a longer paper the next year that incorporated the postulated particle, which Fermi called a "neutrino". His theory, later referred to as Fermi's interaction, and still later as the theory of the weak interaction, described one of the four fundamental forces of nature. The neutrino was detected after his death, and his interaction theory showed why it was so difficult to detect. When he submitted his paper to the British journal "Nature", that journal's editor turned it down because it contained speculations which were "too remote from physical reality to be of interest to readers". Thus Fermi saw the theory published in Italian and German before it was published in English.
In the introduction to the 1968 English translation, physicist Fred L. Wilson noted that:
In January 1934, Irène Joliot-Curie and Frédéric Joliot announced that they had bombarded elements with alpha particles and induced radioactivity in them. By March, Fermi's assistant Gian-Carlo Wick had provided a theoretical explanation using Fermi's theory of beta decay. Fermi decided to switch to experimental physics, using the neutron, which James Chadwick had discovered in 1932. In March 1934, Fermi wanted to see if he could induce radioactivity with Rasetti's polonium-beryllium neutron source. Neutrons had no electric charge, and so would not be deflected by the positively charged nucleus. This meant that they needed much less energy to penetrate the nucleus than charged particles, and so would not require a particle accelerator, which the Via Panisperna boys did not have.
Fermi had the idea to resort to replacing the polonium-beryllium neutron source with a radon-beryllium one, which he created by filling a glass bulb with beryllium powder, evacuating the air, and then adding 50 mCi of radon gas, supplied by Giulio Cesare Trabacchi. This created a much stronger neutron source, the effectiveness of which declined with the 3.8-day half-life of radon. He knew that this source would also emit gamma rays, but, on the basis of his theory, he believed that this would not affect the results of the experiment. He started by bombarding platinum, an element with a high atomic number that was readily available, without success. He turned to aluminium, which emitted an alpha particle and produced sodium, which then decayed into magnesium by beta particle emission. He tried lead, without success, and then fluorine in the form of calcium fluoride, which emitted an alpha particle and produced nitrogen, decaying into oxygen by beta particle emission. In all, he induced radioactivity in 22 different elements. Fermi rapidly reported the discovery of neutron-induced radioactivity in the Italian journal "La Ricerca Scientifica" on 25 March 1934.
The natural radioactivity of thorium and uranium made it hard to determine what was happening when these elements were bombarded with neutrons but, after correctly eliminating the presence of elements lighter than uranium but heavier than lead, Fermi concluded that they had created new elements, which he called hesperium and ausonium. The chemist Ida Noddack suggesting that some of the experiments could have produced lighter elements than lead rather than new, heavier elements. Her suggestion was not taken seriously at the time because her team had not carried out any experiments with uranium or build the theoretical basis for this possibility. At that time, fission was thought to be improbable if not impossible on theoretical grounds. While physicists expected elements with higher atomic numbers to form from neutron bombardment of lighter elements, nobody expected neutrons to have enough energy to split a heavier atom into two light element fragments in the manner that Noddack suggested.
The Via Panisperna boys also noticed some unexplained effects. The experiment seemed to work better on a wooden table than a marble table top. Fermi remembered that Joliot-Curie and Chadwick had noted that paraffin wax was effective at slowing neutrons, so he decided to try that. When neutrons were passed through paraffin wax, they induced a hundred times as much radioactivity in silver compared with when it was bombarded without the paraffin. Fermi guessed that this was due to the hydrogen atoms in the paraffin. Those in wood similarly explained the difference between the wooden and the marble table tops. This was confirmed by repeating the effect with water. He concluded that collisions with hydrogen atoms slowed the neutrons. The lower the atomic number of the nucleus it collides with, the more energy a neutron loses per collision, and therefore the fewer collisions that are required to slow a neutron down by a given amount. Fermi realised that this induced more radioactivity because slow neutrons were more easily captured than fast ones. He developed a diffusion equation to describe this, which became known as the Fermi age equation.
In 1938 Fermi received the Nobel Prize in Physics at the age of 37 for his "demonstrations of the existence of new radioactive elements produced by neutron irradiation, and for his related discovery of nuclear reactions brought about by slow neutrons". After Fermi received the prize in Stockholm, he did not return home to Italy, but rather continued to New York City with his family in December 1938, where they applied for permanent residency. The decision to move to America and become U.S. citizens was due primarily to the racial laws in Italy.
Fermi arrived in New York City on 2 January 1939. He was immediately offered positions at five universities, and accepted one at Columbia University, where he had already given summer lectures in 1936. He received the news that in December 1938, the German chemists Otto Hahn and Fritz Strassmann had detected the element barium after bombarding uranium with neutrons, which Lise Meitner and her nephew Otto Frisch correctly interpreted as the result of nuclear fission. Frisch confirmed this experimentally on 13 January 1939. The news of Meitner and Frisch's interpretation of Hahn and Strassmann's discovery crossed the Atlantic with Niels Bohr, who was to lecture at Princeton University. Isidor Isaac Rabi and Willis Lamb, two Columbia University physicists working at Princeton, found out about it and carried it back to Columbia. Rabi said he told Enrico Fermi, but Fermi later gave the credit to Lamb:
Noddack was proven right after all. Fermi had dismissed the possibility of fission on the basis of his calculations, but he had not taken into account the binding energy that would appear when a nuclide with an odd number of neutrons absorbed an extra neutron. For Fermi, the news came as a profound embarrassment, as the transuranic elements that he had partly been awarded the Nobel Prize for discovering had not been transuranic elements at all, but fission products. He added a footnote to this effect to his Nobel Prize acceptance speech.
The scientists at Columbia decided that they should try to detect the energy released in the nuclear fission of uranium when bombarded by neutrons. On 25 January 1939, in the basement of Pupin Hall at Columbia, an experimental team including Fermi conducted the first nuclear fission experiment in the United States. The other members of the team were Herbert L. Anderson, Eugene T. Booth, John R. Dunning, G. Norris Glasoe, and Francis G. Slack. The next day, the Fifth Washington Conference on Theoretical Physics began in Washington, D.C. under the joint auspices of George Washington University and the Carnegie Institution of Washington. There, the news on nuclear fission was spread even further, fostering many more experimental demonstrations.
French scientists Hans von Halban, Lew Kowarski, and Frédéric Joliot-Curie had demonstrated that uranium bombarded by neutrons emitted more neutrons than it absorbed, suggesting the possibility of a chain reaction. Fermi and Anderson did so too a few weeks later. Leó Szilárd obtained of uranium oxide from Canadian radium producer Eldorado Gold Mines Limited, allowing Fermi and Anderson to conduct experiments with fission on a much larger scale. Fermi and Szilárd collaborated on a design of a device to achieve a self-sustaining nuclear reaction—a nuclear reactor. Owing to the rate of absorption of neutrons by the hydrogen in water, it was unlikely that a self-sustaining reaction could be achieved with natural uranium and water as a neutron moderator. Fermi suggested, based on his work with neutrons, that the reaction could be achieved with uranium oxide blocks and graphite as a moderator instead of water. This would reduce the neutron capture rate, and in theory make a self-sustaining chain reaction possible. Szilárd came up with a workable design: a pile of uranium oxide blocks interspersed with graphite bricks. Szilárd, Anderson, and Fermi published a paper on "Neutron Production in Uranium". But their work habits and personalities were different, and Fermi had trouble working with Szilárd.
Fermi was among the first to warn military leaders about the potential impact of nuclear energy, giving a lecture on the subject at the Navy Department on 18 March 1939. The response fell short of what he had hoped for, although the Navy agreed to provide $1,500 towards further research at Columbia. Later that year, Szilárd, Eugene Wigner, and Edward Teller sent the famous letter signed by Einstein to U.S. President Roosevelt, warning that Nazi Germany was likely to build an atomic bomb. In response, Roosevelt formed the Advisory Committee on Uranium to investigate the matter.
The Advisory Committee on Uranium provided money for Fermi to buy graphite, and he built a pile of graphite bricks on the seventh floor of the Pupin Hall laboratory. By August 1941, he had six tons of uranium oxide and thirty tons of graphite, which he used to build a still larger pile in Schermerhorn Hall at Columbia.
The S-1 Section of the Office of Scientific Research and Development, as the Advisory Committee on Uranium was now known, met on 18 December 1941, with the U.S. now engaged in World War II, making its work urgent. Most of the effort sponsored by the Committee had been directed at producing enriched uranium, but Committee member Arthur Compton determined that a feasible alternative was plutonium, which could be mass-produced in nuclear reactors by the end of 1944. He decided to concentrate the plutonium work at the University of Chicago. Fermi reluctantly moved, and his team became part of the new Metallurgical Laboratory there.
The possible results of a self-sustaining nuclear reaction were unknown, so it seemed inadvisable to build the first nuclear reactor on the University of Chicago campus in the middle of the city. Compton found a location in the Argonne Woods Forest Preserve, about from Chicago. Stone & Webster was contracted to develop the site, but the work was halted by an industrial dispute. Fermi then persuaded Compton that he could build the reactor in the squash court under the stands of the University of Chicago's Stagg Field. Construction of the pile began on 6 November 1942, and Chicago Pile-1 went critical on 2 December. The shape of the pile was intended to be roughly spherical, but as work proceeded Fermi calculated that criticality could be achieved without finishing the entire pile as planned.
This experiment was a landmark in the quest for energy, and it was typical of Fermi's approach. Every step was carefully planned, every calculation meticulously done. When the first self-sustained nuclear chain reaction was achieved, Compton made a coded phone call to James B. Conant, the chairman of the National Defense Research Committee.
To continue the research where it would not pose a public health hazard, the reactor was disassembled and moved to the Argonne Woods site. There Fermi directed experiments on nuclear reactions, reveling in the opportunities provided by the reactor's abundant production of free neutrons. The laboratory soon branched out from physics and engineering into using the reactor for biological and medical research. Initially, Argonne was run by Fermi as part of the University of Chicago, but it became a separate entity with Fermi as its director in May 1944.
When the air-cooled X-10 Graphite Reactor at Oak Ridge went critical on 4 November 1943, Fermi was on hand just in case something went wrong. The technicians woke him early so that he could see it happen. Getting X-10 operational was another milestone in the plutonium project. It provided data on reactor design, training for DuPont staff in reactor operation, and produced the first small quantities of reactor-bred plutonium. Fermi became an American citizen in July 1944, the earliest date the law allowed.
In September 1944, Fermi inserted the first uranium fuel slug into the B Reactor at the Hanford Site, the production reactor designed to breed plutonium in large quantities. Like X-10, it had been designed by Fermi's team at the Metallurgical Laboratory, and built by DuPont, but it was much larger, and was water-cooled. Over the next few days, 838 tubes were loaded, and the reactor went critical. Shortly after midnight on 27 September, the operators began to withdraw the control rods to initiate production. At first all appeared to be well, but around 03:00, the power level started to drop and by 06:30 the reactor had shut down completely. The Army and DuPont turned to Fermi's team for answers. The cooling water was investigated to see if there was a leak or contamination. The next day the reactor suddenly started up again, only to shut down once more a few hours later. The problem was traced to neutron poisoning from xenon-135, a fission product with a half-life of 9.2 hours. DuPont had deviated from the Metallurgical Laboratory's original design in which the reactor had 1,500 tubes arranged in a circle, and had added 504 tubes to fill in the corners. The scientists had originally considered this over-engineering a waste of time and money, but Fermi realized that if all 2,004 tubes were loaded, the reactor could reach the required power level and efficiently produce plutonium.
In mid-1944, Robert Oppenheimer persuaded Fermi to join his Project Y at Los Alamos, New Mexico. Arriving in September, Fermi was appointed an associate director of the laboratory, with broad responsibility for nuclear and theoretical physics, and was placed in charge of F Division, which was named after him. F Division had four branches: F-1 Super and General Theory under Teller, which investigated the "Super" (thermonuclear) bomb; F-2 Water Boiler under L. D. P. King, which looked after the "water boiler" aqueous homogeneous research reactor; F-3 Super Experimentation under Egon Bretscher; and F-4 Fission Studies under Anderson. Fermi observed the Trinity test on 16 July 1945, and conducted an experiment to estimate the bomb's yield by dropping strips of paper into the blast wave. He paced off the distance they were blown by the explosion, and calculated the yield as ten kilotons of TNT; the actual yield was about 18.6 kilotons.
Along with Oppenheimer, Compton, and Ernest Lawrence, Fermi was part of the scientific panel that advised the Interim Committee on target selection. The panel agreed with the committee that atomic bombs would be used without warning against an industrial target. Like others at the Los Alamos Laboratory, Fermi found out about the atomic bombings of Hiroshima and Nagasaki from the public address system in the technical area. Fermi did not believe that atomic bombs would deter nations from starting wars, nor did he think that the time was ripe for world government. He therefore did not join the Association of Los Alamos Scientists.
Fermi became the Charles H. Swift Distinguished Professor of Physics at the University of Chicago on 1 July 1945, although he did not depart the Los Alamos Laboratory with his family until 31 December 1945. He was elected a member of the U.S. National Academy of Sciences in 1945. The Metallurgical Laboratory became the Argonne National Laboratory on 1 July 1946, the first of the national laboratories established by the Manhattan Project. The short distance between Chicago and Argonne allowed Fermi to work at both places. At Argonne he continued experimental physics, investigating neutron scattering with Leona Marshall. He also discussed theoretical physics with Maria Mayer, helping her develop insights into spin–orbit coupling that would lead to her receiving the Nobel Prize.
The Manhattan Project was replaced by the Atomic Energy Commission (AEC) on 1 January 1947. Fermi served on the AEC General Advisory Committee, an influential scientific committee chaired by Robert Oppenheimer. He also liked to spend a few weeks of each year at the Los Alamos National Laboratory, where he collaborated with Nicholas Metropolis, and with John von Neumann on Rayleigh–Taylor instability, the science of what occurs at the border between two fluids of different densities.
After the detonation of the first Soviet fission bomb in August 1949, Fermi, along with Isidor Rabi, wrote a strongly worded report for the committee, opposing the development of a hydrogen bomb on moral and technical grounds. Nonetheless, Fermi continued to participate in work on the hydrogen bomb at Los Alamos as a consultant. Along with Stanislaw Ulam, he calculated that not only would the amount of tritium needed for Teller's model of a thermonuclear weapon be prohibitive, but a fusion reaction could still not be assured to propagate even with this large quantity of tritium. Fermi was among the scientists who testified on Oppenheimer's behalf at the Oppenheimer security hearing in 1954 that resulted in denial of Oppenheimer's security clearance.
In his later years, Fermi continued teaching at the University of Chicago. His PhD students in the postwar period included Owen Chamberlain, Geoffrey Chew, Jerome Friedman, Marvin Goldberger, Tsung-Dao Lee, Arthur Rosenfeld and Sam Treiman. Jack Steinberger was a graduate student, and Mildred Dresselhaus was highly influenced by Fermi during the year she overlapped with him as a PhD student. Fermi conducted important research in particle physics, especially related to pions and muons. He made the first predictions of pion-nucleon resonance, relying on statistical methods, since he reasoned that exact answers were not required when the theory was wrong anyway. In a paper coauthored with Chen Ning Yang, he speculated that pions might actually be composite particles. The idea was elaborated by Shoichi Sakata. It has since been supplanted by the quark model, in which the pion is made up of quarks, which completed Fermi's model, and vindicated his approach.
Fermi wrote a paper "On the Origin of Cosmic Radiation" in which he proposed that cosmic rays arose through material being accelerated by magnetic fields in interstellar space, which led to a difference of opinion with Teller. Fermi examined the issues surrounding magnetic fields in the arms of a spiral galaxy. He mused about what is now referred to as the "Fermi paradox": the contradiction between the presumed probability of the existence of extraterrestrial life and the fact that contact has not been made.
Toward the end of his life, Fermi questioned his faith in society at large to make wise choices about nuclear technology. He said:
Fermi underwent what was called an "exploratory" operation in Billings Memorial Hospital in October 1954, after which he returned home. Fifty days later he died of stomach cancer at age 53 in his home in Chicago. His memorial service was held at the University of Chicago chapel, where colleagues Samuel K. Allison, Emilio Segrè, and Herbert L. Anderson spoke to mourn the loss of one of the world's "most brilliant and productive physicists." His body was interred at Oak Woods Cemetery.
Fermi received numerous awards in recognition of his achievements, including the Matteucci Medal in 1926, the Nobel Prize for Physics in 1938, the Hughes Medal in 1942, the Franklin Medal in 1947, and the Rumford Prize in 1953. He was awarded the Medal for Merit in 1946 for his contribution to the Manhattan Project. Fermi was elected a Foreign Member of the Royal Society (FRS) in 1950. The Basilica of Santa Croce, Florence, known as the "Temple of Italian Glories" for its many graves of artists, scientists and prominent figures in Italian history, has a plaque commemorating Fermi. In 1999, "Time" named Fermi on its list of the top 100 persons of the twentieth century. Fermi was widely regarded as an unusual case of a 20th-century physicist who excelled both theoretically and experimentally. The historian of physics, C. P. Snow, wrote that "if Fermi had been born a few years earlier, one could well imagine him discovering Rutherford's atomic nucleus, and then developing Bohr's theory of the hydrogen atom. If this sounds like hyperbole, anything about Fermi is likely to sound like hyperbole".
Fermi was known as an inspiring teacher, and was noted for his attention to detail, simplicity, and careful preparation of his lectures. Later, his lecture notes were transcribed into books. His papers and notebooks are today in the University of Chicago. Victor Weisskopf noted how Fermi "always managed to find the simplest and most direct approach, with the minimum of complication and sophistication." He disliked complicated theories, and while he had great mathematical ability, he would never use it when the job could be done much more simply. He was famous for getting quick and accurate answers to problems that would stump other people. Later on, his method of getting approximate and quick answers through back-of-the-envelope calculations became informally known as the "Fermi method", and is widely taught.
Fermi was fond of pointing out that when Alessandro Volta was working in his laboratory, Volta had no idea where the study of electricity would lead. Fermi is generally remembered for his work on nuclear power and nuclear weapons, especially the creation of the first nuclear reactor, and the development of the first atomic and hydrogen bombs. His scientific work has stood the test of time. This includes his theory of beta decay, his work with non-linear systems, his discovery of the effects of slow neutrons, his study of pion-nucleon collisions, and his Fermi–Dirac statistics. His speculation that a pion was not a fundamental particle pointed the way towards the study of quarks and leptons.
Many things bear Fermi's name. These include the Fermilab particle accelerator and physics lab in Batavia, Illinois, which was renamed in his honor in 1974, and the Fermi Gamma-ray Space Telescope, which was named after him in 2008, in recognition of his work on cosmic rays. Three nuclear reactor installations have been named after him: the Fermi 1 and Fermi 2 nuclear power plants in Newport, Michigan, the Enrico Fermi Nuclear Power Plant at Trino Vercellese in Italy, and the RA-1 Enrico Fermi research reactor in Argentina. A synthetic element isolated from the debris of the 1952 Ivy Mike nuclear test was named fermium, in honor of Fermi's contributions to the scientific community. This makes him one of 16 scientists who have elements named after them.
Since 1956, the United States Atomic Energy Commission has named its highest honor, the Fermi Award, after him. Recipients of the award include well-known scientists like Otto Hahn, Robert Oppenheimer, Edward Teller and Hans Bethe.
For a full list of his papers, see pages 75–78 in ref. | https://en.wikipedia.org/wiki?curid=10264 |
Editor war
The editor war is the rivalry between users of the Emacs and vi (now usually Vim, or recently Neovim) text editors. The rivalry has become a lasting part of hacker culture and the free software community.
The Emacs vs vi debate was one of the original "holy wars" conducted on Usenet groups, with many flame wars fought between those insisting that their editor of choice is the of editing perfection, and insulting the other, since at least 1985. Related battles have been fought over operating systems, programming languages, version control systems, and even source code indent style.
The most important historical differences between vi and Emacs are presented in the following table:
In the past, many small editors modeled after or derived from vi flourished. This was due to the importance of conserving memory with the comparatively minuscule amount available at the time. As computers have become more powerful, many vi clones, Vim in particular, have grown in size and code complexity. These vi variants of today, as with the old lightweight Emacs variants, tend to have many of the perceived benefits and drawbacks of the opposing side. For example, Vim without any extensions requires about ten times the disk space required by vi, and recent versions of Vim can have more extensions and run slower than Emacs. In "The Art of Unix Programming", Eric S. Raymond called Vim's supposed light weight when compared with Emacs "a shared myth". Moreover, with the large amounts of RAM in modern computers, both Emacs and vi are lightweight compared to large integrated development environments such as Eclipse, which tend to draw derision from Emacs and vi users alike.
Tim O'Reilly said, in 1999, that O'Reilly Media's tutorial on vi sells twice as many copies as that on Emacs (but noted that Emacs came with a free manual). Many programmers use either Emacs and vi or their various offshoots, including Linus Torvalds who uses MicroEMACS. Also in 1999, vi creator Bill Joy said that vi was "written for a world that doesn't exist anymore" and stated that Emacs was written on much more capable machines with faster displays so they could have "funny commands with the screen shimmering and all that, and meanwhile, I'm sitting at home in sort of World War II surplus housing at Berkeley with a modem and a terminal that can just barely get the cursor off the bottom line".
In addition to Emacs and vi workalikes, pico and its free and open-source clone nano and other text editors such as ne often have their own third-party advocates in the editor wars, though not to the extent of Emacs and vi.
, both Emacs and vi can lay claim to being among the longest-lived application programs of all time, as well as being the two most commonly used text editors on Linux and Unix. Many operating systems, especially Linux and BSD derivatives, bundle multiple text editors with the operating system to cater to user demand. For example, a default installation of macOS contains Emacs, ed, nano, TextEdit, and Vim. Frequently, at some point in the discussion, someone will point out that ed is the "standard text editor".
The Church of Emacs, formed by Emacs and the GNU Project's creator Richard Stallman, is a parody religion. While it refers to vi as the "editor of the beast" (vi-vi-vi being 6-6-6 in Roman numerals), it does not oppose the use of vi; rather, it calls proprietary software anathema. ("Using a free version of vi is not a sin but a penance.") The Church of Emacs has its own newsgroup, alt.religion.emacs, that has posts purporting to support this belief system.
Stallman has referred to himself as St IGNU−cius, a saint in the Church of Emacs.
Supporters of vi have created an opposing Cult of vi, argued by the more hard-line Emacs users to be an attempt to "ape their betters".
Regarding vi's modal nature (a common point of frustration for new users) some Emacs users joke that vi has two modes – "beep repeatedly" and "break everything". vi users enjoy joking that Emacs's key-sequences induce carpal tunnel syndrome, or mentioning one of many satirical expansions of the acronym EMACS, such as "Escape Meta Alt Control Shift" (a jab at Emacs's reliance on modifier keys). or "Eight Megabytes And Constantly Swapping" (in a time when that was a great amount of memory) or "EMACS Makes Any Computer Slow" (a recursive acronym like those Stallman uses) or "Eventually Munches All Computer Storage", in reference to Emacs's high system resource requirements. GNU EMACS has been expanded to "Generally Not Used, Except by Middle-Aged Computer Scientists" referencing its most ardent fans, and its declining usage among younger programmers compared to more graphically-oriented editors such as TextMate. The Emacs distribution includes the full list.
As a poke at Emacs' creeping featurism, vi advocates have been known to describe Emacs as "a great operating system, lacking only a decent editor". Emacs advocates have been known to respond that the editor is actually very good, but the operating system could use improvement (referring to Emacs' famous lack of concurrency).
A game among UNIX users, either to test the depth of an Emacs user's understanding of the editor or to poke fun at the complexity of Emacs, involved predicting what would happen if a user held down a modifier key (such as or ) and typed their own name. This game humor originated with users of the older TECO editor, which was the implementation basis, via macros, of the original Emacs.
Due to the unintuitive character sequence to exit vi (":q!"), hackers joke about a proposed method of creating a pseudorandom character sequence by having a user unfamiliar with vi seated in front of an open editor and asking them to exit the program. | https://en.wikipedia.org/wiki?curid=10268 |
Electric guitar
An electric guitar is a guitar that requires external amplification in order to be heard at typical performance volumes. It uses one or more pickups to convert the vibration of its strings into electrical signals, which ultimately are reproduced as sound by loudspeakers. The sound can be shaped or electronically altered to achieve different timbres or tonal qualities, often making it quite different than an acoustic guitar. Often, this is done through the use of effects such as reverb, distortion and "overdrive"; the latter is considered to be a key element of electric blues guitar music and rock guitar playing.
Invented in 1932, the electric guitar was adopted by jazz guitar players, who wanted to play single-note guitar solos in large big band ensembles. Early proponents of the electric guitar on record include Les Paul, Lonnie Johnson, Sister Rosetta Tharpe, T-Bone Walker, and Charlie Christian. During the 1950s and 1960s, the electric guitar became the most important instrument in popular music. It has evolved into an instrument that is capable of a multitude of sounds and styles in genres ranging from pop and rock to country music, blues and jazz. It served as a major component in the development of electric blues, rock and roll, rock music, heavy metal music and many other genres of music.
Electric guitar design and construction varies greatly in the shape of the body and the configuration of the neck, bridge, and pickups. Guitars may have a fixed bridge or a spring-loaded hinged bridge, which lets players "bend" the pitch of notes or chords up or down, or perform vibrato effects. The sound of an electric guitar can be modified by new playing techniques such as string bending, tapping, and hammering-on, using audio feedback, or slide guitar playing.
There are several types of electric guitar, including: the solid-body guitar; various types of hollow-body guitars; the six-string guitar (the most common type), which is usually tuned E, B, G, D, A, E, from highest to lowest strings; the seven-string guitar, which typically adds a low B string below the low E; the eight-string guitar, which typically adds a low E or F# string below the low B; and the twelve-string guitar, which has six pairs of strings.
In pop and rock music, the electric guitar is often used in two roles: as a rhythm guitar, which plays the chord sequences or progressions, and riffs, and sets the beat (as part of a rhythm section); and as a lead guitar, which provides instrumental melody lines, melodic instrumental fill passages, and solos. In a small group, such as a power trio, one guitarist switches between both roles. In large rock and metal bands, there is often a rhythm guitarist and a lead guitarist.
Many experiments at electrically amplifying the vibrations of a string instrument were made dating back to the early part of the 20th century. Patents from the 1910s show telephone transmitters were adapted and placed inside violins and banjos to amplify the sound. Hobbyists in the 1920s used carbon button microphones attached to the bridge; however, these detected vibration from the bridge on top of the instrument, resulting in a weak signal. With numerous people experimenting with electrical instruments in the 1920s and early 1930s, there are many claimants to have been the first to invent an electric guitar.
Electric guitars were originally designed by acoustic guitar makers and instrument manufacturers. The demand for amplified guitars began during the big band era; as orchestras increased in size, guitar players soon realized the necessity in guitar amplification and electrification. The first electric guitars used in jazz were hollow archtop acoustic guitar bodies with electromagnetic transducers. Early electric guitar manufacturers include Rickenbacker in 1932; Dobro in 1933; National, AudioVox and Volu-tone in 1934; Vega, Epiphone (Electrophone and Electar), and Gibson in 1935 and many others by 1936.
The first electrically amplified stringed instrument to be marketed commercially was designed in 1931 by George Beauchamp, the general manager of the National Guitar Corporation, with Paul Barth, who was vice president. The maple body prototype for the one-piece cast aluminium "frying pan" was built by Harry Watson, factory superintendent of the National Guitar Corporation. George Beauchamp, along with Adolph Rickenbacker, invented the electromagnetic pickups. Coils that were wrapped around a magnet would create an electromagnetic field that amplified the vibrations of the guitar strings. Commercial production began in late summer of 1932 by the Ro-Pat-In Corporation (Electro-Patent-Instrument Company), in Los Angeles, a partnership of Beauchamp, Adolph Rickenbacker (originally Rickenbacher), and Paul Barth. In 1934, the company was renamed the Rickenbacker Electro Stringed Instrument Company. In that year Beauchamp applied for a United States patent for an "Electrical Stringed Musical Instrument" and the patent was later issued in 1937. By the time it was patented, other manufacturers were already making their own electric guitar designs.
By early-mid 1935, Electro String Instrument Corporation had achieved mainstream success with the "A-22" "Frying Pan" steel guitar, and set out to capture a new audience through its release of the "Electro-Spanish Model B" and the "Electro-Spanish Ken Roberts", which was the first full 25" scale electric guitar ever produced.
The Electro-Spanish Ken Roberts was revolutionary for its time, providing players a full 25" scale, with easy access to 17 frets free of the body.
Unlike other lap-steel electrified instruments produced during the time, the Electro-Spanish Ken Roberts was designed to play standing vertical, upright with a strap. The Electro-Spanish Ken Roberts was also the first instrument to feature a hand-operated vibrato as a standard appointment, a device called the "Vibrola," invented by Doc Kauffman.
It is estimated that fewer than 50 Electro-Spanish Ken Roberts were constructed between 1933 and 1937; fewer than 10 are known to survive today.
The solid-body electric guitar is made of solid wood, without functionally resonating air spaces. The first solid-body Spanish standard guitar was offered by Vivi-Tone no later than 1934. This model featured a guitar-shaped body of a single sheet of plywood affixed to a wood frame. Another early, substantially solid Spanish electric guitar, called the Electro Spanish, was marketed by the Rickenbacker guitar company in 1935 and made of Bakelite. By 1936, the Slingerland company introduced a wooden solid-body electric model, the Slingerland Songster 401 (and a lap steel counterpart, the Songster 400).
Gibson's first production electric guitar, marketed in 1936, was the ES-150 model ("ES" for "Electric Spanish", and "150" reflecting the $150 price of the instrument, along with matching amplifier). The ES-150 guitar featured a single-coil, hexagonally shaped "bar" pickup, which was designed by Walt Fuller. It became known as the "Charlie Christian" pickup (named for the great jazz guitarist who was among the first to perform with the ES-150 guitar). The ES-150 achieved some popularity but suffered from unequal loudness across the six strings.
A functioning solid-body electric guitar was designed and built in 1940 by Les Paul from an Epiphone acoustic archtop, as an experiment. His "log guitar" — a wood post with a neck attached and two hollow-body halves attached to the sides for appearance only — shares nothing in common for design or hardware with the solid-body Gibson Les Paul, designed by Ted McCarty and introduced in 1952.
The feedback associated with amplified hollow-bodied electric guitars was understood long before Paul's "log" was created in 1940; Gage Brewer's Ro-Pat-In of 1932 had a top so heavily reinforced that it essentially functioned as a solid-body instrument.
Early proponents of the electric guitar on record include Alvino Rey (Phil Spitalney Orchestra), Les Paul (Fred Waring Orchestra), George Barnes (under many aliases), Eddie Durham, Lonnie Johnson, Floyd Smith, Sister Rosetta Tharpe, Big Bill Broonzy, T-Bone Walker, George Van Eps, Charlie Christian (Benny Goodman Orchestra), Tampa Red, Memphis Minnie, and Arthur Crudup. According to jazz historian James Lincoln Collier, Floyd Smith can be credited as the first person to rig up an amplified guitar. According to Collier, "Floyd's Guitar Blues" may be the first important use of the electric guitar on record.
Unlike acoustic guitars, solid-body electric guitars have no vibrating soundboard to amplify string vibration. Instead, solid-body instruments depend on electric pickups and an amplifier (or amp) and speaker. The solid body ensures that the amplified sound reproduces the string vibration alone, thus avoiding the wolf tones and unwanted feedback associated with amplified acoustic guitars. These guitars are generally made of hardwood covered with a hard polymer finish, often polyester or lacquer. In large production facilities, the wood is stored for three to six months in a wood-drying kiln before being cut to shape. Premium custom-built guitars are frequently made with much older, hand-selected wood.
One of the first solid-body guitars was invented by Les Paul. Gibson did not present their Gibson Les Paul guitar prototypes to the public, as they did not believe the solid-body style would catch on. Another early solid-body Spanish style guitar, resembling what would become Gibson's Les Paul guitar a decade later, was developed in 1941 by O.W. Appleton, of Nogales, Arizona. Appleton made contact with both Gibson and Fender but was unable to sell the idea behind his "App" guitar to either company. In 1946, Merle Travis commissioned steel guitar builder Paul Bigsby to build him a solid-body Spanish-style electric. Bigsby delivered the guitar in 1948. The first mass-produced solid-body guitar was Fender Esquire and Fender Broadcaster (later to become the Fender Telecaster), first made in 1948, five years after Les Paul made his prototype. The Gibson Les Paul appeared soon after to compete with the Broadcaster. Another notable solid-body design is the Fender Stratocaster, which was introduced in 1954 and became extremely popular among musicians in the 1960s and 1970s for its wide tonal capabilities and more comfortable ergonomics than other models.
The history of Electric Guitars is summarized by Guitar World magazine, and the earliest electric guitar on their top 10 list is the Ro-Pat-In Electro A-25 "Frying Pan" (1932) described as 'The first-fully functioning solid-body electric guitar to be manufactured and sold'. The most recent electric guitar on this list is the Ibanez Jem (1987) which featured '24 frets', 'an impossibly thin neck' and was 'designed to be the ultimate shredder machine'. Numerous other important electric guitars are on the list including Gibson ES-150 (1936), Fender Telecaster (1951), Gibson Les Paul (1952), Gretsch 6128 Duo Jet (1953), Fender Stratocaster (1954), Rickenbacker 360/12 (1964), Van Halen Frankenstein (1975), Paul Reed Smith Custom (1985) many of these guitars were 'successors' to earlier designs. Electric Guitar designs eventually became culturally important and visually iconic, with various model companies selling miniature model versions of particularly famous electric guitars, for example the Gibson SG used by Angus Young from the group AC/DC.
Some solid-bodied guitars, such as the Gibson Les Paul Supreme, the PRS Singlecut, and the Fender Telecaster Thinline, are built with hollow chambers in the body. These chambers are designed to not interfere with the critical bridge and string anchor point on the solid body. In the case of Gibson and PRS, these are called "chambered bodies". The motivation for this may be to reduce weight, to achieve a semi-acoustic tone (see below) or both.
Semi-acoustic guitars have a hollow body (similar in depth to a solid-body guitar) and electronic pickups mounted on the body. They work in a similar way to solid-body electric guitars except that, because the hollow body also vibrates, the pickups convert a combination of string and body vibration into an electrical signal. Whereas chambered guitars are made, like solid-body guitars, from a single block of wood, semi-acoustic and full-hollowbody guitars bodies are made from thin sheets of wood. They do not provide enough acoustic volume for live performance, but they can be used unplugged for quiet practice. Semi-acoustics are noted for being able to provide a sweet, plaintive, or funky tone. They are used in many genres, including blues, funk, sixties pop, and indie rock. They generally have cello-style F-shaped sound holes. These can be blocked off to prevent feedback, as in B. B. King's famous Lucille. Feedback can also be reduced by making them with a solid block in the middle of the soundbox.
Full hollow-body guitars have large, deep bodies made of glued-together sheets, or "plates", of wood. They can often be played at the same volume as an acoustic guitar and therefore can be used unplugged at intimate gigs. They qualify as electric guitars inasmuch as they have fitted pickups. Historically, archtop guitars
with retrofitted pickups were among the very earliest electric guitars. The instrument originated during the Jazz Age, in the 1920s and 1930s, and are still considered the classic jazz guitar (nicknamed "jazzbox"). Like semi-acoustic guitars, they often have f-shaped sound holes.
Having humbucker pickups (sometimes just a neck pickup) and usually strung heavily, jazzboxes are noted for their warm, rich tone. A variation with single-coil pickups, and sometimes with a Bigsby tremolo, has long been popular in country and rockabilly; it has a distinctly more twangy, biting tone than the classic jazzbox. The term "archtop" refers to a method of construction subtly different from the typical acoustic (or "folk" or "western" or "steel-string" guitar): the top is formed from a moderately thick () piece of wood, which is then carved into a thin () domed shape, whereas conventional acoustic guitars have a thin, flat top.
Some steel-string acoustic guitars are fitted with pickups purely as an alternative to using a separate microphone. They may also be fitted with a piezoelectric pickup under the bridge, attached to the bridge mounting plate, or with a low-mass microphone (usually a condenser mic) inside the body of the guitar that converts the vibrations in the body into electronic signals. Combinations of these types of pickups may be used, with an integral mixer/preamp/graphic equalizer. Such instruments are called electric acoustic guitars. They are regarded as acoustic guitars rather than electric guitars, because the pickups do not produce a signal directly from the vibration of the strings, but rather from the vibration of the guitar top or body.
Electric acoustic guitars should not be confused with semi-acoustic guitars, which have pickups of the type found on solid-body electric guitars, or solid-body hybrid guitars with piezoelectric pickups.
Electric guitar design and construction vary greatly in the shape of the body and the configuration of the neck, bridge, and pickups. However, some features are present on most guitars. The photo below shows the different parts of an electric guitar. The headstock (1) contains the metal machine heads (1.1), which use a worm gear for tuning. The nut (1.4)—a thin fret-like strip of metal, plastic, graphite or bone—supports the strings at the headstock end of the instrument. The frets (2.3) are thin metal strips that stop the string at the correct pitch when the player pushes a string against the fingerboard. The truss rod (1.2) is a metal rod (usually adjustable) that counters the tension of the strings to keep the neck straight. Position markers (2.2) provide the player with a reference to the playing position on the fingerboard.
The neck and fretboard (2.1) extend from the body. At the neck joint (2.4), the neck is either glued or bolted to the body. The body (3) is typically made of wood with a hard, polymerized finish. Strings vibrating in the magnetic field of the pickups (3.1, 3.2) produce an electric current in the pickup winding that passes through the tone and volume controls (3.8) to the output jack. Some guitars have piezo pickups, in addition to or instead of magnetic pickups.
Some guitars have a fixed bridge (3.4). Others have a spring-loaded hinged bridge called a "vibrato bar", "tremolo bar", or "whammy bar", which lets players bend notes or chords up or down in pitch or perform a vibrato embellishment. A plastic pickguard on some guitars protects the body from scratches or covers the control cavity, which holds most of the wiring.
The degree to which the choice of woods and other materials in the solid-guitar body (3) affects the sonic character of the amplified signal is disputed. Many believe it is highly significant, while others think the difference between woods is subtle. In acoustic and archtop guitars, wood choices more clearly affect tone.
Woods typically used in solid-body electric guitars include alder (brighter, but well rounded), swamp ash (similar to alder, but with more pronounced highs and lows), mahogany (dark, bassy, warm), poplar (similar to alder), and basswood (very neutral). Maple, a very bright tonewood, is also a popular body wood, but is very heavy. For this reason it is often placed as a "cap" on a guitar made primarily of another wood. Cheaper guitars are often made of cheaper woods, such as plywood, pine or agathis—not true hardwoods—which can affect durability and tone. Though most guitars are made of wood, any material may be used. Materials such as plastic, metal, and even cardboard have been used in some instruments.
The guitar output jack typically provides a monaural signal. Many guitars with active electronics use a jack with an extra contact normally used for stereo. These guitars use the extra contact to break the ground connection to the on-board battery to preserve battery life when the guitar is unplugged. These guitars require a mono plug to close the internal switch and connect the battery to ground. Standard guitar cables use a high-impedance mono plug. These have a tip and sleeve configuration referred to as a TS phone connector. The voltage is usually around 1 to 9 millivolts.
A few guitars feature stereo output, such as Rickenbacker guitars equipped with "Rick-O-Sound". There are a variety of ways the "stereo" effect may be implemented. Commonly, but not exclusively, stereo guitars route the neck and bridge pickups to separate output buses on the guitar. A stereo cable then routes each pickup to its own signal chain or amplifier. For these applications, the most popular connector is a high-impedance plug with a tip, ring and sleeve configuration, also known as a TRS phone connector. Some studio instruments, notably certain Gibson Les Paul models, incorporate a low-impedance three-pin XLR connector for balanced audio. Many exotic arrangements and connectors exist that support features such as midi and hexaphonic pickups.
The bridge and tailpiece, while serving separate purposes, work closely together to affect playing style and tone. There are four basic types of bridge and tailpiece systems on electric guitars. Within these four types are many variants.
A hard-tail guitar bridge anchors the strings at or directly behind the bridge and is fastened securely to the top of the instrument. These are common on carved-top guitars, such as the Gibson Les Paul and the Paul Reed Smith models, and on slab-body guitars, such as the Music Man Albert Lee and Fender guitars that are not equipped with a vibrato arm.
A "floating" or "trapeze" tailpiece (similar to a violin's) fastens to the body at the base of the guitar. These appear on Rickenbackers, Gretsches, Epiphones, a wide variety of archtop guitars, particularly Jazz guitars, and the 1952 Gibson Les Paul.
Pictured is a "tremolo arm" or "vibrato tailpiece" style bridge and tailpiece system, often called a "whammy bar" or "trem". It uses a lever ("vibrato arm") attached to the bridge that can temporarily slacken or tighten the strings to alter the pitch. A player can use this to create a vibrato or a portamento effect. Early vibrato systems were often unreliable and made the guitar go out of tune easily. They also had a limited pitch range. Later Fender designs were better, but Fender held the patent on these, so other companies used older designs for many years.
With expiration of the Fender patent on the Stratocaster-style vibrato, various improvements on this type of internal, multi-spring vibrato system are now available. Floyd Rose introduced one of the first improvements on the vibrato system in many years when, in the late 1970s, he experimented with "locking" nuts and bridges that prevent the guitar from losing tuning, even under heavy vibrato bar use.
The fourth type of system employs string-through body anchoring. The strings pass over the bridge saddles, then through holes through the top of the guitar body to the back. The strings are typically anchored in place at the back of the guitar by metal ferrules. Many believe this design improves a guitar's sustain and timbre.
A few examples of string-through body guitars are the Fender Telecaster Thinline, the Fender Telecaster Deluxe, the B.C. Rich IT Warlock and Mockingbird, and the Schecter Omen 6 and 7 series.
Compared to an acoustic guitar, which has a hollow body, electric guitars make much less audible sound when their strings are plucked, so electric guitars are normally plugged into a guitar amplifier and speaker. When an electric guitar is played, string movement produces a signal by generating (i.e., inducing) a small electric current in the magnetic pickups, which are magnets wound with coils of very fine wire.
The signal passes through the tone and volume circuits to the output jack, and through a cable to an amplifier. The current induced is proportional to such factors as string density and the amount of movement over the pickups.
Because of their natural inductive qualities, magnetic pickups tend to pick up ambient, usually unwanted electromagnetic interference or EMI. This mains hum results in a tone of 50 or 60 cycles per second depending on the powerline frequency of the local alternating current supply.
The resulting hum is particularly strong with single-coil pickups. Double-coil or "humbucker" pickups were invented as a way to reduce or counter the sound. The high combined inductance of the two coils also leads to the richer, "fatter" tone associated with humbucking pickups.
Electric guitar necks vary in composition and shape. The primary metric of guitar necks is the "scale length", which is the vibrating length of the strings from nut to bridge. A typical Fender guitar uses a scale length, while Gibson uses a scale length in their "Les Paul". While the scale length of the Les Paul is often described as 24.75 inches, it has varied through the years by as much as a half inch.
Frets are positioned proportionally to scale length—the shorter the scale length, the closer the fret spacing. Opinions vary regarding the effect of scale length on tone and feel. Popular opinion holds that longer scale length contributes to greater amplitude. Reports of playing feel are greatly complicated by the many factors involved in this perception. String gauge and design, neck construction and relief, guitar setup, playing style and other factors contribute to the subjective impression of playability or feel.
Necks are described as "bolt-on", "set-in", or "neck-through", depending on how they attach to the body. Set-in necks are glued to the body in the factory. This is the traditional type of joint. Leo Fender pioneered bolt-on necks on electric guitars to facilitate easy adjustment and replacement. Neck-through instruments extend the neck the length of the instrument, so that it forms the center of the body. While a set-in neck can be carefully unglued by a skilled luthier, and a bolt-on neck can simply be unscrewed, a neck-through design is difficult or even impossible to repair, depending on the damage. Historically, the bolt-on style has been more popular for ease of installation and adjustment. Since bolt-on necks can be easily removed, there is an after-market in replacement bolt-on necks from companies such as Warmoth and Mighty Mite. Some instruments—notably most Gibson models—continue to use set-in glued necks. Neck-through bodies are somewhat more common in bass guitars.
Materials for necks are selected for dimensional stability and rigidity, and some allege that they influence tone. Hardwoods are preferred, with maple, mahogany, and ash topping the list. The neck and fingerboard can be made from different materials; for example, a guitar may have a maple neck with a rosewood or ebony fingerboard. In the 1970s, designers began to use exotic man-made materials such as aircraft-grade aluminum, carbon fiber, and ebonol. Makers known for these unusual materials include John Veleno, Travis Bean, Geoff Gould, and Alembic.
Aside from possible engineering advantages, some feel that in relation to the rising cost of rare tonewoods, man-made materials may be economically preferable and more ecologically sensitive. However, wood remains popular in production instruments, though sometimes in conjunction with new materials. Vigier guitars, for example, use a wooden neck reinforced by embedding a light, carbon fiber rod in place of the usual heavier steel bar or adjustable steel truss rod. After-market necks made entirely from carbon fiber fit existing bolt-on instruments. Few, if any, extensive formal investigations have been widely published that confirm or refute claims over the effects of different woods or materials on electric guitar sound.
Several neck shapes appear on guitars, including shapes known as C necks, U necks, and V necks. These refer to the cross-sectional shape of the neck (especially near the nut). Several sizes of fret wire are available, with traditional players often preferring thin frets, and metal shredders liking thick frets. Thin frets are considered better for playing chords, while thick frets allow lead guitarists to bend notes with less effort.
An electric guitar with a folding neck called the "Foldaxe" was designed and built for Chet Atkins by Roger C. Field. Steinberger guitars developed a line of exotic, carbon fiber instruments without headstocks, with tuning done on the bridge instead.
Fingerboards vary as much as necks. The fingerboard surface usually has a cross-sectional radius that is optimized to accommodate finger movement for different playing techniques. Fingerboard radius typically ranges from nearly flat (a very large radius) to radically arched (a small radius). The vintage Fender Telecaster, for example, has a typical small radius of approximately . Some manufacturers have experimented with fret profile and material, fret layout, number of frets, and modifications of the fingerboard surface for various reasons. Some innovations were intended to improve playability by ergonomic means, such as Warmoth Guitars' compound radius fingerboard. Scalloped fingerboards added enhanced microtonality during fast legato runs. Fanned frets intend to provide each string with an optimal playing tension and enhanced musicality. Some guitars have no frets—and others, like the Gittler guitar, have no neck in the traditional sense.
While an acoustic guitar's sound depends largely on the vibration of the guitar's body and the air inside it, the sound of an electric guitar depends largely on the signal from the pickups. The signal can be "shaped" on its path to the amplifier via a range of effect devices or circuits that modify the tone and characteristics of the signal. Amplifiers and speakers also add coloration to the final sound.
Modern electric guitars most commonly have two or three magnetic pickups. Identical pickups produce different tones depending on location between the neck and bridge. Bridge pickups produce a bright or trebly timbre (i.e. more high frequency content), and neck pickups are warmer or more bassy (i.e. more fundamental frequency content). The type of pickup also affects tone. The sound of dual-coil pickups is often described as warm and thick; conversely, single-coil pickups are often described as sounding clear and bright.
Where there is more than one pickup, a switch selects between the outputs of individual pickups or some combination; two-pickup guitars have three-way switches, and three-pickup guitars have five-way switches. Further circuitry sometimes combines pickups in different ways. For instance, phase switching places one pickup out of phase with the other(s), resulting in destructive interference of lower-frequency harmonics (a sound often described as "thin" or "hollow"). A pickup's timbre can be altered by making changes to the individual pickup's circuit. One such alteration is coil splitting, in which one coil of a humbucker pickup is removed from the circuit in order to produce a tone similar to a single-coil pickup (this switching is often accomplished using push-pull potentiometers).
The final stages of on-board sound-shaping circuitry are the volume control (potentiometer) and tone control (a low-pass filter which "rolls off" the treble frequencies). Where there are individual volume controls for different pickups, and where pickup signals can be combined, they would affect the timbre of the final sound by adjusting the balance between pickups from a straight 50:50.
The solid-body electric guitar does not produce enough sound for an audience to hear it in a performance setting unless it's electronically amplified—plugged into an amplifier, mixing console, or PA.
Guitar amplifier design uses a different approach than sound reinforcement system power amplifiers and home "hi-fi" stereo systems. Audio amplifiers generally are intended to accurately reproduce the source signal without adding unwanted tonal coloration (i.e., they have a flat frequency response) or unwanted distortion. In contrast, most guitar amplifiers provide tonal coloration and overdrive or distortion of various types. A common tonal coloration sought by guitarists is rolling off some of the high frequencies.
Guitar amplifiers generally incorporate at least a few effects, the most basic being tone controls for bass and treble. There may be some form of "overdrive" control, where the preamplifier's output is increased to the point where the amplitude overloads the input of the power amplifier stage, causing clipping.
In the 1960s, the tonal palette of the electric guitar was further modified by introducing effect units in the signal path before the amplifier.
Effects units have been created in several formats, the most common of which are the stompbox "pedal" and the rackmount unit. A stomp box (or pedal) is a small metal or plastic box containing the circuitry, which is placed on the floor in front of the musician, and is activated by one or more switches intended to be pressed with the foot. Pedals are smaller than rackmount effects. A rackmount effects unit may contain an electronic circuit nearly identical to a stompbox-based effect, but cased to be mounted in a standard equipment rack. Rack-mount effects units often contain several types of effect. They are controlled by knobs or switches on the front panel or by a MIDI digital control interface.
Typical effects include:
A multi-effects device is a single electronics effects pedal or rack-mount device that contains many electronic effects. Most of these devices allow users to use "pre-set" their desired combinations of effects, offering the ability to easily alter the guitar's tonal dynamics, even mid-song. Some multi-FX pedals contain modelled versions of well-known effects pedals or amplifiers.
By the 1990s, software effects became capable of digitally replicating the analog effects used in the past, with varying degrees of quality.
The sound of a guitar can not only be adapted by electronic sound effects but is also heavily affected by various new techniques developed or becoming possible in combination with electric amplification. This is called extended technique.
Many techniques, such as axial finger vibrato, pull-offs, hammer-ons, palm muting, harmonics and altered tunings, are also used on the classical and acoustic guitar. Shred guitar is a genre involving a number of extended techniques. | https://en.wikipedia.org/wiki?curid=10272 |
Embryo drawing
Embryo drawing is the illustration of embryos in their developmental sequence. In plants and animals, an embryo develops from a zygote, the single cell that results when an egg and sperm fuse during fertilization. In animals, the zygote divides repeatedly to form a ball of cells, which then forms a set of tissue layers that migrate and fold to form an early embryo. Images of embryos provide a means of comparing embryos of different ages, and species. To this day, embryo drawings are made in undergraduate developmental biology lessons.
Comparing different embryonic stages of different animals is a tool that can be used to infer relationships between species, and thus biological evolution. This has been a source of quite some controversy, both now and in the past. Ernst Haeckel pioneered in this field. By comparing different embryonic stages of different vertebrate species, he formulated the recapitulation theory. This theory states that an animal's embryonic development follows exactly the same sequence as the sequence of its evolutionary ancestors. Haeckel's work and the ensuing controversy linked the fields of developmental biology and comparative anatomy into comparative embryology. From a more modern perspective, Haeckel's drawings were the beginnings of the field of evolutionary developmental biology (evo-devo).
The study of comparative embryology aims to prove or disprove that vertebrate embryos of different classes (e.g. mammals vs. fish) follow a similar developmental path due to their common ancestry. Such developing vertebrates have similar genes, which determine the basic body plan. However, further development allows for the distinguishing of distinct characteristics as adults.
In current biology, fundamental research in developmental biology and evolutionary developmental biology is no longer driven by morphological comparisons between embryos, but more by molecular biology. This is partly because Haeckel's drawings were very inaccurate.
The exactness of Ernst Haeckel's drawings of embryos has caused much controversy among Intelligent Design proponents recently and Haeckel's intellectual opponents in the past. Although the early embryos of different species exhibit similarities, Haeckel apparently exaggerated these similarities in support of his Recapitulation theory, sometimes known as the Biogenetic Law or "Ontogeny recapitulates phylogeny". Furthermore, Haeckel even proposed theoretical life-forms to accommodate certain stages in embryogenesis. A recent review concluded that the "biogenetic law is supported by several recent studies - if applied to single characters only".
Critics in the late 19th and early 20th centuries, Karl von Baer and Wilhelm His, did not believe that living embryos reproduce the evolutionary process and produced embryo drawings of their own which emphasized the differences in early embryological development. Late 20th and early 21st century critic Stephen Jay Gould has objected to the continued use of Haeckel's embryo drawings in textbooks.
On the other hand, Michael K. Richardson, Professor of Evolutionary Developmental Zoology, Leiden University, while recognizing that some criticisms of the drawings are legitimate (indeed, it was he and his co-workers who began the modern criticisms in 1998), has supported the drawings as teaching aids, and has said that "on a fundamental level, Haeckel was correct"
Haeckel's illustrations show vertebrate embryos at different stages of development, which exhibit embryonic resemblance as support for evolution, recapitulation as evidence of the Biogenetic Law, and phenotypic divergence as evidence of von Baer's laws. The series of twenty-four embryos from the early editions of Haeckel's "Anthropogenie" remain the most famous. The different species are arranged in columns, and the different stages in rows. Similarities can be seen along the first two rows; the appearance of specialized characters in each species can be seen in the columns and a diagonal interpretation leads one to Haeckel's idea of recapitulation.
Haeckel's embryo drawings are primarily intended to express his theory of embryonic development, the Biogenetic Law, which in turn assumes (but is not crucial to) the evolutionary concept of common descent. His postulation of embryonic development coincides with his understanding of evolution as a developmental process. In and around 1800, embryology fused with comparative anatomy as the primary foundation of morphology. Ernst Haeckel, along with Karl von Baer and Wilhelm His, are primarily influential in forming the preliminary foundations of ‘phylogenetic embryology’ based on principles of evolution. Haeckel's ‘Biogenetic Law’ portrays the parallel relationship between an embryo's development and phylogenetic history. The term, ‘recapitulation,’ has come to embody Haeckel's Biogenetic Law, for embryonic development is a recapitulation of evolution. Haeckel proposes that all classes of vertebrates pass through an evolutionarily conserved “phylotypic” stage of development, a period of reduced phenotypic diversity among higher embryos. Only in later development do particular differences appear. Haeckel portrays a concrete demonstration of his Biogenetic Law through his "Gastrea" theory, in which he argues that the early cup-shaped gastrula stage of development is a universal feature of multi-celled animals. An ancestral form existed, known as the gastrea, which was a common ancestor to the corresponding gastrula.
Haeckel argues that certain features in embryonic development are conserved and palingenetic, while others are caenogenetic. Caenogenesis represents “the blurring of ancestral resemblances in development,” which are said to be the result of certain adaptations to embryonic life due to environmental changes. In his drawings, Haeckel cites the notochord, pharyngeal arches and clefts, pronephros and neural tube as palingenetic features. However, the yolk sac, extra-embryonic membranes, egg membranes and endocardial tube are considered caenogenetic features. The addition of terminal adult stages and the telescoping, or driving back, of such stages to descendant's embryonic stages are likewise representative of Haeckelian embryonic development. In addressing his embryo drawings to a general audience, Haeckel does not cite any sources, which gives his opponents the freedom to make assumptions regarding the originality of his work.
Haeckel was not the only one to create a series of drawings representing embryonic development. Karl E. von Baer and Haeckel both struggled to model one of the most complex problems facing embryologists at the time: the arrangement of general and special characters during development in different species of animals. In relation to developmental timing, von Baer's scheme of development differs from Haeckel's scheme. Von Baer's scheme of development need not be tied to developmental stages defined by particular characters, where recapitulation involves heterochrony. Heterochrony represents a gradual alteration in the original phylogenetic sequence due to embryonic adaptation.
As well, von Baer early noted that embryos of different species could not be easily distinguished from one another as in adults.
Von Baer's laws governing embryonic development are specific rejections of recapitulation. As a response to Haeckel's theory of recapitulation, von Baer enunciates his most notorious laws of development. Von Baer's laws state that general features of animals appear earlier in the embryo than special features, where less general features stem from the most general, each embryo of a species departs more and more from a predetermined passage through the stages of other animals, and there is never a complete morphological similarity between an embryo and a lower adult. Von Baer's embryo drawings display that individual development proceeds from general features of the developing embryo in early stages through differentiation into special features specific to the species, establishing that linear evolution could not occur. Embryological development, in von Baer's mind, is a process of differentiation, "a movement from the more homogeneous and universal to the more heterogeneous and individual."
Von Baer argues that embryos will resemble each other before attaining characteristics differentiating them as part of a specific family, genus or species, but embryos are not the same as the final forms of lower organisms.
Wilhelm His was one of Haeckel's most authoritative and primary opponents advocating physiological embryology. His "Anatomie menschlicher Embryonen" (Anatomy of human embryos) employs a series of his most important drawings chronicling developing embryos from the end of the second week through the end of the second month of pregnancy. In 1878, His begins to engage in serious study of the anatomy of human embryos for his drawings. During the 19th century, embryologists often obtained early human embryos from abortions and miscarriages, postmortems of pregnant women and collections in anatomical museums. In order to construct his series of drawings, His collected specimens which he manipulated into a form that he could operate with.
In His’ "Normentafel", he displays specific individual embryos rather than ideal types. His does not produce norms from aborted specimens, but rather visualizes the embryos in order to make them comparable and specifically subjects his embryo specimens to criticism and comparison with other cases. Ultimately, His’ critical work in embryonic development comes with his production of a series of embryo drawings of increasing length and degree of development. His’ depiction of embryological development strongly differs from Haeckel's depiction, for His argues that the phylogenetic explanation of ontogenetic events is unnecessary. His argues that all ontogenetic events are the “mechanical” result of differential cell growth. His’ embryology is not explained in terms of ancestral history.
The debate between Haeckel and His ultimately becomes fueled by the description of an embryo that Wilhelm Krause propels directly into the ongoing feud between Haeckel and His. Haeckel speculates that the allantois is formed in a similar way in both humans and other mammals. His, on the other hand, accuses Haeckel of altering and playing with the facts. Although Haeckel is proven right about the allantois, the utilization of Krause's embryo as justification turns out to be problematic, for the embryo is that of a bird rather than a human. The underlying debate between Haeckel and His derives from differing viewpoints regarding the similarity or dissimilarity of vertebrate embryos. In response to Haeckel's evolutionary claim that all vertebrates are essentially identical in the first month of embryonic life as proof of common descent, His responds by insisting that a more skilled observer would recognize even sooner that early embryos can be distinguished. His also counteracts Haeckel's sequence of drawings in the "Anthropogenie" with what he refers to as “exact” drawings, highlighting specific differences. Ultimately, His goes so far as to accuse Haeckel of “faking” his embryo illustrations to make the vertebrate embryos appear more similar than in reality. His also accuses Haeckel of creating early human embryos that he conjured in his imagination rather than obtained through empirical observation. His completes his denunciation of Haeckel by pronouncing that Haeckel had “‘relinquished the right to count as an equal in the company of serious researchers.’”
Haeckel encountered numerous oppositions to his artistic depictions of embryonic development during the late nineteenth and early twentieth centuries. Haeckel’s opponents believe that he de-emphasizes the differences between early embryonic stages in order to make the similarities between embryos of different species more pronounced.
The first suggestion of fakery against Haeckel was made in late 1868 by Ludwig Rutimeyer in the "Archiv für Anthropogenie". Rutimeyer was a professor of zoology and comparative anatomy at the University of Basel, who rejected natural selection as simply mechanistic and proposed an anti-materialist view of nature. Rutimeyer claimed that Haeckel “had taken to kinds of liberty with established truth.” Rutimeyer claimed that Haeckel presented the same image three consecutive times as the embryo of the dog, the chicken, and the turtle.
Theodor Bischoff (1807–1882), was a strong opponent of Darwinism. As a pioneer in mammalian embryology, he was one of Haeckel's strongest critics. Although Bischoff's 1840 surveys depict how similar the early embryos of man are to other vertebrates, he later demanded that such hasty generalization was inconsistent with his recent findings regarding the dissimilarity between hamster embryos and those of rabbits and dogs. Nevertheless, Bischoff's main argument was in reference to Haeckel's drawings of human embryos, for Haeckel is later accused of miscopying the dog embryo from him. Throughout Haeckel's time, criticism of his embryo drawings was often due in part to his critics' belief in his representations of embryological development as “crude schemata.”
Michael Richardson and his colleagues in a July 1997 issue of "Anatomy and Embryology", demonstrated that Haeckel falsified his drawings in order to exaggerate the similarity of the phylotypic stage.
In a March 2000 issue of "Natural History", Stephen Jay Gould argued that Haeckel "exaggerated the similarities by idealizations and omissions." As well, Gould argued that Haeckel's drawings are simply inaccurate and falsified. On the other hand, one of those who criticized Haeckel's drawings, Michael Richardson, has argued that "Haeckel's much-criticized drawings are important as phylogenetic hypotheses, teaching aids, and evidence for evolution".
But even Richardson admitted in "Science" Magazine in 1997 that his team's investigation of Haeckel's drawings were showing them to be "one of the most famous fakes in biology."
Some version of Haeckel's drawings can be found in many modern biology textbooks in discussions of the history of embryology, with clarification that these are no longer considered valid .
Although Charles Darwin accepted Haeckel's support for natural selection, he was tentative in using Haeckel's ideas in his writings; with regard to embryology, Darwin relied far more on von Baer's work. Haeckel's work was published in 1866 and 1874, years after Darwin's "The Origin of Species" (1859).
Despite the numerous oppositions, Haeckel has influenced many disciplines in science in his drive to integrate such disciplines of taxonomy and embryology into the Darwinian framework and to investigate phylogenetic reconstruction through his Biogenetic Law. As well, Haeckel served as a mentor to many important scientists, including Anton Dohrn, Richard and Oscar Hertwig, Wilhelm Roux, and Hans Driesch.
One of Haeckel's earliest proponents was Carl Gegenbaur at the University of Jena (1865–1873), during which both men were absorbing the impact of Darwin's theory. The two quickly sought to integrate their knowledge into an evolutionary program. In determining the relationships between "phylogenetic linkages" and "evolutionary laws of form," both Gegenbaur and Haeckel relied on a method of comparison. As Gegenbaur argued, the task of comparative anatomy lies in explaining the form and organization of the animal body in order to provide evidence for the continuity and evolution of a series of organs in the body. Haeckel then provided a means of pursuing this aim with his biogenetic law, in which he proposed to compare an individual's various stages of development with its ancestral line. Although Haeckel stressed comparative embryology and Gegenbaur promoted the comparison of adult structures, both believed that the two methods could work in conjunction to produce the goal of evolutionary morphology.
The philologist and anthropologist, Friedrich Müller, used Haeckel's concepts as a source for his ethnological research, involving the systematic comparison of the folklore, beliefs and practices of different societies. Müller's work relies specifically on theoretical assumptions that are very similar to Haeckel's and reflects the German practice to maintain strong connections between empirical research and the philosophical framework of science. Language is particularly important, for it establishes a bridge between natural science and philosophy. For Haeckel, language specifically represented the concept that all phenomena of human development relate to the laws of biology. Although Müller did not specifically have an influence in advocating Haeckel's embryo drawings, both shared a common understanding of development from lower to higher forms, for Müller specifically saw humans as the last link in an endless chain of evolutionary development.
Modern acceptance of Haeckel's Biogenetic Law, despite current rejection of Haeckelian views, finds support in the certain degree of parallelism between ontogeny and phylogeny. A. M. Khazen, on the one hand, states that "ontogeny is obliged to repeat the main stages of phylogeny." A. S. Rautian, on the other hand, argues that the reproduction of ancestral patterns of development is a key aspect of certain biological systems. Dr. Rolf Siewing acknowledges the similarity of embryos in different species, along with the laws of von Baer, but does not believe that one should compare embryos with adult stages of development. According to M. S. Fischer, reconsideration of the Biogenetic Law is possible as a result of two fundamental innovations in biology since Haeckel's time: cladistics and developmental genetics.
In defense of Haeckel's embryo drawings, the principal argument is that of "schematisation." Haeckel's drawings were not intended to be technical and scientific depictions, but rather schematic drawings and reconstructions for a specifically lay audience. Therefore, as R. Gursch argues, Haeckel's embryo drawings should be regarded as "reconstructions." Although his drawings are open to criticism, his drawings should not be considered falsifications of any sort. Although modern defense of Haeckel's embryo drawings still considers the inaccuracy of his drawings, charges of fraud are considered unreasonable. As Erland Nordenskiöld argues, charges of fraud against Haeckel are unnecessary. R. Bender ultimately goes so far as to reject His's claims regarding the fabrication of certain stages of development in Haeckel's drawings, arguing that Haeckel's embryo drawings are faithful representations of real stages of embryonic development in comparison to published embryos.
Haeckel's embryo drawings, as comparative plates, were at first only copied into biology textbooks, rather than texts on the study of embryology. Even though Haeckel's program in comparative embryology virtually collapsed after the First World War, his embryo drawings have often been reproduced and redrawn with increased precision and accuracy in works that have kept the study of comparative embryology alive. Nevertheless, neither His-inspired human embryology nor developmental biology are concerned with the comparison of vertebrate embryos. Although Stephen Jay Gould's 1977 book "Ontogeny and Phylogeny" helps to reassess Haeckelian embryology, it does not address the controversy over Haeckel's embryo drawings. Nevertheless, new interest in evolution in and around 1977 inspired developmental biologists to look more closely at Haeckel's illustrations. | https://en.wikipedia.org/wiki?curid=10273 |
Enthalpy
Enthalpy , a property of a thermodynamic system, is the sum of the system's internal energy and the product of its pressure and volume. In a system contained so as to prevent mass transfer, for processes at constant pressure, the heat absorbed or released equals the change in enthalpy.
The unit of measurement for enthalpy in the International System of Units (SI) is the joule. Other historical conventional units still in use include the British thermal unit (BTU) and the calorie.
Enthalpy comprises a system's internal energy, which is the energy required to create the system, plus the amount of work required to make room for it by displacing its environment and establishing its volume and pressure.
Enthalpy is a state function that depends only on the prevailing equilibrium state identified by the system's internal energy, pressure, and volume. It is an extensive quantity.
Change in enthalpy () is the preferred expression of system energy change in many chemical, biological, and physical measurements at constant pressure, because it simplifies the description of energy transfer. In a system enclosed so as to prevent matter transfer, at constant pressure, the enthalpy change equals the energy transferred from the environment through heat transfer or work other than expansion work.
The total enthalpy, , of a system cannot be measured directly. The same situation exists in classical mechanics: only a change or difference in energy carries physical meaning. Enthalpy itself is a thermodynamic potential, so in order to measure the enthalpy of a system, we must refer to a defined reference point; therefore what we measure is the change in enthalpy, . The is a positive change in endothermic reactions, and negative in heat-releasing exothermic processes.
For processes under constant pressure, is equal to the change in the internal energy of the system, plus the pressure-volume work done by the system on its surroundings (which is positive for an expansion and negative for a contraction). This means that the change in enthalpy under such conditions is the heat absorbed or released by the system through a chemical reaction or by external heat transfer. Enthalpies for chemical substances at constant pressure usually refer to standard state: most commonly pressure. Standard state does not, strictly speaking, specify a temperature (see standard state), but expressions for enthalpy generally reference the standard heat of formation at .
The enthalpy of an ideal gas is a function of temperature only, so does not depend on pressure. Real materials at common temperatures and pressures usually closely approximate this behavior, which greatly simplifies enthalpy calculation and use in practical designs and analyses.
The word "enthalpy" was coined relatively late, in the early 20th century, in analogy with the 19th-century terms "energy" (introduced in its modern sense by Thomas Young in 1802) and "entropy" (coined in analogy to "energy" by Rudolf Clausius in 1865). Where "energy" uses the root of the Greek word ("ergon") "work" to express the idea of "work-content" and where "entropy" uses the Greek word ("tropē") "transformation" to express the idea of "transformation-content", so by analogy, "enthalpy" uses the root of the Greek word ("thalpos") "warmth, heat" to express the idea of "heat-content".
The term does in fact stand in for the older term "heat content",
a term which is now mostly deprecated as misleading, as refers to the amount of heat absorbed in a process at constant pressure only,
but not in the general case (when pressure is variable).
Josiah Willard Gibbs used the term "a heat function for constant pressure" for clarity.
Introduction of the concept of "heat content" is associated with Benoît Paul Émile Clapeyron and Rudolf Clausius (Clausius–Clapeyron relation, 1850).
The term "enthalpy" first appeared in print in 1909. It is attributed to Heike Kamerlingh Onnes, who most likely introduced it orally the year before, at the first meeting of the Institute of Refrigeration in Paris. | https://en.wikipedia.org/wiki?curid=10274 |
Erdoğan Atalay
Erdoğan Atalay (born September 22, 1966 in Hanover, Germany) is a Turkish-German actor.
In 2017 he married his girlfriend and manager Katja Ohneck.
Having already made his first appearance as a minor actor in "Aladdin and the miracle lamp" at the National Theatre of Hanover before studying acting at the Hochschule für Musik und Theater Hamburg. Afterwards he took on first parts in several German television series such as "Music Groschenweise", "Employment for Lohbeck", "Double Employment" and "The Guard". In 1996 Action Concept engaged him for the role he has played successfully up to now: Semir Gerkhan in the German television series "Alarm für Cobra 11 – Die Autobahnpolizei". | https://en.wikipedia.org/wiki?curid=10275 |
Ennio Morricone
Ennio Morricone, OMRI (; born 10 November 1928) is an Italian composer, orchestrator, conductor, and former trumpet player, writing in a wide range of musical styles. Since 1961, Morricone has composed over 400 scores for cinema and television, as well as over 100 classical works. His score to "The Good, the Bad and the Ugly" (1966) is considered one of the most influential soundtracks in history and was inducted into the Grammy Hall of Fame. His filmography includes over 70 award-winning films, including all Sergio Leone films (since "A Fistful of Dollars"), all Giuseppe Tornatore films (since "Cinema Paradiso"), "The Battle of Algiers", Dario Argento's "Animal Trilogy", "1900", "", "Days of Heaven", several major films in French cinema, in particular the comedy trilogy "La Cage aux Folles I", "II", "" and "Le Professionnel", as well as "The Thing", "The Mission", "The Untouchables", "Mission to Mars", "Bugsy", "Disclosure", "In the Line of Fire", "Bulworth", "Ripley's Game" and "The Hateful Eight".
After playing the trumpet in jazz bands in the 1940s, he became a studio arranger for RCA Victor and in 1955 started ghost writing for film and theatre. Throughout his career, he has composed music for artists such as Paul Anka, Mina, Milva, Zucchero and Andrea Bocelli. From 1960 to 1975, Morricone gained international fame for composing music for Westerns and—with an estimated 10 million copies sold—"Once Upon a Time in the West" is one of the best-selling scores worldwide. From 1966 to 1980, he was a main member of Il Gruppo, one of the first experimental composers collectives, and in 1969 he co-founded Forum Music Village, a prestigious recording studio. From the 1970s, Morricone excelled in Hollywood, composing for prolific American directors such as Don Siegel, Mike Nichols, Brian De Palma, Barry Levinson, Oliver Stone, Warren Beatty, John Carpenter and Quentin Tarantino. In 1977, he composed the official theme for the 1978 FIFA World Cup. He continued to compose music for European productions, such as "Marco Polo", "La piovra", "Nostromo", "Fateless", "" and "En mai, fais ce qu'il te plait". Morricone's music has been reused in television series, including "The Simpsons" and "The Sopranos", and in many films, including "Inglourious Basterds" and "Django Unchained". He also scored seven Westerns for Sergio Corbucci, Duccio Tessari's "Ringo" duology and Sergio Sollima's "The Big Gundown" and "Face to Face". Morricone worked extensively for other film genres with directors such as Bernardo Bertolucci, Mauro Bolognini, Giuliano Montaldo, Roland Joffé, Roman Polanski and Henri Verneuil. His acclaimed soundtrack for "The Mission" (1986) was certified gold in the United States. The album "Yo-Yo Ma Plays Ennio Morricone" stayed 105 weeks on the "Billboard" Top Classical Albums.
Morricone's best-known compositions include "The Ecstasy of Gold", "Se Telefonando", "Man with a Harmonica", "Here's to You", the UK No. 2 single "Chi Mai", "Gabriel's Oboe" and "E Più Ti Penso". In 1971, he received a "Targa d'Oro" for worldwide sales of 22 million, and by 2016 Morricone had sold over 70 million records worldwide. In 2007, he received the Academy Honorary Award "for his magnificent and multifaceted contributions to the art of film music." He has been nominated for a further six Oscars. In 2016, Morricone received his first competitive Academy Award for his score to Quentin Tarantino's film "The Hateful Eight", at the time becoming the oldest person ever to win a competitive Oscar. His other achievements include three Grammy Awards, three Golden Globes, six BAFTAs, ten David di Donatello, eleven Nastro d'Argento, two European Film Awards, the Golden Lion Honorary Award and the Polar Music Prize in 2010. Morricone has influenced many artists from film scoring to other styles and genres, including Hans Zimmer, Danger Mouse, Dire Straits, Muse, Metallica, and Radiohead.
Morricone was born in Rome, the son of Libera Ridolfi and Mario Morricone, a musician. His family came from Arpino, near Frosinone. Morricone, who had four siblings, Adriana, Aldo (who died accidentally before turning four years old, owing to his nanny's mistakenly feeding him cherries, to which he was severely allergic), Maria and Franca, lived in Trastevere, in the centre of Rome, with his parents. Mario, his father, was a trumpet player who worked professionally in different light-music orchestras, while his mother Libera set up a small textile business.
His first teacher was his father Mario Morricone, who taught him how to read music and also to play several instruments. Compelled to take up the trumpet, he entered the National Academy of St Cecilia, to take trumpet lessons under the guidance of Umberto Semproni.
Morricone formally entered the conservatory in 1940 at age 12, enrolling in a four-year harmony program. He completed it within six months. He studied the trumpet, composition, and choral music, under the direction of Goffredo Petrassi, who influenced him; Morricone has since dedicated his concert pieces to Petrassi. In 1941, Morricone was chosen among the students of the National Academy of St Cecilia to be a part of the Orchestra of the Opera directed by Carlo Zecchi on the occasion of a tour of the Veneto region. In 1946, he received his diploma in trumpet. After he graduated, he continued to work in classical composition and arrangement.
Although the composer had received the "Diploma in Instrumentation for Band Arrangement" (fanfare) with a mark of 9/10 in 1952, his studies concluded at the Conservatory of Santa Cecilia in 1954 and obtained a final 9.5/10 in his Diploma in Composition, under the composer Goffredo Petrassi.
Morricone wrote his first compositions when he was six years old and was encouraged to develop his natural talents. In 1946, he composed "Il Mattino" ("The Morning") for voice and piano on a text by Fukuko, first in a group of seven "youth" Lieder.
In the following years, he continued to write music for the theatre as well as classical music for voice and piano, such as "Imitazione", based on a text by Italian poet Giacomo Leopardi, "Intimità", based on a text by Olinto Dini, "Distacco I" and "Distacco II" with words by R. Gnoli, "Oboe Sommerso" for baritone and five instruments with words by poet Salvatore Quasimodo and "Verrà la Morte", for contralto and piano, based on a text by novelist Cesare Pavese.
In 1953, Morricone was asked by Gorni Kramer and Lelio Luttazzi to write an arrangement for some medleys in an American style for a series of evening radio shows. The composer continued with the composition of other 'serious' classical pieces, thus demonstrating the flexibility and eclecticism which has always been an integral part of his character. Many orchestral and chamber compositions date, in fact, from the period between 1954 and 1959: "Musica per archi e pianoforte" (1954), "Invenzione, Canone e Ricercare per piano"; "Sestetto per flauto, oboe, fagotto, violino, viola e violoncello" (1955), "Dodici Variazione per oboe, violoncello e piano"; "Trio per clarinetto, corno e violoncello"; "Variazione su un tema di Frescobaldi" (1956); "Quattro pezzi per chitarra" (1957); "Distanze per violino, violoncello e piano"; "Musica per undici violini, Tre Studi per flauto, clarinetto e fagotto" (1958); and the "Concerto per orchestra" (1957), dedicated to his teacher Goffredo Petrassi.
Morricone soon gained popularity by writing his first background music for radio dramas and quickly moved into film.
Composing for radio, television and pop artists
Morricone's career as an arranger started in 1950, by arranging the piece "Mamma Bianca" (Narciso Parigi). In occasion of the "Anno Santo" (Holy Year), he arranged a long group of popular songs of devotion for radio broadcasting.
In 1956, Morricone started to support his family by playing in a jazz band and arranging pop songs for the Italian broadcasting service RAI. He was hired by RAI in 1958, but quit his job on his first day at work when he was told that broadcasting of music composed by employees was forbidden by a company rule. Subsequently, Morricone became a top studio arranger at RCA Victor, working with Renato Rascel, Rita Pavone, Domenico Modugno and Mario Lanza.
Throughout his career Morricone has composed songs for several national and international jazz and pop artists. In 1962 Morricone worked with American jazz singer Helen Merrill as an arranger on an EP "Helen Merrill sings Italian Songs" on the RCA Italiana label. Gianni Morandi ("Go Kart Twist", 1962), Alberto Lionello ("La donna che vale", 1959), Edoardo Vianello ("Ornella", 1960; "Cicciona cha-cha", 1960; "Faccio finta di dormire", 1961; "T'ho conosciuta", 1963; ), Nora Orlandi ("Arianna", 1960), Jimmy Fontana ("Twist no. 9"; "Nicole", 1962), Rita Pavone ("Come te non-ce nessuno" and "Pel di carota" from 1962, arranged by Luis Bacalov), Catherine Spaak ("Penso a te"; "Questi vent'anni miei", 1964), Luigi Tenco ("Quello che conta"; "Tra tanta gente"; 1962), Gino Paoli ("Nel corso" from 1963, written by Morricone with Paoli), Renato Rascel ("Scirocco", 1964), Paul Anka ("Ogni Volta"), Amii Stewart, Rosy Armen ("L'Amore Gira"), Milva ("Ridevi", "Metti Una Sera A Cena"), Françoise Hardy ("Je changerais d'avis", 1966), Mireille Mathieu ("Mon ami de toujours"; "Pas vu, pas pris", 1971; "J'oublie la pluie et le soleil", 1974) and Demis Roussos ("I Like The World", 1970).
In 1963, the composer co-wrote (with Roby Ferrante) the music for the composition "Ogni volta" ("Every Time"), a song that was performed by Paul Anka for the first time during the Festival di San Remo in 1964. This song was arranged and conducted by Morricone and sold over three million copies worldwide, including one million copies in Italy alone.
Another particular success was his composition, "Se telefonando." Performed by Mina, it was a standout track of "Studio Uno 66", the fifth-biggest-selling album of the year 1966 in Italy. Morricone's sophisticated arrangement of "Se telefonando" was a combination of melodic trumpet lines, Hal Blaine–style drumming, a string set, a '60s Europop female choir, and intensive subsonic-sounding trombones. The Italian Hitparade No. 7 song had eight transitions of tonality building tension throughout the chorus. During the following decades, the song was covered by several performers in Italy and abroad most notably by Françoise Hardy and Iva Zanicchi (1966), Delta V (2005), Vanessa and the O's (2007), and Neil Hannon (2008). "Françoise Hardy – Mon amie la rose" site in the reader's poll conducted by the la Repubblica newspaper to celebrate Mina's 70th anniversary in 2010, 30,000 voters picked the track as the best song ever recorded by Mina.
In 1987, Morricone co-wrote 'It Couldn't Happen Here' with the Pet Shop Boys. Other notable compositions for international artists include: "La metà di me" and "Immagina" (1988) by Ruggero Raimondi, "Libera l'amore" (1989) performed by Zucchero, "Love Affair" (1994) by k.d. lang, "Ha fatto un sogno" (1997) by Antonello Venditti, "Di Più" (1997) by Tiziana Tosca Donati, "Come un fiume tu" (1998), "Un Canto" (1998) and "Conradian" (2006) by Andrea Bocelli, "Ricordare" (1998) and "Salmo" (2000) by Angelo Branduardi and "My heart and I" (2001) by Sting.
After graduating in 1954, Morricone started writing and arranging music as a ghost writer for films credited to other already well-known composers, while also arranging for many light music orchestras of the RAI television network, working most notably with Armando Trovajoli, Alessandro Cicognini and Carlo Savina. He occasionally adopted Anglicized pseudonyms, such as Dan Savio and Leo Nichols.
In 1959, Morricone was the conductor (and uncredited co-composer) for Mario Nascimbene's score to "Morte Di Un Amico" (Death of a Friend), an Italian drama directed by Franco Rossi. In the same year, he composed music for the theatre show "Il Lieto Fine" by Luciano Salce.
The 1960s began on a positive note: 1961 marked in fact his real film debut with Luciano Salce's "Il Federale (The Fascist)". In an interview with American composer Fred Karlin, Morricone discussed his beginnings, stating, "My first films were light comedies or costume movies that required simple musical scores that were easily created, a genre that I never completely abandoned even when I went on to much more important films with major directors".
"Il Federale" marked the beginning of a long-run collaboration with Luciano Salce. In 1962 Morricone composed the jazz-influenced score for Salce's comedy "La voglia matta (Crazy Desire)". That year Morricone arranged also Italian singer Edoardo Vianello's summer hit "Pinne, Fucile e Occhiali", a cha-cha song, peppered with added water effects, unusual instrumental sounds and unexpected stops and starts.
Morricone wrote more works in the climate of the Italian avant-garde. A few of these compositions have been made available on CD, such as "Ut", his trumpet concerto dedicated to the soloist Mauro Maur, one of his favorite musicians; some have yet to be premiered.
From 1964 up to their eventual disbandment in 1980, he was part of "Gruppo di Improvvisazione di Nuova Consonanza" (G.I.N.C.), a group of composers who performed and recorded avant-garde free improvisations. The Rome-based avant-garde ensemble was dedicated to the development of improvisation and new music methods. The ensemble functioned as a laboratory of sorts, working with anti-musical systems and sound techniques in an attempt to redefine the new music ensemble and explore "New Consonance."
Known as "The Group" or "Il Gruppo," they released seven albums across the Deutsche Grammophon, RCA and Cramps labels: "Gruppo di Improvvisazione Nuova Consonanza" (1966), "The Private Sea of Dreams" (1967), "Improvisationen" (1968), "The Feed-back" (1970), "Improvvisazioni a Formazioni Variate" (1973), "Nuova Consonanza" (1975) and "Musica su Schemi" (1976). Perhaps the most famous of these is their album entitled "The Feed-back", which combines free jazz and avant-garde classical music with funk; the album is frequently sampled by hip hop DJs and is considered to be one of the most collectable records in existence, often fetching over $1,000 at auction.
Morricone played a key role in The Group and was among the core members in its revolving line-up; in addition to serving as their trumpet player, he directed them on many occasions and they can be heard on a large number of his scores from the 1970s.
Held in high regard in avant-garde music circles, they are considered to be the first experimental composers collective, their only peers being the British improvisation collective AMM. Their influence can be heard in free improvising ensembles from the European movements including Evan Parker Electro-Acoustic Ensemble, the Swiss electronic free improvisation group Voice Crack, John Zorn and in the techniques of modern classical music and avant-garde jazz groups. The ensemble's groundbreaking work informed their work in composition. The ensemble also performed in varying capacities with Morricone, contributing to some of his '60s and '70s Italian soundtracks, including "A Quiet Place in the Country" (1969) and "Cold Eyes of Fear" (1971).
His earliest scores were Italian light comedy and costume pictures, where Morricone learned to write simple, memorable themes. During the sixties and seventies he composed the scores for comedies such as "Diciottenni al sole" (1962), "Il Successo" (1963), Lina Wertmüller's "I basilischi" (1963), "Slalom" (1965), "Menage all'italiana" (1965), "How I Learned to Love Women" (1966), "L'harem" (1967), "A Fine Pair" (1968), "L'Alibi" (1969), "Questa specie d'amore" (1972), "Forza "G"" (1972) and "Fiorina la vacca" (1972).
His best-known scores for comedies includes "La Cage aux Folles" (1978) and "La Cage aux Folles II" (1980), both directed by Édouard Molinaro, "Il ladrone" (1980), Georges Lautner's "" (1985), Pedro Almodóvar's "Tie Me Up! Tie Me Down!" (1990) and Warren Beatty's "Bulworth" (1998). Morricone has never ceased to arrange and write music for comedies. In 2007, he composed a lighthearted score for the Italian romantic comedy "Tutte le Donne della mia Vita" by Simona Izzo, the director who co-wrote the Morricone-scored religious mini-series "Il Papa Buono".
Though his first films were undistinguished, Morricone's arrangement of an American folk song intrigued director and former schoolmate Sergio Leone. Before being associated with Leone, Morricone had already composed some music for less-known western movies such as "Duello nel Texas" (aka "Gunfight at Red Sands") (1963). In 1962, Morricone met American folksinger Peter Tevis, who is credited with singing the lyrics of Morricone's songs such as "A Gringo Like Me" (from "Gunfight at Red Sands") and "Lonesome Billy" (from "Bullets Don't Argue").
The turning point in Morricone's career took place in 1964, the year in which his third child, Andrea Morricone, who would also become a film composer, was born. Film director Sergio Leone hired Morricone, and together they created a distinctive score to accompany Leone's different version of the Western, "A Fistful of Dollars" (1964).
The "Dollars" trilogy
Because budget strictures limited Morricone's access to a full orchestra, he used gunshots, cracking whips, whistle, voices, jew's harp, trumpets, and the new Fender electric guitar, instead of orchestral arrangements of Western standards à la John Ford. Morricone used his special effects to punctuate and comically tweak the action—cluing in the audience to the taciturn man's ironic stance. Though sonically bizarre for a movie score, Morricone's music was viscerally true to Leone's vision.
As memorable as Leone's close-ups, harsh violence, and black comedy, Morricone's work helped to expand the musical possibilities of film scoring. Morricone was initially billed on the film as Dan Savio. "A Fistful of Dollars" came out in Italy in 1964 and was released in America three years later, greatly popularizing the so-called Spaghetti Western genre. For the American release, Sergio Leone and Ennio Morricone decided to adopt American-sounding names, so they called themselves respectively Bob Robertson and Dan Savio. Over the film's theatrical release, it grossed more than any other Italian film up to that point. The film debuted in the United States in January 1967, where it grossed for the year. It eventually grossed $14.5 million in its American release, against its budget of 200–250,000.
With the score of "A Fistful of Dollars", Morricone began his 20-year collaboration with his childhood friend Alessandro Alessandroni and his Cantori Moderni. Alessandroni provided the whistling and the twanging guitar on the film scores, while his Cantori Moderni were a flexible troupe of modern singers. Morricone specifically exploited the solo soprano of the group, Edda Dell'Orso, at the height of her powers "an extraordinary voice at my disposal".
The composer subsequently scored Leone's other two "Dollars Trilogy" (or "Man With No Name Trilogy") spaghetti westerns: "For a Few Dollars More" (1965) and "The Good, the Bad and the Ugly" (1966). All three films starred the American actor Clint Eastwood as "The Man With No Name" and depicted Leone's own intense vision of the mythical West. Some of the music was written before the film, which was unusual. Leone's films were made like that because he wanted the music to be an important part of it; he kept the scenes longer because he did not want the music to end. According to Morricone this explains why the films are so slow.
Despite the small film budgets, the "Dollars Trilogy" was a box-office success. The available budget for "The Good, the Bad and The Ugly" was about 1.2 million, but it became the most successful film of the "Dollars Trilogy", grossing 25.1 million in the United States and over 2,3 billion lire (1,2 million EUR) in Italy alone. Morricone's score became a major success and sold over three million copies worldwide, earning him over 200 million dollars. On 14 August 1968 the original score was certified by the RIAA with a golden record for the sale of 500,000 copies in the United States only.
The main theme of "For a Few Dollars More" ("Per Qualche Dollaro in Più") was covered by Hugo Montenegro ("For a Few Dollars More"), Babe Ruth ("Theme From a Few Dollars More"), Golden Palominos ("For A Few Dollars More"), Material ("For a Few Dollars More"), and Matti Heinivaho ("Arosusi"). More recently, a Techno-Industrial cover was done by Komor Kommando ("Hasta Luego"). A remix was done by Terranova, "For a Few Dollars More (Terranova Remix)".
Hugo Montenegro's version of the main theme of "The Good, the Bad and the Ugly" sold over one million copies worldwide. Montenegro's album with the same name included a selection of Morricone's compositions from the "Dollars Trilogy". In the United States, the album was certified gold by the RIAA on 9 September 1969. The main theme was later sampled by artists such as Cameo ("Word Up!"), Bomb the Bass and LL Cool J.
"The Ecstasy of Gold" became one of Morricone's best-known compositions. The opening scene of Jeff Tremaine's "Jackass Number Two" (2006), in which the cast is chased through a suburban neighborhood by bulls, is accompanied by this piece. While punk rock band the Ramones used "The Ecstasy of Gold" as closing theme during their live performances, Metallica uses "The Ecstasy of Gold" as the introductory music for its concerts since 1983 This composition is also included on Metallica's live symphonic album "S&M" as well as the live album "". An instrumental metal cover by Metallica (with minimal vocals by lead singer James Hetfield) appeared on the 2007 Morricone tribute album "We All Love Ennio Morricone". This metal version was nominated for a Grammy Award in the category of Best Rock Instrumental Performance. In 2009, the Grammy Award-winning hip-hop artist Coolio extensively sampled the theme for his song "Change".
Subsequent to the success of the "Dollars trilogy", Morricone composed also the scores for "Once Upon a Time in the West" (1968) and Leone's last credited western film "A Fistful of Dynamite" (1971), as well as the scores for "My Name Is Nobody" (1973) and "A Genius, Two Partners and a Dupe" (1975), produced by Sergio Leone.
Morricone's score for "Once Upon a Time in the West" is one of the best-selling original instrumental scores in the world today, with up to 10 million copies sold, including one million copies in France and over 800,000 copies in the Netherlands. One of the main themes from the score, "A Man with Harmonica" (L'uomo Dell'armonica), became worldwide known and sold over 1,260,000 copies in France alone. This theme was later sampled in popular songs such as Beats International's "Dub Be Good to Me" (1990) and The Orb's ambient single "Little Fluffy Clouds" (1990). Film composer Hans Zimmer sampled "A Man with Harmonica" in 2007 as part of his composition "Parlay" (from the soundtrack "").
The collaboration with Leone is considered one of the exemplary collaborations between a director and a composer. Morricone's last score for Leone was for his last film, the gangster drama "Once Upon a Time in America" (1984). Leone died on 30 April 1989 of a heart attack at the age of 60. Before his death in 1989, Leone was part-way through planning a film on the Siege of Leningrad, set during World War II. By 1989, Leone had been able to acquire 100 million in financing from independent backers for the war epic. He had convinced Morricone to compose the film score. The project was canceled when Leone died two days before he was to officially sign on for the film. In early 2003, Italian filmmaker Giuseppe Tornatore announced he would direct a film called "Leningrad". The film has yet to go into production and Morricone has been cagey thus far as to details on account of Tornatore's superstitious nature. , no further details about this film have been released.
Two years after the start of his collaboration with Sergio Leone, Morricone also started to score music for another Spaghetti Western director, Sergio Corbucci. The composer wrote music for Corbucci's "Navajo Joe" (1966), "The Hellbenders" (1967), "The Mercenary/The Professional Gun" (1968), "The Great Silence" (1968), "Compañeros" (1970), "Sonny and Jed" (1972) and "What Am I Doing in the Middle of the Revolution?" (1972).
In addition, Morricone composed music for the western films by Sergio Sollima, "The Big Gundown" (with Lee Van Cleef, 1966), "Face to Face" (1967) and "Run, Man, Run" (1968), as well as the 1970 crime thriller "Violent City" (with Charles Bronson) and the poliziottesco film "Revolver" (1973).
Other relevant scores for less popular Spaghetti Westerns include "Duello nel Texas" (1963), "" (1964), "A Pistol for Ringo" (1965), "The Return of Ringo" (1965), "Seven Guns for the MacGregors" (1966), "The Hills Run Red" (1966), Giulio Petroni's "Death Rides a Horse" (1967) and "Tepepa" (1968), "A Bullet for the General" (1967), "Guns for San Sebastian" (with Charles Bronson and Anthony Quinn, 1968), "A Sky Full of Stars for a Roof" (1968), "The Five Man Army" (1969), Don Siegel's "Two Mules for Sister Sara" (1970), "Life Is Tough, Eh Providence?" (1972) and "Buddy Goes West" (1981).
With Leone's films, Ennio Morricone's name had been put firmly on the map. Most of Morricone's film scores of the 1960s were composed outside the Spaghetti Western genre, while still using Alessandroni's team. Their music included the themes for "Il Malamondo" (1964), "Slalom" (1965) and "Listen, Let's Make Love" (1967). In 1968, Morricone reduced his work outside the movie business and wrote scores for 20 films in the same year. The scores included psychedelic accompaniment for Mario Bava's superhero romp "" (1968).
His talent and creativity were such that many other directors were soon keen to collaborate with him, and in the next few years Morricone scored a lot of films by politically committed directors: collaborating with Marco Bellocchio ("Fists in the Pocket", 1965), Gillo Pontecorvo ("The Battle of Algiers" (1966) and "Queimada!" (1969) with Marlon Brando), Roberto Faenza (H2S, 1968), Giuliano Montaldo ("Sacco e Vanzetti", 1971), Giuseppe Patroni Griffi ("'Tis Pity She's a Whore", 1971), Mauro Bolognini ("Drama of the Rich", 1974), Umberto Lenzi ("Almost Human", 1974), Pier Paolo Pasolini ("Salò, or the 120 Days of Sodom", 1975), Bernardo Bertolucci ("Novecento", 1976) and Tinto Brass ("The Key", 1983).
In 1970, Morricone wrote the score for "Violent City". That same year, he received his first Nastro d'Argento for the music in "Metti, una sera a cena" (Giuseppe Patroni Griffi, 1969) and his second only a year later for "Sacco e Vanzetti" (Giuliano Montaldo, 1971), in which he had made a memorable collaboration with the legendary American folk singer and activist Joan Baez. His soundtrack for "Sacco e Vanzetti" contains another well-known composition by Morricone, the folk song "Here's to You", sung by Joan Baez. For the writing of the lyrics, Baez was inspired by a letter from Bartolomeo Vanzetti: ""Father, yes, I am a prisoner / Fear not to relay my crime"". The song became a hit in several countries, selling over 790,000 copies in France only. The song was later included in movies such as "The Life Aquatic with Steve Zissou" and in the video game "" as the closing theme as well as "".
In the same year, Morricone composed the score for the less-known drama "Maddalena" (1971) by the Polish film director Jerzy Kawalerowicz which included its composition 'Chi Mai'. The theme appeared on the million-selling score for Georges Lautner's "Le Professionnel" (1981), as well as the TV series, "An Englishman's Castle" (1978) and "The Life and Times of David Lloyd George" (1981). Because of its appearance on the latter, "Chi Mai" reached number 2 on the UK Singles Chart in 1981. The single was certified by the BPI with a golden record on 1 May 1981 and sold over 900,000 copies in France alone. "Chi Mai" is also the name of the online community about Morricone, which offers a repository of information and a free online magazine called "Maestro", containing reviews, articles, discoveries and free comments.
In the beginning of the 1970s, Morricone achieved success with other singles, including "A Fistful of Dynamite" (1971) and "God With Us" (1974), having sold respectively 477,000 and 378,000 copies in France only.
Between 1967 and 1993 the composer had a long-term collaboration with director Mauro Bolognini. Morricone wrote more than 15 film scores for Bolognini, including "Le streghe" (1966), "L'assoluto naturale" (1969), "Un bellissimo novembre" (1969), "Metello" (1970), "Chronicle of a Homicide" (1972), "Libera, My Love" (1973), "Per le antiche scale" (1975), "La Dame aux camelias" (1980), "Mosca addio" (1987), "Gli indifferenti" (1988) and "Husband and Lovers" (1992).
Ennio Morricone's eclecticism and knack for creating highly poignant, melodic and emotional music found great scope also in horror movies, such as the baroque thrillers of Dario Argento, from "The Bird with the Crystal Plumage" (1969), "The Cat o' Nine Tails" (1970) and "Four Flies on Grey Velvet" (1971) to "The Stendhal Syndrome" (1996) and "The Phantom of the Opera" (1998). His other horror scores include "Nightmare Castle" (1965), "A Quiet Place in the Country" (1968), "The Antichrist" (1974), "Autopsy" (1975) and "Night Train Murders" (1975).
In addition, Morricone's music has also been featured in many popular and cult Italian giallo films, such as "Senza sapere niente di lei" (1969), "Forbidden Photos of a Lady Above Suspicion" (1970), "A Lizard in a Woman's Skin" (1971), "Cold Eyes of Fear" (1971), "The Fifth Cord" (1971), "Short Night of Glass Dolls" (1971), "My Dear Killer" (1972), "What Have You Done to Solange?" (1972), "Black Belly of the Tarantula" (1972), "Who Saw Her Die?" (1972) and "Spasmo" (1974).
In 1977 Morricone scored Alberto De Martino's apocalyptic horror film "Holocaust 2000", starring Kirk Douglas. In 1982 he composed the score for John Carpenter's science fiction horror movie "The Thing". Morricone's main theme for the film was reflected in Marco Beltrami's film's score of prequel of the 1982 film, which was released in 2011.
The "Dollars Trilogy" was not released in the United States until 1967 when United Artists, who had already enjoyed success distributing the British-produced James Bond films in the United States, decided to release Sergio Leone's Spaghetti Westerns. The American release gave Morricone an exposure in America and his film music became quite popular in the United States.
One of Morricone's first contributions to an American director concerned his music for the religious epic film "" by John Huston. According to Sergio Miceli's book "Morricone, la musica, il cinema", Morricone wrote about 15 or 16 minutes of music, which were recorded for a screen test and conducted by Franco Ferrara. At first Morricone's teacher Goffredo Petrassi had been engaged to write the score for the great big budget epic, but Huston preferred another composer. RCA Records then proposed Morricone who was under contract with them, but a conflict between the film's producer Dino De Laurentiis and RCA occurred. The producer wanted to have the exclusive rights for the soundtrack, while RCA still had the monopoly on Morricone at that time and did not want to release the composer. Subsequently, Morricone's work was rejected because he did not get the ok by RCA to work for Dino De Laurentiis alone. The composer reused the parts of his unused score for "The Bible: In the Beginning" in such films as "The Return of Ringo" (1965) by Duccio Tessari and Alberto Negrin's "The Secret of the Sahara" (1987).
Morricone never left Rome to compose his music and never learned to speak English. But given that the composer has always worked in a wide field of composition genres, from absolute music, which he has always produced, to applied music, working as orchestrator as well as conductor in the recording field, and then as a composer for theatre, radio and cinema, the impression arises that he never really cared that much about his standing in the eyes of Hollywood.
In 1970, Morricone composed the music for Don Siegel's "Two Mules for Sister Sara", an American-Mexican western film starring Shirley MacLaine and Clint Eastwood. The same year the composer also delivered the title theme "The Men from Shiloh" for the American Western television series The Virginian and the score for Phil Karlson's war film "Hornets' Nest", starring Rock Hudson, and scored "Bluebeard", starring Richard Burton, two years later.
In 1974 Morricone wrote music for some unknown episodes of the science-fiction television series "", directed by Lee H. Katzin, and the following year he scored the George Kennedy revenge thriller "The "Human" Factor", which was the final film of director Edward Dmytryk. Two years later he composed the score for the sequel to William Friedkin's 1973 film "The Exorcist", directed by John Boorman: "". The horror film was a major disappointment at the box office. The film grossed 30,749,142 in the United States, turning a profit but still disappointing in comparison to the original film's gross. The same year he scored the Dino De Laurentiis produced adventure film "Orca", starring Richard Harris, which was also only a minor hit but later developed a cult following.
In 1978, the composer worked with Terrence Malick for "Days of Heaven", starring Richard Gere. During the lengthy editing process of the romantic drama, which won an Academy Award for Best Cinematography with an additional three nominations for the score, Terrence Malick and Billy Weber made use of a temporary score dominated by Morricone's music for the Bernardo Bertolucci film "Novecento". Malick also chose the ethereal Aquarium music from Camille Saint-Saëns ("The Carnival of the Animals") to frame the film. When Malick decided he wanted Morricone to score his movie, the director sent a version of it to Italy with the Novecento temp track in place. Morricone agreed to the assignment and Malick flew to Italy because the composer did not fly, so would not travel to the United States. Malick took the movie over to Morricone in Italy and Morricone was writing for "Days of Heaven" the whole time. Afterwards they scored the music in Italy. In "Days of Heaven", Morricone's elegiac music coexists with pre-existing selections.
Despite the fact that Morricone had produced some of the most popular and widely imitated film music ever written throughout the 1960s and '70s, "Days of Heaven" earned him his first Oscar nomination for Best Original Score, with his score up against Jerry Goldsmith's "The Boys from Brazil", Dave Grusin's "Heaven Can Wait", Giorgio Moroder's "Midnight Express" (the eventual winner) and John Williams's "Superman: The Movie" at the Oscar ceremonies in 1979.
In 1979, Morricone provided the music for the thriller "Bloodline", directed by Terence Young, best known for directing the James Bond films "Dr. No" (1962), "From Russia with Love" (1963), and "Thunderball" (1965). Subsequently, the composer was asked to score Michael Ritchie's "The Island" (1980, starring Michael Caine), Gordon Willis's thriller "Windows" (1980), Andrew Bergman's comedy "So Fine" (1981) starring Ryan O'Neal, Matt Cimber's film "Butterfly" (1982), starring Pia Zadora, Samuel Fuller's controversial drama film "White Dog" (1982) and "Thieves After Dark" (1984), Jerry London's critically acclaimed TV movie "The Scarlet and the Black" (1983), starring Gregory Peck, and Richard Fleischers box office bomb "Red Sonja" (1985), starring Arnold Schwarzenegger and Brigitte Nielsen.
Morricone's most fruitful and often long-term collaborations in English language cinema been with directors such as Brian De Palma, Barry Levinson, Warren Beatty, Oliver Stone and especially Roland Joffé, for whom Morricone wrote one of his best-known scores, the highly evocative soundtrack for "The Mission" (1986).
Association with Roland Joffé
"The Mission", directed by Joffé, was about a piece of history considerably more distant, as Spanish Jesuit missionaries see their work undone as a tribe of Paraguayan natives fall within a territorial dispute between the Spanish and Portuguese. At one point the score was one of the world's best-selling film scores, selling over 3 million copies worldwide.
Morricone finally received a second Oscar nomination for "The Mission". Morricone's original score lost out to Herbie Hancock's coolly arranged jazz on Bertrand Tavernier's "Round Midnight". It was considered as a surprising win and a controversial one, given that much of the music in the film was pre-existing. Morricone stated the following during a 2001 interview with "The Guardian": "I definitely felt that I should have won for The Mission. Especially when you consider that the Oscar-winner that year was Round Midnight, which was not an original score. It had a very good arrangement by Herbie Hancock, but it used existing pieces. So there could be no comparison with The Mission. There was a theft!" His score for "The Mission" was ranked at number 1 in a poll of the all-time greatest film scores. The top 10 list was compiled by 40 film composers such as Michael Giacchino and Carter Burwell. The score is ranked 23rd on the AFI's list of 25 greatest film scores of all time.
The composer wrote also the music for three other movies by Joffé: "Fat Man and Little Boy" (1989, starring Paul Newman), "City of Joy" (1992, starring Patrick Swayze) and the opening film for the 2000 Cannes Film Festival, "Vatel", starring Gérard Depardieu, Uma Thurman and Tim Roth.
On three occasions, Brian De Palma worked with Morricone: "The Untouchables" (1987), the 1989 war drama "Casualties of War" and the science fiction film "Mission to Mars" (2000). De Palma's "The Untouchables", starring rising star Kevin Costner as Eliot Ness, Robert De Niro as Al Capone and the Oscar-winning Sean Connery, was released in 1987. Morricone's score for "The Untouchables" resulted in his third nomination for Academy Award for Best Original Score.
In a 2001 interview with "The Guardian", Morricone stated that he had good experiences with De Palma: "De Palma is delicious! He respects music, he respects composers. For The Untouchables, everything I proposed to him was fine, but then he wanted a piece that I didn't like at all, and of course we didn't have an agreement on that. It was something I didn't want to write – a triumphal piece for the police. I think I wrote nine different pieces for this in total and I said, 'Please don't choose the seventh!' because it was the worst. And guess what he chose? The seventh one. But it really suits the movie."
Another American director, Barry Levinson, commissioned the composer on two occasions. First, for the crime-drama "Bugsy", starring Warren Beatty, which received ten Oscar nominations, winning two for Best Art Direction-Set Decoration (Dennis Gassner, Nancy Haigh) and Best Costume Design.
The highest-grossing American movie for which the composer wrote a complete score was for Levinson's "Disclosure" in 1994, starring Michael Douglas and Demi Moore.
"He doesn't have a piano in his studio, I always thought that with composers, you sit at the piano, and you try to find the melody. There's no such thing with Morricone. He hears a melody, and he writes it down. He hears the orchestration completely done", said Barry Levinson in an interview.
During his career in Hollywood, Morricone was approached for numerous other projects, including the Gregory Nava drama "A Time of Destiny" (1988), "Frantic" by Polish-French director Roman Polanski (1988, starring Harrison Ford), Franco Zeffirelli's 1990 drama film "Hamlet" (starring Mel Gibson and Glenn Close), the neo-noir crime film "State of Grace" by Phil Joanou (1990, starring Sean Penn and Ed Harris), "Rampage" (1992) by William Friedkin, and the romantic drama "Love Affair" (1994) by Warren Beatty.
None of the aforementioned films were box office successes, but fortunately Morricone was also commissioned for more successful motion pictures such as "In the Line of Fire" (1993) by Wolfgang Petersen, starring Clint Eastwood and John Malkovich, the horror film "Wolf" (1994, Mike Nichols), which featured Jack Nicholson and Michelle Pfeiffer in the lead roles, and "Bulworth" by Warren Beatty.
In 1997, Morricone composed the music for "Lolita" (by Adrian Lyne) and Oliver Stone's "U Turn", starring Sean Penn and Jennifer Lopez. A year later, Ennio Morricone wrote a complete score for the 1998 drama "What Dreams May Come", but Vincent Ward found the music too emotional and replaced Morricone with Michael Kamen.
One of his last complete scores for an American-related project includes the 2002 thriller "Ripley's Game", starring John Malkovich, by Liliana Cavani.
Noted background music from the shooting scene "target practice" in Quentin Tarantino's Django Unchained was also featured in Kung Fu theaters movies.
Besides the 500 original film scores that have been composed by Morricone for movies and television series in a career of over six decades, his music is in addition frequently reused in more than 150 other film projects. Morricone's compositions appeared in the German TV series "Derrick" (1989), the live-action comedy film "Inspector Gadget", "Ally McBeal" (2001), "The Simpsons" (2002), "The Sopranos" (2001–2002) and more recently in "Dancing with the Stars" (2010).
In 2014, Morricone's song "Giù La Testa" was featured in Florian Habicht's feature film "Pulp: a Film about Life, Death & Supermarkets", an unconventional rockumentary about British group Pulp which premiered at SXSW that year.
Quentin Tarantino borrowed Morricone's music for several of his films. The Main Title of "Death Rides a Horse" (1967) can be heard in "", while "" contains music originally from "For a Few Dollars More", "The Good, the Bad and the Ugly", "The Mercenary" and "Navajo Joe". The themes "Paranoia Prima" and "Unexpected Violence" ("Violenza inattesa"), originally from respectively "The Cat o' Nine Tails" and "The Bird with the Crystal Plumage", were used in "Death Proof" (2007) by Tarantino.
In 2009, Tarantino originally wanted Morricone to compose the film score for "Inglourious Basterds". Morricone was unable to, because the film's sped-up production schedule conflicted with his scoring of Giuseppe Tornatore's "Baarìa". However, Tarantino did use eight tracks composed by Morricone in the film, with four of them included on the soundtrack. The tracks came originally from Morricone's scores for "The Big Gundown" (1966), "Revolver" (1973) and "Allonsanfàn" (1974).
In 2012, Morricone composed the song "Ancora Qui" with lyrics by Italian singer Elisa for Tarantino's "Django Unchained", a track that appeared together with three existing music tracks composed by Morricone on the soundtrack. "Ancora Qui" was one of the contenders for an Academy Award nomination in the Best Original Song category, but eventually the song was not nominated. On 4 January 2013 Morricone presented Tarantino with a Life Achievement Award at a special ceremony being cast as a continuation of the International Rome Film Festival. In 2014, Morricone was misquoted, as claiming that he would "never work" with Tarantino again, but later agreed to write an original film score for Tarantino's "The Hateful Eight", which won an Academy Award in 2016 in the Best Original Score category. His nomination for this film marked him at that time as the second oldest nominee in Academy history, behind Gloria Stuart. Morricone's win marked his first competitive Oscar, and at the age of 87 he became the oldest person at the time to win a competitive Oscar.
In 1988 Morricone started an ongoing and very successful collaboration with Italian director Giuseppe Tornatore. His first score for Tornatore was for the drama film "Cinema Paradiso". The international version of the film won the Special Jury Prize at the 1989 Cannes Film Festival and the 1989 Best Foreign Language Film Oscar. Morricone received a BAFTA award with his son Andrea, and a David di Donatello for his score. In 2002, the director's cut 173-minute version was released (known in the U.S. as "Cinema Paradiso: The New Version").
After the success of "Cinema Paradiso", the composer wrote the music for all subsequent films by Tornatore: the drama film "Everybody's Fine" (Stanno Tutti Bene, 1990), "A Pure Formality" (1994) starring Gérard Depardieu and Roman Polanski, "The Star Maker" (1995), "The Legend of 1900" (1998) starring Tim Roth, the 2000 romantic drama "Malèna" (which featured Monica Bellucci) and the psychological thriller mystery film "La sconosciuta" (2006).
More recently, Morricone composed the scores for "Baarìa" (2009), "The Best Offer" (2013) starring Geoffrey Rush, Jim Sturgess and Donald Sutherland and the romantic drama "The Correspondence" (2015) starring Jeremy Irons and Olga Kurylenko.
The composer won several music awards for his scores to Tornatore's movies. So, Morricone received a fifth Academy Award nomination and a Golden Globe nomination for "Malèna". For "Legend of 1900", he won a Golden Globe Award for Best Original Score.
Morricone has worked for television, from a single title piece to variety shows and documentaries to TV series, including "Moses the Lawgiver" (1974), "The Life and Times of David Lloyd George" (1981), "Marco Polo" (1982) (which won two Primetime Emmys), "The Secret of the Sahara" (1987), "The Endless Game" (1989), "I Promessi Sposi" and "Nostromo" (1996).
He wrote the score for the Mafia television series "La piovra" seasons 2 to 10 from 1985 to 2001, including the themes "Droga e sangue" ("Drugs and Blood"), "La Morale", and "L'Immorale". Morricone worked as the conductor of seasons 3 to 5 of the series. He also worked as the music supervisor for the television project "La bibbia" ("The Bible").
In the late 1990s, he collaborated with his son Andrea on the "Ultimo" crime dramas, resulting in "Ultimo" (1998), "Ultimo 2 – La sfida" (1999), "Ultimo 3 – L'infiltrato" (2004) and "Ultimo 4 – L'occhio del falco" (2013).
In the 2000s, Morricone continued to compose music for successful television series such as "Il Cuore nel Pozzo" (2005), "" (2005), "La provinciale" (2006), "Giovanni Falcone" (2007), "Pane e libertà" (2009) and "Come Un Delfino 1–2" (2011–2013).
With an estimated 13 million viewers, "" became an incredible success. Morricone wrote additional music for the sequel, "" (2006), which portrayed Karol's life as Pope from his papal inauguration to his death. Both scores were originally released respectively in 2005 and 2006. One year later, a double disc album with both scores is released.
In 2003, Morricone scored another epic, for Japanese television, called "Musashi" and was the Taiga drama about Miyamoto Musashi, Japan's legendary warrior. A part of his "applied music" is now applied to Italian television films.
Morricone provided the string arrangements on Morrissey's "Dear God Please Help Me" from the album "Ringleader of the Tormentors" in 2006.
Since 2004, Morricone wrote music for almost exclusively Italian television movies and mini-series, especially for directors such as Giuseppe Tornatore, Alberto Negrin, Giuliano Montaldo, and Franza Di Rosa.
In 2008, the composer recorded music for a Lancia commercial, featuring Richard Gere and directed by Harald Zwart (known for directing "The Pink Panther 2").
In spring and summer 2010, Morricone worked with Hayley Westenra for a collaboration on her album "Paradiso". The album features new songs written by Morricone, as well as some of his best-known film compositions of the last 50 years. Hayley recorded the album with Morricone's orchestra in Rome during the summer of 2010.
Since 1995, he composed the music for several advertising campaigns of Dolce & Gabbana. The commercials were directed by Giuseppe Tornatore.
In 2013, Morricone collaborated with Italian singer-songwriter Laura Pausini on a new version of her hit single "La solitudine" for her 20 years anniversary greatest hits album "20 – The Greatest Hits".
Morricone composed the music for "The Best Offer" (2013) by Giuseppe Tornatore.
In 2014, Ennio Morricone became an honorary chairman of the First International Open Competition in author's music video "Mediamusic." The final of the competition was scheduled on 1 March 2015 in Moscow.
He wrote the score for Christian Carion's "En mai, fais ce qu'il te plait" (2015) and the most recent movie by Tornatore: "The Correspondence" (2016), featuring Jeremy Irons and Olga Kurylenko.
In July 2015, Quentin Tarantino announced after the screening of footage of his movie "The Hateful Eight" at the San Diego Comic-Con International that Morricone would score the film, the first Western that Morricone has scored since 1981. The score was critically acclaimed and won several awards including the Golden Globe Award for Best Original Score and the Academy Award for Best Original Score.
Before receiving his diplomas in trumpet, composition and instrumentation from the conservatory, Morricone was already active as a trumpet player, often performing in an orchestra that specialized in music written for films. After completing his education at Saint Cecilia, the composer honed his orchestration skills as an arranger for Italian radio and television. In order to support himself, he moved to RCA in the early sixties and entered the front ranks of the Italian recording industry. Since 1964, Morricone was also a founding member of the Rome-based avant-garde ensemble Gruppo di Improvvisazione di Nuova Consonanza. During the existence of the group (until 1978), Morricone performed several times with the group as trumpet player.
To ready his music for live performance, he joined smaller pieces of music together into longer suites. Rather than single pieces, which would require the audience to applaud every few minutes, Morricone thought the best idea was to create a series of suites lasting from 15 to 20 minutes, which form a sort of symphony in various movements – alternating successful pieces with personal favorites. In concert, Morricone normally has 180 to 200 musicians and vocalists under his baton, performing multiple genre-crossing collections of music. Rock, symphonic and ethnic instruments share the stage.
On 20 September 1984 Morricone conducted the Orchestre national des Pays de la Loire at "Cinésymphonie '84" ("Première nuit de la musique de film/First night of film music") in the French concert hall Salle Pleyel in Paris. He performed some of his best-known compositions such as "Metti, una sera a cena", "Novecento" and "The Good, the Bad and the Ugly". Michel Legrand and Georges Delerue performed on the same evening.
On 15 October 1987 Morricone gave a concert in front of 12,000 people in the Sportpaleis in Antwerp, Belgium, with the Dutch Metropole Orchestra and the Italian operatic soprano Alide Maria Salvetta. A live-album with a recording of this concert was released in the same year.
On 9 June 2000 Morricone went to the Flanders International Film Festival Ghent to conduct his music together with the National Orchestra of Belgium. During the concert's first part, the screening of "The Life and Death of King Richard III" (1912) was accompanied with live music by Morricone. It was the very first time that the score was performed live in Europe. The second part of the evening consisted of an anthology of the composer's work. The event took place on the eve of Euro 2000, the European Football Championship in Belgium and the Netherlands.
Morricone performed over 250 concerts as of 2001. Since 2001, the composer has been on a world tour, the latter part sponsored by Giorgio Armani, with the Orchestra Roma Sinfonietta, touring London (Barbican 2001; 75th birthday "Concerto", Royal Albert Hall 2003), Paris, Verona, and Tokyo. Morricone performed his classic film scores at the Munich Philharmonie in 2005 and Hammersmith Apollo Theatre in London, UK, on 1 & 2 December 2006.
He made his North American concert debut on 3 February 2007 at Radio City Music Hall in New York City. The previous evening, Morricone had already presented at the United Nations a concert comprising some of his film themes, as well as the cantata "Voci dal silenzio" to welcome the new Secretary-General Ban Ki-Moon. A "Los Angeles Times" review bemoaned the poor acoustics and opined of Morricone: "His stick technique is adequate, but his charisma as a conductor is zero." Morricone, though, has said: "Conducting has never been important to me. If the audience comes for my gestures, they had better stay outside."
On 12 December 2007 Morricone conducted the Orchestra Roma Sinfonietta at the Wiener Stadthalle in Vienna, presenting a selection of his own works.
Together with the Roma Sinfonietta and the Belfast Philharmonic Choir, Morricone performed at the Opening Concerts of the Belfast Festival at Queen's, in the Waterfront Hall on 17 and 18 October 2008.
Morricone and Orchestra Roma Sinfonietta also held a concert at the Belgrade Arena (Belgrade, Serbia) on 14 February 2009.
On 10 April 2010 Morricone conducted a concert at the Royal Albert Hall in London with the Orchestra Roma Sinfonietta and (as in all of his previous London concerts) the Crouch End Festival Chorus. On 11 September he conducted a concert in Verona.
On 26 February 2012 Morricone made his Australian debut when he conducted the Western Australian Youth Orchestra together with a 100 voice chorus (made up primarily of WASO chorus members) at the Burswood Theatre (part of Crown Perth (formerly known as Burswood Entertainment Complex)) in Perth. On 2 March 2012 he conducted the Adelaide Symphony Orchestra at Elder Park, Adelaide as part of the Adelaide Festival of Arts.
On 22 December 2012 Morricone conducted the 85-piece Belgian orchestra "Orkest der Lage Landen" and a 100-piece choir during a two-hour concert in the Sportpaleis in Antwerp.
In November 2013 Morricone began a world tour to coincide with the 50th anniversary of his film music career and performed in locations such as the Crocus City Hall in Moscow, Santiago, Chile, Berlin, Germany (O2 World), Budapest, Hungary, and Vienna (Stadhalle). Back in June 2014, Morricone had to cancel a U.S tour in New York (Barclays Center) and Los Angeles (Nokia Theatre LA Live) due to a back procedure on 20 February. Morricone postponed the rest of his world tour.
In November 2014 Morricone stated that he will resume his European tour starting from February 2015.
In the late 1960s, Morricone and three other Italian composers (Piero Piccioni, Armando Trovajoli and Luis Bacalov) founded Forum Music Village (Rome), previously called Ortophonic recording studio. The recording studio has some peculiarities, one of them is the ability to record a church organ directly to the studio.
Morricone has been using the studio to create his scores for the past 40 years. The studio has hosted many directors who have worked alongside him, including Brian De Palma, Oliver Stone and Barry Levinson.
The Academy Award-winning scores of "" by Luis Bacalov and "Life Is Beautiful" by Nicola Piovani were recorded in Studio A of Forum Music Village.
Notable artists who have recorded at Forum Music Village are Quincy Jones, Jon and Vangelis, Plácido Domingo, Andrea Bocelli, Red Hot Chili Peppers, Will.i.am, Yo-Yo Ma, Morrissey, Bruno Nicolai, Alessandro Alessandroni, Goblin, Pino Donaggio, Nicola Piovani, Danger Mouse, Daniele Luppi and Cher.
On 13 October 1956 he married Maria Travia, whom he had met in 1950. Travia has written lyrics to complement her husband's pieces. Her works include the Latin texts for "The Mission". They have three sons and a daughter, in order of birth: Marco (1957), Alessandra (1961), the conductor and film composer Andrea (1964), and Giovanni Morricone (1966), a filmmaker, who lives in New York City.
Morricone has lived in Italy his entire life and has never desired to live in Hollywood. Morricone is also not fluent in English and will give interviews only in Italian, his native language.
On 25 June 2019, "The New York Times Magazine" listed Ennio Morricone among hundreds of artists whose material was reportedly destroyed in the 2008 Universal fire.
Ennio Morricone has influenced many artists from other styles and genres, including Danger Mouse, Dire Straits, Muse, Metallica, Radiohead and Hans Zimmer.
Ennio Morricone has sold well over 70 million records worldwide during his career that spanned over seven decades, including 6.5 million albums and singles in France, over three million in the United States and more than two million albums in Korea. In 1971, the composer received his first golden record (disco d'oro) for the sale of 1,000,000 records in Italy and a "Targa d'Oro" for the worldwide sales of 22 million.
Ennio Morricone has been involved with at least 19 different movies grossing over 20 million at the box office
" * " = US-only figures. Other successful movies with Morricone's work are "La Luna" (1979), "Kill Bill: Volumes 1 & 2" (2003, 2004), "Inglourious Basterds" (2009) and "Django Unchained" (2012) though the tracks used are sampled from older pictures.
Selected long-time collaborations with directors
Ennio Morricone received his first Academy Award nomination in 1979 for the score to "Days of Heaven" (Terrence Malick, 1978).
In 1984, the U.S. distributor of Sergio Leone's "Once Upon a Time in America" reportedly failed to file the proper paperwork so that Morricone's score, regarded as one of his best, would be eligible for consideration for an Academy Award.
Two years later, Morricone received his second Oscar nomination for "The Mission". He also received Oscar nominations for his scores to "The Untouchables" (1987), "Bugsy" (1991), "Malèna" (2000), and "The Hateful Eight" (2016). On 28 February 2016, Morricone won his first competitive Academy Award for his score to "The Hateful Eight."
Morricone and Alex North are the only composers to receive the Academy Honorary Award since the award's introduction in 1928. Ennio Morricone received the Academy Honorary Award on 25 February 2007, presented by Clint Eastwood, "for his magnificent and multifaceted contributions to the art of film music." With the statuette came a standing ovation. In conjunction with the honor, Morricone released a tribute album, "We All Love Ennio Morricone", that featured as its centerpiece Celine Dion's rendition of "I Knew I Loved You" (based on "Deborah's Theme" from "Once Upon a Time in America"), which she performed at the ceremony. Behind-the-scenes studio production and recording footage of "I Knew I Loved You" can be viewed in the debut episode of the QuincyJones.com Podcast. The lyric, as with Morricone's "Love Affair", had been written by Alan and Marilyn Bergman. Morricone's acceptance speech was in his native Italian tongue and was interpreted by Clint Eastwood, who stood to his left. Eastwood and Morricone had in fact met two days earlier for the first time in 40 years at a reception.
AFI
In 2005 four film scores by Ennio Morricone were nominated by the American Film Institute for an honoured place in the AFI's Top 25 of Best American Film Scores of All Time. His score for "The Mission" was ranked 23rd in the Top 25 list.
Golden Globes
Italian Golden Globes
Grammy Awards
Morricone was nominated seven times for a Grammy Award. In 2009 The Recording Academy inducted his score for The Good, the Bad and the Ugly (1966) into the Grammy Hall of Fame.
Nastro d'Argento (Silver Ribbon)
ASCAP Awards
BAFTA Awards
César Awards
David Award
European Film Awards
LAFCA | https://en.wikipedia.org/wiki?curid=10277 |
List of explosives used during World War II
Almost all the common explosives listed here were mixtures of several common components:
This is only a partial list; there were many others. Many of these compositions are now obsolete and only encountered in legacy munitions and unexploded ordnance.
Two nuclear explosives, containing mixtures of uranium and plutonium, respectively, were also used at the bombings of Hiroshima and Nagasaki | https://en.wikipedia.org/wiki?curid=10278 |
Erlang (unit)
The erlang (symbol E) is a dimensionless unit that is used in telephony as a measure of offered load or carried load on service-providing elements such as telephone circuits or telephone switching equipment. A single cord circuit has the capacity to be used for 60 minutes in one hour. Full utilization of that capacity, 60 minutes of traffic, constitutes 1 erlang.
Carried traffic in erlangs is the average number of concurrent calls measured over a given period (often one hour), while offered traffic is the traffic that would be carried if all call-attempts succeeded. How much offered traffic is carried in practice will depend on what happens to unanswered calls when all servers are busy.
The CCITT named the international unit of telephone traffic the erlang in 1946 in honor of Agner Krarup Erlang. In Erlang's analysis of efficient telephone line usage he derived the formulae for two important cases, Erlang-B and Erlang-C, which became foundational results in teletraffic engineering and queueing theory. His results, which are still used today, relate quality of service to the number of available servers. Both formulae take offered load as one of their main inputs (in erlangs), which is often expressed as call arrival rate times average call length.
A distinguishing assumption behind the Erlang B formula is that there is no queue, so that if all service elements are already in use then a newly arriving call will be blocked and subsequently lost. The formula gives the probability of this occurring. In contrast, the Erlang C formula provides for the possibility of an unlimited queue and it gives the probability that a new call will need to wait in the queue due to all servers being in use. Erlang's formulae apply quite widely, but they may fail when congestion is especially high causing unsuccessful traffic to repeatedly retry. One way of accounting for retries when no queue is available is the Extended Erlang B method.
When used to represent carried traffic, a value (which can be a non-integer such as 43.5) followed by “erlangs” represents the average number of concurrent calls carried by the circuits (or other service-providing elements), where that average is calculated over some reasonable period of time. The period over which the average is calculated is often one hour, but shorter periods (e.g., 15 minutes) may be used where it is known that there are short spurts of demand and a traffic measurement is desired that does not mask these spurts.
One erlang of carried traffic refers to a single resource being in continuous use, or two channels each being in use fifty percent of the time, and so on. For example, if an office has two telephone operators who are both busy all the time, that would represent two erlangs (2 E) of traffic; or a radio channel that is occupied continuously during the period of interest (eg. one hour) is said to have a load of 1 erlang.
When used to describe offered traffic, a value followed by “erlangs” represents the average number of concurrent calls that would have been carried if there were an unlimited number of circuits (that is, if the call-attempts that were made when all circuits were in use had not been rejected). The relationship between offered traffic and carried traffic depends on the design of the system and user behavior. Three common models are (a) callers whose call-attempts are rejected go away and never come back, (b) callers whose call-attempts are rejected try again within a fairly short space of time, and (c) the system allows users to wait in queue until a circuit becomes available.
A third measurement of traffic is instantaneous traffic, expressed as a certain number of erlangs, meaning the exact number of calls taking place at a point in time. In this case the number is an integer. Traffic-level-recording devices, such as moving-pen recorders, plot instantaneous traffic.
The concepts and mathematics introduced by Agner Krarup Erlang have broad applicability beyond telephony. They apply wherever users arrive more or less at random to receive exclusive service from any one of a group of service-providing elements without prior reservation, for example, where the service-providing elements are ticket-sales windows, toilets on an airplane, or motel rooms. (Erlang’s models do not apply where the service-providing elements are shared between several concurrent users or different amounts of service are consumed by different users, for instance, on circuits carrying data traffic.)
The goal of Erlang’s traffic theory is to determine exactly how many service-providing elements should be provided in order to satisfy users, without wasteful over-provisioning. To do this, a target is set for the grade of service (GoS) or quality of service (QoS). For example, in a system where there is no queuing, the GoS may be that no more than 1 call in 100 is blocked (i.e., rejected) due to all circuits being in use (a GoS of 0.01), which becomes the target probability of call blocking, "Pb", when using the Erlang B formula.
There are several resulting formulae, including Erlang B, Erlang C and the related Engset formula, based on different models of user behavior and system operation. These may each be derived by means of a special case of continuous-time Markov processes known as a birth–death process. The more recent Extended Erlang B method provides a further traffic solution that draws on Erlang's results.
Offered traffic (in erlangs) is related to the call arrival rate, "λ", and the average call-holding time (the average time of a phone call), "h", by:
provided that "h" and "λ" are expressed using the same units of time (seconds and calls per second, or minutes and calls per minute).
The practical measurement of traffic is typically based on continuous observations over several days or weeks, during which the instantaneous traffic is recorded at regular, short intervals (such as every few seconds). These measurements are then used to calculate a single result, most commonly the busy-hour traffic (in erlangs). This is the average number of concurrent calls during a given one-hour period of the day, where that period is selected to give the highest result. (This result is called the time-consistent busy-hour traffic). An alternative is to calculate a busy-hour traffic value separately for each day (which may correspond to slightly different times each day) and take the average of these values. This generally gives a slightly higher value than the time-consistent busy-hour value.
Where the existing busy-hour carried traffic, "E"c, is measured on an already overloaded system, with a significant level of blocking, it is necessary to take account of the blocked calls in estimating the busy-hour offered traffic "E"o (which is the traffic value to be used in the Erlang formulae). The offered traffic can be estimated by "E"o = "E"c/(1 − "P"b). For this purpose, where the system includes a means of counting blocked calls and successful calls, "P"b can be estimated directly from the proportion of calls that are blocked. Failing that, "P"b can be estimated by using "E"c in place of "E"o in the Erlang formula and the resulting estimate of "P"b can then be used in "E"o = "E"c/(1 − "P"b) to provide a first estimate of "E"o.
Another method of estimating "E"o in an overloaded system is to measure the busy-hour call arrival rate, "λ" (counting successful calls and blocked calls), and the average call-holding time (for successful calls), "h", and then estimate "E"o using the formula "E" = "λh".
For a situation where the traffic to be handled is completely new traffic, the only choice is to try to model expected user behavior. For example, one could estimate active user population, "N", expected level of use, "U" (number of calls/transactions per user per day), busy-hour concentration factor, "C" (proportion of daily activity that will fall in the busy hour), and average holding time/service time, "h" (expressed in minutes). A projection of busy-hour offered traffic would then be "E"o = "h" erlangs. (The division by 60 translates the busy-hour call/transaction arrival rate into a per-minute value, to match the units in which "h" is expressed.)
The Erlang B formula (or Erlang-B with a hyphen), also known as the Erlang loss formula, is a formula for the blocking probability that describes the probability of call losses for a group of identical parallel resources (telephone lines, circuits, traffic channels, or equivalent), sometimes referred to as an M/M/c/c queue. It is, for example, used to dimension a telephone network's links. The formula was derived by Agner Krarup Erlang and is not limited to telephone networks, since it describes a probability in a queuing system (albeit a special case with a number of servers but no queueing space for incoming calls to wait for a free server). Hence, the formula is also used in certain inventory systems with lost sales.
The formula applies under the condition that an unsuccessful call, because the line is busy, is not queued or retried, but instead really vanishes forever. It is assumed that call attempts arrive following a Poisson process, so call arrival instants are independent. Further, it is assumed that the message lengths (holding times) are exponentially distributed (Markovian system), although the formula turns out to apply under general holding time distributions.
The Erlang B formula assumes an infinite population of sources (such as telephone subscribers), which jointly offer traffic to "N" servers (such as telephone lines). The rate expressing the frequency at which new calls arrive, λ, (birth rate, traffic intensity, etc.) is constant, and does "not" depend on the number of active sources. The total number of sources is assumed to be infinite.
The Erlang B formula calculates the blocking probability of a buffer-less loss system, where a request that is not served immediately is aborted, causing that no requests become queued. Blocking occurs when a new request arrives at a time where all available servers are currently busy. The formula also assumes that blocked traffic is cleared and does not return.
The formula provides the GoS (grade of service) which is the probability "Pb" that a new call arriving to the resources group is rejected because all resources (servers, lines, circuits) are busy: "B"("E", "m") where "E" is the total offered traffic in erlang, offered to "m" identical parallel resources (servers, communication channels, traffic lanes).
where:
Note: The "erlang" is a dimensionless load unit calculated as the mean arrival rate, λ, multiplied by the mean call holding time, "h".
See Little's law to prove that the erlang unit has to be dimensionless for Little's Law to be dimensionally sane.
This may be expressed recursively as follows, in a form that is used to simplify the calculation of tables of the Erlang B formula:
Typically, instead of "B"("E", "m") the inverse 1/"B"("E", "m") is calculated in numerical computation in order to ensure numerical stability:
The Erlang B formula is decreasing and convex in "m".
It requires that call arrivals can be modeled by a Poisson process, which not always is a good match, but it is valid for any statistical distribution of call holding times with finite mean.
It applies to traffic transmission systems that do not buffer traffic.
More modern examples compared to POTS where Erlang B is still applicable, are optical burst switching (OBS) and several current approaches to optical packet switching (OPS).
Erlang B was developed as a trunk sizing tool for telephone networks with holding times in the minutes range, but being a mathematical equation it applies on any time-scale.
Extended Erlang B differs from the classic Erlang-B assumptions by allowing for a proportion of blocked callers to try again, causing an increase in offered traffic from the initial baseline level. It is an iterative calculation rather than a formula and adds an extra parameter, the recall factor formula_8, which defines the recall attempts.
The steps in the process are as follows. It starts at iteration formula_9 with a known initial baseline level of traffic formula_10, which is successively adjusted to calculate a sequence of new offered traffic values formula_11, each of which accounts for the recalls arising from the previously calculated offered traffic formula_12.
1. Calculate the probability of a caller being blocked on their first attempt
as above for Erlang B.
2. Calculate the probable number of blocked calls
3. Calculate the number of recalls, formula_15, assuming a fixed Recall Factor, formula_8,
4. Calculate the new offered traffic
where formula_10 is the initial (baseline) level of traffic.
5. Return to step 1, substituting formula_11 for formula_12, and iterate until a stable value of formula_22 is obtained.
Once a satisfactory value of formula_22 has been found, the blocking probability formula_3 and the recall factor can be used to calculate the probability that all of a caller's attempts are lost, not just their first call but also any subsequent retries.
The Erlang C formula expresses the probability that an arriving customer will need to queue (as opposed to immediately being served). Just as the Erlang B formula, Erlang C assumes an infinite population of sources, which jointly offer traffic of formula_22 erlangs to formula_26 servers. However, if all the servers are busy when a request arrives from a source, the request is queued. An unlimited number of requests may be held in the queue in this way simultaneously. This formula calculates the probability of queuing offered traffic, assuming that blocked calls stay in the system until they can be handled. This formula is used to determine the number of agents or customer service representatives needed to staff a call centre, for a specified desired probability of queuing. However, the Erlang C formula assumes that callers never hang up while in queue, which makes the formula predict that more agents should be used than are really needed to maintain a desired service level.
where:
It is assumed that the call arrivals can be modeled by a Poisson process and that call holding times are described by an exponential distribution.
When Erlang developed the Erlang-B and Erlang-C traffic equations, they were developed on a set of assumptions. These assumptions are accurate under most conditions; however in the event of extremely high traffic congestion, Erlang's equations fail to accurately predict the correct number of circuits required because of re-entrant traffic. This is termed a high-loss system, where congestion breeds further congestion at peak times. In such cases, it is first necessary for many additional circuits to be made available so that the high loss can be alleviated. Once this action has been taken, congestion will return to reasonable levels and Erlang's equations can then be used to determine how exactly many circuits are really required.
An example of an instance which would cause such a High Loss System to develop would be if a TV-based advertisement were to announce a particular telephone number to call at a specific time. In this case, a large number of people would simultaneously phone the number provided. If the service provider had not catered for this sudden peak demand, extreme traffic congestion will develop and Erlang's equations cannot be used. | https://en.wikipedia.org/wiki?curid=10283 |
Eligible receiver
In gridiron football, not all players on offense are entitled to receive a forward pass. Only an eligible pass receiver may legally catch a forward pass, and only an eligible receiver may advance beyond the neutral zone if a forward pass crosses into the neutral zone. If the pass is received by a non-eligible receiver, it is "illegal touching" (five yards and loss of down). If an ineligible receiver is beyond the neutral zone when a forward pass crossing the neutral zone is thrown, a foul of "ineligible receiver downfield" (five yards, but no loss of down) is called. Each league has slightly different rules regarding who is considered an eligible receiver.
The NCAA rulebook defines eligible receivers for college football in Rule 7, Section 3, Article 3. The determining factors are the player's position on the field at the snap and their jersey number. Specifically, any players on offense wearing numbers between 50 and 79 are always ineligible. All defensive players are eligible receivers and offensive players who are not wearing an ineligible number are eligible receivers if they meet one of the following three criteria:
Players may only wear eligible numbers at an ineligible position when it is obvious that a punt or field goal is to be attempted.
If a player is to change between eligible and ineligible positions, they must physically change jersey numbers to reflect the position.
A receiver loses his eligibility by leaving the field of play unless he was forced out by a defensive player and immediately attempts to get back inbounds (Rule 7-3-4). All players on the field become eligible as soon as the ball is touched by a defensive player or an official during play (Rule 7-3-5).
In both American and Canadian professional football, every player on the defensive team is considered eligible. The offensive team must have at least seven players lined up on the line of scrimmage. Of the players on the line of scrimmage, only the two players on the ends of the line of scrimmage are eligible receivers. The remaining players are in the backfield (four in American football, five in Canadian football), including the quarterback. These backfield players are also eligible receivers. In the National Football League (NFL), a quarterback who takes his stance behind center as a T-formation quarterback is not eligible unless, before the ball is snapped, he legally moves to a position at least one yard behind the line of scrimmage or on the end of the line, and is stationary in that position for at least one second before the snap, but is nonetheless not counted toward the seven men required on the line of scrimmage.
If, for example, eight men line up on the line of scrimmage, the team loses an eligible receiver. This can often happen when a flanker or slot receiver, who is supposed to line up behind the line of scrimmage, instead lines up on the line of scrimmage between the offensive line and a split end. In most cases where a pass is caught by an ineligible receiver, it is usually because the quarterback was under pressure and threw it to an offensive lineman out of desperation.
Eligible receivers must wear certain uniform numbers, so that the officials can more easily distinguish between eligible and ineligible receivers. In the NFL running backs must wear numbers 20 to 49, tight ends must wear numbers 80 to 89 (or 40 to 49 if the numbers 80 to 89 have been exhausted), and wide receivers must wear numbers 10 to 19 or 80 to 89. In the CFL ineligible receivers must wear numbers 50 to 69; all other numbers (including 0 and 00) may be worn by eligible receivers. A player who is not wearing a number that corresponds to an eligible receiver is ineligible even if he lines up in an eligible position. However, a player who reports to the referee that he intends to be eligible in the following play is allowed to line up and act as an eligible receiver. An example of this was a 1985 NFL game in which William Perry, wearing number 72 and normally a defensive lineman, was made an eligible receiver on an offensive play, and successfully caught a touchdown pass attempt. A more recent example, and more commonly used, has been former New England Patriots linebacker Mike Vrabel lining up as a tight end in goal line situations. In the 2018 season, George Fant has also lined up in the tight end position for the Seattle Seahawks due to injuries to the starting tight ends Ed Dickson and Will Dissly. In the 2019 season the Atlanta Falcons declared right tackle Ty Sambrailo eligible on many plays before throwing the ball to him for a 35-yard touchdown against the Tampa Bay Buccaneers
Before the snap of the ball, in the American game, backfield players may only move parallel to the line of scrimmage, only one back may be in motion at any given time, and if forward motion has occurred, the back must be still for a full second before the snap. The receiver may be in motion laterally or away from the line of scrimmage at the snap. A breach of this rule results in a penalty for illegal procedure (five yards). However, in the Canadian game, eligible receivers may move in any direction before the snap, any number may be in motion at any one time, and there is no need to be motionless before the snap.
The rules on eligible receivers only apply to forward passes. Any player may legally catch a backwards or lateral pass.
In the American game, once the play has started, eligible receivers can become ineligible depending on how the play develops. Any eligible receiver that goes out of bounds is no longer an eligible receiver and cannot receive a forward pass, unless that player re-establishes by taking three steps in bounds. Also, if a pass is touched by any defensive player or eligible offensive receiver (tipped by a defensive lineman, slips through a receiver's hands, etc.), every offensive player immediately becomes eligible. In the CFL all players become eligible receivers if a pass is touched by a member of the defensive team. A proposed rule change in the XFL would make all players behind the line of scrimmage eligible receivers, regardless of position or number.
In high school football, the rules of eligibility are roughly the same as in the college game. However, as of February 2009, at least five players must wear numbers between 50 and 79 on first, second, or third down, which by rule would make them ineligible receivers. This was because of a change in the definition of a scrimmage-kick formation made by the National Federation of State High School Associations (NFHS). The change was intended to close a loophole in the rules which allowed teams to run an A-11 offense, in which a team could legally be exempted from eligibility numbering restrictions if the player receiving the snap was at least seven yards behind the line of scrimmage.
In 2019, the NFHS changed the rules slightly, instead measuring the number of players behind the line of scrimmage, limiting that number to four. The minimum number of players on the offensive line was reduced from seven to five; however, because of the limit on backs, the only way to legally play with fewer than five ineligible receivers is to play with fewer than 11 players on the field. | https://en.wikipedia.org/wiki?curid=10285 |
Enver Hoxha
Enver Hoxha ( , ; 16 October 190811 April 1985) was an Albanian communist politician who served as the First Secretary of the Party of Labour of Albania, from 1941 until his death in 1985. He was also a member of Politburo of the Party of Labour of Albania, chairman of the Democratic Front of Albania, commander-in-chief of the armed forces from 1944 until his death. He served as the 22nd Prime Minister of Albania from 1944 to 1954 and at various times served as foreign minister and defence minister of People's Socialist Republic of Albania as well.
Born in Gjirokastër in 1908, Hoxha became a teacher in grammar school in 1936. Following Italy's invasion of Albania, he joined the Party of Labour of Albania at its creation in 1941. Hoxha was elected First Secretary in March 1943 at the age of 34. Less than two years after the liberation of the country, the monarchy was abolished, King Zog was deposed, and Hoxha rose to power as the symbolic head of state of Albania.
"40 Years of Socialist Albania", an Albanian publication from 1984, describes how during his four-decade rule, he focused on rebuilding the country, which was left in ruins after WWII, building Albania's first railway line, raising the adult literacy rate from 5% to 98%, eliminating epidemics, electrifying the country and leading Albania towards becoming agriculturally self-sufficient. Detractors criticize him for a series of political repressions which included the establishment and use of forced labor camps, extrajudicial killings and executions that targeted and eliminated dissidents, a large number of which were carried out by the Sigurimi secret police.
Hoxha's government was characterized by his proclaimed firm adherence to anti-revisionist Marxism–Leninism from the mid-1970s onwards. After his break with Maoism in the 1976–1978 period, numerous Maoist parties around the world declared themselves Hoxhaist. The International Conference of Marxist–Leninist Parties and Organizations (Unity & Struggle) is the best-known association of these parties today.
Hoxha was born in Gjirokastër in southern Albania (then a part of the Ottoman Empire), the son of Halil Hoxha, a Muslim Tosk cloth merchant who travelled widely across Europe and the US, and Gjylihan (Gjylo) Hoxha née "Çuçi".
The Hoxha family was attached to the Bektashi Order. In 1916 his father brought him to seek the blessing of Baba Selim of the Zall Teqe.
After elementary school, he followed his studies in the city senior high school "Liria". He started his studies at the Gjirokastër Lyceum in 1923. After the lyceum was closed, due to intervention of Eqrem Libohova Hoxha was awarded a state scholarship for the continuation of his studies in Korçë, at the French language Albanian National Lyceum until 1930.
In 1930, Hoxha went to study at the University of Montpellier in France on a state scholarship for the faculty of natural science, but lost an Albanian state scholarship for neglecting his studies. He later went to Paris, where he presented himself to anti-Zogist immigrants as the brother-in-law of Bahri Omari.
From 1935 to 1936, he was employed as a secretary at the Albanian consulate in Brussels. After returning to Albania, he worked as a contract teacher in the Gymnasium of Tirana. Hoxha taught French and morals in the Korça Liceum from 1937 to 1939 and also served as the caretaker of the school library.
On 7 April 1939, Albania was invaded by Fascist Italy. The Italians established a puppet government, the Albanian Kingdom (1939–43), under Shefqet Vërlaci. At the end of 1939, he was transferred to the Gjirokastra Gymnasium, but he soon returned to Tirana. He was helped by his best friend, Esat Dishnica, who introduced Hoxha to Dishnica's cousin Ibrahim Biçakçiu. Hoxha started to sleep in Biçakçiu's tobacco factory "Flora", and after a while Dishnica opened a shop with the same name, where Hoxha began working. He was a sympathizer of Korça's Communist Group.
On 8 November 1941, the Communist Party of Albania (later renamed the Party of Labour of Albania in 1948) was founded. Hoxha was chosen from the "Korca group" as a Muslim representative by the two Yugoslav envoys as one of the seven members of the provisional Central Committee. The First Consultative Meeting of Activists of the Communist Party of Albania was held in Tirana from April 8 to 11, 1942, with Hoxha himself delivering the main report on 8 April 1942.
In July 1942, Hoxha wrote "Call to the Albanian Peasantry", issued in the name of the Communist Party of Albania. The call sought to enlist support in Albania for the war against the fascists. The peasants were encouraged to hoard their grain and refuse to pay taxes or livestock levies brought by the government. After the September 1942 Conference at Pezë, the National Liberation Movement was founded with the purpose of uniting the anti-fascist Albanians, regardless of ideology or class.
By March 1943, the first National Conference of the Communist Party elected Hoxha formally as First Secretary. During WWII, the Soviet Union's role in Albania was negligible. On 10 July 1943, the Albanian partisans were organised in regular units of companies, battalions and brigades and named the Albanian National Liberation Army. The organization received military support from the British intelligence service, SOE. The General Headquarters was created, with Spiro Moisiu as the commander and Hoxha as political commissar. The Yugoslav Partisans had a much more practical role, helping to plan attacks and exchanging supplies, but communication between them and the Albanians was limited and letters would often arrive late, sometimes well after a plan had been agreed upon by the National Liberation Army without consultation from the Yugoslav partisans.
Within Albania, repeated attempts were made during the war to remedy the communications difficulties which faced partisan groups. In August 1943, a secret meeting, the Mukje Conference, was held between the anti-communist Balli Kombëtar (National Front) and the Communist Party of Albania. The result of this was an agreement to:
To encourage the Balli Kombëtar to sign, the Greater Albania sections that included Kosovo (part of Yugoslavia) and Chamëria were made part of the Agreement.
A problem developed when the Yugoslav Communists disagreed with the goal of a Greater Albania and asked the Communists in Albania to withdraw their agreement. According to Hoxha, Josip Broz Tito had not agreed that "Kosovo was Albanian" and that Serbian opposition made the transfer an unwise option. After the Albanian Communists repudiated the Greater Albania agreement, the Balli Kombëtar condemned the Communists, who in turn accused the Balli Kombëtar of siding with the Italians. The Balli Kombëtar, however, lacked support from the people. After judging the Communists as an immediate threat, the Balli Kombëtar sided with Nazi Germany, fatally damaging its image among those fighting the Fascists. The Communists quickly added to their ranks many of those disillusioned with the Balli Kombëtar and took centre stage in the fight for liberation.
The Permet National Congress held during that time called for a "new democratic Albania for the people". Although the monarchy was not formally abolished, King Zog was barred from returning to the country, which further increased the Communists' control. The Anti-Fascist Committee for National Liberation was founded, chaired by Hoxha. On 22 October 1944, the Committee became the Democratic Government of Albania after a meeting in Berat and Hoxha was chosen as interim Prime Minister. Tribunals were set up to try alleged war criminals who were designated "enemies of the people" and were presided over by Koçi Xoxe.
After liberation on 29 November 1944, several Albanian partisan divisions crossed the border into German-occupied Yugoslavia, where they fought alongside Tito's partisans and the Soviet Red Army in a joint campaign which succeeded in driving out the last pockets of German resistance. Marshal Tito, during a Yugoslavian conference in later years, thanked Hoxha for the assistance that the Albanian partisans had given during the War for National Liberation ("Lufta Nacionalçlirimtare"). The Democratic Front, dominated by the Albanian Communist Party, succeeded the National Liberation Front in August 1945 and the first post-war election was the held on 2 December. The Front was the only legal political organisation allowed to stand in the elections, and the government reported that 93% of Albanians voted for it.
On 11 January 1946, Zog was officially deposed and Albania was proclaimed the People's Republic of Albania (renamed the People's Socialist Republic of Albania in 1976). As First Secretary, Hoxha was "de facto" head of state and the most powerful man in the country.
Albanians celebrate their independence day on 28 November (which is the date on which they declared their independence from the Ottoman Empire in 1912), while in the former People's Socialist Republic of Albania the national day was 29 November, the day the country was liberated from the Italians. Both days are currently national holidays.
Hoxha declared himself a Marxist–Leninist and strongly admired Soviet leader Joseph Stalin.
During the period of 1945–1950, the government adopted policies and actions intended to consolidate power which included extrajudicial killings and executions that targeted and eliminated anti-communists.
The Agrarian Reform Law was passed in August 1945. It confiscated land from beys and large landowners, giving it without compensation to peasants. 52% of all land was owned by large landowners before the law was passed; this declined to 16% after the law's passage. Illiteracy, which was 90–95% in rural areas in 1939 went down to 30% by 1950 and by 1985 it was equal to that of a Western country.
The State University of Tirana was established in 1957, which was the first of its kind in Albania. The Medieval Gjakmarrja (blood feud) was banned. Malaria, the most widespread disease, was successfully fought through advances in health care, the use of DDT, and through the draining of swamplands. From 1965 to 1985, no cases of malaria were reported, whereas previously Albania had the greatest number of infected patients in Europe. No cases of syphilis had been recorded for 30 years.
By 1949, the US and British intelligence organisations were working with King Zog and the mountain men of his personal guard. They recruited Albanian refugees and émigrés from Egypt, Italy and Greece, trained them in Cyprus, Malta and the Federal Republic of Germany (West Germany), and infiltrated them into Albania. Guerrilla units entered Albania in 1950 and 1952, but they were killed or captured by Albanian security forces. Kim Philby, a Soviet double agent working as a liaison officer between the British intelligence service and the US Central Intelligence Agency, had leaked details of the infiltration plan to Moscow, and the security breach claimed the lives of about 300 infiltrators.
At this point, relations with Yugoslavia had begun to change. The roots of the change began on 20 October 1944 at the Second Plenary Session of the Communist Party of Albania. The Session considered the problems that the post-independence Albanian government would face. However, the Yugoslav delegation led by Velimir Stoinić accused the party of "sectarianism and opportunism" and blamed Hoxha for these errors. He also stressed the view that the Yugoslav Communist partisans spearheaded the Albanian partisan movement.
Anti-Yugoslav members of the Albanian Communist Party had begun to think that this was a plot by Tito who intended to destabilize the Party. Koçi Xoxe, Sejfulla Malëshova and others who supported Yugoslavia were looked upon with deep suspicion. Tito's position on Albania was that it was too weak to stand on its own and that it would do better as a part of Yugoslavia. Hoxha alleged that Tito had made it his goal to get Albania into Yugoslavia, firstly by creating the Treaty of Friendship, Co-operation and Mutual Aid in 1946. In time, Albania began to feel that the treaty was heavily slanted towards Yugoslav interests, much like the Italian agreements with Albania under Zog that made the nation dependent upon Italy.
The first issue was that the Albanian lek became revalued in terms of the Yugoslav dinar as a customs union was formed and Albania's economic plan was decided more by Yugoslavia. Albanian economists H. Banja and V. Toçi stated that the relationship between Albania and Yugoslavia during this period was exploitative and that it constituted attempts by Yugoslavia to make the Albanian economy an "appendage" to the Yugoslav economy. Hoxha then began to accuse Yugoslavia of misconduct:
Stalin advised Hoxha that Yugoslavia was attempting to annex Albania: "We did not know that the Yugoslavs, under the pretext of 'defending' your country against an attack from the Greek fascists, wanted to bring units of their army into the PRA [People's Republic of Albania]. They tried to do this in a very secretive manner. In reality, their aim in this direction was utterly hostile, for they intended to overturn the situation in Albania." By June 1947, the Central Committee of Yugoslavia began publicly condemning Hoxha, accusing him of talking an individualistic and anti-Marxist line. When Albania responded by making agreements with the Soviet Union to purchase a supply of agricultural machinery, Yugoslavia said that Albania could not enter into any agreements with other countries without Yugoslav approval.
Koçi Xoxe tried to stop Hoxha from improving relations with Bulgaria, reasoning that Albania would be more stable with one trading partner rather than with many. Nako Spiru, an anti-Yugoslav member of the Party, condemned Xoxe and vice versa. With no one coming to Spiru's defense, he viewed the situation as hopeless and feared that Yugoslav domination of his nation was imminent, which caused him to commit suicide in November.
At the Eighth Plenum of the Central Committee of the Party which lasted from 26 February to 8 March 1948, Xoxe was implicated in a plot to isolate Hoxha and consolidate his own power. He accused Hoxha of being responsible for the decline in relations with Yugoslavia and stated that a Soviet military mission should be expelled in favor of a Yugoslav counterpart. Hoxha managed to remain firm and his support had not declined. When Yugoslavia publicly broke with the Soviet Union, Hoxha's support base grew stronger. Then, on 1 July 1948, Tirana called on all Yugoslav technical advisors to leave the country and unilaterally declared all treaties and agreements between the two countries null and void. Xoxe was expelled from the party and on 13 June 1949, he was executed by hanging.
After the break with Yugoslavia, Hoxha aligned himself with the Soviet Union, for which he had a great admiration. From 1948 to 1960, $200 million in Soviet aid was given to Albania for technical and infrastructural expansion. Albania was admitted to the Comecon on 22 February 1949 and remained important both as a way to pressure Yugoslavia and to serve as a pro-Soviet force in the Adriatic Sea. A submarine base was built on the island of Sazan near Vlorë, posing a possible threat to the United States Sixth Fleet. Relations remained close until the death of Stalin on 5 March 1953. His death was met with 14 days of national mourning in Albania—more than in the Soviet Union. Hoxha assembled the entire population in the capital's largest square featuring a statue of Stalin, requested that they kneel, and made them take a two-thousand word oath of "eternal fidelity" and "gratitude" to their "beloved father" and "great liberator" to whom the people owed "everything".
Under Nikita Khrushchev, Stalin's successor, aid was reduced and Albania was encouraged to adopt Khrushchev's specialization policy. Under this policy, Albania would develop its agricultural output in order to supply the Soviet Union and other Warsaw Pact nations while these nations would be developing specific resource outputs of their own, which would, in theory, strengthen the Warsaw Pact by greatly reducing the lack of certain resources that many of the nations faced. However, this also meant that Albanian industrial development, which was stressed heavily by Hoxha, would have to be significantly reduced.
From 16 May to 17 June 1955, Nikolai Bulganin and Anastas Mikoyan visited Yugoslavia and Khrushchev renounced the expulsion of Yugoslavia from the Communist bloc. Khrushchev also began making references to Palmiro Togliatti's polycentrism theory. Hoxha had not been consulted on this and was offended. Yugoslavia began asking for Hoxha to rehabilitate the image of Koçi Xoxe, which Hoxha steadfastly rejected. In 1956 at the Twentieth Party Congress of the Communist Party of the Soviet Union, Khrushchev condemned the cult of personality that had been built up around Joseph Stalin and also denounced him for many grave mistakes. Khrushchev then announced the theory of peaceful coexistence, which angered Hoxha greatly. The Institute of Marxist–Leninist Studies, led by Hoxha's wife Nexhmije, quoted Vladimir Lenin: "The fundamental principle of the foreign policy of a socialist country and of a Communist party is proletarian internationalism; not peaceful coexistence." Hoxha now took a more active stand against perceived revisionism.
Unity within the Albanian Party of Labour began to decline as well, with a special delegate meeting held in Tirana in April 1956, composed of 450 delegates and having unexpected results. The delegates "criticized the conditions in the party, the negative attitude toward the masses, the absence of party and socialist democracy, the economic policy of the leadership, etc." while also calling for discussions on the cult of personality and the Twentieth Party Congress.
In 1956, Hoxha called for a resolution which would uphold the current leadership of the Party. The resolution was accepted, and all of the delegates who had spoken out were expelled from the party and imprisoned. Hoxha stated that this was yet another of many attempts to overthrow the leadership of Albania which had been organized by Yugoslavia. This incident further consolidated Hoxha's power, effectively making Khrushchev-esque reforms nearly impossible. In the same year, Hoxha traveled to the People's Republic of China, which was then enduring the Sino-Soviet split, and met Mao Zedong. Relations with China improved, as evidenced by Chinese aid to Albania being 4.2% in 1955 before the visit, and rising to 21.6% in 1957.
In an effort to keep Albania in the Soviet sphere, increased aid was given but the Albanian leadership continued to move closer towards China. Relations with the Soviet Union remained at the same level until 1960, when Khrushchev met Sofoklis Venizelos, a liberal Greek politician. Khrushchev sympathized with the concept of an autonomous Greek North Epirus and he hoped to use Greek claims to keep the Albanian leadership in line with Soviet interests. Hoxha reacted by only sending Hysni Kapo, a member of the Albanian Political Bureau, to the Third Congress of the Romanian Workers' Party in Bucharest, an event that heads of state were normally expected to attend. As relations between the two countries continued to deteriorate in the course of the meeting, Khrushchev said:
Relations with the Soviet Union began to decline rapidly. A hardline policy was adopted and the Soviets reduced aid shipments, specifically grain, at a time when Albania needed them due to the possibility of a flood-induced famine. In July 1960, a plot to overthrow the Albanian government was discovered. It was to be organized by Soviet-trained Rear Admiral Teme Sejko. After this, two pro-Soviet members of the Party, Liri Belishova and Koço Tashko, were expelled, with a humorous incident involving Tashko pronouncing "" (Russian for "full stop").
In August, the Party's Central Committee sent a letter of protest to the Central Committee of the Communist Party of the Soviet Union, stating its displeasure at having an anti-Albanian Soviet Ambassador in Tirana. The Fourth Congress of the Party, held from 13 to 20 February 1961, was the last meeting that the Soviet Union or other Eastern European nations attended in Albania. During the congress, the Soviet Union was condemned while China was praised. Mehmet Shehu stated that while many members of the Party were accused of tyranny, this was a baseless charge and unlike the Soviet Union, Albania was led by genuine Marxists.
The Soviet Union retaliated by threatening "dire consequences" if the condemnations were not retracted. Days later, Khrushchev and Antonín Novotný, President of Czechoslovakia (which was Albania's largest source of aid besides the Soviets), threatened to cut off economic aid. In March, Albania was not invited to attend the meeting of the Warsaw Pact nations (Albania had been one of its founding members in 1955) and in April all Soviet technicians were withdrawn from the nation. In May nearly all Soviet troops on the Orikum naval base were withdrawn, leaving the Albanians with 4 submarines and other military equipment.
On 7 November 1961, Hoxha made a speech in which he called Khrushchev a "revisionist, an anti-Marxist and a defeatist". Hoxha portrayed Stalin as the last Communist leader of the Soviet Union and he began to stress Albania's independence. By 11 November, the USSR and every other Warsaw Pact nation broke relations with Albania. Albania was unofficially excluded (by not being invited) from both the Warsaw Pact and Comecon. The Soviet Union also attempted to claim control of the Vlorë port due to a lease agreement; the Albanian Party then passed a law prohibiting any other nation from owning an Albanian port through lease or otherwise. The Soviet–Albanian split was now complete.
As Hoxha's leadership continued, he took on an increasingly theoretical stance. He wrote criticisms which were based on theory and current events which occurred at the time; his most notable criticisms were his condemnations of Maoism after 1978. A major achievement under Hoxha was the advancement of women's rights. Albania had been one of the most, if not the most, patriarchal countries in Europe. The "Code of Lekë", which regulated the status of women, states, "A woman is known as a sack, made to endure as long as she lives in her husband's house." Women were not allowed to inherit anything from their parents, and discrimination was even made in the case of the murder of a pregnant woman:
Women were forbidden from obtaining a divorce, and the wife's parents were obliged to return a runaway daughter to her husband or else suffer shame which could even result in a generations-long blood feud. During World War II, the Albanian Communists encouraged women to join the partisans and following the war, women were encouraged to take up menial jobs, as the education necessary for higher level work was out of most women's reach. In 1938, 4% worked in various sectors of the economy. In 1970, this number had risen to 38%, and in 1982 to 46%.
During the Cultural and Ideological Revolution (discussed below), women were encouraged to take up "all" jobs, including government posts, which resulted in 40.7% of the People's Councils and 30.4% of the People's Assembly being made up of women, including two women in the Central Committee by 1985. In 1978, 15.1 times as many females attended eight-year schools as had done so in 1938 and 175.7 times as many females attended secondary schools. By 1978, 101.9 times as many women attended higher schools as in 1957. Hoxha said of women's rights in 1967:The entire party and country should hurl into the fire and break the neck of anyone who dared trample underfoot the sacred edict of the party on the defense of women's rights.In 1969, direct taxation was abolished and during this period the quality of schooling and health care continued to improve. An electrification campaign was begun in 1960 and the entire nation was expected to have electricity by 1985. Instead, it achieved this on 25 October 1970, making it the first nation with complete electrification in the world. During the Cultural & Ideological Revolution of 1967–1968 the military changed from traditional Communist army tactics and began to adhere to the Maoist strategy known as people's war, which included the abolition of military ranks, which were not fully restored until 1991. Mehmet Shehu said of the country's health service in 1979:'
Hoxha's legacy also included a complex of 173,371 one-man concrete bunkers across a country of 3 million inhabitants, to act as look-outs and gun emplacements along with chemical weapons. The bunkers were built strong and mobile, with the intention that they could be easily placed by a crane or a helicopter in a hole. The types of bunkers vary from machine gun pillboxes, beach bunkers, to underground naval facilities and even Air Force Mountain and underground bunkers.
Hoxha's internal policies were true to Stalin's paradigm which he admired, and the personality cult which was developed in the 1970s and organized around him by the Party also bore a striking resemblance to that of Stalin. At times it even reached an intensity which was as extreme as the personality cult of Kim Il-sung (which Hoxha condemned) with Hoxha being portrayed as a genius commenting on virtually all facets of life from culture to economics to military matters. Each schoolbook required one or more quotations from him on the subjects being studied. The Party honored him with titles such as Supreme Comrade, Sole Force and Great Teacher. He adopted a different type military salute for the People's Army to render honors which is known today as the Hoxhaist Salute, which involves soldiers curling their right fist and raising it to shoulder level. It replaced the Zogist salute, which was used by the Royal Albanian Army for many years.
Hoxha's governance was also distinguished by his encouragement of a high birthrate policy. For instance, a woman who bore an above-average number of children would be given the government award of "Heroine Mother" (in Albanian: "Nënë Heroinë") along with cash rewards. Abortion was essentially restricted (to encourage high birth rates), except if the birth posed a danger to the mother's life, though it was not completely banned; the process was decided by district medical commissions. As a result, the population of Albania tripled from 1 million in 1944 to around 3 million in 1985.
In Albania's Third Five-year Plan, China promised a loan of $125 million to build twenty-five chemical, electrical and metallurgical plants called for under the Plan. However, the nation experienced a difficult transition period, because Chinese technicians were of a lower quality than Soviet ones and the great distance between the two nations, plus the poor relations which Albania had with its neighbors, further complicated matters. Unlike Yugoslavia or the USSR, China had less economic influence on Albania during Hoxha's leadership. The previous fifteen years (1946–1961) had at least 50% of the economy under foreign commerce.
By the time the 1976 Constitution was promulgated, which prohibited foreign debt, aid and investment, Albania had basically become self-sufficient although it was lacking in modern technology. Ideologically, Hoxha found Mao's initial views to be in line with Marxism-Leninism. Mao condemned Nikita Khrushchev's alleged revisionism and he was also critical of Yugoslavia. Aid given from China was interest-free and it did not have to be repaid until Albania could afford to do so.
China never intervened in what Albania's economic output should be, and Chinese technicians worked for the same wages as Albanian workers, unlike Soviet technicians who sometimes made more than three times the pay of Hoxha. Albanian newspapers were reprinted in Chinese newspapers and read on Chinese radio. Finally, Albania led the movement to give the People's Republic of China a seat on the UN Security Council, an effort which would prove successful in 1971 when it replaced the Republic of China's seat. During this period, Albania became the second largest producer of chromium in the world, which was considered an important export for Albania. Strategically, the Adriatic Sea was also attractive to China, and the Chinese leadership had hoped to gain more allies in Eastern Europe with Albania's help, although this effort failed. Zhou Enlai visited Albania in January 1964. On 9 January, "The 1964 Sino-Albanian Joint Statement" was signed in Tirana. The statement said of relations between socialist countries:
Like Albania, China defended the "purity" of Marxism by attacking both US imperialism and "Soviet and Yugoslav revisionism", both equally as part of a "dual adversary" theory. Yugoslavia was viewed as both a "special detachment of U.S. imperialism" and a "saboteur against world revolution." These views, however, began to change in China, which was one of the major issues which Albania had with the alliance. Also unlike Yugoslavia and the Soviet Union, the Sino-Albanian alliance lacked "... an organizational structure for regular consultations and policy coordination, and it was also characterized by an informal relationship which was conducted on an "ad hoc" basis." Mao made a speech on 3 November 1966 in which he claimed that Albania was the only Marxist-Leninist state in Europe and in the same speech, he also stated that "an attack on Albania will have to reckon with great People's China. If the U.S. imperialists, the modern Soviet revisionists or any of their lackeys dare to touch Albania in the slightest, nothing lies ahead for them but a complete, shameful and memorable defeat." Likewise, Hoxha stated that "You may rest assured, comrades, that come what may in the world at large, our two parties and our two peoples will certainly remain together. They will fight together and they will win together."
China entered into a four-year period of relative diplomatic isolation following the Cultural Revolution and at this point relations between China and Albania reached their zenith. On 20 August 1968, the Soviet invasion of Czechoslovakia was condemned by Albania, as was the Brezhnev doctrine. Albania then officially withdrew from the Warsaw Pact on 5 September. Relations with China began to deteriorate on 15 July 1971, when United States President Richard Nixon agreed to visit China to meet with Zhou Enlai. Hoxha felt betrayed and the government was in a state of shock. On 6 August a letter was sent from the Central Committee of the Albanian Party of Labour to the Central Committee of the Communist Party of China, calling Nixon a "frenzied anti-Communist". The letter stated:
The result was a 1971 message from the Chinese leadership stating that Albania could not depend on an indefinite flow of further Chinese aid and in 1972, Albania was advised to "curb its expectations about further Chinese contributions to its economic development". By 1973, Hoxha wrote in his diary "Reflections on China" that the Chinese leaders:
In response, trade with COMECON (although trade with the Soviet Union was still blocked) and Yugoslavia grew. Trade with Third World nations was $0.5 million in 1973, but $8.3 million in 1974. Trade rose from 0.1% to 1.6%. Following Mao's death on 9 September 1976, Hoxha remained optimistic about Sino-Albanian relations, but in August 1977, Hua Guofeng, the new leader of China, stated that Mao's Three Worlds Theory would become official foreign policy. Hoxha viewed this as a way for China to justify having the U.S. as the "secondary enemy" while viewing the Soviet Union as the main one, thus allowing China to trade with the U.S. "the Chinese plan of the 'third world' is a major diabolical plan, with the aim that China should become another superpower, precisely by placing itself at the head of the 'third world' and the 'non-aligned world'." From 30 August to 7 September 1977, Tito visited Beijing and was welcomed by the Chinese leadership. At this point, the Albanian Party of Labour had declared that China was now a revisionist state akin to the Soviet Union and Yugoslavia, and that Albania was the only Marxist–Leninist state on Earth. Hoxha stated:
On 13 July 1978, China announced that it was cutting off all aid to Albania. For the first time in modern history, Albania did not have either an ally or a major trading partner.
Certain clauses in the 1976 constitution effectively circumscribed the exercise of political liberties which the government interpreted as contrary to the established order. In addition, the government denied the population access to information other than that disseminated by the government-controlled media. Internally, the Sigurimi followed the repressive methods of the NKVD, MGB, KGB, and the East German Stasi. At one point, every third Albanian had either been interrogated by the Sigurimi or incarcerated in labour camps. To eliminate dissent, the government imprisoned thousands in forced-labour camps or executed them for crimes such as alleged treachery or for disrupting the proletarian dictatorship. Travel abroad was forbidden after 1968 to all but those who were on official business.
Albania, the only predominantly Muslim country in Europe at that time, largely owing to Turkish influence in the region, had not, like the Ottoman Empire, identified religion with ethnicity. In the Ottoman Empire, Muslims were viewed as Turks, Orthodox Christians were viewed as Greeks, and Catholics were viewed as Latins. Hoxha believed this was a serious issue, feeling that it both fueled Greek separatists in southern Albania and that it also divided the nation in general. The Agrarian Reform Law of 1945 confiscated much of the church's property in the country. Catholics were the earliest religious community to be targeted since the Vatican was seen as being an agent of Fascism and anti-Communism. In 1946 the Jesuit Order was banned and the Franciscans were banned in 1947. "Decree No. 743" (On religion) sought a national church and forbade religious leaders to associate with foreign powers.
Mother Teresa, a Catholic nun whose family resided in Albania during Hoxha's rule, was denied a chance to see them because she was viewed as a dangerous agent. Despite multiple requests and many countries asking on her behalf she was not granted the opportunity to see her mother and sister. Both Mother Teresa's mother and sister died during Hoxha's rule, and the nun herself was only able to visit Albania 5 years after the communist regime collapsed. Dom Lush Gjergji in his book "Our Mother Teresa" describes one of her trips to the embassy where she was crying as she was leaving the building, saying:“Dear God, I can understand and accept that I should suffer, but it is so hard to understand and accept why my mother has to suffer. In her old age she has no other wish than to see us one last time.”
The Party focused on atheist education in schools. This tactic was effective, primarily due to the high birthrate policy encouraged after the war. During what the religious consider "holy periods," such as Lent and Ramadan, many foods which are scorned by them (dairy products, meat, etc.) were distributed in schools and factories, and those who refused to eat those foods were denounced.
Starting on 6 February 1967, the Party began to promote secularism over Abrahamic religions. Hoxha, who had declared a "Cultural and Ideological Revolution" after being partly inspired by China's Cultural Revolution, encouraged communist students and workers to use more forceful tactics to discourage religious practices, although violence was initially condemned.
According to Hoxha, the surge in anti-theist activity began with the youth. The result of this "spontaneous, unprovoked movement" was the demolition or conversion of all 2,169 churches and mosques in Albania. State atheism became official policy, and Albania was declared the world's first atheist state. Town and city names which echoed Abrahamic religious themes were abandoned for neutral secular ones, as well as personal names. During this period religiously based names were also made illegal. The "Dictionary of People's Names", published in 1982, contained 3,000 approved, secular names. In 1992, Monsignor Dias, the Papal Nuncio for Albania appointed by Pope John Paul II, said that of the three hundred Catholic priests present in Albania prior to the Communists coming to power, only thirty were still active. Promotion of religion was banned and all clerics were outlawed as reactionaries. Those religious figures who refused to embrace the principles of Marxism–Leninism were either arrested or carried on their activities in hiding.
Enver Hoxha had declared during the anti-religious campaign that "the only religion of Albania is Albanianism", a quotation from the poem "O moj Shqiperi" ("O Albania") by the 19th-century Albanian writer Pashko Vasa.
Muzafer Korkuti, one of the dominant figures in post-war Albanian archaeology and now Director of the Institute of Archaeology in Tirana, said this in an interview on 10 July 2002:
Efforts were focused on an Illyrian-Albanian continuity issue.
An Illyrian origin of the Albanians (without denying "Pelasgian" roots) continued to play a significant role in Albanian nationalism, resulting in a revival of given names supposedly of "Illyrian" origin, at the expense of given names associated with Christianity. At first, Albanian nationalist writers opted for the Pelasgians as the forefathers of the Albanians, but as this form of nationalism flourished in Albania under Enver Hoxha, the Pelasgians became a secondary element to the Illyrian theory of Albanian origins, which could claim some support in scholarship.
The Illyrian descent theory soon became one of the pillars of Albanian nationalism, especially because it could provide some evidence of continuity of an Albanian presence both in Kosovo and Southern Albania, i.e., areas that were subject to ethnic conflicts between Albanians, Serbs and Greeks. Under the government of Enver Hoxha, an autochthonous ethnogenesis was promoted and physical anthropologists tried to demonstrate that Albanians were different from any other Indo-European populations, a theory now disproved.
They claimed that the Illyrians were the most ancient people in the Balkans and greatly extended the age of the Illyrian language.
Hoxha and his government were hostile to Western (American and British-led) popular culture as it manifested in the mass media, along with the consumerism and social liberalism associated with it. In a speech on the Fourth Plenum of the Central Committee of the PLA (PLA-CC) on 26 June 1973, Hoxha declared a definitive break from any such Western bourgeois influence and what he described as its "degenerated bourgeois culture". In a speech in which he also criticised the "spread of certain vulgar, alien tastes in music and art", which ran "contrary to socialist ethics and the positive traditions of our people", including "degenerate importations such as long hair, extravagant dress, screaming jungle music, coarse language, shameless behaviour and so on", Hoxha declared;
A new Constitution was decided upon by the Seventh Congress of the Albanian Party of Labour on 1–7 November 1976. According to Hoxha, "The old Constitution was the Constitution of the building of the foundations of socialism, whereas the new Constitution will be the Constitution of the complete construction of a socialist society."
Self-reliance was now stressed more than ever. Citizens were encouraged to train in the use of weapons, and this activity was also taught in schools. This was to encourage the creation of quick partisans.
Borrowing and foreign investment were banned under Article 26 of the Constitution, which read: "The granting of concessions to, and the creation of foreign economic and financial companies and other institutions or ones formed jointly with bourgeois and revisionist capitalist monopolies and states as well as obtaining credits from them are prohibited in the People's Socialist Republic of Albania." Hoxha said of borrowing money and allowing investment from other countries:
During this period Albania was the most isolated and poorest country in Europe and socially backwards by European standards. It had the lowest standard of living in Europe. However, as a result of economic self-sufficiency, Albania had a minimal foreign debt. In 1983, Albania imported goods worth $280 million but exported goods worth $290 million, producing a trade surplus of $10 million.
In 1981, Hoxha ordered the execution of several party and government officials in a new purge. Prime Minister Mehmet Shehu, the second-most powerful man in Albania and Hoxha's closest comrade-in-arms for 40 years, was reported to have committed suicide in December 1981. He was subsequently condemned as a "traitor" to Albania and was also accused of operating in the service of multiple intelligence agencies. It is generally believed that he was either killed or shot himself during a power struggle or over differing foreign policy matters with Hoxha. Hoxha also wrote a large assortment of books during this period, resulting in over 65 volumes of collected works, condensed into six volumes of selected works.
Hoxha suffered a heart attack in 1973 from which he never fully recovered. In increasingly precarious health from the late 1970s onward, he turned most state functions over to Ramiz Alia. In his final days he was confined to a wheelchair and suffering from diabetes, which had developed in 1948, and cerebral ischemia, from which he had suffered since 1983. On 9 April 1985, he was struck by ventricular fibrillation. Over the next forty-eight hours, he had repeated episodes of this arrhythmia, and he died in the early morning of 11 April 1985 at the age of 76. Hoxha's body laid in state at the building of the Presidium of the People's Assembly for three days before he was buried on 15 April after a memorial service on Skanderbeg Square. The government refused to accept any foreign delegations during the funeral and even condemned the Soviet message of condolences as "unacceptable". He was succeeded as head of state by Ramiz Alia after his burial, who gained control of the party leadership two days later.
Hoxha's death left Albania with a legacy of isolation and fear of the outside world. Despite some economic progress made by Hoxha, the country was in economic stagnation; Albania had been the poorest European country throughout much of the Cold War period. Following the transition to capitalism in 1992, Hoxha's legacy diminished, so that by the early 21st century very little of it was still in place in Albania.
The surname "Hoxha" is the Albanian variant of Hodja (from ), a title given to his ancestors due to their efforts to teach Albanians about Islam. In addition, among the population, he was widely known by his nickname of "Dulla", a short form for the Muslim name Abdullah stemming from his Muslim roots.
Hoxha's parents were Halil and Gjylihan (Gjylo) Hoxha, and Hoxha had three sisters, Fahrije, Haxhire and Sanije. Hysen Hoxha () was Enver Hoxha's uncle and was a militant who campaigned vigorously for the independence of Albania, which occurred when Enver was four years old. His grandfather Beqir was involved in the Gjirokastër section of the League of Prizren.
Hoxha's son Sokol Hoxha was the CEO of the Albanian Post and Telecommunication service and is married to Liliana Hoxha. The later democratic president of Albania Sali Berisha was often seen socializing with Sokol Hoxha and other close relatives of leading communist figures in Albania.
Hoxha's daughter, Pranvera, is an architect. Along with her husband, Klement Kolaneci, she designed the Enver Hoxha Museum in Tirana, a white-tiled pyramid. Some sources have referred to the edifice, said to be the most expensive ever constructed in Albanian history, as the "Enver Hoxha Mausoleum", though this was not an official appellation. The museum opened in 1988, three years after her father's death, and in 1991 was transformed into a conference centre and exhibition venue renamed Pyramid of Tirana.
Banda Mustafaj was a group of four Albanian émigrés, led by Xhevdet Mustafa, who wanted to assassinate Enver Hoxha in 1982. The gang was connected to counter-revolutionary elements such as the Albanian mafia and members of the royal House of Zogu. The plan failed and two of its members were killed and another one was arrested. It marked the only real effort to kill Hoxha. | https://en.wikipedia.org/wiki?curid=10286 |
Hirohito
At the start of his reign, Japan was already one of the great powers—the ninth-largest economy in the world, the third-largest naval power, and one of the four permanent members of the council of the League of Nations. He was the head of state under the Constitution of the Empire of Japan during Japan's imperial expansion, militarization, and involvement in World War II. After Japan's surrender, he was not prosecuted for war crimes as many other leading government figures were, and his degree of involvement in wartime decisions remains controversial. During the post-war period, he became the symbol of the state of Japan under the post-war constitution and Japan's recovery, and by the end of his reign, Japan emerged as the world's second largest economy.
Born in Tokyo's Aoyama Palace (during the reign of his grandfather, Emperor Meiji) on 29 April 1901, Hirohito was the first son of 21-year-old Crown Prince Yoshihito (the future Emperor Taishō) and 17-year-old Crown Princess Sadako (the future Empress Teimei). He was the grandson of Emperor Meiji and Yanagihara Naruko. His childhood title was Prince Michi. On the 70th day after his birth, Hirohito was removed from the court and placed in the care of the family of Count Kawamura Sumiyoshi, a former vice-admiral, who was to rear him as if he were his own grandchild. At the age of 3, Hirohito and his brother Yasuhito were returned to court when Kawamura died – first to the imperial mansion in Numazu, Shizuoka, then back to the Aoyama Palace. In 1908 he began elementary studies at the Gakushūin (Peers School).
When his grandfather, Emperor Meiji, died on 30 July 1912, Hirohito's father, Yoshihito, assumed the throne, and Hirohito became the heir apparent. At the same time, he was formally commissioned in both the army and navy as a second lieutenant and ensign, respectively, and was also decorated with the Grand Cordon of the Order of the Chrysanthemum. In 1914, he was promoted to the ranks of Lieutenant in the army and Sub-Lieutenant in the navy, then to Captain and Lieutenant in 1916. Hirohito was formally proclaimed Crown Prince and heir apparent on 2 November 1916; but an investiture ceremony was not strictly necessary to confirm this status as heir to the throne.
Hirohito attended Gakushūin Peers' School from 1908 to 1914 and then a special institute for the crown prince (Tōgū-gogakumonsho) from 1914 to 1921. In 1920 Hirohito was promoted to the rank of Major in the army and Lieutenant Commander in the navy.
From 3 March to 3 September 1921 (Taisho 10), the Crown Prince made official visits to the United Kingdom, France, the Netherlands, Belgium, Italy and Vatican City. This was the first visit to Europe by the Crown Prince, and despite strong opposition in Japan, this was realized by the efforts of elder Japanese statesmen (Genrō) such as Yamagata Aritomo and Saionji Kinmochi.
The departure of Prince Hirohito was widely reported in newspapers. The Japanese battleship Katori was used and departed from Yokohama, sailed to Naha, Hong Kong, Singapore, Colombo, Suez, Cairo, and Gibraltar, and two months later, arrived in Portsmouth on 9 May, and on the same day they reached the British capital London. He was welcomed in the UK as a partner of the Anglo-Japanese Alliance and met with King George V and Prime Minister David Lloyd George. That evening, a banquet was held at Buckingham Palace and a meeting with George V and Prince Arthur of Connaught. George V said on this night that he treated his father like Hirohito, who was nervous in an unfamiliar foreign country, and that relieved his tension. The next day, he met Prince Edward at Windsor Castle, and a banquet was held every day thereafter. In London, he toured the British Museum, Tower of London, Bank of England, Lloyd's Marine Insurance, Oxford University, Army University and Naval War College, and enjoyed theater at the New Oxford Theater and the Delhi Theater. At Cambridge University he listened to Professor Tanner's lecture on "Relationship between the British Royal Family and its People" and was awarded an honorary doctorate degree. He visited Edinburgh, Scotland, from the 19th to the 20th, and was also awarded an Honorary Doctor of Laws at the University of Edinburgh. He stayed at the residence of John Stewart-Murray, 7th Duke of Atholl, for three days. "The rise of Bolsheviks won't happen if you live a simple life like Duke Athol."
In Italy, he met with King Vittorio Emanuele III and others, attended official banquets in various countries, and visited places such as the fierce battlefields of World War I.
After his return to Japan, Hirohito became Regent of Japan (Sesshō) on 29 November 1921, in place of his ailing father who was affected by a mental illness. In 1923 he was promoted to the rank of Lieutenant-Colonel in the army and Commander in the navy, and to army Colonel and Navy Captain in 1925.
During Hirohito's regency, a number of important events occurred:
In the Four-Power Treaty on Insular Possessions signed on 13 December 1921, Japan, the United States, Britain, and France agreed to recognize the status quo in the Pacific, and Japan and Britain agreed to terminate formally the Anglo-Japanese Alliance. The Washington Naval Treaty was signed on 6 February 1922. Japan withdrew troops from the Siberian Intervention on 28 August 1922. The Great Kantō earthquake devastated Tokyo on 1 September 1923. On 27 December 1923, Daisuke Namba attempted to assassinate Hirohito in the Toranomon Incident but his attempt failed. During interrogation, he claimed to be a communist and was executed, but some have suggested that he was in contact with the Nagacho faction in the Army.
Prince Hirohito married his distant cousin Princess Nagako Kuni (the future Empress Kōjun), the eldest daughter of Prince Kuniyoshi Kuni, on 26 January 1924. They had two sons and five daughters (see Issue).
The daughters who lived to adulthood left the imperial family as a result of the American reforms of the Japanese imperial household in October 1947 (in the case of Princess Shigeko) or under the terms of the Imperial Household Law at the moment of their subsequent marriages (in the cases of Princesses Kazuko, Atsuko, and Takako).
On 25 December 1926, Hirohito assumed the throne upon the death of his father, Yoshihito. The Crown Prince was said to have received the succession ("senso"). The Taishō era's end and the Shōwa era's beginning (Enlightened Peace) were proclaimed. The deceased Emperor was posthumously renamed Emperor Taishō within days. Following Japanese custom, the new Emperor was never referred to by his given name but rather was referred to simply as "His Majesty the Emperor" which may be shortened to "His Majesty." In writing, the Emperor was also referred to formally as "The Reigning Emperor."
In November 1928, the Emperor's ascension was confirmed in ceremonies ("sokui") which are conventionally identified as "enthronement" and "coronation" ("Shōwa no tairei-shiki"); but this formal event would have been more accurately described as a public confirmation that his Imperial Majesty possesses the Japanese Imperial Regalia, also called the Three Sacred Treasures, which have been handed down through the centuries.
The first part of Hirohito's reign took place against a background of financial crisis and increasing military power within the government through both legal and extralegal means. The Imperial Japanese Army and Imperial Japanese Navy held veto power over the formation of cabinets since 1900, and between 1921 and 1944 there were no fewer than 64 incidents of political violence.
Hirohito narrowly escaped assassination by a hand grenade thrown by a Korean independence activist, Lee Bong-chang, in Tokyo on 9 January 1932, in the Sakuradamon Incident.
Another notable case was the assassination of moderate Prime Minister Inukai Tsuyoshi in 1932, which marked the end of civilian control of the military. This was followed by an attempted military coup in February 1936, the February 26 incident, mounted by junior Army officers of the "Kōdōha" faction who had the sympathy of many high-ranking officers including Prince Chichibu (Yasuhito), one of the Emperor's brothers. This revolt was occasioned by a loss of political support by the militarist faction in Diet elections. The coup resulted in the murders of a number of high government and Army officials.
When Chief Aide-de-camp Shigeru Honjō informed him of the revolt, the Emperor immediately ordered that it be put down and referred to the officers as "rebels" ("bōto"). Shortly thereafter, he ordered Army Minister Yoshiyuki Kawashima to suppress the rebellion within the hour, and he asked reports from Honjō every 30 minutes. The next day, when told by Honjō that little progress was being made by the high command in quashing the rebels, the Emperor told him "I Myself, will lead the Konoe Division and subdue them." The rebellion was suppressed following his orders on 29 February.
Starting from the Mukden Incident in 1931, Japan occupied Chinese territories and established puppet governments. Such "aggression was recommended to Hirohito" by his chiefs of staff and prime minister Fumimaro Konoe, and Hirohito never personally objected to any invasion of China. His main concern seems to have been the possibility of an attack by the Soviet Union in the north. His questions to his chief of staff, Prince Kan'in, and minister of the army, Hajime Sugiyama, were mostly about the time it could take to crush Chinese resistance.
According to Akira Fujiwara, Hirohito endorsed the policy of qualifying the invasion of China as an "incident" instead of a "war"; therefore, he did not issue any notice to observe international law in this conflict (unlike what his predecessors did in previous conflicts officially recognized by Japan as wars), and the Deputy Minister of the Japanese Army instructed the chief of staff of Japanese China Garrison Army on 5 August not to use the term "prisoners of war" for Chinese captives. This instruction led to the removal of the constraints of international law on the treatment of Chinese prisoners. The works of Yoshiaki Yoshimi and Seiya Matsuno show that the Emperor also authorized, by specific orders ("rinsanmei"), the use of chemical weapons against the Chinese. During the invasion of Wuhan, from August to October 1938, the Emperor authorized the use of toxic gas on 375 separate occasions, despite the resolution adopted by the League of Nations on 14 May condemning Japanese use of toxic gas.
On 27 September 1940, ostensibly under Hirohito's leadership, Japan was a contracting partner of the Tripartite Pact with Germany and Italy forming the Axis Powers. Before that, in July 1939, the Emperor quarrelled with his brother, Prince Chichibu, who was visiting him three times a week to support the treaty, and reprimanded the army minister Seishirō Itagaki. But after the success of the Wehrmacht in Europe, the Emperor consented to the alliance.
On 4 September 1941, the Japanese Cabinet met to consider war plans prepared by Imperial General Headquarters and decided that:
The objectives to be obtained were clearly defined: a free hand to continue with the conquest of China and Southeast Asia, no increase in US or British military forces in the region, and cooperation by the West "in the acquisition of goods needed by our Empire."
On 5 September, Prime Minister Konoe informally submitted a draft of the decision to the Emperor, just one day in advance of the Imperial Conference at which it would be formally implemented. On this evening, the Emperor had a meeting with the chief of staff of the army, Sugiyama, chief of staff of the navy, Osami Nagano, and Prime Minister Konoe. The Emperor questioned Sugiyama about the chances of success of an open war with the Occident. As Sugiyama answered positively, the Emperor scolded him:
Chief of Naval General Staff Admiral Nagano, a former Navy Minister and vastly experienced, later told a trusted colleague, "I have never seen the Emperor reprimand us in such a manner, his face turning red and raising his voice."
Nevertheless, all speakers at the Imperial Conference were united in favor of war rather than diplomacy. Baron Yoshimichi Hara, President of the Imperial Council and the Emperor's representative, then questioned them closely, producing replies to the effect that war would be considered only as a last resort from some, and silence from others.
At this point, the Emperor astonished all present by addressing the conference personally and, in breaking the tradition of Imperial silence, left his advisors "struck with awe." (Prime Minister Fumimaro Konoe's description of the event.) Hirohito stressed the need for peaceful resolution of international problems, expressed regret at his ministers' failure to respond to Baron Hara's probings, and recited a poem written by his grandfather, Emperor Meiji, which, he said, he had read "over and over again":
Recovering from their shock, the ministers hastened to express their profound wish to explore all possible peaceful avenues. The Emperor's presentation was in line with his practical role as leader of the State Shinto religion.
At this time, Army Imperial Headquarters was continually communicating with the Imperial household in detail about the military situation. On 8 October, Sugiyama signed a 47-page report to the Emperor (sōjōan) outlining in minute detail plans for the advance into Southeast Asia. During the third week of October, Sugiyama gave the Emperor a 51-page document, "Materials in Reply to the Throne," about the operational outlook for the war.
As war preparations continued, Prime Minister Fumimaro Konoe found himself more and more isolated and gave his resignation on 16 October. He justified himself to his chief cabinet secretary, Kenji Tomita, by stating:
The army and the navy recommended the candidacy of Prince Naruhiko Higashikuni, one of the Emperor's uncles. According to the Shōwa "Monologue," written after the war, the Emperor then said that if the war were to begin while a member of the imperial house was prime minister, the imperial house would have to carry the responsibility and he was opposed to this.
Instead, the Emperor chose the hard-line General Hideki Tōjō, who was known for his devotion to the imperial institution, and asked him to make a policy review of what had been sanctioned by the Imperial Conferences. On 2 November Tōjō, Sugiyama, and Nagano reported to the Emperor that the review of eleven points had been in vain. Emperor Hirohito gave his consent to the war and then asked: "Are you going to provide justification for the war?" The decision for war against the United States was presented for approval to Hirohito by General Tōjō, Naval Minister Admiral Shigetarō Shimada, and Japanese Foreign Minister Shigenori Tōgō.
On 3 November, Nagano explained in detail the plan of the attack on Pearl Harbor to the Emperor. On 5 November Emperor Hirohito approved in imperial conference the operations plan for a war against the Occident and had many meetings with the military and Tōjō until the end of the month. On 25 November Henry L. Stimson, United States Secretary of War, noted in his diary that he had discussed with US President Franklin D. Roosevelt the severe likelihood that Japan was about to launch a surprise attack and that the question had been "how we should maneuver them [the Japanese] into the position of firing the first shot without allowing too much danger to ourselves."
On the following day, 26 November 1941, US Secretary of State Cordell Hull presented the Japanese ambassador with the Hull note, which as one of its conditions demanded the complete withdrawal of all Japanese troops from French Indochina and China. Japanese Prime Minister Hideki Tojo said to his cabinet, "This is an ultimatum." On 1 December an Imperial Conference sanctioned the "War against the United States, United Kingdom and the Kingdom of the Netherlands."
On 8 December (7 December in Hawaii), 1941, in simultaneous attacks, Japanese forces struck at the Hong Kong Garrison, the US Fleet in Pearl Harbor and in the Philippines, and began the invasion of Malaya.
With the nation fully committed to the war, the Emperor took a keen interest in military progress and sought to boost morale. According to Akira Yamada and Akira Fujiwara, the Emperor made major interventions in some military operations. For example, he pressed Sugiyama four times, on 13 and 21 January and 9 and 26 February, to increase troop strength and launch an attack on Bataan. On 9 February 19 March, and 29 May, the Emperor ordered the Army Chief of staff to examine the possibilities for an attack on Chungking, which led to Operation Gogo.
As the tide of war began to turn against Japan (around late 1942 and early 1943), the flow of information to the palace gradually began to bear less and less relation to reality, while others suggest that the Emperor worked closely with Prime Minister Hideki Tojo, continued to be well and accurately briefed by the military, and knew Japan's military position precisely right up to the point of surrender. The chief of staff of the General Affairs section of the Prime Minister's office, Shuichi Inada, remarked to Tōjō's private secretary, Sadao Akamatsu:
In the first six months of war, all the major engagements had been victories. Japanese advances were stopped in the summer of 1942 with the battle of Midway and the landing of the American forces on Guadalcanal and Tulagi in August. The emperor played an increasingly influential role in the war; in eleven major episodes he was deeply involved in supervising the actual conduct of war operations. Hirohito pressured the High Command to order an early attack on the Philippines in 1941-42, including the fortified Bataan peninsula. He secured the deployment of army air power in the Guadalcanal campaign. Following Japan's withdrawal from Guadalcanal he demanded a new offensive in New Guinea, which was duly carried out but failed badly. Unhappy with the navy's conduct of the war, he criticized its withdrawal from the central Solomon Islands and demanded naval battles against the Americans for the losses they had inflicted in the Aleutians. The battles were disasters. Finally, it was at his insistence that plans were drafted for the recapture of Saipan and, later, for an offensive in the Battle of Okinawa. With the Army and Navy bitterly feuding, he settled disputes over the allocation of resources. He helped plan military offenses.
The media, under tight government control, repeatedly portrayed him as lifting the popular morale even as the Japanese cities came under heavy air attack in 1944-45 and food and housing shortages mounted. Japanese retreats and defeats were celebrated by the media as successes that portended "Certain Victory." Only gradually did it become apparent to the Japanese people that the situation was very grim due to growing shortages of food, medicine, and fuel as U.S submarines began wiping out Japanese shipping. Starting in mid 1944, American raids on the major cities of Japan made a mockery of the unending tales of victory. Later that year, with the downfall of Tojo's government, two other prime ministers were appointed to continue the war effort, Kuniaki Koiso and Kantarō Suzuki—each with the formal approval of the Emperor. Both were unsuccessful and Japan was nearing disaster.
In early 1945, in the wake of the losses in the Battle of Leyte, Emperor Hirohito began a series of individual meetings with senior government officials to consider the progress of the war. All but ex-Prime Minister Fumimaro Konoe advised continuing the war. Konoe feared a communist revolution even more than defeat in war and urged a negotiated surrender. In February 1945 during the first private audience with the Emperor he had been allowed in three years, Konoe advised Hirohito to begin negotiations to end the war. According to Grand Chamberlain Hisanori Fujita, the Emperor, still looking for a "tennozan" (a great victory) in order to provide a stronger bargaining position, firmly rejected Konoe's recommendation.
With each passing week victory became less likely. In April, the Soviet Union issued notice that it would not renew its neutrality agreement. Japan's ally Germany surrendered in early May 1945. In June the cabinet reassessed the war strategy, only to decide more firmly than ever on a fight to the last man. This strategy was officially affirmed at a brief Imperial Council meeting, at which, as was normal, the Emperor did not speak.
The following day, Lord Keeper of the Privy Seal Kōichi Kido prepared a draft document which summarized the hopeless military situation and proposed a negotiated settlement. Extremists in Japan were also calling for a death-before-dishonor mass suicide, modeled on the "47 Ronin" incident. By mid-June 1945 the cabinet had agreed to approach the Soviet Union to act as a mediator for a negotiated surrender but not before Japan's bargaining position had been improved by repulse of the anticipated Allied invasion of mainland Japan.
On 22 June the Emperor met with his ministers saying, "I desire that concrete plans to end the war, unhampered by existing policy, be speedily studied and that efforts be made to implement them." The attempt to negotiate a peace via the Soviet Union came to nothing. There was always the threat that extremists would carry out a coup or foment other violence. On 26 July 1945, the Allies issued the Potsdam Declaration demanding unconditional surrender. The Japanese government council, the Big Six, considered that option and recommended to the Emperor that it be accepted only if one to four conditions were agreed upon, including a guarantee of the Emperor's continued position in Japanese society. The Emperor decided not to surrender.
That changed after the atomic bombings of Hiroshima and Nagasaki and the Soviet declaration of war. On 9 August Emperor Hirohito told Kōichi Kido: "The Soviet Union has declared war and today began hostilities against us." On 10 August, the cabinet drafted an "Imperial Rescript ending the War" following the Emperor's indications that the declaration did not compromise any demand which prejudiced the prerogatives of His Majesty as a Sovereign Ruler.
On 12 August 1945, the Emperor informed the imperial family of his decision to surrender. One of his uncles, Prince Yasuhiko Asaka, asked whether the war would be continued if the "kokutai" (national polity) could not be preserved. The Emperor simply replied "Of course." On 14 August the Suzuki government notified the Allies that it had accepted the Potsdam Declaration.
On 15 August a recording of the Emperor's surrender speech (""Gyokuon-hōsō"", literally ""Jewel Voice Broadcast"") was broadcast over the radio (the first time the Emperor was heard on the radio by the Japanese people) announcing Japan's acceptance of the Potsdam Declaration. During the historic broadcast the Emperor stated: "Moreover, the enemy has begun to employ a new and most cruel bomb, the power of which to do damage is, indeed, incalculable, taking the toll of many innocent lives. Should we continue to fight, not only would it result in an ultimate collapse and obliteration of the Japanese nation, but also it would lead to the total extinction of human civilization." The speech also noted that "the war situation has developed not necessarily to Japan's advantage" and ordered the Japanese to "endure the unendurable." The speech, using formal, archaic Japanese, was not readily understood by many commoners. According to historian Richard Storry in "A History of Modern Japan", the Emperor typically used "a form of language familiar only to the well-educated" and to the more traditional samurai families.
A faction of the army opposed to the surrender attempted a coup d'état on the evening of 14 August, prior to the broadcast. They seized the Imperial Palace (the Kyūjō incident), but the physical recording of the emperor's speech was hidden and preserved overnight. The coup was crushed by the next morning, and the speech was broadcast.
In his first ever press conference given in Tokyo in 1975, when he was asked what he thought of the bombing of Hiroshima, the Emperor answered: "It's very regrettable that nuclear bombs were dropped and I feel sorry for the citizens of Hiroshima but it couldn't be helped because that happened in wartime" (shikata ga nai).
The issue of Emperor Hirohito's war responsibility is a controversial matter. There is no consensus among scholars. During wartime the allies frequently depicted Hirohito to equate with Hitler and Mussolini as the three Axis dictators. The apologist thesis, which argues that Hirohito had been a "powerless figurehead" without any implication in wartime policies, was the dominant postwar narrative until 1989. After Hirohito's death, the critical historians say that Hirohito wielded more power than previously believed. Moderates argue that Hirohito had some involvement, but his power was limited by cabinet members, ministers and other people of the military oligarchy.
Historians who follow this thesis believe Emperor Hirohito was directly responsible for the atrocities committed by the imperial forces in the Second Sino-Japanese War and in World War II. They feel that he, and some members of the imperial family such as his brother Prince Chichibu, his cousins Prince Takeda and Prince Fushimi, and his uncles Prince Kan'in, Prince Asaka, and Prince Higashikuni, should have been tried for war crimes.
The debate over Hirohito's responsibility for war crimes concerns how much real control the Emperor had over the Japanese military during the two wars. Officially, the imperial constitution, adopted under Emperor Meiji, gave full power to the Emperor. Article 4 prescribed that, "The Emperor is the head of the Empire, combining in Himself the rights of sovereignty, and exercises them, according to the provisions of the present Constitution," while according to article 6, "The Emperor gives sanction to laws and orders them to be promulgated and executed," and article 11, "The Emperor has the supreme command of the Army and the Navy." The Emperor was thus the leader of the Imperial General Headquarters.
Poison gas weapons, such as phosgene, were produced by Unit 731 and authorized by specific orders given by Hirohito himself, transmitted by the chief of staff of the army. For example, Hirohito authorised the use of toxic gas 375 times during the Battle of Wuhan from August to October 1938.
Historians such as Herbert Bix, Akira Fujiwara, Peter Wetzler, and Akira Yamada assert that the post-war view focusing on imperial conferences misses the importance of numerous "behind the chrysanthemum curtain" meetings where the real decisions were made between the Emperor, his chiefs of staff, and the cabinet. Historians such as Fujiwara and Wetzler, based on the primary sources and the monumental work of Shirō Hara, have produced evidence suggesting that the Emperor worked through intermediaries to exercise a great deal of control over the military and was neither bellicose nor a pacifist but an opportunist who governed in a pluralistic decision-making process. American historian Herbert P. Bix argues that Emperor Hirohito might have been the prime mover of most of the events of the two wars.
The view promoted by both the Japanese Imperial Palace and the American occupation forces immediately after World War II portrayed Emperor Hirohito as a powerless figurehead behaving strictly according to protocol while remaining at a distance from the decision-making processes. This view was endorsed by Prime Minister Noboru Takeshita in a speech on the day of Hirohito's death in which Takeshita asserted that the war "had broken out against [Hirohito's] wishes." Takeshita's statement provoked outrage in nations in East Asia and Commonwealth nations such as the United Kingdom, Canada, Australia, and New Zealand. According to historian Fujiwara, "The thesis that the Emperor, as an organ of responsibility, could not reverse cabinet decision is a myth fabricated after the war." Historian Yinan He agrees with Fujiwara, stating that the exoneration of the Emperor was embodied on a myth used to whitewash the complicity of many wartime political actors, including Hirohito.
In Japan, debate over the Emperor's responsibility was taboo while he was still alive. After his death, however, debate began to surface over the extent of his involvement and thus his culpability.
In the years immediately after Hirohito's death, the debate in Japan was fierce. Susan Chira reported, "Scholars who have spoken out against the late Emperor have received threatening phone calls from Japan's extremist right wing." One example of actual violence occurred in 1990 when the mayor of Nagasaki, Hitoshi Motoshima, was shot and critically wounded by a member of the ultranationalist group, Seikijuku. A year before, in 1989, Motoshima had broken what was characterized as "one of [Japan's] most sensitive taboos" by asserting that Emperor Hirohito bore responsibility for World War II.
Kentarō Awaya argues that post-war Japanese public opinion supporting protection of the Emperor was influenced by U.S. propaganda promoting the view that the Emperor together with the Japanese people had been fooled by the military.
Regarding Hirohito's exemption from trial before the International Military Tribunal of the Far East, opinions were not unanimous. Sir William Webb, the president of the tribunal, declared: "This immunity of the Emperor is contrasted with the part he played in launching the war in the Pacific, is, I think, a matter which the tribunal should take into consideration in imposing the sentences."
An account from the Vice Interior Minister in 1941, Michio Yuzawa, asserts that Hirohito was "at ease" with the attack on Pearl Harbor "once he had made a decision."
In late July 2018, the bookseller Takeo Hatano, an acquaintance of the descendants of Michio Yuzawa (Japanese Vice Interior Minister in 1941), released to Japan's "Yomiuri Shimbun" newspaper a memo by Yuzawa that Hatano had kept for nine years since he received it from Yuzawa's family. The bookseller said: "It took me nine years to come forward, as I was afraid of a backlash. But now I hope the memo would help us figure out what really happened during the war, in which 3.1 million people were killed."
Takahisa Furukawa, expert on wartime history from Nihon University, confirmed the authenticity of the memo, calling it "the first look at the thinking of Emperor Hirohito and Prime Minister Hideki Tojo on the eve of the Japanese attack on Pearl Harbor."
In this document, Yuzawa details a conversation he had with Tojo a few hours before the attack. The Vice Minister quotes Tojo saying:
Historian Furukawa concluded from Yuzawa's memo:
In August 2018 the diary of Shinobu Kobayashi, the Emperor's chamberlain between 1974 and 2000, was released. This diary contains numerous quotes from Hirohito (see below).
Jennifer Lind, associate professor of government at Darthmouth College and specialist in Japanese war memory, concluded from these quotes:
Similarly, historian Takahisa Furukawa concluded:
After the death of Emperor Showa, on 14 February 1989 (Heisei 1), the of the House of Councilors at the time (Prime Minister Noboru Takeshita, Cabinet of Takeshita), Secretary-General of the Cabinet Legislation Bureau, Mimura Osamu (味村治) said, "There are no responsibilities for war under domestic law or international law due to the two points of no response and no prosecution in the International Military Tribunal for the Far East according to of the Constitution of the Empire of Japan."
It is also argued that the Emperor did not defy the military oligarchy that got Japan into World War II until the first atomic bomb fell on Hiroshima. This is supported by Hirohito's personal statements during interviews. It is also pointed out that the Emperors had for millennia been a great symbolic authority, but had little political power. Thus Hirohito had little reason to defy the military oligarchy. The Emperor could not defy cabinet's decision to start World War II and he was not trained or accustomed to do so. Hirohito said he only received reports about military operations after the military commanders made detailed decisions. Hirohito stated that he only made his own decisions twice: for the February 26 Incident and the end of World War II.
The declassified January 1989 British government assessment of Hirohito describes him as "too weak to alter the course of events" and Hirohito was "powerless" and comparisons with Hitler are "ridiculously wide off the mark." Hirohito's power was limited by ministers and the military and if he asserted his views too much he would have been replaced by another member of the royal family.
There are scholars who support that Hirohito was exempted from the International Military Tribunal for the Far East. For example Indian jurist Radhabinod Pal opposed the International Military Tribunal and made a 1,235-page judgment. He found the entire prosecution case to be weak regarding the conspiracy to commit an act of aggressive war with brutalization and subjugation of conquered nations. Pal said there is "no evidence, testimonial or circumstantial, concomitant, prospectant, restrospectant, that would in any way lead to the inference that the government in any way permitted the commission of such offenses,". He added that conspiracy to wage aggressive war was not illegal in 1937, or at any point since. Pal supported the acquittal of all of the defendants. He considered the Japanese military operations as justified, because Chiang Kai-shek supported the boycott of trade operations by the Western Powers, particularly the United States boycott of oil exports to Japan. Pal argued the attacks on neighboring territories were justified to protect the Japanese Empire from an aggressive environment, especially the Soviet Union. He considered that to be self-defense operations which are not criminal. Pal said "the real culprits are not before us" and concluded that "only a lost war is an international crime".
A January 1989 declassified British government assessment of Hirohito said the Emperor was "uneasy with Japan's drift to war in the 1930s and 1940s but was too weak to alter the course of events.". The dispatch by John Whitehead, former ambassador of the United Kingdom to Japan, to Foreign Secretary Geoffrey Howe was declassified on Thursday 20 July 2017 at the National Archives in London.
Britain's ambassador to Japan John Whitehead stated in 1989:
Whitehead concludes that ultimately Hirohito was "powerless" and comparisons with Hitler are "ridiculously wide off the mark." If Hirohito acted too insistent with his views he could have been isolated or replaced with a more pliant member of the royal family. The pre-war Meiji Constitution defined the emperor as "sacred" and all-powerful, but according to Whitehead, Hirohito's power was limited by ministers and the military. Whitehead explained after World War 2, Hirohito's humility was fundamental for the Japanese people to accept the new 1947 constitution and allied occupation.
Shinobu Kobayashi was the Emperor's chamberlain from April 1974 till June 2000 when Empress Kojun died. Kobayashi kept a diary with near daily remarks of Hirohito for 26 years. It was made public on Wednesday 22 August 2018. The rare diary was borrowed from Kobayashi's family by Kyodo News and analyzed by Kyodo News with writer and history expert of the Showa era Kazutoshi Hando and nonfiction writer Masayasu Hosaka. Here are some quotes from the diary:
On 27 May 1980, the Emperor wanted to express his regret about the Sino-Japanese war to former Chinese Premier Hua Guofeng who visited at the time, but was stopped by senior members of the Imperial Household Agency due to fear of backlash from far right groups.
On 7 April 1987, two years before his death, this diary entry shows the Emperor was haunted by perceived discussions about World War 2 responsibility and lost the will to live. Prince Takamatsu died in February 1987.
Kobayashi tried to soothe the Emperor by saying:
Senior chamberlain, Ryogo Urabe's diary entry of the same day supports the remarks stating that Kobayashi "tried to soothe" the Emperor, when he said "there is nothing good in living long,"
As the Emperor chose his uncle Prince Higashikuni as prime minister to assist the occupation, there were attempts by numerous leaders to have him put on trial for alleged war crimes. Many members of the imperial family, such as Princes Chichibu, Takamatsu, and Higashikuni, pressured the Emperor to abdicate so that one of the Princes could serve as regent until Crown Prince Akihito came of age. On 27 February 1946, the Emperor's youngest brother, Prince Mikasa (Takahito), even stood up in the privy council and indirectly urged the Emperor to step down and accept responsibility for Japan's defeat. According to Minister of Welfare Ashida's diary, "Everyone seemed to ponder Mikasa's words. Never have I seen His Majesty's face so pale."
U.S. General Douglas MacArthur insisted that Emperor Hirohito retain the throne. MacArthur saw the Emperor as a symbol of the continuity and cohesion of the Japanese people. Some historians criticize the decision to exonerate the Emperor and all members of the imperial family who were implicated in the war, such as Prince Chichibu, Prince Asaka, Prince Higashikuni, and Prince Hiroyasu Fushimi, from criminal prosecutions.
Before the war crime trials actually convened, the SCAP, its International Prosecution Section (IPS) and Japanese officials worked behind the scenes not only to prevent the Imperial family from being indicted, but also to influence the testimony of the defendants to ensure that no one implicated the Emperor. High officials in court circles and the Japanese government collaborated with Allied GHQ in compiling lists of prospective war criminals, while the individuals arrested as "Class A" suspects and incarcerated solemnly vowed to protect their sovereign against any possible taint of war responsibility. Thus, "months before the Tokyo tribunal commenced, MacArthur's highest subordinates were working to attribute ultimate responsibility for Pearl Harbor to Hideki Tōjō" by allowing "the major criminal suspects to coordinate their stories so that the Emperor would be spared from indictment." According to John W. Dower, "This successful campaign to absolve the Emperor of war responsibility knew no bounds. Hirohito was not merely presented as being innocent of any formal acts that might make him culpable to indictment as a war criminal, he was turned into an almost saintly figure who did not even bear moral responsibility for the war." According to Bix, "MacArthur's truly extraordinary measures to save Hirohito from trial as a war criminal had a lasting and profoundly distorting impact on Japanese understanding of the lost war."
Hirohito was not put on trial, but he was forced to explicitly reject the quasi-official claim that the Emperor of Japan was an "arahitogami", i.e., an incarnate divinity. This was motivated by the fact that, according to the Japanese constitution of 1889, the Emperor had a divine power over his country which was derived from the Shinto belief that the Japanese Imperial Family was the offspring of the sun goddess Amaterasu. Hirohito was however persistent in the idea that the Emperor of Japan should be considered a descendant of the gods. In December 1945, he told his vice-grand-chamberlain Michio Kinoshita: "It is permissible to say that the idea that the Japanese are descendants of the gods is a false conception; but it is absolutely impermissible to call chimerical the idea that the Emperor is a descendant of the gods." In any case, the "renunciation of divinity" was noted more by foreigners than by Japanese, and seems to have been intended for the consumption of the former. The theory of a constitutional monarchy had already had some proponents in Japan. In 1935, when Tatsukichi Minobe advocated the theory that sovereignty resides in the state, of which the Emperor is just an organ (the "tennō kikan setsu"), it caused a furor. He was forced to resign from the House of Peers and his post at the Tokyo Imperial University, his books were banned, and an attempt was made on his life. Not until 1946 was the tremendous step made to alter the Emperor's title from "imperial sovereign" to "constitutional monarch."
Although the Emperor had supposedly repudiated claims to divinity, his public position was deliberately left vague, partly because General MacArthur thought him probable to be a useful partner to get the Japanese to accept the occupation and partly due to behind-the-scenes maneuvering by Shigeru Yoshida to thwart attempts to cast him as a European-style monarch.
Nevertheless, Hirohito's status as a limited constitutional monarch was formalized with the enactment of the 1947 Constitution–officially, an amendment to the Meiji Constitution. It defined the Emperor as "the symbol of the state and the unity of the people," and stripped him of even nominal power in government matters. His role was limited to matters of state as delineated in the Constitution, and in most cases his actions in that realm were carried out in accordance with the binding instructions of the Cabinet.
Following the Iranian Revolution and the end of the short-lived Central African Empire, both in 1979, Hirohito found himself the last monarch in the world to bear any variation of the highest royal title "emperor." By pure coincidence he was also the longest-reigning monarch in the world by this time, which meant that he was ranked first in the diplomatic order of precedence which distinguishes monarchs only by time in office and not by title.
For the rest of his life, Hirohito was an active figure in Japanese life and performed many of the duties commonly associated with a constitutional head of state. He and his family maintained a strong public presence, often holding public walkabouts and making public appearances on special events and ceremonies. Such as in 1947 the Emperor made a public visit to Hiroshima and held a speech in front of a massive crowd encouraging the citizens of Hiroshima. He also played an important role in rebuilding Japan's diplomatic image, traveling abroad to meet with many foreign leaders, including Queen Elizabeth II (1971) and President Gerald Ford (1975). He was not only the first reigning emperor to travel beyond Japan, but also the first to meet a President of the United States. His status and image became strongly positive in the United States.
In 1971 (Showa 46), the Emperor visited seven European countries, including the United Kingdom, the Netherlands, and Switzerland again, for 17 days from 27 September to 14 October. In this case, a special aircraft Douglas DC-8 of Japan Airlines was used unlike the previous visit by ship. Although not counted as a visit, at that time, the Emperor stopped by Anchorage, Alaska as a stopover, and met with United States President Richard Nixon from Washington, DC, at the Alaska District Army Command House at Elmendorf Air Force Base.
The talks between Emperor Showa and President Nixon were not planned at the outset, because initially the stop in the United States was only for refueling to visit Europe. However, the meeting was decided in a hurry at the request of the United States. Although the Japanese side accepted the request, Minister for Foreign Affairs Takeo Fukuda made a public telephone call to the Japanese ambassador to the United States Nobuhiko Ushiba, who promoted talks, saying, "that will cause me a great deal of trouble. We want to correct the perceptions of the other party." At that time, Foreign Minister Fukuda was worried that President Nixon's talks with the Emperor would be used to repair the deteriorating Japan-U.S. Relations, and he was concerned that the premise of the symbolic emperor system could fluctuate.
There was an early visit, with deep royal exchanges in Denmark and Belgium, and in France they were warmly welcomed. In France, Hirohito reunited with Edward VIII, formerly known as Duke of Windsor, who was forced to leave Britain after his abdication and was virtually in exile, and they chatted for a while. However, protests were held in Britain and the Netherlands by veterans who had served in the South-East Asian theatre. In the Netherlands, raw eggs and vacuum flasks were thrown by right-wing forces. The protest was so severe that Empress Kojun, who accompanied the Emperor, was exhausted. In the United Kingdom, protestors stood in silence and turned their backs when the Emperor's carriage passed them while others wore red gloves to symbolize the dead. The satirical magazine "Private Eye" used a racist double entendre to refer to the emperor's visit ("nasty Nip in the air").
Regarding these protests and opposition, Emperor Showa was not surprised to have received a report in advance at a press conference on 12 November after returning to Japan and said that "I do not think that welcome can be ignored" from each country. Also, at a press conference following their golden wedding anniversary three years later, along with the Empress, he mentioned this visit to Europe as his most enjoyable memory in 50 years.
In 1975, the Emperor was invited to visit the United States for 14 days from 30 September to 14 October, at the invitation of then-US President Gerald R. Ford. The visit to the United States after the emperor's throne was the first event in history. The United States Army, Navy and Air Force, as well as the Marine Corps and the Coast Guard honored the state visit. Before and after the Emperor's visit to the United States, a series of terrorist attacks in Japan were caused by anti-American left-wing organizations such as the East Asia Anti-Japan Armed Front.
After arriving in Williamsburg, Emperor Showa stayed in the United States for two weeks, overturning expectations before visiting the United States, and was greatly welcomed in places he visited, including Washington, D.C. and Los Angeles. The Official meeting with President Ford was on 2 October, the offering of flowers to the graves of unknown soldiers at Arlington National Cemetery occurred on 3 October, visiting Rockefeller House in New York was on 4 October with media in the U.S. Then, the front page of newspapers had a photograph of Emperor Showa. When visiting New York, the Pearl Harbor Survivors Association, which consists of survivors of the Attack on Pearl Harbor, adopted the Emperor's Welcome Resolution . During his visit to the United States, he seemed to be a scholar , with many occasions at Botanical gardens.
In a speech at the White House state dinner, Hirohito read, "Thanks to the United States for helping to rebuild Japan after the war." During his stay in Los Angeles, he visited Disneyland, and a smiling photo next to Mickey Mouse adorned the newspapers , and there was talk about the purchase of a Mickey Mouse watch. Two types of commemorative stamps and stamp sheets were issued on the day of their return to Japan which demonstrated that this visit to the United States was a major undertaking. This was the last visit of Emperor Showa to the United States. The official press conference held by the Emperor and Empress before and after their visit to the United States also marked a breakthrough .
The Emperor was deeply interested in and well-informed about marine biology, and the Imperial Palace contained a laboratory from which the Emperor published several papers in the field under his personal name "Hirohito." His contributions included the description of several dozen species of Hydrozoa new to science.
Emperor Hirohito maintained an official boycott of the Yasukuni Shrine after it was revealed to him that Class-A war criminals had secretly been enshrined after its post-war rededication. This boycott lasted from 1978 until his death. This boycott has been maintained by his son Akihito.
On 20 July 2006, "Nihon Keizai Shimbun" published a front-page article about the discovery of a memorandum detailing the reason that the Emperor stopped visiting Yasukuni. The memorandum, kept by former chief of Imperial Household Agency Tomohiko Tomita, confirms for the first time that the enshrinement of 14 Class-A war criminals in Yasukuni was the reason for the boycott. Tomita recorded in detail the contents of his conversations with the Emperor in his diaries and notebooks . According to the memorandum, in 1988, the Emperor expressed his strong displeasure at the decision made by Yasukuni Shrine to include Class-A war criminals in the list of war dead honored there by saying, "At some point, Class-A criminals became enshrined, including Matsuoka and Shiratori. I heard Tsukuba acted cautiously." Tsukuba is believed to refer to Fujimaro Tsukuba, the former chief Yasukuni priest at the time, who decided not to enshrine the war criminals despite having received in 1966 the list of war dead compiled by the government. "What's on the mind of Matsudaira's son, who is the current head priest?" "Matsudaira had a strong wish for peace, but the child didn't know the parent's heart. That's why I have not visited the shrine since. This is my heart." Matsudaira is believed to refer to Yoshitami Matsudaira, who was the grand steward of the Imperial Household immediately after the end of World War II. His son, Nagayoshi, succeeded Fujimaro Tsukuba as the chief priest of Yasukuni and decided to enshrine the war criminals in 1978. Nagayoshi Matsudaira died in 2006, which some commentators have speculated is the reason for release of the memo.
On 22 September 1987, the Emperor underwent surgery on his pancreas after having digestive problems for several months. The doctors discovered that he had duodenal cancer. The Emperor appeared to be making a full recovery for several months after the surgery. About a year later, however, on 19 September 1988, he collapsed in his palace, and his health worsened over the next several months as he suffered from continuous internal bleeding. On 7 January 1989, at 7:55 AM, the grand steward of Japan's Imperial Household Agency, Shoichi Fujimori, officially announced the death of Emperor Hirohito at 6:33 AM and revealed details about his cancer for the first time. Hirohito was survived by his wife, his five surviving children, ten grandchildren, and one great-grandchild.
At the time of his death he was both the longest-lived and longest-reigning historical Japanese emperor, as well as the longest-reigning monarch in the world at that time. The latter distinction would pass to king Bhumibol Adulyadej of Thailand.
The Emperor was succeeded by his son, the Emperor Akihito, whose enthronement ceremony was held on 12 November 1990.
The Emperor's death ended the Shōwa era. On the same day a new era began: the Heisei era, effective at midnight the following day. From 7 January until 31 January, the Emperor's formal appellation was "Departed Emperor." His definitive posthumous name, Shōwa Tennō, was determined on 13 January and formally released on 31 January by Toshiki Kaifu, the prime minister.
On 24 February, Emperor Hirohito's state funeral was held, and unlike that of his predecessor, it was formal but not conducted in a strictly Shinto manner. A large number of world leaders attended the funeral. Emperor Hirohito is buried in the Musashi Imperial Graveyard in Hachiōji, alongside Emperor Taishō, his father.
Emperor Showa and Empress Kojun had seven children, two sons and five daughters. | https://en.wikipedia.org/wiki?curid=10287 |
Emsworth
Emsworth is a small town in Hampshire on the south coast of England, near the border of West Sussex. It lies at the north end of an arm of Chichester Harbour, a large and shallow inlet from the English Channel and is equidistant between Portsmouth and Chichester.
Emsworth has a population of approximately 10,000. The town has a basin for small yachts and fishing boats, which fills at high tide and can be emptied through a sluice at low tide. In geodemographic segmentation the town is the heart of the Emsworth (cross-county) built-up area, the remainder of which is Westbourne, Southbourne and Nutbourne with a combined population of 18,777 in 2011, with a density of 30.5 people per hectare and which shares in two railway stations.
Emsworth began as a Saxon village. At first it was linked to the settlement of Warblington nearby. People from Emsworth worshipped at St Peter's Chapel or in the church at Warblington. Emsworth was not mentioned in the Domesday Book of 1086, as it was included with Warblington.
Emsworth's name came from Anglo Saxon "Æmeles worþ" = "a man called Æmele's enclosure".
Emsworth grew to be larger and more important: in 1239 Emsworth was granted the right to hold a market, and there was also an annual fair In 1332 Emsworth ("Empnesworth") was one of Hampshire's four Customs Ports.
During the 18th and 19th centuries, Emsworth was still a port. Emsworth was known for shipbuilding, boat building and rope making. Grain from the area was ground into flour by tidal mills and transported by ship to places such as London and Portsmouth. Timber from the area was also exported in the 18th and 19th centuries. The River Ems, which is named after the town (not, as often believed, the town being named after the river), flows into the Slipper millpond. The mill itself is now used as offices.
In the 19th century Emsworth had as many as 30 pubs and beer houses; today, only nine remain.
At the beginning of the 19th century, Emsworth had a population of less than 1,200 but it was still considered a large village for the time. By the end of the 18th century, it became fashionable for wealthy people to spend the summer by the sea. In 1805 a bathing house was built where people could have a bath in seawater.
The parish Church of St James was built in 1840. Queen Victoria visited Emsworth in 1842, resulting in Queen Street and Victoria Road being named after her. In 1847 the London, Brighton and South Coast Railway (now the West Coastway line) came to Emsworth, with a railway station built to serve the town.
By 1901 the population of Emsworth was about 2,000. It grew rapidly during the 20th century to about 5,000 by the middle of the century. In 1906 construction began on the post office, with local cricketer George Wilder laying an inscribed brick. The renamed Emsworth Recreation Ground dates from 1909 and is the current home of Emsworth Cricket Club, which was founded in 1811. Cricket in Emsworth has been played at the same ground, Cold Harbour Lawn, since 1761.
In 1902 the once famous Emsworth oyster industry went into rapid decline. This was after many of the guests at mayoral banquets in Southampton and Winchester became seriously ill and four died after consuming oysters. The infection was due to oysters sourced from Emsworth, as the oyster beds had been contaminated with raw sewage. Fishing oysters at Emsworth was subsequently halted until new sewers were dug, though the industry never completely recovered. Recently, Emsworth's last remaining oyster boat, "The Terror," was restored and is now sailing again.
During the Second World War, nearby Thorney Island was used as a Royal Air Force station, playing a role in defence in the Battle of Britain. The north of Emsworth at this time was used for growing flowers and further north was woodland (today Hollybank Woods). In the run up to D-Day, the Canadian Army used these woods as one of their pre-invasion assembly points for men and material. Today the foundations of their barracks can still be seen. In the 1960s large parts of this area were developed with a mix of bungalow and terraced housing.
For a few years (2001 to 2007), Emsworth held a food festival. It was the largest event of its type in the UK, with more than 50,000 visitors in 2007. The festival was cancelled due to numerous complaints of disruption to residents and businesses in the proximity.
The harbour is now used almost exclusively for recreational sailing. The town has two sailing clubs, Emsworth Sailing Club (established in 1919) and Emsworth Slipper Sailing Club (in 1921), the latter based at Quay Mill, a former tide mill. Both clubs organise a programme of racing and social events during the sailing season.
In April 2014, Emsworth Sailing Club received national media coverage when retired Royal Navy Captain Clifford 'John' Caughey drove his car into the clubhouse, causing a loud explosion and requiring thirty firefighters to extinguish the blaze.
The Emsworth Museum is administered by the Emsworth Maritime & Historical Trust and is run by volunteers and has limited opening hours.
Emsworth is twinned with Saint-Aubin-sur-Mer in Normandy, France
The town is part of the Havant constituency, which since the 1983 election has been a Conservative seat. The current Member of Parliament is Alan Mak MP. The town is represented at Havant Borough Council by Councillors Colin Mackey, Rivka Cresswell and Lulu Bowerman. The local Hampshire County Councillor is Ray Bolton. The town has branches of the Conservative Party, Liberal Democrats, the Labour Party and United Kingdom Independence Party.
Emsworth railway station is on the West Coastway Line. It has services that run to Portsmouth, Southampton, Brighton and London Victoria.
Stagecoach South operate the number 700 bus which operates between Brighton and Southsea. | https://en.wikipedia.org/wiki?curid=10289 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.